Recherche avancée

Médias (0)

Mot : - Tags -/alertes

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (16)

  • XMP PHP

    13 mai 2011, par

    Dixit Wikipedia, XMP signifie :
    Extensible Metadata Platform ou XMP est un format de métadonnées basé sur XML utilisé dans les applications PDF, de photographie et de graphisme. Il a été lancé par Adobe Systems en avril 2001 en étant intégré à la version 5.0 d’Adobe Acrobat.
    Étant basé sur XML, il gère un ensemble de tags dynamiques pour l’utilisation dans le cadre du Web sémantique.
    XMP permet d’enregistrer sous forme d’un document XML des informations relatives à un fichier : titre, auteur, historique (...)

  • Les formats acceptés

    28 janvier 2010, par

    Les commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
    ffmpeg -codecs ffmpeg -formats
    Les format videos acceptés en entrée
    Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
    Les formats vidéos de sortie possibles
    Dans un premier temps on (...)

  • Gestion générale des documents

    13 mai 2011, par

    MédiaSPIP ne modifie jamais le document original mis en ligne.
    Pour chaque document mis en ligne il effectue deux opérations successives : la création d’une version supplémentaire qui peut être facilement consultée en ligne tout en laissant l’original téléchargeable dans le cas où le document original ne peut être lu dans un navigateur Internet ; la récupération des métadonnées du document original pour illustrer textuellement le fichier ;
    Les tableaux ci-dessous expliquent ce que peut faire MédiaSPIP (...)

Sur d’autres sites (5011)

  • ffmpeg : StreamNotFound error

    27 janvier 2017, par KMG

    I’m using Adobe Media server and push streams to this server using FMLE.

    My stream name : appvideo

    When I tried to pull the video from the server using ffmpeg/ffplay, I’m not getting "stream not found error". Whereas rtmpdump can stream the video.

    After I see the logs, all the ffmpeg request logs have a stream name with extension .flv like [appvideo.flv]. But rtmpdump not append this extension [ like appvideo].

    FFMpeg/ffplay log in Media server :

    play    stream  2017-01-27  02:26:17    GMT appvideo    10.11.12.202    32715   11  17  _defaultRoot_   _defaultVHost_  live    _definst_   0   404 183.82.250.50   rtmp    -   rtmp://10.11.12.202:443/live    rtmp://10.11.12.202:443/live    -   --  4702122229742256497 3135    3631    normal  appvideo    -   -   rtmp://10.11.12.202:443/live/appvideo.flv   rtmp://10.11.12.202:443/live/appvideo.flv   -   flv 0   0.000000    0   -   0   0   -   -   -   -   1   -   --  -   -   -   -   -   -   -1  -1.000000   -

    Not sure why request append .flv extension at last like this : rtmp ://10.11.12.202:443/live/appvideo.flv

  • ffmpeg version 2.6.8 : Stream specifier ':a' in filtergraph description matches no streams

    16 décembre 2020, par Ashitaka

    I am not getting why this isn't working.. I have tried to get the video streams with [0:v]/[0:1]/[0:v:0] & the audio streams with [0:a]/[0:0]/[0:0:0].
nothing worked.

    


    Explaining the inputs :

    


    1.1st input stream is a video that can be of varying resolution on which the filter adds a padding on to make it 600:480.

    


    2.2nd input is an overlay png which is already at 5:4 ratio.. just making it 600:480 before it gets overlaid in the filter.

    


    3.3rd & 4th ones are also videos which I don't care if they get stretched.. n they are getting stretched to 600:480.

    


    4.so finally there are 3 streams 1 overlaid video 2 stretched videos which needs to be concatenated.

    


    here's the command :

    


    ffmpeg 
-i '/home/vidinflux/public_html/assets/temp/2018020116464612/2018020116464612.mp4' 
-i '/home/vidinflux/public_html/assets/temp/2018020116464612/2018020116464612_overlay.png' 
-i '/home/vidinflux/public_html/assets/user/736/video/Lines1.mp4' 
-i '/home/vidinflux/public_html/assets/user/736/video/Lines11.mp4' 
-filter_complex
"[0:v]trim=0:138,setpts=PTS-STARTPTS[v0];[0:a]atrim=0:138,asetpts=PTS-STARTPTS[a0];[v0]scale='gte(iw/ih\,600/480)*600+lt(iw/ih\,600/480)*((480*iw)/ih):lte(iw/ih\,600/480)*480+gt(iw/ih\,600/480)*((600*ih)/iw)',pad='600:480:(600-gte(iw/ih\,600/480)*600-lt(iw/ih\,600/480)*((480*iw)/ih))/2:(480-lte(iw/ih\,600/480)*480-gt(iw/ih\,600/480)*((600*ih)/iw))/2:black'[x];[1:v]scale=600:480[y];[x][y]overlay=0:0[z];[2:v]scale=600:480,setsar=1:1[x0];[3:v]scale=600:480,setsar=1:1[x1];[x0][2:a][z][a0][x1][3:a]concat=n=3:v=1:a=1[v][a]" 
-map "[v]" 
-map "[a]" 
-c:v libx264 
-shortest /home/vidinflux/public_html/assets/temp/2018020116464612/2018020116464612_final.mp4


    


    this is the complete error I am getting :

    


    Stream specifier ':a' in filtergraph description  [0:v]trim=0:138,setpts=PTS-STARTPTS[v0];[0:a]atrim=0:138,asetpts=PTS-STARTPTS[a0];[v0]scale='gte(iw/ih\,600/480)*600+lt(iw/ih\,600/480)*((480*iw)/ih):lte(iw/ih\,600/480)*480+gt(iw/ih\,600/480)*((600*ih)/iw)',pad='600:480:(600-gte(iw/ih\,600/480)*600-lt(iw/ih\,600/480)*((480*iw)/ih))/2:(480-lte(iw/ih\,600/480)*480-gt(iw/ih\,600/480)*((600*ih)/iw))/2:black'[x];[1:v]scale=600:480[y];[x][y]overlay=0:0[z];[2:v]scale=600:480,setsar=1:1[x0];[3:v]scale=600:480,setsar=1:1[x1];[x0][2:a][z][a0][x1][3:a]concat=n=3:v=1:a=1[v][a] matches no streams.


    


    also there are these warnings :

    


    [Parsed_setsar_9 @ 0x219fba0] num:den syntax is deprecated, please use num/den or named options instead
[Parsed_setsar_11 @ 0x21a4840] num:den syntax is deprecated, please use num/den or named options instead


    


    Complete log as requested :

    


    [root@cloud ~]# ffmpeg -i '/home/vidinflux/public_html/assets/temp/2018020116464612/2018020116464612.mp4' -i '/home/vidinflux/public_html/assets/temp/2018020116464612/2018020116464612_overlay.png' -i '/home/vidinflux/public_html/assets/user/736/video/Lines1.mp4' -i '/home/vidinflux/public_html/assets/user/736/video/Lines11.mp4' -filter_complex \ "[0:v]trim=0:138,setpts=PTS-STARTPTS[v0];[0:a]atrim=0:138,asetpts=PTS-STARTPTS[a0];[v0]scale='gte(iw/ih\,600/480)*600+lt(iw/ih\,600/480)*((480*iw)/ih):lte(iw/ih\,600/480)*480+gt(iw/ih\,600/480)*((600*ih)/iw)',pad='600:480:(600-gte(iw/ih\,600/480)*600-lt(iw/ih\,600/480)*((480*iw)/ih))/2:(480-lte(iw/ih\,600/480)*480-gt(iw/ih\,600/480)*((600*ih)/iw))/2:black'[x];[1:v]scale=600:480[y];[x][y]overlay=0:0[z];[2:v]scale=600:480,setsar=1:1[x0];[3:v]scale=600:480,setsar=1:1[x1];[x0][2:a][z][a0][x1][3:a]concat=n=3:v=1:a=1[v][a]" -map "[v]" -map "[a]" -c:v libx264 -shortest /home/vidinflux/public_html/assets/temp/2018020116464612/2018020116464612_final.mp4
ffmpeg version 2.6.8 Copyright (c) 2000-2016 the FFmpeg developers
  built with gcc 4.4.7 (GCC) 20120313 (Red Hat 4.4.7-16)
  configuration: --prefix=/usr --bindir=/usr/bin --datadir=/usr/share/ffmpeg --incdir=/usr/include/ffmpeg --libdir=/usr/lib64 --mandir=/usr/share/man --arch=x86_64 --optflags='-O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic' --enable-bzlib --disable-crystalhd --enable-gnutls --enable-ladspa --enable-libass --enable-libdc1394 --enable-libfaac --enable-nonfree --disable-indev=jack --enable-libfreetype --enable-libgsm --enable-libmp3lame --enable-openal --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-libschroedinger --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libvorbis --enable-libv4l2 --enable-libx264 --enable-libx265 --enable-libxvid --enable-x11grab --enable-avfilter --enable-avresample --enable-postproc --enable-pthreads --disable-static --enable-shared --enable-gpl --disable-debug --disable-stripping --shlibdir=/usr/lib64 --enable-runtime-cpudetect
  libavutil      54. 20.100 / 54. 20.100
  libavcodec     56. 26.100 / 56. 26.100
  libavformat    56. 25.101 / 56. 25.101
  libavdevice    56.  4.100 / 56.  4.100
  libavfilter     5. 11.102 /  5. 11.102
  libavresample   2.  1.  0 /  2.  1.  0
  libswscale      3.  1.101 /  3.  1.101
  libswresample   1.  1.100 /  1.  1.100
  libpostproc    53.  3.100 / 53.  3.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/home/vidinflux/public_html/assets/temp/2018020116464612/2018020116464612.mp4':
  Metadata:
    major_brand     : mp42
    minor_version   : 0
    compatible_brands: isommp42
    creation_time   : 2017-08-21 02:23:24
  Duration: 00:02:17.23, start: 0.000000, bitrate: 417 kb/s
    Stream #0:0(und): Video: h264 (Constrained Baseline) (avc1 / 0x31637661), yuv420p(tv, bt709), 640x360 [SAR 1:1 DAR 16:9], 318 kb/s, 25 fps, 25 tbr, 12800 tbn, 50 tbc (default)
    Metadata:
      handler_name    : VideoHandler
    Stream #0:1(eng): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 96 kb/s (default)
    Metadata:
      creation_time   : 2017-08-21 02:23:24
      handler_name    : IsoMedia File Produced by Google, 5-11-2011
Input #1, png_pipe, from '/home/vidinflux/public_html/assets/temp/2018020116464612/2018020116464612_overlay.png':
  Duration: N/A, bitrate: N/A
    Stream #1:0: Video: png, rgba, 600x479, 25 tbr, 25 tbn, 25 tbc
Input #2, mov,mp4,m4a,3gp,3g2,mj2, from '/home/vidinflux/public_html/assets/user/736/video/Lines1.mp4':
  Metadata:
    major_brand     : mp42
    minor_version   : 1
    compatible_brands: mp41mp42isom
    creation_time   : 2018-01-31 22:40:09
  Duration: 00:00:04.90, start: 0.103811, bitrate: 846 kb/s
    Stream #2:0(und): Video: h264 (Baseline) (avc1 / 0x31637661), yuv420p(tv, smpte170m/bt470bg/bt709), 1920x1080, 827 kb/s, SAR 1:1 DAR 16:9, 30 fps, 30 tbr, 90k tbn, 180k tbc (default)
    Metadata:
      creation_time   : 2018-01-31 22:40:10
      handler_name    : Core Media Video
Input #3, mov,mp4,m4a,3gp,3g2,mj2, from '/home/vidinflux/public_html/assets/user/736/video/Lines11.mp4':
  Metadata:
    major_brand     : mp42
    minor_version   : 1
    compatible_brands: mp41mp42isom
    creation_time   : 2018-01-31 22:40:09
  Duration: 00:00:04.90, start: 0.103811, bitrate: 846 kb/s
    Stream #3:0(und): Video: h264 (Baseline) (avc1 / 0x31637661), yuv420p(tv, smpte170m/bt470bg/bt709), 1920x1080, 827 kb/s, SAR 1:1 DAR 16:9, 30 fps, 30 tbr, 90k tbn, 180k tbc (default)
    Metadata:
      creation_time   : 2018-01-31 22:40:10
      handler_name    : Core Media Video
[Parsed_setsar_9 @ 0x17c8ba0] num:den syntax is deprecated, please use num/den or named options instead
[Parsed_setsar_11 @ 0x17cd840] num:den syntax is deprecated, please use num/den or named options instead
Stream specifier ':a' in filtergraph description  [0:v]trim=0:138,setpts=PTS-STARTPTS[v0];[0:a]atrim=0:138,asetpts=PTS-STARTPTS[a0];[v0]scale='gte(iw/ih\,600/480)*600+lt(iw/ih\,600/480)*((480*iw)/ih):lte(iw/ih\,600/480)*480+gt(iw/ih\,600/480)*((600*ih)/iw)',pad='600:480:(600-gte(iw/ih\,600/480)*600-lt(iw/ih\,600/480)*((480*iw)/ih))/2:(480-lte(iw/ih\,600/480)*480-gt(iw/ih\,600/480)*((600*ih)/iw))/2:black'[x];[1:v]scale=600:480[y];[x][y]overlay=0:0[z];[2:v]scale=600:480,setsar=1:1[x0];[3:v]scale=600:480,setsar=1:1[x1];[x0][2:a][z][a0][x1][3:a]concat=n=3:v=1:a=1[v][a] matches no streams.


    


  • FFmpeg - feed raw frames via pipe - FFmpeg does not detect pipe closure

    8 septembre 2018, par Rumble

    Im trying to follow these examples from C++ in Windows. Phyton Example C# Example

    I have an application that produces raw frames that shall be encoded with FFmpeg.
    The raw frames are transfered via IPC pipe to FFmpegs STDIN. That is working as expected, FFmpeg even displays the number of frames currently available.

    The problem occours when we are done sending frames. When I close the write end of the pipe I would expect FFmpeg to detect that, finish up and output the video. But that does not happen. FFmpeg stays open and seems to wait for more data.

    I made a small test project in VisualStudio.

    #include "stdafx.h"
    //// stdafx.h
    //#include "targetver.h"
    //#include
    //#include
    //#include <iostream>

    #include "Windows.h"
    #include <cstdlib>

    using namespace std;

    bool WritePipe(void* WritePipe, const UINT8 *const Buffer, const UINT32 Length)
    {
       if (WritePipe == nullptr || Buffer == nullptr || Length == 0)
       {
           cout &lt;&lt; __FUNCTION__ &lt;&lt; ": Some input is useless";
           return false;
       }

       // Write to pipe
       UINT32 BytesWritten = 0;
       UINT8 newline = '\n';
       bool bIsWritten = WriteFile(WritePipe, Buffer, Length, (::DWORD*)&amp;BytesWritten, nullptr);
       cout &lt;&lt; __FUNCTION__ &lt;&lt; " Bytes written to pipe " &lt;&lt; BytesWritten &lt;&lt; endl;
       //bIsWritten = WriteFile(WritePipe, &amp;newline, 1, (::DWORD*)&amp;BytesWritten, nullptr); // Do we need this? Actually this should destroy the image.

       FlushFileBuffers(WritePipe); // Do we need this?

       return bIsWritten;
    }

    #define PIXEL 80 // must be multiple of 8. Otherwise we get warning: Bytes are not aligned

    int main()
    {
       HANDLE PipeWriteEnd = nullptr;
       HANDLE PipeReadEnd = nullptr;
       {
           // create us a pipe for inter process communication
           SECURITY_ATTRIBUTES Attr = { sizeof(SECURITY_ATTRIBUTES), NULL, true };
           if (!CreatePipe(&amp;PipeReadEnd, &amp;PipeWriteEnd, &amp;Attr, 0))
           {
               cout &lt;&lt; "Could not create pipes" &lt;&lt; ::GetLastError() &lt;&lt; endl;
               system("Pause");
               return 0;
           }
       }

       // Setup the variables needed for CreateProcess
       // initialize process attributes
       SECURITY_ATTRIBUTES Attr;
       Attr.nLength = sizeof(SECURITY_ATTRIBUTES);
       Attr.lpSecurityDescriptor = NULL;
       Attr.bInheritHandle = true;

       // initialize process creation flags
       UINT32 CreateFlags = NORMAL_PRIORITY_CLASS;
       CreateFlags |= CREATE_NEW_CONSOLE;

       // initialize window flags
       UINT32 dwFlags = 0;
       UINT16 ShowWindowFlags = SW_HIDE;

       if (PipeWriteEnd != nullptr || PipeReadEnd != nullptr)
       {
           dwFlags |= STARTF_USESTDHANDLES;
       }

       // initialize startup info
       STARTUPINFOA StartupInfo = {
           sizeof(STARTUPINFO),
           NULL, NULL, NULL,
           (::DWORD)CW_USEDEFAULT,
           (::DWORD)CW_USEDEFAULT,
           (::DWORD)CW_USEDEFAULT,
           (::DWORD)CW_USEDEFAULT,
           (::DWORD)0, (::DWORD)0, (::DWORD)0,
           (::DWORD)dwFlags,
           ShowWindowFlags,
           0, NULL,
           HANDLE(PipeReadEnd),
           HANDLE(nullptr),
           HANDLE(nullptr)
       };

       LPSTR ffmpegURL = "\"PATHTOFFMPEGEXE\" -y -loglevel verbose -f rawvideo -vcodec rawvideo -framerate 1 -video_size 80x80 -pixel_format rgb24 -i - -vcodec mjpeg -framerate 1/4 -an \"OUTPUTDIRECTORY\"";

       // Finally create the process
       PROCESS_INFORMATION ProcInfo;
       if (!CreateProcessA(NULL, ffmpegURL, &amp;Attr, &amp;Attr, true, (::DWORD)CreateFlags, NULL, NULL, &amp;StartupInfo, &amp;ProcInfo))
       {
           cout &lt;&lt; "CreateProcess failed " &lt;&lt; ::GetLastError() &lt;&lt; endl;
       }
       //CloseHandle(ProcInfo.hThread);


           // Create images and write to pipe
    #define MYARRAYSIZE (PIXEL*PIXEL*3) // each pixel has 3 bytes

       UINT8* Bitmap = new UINT8[MYARRAYSIZE];

       for (INT32 outerLoopIndex = 9; outerLoopIndex >= 0; --outerLoopIndex)   // frame loop
       {
           for (INT32 innerLoopIndex = MYARRAYSIZE - 1; innerLoopIndex >= 0; --innerLoopIndex) // create the pixels for each frame
           {
               Bitmap[innerLoopIndex] = (UINT8)(outerLoopIndex * 20); // some gray color
           }
           system("pause");
           if (!WritePipe(PipeWriteEnd, Bitmap, MYARRAYSIZE))
           {
               cout &lt;&lt; "Failed writing to pipe" &lt;&lt; endl;
           }
       }

       // Done sending images. Tell the other process. IS THIS NEEDED? HOW TO TELL FFmpeg WE ARE DONE?
       //UINT8 endOfFile = 0xFF; // EOF = -1 == 1111 1111 for uint8
       //if (!WritePipe(PipeWriteEnd, &amp;endOfFile, 1))
       //{
       //  cout &lt;&lt; "Failed writing to pipe" &lt;&lt; endl;
       //}
       //FlushFileBuffers(PipeReadEnd); // Do we need this?

       delete Bitmap;


       system("pause");

       // clean stuff up
       FlushFileBuffers(PipeWriteEnd); // Do we need this?

       if (PipeWriteEnd != NULL &amp;&amp; PipeWriteEnd != INVALID_HANDLE_VALUE)
       {
           CloseHandle(PipeWriteEnd);
       }

       // We do not want to destroy the read end of the pipe? Should not as that belongs to FFmpeg
       //if (PipeReadEnd != NULL &amp;&amp; PipeReadEnd != INVALID_HANDLE_VALUE)
       //{
       //  ::CloseHandle(PipeReadEnd);
       //}


       return 0;

    }
    </cstdlib></iostream>

    And here the output of FFmpeg

    ffmpeg version 3.4.1 Copyright (c) 2000-2017 the FFmpeg developers
     built with gcc 7.2.0 (GCC)
     configuration: --enable-gpl --enable-version3 --enable-sdl2 --enable-bzlib --enable-fontconfig --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libfreetype --enable-libmp3lame --enable-libopenjpeg --enable-libopus --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libtheora --enable-libtwolame --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libzimg --enable-lzma --enable-zlib --enable-gmp --enable-libvidstab --enable-libvorbis --enable-cuda --enable-cuvid --enable-d3d11va --enable-nvenc --enable-dxva2 --enable-avisynth --enable-libmfx
     libavutil      55. 78.100 / 55. 78.100
     libavcodec     57.107.100 / 57.107.100
     libavformat    57. 83.100 / 57. 83.100
     libavdevice    57. 10.100 / 57. 10.100
     libavfilter     6.107.100 /  6.107.100
     libswscale      4.  8.100 /  4.  8.100
     libswresample   2.  9.100 /  2.  9.100
     libpostproc    54.  7.100 / 54.  7.100
    [rawvideo @ 00000221ff992120] max_analyze_duration 5000000 reached at 5000000 microseconds st:0
    Input #0, rawvideo, from 'pipe:':
     Duration: N/A, start: 0.000000, bitrate: 153 kb/s
       Stream #0:0: Video: rawvideo, 1 reference frame (RGB[24] / 0x18424752), rgb24, 80x80, 153 kb/s, 1 fps, 1 tbr, 1 tbn, 1 tbc
    Stream mapping:
     Stream #0:0 -> #0:0 (rawvideo (native) -> mjpeg (native))
    [graph 0 input from stream 0:0 @ 00000221ff999c20] w:80 h:80 pixfmt:rgb24 tb:1/1 fr:1/1 sar:0/1 sws_param:flags=2
    [auto_scaler_0 @ 00000221ffa071a0] w:iw h:ih flags:'bicubic' interl:0
    [format @ 00000221ffa04e20] auto-inserting filter 'auto_scaler_0' between the filter 'Parsed_null_0' and the filter 'format'
    [swscaler @ 00000221ffa0a780] deprecated pixel format used, make sure you did set range correctly
    [auto_scaler_0 @ 00000221ffa071a0] w:80 h:80 fmt:rgb24 sar:0/1 -> w:80 h:80 fmt:yuvj444p sar:0/1 flags:0x4
    Output #0, mp4, to 'c:/users/vr3/Documents/Guenni/sometest.mp4':
     Metadata:
       encoder         : Lavf57.83.100
       Stream #0:0: Video: mjpeg, 1 reference frame (mp4v / 0x7634706D), yuvj444p(pc), 80x80, q=2-31, 200 kb/s, 1 fps, 16384 tbn, 1 tbc
       Metadata:
         encoder         : Lavc57.107.100 mjpeg
       Side data:
         cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1
    frame=   10 fps=6.3 q=1.6 size=       0kB time=00:00:09.00 bitrate=   0.0kbits/s speed=5.63x

    As you can see in the last line of te FFmpeg output, the images got trough. 10 frames are available. But after closing the pipe, FFmpeg does not close, still expecting input.

    As the linked examples show, this should be a valid method.

    Trying for a week now...