Recherche avancée

Médias (33)

Mot : - Tags -/creative commons

Autres articles (35)

  • Participer à sa documentation

    10 avril 2011

    La documentation est un des travaux les plus importants et les plus contraignants lors de la réalisation d’un outil technique.
    Tout apport extérieur à ce sujet est primordial : la critique de l’existant ; la participation à la rédaction d’articles orientés : utilisateur (administrateur de MediaSPIP ou simplement producteur de contenu) ; développeur ; la création de screencasts d’explication ; la traduction de la documentation dans une nouvelle langue ;
    Pour ce faire, vous pouvez vous inscrire sur (...)

  • Contribute to a better visual interface

    13 avril 2011

    MediaSPIP is based on a system of themes and templates. Templates define the placement of information on the page, and can be adapted to a wide range of uses. Themes define the overall graphic appearance of the site.
    Anyone can submit a new graphic theme or template and make it available to the MediaSPIP community.

  • Librairies et binaires spécifiques au traitement vidéo et sonore

    31 janvier 2010, par

    Les logiciels et librairies suivantes sont utilisées par SPIPmotion d’une manière ou d’une autre.
    Binaires obligatoires FFMpeg : encodeur principal, permet de transcoder presque tous les types de fichiers vidéo et sonores dans les formats lisibles sur Internet. CF ce tutoriel pour son installation ; Oggz-tools : outils d’inspection de fichiers ogg ; Mediainfo : récupération d’informations depuis la plupart des formats vidéos et sonores ;
    Binaires complémentaires et facultatifs flvtool2 : (...)

Sur d’autres sites (7282)

  • How to concatenate two videos w. ffmpeg — documented code not working

    2 mars 2014, par Jim Miller

    I'm trying to concatenate two videos with ffmpeg. Nothing fancy ; I just want one video that consists of video A immediately followed by video B.

    I've tried the code from How to concatenate (join, merge) media files on a freshly built and otherwise-working-fine install of ffmpeg 1.2.1 on Fedora 17, but the following error message appears :

    $ ffmpeg -i video_a.mov -i video_b.mov -filter_complex '[0:0] [0:1] [1:0] [1:1] concat=n=2:v=1:a=1 [v] [a]' -map '[v]' -map '[a]' output.mp4

    ffmpeg version N-54271-g7f866c1 Copyright (c) 2000-2013 the FFmpeg developers
     built on Jun 29 2013 11:05:42 with gcc 4.7.2 (GCC) 20120921 (Red Hat 4.7.2-2)
     configuration: --enable-gpl --enable-nonfree --enable-pthreads --enable-libx264 --enable-libfaac --extra-cflags=-I/usr/local/include --extra-ldflags=-L/usr/local/lib
     libavutil      52. 37.101 / 52. 37.101
     libavcodec     55. 17.100 / 55. 17.100
     libavformat    55. 10.100 / 55. 10.100
     libavdevice    55.  2.100 / 55.  2.100
     libavfilter     3. 77.101 /  3. 77.101
     libswscale      2.  3.100 /  2.  3.100
     libswresample   0. 17.102 /  0. 17.102
     libpostproc    52.  3.100 / 52.  3.100

    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'v1221-MTQxMzcyNTIxODU2.mov':
     Metadata:
       major_brand     : qt  
       minor_version   : 0
       compatible_brands: qt  
       creation_time   : 2013-03-28 20:34:59
       encoder         : Mac OS X v10.8.3 (CMA 914, CM 926.87, x86_64)
       encoder-eng     : Mac OS X v10.8.3 (CMA 914, CM 926.87, x86_64)
     Duration: 00:00:05.34, start: 0.000000, bitrate: 15837 kb/s
       Stream #0:0(und): Audio: aac (mp4a / 0x6134706D), 44100 Hz, mono, fltp, 221 kb/s
       Metadata:
         creation_time   : 2013-03-28 20:34:59
         handler_name    : Core Media Data Handler
       Stream #0:1(und): Video: h264 (Main) (avc1 / 0x31637661), yuv420p, 1920x1080 [SAR 1:1 DAR 16:9], 15512 kb/s, 29.81 fps, 30 tbr, 600 tbn, 1200 tbc
       Metadata:
         creation_time   : 2013-03-28 20:34:59
         handler_name    : Core Media Data Handler

    Input #1, mov,mp4,m4a,3gp,3g2,mj2, from 'v1224-MTQxMzcyNTIxODg5.mov':
     Metadata:
       major_brand     : qt  
       minor_version   : 0
       compatible_brands: qt  
       creation_time   : 2013-03-28 20:36:28
       encoder         : Mac OS X v10.8.3 (CMA 914, CM 926.87, x86_64)
       encoder-eng     : Mac OS X v10.8.3 (CMA 914, CM 926.87, x86_64)
     Duration: 00:00:04.13, start: 0.000000, bitrate: 15689 kb/s
       Stream #1:0(und): Audio: aac (mp4a / 0x6134706D), 44100 Hz, mono, fltp, 221 kb/s
       Metadata:
         creation_time   : 2013-03-28 20:36:28
         handler_name    : Core Media Data Handler
       Stream #1:1(und): Video: h264 (Main) (avc1 / 0x31637661), yuv420p, 1920x1080 [SAR 1:1 DAR 16:9], 15446 kb/s, 29.79 fps, 30 tbr, 600 tbn, 1200 tbc
       Metadata:
         creation_time   : 2013-03-28 20:36:28
         handler_name    : Core Media Data Handler

    Stream specifier ':0' in filtergraph description [0:0] [0:1] [1:0] [1:1] concat=n=2:v=1:a=1 [v] [a] matches no streams.

    A few other things to note :

    • The two videos I'm working with were shot with the same camera, so there shouldn't be any problems with aspect ratio or other gory video details.
    • I'm able to do other things with my ffmpeg installation, like convert one of those videos from .mov to .mp4 (yes, I had to recompile with faac...), which seems to vouch for both the ffmpeg and the video.
    • I've tried modifying the above invocation to produce a .mov file at the end, but I get the same error as before.
    • I've tried some stupid hacking tricks on the request above, like concatenating two copies of the same video, as well as some other invocations from other places around the web that involve filter_complex. Even on ones that were cited as working, I get the "matches no streams" message.
  • Why review compositing work in MJPEG videos rather than (say) H.264 ?

    6 juin 2016, par d3vid

    I have received a request to encode DPX files to MOV/MJPEG rather than MOV/H.264 (which ffmpeg picks by default if you convert to output.mov). These is to review compositing renders (in motion), so color accuracy is critical.

    Comparing a sample "ideal" MOV to the current (H.264) output I can see :

    • resolution : the same
    • ColorSpace/Primaries : Rec609 (SD) versus Rec709 (HD)
    • YUV : 4:2:0 versus 4:4:4
    • filesize : smaller

    The ffmpeg default seems to be better quality and result in a smaller filesize. Is there something I’m missing ?

  • FFMPEG not enough data (x y), trying to decode anyway

    7 juin 2016, par Forest J. Handford

    I’m trying to make videos of Direct3D games using a C# app. For non-Direct3D games I stream images from Graphics.CopyFromScreen which works. When I copy the screen from Direct3D and stream it to FFMPEG I get :

    [bmp @ 00000276b0b9c280] not enough data (5070 < 129654), trying to
    decode anyway

    An MP4 file is created, but it is always 0 bytes.

    To get screenshots from Direct3D, I am using Justin Stenning’s Direct3DHook. This produces images MUCH bigger than when I get images from Graphics.CopyFromScreen (8 MB vs 136 KB). I’ve tried increasing the buffer (-bufsize) but the number on the left of the error is not impacted.

    I’ve tried resizing the image to 1/6th the original. That reduces the number on the right, but does not eliminate it. Even when the number on the right is close to what I have for Graphics.CopyFromScreen I get an error. Here is a sample of the current code :

    using System;
    using System.Diagnostics;
    using System.Threading;
    using System.Drawing;
    using Capture.Hook;
    using Capture.Interface;
    using Capture;
    using System.IO;

    namespace GameRecord
    {
       public class Video
       {
           private const int VID_FRAME_FPS = 8;
           private const int SIZE_MODIFIER = 6;
           private const double FRAMES_PER_MS = VID_FRAME_FPS * 0.001;
           private const int SLEEP_INTERVAL = 2;
           private const int CONSTANT_RATE_FACTOR = 18; // Lower crf = Higher Quality https://trac.ffmpeg.org/wiki/Encode/H.264
           private Image image;
           private Capture captureScreen;
           private int processId = 0;
           private Process process;
           private CaptureProcess captureProcess;
           private Process launchingFFMPEG;
           private string arg;
           private int frame = 0;
           private Size? resize = null;


           /// <summary>
           /// Generates the Videos by gathering frames and processing via FFMPEG.
           /// </summary>
           public void RecordScreenTillGameEnd(string exe, OutputDirectory outputDirectory, CustomMessageBox alertBox, Thread workerThread)
           {
               AttachProcess(exe);
               RequestD3DScreenShot();
               while (image == null) ;
               Logger.log.Info("Launching FFMPEG ....");
               resize = new Size(image.Width / SIZE_MODIFIER, image.Height / SIZE_MODIFIER);
               // H.264 can let us do 8 FPS in high res . . . but must be licensed for commercial use.
               arg = "-f image2pipe -framerate " + VID_FRAME_FPS + " -i pipe:.bmp -pix_fmt yuv420p -crf " +
                   CONSTANT_RATE_FACTOR + " -preset ultrafast -s " + resize.Value.Width + "x" +
                   resize.Value.Height + " -vcodec libx264 -bufsize 30000k -y \"" +
                   outputDirectory.pathToVideo + "\"";

               launchingFFMPEG = new Process
               {
                   StartInfo = new ProcessStartInfo
                   {
                       FileName = "ffmpeg",
                       Arguments = arg,
                       UseShellExecute = false,
                       CreateNoWindow = true,
                       RedirectStandardInput = true,
                       RedirectStandardError = true
                   }
               };
               launchingFFMPEG.Start();

               Stopwatch stopWatch = Stopwatch.StartNew(); //creates and start the instance of Stopwatch

               do
               {
                   Thread.Sleep(SLEEP_INTERVAL);
               } while (workerThread.IsAlive);

               Logger.log.Info("Total frames: " + frame + " Expected frames: " + (ExpectedFrames(stopWatch.ElapsedMilliseconds) - 1));

               launchingFFMPEG.StandardInput.Close();

    #if DEBUG
               string line;
               while ((line = launchingFFMPEG.StandardError.ReadLine()) != null)
               {
                   Logger.log.Debug(line);
               }
    #endif
               launchingFFMPEG.Close();
               alertBox.Show();
           }

           void RequestD3DScreenShot()
           {
               captureProcess.CaptureInterface.BeginGetScreenshot(new Rectangle(0, 0, 0, 0), new TimeSpan(0, 0, 2), Callback, resize, (ImageFormat)Enum.Parse(typeof(ImageFormat), "Bitmap"));
           }

           private void AttachProcess(string exe)
           {
               Thread.Sleep(300);
               Process[] processes = Process.GetProcessesByName(Path.GetFileNameWithoutExtension(exe));
               foreach (Process currProcess in processes)
               {
                   // Simply attach to the first one found.

                   // If the process doesn't have a mainwindowhandle yet, skip it (we need to be able to get the hwnd to set foreground etc)
                   if (currProcess.MainWindowHandle == IntPtr.Zero)
                   {
                       continue;
                   }

                   // Skip if the process is already hooked (and we want to hook multiple applications)
                   if (HookManager.IsHooked(currProcess.Id))
                   {
                       continue;
                   }

                   Direct3DVersion direct3DVersion = Direct3DVersion.AutoDetect;

                   CaptureConfig cc = new CaptureConfig()
                   {
                       Direct3DVersion = direct3DVersion,
                       ShowOverlay = false
                   };

                   processId = currProcess.Id;
                   process = currProcess;

                   var captureInterface = new CaptureInterface();
                   captureInterface.RemoteMessage += new MessageReceivedEvent(CaptureInterface_RemoteMessage);
                   captureProcess = new CaptureProcess(process, cc, captureInterface);

                   break;
               }
               Thread.Sleep(10);

               if (captureProcess == null)
               {
                   ShowUser.Exception("No executable found matching: '" + exe + "'");
               }
           }

           /// <summary>
           /// The callback for when the screenshot has been taken
           /// </summary>
           ///
           ///
           ///
           void Callback(IAsyncResult result)
           {
               using (Screenshot screenshot = captureProcess.CaptureInterface.EndGetScreenshot(result))
               if (screenshot != null &amp;&amp; screenshot.Data != null &amp;&amp; arg != null)
               {
                   if (image != null)
                   {
                       image.Dispose();
                   }

                   image = screenshot.ToBitmap();
                   // image.Save("D3DImageTest.bmp");
                   image.Save(launchingFFMPEG.StandardInput.BaseStream, System.Drawing.Imaging.ImageFormat.Bmp);
                   launchingFFMPEG.StandardInput.Flush();
                   frame++;
               }

               if (frame &lt; 5)
               {
                   Thread t = new Thread(new ThreadStart(RequestD3DScreenShot));
                   t.Start();
               }
               else
               {
                   Logger.log.Info("Done getting shots from D3D.");
               }
           }

           /// <summary>
           /// Display messages from the target process
           /// </summary>
           ///
           private void CaptureInterface_RemoteMessage(MessageReceivedEventArgs message)
           {
               Logger.log.Info(message);
           }
       }
    }

    When I search the internet for the error all I get is the FFMPEG source code, which has not proven to be illuminating. I have been able to save the image directly to disk, which makes me feel like it is not an issue with disposing the data. I have also tried only grabbing one frame, but that produces the same error, which suggests to me it is not a threading issue.

    Here is the full sample of stderr :

    2016-06-02 18:29:38,046 === ffmpeg version N-79143-g8ff0f6a Copyright (c) 2000-2016 the FFmpeg developers

    2016-06-02 18:29:38,047 ===   built with gcc 5.3.0 (GCC)

    2016-06-02 18:29:38,048 ===   configuration: --enable-gpl
    --enable-version3 --disable-w32threads --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libdcadec --enable-libfreetype --enable-libgme --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmfx --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-librtmp --enable-libschroedinger --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-libzimg --enable-lzma --enable-decklink --enable-zlib

    2016-06-02 18:29:38,062 ===   libavutil      55. 19.100 / 55. 19.100

    2016-06-02 18:29:38,063 ===   libavcodec     57. 30.100 / 57. 30.100

    2016-06-02 18:29:38,064 ===   libavformat    57. 29.101 / 57. 29.101

    2016-06-02 18:29:38,064 ===   libavdevice    57.  0.101 / 57.  0.101

    2016-06-02 18:29:38,065 ===   libavfilter     6. 40.102 /  6. 40.102

    2016-06-02 18:29:38,066 ===   libswscale      4.  0.100 /  4.  0.100

    2016-06-02 18:29:38,067 ===   libswresample   2.  0.101 /  2.  0.101

    2016-06-02 18:29:38,068 ===   libpostproc    54.  0.100 / 54.  0.100

    2016-06-02 18:29:38,068 === [bmp @ 000002cd7e5cc280] not enough data (13070 &lt; 8294454), trying to decode anyway

    2016-06-02 18:29:38,069 === [bmp @ 000002cd7e5cc280] not enough data (13016 &lt; 8294400)

    2016-06-02 18:29:38,069 === Input #0, image2pipe, from 'pipe:.bmp':

    2016-06-02 18:29:38,262 ===   Duration: N/A, bitrate: N/A

    2016-06-02 18:29:38,262 ===     Stream #0:0: Video: bmp, bgra, 1920x1080, 8 tbr, 8 tbn, 8 tbc

    2016-06-02 18:29:38,263 === [libx264 @ 000002cd7e5d59a0] VBV bufsize set but maxrate unspecified, ignored

    2016-06-02 18:29:38,264 === [libx264 @ 000002cd7e5d59a0] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 AVX2 LZCNT BMI2

    2016-06-02 18:29:38,265 === [libx264 @ 000002cd7e5d59a0] profile Constrained Baseline, level 1.1

    2016-06-02 18:29:38,266 === [libx264 @ 000002cd7e5d59a0] 264 - core 148 r2665 a01e339 - H.264/MPEG-4 AVC codec - Copyleft 2003-2016 - http://www.videolan.org/x264.html - options: cabac=0 ref=1 deblock=0:0:0 analyse=0:0 me=dia subme=0 psy=1 psy_rd=1.00:0.00 mixed_ref=0 me_range=16 chroma_me=1 trellis=0 8x8dct=0 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=0 threads=6 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=0 weightp=0 keyint=250 keyint_min=8 scenecut=0 intra_refresh=0 rc=crf mbtree=0 crf=18.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=0

    2016-06-02 18:29:38,463 === Output #0, mp4, to 'C:\Users\fores\AppData\Roaming\Affectiva\n_Artifacts_20160602_182857\GameplayVidOut.mp4':

    2016-06-02 18:29:38,464 ===   Metadata:

    2016-06-02 18:29:38,465 ===     encoder         : Lavf57.29.101

    2016-06-02 18:29:38,469 ===     Stream #0:0: Video: h264 (libx264) ([33][0][0][0] / 0x0021), yuv420p, 320x180, q=-1--1, 8 fps, 16384 tbn, 8 tbc

    2016-06-02 18:29:38,470 ===     Metadata:

    2016-06-02 18:29:38,472 ===       encoder         : Lavc57.30.100 libx264

    2016-06-02 18:29:38,474 ===     Side data:

    2016-06-02 18:29:38,475 ===       cpb: bitrate max/min/avg: 0/0/0 buffer size: 30000000 vbv_delay: -1

    2016-06-02 18:29:38,476 === Stream mapping:

    2016-06-02 18:29:38,477 ===   Stream #0:0 -> #0:0 (bmp (native) -> h264 (libx264))

    2016-06-02 18:29:38,480 === [bmp @ 000002cd7e5cc9a0] not enough data (13070 &lt; 8294454), trying to decode anyway

    2016-06-02 18:29:38,662 === [bmp @ 000002cd7e5cc9a0] not enough data (13016 &lt; 8294400)

    2016-06-02 18:29:38,662 === Error while decoding stream #0:0: Invalid data found when processing input

    2016-06-02 18:29:38,663 === frame=    0 fps=0.0 q=0.0 Lsize=       0kB time=00:00:00.00 bitrate=N/A speed=   0x    

    2016-06-02 18:29:38,663 === video:0kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown

    2016-06-02 18:29:38,664 === Conversion failed!

    In memory, the current image is 320 pixels wide and 180 pixels long. The pixel format is Format32bppRgb. The horizontal and vertical resolutions seem odd, they are both 96.01199. When filed to disk here is ffprobe output for the file :

    ffprobe version N-79143-g8ff0f6a Copyright (c) 2007-2016 the FFmpeg developers
     built with gcc 5.3.0 (GCC)
     configuration: --enable-gpl --enable-version3 --disable-w32threads --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libdcadec --enable-libfreetype --enable-libgme --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmfx --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-librtmp --enable-libschroedinger --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-libzimg --enable-lzma --enable-decklink --enable-zlib
     libavutil      55. 19.100 / 55. 19.100
     libavcodec     57. 30.100 / 57. 30.100
     libavformat    57. 29.101 / 57. 29.101
     libavdevice    57.  0.101 / 57.  0.101
     libavfilter     6. 40.102 /  6. 40.102
     libswscale      4.  0.100 /  4.  0.100
     libswresample   2.  0.101 /  2.  0.101
     libpostproc    54.  0.100 / 54.  0.100
    Input #0, png_pipe, from 'C:\Users\fores\git\game-playtest-tool\GamePlayTest\bin\x64\Debug\D3DFromCapture.bmp':
     Duration: N/A, bitrate: N/A
       Stream #0:0: Video: png, rgba(pc), 1920x1080 [SAR 3779:3779 DAR 16:9], 25 tbr, 25 tbn, 25 tbc

    Here is a PNG version of an example screenshot from the current code (playing Portal 2) :
    Portal 2 Screenshot

    Any ideas would be greatly appreciated. My current workaround is to save the files to the HDD and compile the video after gameplay, but it’s a far less performant option. Thank you !