Recherche avancée

Médias (2)

Mot : - Tags -/doc2img

Autres articles (48)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Demande de création d’un canal

    12 mars 2010, par

    En fonction de la configuration de la plateforme, l’utilisateur peu avoir à sa disposition deux méthodes différentes de demande de création de canal. La première est au moment de son inscription, la seconde, après son inscription en remplissant un formulaire de demande.
    Les deux manières demandent les mêmes choses fonctionnent à peu près de la même manière, le futur utilisateur doit remplir une série de champ de formulaire permettant tout d’abord aux administrateurs d’avoir des informations quant à (...)

  • Mediabox : ouvrir les images dans l’espace maximal pour l’utilisateur

    8 février 2011, par

    La visualisation des images est restreinte par la largeur accordée par le design du site (dépendant du thème utilisé). Elles sont donc visibles sous un format réduit. Afin de profiter de l’ensemble de la place disponible sur l’écran de l’utilisateur, il est possible d’ajouter une fonctionnalité d’affichage de l’image dans une boite multimedia apparaissant au dessus du reste du contenu.
    Pour ce faire il est nécessaire d’installer le plugin "Mediabox".
    Configuration de la boite multimédia
    Dès (...)

Sur d’autres sites (4280)

  • ffmpeg concatenate two videos, unexpectedly changes the first second(s) of 2nd video

    29 juin 2019, par Roy

    I used ffmpeg to concatenate two videos of my game play recordings. I wrote a list.txt file which lists the two files :

    list.txt:
    file 2019~06~28_~_Game_1_~_Part_2.mp4
    file 2019~06~28_~_Game_1_~_Part_3.mp4

    I then run ffmpeg to concat them :

    ffmpeg -safe 0 -f concat -i list.txt -c copy "output.mp4"

    However, the resulting video seems to be skipping frames (or going through them really quickly) at the first second(s) of the second video, causing the perception of the motion suddenly fast-forwarded.

    The two videos were recorded by the same game video recorder "GeForce Experience" in one game session. They should match smoothly when concatenated.

    Here is the output of ffmpeg :

    ffmpeg version 3.4.1 Copyright (c) 2000-2017 the FFmpeg developers
     built with gcc 7.2.0 (GCC)
     configuration: --enable-gpl --enable-version3 --enable-sdl2 --enable-bzlib --enable-fontconfig --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libfreetype --enable-libmp3lame --enable-libopenjpeg --enable-libopus --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libtheora --enable-libtwolame --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libzimg --enable-lzma --enable-zlib --enable-gmp --enable-libvidstab --enable-libvorbis --enable-cuda --enable-cuvid --enable-d3d11va --enable-nvenc --enable-dxva2 --enable-avisynth --enable-libmfx
     libavutil      55. 78.100 / 55. 78.100
     libavcodec     57.107.100 / 57.107.100
     libavformat    57. 83.100 / 57. 83.100
     libavdevice    57. 10.100 / 57. 10.100
     libavfilter     6.107.100 /  6.107.100
     libswscale      4.  8.100 /  4.  8.100
     libswresample   2.  9.100 /  2.  9.100
     libpostproc    54.  7.100 / 54.  7.100
    [mov,mp4,m4a,3gp,3g2,mj2 @ 000001600bbdb5e0] Auto-inserting h264_mp4toannexb bitstream filter
    Input #0, concat, from 'list.txt':
     Duration: N/A, start: 0.000000, bitrate: 24674 kb/s
       Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, smpte170m/smpte170m/bt470m), 1920x1080 [SAR 1:1 DAR 16:9], 24479 kb/s, 59.69 fps, 60 tbr, 90k tbn, 120 tbc
       Metadata:
         creation_time   : 2019-06-29T04:43:18.000000Z
         handler_name    : VideoHandle
       Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 195 kb/s
       Metadata:
         creation_time   : 2019-06-29T04:43:18.000000Z
         handler_name    : SoundHandle
    File 'output.mp4' already exists. Overwrite ? [y/N] y
    Output #0, mp4, to 'output.mp4':
     Metadata:
       encoder         : Lavf57.83.100
       Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, smpte170m/smpte170m/bt470m), 1920x1080 [SAR 1:1 DAR 16:9], q=2-31, 24479 kb/s, 59.69 fps, 60 tbr, 90k tbn, 90k tbc
       Metadata:
         creation_time   : 2019-06-29T04:43:18.000000Z
         handler_name    : VideoHandle
       Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 195 kb/s
       Metadata:
         creation_time   : 2019-06-29T04:43:18.000000Z
         handler_name    : SoundHandle
    Stream mapping:
     Stream #0:0 -> #0:0 (copy)
     Stream #0:1 -> #0:1 (copy)
    Press [q] to stop, [?] for help
    [mov,mp4,m4a,3gp,3g2,mj2 @ 000001600bbdb5e0] Auto-inserting h264_mp4toannexb bitstream filter
    frame= 7405 fps=0.0 q=-1.0 Lsize=  221175kB time=00:02:03.63 bitrate=14655.4kbits/s speed= 157x
    video:218137kB audio:2862kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.079741%

    In particular, I don’t know what does "Auto-inserting h254_mp4toannexb bitstream filter" mean. Did this caused the unexpected change ?

  • FFmpeg C++ decoding in a separate thread

    12 juin 2019, par Brigapes

    I’m trying to decode a video with FFmpeg and convert it to an openGL texture and display it inside a cocos2dx engine. I’ve managed to do that and it displays the video as i wanted to, now the problem is performance wise. I get a Sprite update every frame(game is fixed 60fps, video is 30fps) so what i did was i decoded and converted frame interchangeably, didn’t work great, now i have it set up to have a separate thread where i decode in an infinite while loop with sleep() just so it doesn’t hog the cpu/program.
    What i currently have set up is 2 pbo framebuffers and a bool flag to tell my ffmpeg thread loop to decode another frame since i don’t know how to manually wait when to decode another frame. I’ve searched online for a soultion to this kind of problem but didn’t manage to get any answers.

    I’ve looked at this : Decoding video directly into a texture in separate thread but it didn’t solve my problem since it was just converting YUV to RGB inside opengl shaders which i haven’t done yet but currently not an issue.

    Additional info that might be useful is that i don’t need to end thread until application exit and i’m open to using any video format, including lossless.

    Ok so main decoding loop looks like this :

    //.. this is inside of a constructor / init
    //adding thread to array in order to save the thread    
    global::global_pending_futures.push_back(std::async(std::launch::async, [=] {
           while (true) {
               if (isPlaying) {
                   this->decodeLoop();
               }
               else {
                   std::this_thread::sleep_for(std::chrono::milliseconds(3));
               }
           }
       }));

    Reason why i use bool to check if frame was used is because main decoding function takes about 5ms to finish in debug and then should wait about 11 ms for it to display the frame, so i can’t know when the frame was displayed and i also don’t know how long did decoding take.

    Decode function :

    void video::decodeLoop() { //this should loop in a separate thread
       frameData* buff = nullptr;
       if (buf1.needsRefill) {
       /// buf1.bufferLock.lock();
           buff = &buf1;
           buf1.needsRefill = false;
           firstBuff = true;
       }
       else if (buf2.needsRefill) {
           ///buf2.bufferLock.lock();
           buff = &buf2;
           buf2.needsRefill = false;
           firstBuff = false;
       }

       if (buff == nullptr) {
           std::this_thread::sleep_for(std::chrono::milliseconds(1));
           return;//error? //wait?
       }

       //pack pixel buffer?

       if (getNextFrame(buff)) {
           getCurrentRBGConvertedFrame(buff);
       }
       else {
           loopedTimes++;
           if (loopedTimes >= repeatTimes) {
               stop();
           }
           else {
               restartVideoPlay(&buf1);//restart both
               restartVideoPlay(&buf2);
               if (getNextFrame(buff)) {
                   getCurrentRBGConvertedFrame(buff);
               }
           }
       }
    /// buff->bufferLock.unlock();

       return;
    }

    As you can tell i first check if buffer was used using bool needsRefill and then decode another frame.

    frameData struct :

       struct frameData {
           frameData() {};
           ~frameData() {};

           AVFrame* frame;
           AVPacket* pkt;
           unsigned char* pdata;
           bool needsRefill = true;
           std::string name = "";

           std::mutex bufferLock;

           ///unsigned int crrFrame
           GLuint pboid = 0;
       };

    And this is called every frame :

    void video::actualDraw() { //meant for cocos implementation
       if (this->isVisible()) {
           if (this->getOpacity() > 0) {
               if (isPlaying) {
                   if (loopedTimes >= repeatTimes) { //ignore -1 because comparing unsgined to signed
                       this->stop();
                   }
               }

               if (isPlaying) {
                   this->setVisible(true);

                   if (!display) { //skip frame
                       ///this->getNextFrame();
                       display = true;
                   }
                   else if (display) {
                       display = false;
                       auto buff = this->getData();                    
                       width = this->getWidth();
                       height = this->getHeight();
                       if (buff) {
                           if (buff->pdata) {

                               glBindBuffer(GL_PIXEL_UNPACK_BUFFER, buff->pboid);
                               glBufferData(GL_PIXEL_UNPACK_BUFFER, 3 * (width*height), buff->pdata, GL_DYNAMIC_DRAW);


                               glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, width, height, GL_RGB, GL_UNSIGNED_BYTE, 0);///buff->pdata);                            glBindBuffer(GL_PIXEL_UNPACK_BUFFER, 0);
                           }

                           buff->needsRefill = true;
                       }
                   }
               }
               else { this->setVisible(false); }
           }
       }
    }

    getData func to tell which frambuffer it uses

    video::frameData* video::getData() {
       if (firstBuff) {
           if (buf1.needsRefill == false) {
               ///firstBuff = false;
               return &buf1;///.pdata;
           }
       }
       else { //if false
           if (buf2.needsRefill == false) {
               ///firstBuff = true;
               return &buf2;///.pdata;
           }
       }
       return nullptr;
    }

    I’m not sure what else to include i pasted whole code to pastebin.
    video.cpp : https://pastebin.com/cWGT6APn
    video.h https://pastebin.com/DswAXwXV

    To summarize the problem :

    How do i properly implement decoding in a separate thread / how do i optimize current code ?

    Currently video is lagging when some other thread or main thread gets heavy and then it does not decode fast enough.

  • Kernel32 not found when using FFmpeg.Autogen 4.1.0.2 in Mono/Linux

    5 décembre 2024, par Robert Russell

    I'm submitting a bug report while I was posting this I didn't know I could see into FFmpeg.Autogen from the stacktrace. Anyways I posted a Bug Report on Github

    



    https://github.com/Ruslan-B/FFmpeg.AutoGen/issues/109

    



    I'm trying to run my code in Linux that uses FFmpeg.Autogen to interface with the ffmpeg libraries. I am getting kernel32 dll not found can not figure out why. He says to not post issues to github for troubleshooting.
Possible related issue : https://github.com/Ruslan-B/FFmpeg.AutoGen/issues/89

    



    First thing I've tried were to include the binary helper class from the example code I tweaked it a little bit. Added the exact path to the linux files.
Second thing I did was add FFmpeg.AutoGen.dll.config if configured right and it tries to ref a windows DLL it should point to the linux one.
Stacktrace :

    



    System.DllNotFoundException: kernel32
  at at (wrapper managed-to-native) FFmpeg.AutoGen.Native.WindowsNativeMethods.GetProcAddress(intptr,string)
  at FFmpeg.AutoGen.Native.FunctionLoader.GetFunctionPointer (System.IntPtr nativeLibraryHandle, System.String functionName) [0x00000] in D:\FFmpeg.AutoGen\FFmpeg.AutoGen\Native\FunctionLoader.cs:55
  at FFmpeg.AutoGen.Native.FunctionLoader.GetFunctionDelegate[T] (System.IntPtr nativeLibraryHandle, System.String functionName, System.Boolean throwOnError) [0x00000] in D:\FFmpeg.AutoGen\FFmpeg.AutoGen\Native\FunctionLoader.cs:28
  at FFmpeg.AutoGen.ffmpeg.GetFunctionDelegate[T] (System.IntPtr libraryHandle, System.String functionName) [0x00000] in D:\FFmpeg.AutoGen\FFmpeg.AutoGen\FFmpeg.cs:50
  at FFmpeg.AutoGen.ffmpeg+<>c.<.cctor>b__4_318 () [0x00000] in D:\FFmpeg.AutoGen\FFmpeg.AutoGen\FFmpeg.functions.export.g.cs:7163
  at FFmpeg.AutoGen.ffmpeg.avformat_alloc_context () [0x00000] in D:\FFmpeg.AutoGen\FFmpeg.AutoGen\FFmpeg.functions.export.g.cs:7176
  at FF8.FfccVaribleGroup..ctor () [0x0009c] in /home/robert/OpenVIII/FF8/FfccVaribleGroup.cs:53
  at FF8.Ffcc..ctor (System.String filename, FFmpeg.AutoGen.AVMediaType mediatype, FF8.Ffcc+FfccMode mode) [0x00008] in /home/robert/OpenVIII/FF8/Ffcc.cs:31
  at FF8.Module_movie_test.InitMovie () [0x00001] in /home/robert/OpenVIII/FF8/module_movie_test.cs:160
  at FF8.Module_movie_test.Update () [0x000c5] in /home/robert/OpenVIII/FF8/module_movie_test.cs:88
  at FF8.ModuleHandler.Update (Microsoft.Xna.Framework.GameTime gameTime) [0x000ac] in /home/robert/OpenVIII/FF8/ModuleHandler.cs:43
  at FF8.Game1.Update (Microsoft.Xna.Framework.GameTime gameTime) [0x00030] in /home/robert/OpenVIII/FF8/Game1.cs:69
  at Microsoft.Xna.Framework.Game.DoUpdate (Microsoft.Xna.Framework.GameTime gameTime) [0x00019] in <4fc8466c27384bb19c7b81b2a6a71083>:0
  at Microsoft.Xna.Framework.Game.Tick () [0x00103] in <4fc8466c27384bb19c7b81b2a6a71083>:0
  at Microsoft.Xna.Framework.SdlGamePlatform.RunLoop () [0x00021] in <4fc8466c27384bb19c7b81b2a6a71083>:0
  at Microsoft.Xna.Framework.Game.Run (Microsoft.Xna.Framework.GameRunBehavior runBehavior) [0x0008b] in <4fc8466c27384bb19c7b81b2a6a71083>:0
  at Microsoft.Xna.Framework.Game.Run () [0x0000c] in <4fc8466c27384bb19c7b81b2a6a71083>:0
  at FF8.Program.Main () [0x00007] in /home/robert/OpenVIII/FF8/Program.cs:17


    



    My code that triggers this :

    



    Format = ffmpeg.avformat_alloc_context();


    



    Binaryhelper should set the path correctly for the file

    



    internal static void RegisterFFmpegBinaries()
        {
            var libraryPath = "";
            switch (Environment.OSVersion.Platform)
            {
                case PlatformID.Win32NT:
                case PlatformID.Win32S:
                case PlatformID.Win32Windows:
                    var current = Environment.CurrentDirectory;
                    var probe = Path.Combine(Environment.Is64BitProcess ? "x64" : "x86");
                    while (current != null)
                    {
                        var ffmpegDirectory = Path.Combine(current, probe);
                        if (Directory.Exists(ffmpegDirectory))
                        {
                            Console.WriteLine($"FFmpeg binaries found in: {ffmpegDirectory}");
                            RegisterLibrariesSearchPath(ffmpegDirectory);
                            return;
                        }
                        current = Directory.GetParent(current)?.FullName;
                    }
                    break;
                case PlatformID.Unix:
                    libraryPath = "/usr/lib/x86_64-linux-gnu";
                    RegisterLibrariesSearchPath(libraryPath);
                    break;
                case PlatformID.MacOSX:
                    libraryPath = Environment.GetEnvironmentVariable(LD_LIBRARY_PATH);
                    RegisterLibrariesSearchPath(libraryPath);
                    break;
            }
        }


    



    The FFmpeg.Autogen.dll.config

    



    <configuration>&#xA;  <dllmap os="linux" dll="avutil-56.dll" target="/usr/lib/x86_64-linux-gnu/libavutil.so.56"></dllmap>&#xA;  <dllmap os="linux" dll="avcodec-58.dll" target="/usr/lib/x86_64-linux-gnu/libavcodec.so.58"></dllmap>&#xA;  <dllmap os="linux" dll="avformat-58.dll" target="/usr/lib/x86_64-linux-gnu/libavformat.so.58"></dllmap>&#xA;  <dllmap os="linux" dll="avdevice-58.dll" target="/usr/lib/x86_64-linux-gnu/libavdevice.so.58"></dllmap>&#xA;  <dllmap os="linux" dll="avfilter-7.dll" target="/usr/lib/x86_64-linux-gnu/libavfilter.so.7"></dllmap>&#xA;  <dllmap os="linux" dll="avresample-4.dll" target="/usr/lib/x86_64-linux-gnu/libavresample.so.4"></dllmap>&#xA;  <dllmap os="linux" dll="swscale-5.dll" target="/usr/lib/x86_64-linux-gnu/libswscale.so.5"></dllmap>&#xA;  <dllmap os="linux" dll="swresample-3.dll" target="/usr/lib/x86_64-linux-gnu/libswresample.so.3"></dllmap>&#xA;  <dllmap os="linux" dll="postproc-55.dll" target="/usr/lib/x86_64-linux-gnu/libpostproc.so.55"></dllmap>&#xA;</configuration>&#xA;

    &#xA;