Recherche avancée

Médias (0)

Mot : - Tags -/signalement

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (44)

  • La file d’attente de SPIPmotion

    28 novembre 2010, par

    Une file d’attente stockée dans la base de donnée
    Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
    Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

Sur d’autres sites (7722)

  • (ffmpeg.autogen)How can i Play RTSP Stream Audio With Video

    1er octobre 2018, par newbie

    I am making rtsp stream player to use opencv..

    The example Project is wonderful.

    But that is with not Sound

    I Added thid Code and I think I found Audio Stream Number, but I don’t know next step

       private readonly AVCodecContext* _aCodecContext;
       private readonly AVFormatContext* _aFormatContext;
       private readonly int _a_streamIndex;

       public VideoStreamDecoder(string url)
       {
           _vFormatContext = ffmpeg.avformat_alloc_context();

           var pFormatContext = _vFormatContext;
           ffmpeg.avformat_open_input(&pFormatContext, url, null, null).ThrowExceptionIfError();

           ffmpeg.avformat_find_stream_info(_vFormatContext, null).ThrowExceptionIfError();

           // find the first video stream
           AVStream* vStream = null;
           for (var i = 0; i < _vFormatContext->nb_streams; i++)
               if (_vFormatContext->streams[i]->codec->codec_type == AVMediaType.AVMEDIA_TYPE_VIDEO)
               {
                   vStream = _vFormatContext->streams[i];
                   break;
               }

           // find Audio stream
           AVStream* aStream = null;
           for (var i = 0; i < _vFormatContext->nb_streams; i++)
               if (_vFormatContext->streams[i]->codec->codec_type == AVMediaType.AVMEDIA_TYPE_AUDIO)
               {
                   aStream = _vFormatContext->streams[i];
                   break;
               }

           if (vStream == null) throw new InvalidOperationException("Could not found video stream.");

           _v_streamIndex = vStream->index;
           _vCodecContext = vStream->codec;

           _a_streamIndex = aStream->index;
           _aCodecContext = aStream->codec;

           var vcodecId = _vCodecContext->codec_id;
           var vCodec = ffmpeg.avcodec_find_decoder(vcodecId);

           var acodecId = _aCodecContext->codec_id;
           var aCodec = ffmpeg.avcodec_find_decoder(acodecId);

           if (vCodec == null) throw new InvalidOperationException("Unsupported codec.");

           ffmpeg.avcodec_open2(_vCodecContext, vCodec, null).ThrowExceptionIfError();
           ffmpeg.avcodec_open2(_aCodecContext, aCodec, null).ThrowExceptionIfError();

           vCodecName = ffmpeg.avcodec_get_name(vcodecId);
           aCodecName = ffmpeg.avcodec_get_name(acodecId);
           FrameSize = new Size(_vCodecContext->width, _vCodecContext->height);
           PixelFormat = _vCodecContext->pix_fmt;

           _pPacket = ffmpeg.av_packet_alloc();
           _pFrame = ffmpeg.av_frame_alloc();
       }
  • How do I properly enable ffmpeg for matplotlib.animation ?

    9 novembre 2018, par spanishgum

    I have covered a lot of ground on stack so far trying to get ffmpeg going so I can make a timelapse video.

    I am on a CentOS 7 machine, running python3.7.0a0.

    python3
    >>> import numpy as np
    >>> np.__version__
    '1.12.0'
    >>> import matplotlib as mpl
    >>> mpl.__version__
    '2.0.0'
    >>> import mpl_toolkits.basemap as base
    >>> base.__version__
    '1.0.7'

    I found this github gist on installing ffmpeg. I used the chromium source, and installed without a prefix option (using the default).

    I have confirmed that ffmpeg is installed, although I don’t know anything about testing whether it works.

    which ffmpeg
    /usr/local/bin/ffmpeg

    ffmpeg -version
    ffmpeg version N-83533-gada281d Copyright (c) 2000-2017 the FFmpeg dev elopers
    built with gcc 4.8.5 (GCC) 20150623 (Red Hat 4.8.5-11
    configuration:
    libavutil      55. 47.100 / 55. 47.100
    libavcodec     57. 80.100 / 57. 80.100
    libavformat    57. 66.102 / 57. 66.102
    libavdevice    57.  2.100 / 57.  2.100
    libavfilter     6. 73.100 /  6. 73.100
    libswscale      4.  3.101 /  4.  3.101
    libswresample   2.  4.100 /  2.  4.100

    I tried to run a few sample examples I found online :

    [1] http://matplotlib.org/examples/animation/basic_example_writer.html

    [2] https://stackoverflow.com/a/23098090/3454650

    Everything works fine up until I try to save the animation file.

    [1]

    anim.save('basic_animation.mp4', writer = FFwriter, fps=30, extra_args=['-vcodec', 'libx264'])

    [2]

    im_ani.save('im.mp4', writer=writer)

    I found here that explictly setting the path to ffmpeg might be necessary so I added this to the top of the test scripts :

    plt.rcParams['animation.ffmpeg_path'] = '/usr/local/bin/ffmpeg'

    I tried a few more tweaks in the code but always get the same response, which I do not know how to begin deciphering :

    Traceback (most recent call last):
     File "testanim.py", line 27, in <module>
       writer.grab_frame()
     File "/usr/local/lib/python3.7/contextlib.py", line 100, in __exit__
       self.gen.throw(type, value, traceback)
     File "/usr/local/lib/python3.7/site-packages/matplotlib/animation.py", line 256, in saving
       self.finish()
     File "/usr/local/lib/python3.7/site-packages/matplotlib/animation.py", line 276, in finish
       self.cleanup()
     File "/usr/local/lib/python3.7/site-packages/matplotlib/animation.py", line 311, in cleanup
       out, err = self._proc.communicate()
     File "/usr/local/lib/python3.7/subprocess.py", line 836, in communicate
       stdout, stderr = self._communicate(input, endtime, timeout)
     File "/usr/local/lib/python3.7/subprocess.py", line 1474, in _communicate
       selector.register(self.stdout, selectors.EVENT_READ)
     File "/usr/local/lib/python3.7/selectors.py", line 351, in register
       key = super().register(fileobj, events, data)
     File "/usr/local/lib/python3.7/selectors.py", line 237, in register
       key = SelectorKey(fileobj, self._fileobj_lookup(fileobj), events, data)
     File "/usr/local/lib/python3.7/selectors.py", line 224, in _fileobj_lookup
       return _fileobj_to_fd(fileobj)
     File "/usr/local/lib/python3.7/selectors.py", line 39, in _fileobj_to_fd
       "{!r}".format(fileobj)) from None
    ValueError: Invalid file object: &lt;_io.BufferedReader name=6>
    </module>

    Is there something with my configuration that is malformed ? I searched google for this error for some time but never found anything relevant to animations / ffmpeg. Any help would be greatly appreciated.


    UPDATE :

    @LordNeckBeard pointed me here : https://trac.ffmpeg.org/wiki/CompilationGuide/Centos

    I ran into problems with installing the x264 encoding dependency. Some files in libavcodec/*.c (in the make output) were reporting undefined references to several functions. After a wild goose chase found this : https://mailman.videolan.org/pipermail/x264-devel/2015-February/010971.html

    To fix the x264 installation, I simply added some configure flags :

    ./configure --enable-static --enable-shared --extra-ldflags="-lswresample -llzma"

    UPDATE :

    So everything installed fine after fixing the libx264 problems. I went ahead and copied the ffmpeg binary from the ffmpeg_build folder into /usr/local/bin/ffmpeg.

    After running the script I was getting problems where ffmpeg could not find the libx264 shared object. I think I will have to recompile everything using different prefixes. My intuition tells me there are old files laying around after I have messed with everything, using some configuration that is broken.

    So I decided maybe I should just try to use NUX : http://linoxide.com/linux-how-to/install-ffmpeg-centos-7/
    I installed ffmpeg using the new rpm, but to no avail. I still was not able to run ffmpeg because of a missing shared object.

    Finally, instead of usiong files copied into my /usr/local/bin folder, I ran ffmpeg directly from the build bin directory. Turns out that this does work properly !

    So in essence, if I want to install ffmpeg system wide, I need to manually compile from sources again but using a nonlocal prefix.

  • Video encoded by ffmpeg.exe giving me a garbage video in c# .net

    10 septembre 2018, par Amit Yadav

    I want to record video through webcam and file to be saved in .mp4 format. So for recording i am using AforgeNet.dll for recording and for mp4 format i am encoding raw video into mp4 using ffmpeg.exe using NamedPipeStreamServer as i receive frame i push into the named pipe buffer. this process works fine i have checked in ProcessTheErrorData event but when i stop recording the output file play garbage video shown in the image below

    enter image description here

    Here is Code for this

      Process Process;
     NamedPipeServerStream _ffmpegIn;
       byte[]  _videoBuffer ;
       const string PipePrefix = @"\\.\pipe\";
       public void ffmpegWriter()
       {
           if(File.Exists("outputencoded.mp4"))
           {
               File.Delete("outputencoded.mp4");
           }

           _videoBuffer = new byte[widht * heigth * 4];
           var audioPipeName = GetPipeName();
           var videoPipeName = GetPipeName();
           var videoInArgs = $@" -thread_queue_size 512 -use_wallclock_as_timestamps 1 -f rawvideo -pix_fmt rgb32 -video_size 640x480 -i \\.\pipe\{videoPipeName}";
           var videoOutArgs = $"-vcodec libx264 -crf 15 -pix_fmt yuv420p -preset ultrafast -r 10";
           _ffmpegIn = new NamedPipeServerStream(videoPipeName, PipeDirection.Out, 1, PipeTransmissionMode.Byte, PipeOptions.Asynchronous, 0, _videoBuffer.Length);
           Process=StartFFmpeg($"{videoInArgs} {videoOutArgs} \"{"outputencoded.mp4"}\"", "outputencoded.mp4");
       }

    bool WaitForConnection(NamedPipeServerStream ServerStream, int Timeout)
       {
           var asyncResult = ServerStream.BeginWaitForConnection(Ar => { }, null);

           if (asyncResult.AsyncWaitHandle.WaitOne(Timeout))
           {
               ServerStream.EndWaitForConnection(asyncResult);

               return ServerStream.IsConnected;
           }

           return false;
       }
     static string GetPipeName() => $"record-{Guid.NewGuid()}";

    public static Process StartFFmpeg(string Arguments, string OutputFileName)
           {
               var process = new Process
               {
                   StartInfo =
                   {
                       FileName = "ffmpeg.exe",
                       Arguments = Arguments,
                       UseShellExecute = false,
                       CreateNoWindow = true,
                       RedirectStandardError = true,
                       RedirectStandardInput = true,

                   },
                   EnableRaisingEvents = true
               };

               //  var logItem = ServiceProvider.Get<ffmpeglog>().CreateNew(Path.GetFileName(OutputFileName));

               process.ErrorDataReceived += (s, e) => ProcessTheErrorData(s,e);

               process.Start();

               process.BeginErrorReadLine();

               return process;
           }
    </ffmpeglog>

    and Writing each frame like this.

    private void video_NewFrame(object sender, NewFrameEventArgs eventArgs)
           {

               try
               {
                   if (_recording)
                   {

                       using (var bitmap = (Bitmap) eventArgs.Frame.Clone())
                       {
                          var _videoBuffers = ImageToByte(bitmap);


                               if (_firstFrameTime != null)
                               {
                               bitmap.Save("image/" + DateTime.Now.ToString("ddMMyyyyHHmmssfftt")+".bmp");

                                   _lastFrameTask?.Wait();


                                   _lastFrameTask = _ffmpegIn.WriteAsync(_videoBuffers, 0, _videoBuffers.Length);
                               }
                               else
                               {
                                   if (_firstFrame)
                                   {
                                       if (!WaitForConnection(_ffmpegIn, 5000))
                                       {
                                           throw new Exception("Cannot connect Video pipe to FFmpeg");
                                       }

                                       _firstFrame = false;
                                   }

                                   _firstFrameTime = DateTime.Now;
                                   _lastFrameTask?.Wait();

                                   _lastFrameTask = _ffmpegIn.WriteAsync(_videoBuffers, 0, _videoBuffers.Length);

                               }

                       }
                   }
                   using (var bitmap = (Bitmap) eventArgs.Frame.Clone())
                   {
                       var bi = bitmap.ToBitmapImage();
                       bi.Freeze();
                       Dispatcher.CurrentDispatcher.Invoke(() => Image = bi);
                   }
               }
               catch (Exception exc)
               {
                   MessageBox.Show("Error on _videoSource_NewFrame:\n" + exc.Message, "Error", MessageBoxButton.OK,
                       MessageBoxImage.Error);
                   StopCamera();
               }
           }

    Even i have written each frame to disk as bitmap .bmp and frames are correct but i don’t know what’s i am missing here ? please help thanks in advance.