Recherche avancée

Médias (1)

Mot : - Tags -/university

Autres articles (45)

  • Mise à jour de la version 0.1 vers 0.2

    24 juin 2013, par

    Explications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
    Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

Sur d’autres sites (6230)

  • real time video streaming in C#

    16 juin 2016, par Nuwan

    I’m developing an application for real time streaming. Two parts include for the streaming.
    I use a capture card to capture some live source and need to stream in real time.
    and also need to stream a local video file.

    To stream local video file in real time I use emgu cv to capture the video frame as bitmaps.
    To achieve this I create the bitmap list and I save captured bitmap to this list using one thread.
    and also I display those frames in a picture box. Bitmap list can store 1 second video. if frame rate is
    30 it will store 30 video frames. After filling this list I start another thread to encode that 1 second chunk
    video.

    For encoding purpose I use ffmpeg wrapper called nreco. I write that video frames to ffmpeg
    and start the ffmpeg to encode. After stopping that task I can get encoded data as byte array.

    Then I’m sending that data using UDP protocol through LAN.

    This works fine. But I cannot achieve the smooth streaming. When I received stream via VLC player there is some millisecond of delay between packets and also I noticed there a frame lost.

    private Capture _capture = null;
    Image frame;

    // Here I capture the frames and store them in a list
    private void ProcessFrame(object sender, EventArgs arg)
    {
        frame = _capture.QueryFrame();
        frameBmp = new Bitmap((int)frameWidth, (int)frameHeight, PixelFormat.Format24bppRgb);
        frameBmp = frame.ToBitmap();


    twoSecondVideoBitmapFramesForEncode.Add(frameBmp);
                           ////}
        if (twoSecondVideoBitmapFramesForEncode.Count == (int)FrameRate)
        {
            isInitiate = false;
            thread = new Thread(new ThreadStart(encodeTwoSecondVideo));
            thread.IsBackground = true;
            thread.Start();
        }  
    }

    public void encodeTwoSecondVideo()
    {
       List<bitmap> copyOfTwoSecondVideo = new List<bitmap>();
       copyOfTwoSecondVideo = twoSecondVideoBitmapFramesForEncode.ToList();
       twoSecondVideoBitmapFramesForEncode.Clear();

       int g = (int)FrameRate * 2;

       // create the ffmpeg task. these are the parameters i use for h264 encoding

           string outPutFrameSize = frameWidth.ToString() + "x" + frameHeight.ToString();
           //frame.ToBitmap().Save(msBit, frame.ToBitmap().RawFormat);
           ms = new MemoryStream();
           //Create video encoding task and set main parameters for the video encode

           ffMpegTask = ffmpegConverter.ConvertLiveMedia(
               Format.raw_video,
               ms,
               Format.h264,
               new ConvertSettings()
               {

                   CustomInputArgs = " -pix_fmt bgr24 -video_size " + frameWidth + "x" + frameHeight + " -framerate " + FrameRate + " ", // windows bitmap pixel format
                   CustomOutputArgs = " -threads 7 -preset ultrafast -profile:v baseline -level 3.0 -tune zerolatency -qp 0 -pix_fmt yuv420p -g " + g + " -keyint_min " + g + " -flags -global_header -sc_threshold 40 -qscale:v 1 -crf 25 -b:v 10000k -bufsize 20000k -s " + outPutFrameSize + " -r " + FrameRate + " -pass 1 -coder 1 -movflags frag_keyframe -movflags +faststart -c:a libfdk_aac -b:a 128k "
                   //VideoFrameSize = FrameSize.hd1080,
                   //VideoFrameRate = 30

               });

           ////////ffMpegTask.Start();
           ffMpegTask.Start();


         // I get the 2 second chunk video bitmap from the list and write to the ffmpeg
     foreach (var item in copyOfTwoSecondVideo)
           {
               id++;
               byte[] buf = null;
               BitmapData bd = null;
               Bitmap frameBmp = null;

               Thread.Sleep((int)(1000.5 / FrameRate));

               bd = item.LockBits(new Rectangle(0, 0, item.Width, item.Height), ImageLockMode.ReadOnly, PixelFormat.Format24bppRgb);
               buf = new byte[bd.Stride * item.Height];
               Marshal.Copy(bd.Scan0, buf, 0, buf.Length);
               ffMpegTask.Write(buf, 0, buf.Length);
               item.UnlockBits(bd);
           }
      }
    </bitmap></bitmap>

    This is the process I used to achieve the live streaming. But the stream is not smooth. I tried using a queue instead
    of list to reduce the the latency to fill the list. Because I thought that latency happens encoding thread encode
    and send 2 second video very quickly. But when it finishes this encoding process of bitmap list not
    completely full. So encoding thread will stop until the next 2 second video is ready.

    If any one can help me to figure this out, it is very grateful. If the way of I’m doing this is wrong, please correct me.
    Thank You !

  • FFMPEG or FFPLAY, catch FFT signal in real time as floats

    25 avril 2021, par NVRM

    Looking to extract in real time a FFT snapshot of waveforms data with ffplay, in the view of creating animations.

    &#xA;

    This is exactly what I am looking to catch, but this demo is using JavaScript in a browser. (Source own post)

    &#xA;

    &#xD;&#xA;
    &#xD;&#xA;
    const audio = document.getElementById(&#x27;music&#x27;);&#xA;audio.load();&#xA;audio.play();&#xA;&#xA;const ctx = new AudioContext();&#xA;const audioSrc = ctx.createMediaElementSource(audio);&#xA;const analyser = ctx.createAnalyser();&#xA;&#xA;audioSrc.connect(analyser);&#xA;analyser.connect(ctx.destination);&#xA;&#xA;analyser.fftSize = 256;&#xA;const bufferLength = analyser.frequencyBinCount;&#xA;const frequencyData = new Uint8Array(bufferLength);&#xA;&#xA;setInterval(() => {&#xA;   analyser.getByteFrequencyData(frequencyData);&#xA;   console.log(frequencyData);&#xA;}, 1000);

    &#xD;&#xA;

    <audio src="http://strm112.1.fm/reggae_mobile_mp3" crossorigin="use-URL-credentials" controls="true"></audio>

    &#xD;&#xA;

    &#xD;&#xA;

    &#xD;&#xA;&#xA;


    &#xA;

    I tried many variations around the method posted on https://trac.ffmpeg.org/wiki/Waveform .

    &#xA;

    enter image description here

    &#xA;

    The problem is the output format for FFT is PCM (Pulse Code Modulation), and not real time.

    &#xA;


    &#xA;

    In a generic way, is there a simple way to do this, while the sound is playing, to retrieve this data ?

    &#xA;

    ffplay -fft file.mp3 > fft.json&#xA;

    &#xA;


    &#xA;

    Using C, same stuff : Apply FFT on pcm data and convert to a spectrogram

    &#xA;

    FFMPEG waveform filter documentation

    &#xA;

  • Extracting each individual frame from an H264 stream for real-time analysis with OpenCV

    5 mai 2017, par exclmtnpt

    Problem Outline

    I have an h264 real-time video stream (I’ll call this "the stream") being captured in Process1. My goal is to extract each frame from the stream as it comes through and use Process2 to analyze it with OpenCV. (Process1 is nodejs, Process2 is Python)

    Things I’ve tried, and their failure modes :

    • Send the stream directly from one Process1 to Process2 over a named fifo pipe :

    I succeeded in directing the stream from Process1 into the pipe. However, in Process2 (which is Python) I could not (a) extract individual frames from the stream, and (b) convert any extracted data from h264 into an OpenCV format (e.g. JPEG, numpy array).

    I had hoped to use OpenCV’s VideoCapture() method, but it does not allow you to pass a FIFO pipe as an input. I was able to use VideoCapture by saving the h264 stream to a .h264 file, and then passing that as the file path. This doesn’t help me, because I need to do my analysis in real time (i.e. I can’t save the stream to a file before reading it in to OpenCV).

    • Pipe the stream from Process1 to FFMPEG, use FFMPEG to change the stream format from h264 to MJPEG, then pipe the output to Process2 :

    I attempted this using the command :

    cat pipeFromProcess1.fifo | ffmpeg -i pipe:0 -f h264 -f mjpeg pipe:1 | cat > pipeToProcess2.fifo

    The biggest issue with this approach is that FFMPEG takes inputs from Process1 until Process1 is killed, and only then does Process2 begin to receive the data.

    Additionally, on the Process2 side, I still don’t understand how to extract individual frames from the data coming over the pipe. I open the pipe for reading (as "f") and then execute data = f.readline(). The size of data varies drastically (some reads have length on the order of 100, others length on the order of 1,000). When I use f.read() instead of f.readline(), the length is much larger, on the order of 100,000.

    If I were to know that I was getting the correct size chunk of data, I would still not know how to transform it into an OpenCV-compatible array because I don’t understand the format it’s coming over in. It’s a string, but when I print it out it looks like this :

    ��_M 0A0����tQ,\%��e���f/�H�#Y�p�f#�Kus�} F����ʳa�G������+$x�%V�� }[����Wo �1’̶A���c����*�&=Z^�o’��Ͽ� SX-ԁ涶V&H|��$
     ��<�E�� ��>�����u���7�����cR� �f�=�9 ��fs�q�ڄߧ�9v�]�Ӷ���& gr]�n�IRܜ�檯����

    � ����+ �I��w�}� ��9�o��� �w��M�m���IJ ��� �m�=�Soՙ}S �>j �,�ƙ�’���tad =i ��WY�FeC֓z �2�g� ;EXX��S��Ҁ*, ���w� _|�&�y��H��=��)� ���Ɗ3@ �h���Ѻ�Ɋ��ZzR`��)�y�� c�ڋ.��v� !u���� �S�I#�$9R�Ԯ0py z ��8 #��A�q�� �͕� ijc �bp=��۹ c SqH

    Converting from base64 doesn’t seem to help. I also tried :

    array = np.fromstring(data, dtype=np.uint8)

    which does convert to an array, but not one of a size that makes sense based on the 640x368x3 dimensions of the frames I’m trying to decode.

    • Using decoders such as Broadway.js to convert the h264 stream :

    These seem to be focused on streaming to a website, and I did not have success trying to re-purpose them for my goal.

    Clarification about what I’m NOT trying to do :

    I’ve found many related questions about streaming h264 video to a website. This is a solved problem, but none of the solutions help me extract individual frames and put them in an OpenCV-compatible format.

    Also, I need to use the extracted frames in real time on a continual basis. So saving each frame as a .jpg is not helpful.

    System Specs

    Raspberry Pi 3 running Raspian Jessie

    Additional Detail

    I’ve tried to generalize the problem I’m having in my question. If it’s useful to know, Process1 is using the node-bebop package to pull down the h264 stream (using drone.getVideoStream()) from a Parrot Bebop 2.0. I tried using the other video stream available through node-bebop (getMjpegStream()). This worked, but was not nearly real-time ; I was getting very intermittent data streams. I’ve entered that specific problem as an Issue in the node-bebop repository.

    Thanks for reading ; I really appreciate any help anyone can give !