Recherche avancée

Médias (91)

Autres articles (35)

  • Support de tous types de médias

    10 avril 2011

    Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)

  • Supporting all media types

    13 avril 2011, par

    Unlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)

  • Encoding and processing into web-friendly formats

    13 avril 2011, par

    MediaSPIP automatically converts uploaded files to internet-compatible formats.
    Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
    Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
    Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
    All uploaded files are stored online in their original format, so you can (...)

Sur d’autres sites (4241)

  • How to translate bgr data to CVPixelBufferRef ?or how to play just bgr data in iOS

    4 décembre 2016, par warlock

    I have got some RGB data used live555 & FFmpeg, but how can I play it as a video ?

    I Google a lot and found that avplayer maybe help. Others I searched link me to UIImage translate to rgb...or your search returned no matches.

    This is one I searched, but it is "Create ", not "Play ".

    possible to create a video file from RGB frames using AV Foundation

    I try to use AVsampleBufferDisplayLayer, but I don’t know how to translate.

    Anyone can help ?

  • Revision 4c05a051ab : Move qcoeff, dqcoeff from BLOCKD to per-plane data Start grouping data per-plan

    2 avril 2013, par John Koleszar

    Changed Paths : Modify /vp9/common/vp9_blockd.h Modify /vp9/common/vp9_invtrans.c Modify /vp9/common/vp9_mbpitch.c Modify /vp9/common/vp9_rtcd_defs.sh Modify /vp9/decoder/vp9_decodframe.c Modify /vp9/decoder/vp9_dequantize.c Modify /vp9/decoder/vp9_dequantize.h (...)

  • real time video streaming in C#

    16 juin 2016, par Nuwan

    I’m developing an application for real time streaming. Two parts include for the streaming.
    I use a capture card to capture some live source and need to stream in real time.
    and also need to stream a local video file.

    To stream local video file in real time I use emgu cv to capture the video frame as bitmaps.
    To achieve this I create the bitmap list and I save captured bitmap to this list using one thread.
    and also I display those frames in a picture box. Bitmap list can store 1 second video. if frame rate is
    30 it will store 30 video frames. After filling this list I start another thread to encode that 1 second chunk
    video.

    For encoding purpose I use ffmpeg wrapper called nreco. I write that video frames to ffmpeg
    and start the ffmpeg to encode. After stopping that task I can get encoded data as byte array.

    Then I’m sending that data using UDP protocol through LAN.

    This works fine. But I cannot achieve the smooth streaming. When I received stream via VLC player there is some millisecond of delay between packets and also I noticed there a frame lost.

    private Capture _capture = null;
    Image frame;

    // Here I capture the frames and store them in a list
    private void ProcessFrame(object sender, EventArgs arg)
    {
        frame = _capture.QueryFrame();
        frameBmp = new Bitmap((int)frameWidth, (int)frameHeight, PixelFormat.Format24bppRgb);
        frameBmp = frame.ToBitmap();


    twoSecondVideoBitmapFramesForEncode.Add(frameBmp);
                           ////}
        if (twoSecondVideoBitmapFramesForEncode.Count == (int)FrameRate)
        {
            isInitiate = false;
            thread = new Thread(new ThreadStart(encodeTwoSecondVideo));
            thread.IsBackground = true;
            thread.Start();
        }  
    }

    public void encodeTwoSecondVideo()
    {
       List<bitmap> copyOfTwoSecondVideo = new List<bitmap>();
       copyOfTwoSecondVideo = twoSecondVideoBitmapFramesForEncode.ToList();
       twoSecondVideoBitmapFramesForEncode.Clear();

       int g = (int)FrameRate * 2;

       // create the ffmpeg task. these are the parameters i use for h264 encoding

           string outPutFrameSize = frameWidth.ToString() + "x" + frameHeight.ToString();
           //frame.ToBitmap().Save(msBit, frame.ToBitmap().RawFormat);
           ms = new MemoryStream();
           //Create video encoding task and set main parameters for the video encode

           ffMpegTask = ffmpegConverter.ConvertLiveMedia(
               Format.raw_video,
               ms,
               Format.h264,
               new ConvertSettings()
               {

                   CustomInputArgs = " -pix_fmt bgr24 -video_size " + frameWidth + "x" + frameHeight + " -framerate " + FrameRate + " ", // windows bitmap pixel format
                   CustomOutputArgs = " -threads 7 -preset ultrafast -profile:v baseline -level 3.0 -tune zerolatency -qp 0 -pix_fmt yuv420p -g " + g + " -keyint_min " + g + " -flags -global_header -sc_threshold 40 -qscale:v 1 -crf 25 -b:v 10000k -bufsize 20000k -s " + outPutFrameSize + " -r " + FrameRate + " -pass 1 -coder 1 -movflags frag_keyframe -movflags +faststart -c:a libfdk_aac -b:a 128k "
                   //VideoFrameSize = FrameSize.hd1080,
                   //VideoFrameRate = 30

               });

           ////////ffMpegTask.Start();
           ffMpegTask.Start();


         // I get the 2 second chunk video bitmap from the list and write to the ffmpeg
     foreach (var item in copyOfTwoSecondVideo)
           {
               id++;
               byte[] buf = null;
               BitmapData bd = null;
               Bitmap frameBmp = null;

               Thread.Sleep((int)(1000.5 / FrameRate));

               bd = item.LockBits(new Rectangle(0, 0, item.Width, item.Height), ImageLockMode.ReadOnly, PixelFormat.Format24bppRgb);
               buf = new byte[bd.Stride * item.Height];
               Marshal.Copy(bd.Scan0, buf, 0, buf.Length);
               ffMpegTask.Write(buf, 0, buf.Length);
               item.UnlockBits(bd);
           }
      }
    </bitmap></bitmap>

    This is the process I used to achieve the live streaming. But the stream is not smooth. I tried using a queue instead
    of list to reduce the the latency to fill the list. Because I thought that latency happens encoding thread encode
    and send 2 second video very quickly. But when it finishes this encoding process of bitmap list not
    completely full. So encoding thread will stop until the next 2 second video is ready.

    If any one can help me to figure this out, it is very grateful. If the way of I’m doing this is wrong, please correct me.
    Thank You !