Recherche avancée

Médias (91)

Autres articles (97)

  • Encoding and processing into web-friendly formats

    13 avril 2011, par

    MediaSPIP automatically converts uploaded files to internet-compatible formats.
    Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
    Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
    Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
    All uploaded files are stored online in their original format, so you can (...)

  • Installation en mode ferme

    4 février 2011, par

    Le mode ferme permet d’héberger plusieurs sites de type MediaSPIP en n’installant qu’une seule fois son noyau fonctionnel.
    C’est la méthode que nous utilisons sur cette même plateforme.
    L’utilisation en mode ferme nécessite de connaïtre un peu le mécanisme de SPIP contrairement à la version standalone qui ne nécessite pas réellement de connaissances spécifique puisque l’espace privé habituel de SPIP n’est plus utilisé.
    Dans un premier temps, vous devez avoir installé les mêmes fichiers que l’installation (...)

  • Installation en mode standalone

    4 février 2011, par

    L’installation de la distribution MediaSPIP se fait en plusieurs étapes : la récupération des fichiers nécessaires. À ce moment là deux méthodes sont possibles : en installant l’archive ZIP contenant l’ensemble de la distribution ; via SVN en récupérant les sources de chaque modules séparément ; la préconfiguration ; l’installation définitive ;
    [mediaspip_zip]Installation de l’archive ZIP de MediaSPIP
    Ce mode d’installation est la méthode la plus simple afin d’installer l’ensemble de la distribution (...)

Sur d’autres sites (7332)

  • FFmpeg mp4 compression

    22 septembre 2016, par Nikita Pronchik

    I’m developing photo and video social network ( partially like instagram ). And i’m using ffmpeg for mp4 to mp4 file size compression. I achieved 3 time scale factor for file size ( from 13 mb to 2,5 mb ) but very stupidly - 3 times converting output files by the following command-line string :

    ./ffmpeg -i input.mp4 -vcodec h264 -acodec mp2 output.mp4

    I’m a newbee in video-audio codec theory, so i completely lost audio while such compression. So my question is - where is audio ? Which option should i use ? Thanks in advance !

  • Re-stream (forward) UDP-Live-Video-Stream (using winsock) reduce video quality ?

    18 août 2014, par vantrung -cuncon

    So, I used ffmpeg to stream the live webcam using UDP protocol to port 1111 :

    ffmpeg -f dshow -i video="Lenovo EasyCamera" -f mpegts udp://localhost:1111

    When I played it directly by ffplay from port 1111, everything worked properly :

    ffplay udp://localhost:1111

    I got the video quality like this :

    enter image description here

    So I think, I could write some winsock-codes to listen to port 1111 & forward any UDP packet that it catches to port 2222. Thus, I could simulate that I’m streaming to port 2222. My code is something like this :

    ' // Please note that this is the simplified code - cause it worked
    ' // i've just post the key lines
    Winsock1.Bind 1111
    Winsock2.remotePort = 2222

    WinSock1.GetData myPacket
    Winsock2.SendData myPacket

    Then I tried playing the stream from port 2222 using ffplay :

    ffplay udp://localhost:2222

    Well, I don’t know why the video quality turned to this bad :

    enter image description here

    The point is that, I’ve sent the same UDP packets in the same order as the streaming-source. What could be wrong here ?


    PS : I’ve tried a similar experiment like above with TCP, but the end-result video quality was as good as direct streaming. So, could this be a problem of UDP ?


    PS2 : I’ve tested the UDP packet loss & disorder by replacing the ffplay with a socket that listen to port 2222 & print out all the received packets. But the result is all 10,000+ packets were in correct order & nothing lost. What a crazy phenomenon ?

  • real time video streaming in C#

    16 juin 2016, par Nuwan

    I’m developing an application for real time streaming. Two parts include for the streaming.
    I use a capture card to capture some live source and need to stream in real time.
    and also need to stream a local video file.

    To stream local video file in real time I use emgu cv to capture the video frame as bitmaps.
    To achieve this I create the bitmap list and I save captured bitmap to this list using one thread.
    and also I display those frames in a picture box. Bitmap list can store 1 second video. if frame rate is
    30 it will store 30 video frames. After filling this list I start another thread to encode that 1 second chunk
    video.

    For encoding purpose I use ffmpeg wrapper called nreco. I write that video frames to ffmpeg
    and start the ffmpeg to encode. After stopping that task I can get encoded data as byte array.

    Then I’m sending that data using UDP protocol through LAN.

    This works fine. But I cannot achieve the smooth streaming. When I received stream via VLC player there is some millisecond of delay between packets and also I noticed there a frame lost.

    private Capture _capture = null;
    Image frame;

    // Here I capture the frames and store them in a list
    private void ProcessFrame(object sender, EventArgs arg)
    {
        frame = _capture.QueryFrame();
        frameBmp = new Bitmap((int)frameWidth, (int)frameHeight, PixelFormat.Format24bppRgb);
        frameBmp = frame.ToBitmap();


    twoSecondVideoBitmapFramesForEncode.Add(frameBmp);
                           ////}
        if (twoSecondVideoBitmapFramesForEncode.Count == (int)FrameRate)
        {
            isInitiate = false;
            thread = new Thread(new ThreadStart(encodeTwoSecondVideo));
            thread.IsBackground = true;
            thread.Start();
        }  
    }

    public void encodeTwoSecondVideo()
    {
       List<bitmap> copyOfTwoSecondVideo = new List<bitmap>();
       copyOfTwoSecondVideo = twoSecondVideoBitmapFramesForEncode.ToList();
       twoSecondVideoBitmapFramesForEncode.Clear();

       int g = (int)FrameRate * 2;

       // create the ffmpeg task. these are the parameters i use for h264 encoding

           string outPutFrameSize = frameWidth.ToString() + "x" + frameHeight.ToString();
           //frame.ToBitmap().Save(msBit, frame.ToBitmap().RawFormat);
           ms = new MemoryStream();
           //Create video encoding task and set main parameters for the video encode

           ffMpegTask = ffmpegConverter.ConvertLiveMedia(
               Format.raw_video,
               ms,
               Format.h264,
               new ConvertSettings()
               {

                   CustomInputArgs = " -pix_fmt bgr24 -video_size " + frameWidth + "x" + frameHeight + " -framerate " + FrameRate + " ", // windows bitmap pixel format
                   CustomOutputArgs = " -threads 7 -preset ultrafast -profile:v baseline -level 3.0 -tune zerolatency -qp 0 -pix_fmt yuv420p -g " + g + " -keyint_min " + g + " -flags -global_header -sc_threshold 40 -qscale:v 1 -crf 25 -b:v 10000k -bufsize 20000k -s " + outPutFrameSize + " -r " + FrameRate + " -pass 1 -coder 1 -movflags frag_keyframe -movflags +faststart -c:a libfdk_aac -b:a 128k "
                   //VideoFrameSize = FrameSize.hd1080,
                   //VideoFrameRate = 30

               });

           ////////ffMpegTask.Start();
           ffMpegTask.Start();


         // I get the 2 second chunk video bitmap from the list and write to the ffmpeg
     foreach (var item in copyOfTwoSecondVideo)
           {
               id++;
               byte[] buf = null;
               BitmapData bd = null;
               Bitmap frameBmp = null;

               Thread.Sleep((int)(1000.5 / FrameRate));

               bd = item.LockBits(new Rectangle(0, 0, item.Width, item.Height), ImageLockMode.ReadOnly, PixelFormat.Format24bppRgb);
               buf = new byte[bd.Stride * item.Height];
               Marshal.Copy(bd.Scan0, buf, 0, buf.Length);
               ffMpegTask.Write(buf, 0, buf.Length);
               item.UnlockBits(bd);
           }
      }
    </bitmap></bitmap>

    This is the process I used to achieve the live streaming. But the stream is not smooth. I tried using a queue instead
    of list to reduce the the latency to fill the list. Because I thought that latency happens encoding thread encode
    and send 2 second video very quickly. But when it finishes this encoding process of bitmap list not
    completely full. So encoding thread will stop until the next 2 second video is ready.

    If any one can help me to figure this out, it is very grateful. If the way of I’m doing this is wrong, please correct me.
    Thank You !