Recherche avancée

Médias (1)

Mot : - Tags -/biomaping

Autres articles (98)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Multilang : améliorer l’interface pour les blocs multilingues

    18 février 2011, par

    Multilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
    Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela.

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

Sur d’autres sites (8839)

  • How to extract stream of images from video file using ffmpeg

    13 décembre 2013, par user3032143

    I want to extract stream of images from a video file using ffmpeg.

    I know I can extract them straight to the hard drive using these arguments :

    -i - -qscale 1 h :\out\img-%05d.jpg

    But i would like to extract directly to a stream.

    This is my code so far :

    private void ExtractImagesFromVideo(byte[] data,string _args)
       {
           try
           {
               serverBuild = new Process();
               serverBuild.StartInfo.WorkingDirectory = Environment.CurrentDirectory;
               serverBuild.StartInfo.Arguments = _args;
               serverBuild.StartInfo.FileName = Environment.CurrentDirectory + @"\ffmpeg.exe";
               serverBuild.StartInfo.UseShellExecute = false;
               serverBuild.StartInfo.RedirectStandardOutput = true;
               serverBuild.StartInfo.RedirectStandardError = true;
               serverBuild.StartInfo.RedirectStandardInput = true;
               serverBuild.StartInfo.CreateNoWindow = true;
               serverBuild.StartInfo.LoadUserProfile = false;
               serverBuild.EnableRaisingEvents = true;
               serverBuild.Start();

               using (BinaryWriter bw = new BinaryWriter(serverBuild.StandardInput.BaseStream))
               {
                   bw.Write(data);
               }

               mStandardOutput = serverBuild.StandardOutput.BaseStream;
               mStandardOutput.BeginRead(mReadBuffer, 0, mReadBuffer.Length, StandardOutputReadCallback, null);
               serverBuild.WaitForExit();
               byte[] _data = mStandardOutputMs.ToArray();
               mStandardOutput.Close();

           }
           catch (Exception _ex)
           {

           }
           finally
           {              
               serverBuild.Dispose();
           }
       }

    and I call like like this :

        string _argsOut = @"-i pipe:0 -qscale 1 -f mjpeg pipe:1 ";
        ExtractImagesFromVideo(data, _argsOut);

    and it hangs on this line :

    bw.Write(data);

    thanks

  • Why aren't the videos in my S3 bucket buffering to html video tag ?

    2 juin 2019, par Michael Cain

    I have so far successfully programmed a node script on a Udoo x86 advanced plus that captures an Ethernet connected IP cam’s RTSP stream. I use ffmpeg to trans-code the stream into 5 second mp4 files. As soon as the files show up in the folder they are uploaded/synced to my AWS S3 Bucket. Next I have a Node server that GET’s the most recently created mp4 file from the S3 bucket and runs it through mediasource extension and finally to an html video tag.

    The videos are playing on the browser but not in any kind of synchronous manner. No buffering seems to be taking place. one video plays then another and so on. Video is skipping all over the place.

    I would really appreciate any guidance with this bug.

    export function startlivestream() {
     const videoElement = document.getElementById("my-video");
     const myMediaSource = new MediaSource();
     const url = URL.createObjectURL(myMediaSource);
     videoElement.src = url;
     myMediaSource.addEventListener("sourceopen", sourceOpen);
    }
    function sourceOpen() {
     if (window.MediaSource.isTypeSupported(
         'video/mp4; codecs="avc1.42E01E, mp4a.40.2"'
       )
     )
    {
          console.log("YES");
     }

    // 1. add source buffers

     const mediaCodec = 'video/mp4; codecs="avc1.4D601F"';
     var mediasource = this;
     const videoSourceBuffer = mediasource.addSourceBuffer(mediaCodec);

    // 2. download and add our audio/video to the SourceBuffers

    function checkVideo(url) {
     var oReq = new XMLHttpRequest();
       oReq.open("GET", url, true);
       oReq.responseType = "arraybuffer";

        oReq.onload = function(oEvent) {
         var arrayBuffer = oReq.response; // Note: not oReq.responseText
         if (arrayBuffer) {
            videoSourceBuffer.addEventListener("updateend", function(_) {
              mediasource.endOfStream();
              document.getElementById("my-video").play();
      });
              videoSourceBuffer.appendBuffer(arrayBuffer);
         }
       };

       oReq.send(null);
     }

     setInterval(function() {
       checkVideo("http://localhost:8080");
     }, 5000);

    My ffmpeg tags :

    const startRecording = () => {
     const args = [
       "-rtsp_transport",
       "tcp",
       "-i",
       inputURL,
       "-f",
       "segment",
       "-segment_time",
       "5",
       "-segment_format",
       "mp4",
       "-segment_format_options",
       "movflags=frag_keyframe+empty_moov+default_base_moof",
       "-segment_time",
       "5",
       "-segment_list_type",
       "m3u8",
       "-c:v",
       "copy",
       "-strftime",
       "1",
       `${path.join(savePath, "test-%Y-%m-%dT%H-%M-%S.mp4")}`
     ];

    From what I have learned about Mediasource extensions they allow multiple videos to be taken in and allow the client to buffer them so it looks like one longer video. In simple terms.

  • How to pass BytesIO image objects to ffmpeg ?

    13 avril 2023, par Mr.Slow

    I a have a (nested) list od BytesIO objects (images) that I would like to pass to ffmpeg and make a video. I do know, the ffmpeg cannot take it straight. What should I convert it in first ? There might be a better way using 'pipe :', which I did not succeed to implement yet.
(in this example code I ignore image duration and audio, too)

    


    def merge_videos(file_id: float, audio_list: List[BinaryIO], duration_list: List[float], images_nested_list):
    # flatten the nested list of images
    images_list = [image for images_sublist in images_nested_list for image in images_sublist]
    
    additional_parameters = {'c:a': 'aac', 'c:v': 'libx264'}

    # Create a BytesIO object to hold the output video data
    output_data = io.BytesIO()

    # create the FFmpeg command with the specified parameters and pipe the output to the BytesIO object
    command = ffmpeg.output(*images_list, '-', vf='fps=10,format=yuv420p', preset='veryfast', shortest=None, r=10, max_muxing_queue_size=4000, **additional_parameters).pipe(output_data)

    try:
        # run the FFmpeg command with error and output capture
        subprocess.check_output(['ffmpeg', '-y', '-f', 'concat', '-safe', '0', '-i', 'audio.txt', '-i', '-', '-c:v', 'copy', '-c:a', 'aac', f"{PROJECT_PATH}/data/final-{file_id}.mp4"], input=output_data.getvalue())
        log.info("Final video with file_id %s has been converted successfully", file_id)


    


    ...this code returns :

    


    TypeError: Expected incoming stream(s) to be of one of the following types: ffmpeg.nodes.FilterableStream; got <class>&#xA;</class>

    &#xA;

    How to handle it please ? Thanks for help.

    &#xA;