Recherche avancée

Médias (0)

Mot : - Tags -/clipboard

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (49)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Keeping control of your media in your hands

    13 avril 2011, par

    The vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
    While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
    MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
    MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...)

  • MediaSPIP Core : La Configuration

    9 novembre 2010, par

    MediaSPIP Core fournit par défaut trois pages différentes de configuration (ces pages utilisent le plugin de configuration CFG pour fonctionner) : une page spécifique à la configuration générale du squelettes ; une page spécifique à la configuration de la page d’accueil du site ; une page spécifique à la configuration des secteurs ;
    Il fournit également une page supplémentaire qui n’apparait que lorsque certains plugins sont activés permettant de contrôler l’affichage et les fonctionnalités spécifiques (...)

Sur d’autres sites (6304)

  • nodejs FFMPEG argument issues

    16 août 2016, par shaun

    When I put the following command in the command line FFMPEG works completely correct and mute’s the sections of the video as expected.

    C:\>ffmpeg -y -i C:/Users/ADMINI~1/AppData/Local/Temp/2/0400028520160811144100001i100.mp4 -af "volume=enable='between(t,1,3)':volume=0, volume=enable='between(t,10,12)':volume=0, volume=enable='between(t,4,6)':volume=0, volume=enable='between(t,7,9)':volume=0" -c:v copy -movflags +faststart c:\temp\31e7ac4063d111e6bdc67f1f7f7b55d3.mp4
    ffmpeg version N-79651-ge1c2048 Copyright (c) 2000-2016 the FFmpeg developers
     built with gcc 4.9.3 (GCC)
     configuration: --prefix=/usr/local/x86_64-w64-mingw32 --enable-gpl --enable-nonfree --enable-libx264 --enable-libfdk_aac --enable-static --enable-runtime-cpudetect --enable-w32threads --disable-shared --disable-ffplay --disable-ffserver --arch=x86_64 --extra-cflags=-I/local/x86_64-w64-mingw32/include --extra-ldflags='-L/local/x86_64-w64-mingw32/lib -static'
     libavutil      55. 22.101 / 55. 22.101
     libavcodec     57. 38.100 / 57. 38.100
     libavformat    57. 34.103 / 57. 34.103
     libavdevice    57.  0.101 / 57.  0.101
     libavfilter     6. 44.100 /  6. 44.100
     libswscale      4.  1.100 /  4.  1.100
     libswresample   2.  0.101 /  2.  0.101
     libpostproc    54.  0.100 / 54.  0.100
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'C:/Users/ADMINI~1/AppData/Local/Temp/2/0400028520160811144100001i100.mp4':
     Metadata:
       major_brand     : isom
       minor_version   : 512
       compatible_brands: isomiso2avc1mp41
       encoder         : Lavf57.34.103
     Duration: 00:00:12.78, start: 0.000000, bitrate: 4074 kb/s
       Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p, 1280x720, 4006 kb/s, 30 fps, 30 tbr, 15360 tbn (default)
       Metadata:
         handler_name    : VideoHandler
       Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 64 kb/s (default)
       Metadata:
         handler_name    : SoundHandler
    [mp4 @ 03a8e5e0] Using AVStream.codec to pass codec parameters to muxers is deprecated, use AVStream.codecpar instead.
       Last message repeated 1 times
    Output #0, mp4, to 'c:\temp\31e7ac4063d111e6bdc67f1f7f7b55d3-AudRedact.mp4':
     Metadata:
       major_brand     : isom
       minor_version   : 512
       compatible_brands: isomiso2avc1mp41
       encoder         : Lavf57.34.103
       Stream #0:0(und): Video: h264 ([33][0][0][0] / 0x0021), yuv420p, 1280x720, q=2-31, 4006 kb/s, 30 fps, 30 tbr, 15360 tbn (default)
       Metadata:
         handler_name    : VideoHandler
       Stream #0:1(und): Audio: aac (LC) ([64][0][0][0] / 0x0040), 48000 Hz, stereo, fltp, 128 kb/s (default)
       Metadata:
         handler_name    : SoundHandler
         encoder         : Lavc57.38.100 aac
    Stream mapping:
     Stream #0:0 -> #0:0 (copy)
     Stream #0:1 -> #0:1 (aac (native) -> aac (native))
    Press [q] to stop, [?] for help
    [mp4 @ 03a8e5e0] Starting second pass: moving the moov atom to the beginning of the file24.3x
    frame=  383 fps=0.0 q=-1.0 Lsize=    6325kB time=00:00:12.73 bitrate=4068.3kbits/s speed=24.7x
    video:6244kB audio:69kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.191899%
    [aac @ 03776160] Qavg: 51080.756

    However if I build the same in node programatically :

    function redactAudio(tempFilePath, arrPairs, cb){
       // ffmpeg -loglevel fatal -y -i video.mp4 -af "volume=enable='between(t,5,10)':volume=0, volume=enable='between(t,15,20)':volume=0" -c:v copy -movflags +faststart output.mp4

       var tempBuff = Buffer.alloc(16);
       jsuuid.v1(null, tempBuff, 0);
       var outFileName = tempBuff.toString('hex') + path2.extname(tempFilePath);

       var volStr = '"';
       for (var i = 0; i < arrPairs.length; i++) {
           volStr += "volume=enable='between(t," + arrPairs[i].start + "," + arrPairs[i].end + ")':volume=0";
           if (i !== arrPairs.length - 1) {
               volStr += ", ";
           } else {
               volStr += '"';
           }
       }

       child_process.execFile(
           'ffmpeg',
           [
               /*'-loglevel', 'fatal',*/
               '-y', '-i', tempFilePath,
               '-af', volStr,
               '-c:v', 'copy',
               '-movflags', '+faststart', outFileName
           ],
           {
               cwd: tempDir,
               maxBuffer: Infinity
           },
           function(err, stdout, stderr) {
               if (err) {
                   console.error(clc.magentaBright(clc.whiteBright('FFMPEG - ERROR OCC: ', path2.basename(tempFilePath), ' : ', stderr, '\n')));
                   return cb(err, 'FFMpeg Failed: ' +  JSON.stringify({ stdout: stdout, stderr: stderr} ));
               } else {
                   console.log(clc.magentaBright(clc.whiteBright('FFMPEG - Finished:  ', path2.basename(tempFilePath), '\n')));
                   return cb(null, outFileName);
               }
           }
       );  
    }

    Please Help me understand what is going on because I get the following error every time when run from node.

    FFMPEG - ERROR OCC:  0400028520160811144100001i100.mp4  :  ffmpeg version N-79651-ge1c2048 Copyright (c) 2000-2016 the FFmpeg developers
     built with gcc 4.9.3 (GCC)
     configuration: --prefix=/usr/local/x86_64-w64-mingw32 --enable-gpl --enable-nonfree --enable-libx264 --enable-libfdk_aac --enable-static --enable-runtime-cpudetect --enable-w32threads --disable-shared --disable-ffplay --disable-ffserver --arch=x86_64 --extra-cflags=-I/local/x86_64-w64-mingw32/include --extra-ldflags='-L/local/x86_64-w64-mingw32/lib -static'
     libavutil      55. 22.101 / 55. 22.101
     libavcodec     57. 38.100 / 57. 38.100
     libavformat    57. 34.103 / 57. 34.103
     libavdevice    57.  0.101 / 57.  0.101
     libavfilter     6. 44.100 /  6. 44.100
     libswscale      4.  1.100 /  4.  1.100
     libswresample   2.  0.101 /  2.  0.101
     libpostproc    54.  0.100 / 54.  0.100
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'C:/Users/ADMINI~1/AppData/Local/Temp/2/0400028520160811144100001i100.mp4':
     Metadata:
       major_brand     : isom
       minor_version   : 512
       compatible_brands: isomiso2avc1mp41
       encoder         : Lavf57.34.103
     Duration: 00:00:12.78, start: 0.000000, bitrate: 4074 kb/s
       Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p, 1280x720, 4006 kb/s, 30 fps, 30 tbr, 15360 tbn (default)
       Metadata:
         handler_name    : VideoHandler
       Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 64 kb/s (default)
       Metadata:
         handler_name    : SoundHandler
    [AVFilterGraph @ 00381600] No such filter: '"volume'
    Error opening filters!

    0400028520160811144100001i100.mp4
    { Error: Command failed: ffmpeg -y -i C:/Users/ADMINI~1/AppData/Local/Temp/2/0400028520160811144100001i100.mp4 -af "volume=enable='between(t,1,3)':volume=0, volume=enable='between(t,10,12)':volume=0, volume=enable='between(t,4,6)':volume=0, volume=enable='between(t,7,9)':volume=0" -c:v copy -movflags +faststart ea1eae2063d211e6bbf2af16c2e9be57.mp4
    ffmpeg version N-79651-ge1c2048 Copyright (c) 2000-2016 the FFmpeg developers
     built with gcc 4.9.3 (GCC)
     configuration: --prefix=/usr/local/x86_64-w64-mingw32 --enable-gpl --enable-nonfree --enable-libx264 --enable-libfdk_aac --enable-static --enable-runtime-cpudetect --enable-w32threads --disable-shared --disable-ffplay --disable-ffserver --arch=x86_64 --extra-cflags=-I/local/x86_64-w64-mingw32/include --extra-ldflags='-L/local/x86_64-w64-mingw32/lib -static'
     libavutil      55. 22.101 / 55. 22.101
     libavcodec     57. 38.100 / 57. 38.100
     libavformat    57. 34.103 / 57. 34.103
     libavdevice    57.  0.101 / 57.  0.101
     libavfilter     6. 44.100 /  6. 44.100
     libswscale      4.  1.100 /  4.  1.100
     libswresample   2.  0.101 /  2.  0.101
     libpostproc    54.  0.100 / 54.  0.100
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'C:/Users/ADMINI~1/AppData/Local/Temp/2/0400028520160811144100001i100.mp4':
     Metadata:
       major_brand     : isom
       minor_version   : 512
       compatible_brands: isomiso2avc1mp41
       encoder         : Lavf57.34.103
     Duration: 00:00:12.78, start: 0.000000, bitrate: 4074 kb/s
       Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p, 1280x720, 4006 kb/s, 30 fps, 30 tbr, 15360 tbn (default)
       Metadata:
         handler_name    : VideoHandler
       Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 64 kb/s (default)
       Metadata:
         handler_name    : SoundHandler
    [AVFilterGraph @ 00381600] No such filter: '"volume'
    Error opening filters!

       at ChildProcess.exithandler (child_process.js:202:12)
       at emitTwo (events.js:106:13)
       at ChildProcess.emit (events.js:191:7)
       at maybeClose (internal/child_process.js:850:16)
       at Process.ChildProcess._handle.onexit (internal/child_process.js:215:5)
     killed: false,
     code: 1,
     signal: null,
     cmd: 'ffmpeg -y -i C:/Users/ADMINI~1/AppData/Local/Temp/2/0400028520160811144100001i100.mp4 -af "volume=enable=\'between(t,1,3)\':volume=0, volume=enable=\'between(t,10,12)\':volume=0, volume=enable=\'between(t,4,6)\':volume=0, volume=enable=\'between(t,7,9)\':volume=0" -c:v copy -movflags +faststart ea1eae2063d211e6bbf2af16c2e9be57.mp4' }

    Best I can think is it has something to do with the \’ in volStr but I don’t know how to create it any other way.


    after @Mulvya comment I changed my for loop to

    var volStr = '';
    for (var i = 0; i < arrPairs.length; i++) {
       volStr += "volume=enable='between(t," + arrPairs[i].start + "," + arrPairs[i].end + ")':volume=0";
       if (i !== arrPairs.length - 1) {
           volStr += ",";
       }
    }

    getting rid of the space between each grouping and it worked beautifully.

  • RTSP to HLS via FFMPEG, latency issues

    28 juin 2024, par Pabl0

    The following are all the steps that I took to render a RTSP stream in my web app :

    


    How to display RTSP stream in browser using HLS

    


    Situation and Problem
You have an RTSP stream that you want to display in a browser using HLS (HTTP Live Streaming). However, when you try to play the RTSP stream in the browser using hls.js, you encounter the error "Unsupported HEVC in M2TS found." This error indicates that the HLS stream uses the HEVC (H.265) codec, which is not widely supported by many browsers and HLS players, including hls.js.

    


    The most reliable solution is to transcode the stream from H.265 to H.264 using FFmpeg, which is more broadly supported. Here's how to transcode the stream :

    


    Step 1 : Transcode the Stream Using FFmpeg

    


    Run the following FFmpeg command to transcode the RTSP stream from H.265 to H.264 and generate the HLS segments :

    


    ffmpeg -i rtsp://192.168.144.25:8554/main.264 -c:v libx264 -c:a aac -strict -2 -hls_time 10 -hls_list_size 0 -f hls C:\path\to\output\index.m3u8


    


    c:v libx264 sets the video codec to H.264.

    


    c:a aac sets the audio codec to AAC.

    


    hls_time 10 sets the duration of each segment to 10 seconds.

    


    hls_list_size 0 tells FFmpeg to include all segments in the playlist.

    


    f hls specifies the output format as HLS.

    


    C :\path\to\output\ is the directory where the HLS files will be saved. Ensure that C :\path\to\output\ is the directory where you want to save the HLS files.

    


    Step 2 : Verify the HLS Files

    


    After running the FFmpeg command, verify that the following files are generated in the output directory :

    


    index.m3u8 (HLS playlist file)

    


    Multiple .ts segment files (e.g., index0.ts, index1.ts, etc.)

    


    Step 3 : Serve the HLS Files with an HTTP Server

    


    Navigate to the directory containing the HLS files and start the HTTP server :

    


    cd C :\path\to\output
python -m http.server 8000
Step 4 : Update and Test the HTML File
Ensure that hls_test.html file is in the same directory as the HLS files and update it as needed :

    


    hls_test.html :

    


    &#xA;&#xA;    &#xA;        &#xA;        &#xA;        &#xA;    &#xA;    &#xA;        <h1>HLS Stream Test</h1>&#xA;        <button>Play Stream</button>&#xA;        <video controls="controls" style="width: 100%; height: auto;"></video>&#xA;        <code class="echappe-js">&lt;script src=&quot;https://cdn.jsdelivr.net/npm/hls.js@latest&quot;&gt;&lt;/script&gt;&#xA;        &lt;script&gt;&amp;#xA;            document&amp;#xA;                .getElementById(&amp;#x27;playButton&amp;#x27;)&amp;#xA;                .addEventListener(&amp;#x27;click&amp;#x27;, () =&gt; {&amp;#xA;                    const video = document.getElementById(&amp;#x27;video&amp;#x27;);&amp;#xA;                    if (Hls.isSupported()) {&amp;#xA;                        const hls = new Hls();&amp;#xA;                        hls.loadSource(&amp;#x27;http://localhost:8000/index.m3u8&amp;#x27;);&amp;#xA;                        hls.attachMedia(video);&amp;#xA;                        hls.on(Hls.Events.MANIFEST_PARSED, function () {&amp;#xA;                            video.play().catch((error) =&gt; {&amp;#xA;                                console.error(&amp;#xA;                                    &amp;#x27;Error attempting to play:&amp;#x27;,&amp;#xA;                                    error,&amp;#xA;                                );&amp;#xA;                            });&amp;#xA;                        });&amp;#xA;                        hls.on(Hls.Events.ERROR, function (event, data) {&amp;#xA;                            console.error(&amp;#x27;HLS Error:&amp;#x27;, data);&amp;#xA;                        });&amp;#xA;                    } else if (&amp;#xA;                        video.canPlayType(&amp;#x27;application/vnd.apple.mpegurl&amp;#x27;)&amp;#xA;                    ) {&amp;#xA;                        video.src = &amp;#x27;http://localhost:8000/index.m3u8&amp;#x27;;&amp;#xA;                        video.addEventListener(&amp;#x27;canplay&amp;#x27;, function () {&amp;#xA;                            video.play().catch((error) =&gt; {&amp;#xA;                                console.error(&amp;#xA;                                    &amp;#x27;Error attempting to play:&amp;#x27;,&amp;#xA;                                    error,&amp;#xA;                                );&amp;#xA;                            });&amp;#xA;                        });&amp;#xA;                    } else {&amp;#xA;                        console.error(&amp;#x27;HLS not supported in this browser.&amp;#x27;);&amp;#xA;                    }&amp;#xA;                });&amp;#xA;        &lt;/script&gt;&#xA;    &#xA;&#xA;

    &#xA;

    Step 5 : Open the HTML File in Your Browser

    &#xA;

    Open your browser and navigate to :

    &#xA;

    http://localhost:8000/hls_test.html&#xA;

    &#xA;

    Click the "Play Stream" button to start playing the HLS stream. If everything is set up correctly, you should see the video playing in the browser.

    &#xA;

    Conclusion

    &#xA;

    By transcoding the RTSP stream from H.265 to H.264 and serving it as an HLS stream, you can display the video in a browser using hls.js. This approach ensures broader compatibility with browsers and HLS players, allowing you to stream video content seamlessly.

    &#xA;

    PART 2 : Add this method to the react app

    &#xA;

    We are assuming that the ffmpeg command is running in the background and generating the HLS stream. Now, we will create a React component that plays the HLS stream in the browser using the video.js library.

    &#xA;

    If not, please refer to the previous steps to generate the HLS stream using FFmpeg. (steps 1-3 of the previous section)

    &#xA;

    Step 1 : Create the Camera Component

    &#xA;

    import { useRef } from &#x27;react&#x27;;&#xA;import videojs from &#x27;video.js&#x27;;&#xA;import &#x27;video.js/dist/video-js.css&#x27;;&#xA;&#xA;const Camera = ({ streamUrl }) => {&#xA;    const videoRef = useRef(null);&#xA;    const playerRef = useRef(null);&#xA;&#xA;    const handlePlayClick = () => {&#xA;        const videoElement = videoRef.current;&#xA;        if (videoElement) {&#xA;            playerRef.current = videojs(videoElement, {&#xA;                controls: true,&#xA;                autoplay: false,&#xA;                preload: &#x27;auto&#x27;,&#xA;                sources: [&#xA;                    {&#xA;                        src: streamUrl,&#xA;                        type: &#x27;application/x-mpegURL&#x27;,&#xA;                    },&#xA;                ],&#xA;            });&#xA;&#xA;            playerRef.current.on(&#x27;error&#x27;, () => {&#xA;                const error = playerRef.current.error();&#xA;                console.error(&#x27;VideoJS Error:&#x27;, error);&#xA;            });&#xA;&#xA;            playerRef.current.play().catch((error) => {&#xA;                console.error(&#x27;Error attempting to play:&#x27;, error);&#xA;            });&#xA;        }&#xA;    };&#xA;&#xA;    return (&#xA;        &#xA;            <button>Play Stream</button>&#xA;            &#xA;        &#xA;    );&#xA;};&#xA;&#xA;export default Camera;&#xA;

    &#xA;

    Note : This component uses the video.js library to play the HLS stream. Make sure to install video.js using npm or yarn :

    &#xA;

    npm install video.js

    &#xA;

    Step 2 : Use the Camera Component in Your App

    &#xA;

    Now, you can use the Camera component in your React app to display the HLS stream. Here's an example of how to use the Camera component :

    &#xA;

    <camera streamurl="http://localhost:8000/index.m3u8"></camera>

    &#xA;

    Note : see we are pointing to the HLS stream URL generated by FFmpeg in the previous steps.

    &#xA;

    Step 3 : Create the Cors Proxy Server and place it where the HLS files are being stored.

    &#xA;

    from http.server import HTTPServer, SimpleHTTPRequestHandler&#xA;import socketserver&#xA;import os&#xA;&#xA;class CORSRequestHandler(SimpleHTTPRequestHandler):&#xA;    def end_headers(self):&#xA;        if self.path.endswith(&#x27;.m3u8&#x27;):&#xA;            self.send_header(&#x27;Content-Type&#x27;, &#x27;application/vnd.apple.mpegurl&#x27;)&#xA;        elif self.path.endswith(&#x27;.ts&#x27;):&#xA;            self.send_header(&#x27;Content-Type&#x27;, &#x27;video/MP2T&#x27;)&#xA;        super().end_headers()&#xA;&#xA;if __name__ == &#x27;__main__&#x27;:&#xA;    port = 8000&#xA;    handler = CORSRequestHandler&#xA;    web_dir = r&#x27;C:\Video_cam_usv&#x27;&#xA;    os.chdir(web_dir)&#xA;    httpd = socketserver.TCPServer((&#x27;&#x27;, port), handler)&#xA;    print(f"Serving HTTP on port {port}")&#xA;    httpd.serve_forever()&#xA;

    &#xA;

    Note : Change the web_dir to the directory where the HLS files are stored.

    &#xA;

    Also, note that the server is sending the correct MIME types for .m3u8 and .ts files. For example :

    &#xA;

    .m3u8 should be application/vnd.apple.mpegurl or application/x-mpegURL.&#xA;.ts should be video/MP2T.&#xA;

    &#xA;

    Step 4 : Start the CORS Proxy Server

    &#xA;

    Open a terminal, navigate to the directory where the CORS proxy server script is located (same as the HLS files are being saved), and run the following command :

    &#xA;

    python cors_proxy_server.py&#xA;

    &#xA;

    This will start the CORS proxy server on port 8000 and serve the HLS files with the correct MIME types.

    &#xA;

    Step 5 : Start the React App&#xA;Start your React app using the following command :

    &#xA;

    npm run dev

    &#xA;

    I have tried everything above (it´s my own doc to keep with the steps Ive taken so far) and I get the stream to render on my web app but the latency is very high, at least of 5-10 secs, how can i make it be real time or close to that ?

    &#xA;

  • Issues with Video Recording Duration and Smooth Playback when Using v4l2 framework to MP4 (FFmpeg)

    9 décembre 2024, par Reena

    I'm trying to record a video from a USB device with v4l2 framework and save it in MP4 format using FFmpeg. My sample code successfully captures and saves the video, but I'm running into some issues :

    &#xA;

    The recorded video duration is shorter than expected. For instance :

    &#xA;

    When recording a 1-minute video at 1280x720, the output file only has 58 or 59 seconds. For 1920x1080, the duration is even more off — only about 28 or 30 seconds, instead of the expected 1 minute. Additionally, the video is not smooth. There are noticeable drops in frames and playback inconsistencies.

    &#xA;

    My setup :

    &#xA;

    Using a USB device with v4l2 framework Saving the video in MP4 format Tested with different resolutions (1280x720, 1920x1080) I've attached my sample code below. Could someone help me figure out why I'm experiencing these issues with video duration and smooth playback ?

    &#xA;

    #include &#xA;#include &#xA;#include &#xA;#include &#xA;#include <sys></sys>ioctl.h>&#xA;#include <sys></sys>mman.h>&#xA;#include <linux></linux>videodev2.h>&#xA;#include <libavformat></libavformat>avformat.h>&#xA;#include <libavcodec></libavcodec>avcodec.h>&#xA;#include <libavutil></libavutil>imgutils.h>&#xA;#include <libavutil></libavutil>opt.h>&#xA;#include <libswscale></libswscale>swscale.h>&#xA;#include &#xA;#include <sys></sys>time.h>&#xA;#include &#xA;&#xA;#define WIDTH 1280&#xA;#define HEIGHT 720&#xA;#define FPS 30&#xA;#define DURATION 10 // Recording duration in seconds&#xA;#define BUFFER_COUNT 4 // Number of buffers&#xA;&#xA;struct buffer {&#xA;    void *start;&#xA;    size_t length;&#xA;};&#xA;&#xA;struct buffer *buffers;&#xA;&#xA;void open_device(int *fd, const char *device) {&#xA;    *fd = open(device, O_RDWR | O_NONBLOCK);&#xA;    if (*fd &lt; 0) {&#xA;        perror("Cannot open video device");&#xA;        exit(1);&#xA;    }&#xA;}&#xA;&#xA;void init_mmap(int fd) {&#xA;    struct v4l2_requestbuffers req;&#xA;    memset(&amp;req, 0, sizeof(req));&#xA;    req.count = BUFFER_COUNT;&#xA;    req.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;&#xA;    req.memory = V4L2_MEMORY_MMAP;&#xA;&#xA;    if (ioctl(fd, VIDIOC_REQBUFS, &amp;req) &lt; 0) {&#xA;        perror("Requesting buffer");&#xA;        exit(1);&#xA;    }&#xA;&#xA;    buffers = calloc(req.count, sizeof(*buffers));&#xA;    for (size_t i = 0; i &lt; req.count; &#x2B;&#x2B;i) {&#xA;        struct v4l2_buffer buf;&#xA;        memset(&amp;buf, 0, sizeof(buf));&#xA;        buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;&#xA;        buf.memory = V4L2_MEMORY_MMAP;&#xA;        buf.index = i;&#xA;&#xA;        if (ioctl(fd, VIDIOC_QUERYBUF, &amp;buf) &lt; 0) {&#xA;            perror("Querying buffer");&#xA;            exit(1);&#xA;        }&#xA;&#xA;        buffers[i].length = buf.length;&#xA;        buffers[i].start = mmap(NULL, buf.length, PROT_READ | PROT_WRITE, MAP_SHARED, fd, buf.m.offset);&#xA;&#xA;        if (MAP_FAILED == buffers[i].start) {&#xA;            perror("mmap");&#xA;            exit(1);&#xA;        }&#xA;    }&#xA;}&#xA;&#xA;void start_capturing(int fd) {&#xA;    for (size_t i = 0; i &lt; BUFFER_COUNT; &#x2B;&#x2B;i) {&#xA;        struct v4l2_buffer buf;&#xA;        memset(&amp;buf, 0, sizeof(buf));&#xA;        buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;&#xA;        buf.memory = V4L2_MEMORY_MMAP;&#xA;        buf.index = i;&#xA;&#xA;        if (ioctl(fd, VIDIOC_QBUF, &amp;buf) &lt; 0) {&#xA;            perror("Queue buffer");&#xA;            exit(1);&#xA;        }&#xA;    }&#xA;&#xA;    enum v4l2_buf_type type = V4L2_BUF_TYPE_VIDEO_CAPTURE;&#xA;    if (ioctl(fd, VIDIOC_STREAMON, &amp;type) &lt; 0) {&#xA;        perror("Start capture");&#xA;        exit(1);&#xA;    }&#xA;}&#xA;&#xA;void stop_capturing(int fd) {&#xA;    enum v4l2_buf_type type = V4L2_BUF_TYPE_VIDEO_CAPTURE;&#xA;&#xA;    if (ioctl(fd, VIDIOC_STREAMOFF, &amp;type) &lt; 0) {&#xA;        perror("Stop capture");&#xA;        exit(1);&#xA;    }&#xA;&#xA;    printf("Video capture stopped.\n");&#xA;}&#xA;&#xA;void unmap_buffers() {&#xA;    for (size_t i = 0; i &lt; BUFFER_COUNT; &#x2B;&#x2B;i) {&#xA;        if (munmap(buffers[i].start, buffers[i].length) &lt; 0) {&#xA;            perror("munmap");&#xA;            exit(1);&#xA;        }&#xA;    }&#xA;&#xA;    free(buffers);&#xA;}&#xA;&#xA;void initialize_ffmpeg(AVFormatContext **fmt_ctx, AVCodecContext **codec_ctx, AVStream **video_stream, const char *filename) {&#xA;    av_register_all();&#xA;&#xA;    AVOutputFormat *fmt = av_guess_format(NULL, filename, NULL);&#xA;    if (!fmt) {&#xA;        fprintf(stderr, "Could not determine output format\n");&#xA;        exit(1);&#xA;    }&#xA;&#xA;    if (avformat_alloc_output_context2(fmt_ctx, fmt, NULL, filename) &lt; 0) {&#xA;        fprintf(stderr, "Could not allocate format context\n");&#xA;        exit(1);&#xA;    }&#xA;&#xA;    AVCodec *codec = avcodec_find_encoder(AV_CODEC_ID_H264);&#xA;    if (!codec) {&#xA;        fprintf(stderr, "Codec not found\n");&#xA;        exit(1);&#xA;    }&#xA;&#xA;    *video_stream = avformat_new_stream(*fmt_ctx, NULL);&#xA;    if (!*video_stream) {&#xA;        fprintf(stderr, "Could not create stream\n");&#xA;        exit(1);&#xA;    }&#xA;&#xA;    *codec_ctx = avcodec_alloc_context3(codec);&#xA;    if (!*codec_ctx) {&#xA;        fprintf(stderr, "Could not allocate codec context\n");&#xA;        exit(1);&#xA;    }&#xA;&#xA;    (*codec_ctx)->codec_type = AVMEDIA_TYPE_VIDEO;&#xA;    (*codec_ctx)->width = WIDTH;&#xA;    (*codec_ctx)->height = HEIGHT;&#xA;    (*codec_ctx)->time_base = (AVRational){1, FPS};&#xA;    (*codec_ctx)->framerate = (AVRational){FPS, 1};&#xA;    (*codec_ctx)->pix_fmt = AV_PIX_FMT_YUV420P;&#xA;    (*codec_ctx)->gop_size = 10;&#xA;    (*codec_ctx)->max_b_frames = 1;&#xA;&#xA;    av_opt_set(*codec_ctx, "preset", "fast", 0);&#xA;    av_opt_set_int(*codec_ctx, "crf", 23, 0);&#xA;&#xA;    (*video_stream)->time_base = (*codec_ctx)->time_base;&#xA;    (*video_stream)->codecpar->codec_id = fmt->video_codec;&#xA;    (*video_stream)->codecpar->codec_type = AVMEDIA_TYPE_VIDEO;&#xA;    (*video_stream)->codecpar->width = (*codec_ctx)->width;&#xA;    (*video_stream)->codecpar->height = (*codec_ctx)->height;&#xA;    (*video_stream)->codecpar->format = (*codec_ctx)->pix_fmt;&#xA;&#xA;    if ((*fmt_ctx)->oformat->flags &amp; AVFMT_GLOBALHEADER) {&#xA;        (*codec_ctx)->flags |= AV_CODEC_FLAG_GLOBAL_HEADER;&#xA;    }&#xA;&#xA;    if (avcodec_open2(*codec_ctx, codec, NULL) &lt; 0) {&#xA;        fprintf(stderr, "Could not open codec\n");&#xA;        exit(1);&#xA;    }&#xA;&#xA;    if (avcodec_parameters_from_context((*video_stream)->codecpar, *codec_ctx) &lt; 0) {&#xA;        fprintf(stderr, "Could not copy codec parameters\n");&#xA;        exit(1);&#xA;    }&#xA;&#xA;    if (!(fmt->flags &amp; AVFMT_NOFILE)) {&#xA;        if (avio_open(&amp;(*fmt_ctx)->pb, filename, AVIO_FLAG_WRITE) &lt; 0) {&#xA;            fprintf(stderr, "Could not open output file\n");&#xA;            exit(1);&#xA;        }&#xA;    }&#xA;&#xA;    if (avformat_write_header(*fmt_ctx, NULL) &lt; 0) {&#xA;        fprintf(stderr, "Could not write header\n");&#xA;        exit(1);&#xA;    }&#xA;}&#xA;&#xA;void capture_and_encode(int fd, AVFormatContext *fmt_ctx, AVCodecContext *codec_ctx, struct SwsContext *sws_ctx, AVStream *video_stream, int duration) {&#xA;    struct v4l2_buffer buffer;&#xA;    AVFrame *frame = av_frame_alloc();&#xA;    AVPacket packet;&#xA;    av_init_packet(&amp;packet);&#xA;&#xA;    frame->format = codec_ctx->pix_fmt;&#xA;    frame->width = codec_ctx->width;&#xA;    frame->height = codec_ctx->height;&#xA;    av_image_alloc(frame->data, frame->linesize, codec_ctx->width, codec_ctx->height, codec_ctx->pix_fmt, 32);&#xA;&#xA;    struct timespec start_time;&#xA;    clock_gettime(CLOCK_MONOTONIC, &amp;start_time);&#xA;    double elapsed_time = 0;&#xA;    int64_t pts_counter = 0;&#xA;    int frame_count = 0;&#xA;&#xA;    while (elapsed_time &lt; duration) {&#xA;        fd_set fds;&#xA;        struct timeval tv;&#xA;        int r;&#xA;&#xA;        FD_ZERO(&amp;fds);&#xA;        FD_SET(fd, &amp;fds);&#xA;&#xA;        tv.tv_sec = 2;&#xA;        tv.tv_usec = 0;&#xA;&#xA;        r = select(fd &#x2B; 1, &amp;fds, NULL, NULL, &amp;tv);&#xA;        if (r == -1) {&#xA;            perror("select");&#xA;            exit(1);&#xA;        }&#xA;&#xA;        if (r == 0) {&#xA;            fprintf(stderr, "select timeout\n");&#xA;            exit(1);&#xA;        }&#xA;&#xA;        memset(&amp;buffer, 0, sizeof(buffer));&#xA;        buffer.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;&#xA;        buffer.memory = V4L2_MEMORY_MMAP;&#xA;&#xA;        if (ioctl(fd, VIDIOC_DQBUF, &amp;buffer) &lt; 0) {&#xA;            if (errno == EAGAIN) continue;&#xA;            perror("Could not dequeue buffer");&#xA;            exit(1);&#xA;        }&#xA;&#xA;        uint8_t *src_slices[1] = {buffers[buffer.index].start};&#xA;        int src_stride[1] = {WIDTH * 2}; // UYVY is 2 bytes per pixel&#xA;&#xA;        sws_scale(sws_ctx, src_slices, src_stride, 0, HEIGHT, frame->data, frame->linesize);&#xA;&#xA;        frame->pts = pts_counter;&#xA;        pts_counter &#x2B;= av_rescale_q(1, (AVRational){1, FPS}, codec_ctx->time_base);&#xA;&#xA;        if (avcodec_send_frame(codec_ctx, frame) &lt; 0) {&#xA;            fprintf(stderr, "Error sending frame\n");&#xA;            exit(1);&#xA;        }&#xA;&#xA;        while (avcodec_receive_packet(codec_ctx, &amp;packet) == 0) {&#xA;            av_packet_rescale_ts(&amp;packet, codec_ctx->time_base, video_stream->time_base);&#xA;            packet.stream_index = video_stream->index;&#xA;&#xA;            if (av_interleaved_write_frame(fmt_ctx, &amp;packet) &lt; 0) {&#xA;                fprintf(stderr, "Error writing frame\n");&#xA;                exit(1);&#xA;            }&#xA;&#xA;            av_packet_unref(&amp;packet);&#xA;        }&#xA;        printf("Processed frame %d\n", frame_count);&#xA;&#xA;        if (ioctl(fd, VIDIOC_QBUF, &amp;buffer) &lt; 0) {&#xA;            perror("Could not requeue buffer");&#xA;            exit(1);&#xA;        }&#xA;        frame_count&#x2B;&#x2B;;&#xA;        struct timespec current_time;&#xA;        clock_gettime(CLOCK_MONOTONIC, &amp;current_time);&#xA;        elapsed_time = (current_time.tv_sec - start_time.tv_sec) &#x2B; (current_time.tv_nsec - start_time.tv_nsec) / 1e9;&#xA;        printf("Elapsed time: %f seconds\n", elapsed_time);&#xA;    }&#xA;&#xA;    av_freep(&amp;frame->data[0]);&#xA;    av_frame_free(&amp;frame);&#xA;    printf("Total frames processed: %d\n", frame_count);&#xA;}&#xA;&#xA;int main(int argc, char *argv[]) {&#xA;    if (argc != 2) {&#xA;        fprintf(stderr, "Usage: %s <output file="file">\n", argv[0]);&#xA;        exit(1);&#xA;    }&#xA;&#xA;    const char *output_file = argv[1];&#xA;    int fd;&#xA;    open_device(&amp;fd, "/dev/video2");&#xA;&#xA;    struct v4l2_format fmt;&#xA;    memset(&amp;fmt, 0, sizeof(fmt));&#xA;    fmt.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;&#xA;    fmt.fmt.pix.width = WIDTH;&#xA;    fmt.fmt.pix.height = HEIGHT;&#xA;    fmt.fmt.pix.pixelformat = V4L2_PIX_FMT_UYVY;&#xA;    fmt.fmt.pix.field = V4L2_FIELD_NONE;&#xA;&#xA;    if (ioctl(fd, VIDIOC_S_FMT, &amp;fmt) &lt; 0) {&#xA;        perror("Setting pixel format");&#xA;        exit(1);&#xA;    }&#xA;&#xA;    if (fmt.fmt.pix.pixelformat != V4L2_PIX_FMT_UYVY) {&#xA;        fprintf(stderr, "Device does not support UYVY format\n");&#xA;        exit(1);&#xA;    }&#xA;&#xA;    init_mmap(fd);&#xA;    start_capturing(fd);&#xA;&#xA;    AVFormatContext *fmt_ctx = NULL;&#xA;    AVCodecContext *codec_ctx = NULL;&#xA;    AVStream *video_stream = NULL;&#xA;&#xA;    initialize_ffmpeg(&amp;fmt_ctx, &amp;codec_ctx, &amp;video_stream, output_file);&#xA;&#xA;    struct SwsContext *sws_ctx = sws_getContext(WIDTH, HEIGHT, AV_PIX_FMT_UYVY422,&#xA;                                                WIDTH, HEIGHT, AV_PIX_FMT_YUV420P,&#xA;                                                SWS_BICUBIC, NULL, NULL, NULL);&#xA;&#xA;    if (!sws_ctx) {&#xA;        fprintf(stderr, "Could not initialize SwsContext\n");&#xA;        exit(1);&#xA;    }&#xA;&#xA;    capture_and_encode(fd, fmt_ctx, codec_ctx, sws_ctx, video_stream, DURATION);&#xA;&#xA;    sws_freeContext(sws_ctx);&#xA;    av_write_trailer(fmt_ctx);&#xA;    avcodec_free_context(&amp;codec_ctx);&#xA;    avformat_free_context(fmt_ctx);&#xA;    stop_capturing(fd);&#xA;    unmap_buffers();&#xA;    close(fd);&#xA;&#xA;    return 0;&#xA;}&#xA;</output>

    &#xA;