Recherche avancée

Médias (1)

Mot : - Tags -/belgique

Autres articles (104)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

Sur d’autres sites (7878)

  • Prevent ffmpeg from changing the intensity of colors while downscaling the resolution of the video

    29 août 2022, par dravit

    I have a use case where I need to downscale a 716x1280 mp4 video to 358x640 (half of the original). Command that I used is

    


    ffmpeg -i ./input.mp4 -vf "scale=640:640:force_original_aspect_ratio=decrease,pad=ceil(iw/2)*2:ceil(ih/2)*2" ./output.mp4


    


    Out of 10 sample videos, 2 of the them suffered impact on colors. Below I have attached a comparison from the one which was impacted the most.

    


    Comparison of frames from the most impacted video

    


    NOTE : The one on the right is a frame from the original video and the frame on the left is the one from the processed (down scaled) video. Notice the colors red and green in the image (even the skin color and hair color were changed).

    


    What I am looking for is

    


      

    • Is there any way I can prevent changes like these happening ? Probably some flag on saturation, brightness, contrast or any other parameter.
    • 


    • I am assuming that ffmpeg uses some default settings while downscaling a video. What made ffmpeg change colors only for these two videos ? If it made similar changes for the rest of the videos as well, how to predict this behaviour before hand ?
    • 


    


    EDIT :

    


    What I already have Tried ?

    


      

    • -crf with values 0 and 18.
    • 


    • -preset veryslow as mentioned here
    • 


    


    None helped

    


    Mediainfo input V/S output

    


    





    


    


    


    


    


    



    


    


    


    


    


    


    


    


    


    


    


    


    


    


    


    


    


    


    


    


    


    


    


    


    


    param input output
    color range Limited NA (attribute not in description)
    color primaries BT.2020 NA (attribute not in description)
    transfer characteristics HLG NA (attribute not in description)
    matrix coefficients BT.2020 non-constant NA (attribute not in description)
    bit deapth 8 8

    


    


    Logs of the ffmpeg command

    


    ffmpeg -i ./input.mp4 -vf "scale=640:640:force_original_aspect_ratio=decrease,pad=ceil(iw/2)*2:ceil(ih/2)*2" -movflags +faststart ./output.mp4
ffmpeg version 4.3.1 Copyright (c) 2000-2020 the FFmpeg developers
  built with Apple clang version 12.0.0 (clang-1200.0.32.28)
  configuration: --prefix=/usr/local/Cellar/ffmpeg/4.3.1_9 --enable-shared --enable-pthreads --enable-version3 --enable-avresample --cc=clang --host-cflags= --host-ldflags= --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libbluray --enable-libdav1d --enable-libmp3lame --enable-libopus --enable-librav1e --enable-librubberband --enable-libsnappy --enable-libsrt --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-librtmp --enable-libspeex --enable-libsoxr --enable-videotoolbox --enable-libzmq --enable-libzimg --disable-libjack --disable-indev=jack
  libavutil      56. 51.100 / 56. 51.100
  libavcodec     58. 91.100 / 58. 91.100
  libavformat    58. 45.100 / 58. 45.100
  libavdevice    58. 10.100 / 58. 10.100
  libavfilter     7. 85.100 /  7. 85.100
  libavresample   4.  0.  0 /  4.  0.  0
  libswscale      5.  7.100 /  5.  7.100
  libswresample   3.  7.100 /  3.  7.100
  libpostproc    55.  7.100 / 55.  7.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from './input.mp4':
  Metadata:
    major_brand     : isom
    minor_version   : 512
    compatible_brands: isomiso2avc1mp41
    encoder         : Lavf58.45.100
  Duration: 00:00:30.05, start: 0.000000, bitrate: 10366 kb/s
    Stream #0:0(und): Video: h264 (Main) (avc1 / 0x31637661), yuv420p(tv, bt2020nc/bt2020/arib-std-b67), 716x1280, 10116 kb/s, 30 fps, 30 tbr, 19200 tbn, 38400 tbc (default)
    Metadata:
      handler_name    : Core Media Video
    Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 245 kb/s (default)
    Metadata:
      handler_name    : Core Media Audio
Stream mapping:
  Stream #0:0 -> #0:0 (h264 (native) -> h264 (libx264))
  Stream #0:1 -> #0:1 (aac (native) -> aac (native))
Press [q] to stop, [?] for help
[libx264 @ 0x7faab4808800] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 BMI2 AVX2
[libx264 @ 0x7faab4808800] profile High, level 3.0, 4:2:0, 8-bit
[libx264 @ 0x7faab4808800] 264 - core 161 r3027 4121277 - H.264/MPEG-4 AVC codec - Copyleft 2003-2020 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=12 lookahead_threads=2 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
Output #0, mp4, to './output.mp4':
  Metadata:
    major_brand     : isom
    minor_version   : 512
    compatible_brands: isomiso2avc1mp41
    encoder         : Lavf58.45.100
    Stream #0:0(und): Video: h264 (libx264) (avc1 / 0x31637661), yuv420p, 358x640, q=-1--1, 30 fps, 15360 tbn, 30 tbc (default)
    Metadata:
      handler_name    : Core Media Video
      encoder         : Lavc58.91.100 libx264
    Side data:
      cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: N/A
    Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 128 kb/s (default)
    Metadata:
      handler_name    : Core Media Audio
      encoder         : Lavc58.91.100 aac
[mp4 @ 0x7faab5808800] Starting second pass: moving the moov atom to the beginning of the file
frame=  901 fps=210 q=-1.0 Lsize=    3438kB time=00:00:30.02 bitrate= 938.0kbits/s speed=7.01x
video:2933kB audio:472kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.974633%
[libx264 @ 0x7faab4808800] frame I:6     Avg QP:22.60  size: 20769
[libx264 @ 0x7faab4808800] frame P:228   Avg QP:24.84  size:  7657
[libx264 @ 0x7faab4808800] frame B:667   Avg QP:27.59  size:  1697
[libx264 @ 0x7faab4808800] consecutive B-frames:  0.9%  0.9%  1.0% 97.2%
[libx264 @ 0x7faab4808800] mb I  I16..4:  9.5% 64.6% 26.0%
[libx264 @ 0x7faab4808800] mb P  I16..4:  2.5% 12.2%  2.5%  P16..4: 37.2% 20.6% 11.2%  0.0%  0.0%    skip:13.7%
[libx264 @ 0x7faab4808800] mb B  I16..4:  0.4%  2.1%  0.2%  B16..8: 42.2%  7.1%  1.2%  direct: 1.8%  skip:44.9%  L0:39.4% L1:52.8% BI: 7.8%
[libx264 @ 0x7faab4808800] 8x8 transform intra:72.2% inter:74.2%
[libx264 @ 0x7faab4808800] coded y,uvDC,uvAC intra: 61.8% 67.2% 20.2% inter: 16.7% 13.9% 1.3%
[libx264 @ 0x7faab4808800] i16 v,h,dc,p: 24% 19%  7% 50%
[libx264 @ 0x7faab4808800] i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 21% 16% 15%  6%  9% 11%  7% 10%  6%
[libx264 @ 0x7faab4808800] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 25% 16% 13%  7%  9% 10%  7%  9%  4%
[libx264 @ 0x7faab4808800] i8c dc,h,v,p: 53% 16% 26%  5%
[libx264 @ 0x7faab4808800] Weighted P-Frames: Y:3.9% UV:1.8%
[libx264 @ 0x7faab4808800] ref P L0: 57.8% 19.5% 14.8%  7.8%  0.1%
[libx264 @ 0x7faab4808800] ref B L0: 90.7%  7.2%  2.1%
[libx264 @ 0x7faab4808800] ref B L1: 95.3%  4.7%
[libx264 @ 0x7faab4808800] kb/s:799.80
[aac @ 0x7faab2036a00] Qavg: 189.523


    


  • RTSP to HLS via FFMPEG, latency issues

    28 juin 2024, par Pabl0

    The following are all the steps that I took to render a RTSP stream in my web app :

    


    How to display RTSP stream in browser using HLS

    


    Situation and Problem
You have an RTSP stream that you want to display in a browser using HLS (HTTP Live Streaming). However, when you try to play the RTSP stream in the browser using hls.js, you encounter the error "Unsupported HEVC in M2TS found." This error indicates that the HLS stream uses the HEVC (H.265) codec, which is not widely supported by many browsers and HLS players, including hls.js.

    


    The most reliable solution is to transcode the stream from H.265 to H.264 using FFmpeg, which is more broadly supported. Here's how to transcode the stream :

    


    Step 1 : Transcode the Stream Using FFmpeg

    


    Run the following FFmpeg command to transcode the RTSP stream from H.265 to H.264 and generate the HLS segments :

    


    ffmpeg -i rtsp://192.168.144.25:8554/main.264 -c:v libx264 -c:a aac -strict -2 -hls_time 10 -hls_list_size 0 -f hls C:\path\to\output\index.m3u8


    


    c:v libx264 sets the video codec to H.264.

    


    c:a aac sets the audio codec to AAC.

    


    hls_time 10 sets the duration of each segment to 10 seconds.

    


    hls_list_size 0 tells FFmpeg to include all segments in the playlist.

    


    f hls specifies the output format as HLS.

    


    C :\path\to\output\ is the directory where the HLS files will be saved. Ensure that C :\path\to\output\ is the directory where you want to save the HLS files.

    


    Step 2 : Verify the HLS Files

    


    After running the FFmpeg command, verify that the following files are generated in the output directory :

    


    index.m3u8 (HLS playlist file)

    


    Multiple .ts segment files (e.g., index0.ts, index1.ts, etc.)

    


    Step 3 : Serve the HLS Files with an HTTP Server

    


    Navigate to the directory containing the HLS files and start the HTTP server :

    


    cd C :\path\to\output
python -m http.server 8000
Step 4 : Update and Test the HTML File
Ensure that hls_test.html file is in the same directory as the HLS files and update it as needed :

    


    hls_test.html :

    


    &#xA;&#xA;    &#xA;        &#xA;        &#xA;        &#xA;    &#xA;    &#xA;        <h1>HLS Stream Test</h1>&#xA;        <button>Play Stream</button>&#xA;        <video controls="controls" style="width: 100%; height: auto;"></video>&#xA;        <code class="echappe-js">&lt;script src=&quot;https://cdn.jsdelivr.net/npm/hls.js@latest&quot;&gt;&lt;/script&gt;&#xA;        &lt;script&gt;&amp;#xA;            document&amp;#xA;                .getElementById(&amp;#x27;playButton&amp;#x27;)&amp;#xA;                .addEventListener(&amp;#x27;click&amp;#x27;, () =&gt; {&amp;#xA;                    const video = document.getElementById(&amp;#x27;video&amp;#x27;);&amp;#xA;                    if (Hls.isSupported()) {&amp;#xA;                        const hls = new Hls();&amp;#xA;                        hls.loadSource(&amp;#x27;http://localhost:8000/index.m3u8&amp;#x27;);&amp;#xA;                        hls.attachMedia(video);&amp;#xA;                        hls.on(Hls.Events.MANIFEST_PARSED, function () {&amp;#xA;                            video.play().catch((error) =&gt; {&amp;#xA;                                console.error(&amp;#xA;                                    &amp;#x27;Error attempting to play:&amp;#x27;,&amp;#xA;                                    error,&amp;#xA;                                );&amp;#xA;                            });&amp;#xA;                        });&amp;#xA;                        hls.on(Hls.Events.ERROR, function (event, data) {&amp;#xA;                            console.error(&amp;#x27;HLS Error:&amp;#x27;, data);&amp;#xA;                        });&amp;#xA;                    } else if (&amp;#xA;                        video.canPlayType(&amp;#x27;application/vnd.apple.mpegurl&amp;#x27;)&amp;#xA;                    ) {&amp;#xA;                        video.src = &amp;#x27;http://localhost:8000/index.m3u8&amp;#x27;;&amp;#xA;                        video.addEventListener(&amp;#x27;canplay&amp;#x27;, function () {&amp;#xA;                            video.play().catch((error) =&gt; {&amp;#xA;                                console.error(&amp;#xA;                                    &amp;#x27;Error attempting to play:&amp;#x27;,&amp;#xA;                                    error,&amp;#xA;                                );&amp;#xA;                            });&amp;#xA;                        });&amp;#xA;                    } else {&amp;#xA;                        console.error(&amp;#x27;HLS not supported in this browser.&amp;#x27;);&amp;#xA;                    }&amp;#xA;                });&amp;#xA;        &lt;/script&gt;&#xA;    &#xA;&#xA;

    &#xA;

    Step 5 : Open the HTML File in Your Browser

    &#xA;

    Open your browser and navigate to :

    &#xA;

    http://localhost:8000/hls_test.html&#xA;

    &#xA;

    Click the "Play Stream" button to start playing the HLS stream. If everything is set up correctly, you should see the video playing in the browser.

    &#xA;

    Conclusion

    &#xA;

    By transcoding the RTSP stream from H.265 to H.264 and serving it as an HLS stream, you can display the video in a browser using hls.js. This approach ensures broader compatibility with browsers and HLS players, allowing you to stream video content seamlessly.

    &#xA;

    PART 2 : Add this method to the react app

    &#xA;

    We are assuming that the ffmpeg command is running in the background and generating the HLS stream. Now, we will create a React component that plays the HLS stream in the browser using the video.js library.

    &#xA;

    If not, please refer to the previous steps to generate the HLS stream using FFmpeg. (steps 1-3 of the previous section)

    &#xA;

    Step 1 : Create the Camera Component

    &#xA;

    import { useRef } from &#x27;react&#x27;;&#xA;import videojs from &#x27;video.js&#x27;;&#xA;import &#x27;video.js/dist/video-js.css&#x27;;&#xA;&#xA;const Camera = ({ streamUrl }) => {&#xA;    const videoRef = useRef(null);&#xA;    const playerRef = useRef(null);&#xA;&#xA;    const handlePlayClick = () => {&#xA;        const videoElement = videoRef.current;&#xA;        if (videoElement) {&#xA;            playerRef.current = videojs(videoElement, {&#xA;                controls: true,&#xA;                autoplay: false,&#xA;                preload: &#x27;auto&#x27;,&#xA;                sources: [&#xA;                    {&#xA;                        src: streamUrl,&#xA;                        type: &#x27;application/x-mpegURL&#x27;,&#xA;                    },&#xA;                ],&#xA;            });&#xA;&#xA;            playerRef.current.on(&#x27;error&#x27;, () => {&#xA;                const error = playerRef.current.error();&#xA;                console.error(&#x27;VideoJS Error:&#x27;, error);&#xA;            });&#xA;&#xA;            playerRef.current.play().catch((error) => {&#xA;                console.error(&#x27;Error attempting to play:&#x27;, error);&#xA;            });&#xA;        }&#xA;    };&#xA;&#xA;    return (&#xA;        &#xA;            <button>Play Stream</button>&#xA;            &#xA;        &#xA;    );&#xA;};&#xA;&#xA;export default Camera;&#xA;

    &#xA;

    Note : This component uses the video.js library to play the HLS stream. Make sure to install video.js using npm or yarn :

    &#xA;

    npm install video.js

    &#xA;

    Step 2 : Use the Camera Component in Your App

    &#xA;

    Now, you can use the Camera component in your React app to display the HLS stream. Here's an example of how to use the Camera component :

    &#xA;

    <camera streamurl="http://localhost:8000/index.m3u8"></camera>

    &#xA;

    Note : see we are pointing to the HLS stream URL generated by FFmpeg in the previous steps.

    &#xA;

    Step 3 : Create the Cors Proxy Server and place it where the HLS files are being stored.

    &#xA;

    from http.server import HTTPServer, SimpleHTTPRequestHandler&#xA;import socketserver&#xA;import os&#xA;&#xA;class CORSRequestHandler(SimpleHTTPRequestHandler):&#xA;    def end_headers(self):&#xA;        if self.path.endswith(&#x27;.m3u8&#x27;):&#xA;            self.send_header(&#x27;Content-Type&#x27;, &#x27;application/vnd.apple.mpegurl&#x27;)&#xA;        elif self.path.endswith(&#x27;.ts&#x27;):&#xA;            self.send_header(&#x27;Content-Type&#x27;, &#x27;video/MP2T&#x27;)&#xA;        super().end_headers()&#xA;&#xA;if __name__ == &#x27;__main__&#x27;:&#xA;    port = 8000&#xA;    handler = CORSRequestHandler&#xA;    web_dir = r&#x27;C:\Video_cam_usv&#x27;&#xA;    os.chdir(web_dir)&#xA;    httpd = socketserver.TCPServer((&#x27;&#x27;, port), handler)&#xA;    print(f"Serving HTTP on port {port}")&#xA;    httpd.serve_forever()&#xA;

    &#xA;

    Note : Change the web_dir to the directory where the HLS files are stored.

    &#xA;

    Also, note that the server is sending the correct MIME types for .m3u8 and .ts files. For example :

    &#xA;

    .m3u8 should be application/vnd.apple.mpegurl or application/x-mpegURL.&#xA;.ts should be video/MP2T.&#xA;

    &#xA;

    Step 4 : Start the CORS Proxy Server

    &#xA;

    Open a terminal, navigate to the directory where the CORS proxy server script is located (same as the HLS files are being saved), and run the following command :

    &#xA;

    python cors_proxy_server.py&#xA;

    &#xA;

    This will start the CORS proxy server on port 8000 and serve the HLS files with the correct MIME types.

    &#xA;

    Step 5 : Start the React App&#xA;Start your React app using the following command :

    &#xA;

    npm run dev

    &#xA;

    I have tried everything above (it´s my own doc to keep with the steps Ive taken so far) and I get the stream to render on my web app but the latency is very high, at least of 5-10 secs, how can i make it be real time or close to that ?

    &#xA;

  • fluent-ffmpeg error : ffmpeg exited with code 1 : At least one output file must be specified

    18 septembre 2023, par 김동환

    I'm trying to create a simple video from several images on node js.

    &#xA;

    const ffmpeg = require(&#x27;fluent-ffmpeg&#x27;);&#xA;&#xA;ffmpeg().input(&#x27;input.txt&#x27;)&#xA;.inputOption(["-f concat"])&#xA;.outputOptions("-c:v libx264 -r 30 -pix_fmt yuv420p")&#xA;.output(&#x27;output.mp4&#x27;)&#xA;.on(&#x27;start&#x27;, (commandLine) => {&#xA;console.log(`FFmpeg command: ${commandLine}`);&#xA;})&#xA;.on(&#x27;end&#x27;, () => {&#xA;console.log(&#x27;completed&#x27;);&#xA;}).on(&#x27;error&#x27;, (err) => {&#xA;console.error(&#x27;error occurred: &#x27; &#x2B; err.message);&#xA;}).run();&#xA;

    &#xA;

    and input.txt is

    &#xA;

    file &#x27;test1.png&#x27;&#xA;duration 2&#xA;file &#x27;test2.png&#x27;&#xA;duration 30&#xA;file &#x27;test3.png&#x27;&#xA;duration 4&#xA;

    &#xA;

    I don't know why but this code doesn't work. but if i copy and paste the code from $commandLine, it works !!??

    &#xA;

    PS C:\\Users\\donghwan\\Documents\\GitHub\\create_video\\src\> node test.js&#xA;FFmpeg command: ffmpeg -f concat -i input.txt -y -c:v libx264 -r 30 -pix_fmt yuv420p output.mp4&#xA;error occurred: ffmpeg exited with code 1: At least one output file must be specified&#xA;&#xA;PS C:\\Users\\donghwan\\Documents\\GitHub\\create_video\\src\> ffmpeg -f concat -i input.txt -y -c:v libx264 -r 30 -pix_fmt yuv420p output.mp4&#xA;ffmpeg version 2023-08-20-git-f0b1cab538-full_build-www.gyan.dev Copyright (c) 2000-2023 the FFmpeg developers&#xA;&#xA;and so on...&#xA;

    &#xA;

    I'm so confused and i tried to find any other reason for this but i couldn't find anything.

    &#xA;