Recherche avancée

Médias (0)

Mot : - Tags -/xmlrpc

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (78)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • XMP PHP

    13 mai 2011, par

    Dixit Wikipedia, XMP signifie :
    Extensible Metadata Platform ou XMP est un format de métadonnées basé sur XML utilisé dans les applications PDF, de photographie et de graphisme. Il a été lancé par Adobe Systems en avril 2001 en étant intégré à la version 5.0 d’Adobe Acrobat.
    Étant basé sur XML, il gère un ensemble de tags dynamiques pour l’utilisation dans le cadre du Web sémantique.
    XMP permet d’enregistrer sous forme d’un document XML des informations relatives à un fichier : titre, auteur, historique (...)

  • Encoding and processing into web-friendly formats

    13 avril 2011, par

    MediaSPIP automatically converts uploaded files to internet-compatible formats.
    Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
    Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
    Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
    All uploaded files are stored online in their original format, so you can (...)

Sur d’autres sites (7677)

  • Gstreamer video increases latency with decresed FPS

    19 novembre 2024, par Ri Di

    I am using RPI 5 to stream the video :

    


    rpicam-vid -t 0 --camera 0 --nopreview --mode 2304:1296:10:P --codec yuv420 --width 640 --height 360 --framerate 10 --rotation 0 --autofocus-mode manual --inline --listen -o - | ffmpeg -f rawvideo -pix_fmt yuv420p -s:v 640x360 -r 10 -i /dev/stdin -c:v libx264 -preset ultrafast -tune zerolatency -maxrate 300k -bufsize 50k -g 30000 -f mpegts tcp://192.168.0.147:1234


    


    View it with :

    


    gst-launch-1.0 -v tcpserversrc host=0.0.0.0 port=1234 ! queue ! tsdemux ! h264parse ! avdec_h264 ! videorate ! video/x-raw,framerate=10/1 ! videoconvert ! autovideosink sync=false


    


    Problem is that with 10 FPS I get around 2s of latency ! While 56 or 120 FPS results in below 300ms latency.

    


    Is the problem in sender or reader side ? Or both ?

    


    I am not planning to use the 10 FPS, its only for demonstration of problem. But I would like to get lower latency at 56 FPS - just like at 120 FPS (around 80-100 ms difference) or maybe even better, as it seems to get lower with higher FPS.

    


    Maybe there is some kind of buffering parameter which holds frames ?

    


    (of course, when testing with higher FPS I change both numbers in sender and the one in reader command. The camera is v3 RPI official)

    


    Also I'd like to mention that same thing happens with ffplay :

    


    ffplay -i -probesize 3000 tcp://0.0.0.0:1234/?listen


    


  • Gstreamer video increases latency with decreased FPS

    19 novembre 2024, par Ri Di

    I am using RPI 5 to stream the video :

    


    rpicam-vid -t 0 --camera 0 --nopreview --mode 2304:1296:10:P --codec yuv420 
           --width 640 --height 360 --framerate 10 --rotation 0 
           --autofocus-mode manual --inline --listen -o - | 
     ffmpeg -f rawvideo -pix_fmt yuv420p -s:v 640x360 -r 10 -i /dev/stdin 
            -c:v libx264 -preset ultrafast -tune zerolatency -maxrate 300k 
            -bufsize 50k -g 30000 -f mpegts tcp://192.168.0.147:1234


    


    View it with :

    


    gst-launch-1.0 -v tcpserversrc host=0.0.0.0 port=1234 ! queue ! 
    tsdemux ! h264parse ! avdec_h264 ! videorate ! video/x-raw,framerate=10/1 !  
    videoconvert ! autovideosink sync=false


    


    Problem is that with 10 FPS I get around 2s of latency ! While 56 or 120 FPS results in below 300ms latency.

    


    Is the problem in sender or reader side ? Or both ?

    


    I am not planning to use the 10 FPS, its only for demonstration of problem. But I would like to get lower latency at 56 FPS - just like at 120 FPS (around 80-100 ms difference) or maybe even better, as it seems to get lower with higher FPS.

    


    Maybe there is some kind of buffering parameter which holds frames ?

    


    (of course, when testing with higher FPS I change both numbers in sender and the one in reader command. The camera is v3 RPI official)

    


    Also I'd like to mention that same thing happens with ffplay :

    


    ffplay -i -probesize 3000 tcp://0.0.0.0:1234/?listen


    


  • Capturing actual video stream quality using FFmpeg and FFplay – smooth recording but stuttering stream

    3 décembre 2024, par Karlitos Bemowo

    I’m trying to analyze the quality of a video stream over a local network using FFmpeg. My goal is to capture the actual streamed video (with all artifacts, stuttering, or buffering) as experienced in real-time, rather than a reconstructed, smooth output. Currently, the recorded video is perfectly smooth, but the live stream played using ffplay exhibits noticeable stuttering. Additionally, while the duration of the recording matches the original video, the live stream lasts longer, likely due to buffering caused by network instability.

    


    Here are the commands I’m using :
Sender :
ffmpeg -re -i file.mp4 -c:v libx265 -preset veryfast -f mpegts udp ://192.168.1.26:1234
Receiver :
ffmpeg -i udp ://192.168.1.10:1234 -c copy received_fhd.mp4 -fflags nobuffer

    


    The recorded file received_fhd.mp4 plays back perfectly, with no stuttering or frame drops, even though the network stream has visible issues when viewed in real-time with ffplay.The recorded file has the same duration as the original video (fhd.mp4), while the real-time stream playback with ffplay takes longer, suggesting delays caused by buffering or network issues. Same issue with with other codes for example vp9.

    


    GOAL : I want to capture the actual streamed video as experienced in real-time, with all stuttering, buffering, and artifacts included. This would allow me to analyze the network's impact on the video quality effectively.

    


    Questions :

    


      

    1. How can I modify my commands to record the video stream exactly as it is experienced in real-time, including any network-induced issues ?
    2. 


    3. Is there a way to configure FFmpeg to behave more like ffplay in terms of handling and recording the stream ?
    4. 


    


    Any guidance or alternative approaches would be greatly appreciated. Thank you !

    


    Experimented with different codecs (e.g., H.265, VP9) and output formats. Older codes has no delay in tranmission so I didn't notice the issue at frist.

    


    Explored various ffmpeg options like -fflags nobuffer and -c copy (as documentation says " For example, -codec copy or -codec : copy would copy all the streams without reencoding."), but it still doesn’t replicate the actual stream behavior.