Recherche avancée

Médias (17)

Mot : - Tags -/wired

Autres articles (73)

  • List of compatible distributions

    26 avril 2011, par

    The table below is the list of Linux distributions compatible with the automated installation script of MediaSPIP. Distribution nameVersion nameVersion number Debian Squeeze 6.x.x Debian Weezy 7.x.x Debian Jessie 8.x.x Ubuntu The Precise Pangolin 12.04 LTS Ubuntu The Trusty Tahr 14.04
    If you want to help us improve this list, you can provide us access to a machine whose distribution is not mentioned above or send the necessary fixes to add (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Déploiements possibles

    31 janvier 2010, par

    Deux types de déploiements sont envisageable dépendant de deux aspects : La méthode d’installation envisagée (en standalone ou en ferme) ; Le nombre d’encodages journaliers et la fréquentation envisagés ;
    L’encodage de vidéos est un processus lourd consommant énormément de ressources système (CPU et RAM), il est nécessaire de prendre tout cela en considération. Ce système n’est donc possible que sur un ou plusieurs serveurs dédiés.
    Version mono serveur
    La version mono serveur consiste à n’utiliser qu’une (...)

Sur d’autres sites (6976)

  • Conversion failed. 2 frames left in the queue on closing ffmpeg

    28 août 2020, par AlexZheda

    My simplified ffmpeg command (the longer one has over 300 files) is the following.

    



    ffmpeg -i "v1.mp4" -i "v2.mp4" -i "v3.mp4"
    -filter_complex "[0:v:0][1:v:0][2:v:0]concat=n=3:v=1:a=0,fps=fps=30[cv1]; 
        [0:a:0][1:a:0][2:a:0]concat=n=3:v=0:a=1,asetpts=N/SR/TB[ca1]; 
        [cv1]setpts=0.25*PTS[v4]; 
        [ca1]atempo=4,asetpts=N/SR/TB[a4]" 
    -c:v h264_nvenc -map "[v4]" -map "[a4]" x4_output_0.mp4


    



    The video encoding is working but then breaks and the output file seems to be truncated. The output files are nearly of the size as they should be but they can't be read.

    



    Video encoding failed\r\n
[aac @ 00000248a7856840] Qavg: 325.600\r\n
[aac @ 00000248a7856840] 2 frames left in the queue on closing\r\n
[aac @ 00000248a78595c0] Qavg: 236.279\r\n[aac @ 00000248a78595c0] 
2 frames left in the queue on closing\r\n
[aac @ 00000248a7855140] Qavg: 2729.299\r\n
[aac @ 00000248a7855140] 2 frames left in the queue on closing\r\n
[aac @ 00000248a785bec0] Qavg: 1158.664\r\n
[aac @ 00000248a785bec0] 2 frames left in the queue on closing\r\n
Conversion failed!\r\n")


    



      

    1. Does the error have anything to do with the audio part of .mp4 since aac @ ... ?
    2. 


    3. What does the Qavg mean in the error message ?
    4. 


    5. What is the difference in the video stream between the codec_time_base and the time_base (see the differences in the video attributes frequencies below) ?
    6. 


    



    Below are the frequencies of the video attributes for all videos that have more than 1 distinct value. It's of the form [(value, frequency), (value, frequency),...].

    



    codec_time_base --- [('1/60', 384), ('1001/60000', 7), ('50/2997', 1)]
has_b_frames --- [(0, 336), (2, 56)]
level --- [(31, 336), (30, 56)]
r_frame_rate --- [('30/1', 384), ('30000/1001', 7), ('2997/100', 1)]
avg_frame_rate --- [('30/1', 384), ('30000/1001', 7), ('2997/100', 1)]
time_base --- [('1/30', 383), ('1/30000', 7), ('1/2997', 1), ('1/15360', 1)]


    



    The same for the audio attributes in all those video files.

    



    codec_time_base --- [('1/48000', 386), ('1/44100', 6)]
sample_rate --- [('48000', 386), ('44100', 6)]
time_base --- [('1/48000', 386), ('1/44100', 6)]


    



      

    1. Is it possible that something is not right with some of the video files here that causes the breakdown of the encoding ?
    2. 


    


  • ffmpeg commands execution failed when run from Java. Executed successfully when the command is run in the terminal

    28 septembre 2022, par Kaung Myat Khaing

    I'm working on a java project which works with audio file manipulation(I'm newbie to Java). I want to get the start and end time of silence in the audio with ffmpeg "silencedetect" audio filter and write the results to a text file.
I run the filter command within the following function, but it always result in failure(prints "Silence Detection failed"). Then, I tried running the command in the terminal and it was successful(prints "Silence Detection successful"). I don't know what's wrong with my java code. I have other functions which execute other ffmpeg commands with the same flow as provided, but they works fine.
Here's the function.

    


    public static void getSilence(String filePath) throws InterruptedException, IOException {
        //                detect silence
        System.out.println("Silence Detection cmd: ffmpeg -i " + filePath
                + " -af silencedetect=d=1:noise=-60dB -f null - |& awk '/silencedetect/ {print $4,$5}' > "
                + SOURCE_PATH + "/silence_detections/silenceInfo.txt");
        Process silenceDetect = Runtime.getRuntime().exec("ffmpeg -i " + filePath
                + " -af silencedetect=d=1:noise=-60dB -f null - |& awk '/silencedetect/ {print $4,$5}' > "
                + SOURCE_PATH + "/silence_detections/silenceInfo.txt");

        if (silenceDetect.waitFor() == 0) {
            System.out.println("Silence Detection successful");
        } else System.out.println("Silence Detection failed");
    }


    


  • Ffmpeg : How to capture audio and metadata simultaneously

    12 janvier 2020, par ChrisDoernen

    I am developing a multi platform app for live audio streaming written in JS. The goal is to get the meters/volumes per channel while capturing audio from the sound card. This has to be done in one command since I get the error device or resource busy when firing multiple commands with the same input.

    Capturing audio works fine using this command :

    ffmpeg -y -f alsa -i hw:CARD=PCH,DEV=0 -ac 2 -b:a 192k -acodec libmp3lame -f mp3 -probesize 64 -rtbufsize 64 -reservoir 0 -fflags +nobuffer -hide_banner pipe:1

    Getting the volume for the right channel works with this command (left channel is analog providing 0.0.1 to -map_channel) :

    ffmpeg -f alsa -i hw:CARD=PCH,DEV=0 -map_channel 0.0.0 -af ebur128=metadata=1,ametadata=print:key=lavfi.r128.M -f null pipe:1

    The question is how to combine these, providing a way to pipe the outputs correctly.

    As a first step, my current approach is to utilize the file argument of ametadata filter (documenation here) and writing to a socket opened with the following JS code

    var net = require('net');

    var server = net.createServer(function (stream) {
     stream.on('data', function (c) { console.log('data:', c.toString()); });
    });

    server.listen('/tmp/test.sock');

    like

    ffmpeg -f alsa -i hw:CARD=PCH,DEV=0 -map_channel 0.0.1 -af ebur128=metadata=1,ametadata=mode=print:key=lavfi.r128.M:file=unix\:/tmp/test.sock:direct -f null -

    but the socket receives no data and there is no error in ffmpeg.

    Redirecting the output of the streaming command to the socket howerver works :

    ffmpeg -y -f alsa -i hw:CARD=PCH,DEV=0 -ac 2 -b:a 192k -acodec libmp3lame -f mp3 -probesize 64 -rtbufsize 64 -reservoir 0 -fflags +nobuffer -hide_banner unix:/tmp/test.sock

    I am wondering what I am missing and whether I am on the right track alltogether.