Recherche avancée

Médias (10)

Mot : - Tags -/wav

Autres articles (70)

  • Participer à sa traduction

    10 avril 2011

    Vous pouvez nous aider à améliorer les locutions utilisées dans le logiciel ou à traduire celui-ci dans n’importe qu’elle nouvelle langue permettant sa diffusion à de nouvelles communautés linguistiques.
    Pour ce faire, on utilise l’interface de traduction de SPIP où l’ensemble des modules de langue de MediaSPIP sont à disposition. ll vous suffit de vous inscrire sur la liste de discussion des traducteurs pour demander plus d’informations.
    Actuellement MediaSPIP n’est disponible qu’en français et (...)

  • MediaSPIP v0.2

    21 juin 2013, par

    MediaSPIP 0.2 est la première version de MediaSPIP stable.
    Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • MediaSPIP version 0.1 Beta

    16 avril 2011, par

    MediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

Sur d’autres sites (11794)

  • FFMPEG cannot find H264 encoder

    31 juillet 2020, par Serban Stoenescu

    I am running ffmpeg from a program, and we built ffmpeg ourselves (we did not install it with package managers or pre-built stuff).
This is the command that is built :

    


    2020-07-31 12:14:11.942 INFO ffmpeg::FFTranscoder Executing: "../deps/ffmpeg/bin/ffmpeg" -y -r 26.062 -i ""/home/sstoenescu/Work/myproject/sample/myvideo.h264"" -c:v openh264 -vcodec h264 -movflags +faststart ""/home/sstoenescu/Work/myproject/sample/myvideo.mp4""


    


    The file myvideo.h264 is valid, I can play it with ffplay and I can run ffprobe on it without problems.
However, the command above gives this error :

    


    ffmpeg version N-98068-g16bdc2b541 Copyright (c) 2000-2020 the FFmpeg developers
  built with gcc 7 (Ubuntu 7.5.0-3ubuntu1~18.04)
  configuration: --arch=x86_64 --cpu=nehalem --disable-ffplay --enable-shared --enable-network --enable-gpl --enable-libx265 --enable-encoder=libx265 --disable-hwaccels --enable-hwaccel=h264_qsv --enable-hwaccel=h264_vaapi
  libavutil      56. 50.100 / 56. 50.100
  libavcodec     58. 90.100 / 58. 90.100
  libavformat    58. 44.100 / 58. 44.100
  libavdevice    58.  9.103 / 58.  9.103
  libavfilter     7. 84.100 /  7. 84.100
  libswscale      5.  6.101 /  5.  6.101
  libswresample   3.  6.100 /  3.  6.100
  libpostproc    55.  6.100 / 55.  6.100
[h264 @ 0x5597c38defc0] error while decoding MB 86 45, bytestream -30
[h264 @ 0x5597c38defc0] concealing 2723 DC, 2723 AC, 2723 MV errors in I frame
Input #0, h264, from '/home/sstoenescu/Work/edvr/laba/gaoazar.h264':
  Duration: N/A, bitrate: N/A
    Stream #0:0: Video: h264 (High), yuv420p(progressive), 1920x1080 [SAR 1:1 DAR 16:9], 25 fps, 25 tbr, 1200k tbn, 50 tbc
Multiple -c, -codec, -acodec, -vcodec, -scodec or -dcodec options specified for stream 0, only the last option '-c:v h264' will be used.
Stream mapping:
  Stream #0:0 -> #0:0 (h264 (native) -> h264 (h264_v4l2m2m))
Press [q] to stop, [?] for help
[h264 @ 0x5597c38e30c0] error while decoding MB 86 45, bytestream -30
[h264 @ 0x5597c38e30c0] concealing 2723 DC, 2723 AC, 2723 MV errors in I frame
[h264_v4l2m2m @ 0x5597c3905500] Could not find a valid device
[h264_v4l2m2m @ 0x5597c3905500] can't configure encoder
Error initializing output stream 0:0 -- Error while opening encoder for output stream #0:0 - maybe incorrect parameters such as bit_rate, rate, width or height


    


    It used to work, I did not change anything in the command, so I'm assuming it's something in the way we build ffmpeg. We are using these flags : --enable-decoder=h264 --enable-encoder=h264_qsv    --enable-encoder=h264_vaapi --enable-decoder=h264 --enable-decoder=h264_qsv
Here's the full configuration :

    


    ./configure \
    --arch=x86_64 --cpu=nehalem \
    --disable-ffplay --enable-shared \
    --enable-network \
    --enable-gpl --enable-libx265 \
    --enable-encoder=libx265\
    --disable-hwaccels --enable-hwaccel=h264_qsv --enable-hwaccel=h264_vaapi \
     #--enable-encoder=libopenh264 \
     --enable-encoder=h264_qsv \
        --enable-encoder=h264_vaapi --enable-encoder=mjpeg --enable-encoder=gif \
    --enable-decoder=h264 --enable-decoder=h264_qsv \
        #--enable-decoder=libopenh264 
    --enable-libx264 \
        --enable-decoder=png \
    --enable-protocol=tcp \
    --enable-demuxer=rtsp --enable-demuxer=mov --enable-demuxer=h264 --enable-demuxer=image2 \
    --enable-muxer=mp4 --enable-muxer=h264 --enable-muxer=image2 \
    --prefix="./build"\
    --disable-doc


    


    Any ideas what could be wrong ?
Thanks.

    


    Regards,
Serban

    


  • Interpreting ffmpeg output in Python

    11 juin 2020, par Luka Milivojevic

    I started working in FFmpeg and I want to create a list that will contain start and end timestamps of silence intervals. I did print out these intervals using the FFmpeg but I need to format that output so it looks a bit more readable, so that is why I want to create a list out of it and then print it using a custom function. I know that I should go with regex here but I am not sure how should I write it nor how should I read the FFmpeg console output. My function for silence detection looks like :

    



    def detect_silence_ffmpeg():
    command = r"ffmpeg -i audio.wav -af silencedetect=n=-40dB:d=0.5 -f null - "
    subprocess.call(command, shell=True)


    



    And the output of this function on a 7 second long sample video is :

    



    ffmpeg version git-2020-06-03-b6d7c4c Copyright (c) 2000-2020 the FFmpeg developers
  built with gcc 9.3.1 (GCC) 20200523
  configuration: --enable-gpl --enable-version3 --enable-sdl2 --enable-fontconfig --enable-gnutls --enable-iconv --enable-libass --enable-libdav1d --enable-libbluray --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libsrt --enable-libtheora --enable-libtwolame --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libzimg --enable-lzma --enable-zlib --enable-gmp --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvo-amrwbenc --enable-libmysofa --enable-libspeex --enable-libxvid --enable-libaom --disable-w32threads --enable-libmfx --enable-ffnvcodec --enable-cuda-llvm --enable-cuvid --enable-d3d11va --enable-nvenc --enable-nvdec --enable-dxva2 --enable-avisynth --enable-libopenmpt --enable-amf
  libavutil      56. 49.100 / 56. 49.100
  libavcodec     58. 90.100 / 58. 90.100
  libavformat    58. 44.100 / 58. 44.100
  libavdevice    58.  9.103 / 58.  9.103
  libavfilter     7. 84.100 /  7. 84.100
  libswscale      5.  6.101 /  5.  6.101
  libswresample   3.  6.100 /  3.  6.100
  libpostproc    55.  6.100 / 55.  6.100
Guessed Channel Layout for Input Stream #0.0 : stereo
Input #0, wav, from 'audio.wav':
  Metadata:
    encoder         : Lavf58.44.100
  Duration: 00:00:07.34, bitrate: 1411 kb/s
    Stream #0:0: Audio: pcm_s16le ([1][0][0][0] / 0x0001), 44100 Hz, stereo, s16, 1411 kb/s
Stream mapping:
  Stream #0:0 -> #0:0 (pcm_s16le (native) -> pcm_s16le (native))
Press [q] to stop, [?] for help
Output #0, null, to 'pipe:':
  Metadata:
    encoder         : Lavf58.44.100
    Stream #0:0: Audio: pcm_s16le, 44100 Hz, stereo, s16, 1411 kb/s
    Metadata:
      encoder         : Lavc58.90.100 pcm_s16le
[silencedetect @ 00000202fc71e680] silence_start: 0
[silencedetect @ 00000202fc71e680] silence_end: 1.16374 | silence_duration: 1.16374
[silencedetect @ 00000202fc71e680] silence_start: 1.94558
[silencedetect @ 00000202fc71e680] silence_end: 3.41345 | silence_duration: 1.46787
[silencedetect @ 00000202fc71e680] silence_start: 3.8578
[silencedetect @ 00000202fc71e680] silence_end: 5.84844 | silence_duration: 1.99063
[silencedetect @ 00000202fc71e680] silence_start: 6.43653
size=N/A time=00:00:07.33 bitrate=N/A speed= 308x    
video:0kB audio:1264kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown
[silencedetect @ 00000202fc71e680] silence_end: 7.33868 | silence_duration: 0.902154


    



    And this code should be implemented on an hour or so long videos so I really need to find a way to format this output a bit better than this. That would be it, any help would be much appreciated :)

    



    P.S : the idea is that this should work on Windows mainly, but if the cross-platform is possible too it would be great.

    


  • ffmpeg running in cloudfunction silently fails/never finishes

    19 juin 2020, par Vojtěch

    I am trying to implement a Cloudfunction which would run ffmpeg on a Google bucket upload. I have been playing with a script based on https://kpetrovi.ch/2017/11/02/transcoding-videos-with-ffmpeg-in-google-cloud-functions.html

    



    The original script needs little tuning as the library evolved a bit. My current version is here :

    



    const {Storage} = require('@google-cloud/storage');
const storage = new Storage();
const ffmpeg = require('fluent-ffmpeg');
const ffmpeg_static = require('ffmpeg-static');

console.log("Linking ffmpeg path to:", ffmpeg_static)
ffmpeg.setFfmpegPath(ffmpeg_static);

exports.transcodeVideo = (event, callback) => {
    const bucket = storage.bucket(event.bucket);
    console.log(event);
    if (event.name.indexOf('uploads/') === -1) {
        console.log("File " + event.name + " is not to be processed.")
        return;
    }

    // ensure that you only proceed if the file is newly createdxxs
    if (event.metageneration !== '1') {
        callback();
        return;
    }

    // Open write stream to new bucket, modify the filename as needed.
    const targetName = event.name.replace("uploads/", "").replace(/[.][a-z0-9]+$/, "");
    console.log("Target name will be: " + targetName);

    const remoteWriteStream = bucket.file("processed/" + targetName + ".mp4")
        .createWriteStream({
            metadata: {
                //metadata: event.metadata, // You may not need this, my uploads have associated metadata
                contentType: 'video/mp4', // This could be whatever else you are transcoding to
            },
        });

    // Open read stream to our uploaded file
    const remoteReadStream = bucket.file(event.name).createReadStream();

    // Transcode
    ffmpeg()
        .input(remoteReadStream)
        .outputOptions('-c:v copy') // Change these options to whatever suits your needs
        .outputOptions('-c:a aac')
        .outputOptions('-b:a 160k')
        .outputOptions('-f mp4')
        .outputOptions('-preset fast')
        .outputOptions('-movflags frag_keyframe+empty_moov')
        // https://github.com/fluent-ffmpeg/node-fluent-ffmpeg/issues/346#issuecomment-67299526
        .on('start', (cmdLine) => {
            console.log('Started ffmpeg with command:', cmdLine);
        })
        .on('end', () => {
            console.log('Successfully re-encoded video.');
            callback();
        })
        .on('error', (err, stdout, stderr) => {
            console.error('An error occured during encoding', err.message);
            console.error('stdout:', stdout);
            console.error('stderr:', stderr);
            callback(err);
        })
        .pipe(remoteWriteStream, { end: true }); // end: true, emit end event when readable stream ends
};



    



    This version correctly runs and I can see this in logs :

    



    2020-06-16 21:24:22.606  Function execution took 912 ms, finished with status: 'ok'
2020-06-16 21:24:52.902  Started ffmpeg with command: ffmpeg -i pipe:0 -c:v copy -c:a aac -b:a 160k -f mp4 -preset fast -movflags frag_keyframe+empty_moov pipe:1


    



    It seems the function execution ends before the actual ffmpeg command, which then never finishes.

    



    Is there a way to make the ffmpeg "synchronous" or "blocking" so that it finishes before the function execution ?