Recherche avancée

Médias (1)

Mot : - Tags -/book

Autres articles (50)

  • Les formats acceptés

    28 janvier 2010, par

    Les commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
    ffmpeg -codecs ffmpeg -formats
    Les format videos acceptés en entrée
    Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
    Les formats vidéos de sortie possibles
    Dans un premier temps on (...)

  • La file d’attente de SPIPmotion

    28 novembre 2010, par

    Une file d’attente stockée dans la base de donnée
    Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
    Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...)

  • Utilisation et configuration du script

    19 janvier 2011, par

    Informations spécifiques à la distribution Debian
    Si vous utilisez cette distribution, vous devrez activer les dépôts "debian-multimedia" comme expliqué ici :
    Depuis la version 0.3.1 du script, le dépôt peut être automatiquement activé à la suite d’une question.
    Récupération du script
    Le script d’installation peut être récupéré de deux manières différentes.
    Via svn en utilisant la commande pour récupérer le code source à jour :
    svn co (...)

Sur d’autres sites (5902)

  • Is ffmpeg able to read ArrayBuffer input from stream

    7 juillet 2017, par jAndy

    I want to accomplish the following tasks :

    • Record Video+Audio from any HTML5 (MediaStream) capable browser
    • Send that data via WebSocket as Blob / ArrayBuffer chunks to a server
    • Broadcast that input stream-data to multiple clients

    As it turns out, this brought me into a world of pain. The first task is fairly simple using the HTML5 MediaStream objects alongside WebSockets.

    // ... for simplicity...
    navigator.mediaDevices.getUserMedia({ audio: true, video: true }).then(stream => {
       let mediaRecorder = new MediaRecorder( stream );
       // ...
       mediaRecorder.ondataavailable = e => {
           webSocket.send( 'newVideoData', e.data ); // configured for binary data
       };
    });

    Now, I want to receive those data fragments and stream those via nginx vod module, because I guess I want the output stream in HLS or DASH.
    I could write a little nodejs script as backend, which just receives the binary chunks and write them to a file or stream, and just reference it so nginx vod module could possibly read it and create the m3u8 manifest on the fly ?

    I am wondering now,

    • if ffmpeg is able to read that binary data directly (should be webm format), without a man-in-the-middle script, "somehow" ?
    • If not, do I have to write the data down into a file and pass that as input to ffmpeg or can I (should I) pipe the data to a self spawned ffmpeg instance ? (if so, how ?)
    • Do I actually need the nginx server (probably alongside rtmp module) to deliver the output stream as HLS or could I just use ffmpeg to also create a dynamic manifest ?
    • Is the nginx vod module capable of creating a dynamic hls/dash manifest or must the input data be complete beforehand ?
    • Ultimately, am I on the totally wrong track here ? :P

    Actually I just want to create a little video-live-chat demo, without any plugins or 3rd party encoding software, pure browser.

  • Cannot Compile FFMPEG with libfreetype on Windows/Msys2

    10 octobre 2024, par Devin Dixon

    I am having issues compiling ffmpeg with libfreetype with this commmand on windows MSYS2 :

    


    ./configure     --pkg-config-flags="--static" --enable-libvpl  --enable-libopenh264     --enable-version3  --enable-libfreetype    --enable-libopus     --enable-libvpx     --enable-libvorbis     --enable-libaom     --enable-libdav1d     --disable-gpl     --disable-w32threads     --enable-pthreads     --disable-shared     --enable-static     --extra-cflags='--static'   --extra-cflags="-I/mingw64/include -static"   --extra-ldflags="-L/mingw64/lib -static" --prefix="/home/compiled"


    


    I keep getting this error :

    


    ERROR: freetype2 not found using pkg-config

If you think configure made a mistake, make sure you are using the latest
version from Git.  If the latest version fails, report the problem to the
ffmpeg-user@ffmpeg.org mailing list or IRC #ffmpeg on irc.libera.chat.
Include the log file "ffbuild/config.log" produced by configure as this will help
solve the problem.


    


    I've installed freetype with this command :

    


     pacman -Ss mingw-w64-x86_64-freetype


    


    I've also tried compiling freetype2 from the source :

    


    git clone https://git.savannah.gnu.org/git/freetype/freetype2.git

cd freetype2

mkdir build && cd build

/mingw64/bin/cmake .. -G "MSYS Makefiles" -DBUILD_SHARED_LIBS=OFF -DCMAKE_INSTALL_PREFIX=/mingw64

make

make install


    


    And pkg-config gives this :

    


    pkg-config freetype2 --cflags --libs
-IC:/msys64/mingw64/include/freetype2 -IC:/msys64/mingw64/include/harfbuzz -IC:/msys64/mingw64/include/glib-2.0 -IC:/msys64/mingw64/lib/glib-2.0/include -IC:/msys64/mingw64/include/libpng16 -lfreetype


    


    I can confirm the package is there :

    


    ls /mingw64/lib/pkgconfig/freetype2.pc
/mingw64/lib/pkgconfig/freetype2.pc


    


    Is there anything else I should be doing to compile a static version of ffmpeg with this package ?

    


  • How to convert a Stream on the fly with FFMpegCore ?

    18 octobre 2023, par Adrian

    For a school project, I need to stream videos that I get from torrents while they are downloading on the server.
When the video is a .mp4 file, there's no problem, but I must also be able to stream .mkv files, and for that I need to convert them into .mp4 before sending them to the client, and I can't find a way to convert my Stream that I get from MonoTorrents with FFMpegCore into a Stream that I can send to my client.

    


    Here is the code I wrote to simply download and stream my torrent :

    


    var cEngine = new ClientEngine();

var manager = await cEngine.AddStreamingAsync(GenerateMagnet(torrent), ) ?? throw new Exception("An error occurred while creating the torrent manager");

await manager.StartAsync();
await manager.WaitForMetadataAsync();

var videoFile = manager.Files.OrderByDescending(f => f.Length).FirstOrDefault();
if (videoFile == null)
    return Results.NotFound();

var stream = await manager.StreamProvider!.CreateStreamAsync(videoFile, true);
return Results.File(stream, contentType: "video/mp4", fileDownloadName: manager.Name, enableRangeProcessing: true);


    


    I saw that the most common way to convert videos is by using ffmpeg. .NET has a package called FFMpefCore that is a wrapper for ffmpeg.

    


    To my previous code, I would add right before the return :

    


    if (!videoFile.Path.EndsWith(".mp4"))
{
    var outputStream = new MemoryStream();
    FFMpegArguments
        .FromPipeInput(new StreamPipeSource(stream), options =>
        {
            options.ForceFormat("mp4");
        })
        .OutputToPipe(new StreamPipeSink(outputStream))
        .ProcessAsynchronously();
    return Results.File(outputStream, contentType: "video/mp4", fileDownloadName: manager.Name, enableRangeProcessing: true);
}


    


    I unfortunately can't get a "live" Stream to send to my client.