Recherche avancée

Médias (91)

Autres articles (106)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Encoding and processing into web-friendly formats

    13 avril 2011, par

    MediaSPIP automatically converts uploaded files to internet-compatible formats.
    Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
    Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
    Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
    All uploaded files are stored online in their original format, so you can (...)

  • Qu’est ce qu’un masque de formulaire

    13 juin 2013, par

    Un masque de formulaire consiste en la personnalisation du formulaire de mise en ligne des médias, rubriques, actualités, éditoriaux et liens vers des sites.
    Chaque formulaire de publication d’objet peut donc être personnalisé.
    Pour accéder à la personnalisation des champs de formulaires, il est nécessaire d’aller dans l’administration de votre MediaSPIP puis de sélectionner "Configuration des masques de formulaires".
    Sélectionnez ensuite le formulaire à modifier en cliquant sur sont type d’objet. (...)

Sur d’autres sites (11434)

  • FFMPEG : Remove packets based on PTS/DTS

    9 mai 2018, par stevendesu

    I have a video which contains some audio packets beyond the end of the video data :

    $> ffprobe -show_packets video.mp4
    ...
    ...
    ...
    [PACKET]
    codec_type=video
    stream_index=0
    pts=5653648
    pts_time=235.568667
    dts=5653648
    dts_time=235.568667
    duration=1001
    duration_time=0.041708
    convergence_duration=N/A
    convergence_duration_time=N/A
    size=1030
    pos=25233684
    flags=__
    [/PACKET]
    [PACKET]
    codec_type=audio
    stream_index=1
    pts=11310080
    pts_time=235.626667
    dts=11310080
    dts_time=235.626667
    duration=1024
    duration_time=0.021333
    convergence_duration=N/A
    convergence_duration_time=N/A
    size=284
    pos=25234714
    flags=K_
    [/PACKET]
    [PACKET]
    codec_type=audio
    stream_index=1
    pts=11311104
    pts_time=235.648000
    dts=11311104
    dts_time=235.648000
    duration=1024
    duration_time=0.021333
    convergence_duration=N/A
    convergence_duration_time=N/A
    size=285
    pos=25234998
    flags=K_
    [/PACKET]
    [PACKET]
    codec_type=audio
    stream_index=1
    pts=11312128
    pts_time=235.669333
    dts=11312128
    dts_time=235.669333
    duration=992
    duration_time=0.020667
    convergence_duration=N/A
    convergence_duration_time=N/A
    size=290
    pos=25235283
    flags=K_
    [/PACKET]
    $>

    The last video packet in the video has a PTS time of 235.568667 and a duration of 0.041708 - meaning all video data ends at 235.610375. However there are audio packets beginning at 235.626667 and later.

    Is there an easy way to strip these audio packets from the file so that the audio and video end simultaneously ?

  • How to read raw audio data using FFmpeg ?

    6 juin 2020, par Yousef Alaqra

    I'm trying to use this command to get the audio stream using UDP :

    



    ffmpeg -i udp://192.168.1.1:6980 -acodec copy


    



    I got an error when I execute it, which says :

    



    [udp @ 00000157a76b9a40] bind failed: Error number -10048 occurred
udp://192.168.1.1:6980: I/O error


    



    What's the meaning of this error ?

    



    Update :

    



    I was able to read raw audio data using FFmpeg and output into a wave file, using the following command :

    



    ffmpeg -f u16be -ar 44100 -ac 2 -i 'udp://127.0.0.1:1223' output.wav


    



    The problem now, Sine there is surrounding metadata in the network packets being received, it needs to be stripped out or it will result in noise.

    



    In C# I used Skip() to trim the first 28 bytes of the received packet, how would I achieve this using FFmpeg ?

    



    Update :

    



    I was able to read the raw bytes from UDP packets using by executing child process in node js :

    



    var http = require("http");
var port = 8888;
var host = "localhost";
var children = require("child_process");

http
  .createServer(function (req, res) {
    //ffmpeg -f s16le -ar 48000 -ac 2 -i 'udp://192.168.1.230:65535' -b:a 128k -f webm -
    var ffm = children.spawn(
      "ffmpeg",
      "-f s16le -ar 48000 -ac 2 -i udp://192.168.1.230:65535 -b:a 128k -f webm -".split(
        " "
      )
    );

    res.writeHead(200, { "Content-Type": "audio/webm" });
    ffm.stdout.on("data", (data) => {
      console.log(data);
      res.write(data);
    });
  })
  .listen(port, host);

console.log("Server running at http://" + host + ":" + port + "/");


    



    As you can see in the code sample above, I'm trying to pipe the output of the child process into the response, so I would be able to hear the audio in the browser.

    



    I'm receiving the data, after executing the child process, but the browser unable to play audio for some reason that I need to figure it out.

    



    Do you have an idea of what am I missing ?

    


  • lavc : Use hardware config information in ff_get_format()

    26 octobre 2017, par Mark Thompson
    lavc : Use hardware config information in ff_get_format()
    

    This removes the dependency that hardware pixel formats previously had on
    AVHWAccel instances, meaning only those which actually do something need
    exist after this patch.

    Also updates avcodec_default_get_format() to be able to choose hardware
    formats if either a matching device has been supplied or no additional
    external configuration is required, and avcodec_get_hw_frames_parameters()
    to use the hardware config rather than searching the old hwaccel list.

    The FF_CODEC_CAP_HWACCEL_REQUIRE_CLASS mechanism is deleted because it
    no longer does anything (the codec already contains the pointers to the
    matching hwaccels).

    • [DH] libavcodec/avcodec.h
    • [DH] libavcodec/cuviddec.c
    • [DH] libavcodec/decode.c
    • [DH] libavcodec/internal.h