Recherche avancée

Médias (1)

Mot : - Tags -/embed

Autres articles (87)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

  • De l’upload à la vidéo finale [version standalone]

    31 janvier 2010, par

    Le chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
    Upload et récupération d’informations de la vidéo source
    Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
    Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)

Sur d’autres sites (10113)

  • ffmpeg in:h264 out:yuv to stdout - data format ?

    27 février 2019, par Petr

    I am (like many) trying to get a continuous series of still images out of the camera attached to a raspberry pi. I want to do this in java for all the usual reasons, and am using a Runtime exec command to pipe the output of raspivid to the following ffmpeg command, and then collecting the result via stdout --- note xxx.h264 is a test file generated by the camera that does not play because there is no container, but I am getting images out so half good.

    ffmpeg -i xxx.h264 -vcodec rawvideo -r 2 -pix_fmt yuv420p -f nut -

    I have some code displaying the frames, but they "march" across the display area from left to right, and there appears to be a growing amount of rubbish across the top of the images. I have looked at the bytes it outputs by running the same command and redirecting it into a file, then using vi/xxd and find that there is headder material ("nut/multimedia container ...").

    I am guessing that there is more metadata inserted by my ffmpeg command, that I am failing to remove when processing the raw yuv420p data as described here : https://en.wikipedia.org/wiki/YUV#Y%E2%80%B2UV420sp_%28NV21%29_to_RGB_conversion_%28Android%29

    For the life of me I cannot find the nut documentation anywhere in a readable format and anyway, it seems that is not what I should be looking for. Any pointers as to how I can recognise the frame boundaries in my byte stream ?

  • Concatenating multiple remote files using ffmpeg ?

    8 décembre 2018, par May Rest in Peace

    I am trying to concatenate multiple remote files using ffmpeg but some files get skipped in the output.

    I use the command

    ffmpeg -f concat -safe 0 -protocol_whitelist "file,http,https,tcp,tls" -i mylist.txt -c copy output.m4a

    mylist.txt looks like :

    file 'http://remoteurl?fileName=20.m4a'
    file 'http://remoteurl?fileName=21.m4a'
    file 'http://remoteurl?fileName=22.m4a'
    file 'http://remoteurl?fileName=23.m4a'

    On running this command, the output will contain audio from only some files.

    I download the files individually from the same urls and did a local concatentation using the same command and it worked perfectly.

    Is this because concat will not work if files are not present immediately as mentioned in https://trac.ffmpeg.org/wiki/Concatenate#Automaticallyappendingtothelistfile ?

    If that’s the case then how should I proceed ? There’s a terminal script provided in the above link but I am on a Windows machine and tbh, I am not that good at bash scripting.

    All files are audio files with same bitrate and are in .m4a format.

    This is the error message I receive [mov,mp4,m4a,3gp,3g2,mj2 @ 00000278b64d4f40] stream 0, offset 0xc9b: partial file

  • Adding Gapless Playback information to AAC

    13 juillet 2018, par StaticBR

    im currently trying to develop an Video / Audio encoding pipline.
    My goal is it to encode mp4 files containing an h264 video track and an AAC audio Track. These files should be played one after another without any gaps in between.

    Currently im converting the videos with ffmpeg.
    Unfortunately my input files are missing the gapless playback metadata, which will be needed for gapless playback of the AAC track.

    Infact im looking for a way to add the iTunSMPB udta comment, as it is needed by the Exoplayer. (See Parser for Details : GaplessInfoHolder.java )

    I could not find a way to add this via ffmpeg ( ffmpeg AAC encoder doc), did i maybe missed something ?

    Even Wikipedia only lists two converters that should be able to do that : Nero Digital and Itunes. But this infomation could be outdated.

    Do anyone of you know a java library or (linux) command that can add this metadata to an mp4 file ?

    I hope some of you might be able to help me.
    Thank you.