Recherche avancée

Médias (1)

Mot : - Tags -/ticket

Autres articles (24)

  • Installation en mode ferme

    4 février 2011, par

    Le mode ferme permet d’héberger plusieurs sites de type MediaSPIP en n’installant qu’une seule fois son noyau fonctionnel.
    C’est la méthode que nous utilisons sur cette même plateforme.
    L’utilisation en mode ferme nécessite de connaïtre un peu le mécanisme de SPIP contrairement à la version standalone qui ne nécessite pas réellement de connaissances spécifique puisque l’espace privé habituel de SPIP n’est plus utilisé.
    Dans un premier temps, vous devez avoir installé les mêmes fichiers que l’installation (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Support de tous types de médias

    10 avril 2011

    Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)

Sur d’autres sites (5435)

  • ffmpeg in:h264 out:yuv to stdout - data format ?

    27 février 2019, par Petr

    I am (like many) trying to get a continuous series of still images out of the camera attached to a raspberry pi. I want to do this in java for all the usual reasons, and am using a Runtime exec command to pipe the output of raspivid to the following ffmpeg command, and then collecting the result via stdout --- note xxx.h264 is a test file generated by the camera that does not play because there is no container, but I am getting images out so half good.

    ffmpeg -i xxx.h264 -vcodec rawvideo -r 2 -pix_fmt yuv420p -f nut -

    I have some code displaying the frames, but they "march" across the display area from left to right, and there appears to be a growing amount of rubbish across the top of the images. I have looked at the bytes it outputs by running the same command and redirecting it into a file, then using vi/xxd and find that there is headder material ("nut/multimedia container ...").

    I am guessing that there is more metadata inserted by my ffmpeg command, that I am failing to remove when processing the raw yuv420p data as described here : https://en.wikipedia.org/wiki/YUV#Y%E2%80%B2UV420sp_%28NV21%29_to_RGB_conversion_%28Android%29

    For the life of me I cannot find the nut documentation anywhere in a readable format and anyway, it seems that is not what I should be looking for. Any pointers as to how I can recognise the frame boundaries in my byte stream ?

  • Concatenating multiple remote files using ffmpeg ?

    8 décembre 2018, par May Rest in Peace

    I am trying to concatenate multiple remote files using ffmpeg but some files get skipped in the output.

    I use the command

    ffmpeg -f concat -safe 0 -protocol_whitelist "file,http,https,tcp,tls" -i mylist.txt -c copy output.m4a

    mylist.txt looks like :

    file 'http://remoteurl?fileName=20.m4a'
    file 'http://remoteurl?fileName=21.m4a'
    file 'http://remoteurl?fileName=22.m4a'
    file 'http://remoteurl?fileName=23.m4a'

    On running this command, the output will contain audio from only some files.

    I download the files individually from the same urls and did a local concatentation using the same command and it worked perfectly.

    Is this because concat will not work if files are not present immediately as mentioned in https://trac.ffmpeg.org/wiki/Concatenate#Automaticallyappendingtothelistfile ?

    If that’s the case then how should I proceed ? There’s a terminal script provided in the above link but I am on a Windows machine and tbh, I am not that good at bash scripting.

    All files are audio files with same bitrate and are in .m4a format.

    This is the error message I receive [mov,mp4,m4a,3gp,3g2,mj2 @ 00000278b64d4f40] stream 0, offset 0xc9b: partial file

  • Adding Gapless Playback information to AAC

    13 juillet 2018, par StaticBR

    im currently trying to develop an Video / Audio encoding pipline.
    My goal is it to encode mp4 files containing an h264 video track and an AAC audio Track. These files should be played one after another without any gaps in between.

    Currently im converting the videos with ffmpeg.
    Unfortunately my input files are missing the gapless playback metadata, which will be needed for gapless playback of the AAC track.

    Infact im looking for a way to add the iTunSMPB udta comment, as it is needed by the Exoplayer. (See Parser for Details : GaplessInfoHolder.java )

    I could not find a way to add this via ffmpeg ( ffmpeg AAC encoder doc), did i maybe missed something ?

    Even Wikipedia only lists two converters that should be able to do that : Nero Digital and Itunes. But this infomation could be outdated.

    Do anyone of you know a java library or (linux) command that can add this metadata to an mp4 file ?

    I hope some of you might be able to help me.
    Thank you.