
Recherche avancée
Autres articles (102)
-
Multilang : améliorer l’interface pour les blocs multilingues
18 février 2011, parMultilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela. -
L’agrémenter visuellement
10 avril 2011MediaSPIP est basé sur un système de thèmes et de squelettes. Les squelettes définissent le placement des informations dans la page, définissant un usage spécifique de la plateforme, et les thèmes l’habillage graphique général.
Chacun peut proposer un nouveau thème graphique ou un squelette et le mettre à disposition de la communauté. -
Possibilité de déploiement en ferme
12 avril 2011, parMediaSPIP peut être installé comme une ferme, avec un seul "noyau" hébergé sur un serveur dédié et utilisé par une multitude de sites différents.
Cela permet, par exemple : de pouvoir partager les frais de mise en œuvre entre plusieurs projets / individus ; de pouvoir déployer rapidement une multitude de sites uniques ; d’éviter d’avoir à mettre l’ensemble des créations dans un fourre-tout numérique comme c’est le cas pour les grandes plate-formes tout public disséminées sur le (...)
Sur d’autres sites (8016)
-
Adding Gapless Playback information to AAC
13 juillet 2018, par StaticBRim currently trying to develop an Video / Audio encoding pipline.
My goal is it to encode mp4 files containing an h264 video track and an AAC audio Track. These files should be played one after another without any gaps in between.Currently im converting the videos with ffmpeg.
Unfortunately my input files are missing the gapless playback metadata, which will be needed for gapless playback of the AAC track.Infact im looking for a way to add the
iTunSMPB
udta
comment, as it is needed by the Exoplayer. (See Parser for Details : GaplessInfoHolder.java )I could not find a way to add this via ffmpeg ( ffmpeg AAC encoder doc), did i maybe missed something ?
Even Wikipedia only lists two converters that should be able to do that : Nero Digital and Itunes. But this infomation could be outdated.
Do anyone of you know a java library or (linux) command that can add this metadata to an mp4 file ?
I hope some of you might be able to help me.
Thank you. -
Concatenating multiple remote files using ffmpeg ?
8 décembre 2018, par May Rest in PeaceI am trying to concatenate multiple remote files using ffmpeg but some files get skipped in the output.
I use the command
ffmpeg -f concat -safe 0 -protocol_whitelist "file,http,https,tcp,tls" -i mylist.txt -c copy output.m4a
mylist.txt
looks like :file 'http://remoteurl?fileName=20.m4a'
file 'http://remoteurl?fileName=21.m4a'
file 'http://remoteurl?fileName=22.m4a'
file 'http://remoteurl?fileName=23.m4a'On running this command, the output will contain audio from only some files.
I download the files individually from the same urls and did a local concatentation using the same command and it worked perfectly.
Is this because concat will not work if files are not present immediately as mentioned in https://trac.ffmpeg.org/wiki/Concatenate#Automaticallyappendingtothelistfile ?
If that’s the case then how should I proceed ? There’s a terminal script provided in the above link but I am on a Windows machine and tbh, I am not that good at bash scripting.
All files are audio files with same bitrate and are in .m4a format.
This is the error message I receive
[mov,mp4,m4a,3gp,3g2,mj2 @ 00000278b64d4f40] stream 0, offset 0xc9b: partial file
-
ffmpeg in:h264 out:yuv to stdout - data format ?
27 février 2019, par PetrI am (like many) trying to get a continuous series of still images out of the camera attached to a raspberry pi. I want to do this in java for all the usual reasons, and am using a Runtime exec command to pipe the output of raspivid to the following ffmpeg command, and then collecting the result via stdout --- note xxx.h264 is a test file generated by the camera that does not play because there is no container, but I am getting images out so half good.
ffmpeg -i xxx.h264 -vcodec rawvideo -r 2 -pix_fmt yuv420p -f nut -
I have some code displaying the frames, but they "march" across the display area from left to right, and there appears to be a growing amount of rubbish across the top of the images. I have looked at the bytes it outputs by running the same command and redirecting it into a file, then using vi/xxd and find that there is headder material ("nut/multimedia container ...").
I am guessing that there is more metadata inserted by my ffmpeg command, that I am failing to remove when processing the raw yuv420p data as described here : https://en.wikipedia.org/wiki/YUV#Y%E2%80%B2UV420sp_%28NV21%29_to_RGB_conversion_%28Android%29
For the life of me I cannot find the nut documentation anywhere in a readable format and anyway, it seems that is not what I should be looking for. Any pointers as to how I can recognise the frame boundaries in my byte stream ?