
Recherche avancée
Autres articles (46)
-
Les statuts des instances de mutualisation
13 mars 2010, parPour des raisons de compatibilité générale du plugin de gestion de mutualisations avec les fonctions originales de SPIP, les statuts des instances sont les mêmes que pour tout autre objets (articles...), seuls leurs noms dans l’interface change quelque peu.
Les différents statuts possibles sont : prepa (demandé) qui correspond à une instance demandée par un utilisateur. Si le site a déjà été créé par le passé, il est passé en mode désactivé. publie (validé) qui correspond à une instance validée par un (...) -
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...) -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...)
Sur d’autres sites (8591)
-
How to set background to subtitle in ffmpeg ?
10 avril 2017, par supermarioIt is described here how ot burn a srt file into a video.
However, I want to put a semi-transparent background to the subtitles so that the texts can be read more easily. How can I do that ? -
timelapse images into a movie, 500 at a time
2 mars 2017, par molly78I am trying to make a script to turn a bunch of timelapse images into a movie, using ffmpeg.
The latest problem is how to loop thru the images in, say, batches of 500.
There could be 100 images from the day, or there could be 5000 images.
The reason for breaking this apart is due to running out of memory.
Afterwards I would need to cat them using MP4Box to join all together...
I am entirely new to bash, but not entirely programming.
What I think needs to happen is this
1) read in the folders contents as the images may not be consecutively named
2) send ffmpeg a list of 500 at a time to process (https://trac.ffmpeg.org/wiki/Concatenate)
2b) while you’re looping thru this, set a counter to determine how many loops you’ve done
3) use the number of loops to create the MP4Box cat command line to join them all at the end.
the basic script that works if there’s only say 500 images is :
#!/bin/bash
dy=$(date '+%Y-%m-%d')
ffmpeg -framerate 24 -s hd1080 -pattern_type glob -i "/mnt/cams/Camera1/$dy/*.jpg" -vcodec libx264 -pix_fmt yuv420p Cam1-"$dy".mp4MP4Box’s cat command looks like :
MP4Box -cat Cam1-$dy7.mp4 -cat Cam1-$dy6.mp4 -cat Cam1-$dy5.mp4 -cat Cam1-$dy4.mp4 -cat Cam1-$dy3.mp4 -cat Cam1-$dy2.mp4 -cat Cam1-$dy1.mp4 "Cam1 - $dy1 to $dy7.mp4"
Needless to say help is immensely appreciated for my project
-
ffmpeg and gnu parallel
16 août 2013, par souvikMy work would require me to encode a few thousand movies in a few days. Each movie needs to be encoded in 3 different formats. I use ffmpeg to output these formats in parallel with a single read of the input source as detailed here : http://ffmpeg.org/trac/ffmpeg/wiki/Creating%20multiple%20outputs
In addition, I am using GNU Parallel to encode from multiple video files in parallel. We have four blade servers of different configurations (48, 32, 16 and 16 cores) encoding videos in parallel. Ideally, we should be able to encode 112 videos in parallel.
However, it seems that encoding completes faster on machines with lesser cores. I have 16 completed encodes on the 16 core servers in around 4 hours, while it takes close to 10 hours for 48 encodes to complete on the 48 core system. What could be the bottleneck ? A typical encode command is as follows :
ffmpeg -i sample.mpg -y -vcodec libx264 -vprofile baseline -level 30 -acodec libfdk_aac -ab 128k -ac 2 -b:v 500K -threads 1 encoded/sample_enc.mp4
Any pointers highly appreciated. Thanks !