
Recherche avancée
Autres articles (67)
-
Les tâches Cron régulières de la ferme
1er décembre 2010, parLa gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
Le super Cron (gestion_mutu_super_cron)
Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...) -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
De l’upload à la vidéo finale [version standalone]
31 janvier 2010, parLe chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
Upload et récupération d’informations de la vidéo source
Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)
Sur d’autres sites (8539)
-
Looking for video libraries for editing video on windwos/ios/azure service
15 juillet 2015, par sanyamWhich library would be a good bet in terms of ...
1. Richness in features (I want to create a video from short clips and images with effects and filters). Think auto videos created from Google Photos.
2. Extensibility - I would prefer using similar tech on IOS and Windows apps. Might have to offload some processing to a web service on Azure (windows or linux).
3. Programmability - An API is preferred over command line as it gives more flexibility and better error handling.Given all these options, what is my best bet ? I’ve only looked at ffmpeg till now, is there a better alternative ?
-
ffmpeg video rendering command adding empty audio to end of video
13 janvier 2020, par MartinI have an ffmpeg command which is supposed to take an mp3 file and image file as inputs, and render a music video with the image as the background and the video length is the exact length of the mp3 audio file input.
I was using this command :
ffmpeg -loop 1 -framerate 2 -i "front.jpg" -i "audioInput.mp3" -vf "scale=2*trunc(iw/2):2*trunc(ih/2),setsar=1,format=yuv420p" -c:v libx264 -preset medium -tune stillimage -crf 18 -c:a aac -b:a 320k -shortest -vf scale=1920:1080 -pix_fmt yuv420p "videoOutput.mp4"
But the problem was that my audioInput.mp3 file is 1 minute 12 seconds long, and my output video was 1 minute 39 seconds, with an added couple seconds of silence at the end of the video.
I asked a question on stackoverflow and it was closed/linked to this question. The accepted answer says that this command would not add extra seconds onto the end of the video :
ffmpeg -y -loop 1 -framerate 2 -i "front.jpg" -i "maintheme.mp3" -c:v libx264 -tune stillimage -c:a aac -b:a 320k -pix_fmt yuv420p -shortest -fflags +shortest -max_interleave_delta 100M "result2.mp4"
I tried using this command, and my output video was still 1.14 seconds long.
I tried the command again with a different audio file (flac) and the video output had again 2 seconds of blank audio at the end. Is one of my flags causing extra space to be appended ? Can I cut the video’s length to be that of the audio file ? -
How to generating waveform from video & show it with video
25 novembre 2016, par SalilWe are using Rails as a backend & AngularJS on Front End side in my App where we need to show Video & audio waveform of that video.
We are using ’wavesurfer.js’ to show the waveform on Front End side & ’node-pcm’ to generate pcm from video file on BackEnd side.
This is working as expected but in some of the videos while creating waveform from pcm data instead of showing small sine waves we get flat line.
Also it takes too much time to show the waveform for every page reload.To overcome this issue we are planning to create waveform image using ffmpeg
ffmpeg -i 'https://s3.amazonaws.com/aadasdsadsadasdas/xyz.mp4' -filter_complex showwavespic -frames:v 1 output.png
This is working fine but it also takes too much time (Ofcourse only once ) to generate the image for remote video (i.e. We are saving videos on S3)
Problem with this i don’t get any library to integrate the waveform image with the Video.
Can someone suggest any better approach related to this.