
Recherche avancée
Médias (1)
-
Géodiversité
9 septembre 2011, par ,
Mis à jour : Août 2018
Langue : français
Type : Texte
Autres articles (97)
-
Les vidéos
21 avril 2011, parComme les documents de type "audio", Mediaspip affiche dans la mesure du possible les vidéos grâce à la balise html5 .
Un des inconvénients de cette balise est qu’elle n’est pas reconnue correctement par certains navigateurs (Internet Explorer pour ne pas le nommer) et que chaque navigateur ne gère en natif que certains formats de vidéos.
Son avantage principal quant à lui est de bénéficier de la prise en charge native de vidéos dans les navigateur et donc de se passer de l’utilisation de Flash et (...) -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
Emballe médias : à quoi cela sert ?
4 février 2011, parCe plugin vise à gérer des sites de mise en ligne de documents de tous types.
Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ;
Sur d’autres sites (9855)
-
ffmpeg video rendering command adding empty audio to end of video
13 janvier 2020, par MartinI have an ffmpeg command which is supposed to take an mp3 file and image file as inputs, and render a music video with the image as the background and the video length is the exact length of the mp3 audio file input.
I was using this command :
ffmpeg -loop 1 -framerate 2 -i "front.jpg" -i "audioInput.mp3" -vf "scale=2*trunc(iw/2):2*trunc(ih/2),setsar=1,format=yuv420p" -c:v libx264 -preset medium -tune stillimage -crf 18 -c:a aac -b:a 320k -shortest -vf scale=1920:1080 -pix_fmt yuv420p "videoOutput.mp4"
But the problem was that my audioInput.mp3 file is 1 minute 12 seconds long, and my output video was 1 minute 39 seconds, with an added couple seconds of silence at the end of the video.
I asked a question on stackoverflow and it was closed/linked to this question. The accepted answer says that this command would not add extra seconds onto the end of the video :
ffmpeg -y -loop 1 -framerate 2 -i "front.jpg" -i "maintheme.mp3" -c:v libx264 -tune stillimage -c:a aac -b:a 320k -pix_fmt yuv420p -shortest -fflags +shortest -max_interleave_delta 100M "result2.mp4"
I tried using this command, and my output video was still 1.14 seconds long.
I tried the command again with a different audio file (flac) and the video output had again 2 seconds of blank audio at the end. Is one of my flags causing extra space to be appended ? Can I cut the video’s length to be that of the audio file ? -
How to generating waveform from video & show it with video
25 novembre 2016, par SalilWe are using Rails as a backend & AngularJS on Front End side in my App where we need to show Video & audio waveform of that video.
We are using ’wavesurfer.js’ to show the waveform on Front End side & ’node-pcm’ to generate pcm from video file on BackEnd side.
This is working as expected but in some of the videos while creating waveform from pcm data instead of showing small sine waves we get flat line.
Also it takes too much time to show the waveform for every page reload.To overcome this issue we are planning to create waveform image using ffmpeg
ffmpeg -i 'https://s3.amazonaws.com/aadasdsadsadasdas/xyz.mp4' -filter_complex showwavespic -frames:v 1 output.png
This is working fine but it also takes too much time (Ofcourse only once ) to generate the image for remote video (i.e. We are saving videos on S3)
Problem with this i don’t get any library to integrate the waveform image with the Video.
Can someone suggest any better approach related to this.
-
What's the ideal encoding for rendering an immediate video (e.g. preview) while waiting for the high-quality video to render ? [on hold]
20 avril 2018, par allanbreyesI’m working on an application that renders an MLT (OSS non-linear video-editing engine) project on a web server, which uses ffmpeg under the hood. While waiting for the high-quality render to finish (several minutes), I need to make a "best-effort" render in parallel so that users can immediately (i.e. real-time) see the results. What encoding works best for this ?