
Recherche avancée
Autres articles (66)
-
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page. -
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...) -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...)
Sur d’autres sites (6193)
-
FFMPEG images to video + overlay video
7 novembre 2018, par AK47I am trying to make 15 second video where the background layer is a video made up of 2 images, the first line creates a 15 second video from 2 images.
I chose a small framerate so it renders an mp4 quickly. I then overlay a webm video (which has transparency) over the images. The final video seems to keep the framerate of 2, but i would rather keep the 24 framerate of the webm video.
Is this possible ? & is it also possible to turn the below into 1 statement.
ffmpeg -loop 1 -framerate 2 -t 11 -i image1.png -loop 1 -framerate 2 -t 4 -i image2.png -filter_complex "[0][1]concat=n=2" backgroundvideo.mp4;
ffmpeg -i backgroundvideo.mp4 -c:v libvpx-vp9 -i overlayvideo.webm -filter_complex overlay newvid.mp4 -
How to find an audio channel layout of a video an then apply it to an another video
3 janvier 2023, par someCO_OLguyWe have a video of an audio channel layout, let's say BL. We don't know the format. We need to find the layout of that video and apply to a different video, assuming that each video only has 1 channel.


I tried a lot, and still I can't find an answer that could work


-
How to add dynamic captions with animations on a video in a cross-platform Android & iOS app and export it as a final video ? [closed]
29 avril, par Danial QskI'm building a cross-platform mobile app using Flutter (or React Native) where users can :


Add dynamic text captions with animations over a video, similar to TikTok


See these changes in real-time during video playback


Export the video with the animated captions burned into the output (e.g., MP4)


What I’ve implemented so far :


Video playback using video_player in Flutter (or react-native-video).


Captions are rendered using Canvas/UI layer during playback.


Problem : While I can overlay animated text in the UI, I can't figure out how to include these in the exported video. I’m considering using FFmpeg to burn in the captions, but I’m unclear how to integrate it with the animated UI overlays or render them into a final video on both Android and iOS.


What I want to achieve :


Efficient way to render animated text overlays onto a video and export it as a final video file on both Android and iOS.


Maintain good performance for real-time playback and smooth export.


What approaches or processing pipelines could achieve this ?
(For example, rendering the UI layer to an offscreen video, integrating with FFmpeg, or using native rendering techniques.)