
Recherche avancée
Médias (2)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
-
Carte de Schillerkiez
13 mai 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Texte
Autres articles (63)
-
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)
Sur d’autres sites (8908)
-
How to stack several videos using ffmpeg ?
13 juillet 2019, par JohnI have four cameras that are recording videos and wand to stack all videos into one video file. The final video must be a 2x2 grid.
THE HARD PART is that sometimes cameras are not recording simultaneously.
Here you can see one situation of how cameras are recording :
Camera 1 timeline: |recording....................................|
Camera 2 timeline: |stopped....|recording........................|
Camera 3 timeline: |recording.........|stopped....|recording.....|
Camera 4 timeline: |recording..................|stopped..........|Explanation :
- Camera 1 is recording all time.
- Camera 2 is starting recording later.
- Camera 3 is recording, then stopped a period of time, then again recording.
- Camera 4 is recording a period of time, then stopped.
In the final video I want to synchronize each video by timestamps.
If a camera is stopped a period of time then at its position must be a black portion.
How can I do this ?
-
ffmpeg_kit_flutter operation not permitted for audio operations
25 août 2023, par Black Eyed BeansI'm trying to trim an audio file using
ffmpeg_kit_flutter
but I keep getting the error :

audio/path/output.mp3: Operation not permitted.



This is the ffmpeg command that I'm using :


final cmd="-y -i \"$audioPath\" -ss $audioStartTime -to $audioEndTime -c:a libmp3lame $outPutName";



And I've also tried :


final cmd="-y -i \"$audioPath\" -ss $audioStartTime -to $audioEndTime -c copy $outPutName";



But the error is still the same.
I'm using the
ffmpeg_kit_flutter_full_gpl
package.

-
Combining multiple image files into a video while using filter_complex to apply a watermark
14 décembre 2017, par GeuisI’m trying to combine two ffmpeg operations into a single one.
Currently I have two sets of ffmpeg commands that first generate a video from existing images, then runs that video through ffmpeg again to apply a watermark.
I’d like to see if its possible to combine these into a single operation.
# Create the source video
ffmpeg -y \
-framerate 1/1 \
-i layer-%d.png \
-r 30 -vcodec libx264 -preset ultrafast -crf 23 -pix_fmt yuv420p \
output.mp4
# Apply the watermark and render the final output
ffmpeg -y \
-i output.mp4 \
-i logo.png \
-filter_complex "[1:v][0:v]scale2ref=40:40[a][b];[b][a]overlay=(80):(main_h-200-80)" \
final.mp4