
Recherche avancée
Médias (2)
-
Exemple de boutons d’action pour une collection collaborative
27 février 2013, par
Mis à jour : Mars 2013
Langue : français
Type : Image
-
Exemple de boutons d’action pour une collection personnelle
27 février 2013, par
Mis à jour : Février 2013
Langue : English
Type : Image
Autres articles (57)
-
MediaSPIP v0.2
21 juin 2013, parMediaSPIP 0.2 est la première version de MediaSPIP stable.
Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...) -
Mise à disposition des fichiers
14 avril 2011, parPar défaut, lors de son initialisation, MediaSPIP ne permet pas aux visiteurs de télécharger les fichiers qu’ils soient originaux ou le résultat de leur transformation ou encodage. Il permet uniquement de les visualiser.
Cependant, il est possible et facile d’autoriser les visiteurs à avoir accès à ces documents et ce sous différentes formes.
Tout cela se passe dans la page de configuration du squelette. Il vous faut aller dans l’espace d’administration du canal, et choisir dans la navigation (...) -
MediaSPIP version 0.1 Beta
16 avril 2011, parMediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)
Sur d’autres sites (8507)
-
How video editor show real time preview of videos ? [closed]
3 juin 2024, par SWIKI am trying to create a simple video editor that combine two video by layering one over another, i can easily do it with ffmpeg, but I am not sure how can I make a preview of it before making final video ? How video editor display preview without building them ? I am looking towards create a react application.


-
How to use FFMPEG to send image via RTMP using ProcessBuilder
13 mai 2022, par ljnoahI have a callback function that gives me frames as bytes type to which I would like to pass as FFMPEG parameter to write them to a rtmp URL. but I don't really have any experience with ffmpeg, thus far I was not able to find an example on how to do it. Basically, I would like to know can I use use the bytes array that is FrameData that holds the images I am getting and write to ffmpeg as a parameter to be sent via streaming to a server using ProcessBuilder.


private byte[] FrameData = new byte[384 * 288 * 4];
 private final IFrameCallback mIFrameCallback = new IFrameCallback() {
 @Override
 public void onFrame(final ByteBuffer frameData) {
 frameData.clear();
 frameData.get(FrameData, 0, frameData.capacity());
 ProcessBuilder pb = new ProcessBuilder(ffmpeg , "-y", "-f", "rawvideo", "vcodec", "rawvideo", "-pix_fmt", "bgr24",
 "-r", "25",
 "-i", "-",
 "-c:v", "libx264",
 "-pix_fmt", "yuv420p",
 "-preset", "ultrafast",
 "-f", "flv",
 "rtmp://192.168.0.13:1935/live/test");
 }
 Log.e(TAG, "mIFrameCallback: onFrame------");
 try {
 pb.inheritIO().start().waitFor();
 } catch (InterruptedException | IOException e) {
 e.printStackTrace();
 }
 };



This callback gives me the frames from my camera on the fly and writes it to FrameData, which I can compress to a bitmap if needed. The current attempt isn't working as I have no idea how to pass my byte array as a parameter to ffmpeg to be streamed via rtmp as above to push my frames from the camera that are stored FrameData byte buffer via RTMP/RTSP to my server IP. I would use a similar approach in python like this :


import subprocess
 fps = 25
 width = 224
 height = 224
 command = ['ffmpeg', '-y', '-f', 'rawvideo', '-vcodec', 'rawvideo', '-pix_fmt', 'bgr24',
 '-s', "{}x{}".format(width, height),
 '-r', str(fps),
 '-i', '-',
 '-c:v', 'libx264',
 '-pix_fmt', 'yuv420p',
 '-preset', 'ultrafast',
 '-f', 'flv',
 'rtmp://192.168.0.13:1935/live/test']
 p = subprocess.Popen(command, stdin=subprocess.PIPE)
 while(True):
 frame = np.random.randint([255], size=(224, 224, 3))
 frame = frame.astype(np.uint8)
 p.stdin.write(frame.tobytes())



I really don't understand how to write my byte arrays to the ffmpeg as I would in this Python example above.
What I tried doing was this :


private byte[] FrameData = new byte[384 * 288 * 4];
 String ffmpeg = Loader.load(org.bytedeco.ffmpeg.ffmpeg.class);
 private final IFrameCallback mIFrameCallback = new IFrameCallback() {
 @RequiresApi(api = Build.VERSION_CODES.O)
 @Override
 public void onFrame(final ByteBuffer frameData) {
 frameData.clear();
 frameData.get(FrameData, 0, frameData.capacity());
 ProcessBuilder pb = new ProcessBuilder(ffmpeg , "-y", "-f", "rawvideo", "vcodec", "rawvideo", "-pix_fmt", "bgr24",
 "-r", "25",
 "-i", "-",
 "-c:v", "libx264",
 "-pix_fmt", "yuv420p",
 "-preset", "ultrafast",
 "-f", "flv",
 "rtmp://192.168.0.13:1935/live/test");
 try {

 Log.e(TAG, "mIFrameCallback: onFrame------");
 pb.redirectInput();
 pb.redirectError();
 Log.e(TAG, "frame data check 1");
 Process p = pb.start();
 Log.e(TAG, "frame data check 2");
 p.getOutputStream().write(FrameData);
 Log.e(TAG, "frame data check 3");
 } catch (IOException e) {
 e.printStackTrace();
 }
 }
 };



Unfortunately, no results whatsoever. It appears that the try/catch block is not executed, I am not even sure if this is the right way to send bytes via RTMP stream


Edit : I have fixed the indicated issue with the ProcessBuilder being called twice and logged the calls made inside the code, it only calls up to :
Log.e(TAG, "frame data check 1");
Although, I am still not sure if that's the write way to write an array of bytes to ffmpeg for rtmp streaming.

-
ffmpeg : md5 of m3u8 playlists generated from same input video with different segment durations (after applying video filter) don't match
15 juillet 2020, par Saurabh P BhandariHere are a few commands I am using to convert and transize a video in mp4 format to a m3u8 playlist.


For a given input video (mp4 format), generate multiple video only segments with segment duration 30s


ffmpeg -loglevel error -i input.mp4 -dn -sn -an -c:v copy -bsf:v h264_mp4toannexb -copyts -start_at_zero -f segment -segment_time 30 30%03d.mp4 -dn -sn -vn -c:a copy audio.aac



Apply video filter (in this case scaling) on each segment and convert it to a m3u8 format


ls 30*.mp4 | parallel 'ffmpeg -loglevel error -i {} -vf scale=-2:144 -hls_list_size 0 {}.m3u8'



Store the list of m3u8 files generated in
list.txt
in this formatfile 'segment-name.m3u8'


for f in 30*.m3u8; do echo "file '$f'" >> list.txt; done



Using concat demuxer, combine all segment files (which are in m3u8 format) and the audio to get one final m3u8 playlist pointing to segments with duration of 10s.


ffmpeg -loglevel error -f concat -i list.txt -i audio.aac -c copy -hls_list_size 0 -hls_time 10 output_30.m3u8




I can change the segment duration in the first step from 30s to 60s, and compare the md5 of the final m3u8 playlist generated in both the cases using this command


ffmpeg -loglevel error -i <input m3u8="m3u8" playlist="playlist" /> -f md5 - 



The md5 of the output files differ i.e video streams of
output_30.m3u8
andoutput_60.m3u8
are not the same.

Can anyone elaborate on this ?


(I expected the md5 to be the same)