
Recherche avancée
Autres articles (63)
-
MediaSPIP v0.2
21 juin 2013, parMediaSPIP 0.2 est la première version de MediaSPIP stable.
Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...) -
MediaSPIP version 0.1 Beta
16 avril 2011, parMediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir
Sur d’autres sites (9923)
-
FFMPEG best options for playing several files over a network
7 juillet 2019, par ManityIm using a WPF ffmpeg control wrapper, that uses the ffmpeg library. Using this control wrapper i am loading 4 videos at the same time, stored on network locations. the files are opened 4 at a time, played then a user can move to the next 4 files and they are loaded and played at the same time and so on and on.
Ignoring the wrapper my question is specifically about what properties or protocols can i experiment with or adjust on FFMPEG that could help with playing 4 video files at a time, repeatedly.
The files are mp4 files stored on a network server, that can be busy so i need to deal with latency on load and playing and stop the loading of videos from timing out.The videos are not streamed, they are mp4 stored on a network that can be busy.
for example i looked at the async option in FFMPEG.
Asynchronous data filling wrapper for input stream. I cant really find a lot of info on this and have no idea if it would improve things or not.So what other features or protocols can i try with ffmpeg to improve reliability on how im using it ?
any advice appreciated.
-
Does a track run in a fragmented MP4 have to start with a key frame ?
18 janvier 2021, par stevendesuI'm ingesting an RTMP stream and converting it to a fragmented MP4 file in JavaScript. It took a week of work but I'm almost finished with this task. I'm generating a valid
ftyp
atom,moov
atom, andmoof
atom and the first frame of the video actually plays (with audio) before it goes into an infinite buffering with no errors listed inchrome://media-internals



Plugging the video into
ffprobe
, I get an error similar to :


[mov,mp4,m4a,3gp,3g2,mj2 @ 0x558559198080] Failed to add index entry
 Last message repeated 368 times
[h264 @ 0x55855919b300] Invalid NAL unit size (-619501801 > 966).
[h264 @ 0x55855919b300] Error splitting the input into NAL units.




This led me on a massive hunt for data alignment issues or invalid byte offsets in my
tfhd
andtrun
atoms, however no matter where I looked or how I sliced the data, I couldn't find any problems in themoof
atom.


I then took the original FLV file and converted it to an MP4 in
ffmpeg
with the following command :


ffmpeg -i ~/Videos/rtmp/big_buck_bunny.flv -c copy -ss 5 -t 10 -movflags frag_keyframe+empty_moov+faststart test.mp4




I opened both the MP4 I was creating and the MP4 output by
ffmpeg
in an atom parsing file and compared the two :





The first thing that jumped out at me was the
ffmpeg
-generated file has multiple video samples permoof
. Specifically, everymoof
started with 1 key frame, then contained all difference frames until the next key frame (which was used as the start of the followingmoof
atom)


Contrast this with how I'm generating my MP4. I create a
moof
atom every time an FLVVIDEODATA
packet arrives. This means mymoof
may not contain a key frame (and usually doesn't)


Could this be why I'm having trouble ? Or is there something else I'm missing ?



The video files in question can be downloaded here :






Another issue I noticed was
ffmpeg
's prolific use ofbase_data_offset
in thetfhd
atom. However when I tried tracking the total number of bytes appended and setting thebase_data_offset
myself, I got an error in Chrome along the lines of : "MSE doesn't support base_data_offset". Per the ISO/IEC 14996-10 spec :




If not provided, the base-data-offset for the first track in the movie fragment is the position of the first byte of the enclosing Movie Fragment Box, and for second and subsequent track fragments, the default is the end of the data defined by the preceding fragment.





This wording leads me to believe that the
data_offset
in the firsttrun
atom should be equal to the size of themoof
atom and thedata_offset
in the secondtrun
atom should be0
(0 bytes from the end of the data defined by the preceding fragment). However when I tried this I got an error that the video data couldn't be parsed. What did lead to data that could be parsed was the length of themoof
atom plus the total length of the first track (as if the base offset were the first byte of the enclosingmoof
box, same as the first track)

-
passing options in execFileSync fails for ffmpeg
17 octobre 2018, par Amin BaigI am trying to write a small scripts to automate the creation of video files. I have it almost working but stuck at one part, following is my node js code :
let str1 = "-c:v libvpx -i sourceVideos/a1.mkv -c:v libvpx -i sourceVideos/a2.mkv -c:v libvpx -i sourceVideos/a3.mkv";
let str2 = "[1]setpts=PTS+5.00/TB[a2];[2]setpts=PTS+10.00/TB[a3];[0][a2]overlay[o2];[o2][a3]overlay";
let outFile = 'validout.mp4';
const masterStream = execFileSync('ffmpeg', [str1, '-filter_complex', str2, outFile]);
console.log('All processing completed');The above code represents this ffmpeg command to create a video from multiple videos :
ffmpeg -c:v libvpx -i sourceVideos/a1.mkv -c:v libvpx -i sourceVideos/a2.mkv -c:v libvpx -i sourceVideos/a3.mkv -filter_complex "[1]setpts=PTS+5.00/TB[a2];[2]setpts=PTS+10.00/TB[a3];[0][a2]overlay[o2];[o2][a3]overlay" validationout.mp4
So I have placed the args in str2 and the options/inputs in str1. The problem is that when I pass the inputs with their options in str1 and place it in my execFileSync command it’s not parsed by the command. I have also tested for confirmation and if I pass the options in the following format it works in the node js script :
//version 1 with separated arguments
const masterStream = execFileSync('ffmpeg', ['-c:v', 'libvpx', '-i', 'sourceVideos/a1.mkv', '-c:v', 'libvpx', '-i', 'sourceVideos/a2.mkv', '-c:v', 'libvpx', '-i', 'sourceVideos/a3.mkv', '-filter_complex', str2, outFile]);My question is : how can I pass the options/inputs to the execFileSync in str1 so that they can be executed ?