
Recherche avancée
Autres articles (62)
-
MediaSPIP v0.2
21 juin 2013, parMediaSPIP 0.2 est la première version de MediaSPIP stable.
Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...) -
MediaSPIP version 0.1 Beta
16 avril 2011, parMediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...) -
Ecrire une actualité
21 juin 2013, parPrésentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
Vous pouvez personnaliser le formulaire de création d’une actualité.
Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)
Sur d’autres sites (3239)
-
Trying to concatenate 11 seconds of black video onto the end of a 5 min video. Getting non-monotonous DTS in output stream error
14 janvier 2020, par TidusWulfI have a video shot from my smartphone that is 3840x2160@57.74. Originally, it was exactly 5 minutes long. I replaced the audio with a music file that is 5 minutes and 11 seconds using
ffmpeg -i video.mp4 -i audio.mp3 -c copy -map 0:v:0 -map 1:a:0 output.mp4
I expected 11 seconds of black at the end, but instead the video output freezes on the last frame for 11 seconds. There’s definitely something funky going on because when I try to upload to youtube, it only sees the first 5 minutes. The last 10 seconds of audio get dropped.I tried making a second black clip with
ffmpeg -f lavfi -i color=c=black:s=uhd2160:r=57.74 -t 11 -pix_fmt yuv420p blk2.mp4
. When I try to concat the two files withffmpeg -f concat -safe 0 -i list.txt -c copy teacup6.mp4
I get a huge list of errors such as[mp4 @ 00000166fda994c0] Non-monotonous DTS in output stream 0:0; previous: 27002920, current: 3728954; changing to 27002921. This may result in incorrect timestamps in the output file.
for what appears to be about 11 seconds worth of frames, so basically the entire 11 second black clip. When I play it in VLC it goes black, but I suspect it’s not actually processing/playing 11 seconds of good black video because if I click along the track timeline, visual glitches start happening in the canvas.Here is what ffmpeg tells me about my inputs and outputs as far as framerate, pixel format, etc.
[mov,mp4,m4a,3gp,3g2,mj2 @ 00000239e6cb2880] Auto-inserting h264_mp4toannexb bitstream filter
Input #0, concat, from 'list.txt':
Duration: N/A, start: -0.023021, bitrate: 72244 kb/s
Stream #0:0(eng): Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, bt709), 3840x2160, 71994 kb/s, SAR 1:1 DAR 16:9, 57.74 fps, 59 tbr, 90k tbn, 180k tbc
Metadata:
handler_name : VideoHandle
Stream #0:1(und): Audio: mp3 (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 250 kb/s
Metadata:
handler_name : SoundHandler
Output #0, mp4, to 'teacup5.mp4':
Metadata:
encoder : Lavf58.35.102
Stream #0:0(eng): Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, bt709), 3840x2160 [SAR 1:1 DAR 16:9], q=2-31, 71994 kb/s, 57.74 fps, 59 tbr, 90k tbn, 90k tbc
Metadata:
handler_name : VideoHandle
Stream #0:1(und): Audio: mp3 (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 250 kb/s
Metadata:
handler_name : SoundHandler
Stream mapping:
Stream #0:0 -> #0:0 (copy)
Stream #0:1 -> #0:1 (copy)
Press [q] to stop, [?] for helpalternatively, I tried to make a black 3840x2160 png file and create an 11 second slideshow. It didn’t turn out any better. I tried a single image for 11 seconds @ 57.74fps, I also tried a looping slideshow of that one image, looping on every single frame, at 57.74 fps. I should note that wether a slideshow or a color generator, the filesize for the 11 seconds of black comes out to about 160KB. Strangely small in my opinion, but I was chalking it up to a good compression algorithm.
I tried to do this in Da Vinci Resolve 16.1 instead and the video wouldn’t play. I also couldn’t find a way to maintain the same unusual 57.74 framerate with the free version, so it was undesirable anyway. I tried to re-process using Handbrake, but it threw an error. The output of the Handbrake processed video file also gets stuck on the last frame of the video, instead of going to black.
-
FFmpeg extracts black image from H264 stream
8 juin 2022, par massivemoistureI have a C# application that receives H264 stream through a socket. I want to continuously get the latest image from that stream.


Here's what I did with FFmpeg 5.0.1, just a rough sample to get ONE latest image, how I start FFmpeg :


var ffmpegInfo = new ProcessStartInfo(FFMPEG_PATH);
ffmpegInfo.RedirectStandardInput = true;
ffmpegInfo.RedirectStandardOutput = true;
ffmpegInfo.RedirectStandardError = true;
ffmpegInfo.UseShellExecute = false;

ffmpegInfo.Arguments = "-i pipe: -f h264 -pix_fmt bgr24 -an -sn pipe:";

ffmpegInfo.UseShellExecute = false;
ffmpegInfo.CreateNoWindow = true;

Process myFFmpeg = new Process();
myFFmpeg.StartInfo = ffmpegInfo;
myFFmpeg.EnableRaisingEvents = true;
myFFmpeg.Start();

var inStream = myFFmpeg.StandardInput.BaseStream;
FileStream baseStream = myFFmpeg.StandardOutput.BaseStream as FileStream;
myFFmpeg.BeginErrorReadLine();



Then I start a new thread to receive the stream through socket :


// inStream is "myFFmpeg.StandardInput.BaseStream" from the code block above
var t = Task.Run(() => ReceiveStream(inStream));



Next I read the output from FFmpeg :


byte[] decoded = new byte[Width * Height * 3];
int numBytesToRead = Width * Height * 3;
int numBytesRead = 0;

while (numBytesToRead > 0)
{
 int n = baseStream.Read(decoded, 0, decoded.Length);
 Console.WriteLine($"Read {n} bytes");
 if (n == 0)
 {
 break;
 } 
 numBytesRead += n;
 numBytesToRead -= n;
}



Lastly, I use ImageSharp library to save
decoded
byte array as a jpeg file.

image.Save("test.jpeg", encoder);



However,
test.jpeg
always comes out as a black image. What did I do wrong ?
Here's the stderr log that I got from ffmpeg :


 Duration: N/A, bitrate: N/A
 Stream #0:0: Video: h264 (High), yuv420p(tv, smpte170m/bt470bg/smpte170m, progressive), 1080x2256, 25 fps, 25 tbr, 1200k tbn
Stream mapping:
 Stream #0:0 -> #0:0 (h264 (native) -> h264 (libx264))
Incompatible pixel format 'bgr24' for codec 'libx264', auto-selecting format 'yuv444p'
[libx264 @ 0x11b9068f0] using cpu capabilities: ARMv8 NEON
[libx264 @ 0x11b9068f0] profile High 4:4:4 Predictive, level 5.0, 4:4:4, 8-bit
Output #0, h264, to 'pipe:':
 Metadata:
 encoder : Lavf59.16.100
 Stream #0:0: Video: h264, yuv444p(tv, smpte170m/bt470bg/smpte170m, progressive), 1080x2256, q=2-31, 25 fps, 25 tbn
 Metadata:
 encoder : Lavc59.18.100 libx264
 Side data:
 cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: N/A
frame= 1 fps=0.0 q=0.0 size= 0kB time=00:00:00.00 bitrate=N/A speed= 0x 
frame= 56 fps=0.0 q=0.0 size= 0kB time=00:00:00.00 bitrate=N/A speed= 0x
frame= 87 fps= 83 q=28.0 size= 370kB time=00:00:01.16 bitrate=2610.6kbits/s speed=1.11x
frame= 118 fps= 75 q=28.0 size= 698kB time=00:00:02.40 bitrate=2381.4kbits/s speed=1.54x
frame= 154 fps= 75 q=28.0 size= 1083kB time=00:00:03.84 bitrate=2311.1kbits/s speed=1.86x
...



Thank you !


Edit : as suggested by @kesh, I have changed
h264
torawvideo
, the arguments now are :-i pipe: -f rawvideo -pix_fmt bgr24 -an -sn pipe:


Here's the output of ffmpeg :


Input #0, h264, from 'pipe:':
 Duration: N/A, bitrate: N/A
 Stream #0:0: Video: h264 (High), yuv420p(tv, smpte170m/bt470bg/smpte170m, progressive), 1080x2256, 25 fps, 25 tbr, 1200k tbn
Stream mapping:
 Stream #0:0 -> #0:0 (h264 (native) -> rawvideo (native))
// About 9 of these "No accelerated colorspace..." message
[swscaler @ 0x128690000] [swscaler @ 0x1286a0000] No accelerated colorspace conversion found from yuv420p to bgr24.
Output #0, rawvideo, to 'pipe:':
 Metadata:
 encoder : Lavf59.16.100
 Stream #0:0: Video: rawvideo (BGR[24] / 0x18524742), bgr24(pc, gbr/bt470bg/smpte170m, progressive), 1080x2256, q=2-31, 1461888 kb/s, 25 fps, 25 tbn
 Metadata:
 encoder : Lavc59.18.100 rawvideo
frame= 1 fps=0.0 q=0.0 size= 0kB time=00:00:00.00 bitrate=N/A speed= 0x
// FFmpeg outputs no more log after this



-
ffmpeg Command stop executing in background after the application kill
26 mars 2018, par Amjad KhanFFmpeg Command are executed and that are working well, Implemented on android.
But I am facing problem when the user kill the application the command which I have executed it is terminated.
I have created background service, which is running in the background but the command stop in the middle, is there any way to handle it
Code here
ffmpeg.execute(command, new ExecuteBinaryResponseHandler() {
@Override
public void onFailure(String s) {
Log.e(TAG, "FAILED with output : " + s);
}
@Override
public void onSuccess(String s) {
Log.e(TAG, "SUCCESS with output : " + s);
}
@Override
public void onProgress(String s) {
//This method get stoped
}
@Override
public void onStart() {
}
@Override
public void onFinish() {
}
});