
Recherche avancée
Autres articles (9)
-
Déploiements possibles
31 janvier 2010, parDeux types de déploiements sont envisageable dépendant de deux aspects : La méthode d’installation envisagée (en standalone ou en ferme) ; Le nombre d’encodages journaliers et la fréquentation envisagés ;
L’encodage de vidéos est un processus lourd consommant énormément de ressources système (CPU et RAM), il est nécessaire de prendre tout cela en considération. Ce système n’est donc possible que sur un ou plusieurs serveurs dédiés.
Version mono serveur
La version mono serveur consiste à n’utiliser qu’une (...) -
Submit bugs and patches
13 avril 2011Unfortunately a software is never perfect.
If you think you have found a bug, report it using our ticket system. Please to help us to fix it by providing the following information : the browser you are using, including the exact version as precise an explanation as possible of the problem if possible, the steps taken resulting in the problem a link to the site / page in question
If you think you have solved the bug, fill in a ticket and attach to it a corrective patch.
You may also (...) -
Encoding and processing into web-friendly formats
13 avril 2011, parMediaSPIP automatically converts uploaded files to internet-compatible formats.
Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
All uploaded files are stored online in their original format, so you can (...)
Sur d’autres sites (3985)
-
FFMPEG generated h264/aac mp4 video first frame doesn't report a zero mediaTime using requestVideoFrameCallback html api [closed]
12 mai 2023, par Daniel RobinsonI have noticed that mp4 files generated using ffmpeg aren’t reporting a mediaTime of 0 via requestVideoFrameCallback for the first frame however files encoded as webm or via another encoder such as mainconcept its 0.


You can see a demo of the issue here https://randomstuffdr.s3.eu-west-1.amazonaws.com/TEST-v3.HTML
As you increment the frames with video 1 you don’t see video progress until the third button press.


With video 2-4, the mediaTime is 0 and a single press of the button increments the video by one frame.


Browser : Chrome 113.0.5672.92 (Windows 10)


FFMPEG Version : 6.0-essentials_build-www.gyan.dev or N-109421-g9adf02247c-20221216


There was an interesting note on this page related to mediaTime implementation in chrome https://web.dev/requestvideoframecallback-rvfc/




Of special interest in this list is mediaTime. In Chromium's
implementation, we use the audio clock as the time source that backs
video.currentTime, whereas the mediaTime is directly populated by the
presentationTimestamp of the frame. The mediaTime is what you should
use if you want to exactly identify frames in a reproducible way,
including to identify exactly which frames you missed.




The led me to compare the PTS files of the video and audio stream between the ffmpeg and MC files using ffprobe however these are all zero for the first frames. The one interesting note, is that ffmpeg file seems to interleave video and audio packets in ffprobe while the mainconcept file has 8 video frames before audio, I have no idea if this is significant.


What I would like to know is if there is any way I can make ffmpeg generate a file in a similar manner to mainconcept ? So far I have tried ; disabling b-frames, use baseline profile and avoid_negative_ts


Video 1 – (Video : H264 /Audio : AAC / Wrapper : MP4 / Transcoder : FFMPEG)
.\ffmpeg.exe -i '720p50 Flash and Beep.mxf' -map 0:0 -map 0:1 -map_metadata -1 -c:v libx264 -crf 28 -b:v 118k -maxrate 118k -bufsize 236k -vf scale="480 :-1" -preset veryfast -c:a aac "..\low-res\720p50FlashandBeep_V_A.mp4"


Video 2 – (Video : H264 /Audio : No Audio / Wrapper : MP4 / Transcoder : FFMPEG)
.\ffmpeg.exe -i '720p50 Flash and Beep.mxf' -map 0:0 -map 0:1 -map_metadata -1 -c:v libx264 -crf 28 -b:v 118k -maxrate 118k -bufsize 236k -vf scale="480 :-1" -preset veryfast -an "..\low-res\720p50FlashandBeep_V.mp4"


Video 3 – (Video : vp8 /Audio : ogg / Wrapper : webm / Transcoder : FFMPEG)
.\ffmpeg.exe -i '720p50 Flash and Beep.mxf' -map 0:0 -map 0:1 -map_metadata -1 -c:v vp8 -crf 28 -b:v 118k -maxrate 118k -bufsize 236k -vf scale="480 :-1" -preset veryfast "..\low-res\720p50FlashandBeep_V_A.webm"


Video 4 – (Video : H264 /Audio : AAC / Wrapper : MP4 / Transcoder : MainConcept)


-
Why is the last frame not showing in an MP4 generated by libav ?
31 octobre 2023, par BenNote : I have a working example of the problem here.


I'm using the libav/ffmpeg API to generate an MP4 with the h264 codec. In my specific situation I'm generating the files with a max number of 2 "B" frames. I'm able to generate an Mp4 with the right number of frames so that a single, lone "B" frame is the very last frame being written. When this happens, the encoder sets that frame's packet to be discarded (I've verified this with ffprobe). The net result is that some players (say, when dropping the MP4 into Edge or Chrome) will display only n-1 total frames (ignoring the discarded packet). Other players, such as VLC, will play the full n frames (not ignoring the discarded packet). So, the result is inconsistent.


ffmpeg.exe itself doesn't appear to have this problem. Instead, it will set what would be the lone "B" frame to a "P" frame. This means the file will play the same regardless of what player is used.


The problem is : I don't know how to mimic ffmpeg's behavior using the SDK so the last frame will play regardless of the player. As far as I can tell I'm closing out the file properly by flushing out the encoder buffers. I must be doing something wrong somewhere.


I provided a link to the full source above, but at a high level I'm initializing the codec context and stream like this :


newStream->codecpar->codec_id = AV_CODEC_ID_H264;
newStream->codecpar->codec_type = AVMEDIA_TYPE_VIDEO;
newStream->codecpar->width = Width;
newStream->codecpar->height = Height;
newStream->codecpar->format = AV_PIX_FMT_YUV420P;
newStream->time_base = { 1, 75 };
avcodec_parameters_to_context(codecContext, newStream->codecpar);


codecContext->time_base = { 1, 75 };
codecContext->gop_size = 30;



I then sit in a loop and use OpenCV to generate frames (each frame has its frame number drawn on it) :


auto matrix = cv::Mat(Height, Width, CV_8UC3, cv::Scalar(0, 0, 0));

std::stringstream ss;
ss << f; 

cv::putText(matrix, ss.str().c_str(), textOrg, fontFace, fontScale, cv::Scalar(255, 255, 255), thickness, 8);



I then write out the frame like this (looping if more data is needed) :


if ((ret = avcodec_send_frame(codecContext, frame)) == 0) {

 ret = avcodec_receive_packet(codecContext, &pkt); 

 if (ret == AVERROR(EAGAIN))
 {
 continue; 
 }
 else
 {
 av_interleaved_write_frame(pFormat, &pkt);
 }
 av_packet_unref(&pkt);
}



And finally I flush out the file at the end like this :


if ((ret = avcodec_send_frame(codecContext, NULL)) == 0)
{
 for (;;)
 {
 if ((ret = avcodec_receive_packet(codecContext, &pkt)) == AVERROR_EOF)
 {
 break;
 }
 else
 {
 ret = av_interleaved_write_frame(pFormat, &pkt);
 av_packet_unref(&pkt);
 }
 }

 av_write_trailer(pFormat);
 avio_close(pFormat->pb);
}



Yet when I play in Chrome, the player ends on frame 6758,




And in VLC, the player ends on frame 6759.




What am I doing wrong ?


-
ffprobe can not read a file generated by node JS createWriteStream() on AWS
30 septembre 2016, par matthiasdvI have a setup running on AWS that triggers an AWS Lambda function when a video file is uploaded to an S3 bucket. The Lambda function is written in node and downloads the file, streams it to a temp working file and then executes ffprobe on it followed by an ffmpeg command.
ffprobe throws the following error :
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x4ea8e20] error reading header download: Invalid data found when processing input
The bug is hard to reproduce and occurs only half the time, which I believe to be because of the async nature of the program.
My main function is as follows
downloadFile(library.getDownloadStream, logger, sourceLocation, localFilePath)
.then(() => ffprobe(logger))
.then(() => ffmpeg(logger, keyPrefix))
.then(() => removeDownload(logger, localFilePath))
.then(() => uploadFiles(library.uploadToBucket, logger, keyPrefix))
.then(data => invocation.callback())
.catch(error => invocation.callback(error));The ffprobe Function :
function downloadFile(downloadFunc, logger, sourceLocation, download) {
return new Promise((resolve, reject) => {
logger.log(`Starting download: ${sourceLocation.bucket} / ${sourceLocation.key}`);
var downloadFile = createWriteStream(download);
downloadFunc(sourceLocation.bucket, sourceLocation.key)
.on('end', () => {
logger.log("closing writing stream");
downloadFile.end();
logger.log('Download finished');
resolve();
})
.on('error', reject)
.pipe(downloadFile);
});
}Each build/update uploads the latest ffmpeg version to AWS. I can not reproduce this error locally.
Why is ffprobe throwing this error regarding the header ?
Update
Logging the downloaded file’s filesize prints exactly the same amount of bytes, regardless of wether ffprobe is successful or not.
However, when I set a timeOut before resolving the promise, the bug no longer occurs and ffprobe runs successfully each time :
downloadFunc(sourceLocation.bucket, sourceLocation.key)
.on('end', () => {
logger.log('Download finished');
// Filesize
var meta = fs.statSync(download);
var fileSizeInBytes = meta["size"];
logger.log(fileSizeInBytes);
// resolve();
setTimeout(resolve, 1000);
})
.on('error', reject)
.pipe(downloadFile);Why is this happening ?