
Recherche avancée
Médias (1)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (54)
-
Use, discuss, criticize
13 avril 2011, parTalk to people directly involved in MediaSPIP’s development, or to people around you who could use MediaSPIP to share, enhance or develop their creative projects.
The bigger the community, the more MediaSPIP’s potential will be explored and the faster the software will evolve.
A discussion list is available for all exchanges between users. -
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page. -
MediaSPIP Player : problèmes potentiels
22 février 2011, parLe lecteur ne fonctionne pas sur Internet Explorer
Sur Internet Explorer (8 et 7 au moins), le plugin utilise le lecteur Flash flowplayer pour lire vidéos et son. Si le lecteur ne semble pas fonctionner, cela peut venir de la configuration du mod_deflate d’Apache.
Si dans la configuration de ce module Apache vous avez une ligne qui ressemble à la suivante, essayez de la supprimer ou de la commenter pour voir si le lecteur fonctionne correctement : /** * GeSHi (C) 2004 - 2007 Nigel McNie, (...)
Sur d’autres sites (9981)
-
android chromecast mp4 by ffmpeg - shorter files play, some longer ones buffer but dont play
20 octobre 2014, par Robert RowntreeUsing Android SDK for CC, CCL, and ’styled receiver’ CC app.
Video Files created by a heroku/ffmpeg bin that will play on any other client i tried (VLC ubuntu... ) and that will even play in chrome on a tab that’s being cast to chromecast will NOT PLAY natively on chromecast apps. They will appear as normal load in the UI of the app. CCL logs will show them buffering OK , but they never enter a play state from the buffering state. Debugging the receiver shows nothing unusual ( network tab, or Console ). They just buffer and dont play.
If a slightly shorter version of the same , 2 ffmpeg input files is prepared and hosted, it seems to play fine in shorter versions under about 14 seconds long but will not play if a 16 second version is created.
I used a number of chromecast apps including Allcast to test this. always got same result.
2 dropbox mp4 links created with the ffmpeg cli here are below. A 14 second version of the output mp4 file plays fine on several Chromecast/android apps and a 16 second version fails.
file len 16s Wont play in CC apps
ffprobe output on the files at :
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'tst-nonplay-shorter_14s.mp4':
Metadata:
major_brand : isom
minor_version : 512
compatible_brands: isomiso2avc1mp41
encoder : Lavf54.29.105
Duration: 00:00:14.07, start: 0.046440, bitrate: 311 kb/s
Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuvj420p, 1080x614 [SAR 1535:1539 DAR 100:57], 199 kb/s, 1 fps, 1 tbr, 16384 tbn, 2 tbc
Metadata:
handler_name : VideoHandler
Stream #0:1(und): Audio: aac (mp4a / 0x6134706D), 22050 Hz, mono, s16, 110 kb/s
Metadata:
handler_name : SoundHandlerand the file 2 seconds longer that wont play
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'tst-nonplay-shorter_16s.mp4':
Metadata:
major_brand : isom
minor_version : 512
compatible_brands: isomiso2avc1mp41
encoder : Lavf54.29.105
Duration: 00:00:16.06, start: 0.046440, bitrate: 286 kb/s
Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuvj420p, 1080x614 [SAR 1535:1539 DAR 100:57], 174 kb/s, 1 fps, 1 tbr, 16384 tbn, 2 tbc
Metadata:
handler_name : VideoHandler
Stream #0:1(und): Audio: aac (mp4a / 0x6134706D), 22050 Hz, mono, s16, 110 kb/s
Metadata:
handler_name : SoundHandlerUsed ’qtfaststart’ and checked for MOOV ATOM issues and dont think that’s relevant.
I think i can limit recordings in my app to 14 seconds and avoid problems that way, but I would like to understand what is up with the chromecast GETS on MP4’s generated by ffmpeg.
-
Memory and Frame Rate Issues in FFmpeg Video Streaming
15 janvier 2024, par user18490I'm currently working with FFmpeg libraries to generate a video stream for the first time. As a newcomer to this field, I'm still exploring and experimenting with the code. For testing purposes, I've set up Nginx as the server and VLC as the client to view the stream. During these initial tests, I'm loading an image at the start and processing it for each frame, which is then sent for encoding and streaming.


I've encountered a couple of interconnected issues :


- 

-
When I process frames as quickly as possible, I can see the stream playing seemingly in real-time. However, I've noticed that the computer's memory usage skyrockets, rapidly consuming gigabytes of memory within seconds.


-
I suspect that this memory issue is related to the fact that I might be generating frames at a much higher rate than FFmpeg can handle for encoding and streaming. It seems like there may be buffering of data occurring somewhere. To address this, I introduced a 'sleep_for' delay to throttle the frame rate down to approximately 40 frames per second, instead of the 4000 fps I get without throttling. While this does contain memory growth, VLC no longer plays the stream in real-time. Instead, it appears to wait, play for a couple of seconds, go idle, and then resume playing intermittently.








I've tried adjusting the frame rate settings, specifically :


(*codec_context)->time_base = (AVRational){1, 30};



and


frame_yuv->pts = (1.0 / 30) * 90 * frame_count;



Based on my research and information from Stack Overflow, these settings seem correct. They indicate that the stream is meant to be played at 30 frames per second and that I've set the timing accordingly (where 90 represents the sample rate used by FFmpeg).


I would greatly appreciate any assistance or insights into resolving these issues.


extern "C" {
 #include <libavcodec></libavcodec>avcodec.h>
 #include <libavformat></libavformat>avformat.h>
 #include <libavutil></libavutil>imgutils.h>
 #include <libavutil></libavutil>opt.h>
 #include <libswscale></libswscale>swscale.h>
}

#include <iostream>
#include <chrono>
#include <thread>
#include <fstream>

void InitializeFFMPG(AVCodecContext** codec_context, AVFormatContext** format_context, int width, int height) {
 avformat_network_init();
 //av_register_all(); // no longer needed
 
 avformat_alloc_output_context2(format_context, nullptr, "flv", "rtmp://localhost/live/stream"); // RTMP URL
 
 const AVCodec* codec = avcodec_find_encoder(AV_CODEC_ID_H264);
 if (!codec) {
 std::cerr << "avcodec_find_encoder err." << std::endl;
 }
 
 *codec_context = avcodec_alloc_context3(codec);
 if (!*codec_context) {
 std::cerr << "codec_context err." << std::endl;
 }
 
 (*codec_context)->width = width;
 (*codec_context)->height = height;
 (*codec_context)->pix_fmt = AV_PIX_FMT_YUV420P;
 (*codec_context)->time_base = (AVRational){1, 30};
 if (avcodec_open2(*codec_context, codec, nullptr) < 0) {
 std::cerr << "Could not open codec" << std::endl;
 }
 
 AVStream* stream = avformat_new_stream(*format_context, codec);
 if (!stream) {
 std::cerr << "Could not create stream" << std::endl;
 }
 
 avcodec_parameters_from_context(stream->codecpar, *codec_context);
 
 if (avio_open(&(*format_context)->pb, "rtmp://localhost/live/stream", AVIO_FLAG_WRITE) < 0) {
 std::cerr << "Count not open output URL" << std::endl;
 }
 
 if (avformat_write_header(*format_context, nullptr) < 0) {
 std::cerr << "Could not write header" << std::endl;
 }
}

int frame_count = 0;

void ConvertRGBToYUV(AVFrame* frame_rgb, AVFrame* frame_yuv, struct SwsContext* sws_ctx) {
 sws_scale(sws_ctx, frame_rgb->data, frame_rgb->linesize, 0, frame_rgb->height, frame_yuv->data, frame_yuv->linesize);
}

void EncodeAndStreamFrame(AVCodecContext* codec_context, AVFormatContext* format_context, 
 unsigned char** framebuffer, int width, int height, struct SwsContext* sws_ctx) {
 AVFrame* frame_rgb = av_frame_alloc();
 if (!frame_rgb) {
 std::cerr << "Failed to allocate frame_rgb" << std::endl;
 return;
 }

 frame_rgb->format = AV_PIX_FMT_RGB24;
 frame_rgb->width = width;
 frame_rgb->height = height;

 if (av_image_alloc(frame_rgb->data, frame_rgb->linesize, width, height, AV_PIX_FMT_RGB24, 32) < 0) {
 std::cerr << "Failed to allocate RGB image buffer" << std::endl;
 av_frame_free(&frame_rgb);
 return;
 }
 
 memcpy(frame_rgb->data[0], *framebuffer, width * height * 3);
 
 AVFrame* frame_yuv = av_frame_alloc();
 if (!frame_yuv) {
 std::cerr << "Failed to allocate frame_yuv" << std::endl;
 av_frame_free(&frame_rgb);
 return;
 }

 frame_yuv->format = AV_PIX_FMT_YUV420P;
 frame_yuv->width = width;
 frame_yuv->height = height;

 if (av_frame_get_buffer(frame_yuv, 32) < 0) {
 std::cerr << "Failed to allocate YUV frame buffer" << std::endl;
 av_frame_free(&frame_rgb);
 av_frame_free(&frame_yuv);
 return;
 }
 
 frame_yuv->pts = (1.0 / 30) * 90 * frame_count;
 
 ConvertRGBToYUV(frame_rgb, frame_yuv, sws_ctx);

 if (avcodec_send_frame(codec_context, frame_yuv) < 0) {
 std::cerr << "Error sending YUV frame for encoding" << std::endl;
 av_frame_free(&frame_rgb);
 av_frame_free(&frame_yuv);
 return;
 }

 while (1) {
 AVPacket pkt = { 0 };
 av_packet_unref(&pkt);
 pkt.data = NULL;
 pkt.size = 0;

 int ret = avcodec_receive_packet(codec_context, &pkt);
 if (ret == AVERROR(EAGAIN) || ret == AVERROR_EOF) {
 break;
 } else if (ret < 0) {
 break;
 }

 if (av_interleaved_write_frame(format_context, &pkt) < 0) {
 av_packet_unref(&pkt);
 break;
 }
 av_packet_unref(&pkt);
 }

 av_frame_free(&frame_rgb);
 av_frame_free(&frame_yuv);
}

void ProcessFrame(const unsigned char* in, unsigned char*& framebuffer, int width, int height) {
 float cosine = std::abs(std::cos(frame_count / 300.f));
 for (uint32_t i = 0; i < width * height * 3; ++i)
 framebuffer[i] = (unsigned char)((float)in[i] * cosine);
}

int main() {
 AVCodecContext *codec_context = nullptr;
 AVFormatContext *format_context = nullptr;
 int width = 320, height = 160;
 InitializeFFMPG(&codec_context, &format_context, width, height);
 
 struct SwsContext* sws_ctx = sws_getContext(width, height, AV_PIX_FMT_RGB24,
 width, height, AV_PIX_FMT_YUV420P,
 0, nullptr, nullptr, nullptr);
 if (!sws_ctx) {
 return 0;
 }

 unsigned char* framebuffer_in = new unsigned char[width*height*3];
 unsigned char* framebuffer = new unsigned char[width*height*3];
 memset(framebuffer, 0x0, width * height * 3);
 std::ifstream ifs("C:/Users/xxx/Downloads/ocean.ppm", std::ios::binary);
 std::string header;
 ifs >> header;
 uint32_t w, h, bpc;
 ifs >> w >> h >> bpc;
 ifs.ignore();
 ifs.read((char*)framebuffer_in, w * h * 3);
 ifs.close();
 
 auto start_time = std::chrono::high_resolution_clock::now();
 while (1) {
 ProcessFrame(framebuffer_in, framebuffer, width, height);
 EncodeAndStreamFrame(codec_context, format_context, &framebuffer, width, height, sws_ctx);
 frame_count++;
 std::chrono::milliseconds frameDelay(20); 
 std::this_thread::sleep_for(frameDelay);
 auto end_time = std::chrono::high_resolution_clock::now();
 auto duration = std::chrono::duration_cast(end_time - start_time);
 double fps = static_cast<double>(frame_count) / duration.count();
 fprintf(stderr, "\r%f fps", fps);
 fflush(stderr);
 }

 // Cleanup
 delete[] framebuffer;
 delete[] framebuffer_in;
 avcodec_close(codec_context);
 avformat_close_input(&format_context);
 avformat_free_context(format_context);
 avcodec_free_context(&codec_context);

 return 0;
}
</double></fstream></thread></chrono></iostream>


-
-
Split video with ffmpeg segment option is missing frame
9 février 2024, par DanI’m trying to get the ffmpeg “segment” option to split my video into segments at the Iframes. I'm using ffmpeg V6.1.1.


First I added time stamps to each frame of my video so that when it plays, I can see exactly which frame is being displayed. I used this command :


ffmpeg -i In.mp4 -vf "drawtext=fontfile='C :\Windows\Fonts\Arial.ttf' : text='%frame_num :~ %pts':fontsize=200 : r=25 : x=(w-tw)/2 : y=h-(2*lh) : fontcolor=white : box=1 : boxcolor=0x00000099" -y Out.mp4


Then I used ffprobe to confirm that the video is 30 FPS and the Iframes are as follows :


0.000000
4.933333
10.000000
11.533333
18.866667
24.966667


Based on these Iframe times, I’d expect the following segments :







 Start Frame 

Start Time 

End Frame 

End Time 







 0 

0 

147 

4.900000 




 148 

4.933333 

299 

9.966667 




 300 

10.000000 

345 

11.500000 




 346 

11.533333 

565 

18.833334 




 566 

18.866667 

748 

24.933334 




 749 

24.966667 

867 

28.906667 









When I use ffmpeg to split the video into segments with the following command, I get six files as expected :


ffmpeg -i Out.mp4 -f segment -c copy -reset_timestamps 1 -map 0 "Out %d.mp4"


When I play the segments, they are all correct except the first segment file (Out 0.mp4). It seems to be missing the last frame. It contains frames 0 to 146 (4.866667 sec) but should also include frame 147 (4.9 sec). All the other segment files are as expected.


I’ve tried this on several different mp4 videos and they all are missing the last frame on the first segments.


Any idea why my first segment files is missing the last frame of the segment ?


Could this be an ffmpeg bug ?


Thanks for the help !
Dan


Here is my console session with all output :


C:\> ffprobe Out.mp4
ffprobe version 2023-12-21-git-1e42a48e37-full_build-www.gyan.dev Copyright (c) 2007-2023 the FFmpeg developers
 built with gcc 12.2.0 (Rev10, Built by MSYS2 project)
 configuration: --enable-gpl --enable-version3 --enable-static --pkg-config=pkgconf --disable-w32threads --disable-autodetect --enable-fontconfig --enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp --enable-bzlib --enable-lzma --enable-libsnappy --enable-zlib --enable-librist --enable-libsrt --enable-libssh --enable-libzmq --enable-avisynth --enable-libbluray --enable-libcaca --enable-sdl2 --enable-libaribb24 --enable-libaribcaption --enable-libdav1d --enable-libdavs2 --enable-libuavs3d --enable-libzvbi --enable-librav1e --enable-libsvtav1 --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs2 --enable-libxvid --enable-libaom --enable-libjxl --enable-libopenjpeg --enable-libvpx --enable-mediafoundation --enable-libass --enable-frei0r --enable-libfreetype --enable-libfribidi --enable-libharfbuzz --enable-liblensfun --enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf --enable-cuda-llvm --enable-cuvid --enable-ffnvcodec --enable-nvdec --enable-nvenc --enable-dxva2 --enable-d3d11va --enable-libvpl --enable-libshaderc --enable-vulkan --enable-libplacebo --enable-opencl --enable-libcdio --enable-libgme --enable-libmodplug --enable-libopenmpt --enable-libopencore-amrwb --enable-libmp3lame --enable-libshine --enable-libtheora --enable-libtwolame --enable-libvo-amrwbenc --enable-libcodec2 --enable-libilbc --enable-libgsm --enable-libopencore-amrnb --enable-libopus --enable-libspeex --enable-libvorbis --enable-ladspa --enable-libbs2b --enable-libflite --enable-libmysofa --enable-librubberband --enable-libsoxr --enable-chromaprint
 libavutil 58. 36.100 / 58. 36.100
 libavcodec 60. 36.100 / 60. 36.100
 libavformat 60. 20.100 / 60. 20.100
 libavdevice 60. 4.100 / 60. 4.100
 libavfilter 9. 14.100 / 9. 14.100
 libswscale 7. 6.100 / 7. 6.100
 libswresample 4. 13.100 / 4. 13.100
 libpostproc 57. 4.100 / 57. 4.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'Out.mp4':
 Metadata:
 major_brand : isom
 minor_version : 512
 compatible_brands: isomiso2avc1mp41
 title : Short 4k video sample - 4K Ultra HD (3840x2160)
 date : 2014:05:24 19:00:00
 encoder : Lavf60.20.100
 Duration: 00:00:28.96, start: 0.000000, bitrate: 3181 kb/s
 Stream #0:0[0x1](und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(progressive), 1920x1080 [SAR 1:1 DAR 16:9], 3045 kb/s, 30 fps, 30 tbr, 15360 tbn (default)
 Metadata:
 handler_name : VideoHandler
 vendor_id : [0][0][0][0]
 encoder : Lavc60.36.100 libx264
 Stream #0:1[0x2](und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 128 kb/s (default)
 Metadata:
 handler_name : SoundHandler
 vendor_id : [0][0][0][0]

C:\ ffprobe -loglevel error -skip_frame nokey -select_streams v:0 -show_entries frame=pts_time -of csv=print_section=0 Out.mp4
0.000000,
4.933333
10.000000
11.533333
18.866667
24.966667

C:\ ffmpeg -i Out.mp4 -f segment -c copy -reset_timestamps 1 -map 0 "Out %1d.mp4"
ffmpeg version 2023-12-21-git-1e42a48e37-full_build-www.gyan.dev Copyright (c) 2000-2023 the FFmpeg developers
 built with gcc 12.2.0 (Rev10, Built by MSYS2 project)
 configuration: --enable-gpl --enable-version3 --enable-static --pkg-config=pkgconf --disable-w32threads --disable-autodetect --enable-fontconfig --enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp --enable-bzlib --enable-lzma --enable-libsnappy --enable-zlib --enable-librist --enable-libsrt --enable-libssh --enable-libzmq --enable-avisynth --enable-libbluray --enable-libcaca --enable-sdl2 --enable-libaribb24 --enable-libaribcaption --enable-libdav1d --enable-libdavs2 --enable-libuavs3d --enable-libzvbi --enable-librav1e --enable-libsvtav1 --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs2 --enable-libxvid --enable-libaom --enable-libjxl --enable-libopenjpeg --enable-libvpx --enable-mediafoundation --enable-libass --enable-frei0r --enable-libfreetype --enable-libfribidi --enable-libharfbuzz --enable-liblensfun --enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf --enable-cuda-llvm --enable-cuvid --enable-ffnvcodec --enable-nvdec --enable-nvenc --enable-dxva2 --enable-d3d11va --enable-libvpl --enable-libshaderc --enable-vulkan --enable-libplacebo --enable-opencl --enable-libcdio --enable-libgme --enable-libmodplug --enable-libopenmpt --enable-libopencore-amrwb --enable-libmp3lame --enable-libshine --enable-libtheora --enable-libtwolame --enable-libvo-amrwbenc --enable-libcodec2 --enable-libilbc --enable-libgsm --enable-libopencore-amrnb --enable-libopus --enable-libspeex --enable-libvorbis --enable-ladspa --enable-libbs2b --enable-libflite --enable-libmysofa --enable-librubberband --enable-libsoxr --enable-chromaprint
 libavutil 58. 36.100 / 58. 36.100
 libavcodec 60. 36.100 / 60. 36.100
 libavformat 60. 20.100 / 60. 20.100
 libavdevice 60. 4.100 / 60. 4.100
 libavfilter 9. 14.100 / 9. 14.100
 libswscale 7. 6.100 / 7. 6.100
 libswresample 4. 13.100 / 4. 13.100
 libpostproc 57. 4.100 / 57. 4.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'Out.mp4':
 Metadata:
 major_brand : isom
 minor_version : 512
 compatible_brands: isomiso2avc1mp41
 title : Short 4k video sample - 4K Ultra HD (3840x2160)
 date : 2014:05:24 19:00:00
 encoder : Lavf60.20.100
 Duration: 00:00:28.96, start: 0.000000, bitrate: 3181 kb/s
 Stream #0:0[0x1](und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(progressive), 1920x1080 [SAR 1:1 DAR 16:9], 3045 kb/s, 30 fps, 30 tbr, 15360 tbn (default)
 Metadata:
 handler_name : VideoHandler
 vendor_id : [0][0][0][0]
 encoder : Lavc60.36.100 libx264
 Stream #0:1[0x2](und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 128 kb/s (default)
 Metadata:
 handler_name : SoundHandler
 vendor_id : [0][0][0][0]
Stream mapping:
 Stream #0:0 -> #0:0 (copy)
 Stream #0:1 -> #0:1 (copy)
[segment @ 00000195bbc52940] Opening 'Out 0.mp4' for writing
Output #0, segment, to 'Out %1d.mp4':
 Metadata:
 major_brand : isom
 minor_version : 512
 compatible_brands: isomiso2avc1mp41
 title : Short 4k video sample - 4K Ultra HD (3840x2160)
 date : 2014:05:24 19:00:00
 encoder : Lavf60.20.100
 Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(progressive), 1920x1080 [SAR 1:1 DAR 16:9], q=2-31, 3045 kb/s, 30 fps, 30 tbr, 15360 tbn (default)
 Metadata:
 handler_name : VideoHandler
 vendor_id : [0][0][0][0]
 encoder : Lavc60.36.100 libx264
 Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 128 kb/s (default)
 Metadata:
 handler_name : SoundHandler
 vendor_id : [0][0][0][0]
Press [q] to stop, [?] for help
[segment @ 00000195bbc52940] Opening 'Out 1.mp4' for writing
[segment @ 00000195bbc52940] Opening 'Out 2.mp4' for writing
[segment @ 00000195bbc52940] Opening 'Out 3.mp4' for writing
[segment @ 00000195bbc52940] Opening 'Out 4.mp4' for writing
[segment @ 00000195bbc52940] Opening 'Out 5.mp4' for writing
[out#0/segment @ 00000195bc3e8cc0] video:10757kB audio:456kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown
size=N/A time=00:00:28.86 bitrate=N/A speed= 322x