
Recherche avancée
Médias (91)
-
MediaSPIP Simple : futur thème graphique par défaut ?
26 septembre 2013, par
Mis à jour : Octobre 2013
Langue : français
Type : Video
-
avec chosen
13 septembre 2013, par
Mis à jour : Septembre 2013
Langue : français
Type : Image
-
sans chosen
13 septembre 2013, par
Mis à jour : Septembre 2013
Langue : français
Type : Image
-
config chosen
13 septembre 2013, par
Mis à jour : Septembre 2013
Langue : français
Type : Image
-
SPIP - plugins - embed code - Exemple
2 septembre 2013, par
Mis à jour : Septembre 2013
Langue : français
Type : Image
-
GetID3 - Bloc informations de fichiers
9 avril 2013, par
Mis à jour : Mai 2013
Langue : français
Type : Image
Autres articles (51)
-
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page. -
Keeping control of your media in your hands
13 avril 2011, parThe vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...) -
Participer à sa traduction
10 avril 2011Vous pouvez nous aider à améliorer les locutions utilisées dans le logiciel ou à traduire celui-ci dans n’importe qu’elle nouvelle langue permettant sa diffusion à de nouvelles communautés linguistiques.
Pour ce faire, on utilise l’interface de traduction de SPIP où l’ensemble des modules de langue de MediaSPIP sont à disposition. ll vous suffit de vous inscrire sur la liste de discussion des traducteurs pour demander plus d’informations.
Actuellement MediaSPIP n’est disponible qu’en français et (...)
Sur d’autres sites (9569)
-
Can't play rtp stream from ffmpeg/avconv, no data received
6 mars 2014, par Foo BarazzI started avserver on my Raspberry Pi, webcam attached I read from /dev/video0 with
pi@raspberrypi $ avconv -f video4linux2 -i /dev/video0 -vcodec mpeg2video -r 25 - pix_fmt yuv420p -me_method epzs -b 2600k -bt 256k -f rtp rtp://192.168.0.124:8090
avconv version 0.8.6-6:0.8.6-1+rpi1, Copyright (c) 2000-2013 the Libav developers
built on Mar 31 2013 13:58:10 with gcc 4.6.3
[video4linux2 @ 0x17c1720] Estimating duration from bitrate, this may be inaccurate
Input #0, video4linux2, from '/dev/video0':
Duration: N/A, start: 615.594215, bitrate: 36864 kb/s
Stream #0.0: Video: rawvideo, yuyv422, 320x240, 36864 kb/s, 30 tbr, 1000k tbn, 30 tbc
[buffer @ 0x17c16e0] w:320 h:240 pixfmt:yuyv422
[avsink @ 0x17c2f00] auto-inserting filter 'auto-inserted scaler 0' between the filter 'src' and the filter 'out'
[scale @ 0x17c34c0] w:320 h:240 fmt:yuyv422 -> w:320 h:240 fmt:yuv420p flags:0x4
Output #0, rtp, to 'rtp://192.168.0.124:8090':
Metadata:
encoder : Lavf53.21.1
Stream #0.0: Video: mpeg2video, yuv420p, 320x240, q=2-31, 2600 kb/s, 90k tbn, 25 tbc
Stream mapping:
Stream #0:0 -> #0:0 (rawvideo -> mpeg2video)
SDP:
v=0
o=- 0 0 IN IP4 127.0.0.1
s=No Name
c=IN IP4 192.168.0.124
t=0 0
a=tool:libavformat 53.21.1
m=video 8090 RTP/AVP 32
b=AS:2600
Press ctrl-c to stop encoding
frame= 576 fps= 25 q=2.0 size= 2133kB time=23.00 bitrate= 759.8kbits/s dup=390 drop=0
frame= 590 fps= 25 q=2.0 size= 2191kB time=23.56 bitrate= 762.0kbits/s dup=400 drop=0
frame= 1320 fps= 25 q=2.0 size= 4932kB time=52.76 bitrate= 765.8kbits/s dup=908 drop=0
...Seems to work fine, it reads data from the webcam.
Now I'm trying to simply play with ffplay from my Mac with
$ ffplay rtp://192.168.0.124:8090
ffplay version 1.2.4 Copyright (c) 2003-2013 the FFmpeg developers
built on Mar 1 2014 15:18:21 with Apple LLVM version 5.0 (clang-500.2.79) (based on LLVM 3.3svn)
configuration: --prefix=/usr/local/Cellar/ffmpeg/1.2.4 --enable-shared --enable-pthreads --enable-gpl --enable-version3 --enable-nonfree --enable-hardcoded-tables --enable-avresample --enable-vda --cc=clang --host-cflags= --host-ldflags= --enable-libx264 --enable-libfaac --enable-libmp3lame --enable-libxvid --enable-libfreetype --enable-libtheora --enable-libvorbis --enable-libvpx --enable-librtmp --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libvo-aacenc --enable-libass --enable-ffplay --enable-libspeex --enable-libschroedinger --enable-libfdk-aac --enable-libopus --enable-frei0r --enable-libopenjpeg --extra-cflags='-I/usr/local/Cellar/openjpeg/1.5.1/include/openjpeg-1.5 '
libavutil 52. 18.100 / 52. 18.100
libavcodec 54. 92.100 / 54. 92.100
libavformat 54. 63.104 / 54. 63.104
libavdevice 54. 3.103 / 54. 3.103
libavfilter 3. 42.103 / 3. 42.103
libswscale 2. 2.100 / 2. 2.100
libswresample 0. 17.102 / 0. 17.102
libpostproc 52. 2.100 / 52. 2.100
nan A-V: 0.000 fd= 0 aq= 0KB vq= 0KB sq= 0B f=0/0
nan A-V: 0.000 fd= 0 aq= 0KB vq= 0KB sq= 0B f=0/0
nan A-V: 0.000 fd= 0 aq= 0KB vq= 0KB sq= 0B f=0/0
...The video doesn't open and it seems to not reading any data from the Raspberry Pi.
I use the default configuration for avserver.
The webcam is definitely working as I managed to just write out images with avconv from it.What did I miss ?
-
FFmpeg player backporting to Android 2.1 - one more problem
22 avril 2024, par tretdmI looked for a lot of information about how to build and use FFmpeg in early versions of Android, looked at the source codes of players from 2011-2014 and was able to easily build FFmpeg 4.0.4 and 3.1.4 on the NDKv5 platform. I have highlighted the main things for this purpose :


- 

<android></android>bitmap.h>
and<android></android>native_window.h>
before Android 2.2 (API Level 8) such a thing did not exist- this requires some effort to implement buffer management for A/V streams, since in practice, when playing video, the application silently crashed after a few seconds due to overflow (below code example in C++ and Java)
- FFmpeg - imho, the only way to support a sufficient number of codecs that are not officially included in Android 2.1 and above








void decodeVideoFromPacket(JNIEnv *env, jobject instance,
 jclass mplayer_class, AVPacket avpkt, 
 int total_frames, int length) {
 AVFrame *pFrame = NULL
 AVFrame *pFrameRGB = NULL;
 pFrame = avcodec_alloc_frame();
 pFrameRGB = avcodec_alloc_frame();
 int frame_size = avpicture_get_size(PIX_FMT_RGB32, gVideoCodecCtx->width, gVideoCodecCtx->height);
 unsigned char* buffer = (unsigned char*)av_malloc((size_t)frame_size * 3);
 if (!buffer) {
 av_free(pFrame);
 av_free(pFrameRGB);
 return;
 }
 jbyteArray buffer2;
 jmethodID renderVideoFrames = env->GetMethodID(mplayer_class, "renderVideoFrames", "([BI)V");
 int frameDecoded;
 avpicture_fill((AVPicture*) pFrame,
 buffer,
 gVideoCodecCtx->pix_fmt,
 gVideoCodecCtx->width,
 gVideoCodecCtx->height
 );

 if (avpkt.stream_index == gVideoStreamIndex) { // If video stream found
 int size = avpkt.size;
 total_frames++;
 struct SwsContext *img_convert_ctx = NULL;
 avcodec_decode_video2(gVideoCodecCtx, pFrame, &frameDecoded, &avpkt);
 if (!frameDecoded || pFrame == NULL) {
 return;
 }

 try {
 PixelFormat pxf;
 // RGB565 by default for Android Canvas in pre-Gingerbread devices.
 if(android::get_android_api_version(env) >= ANDROID_API_CODENAME_GINGERBREAD) {
 pxf = PIX_FMT_BGR32;
 } else {
 pxf = PIX_FMT_RGB565;
 }

 int rgbBytes = avpicture_get_size(pxf, gVideoCodecCtx->width,
 gVideoCodecCtx->height);

 // Converting YUV to RGB frame & RGB frame to char* buffer 
 
 buffer = convertYuv2Rgb(pxf, pFrame, rgbBytes); // result of av_image_copy_to_buffer()

 if(buffer == NULL) {
 return;
 }

 buffer2 = env->NewByteArray((jsize) rgbBytes);
 env->SetByteArrayRegion(buffer2, 0, (jsize) rgbBytes,
 (jbyte *) buffer);
 env->CallVoidMethod(instance, renderVideoFrames, buffer2, rgbBytes);
 env->DeleteLocalRef(buffer2);
 free(buffer);
 } catch (...) {
 if (debug_mode) {
 LOGE(10, "[ERROR] Render video frames failed");
 return;
 }
 }
 }
}



private void renderVideoFrames(final byte[] buffer, final int length) {
 new Thread(new Runnable() {
 @Override
 public void run() {
 Canvas c;
 VideoTrack track = null;
 for (int tracks_index = 0; tracks_index < tracks.size(); tracks_index++) {
 if (tracks.get(tracks_index) instanceof VideoTrack) {
 track = (VideoTrack) tracks.get(tracks_index);
 }
 }
 if (track != null) {
 int frame_width = track.frame_size[0];
 int frame_height = track.frame_size[1];
 if (frame_width > 0 && frame_height > 0) {
 try {
 // RGB_565 == 65K colours (16 bit)
 // RGB_8888 == 16.7M colours (24 bit w/ alpha ch.)
 int bpp = Build.VERSION.SDK_INT > 9 ? 16 : 24;
 Bitmap.Config bmp_config =
 bpp == 24 ? Bitmap.Config.RGB_565 : Bitmap.Config.ARGB_8888;
 Paint paint = new Paint();
 if(buffer != null && holder != null) {
 holder.setType(SurfaceHolder.SURFACE_TYPE_NORMAL);
 if((c = holder.lockCanvas()) == null) {
 Log.d(MPLAY_TAG, "Lock canvas failed");
 return;
 }
 ByteBuffer bbuf =
 ByteBuffer.allocateDirect(minVideoBufferSize);
 bbuf.rewind();
 for(int i = 0; i < buffer.length; i++) {
 bbuf.put(i, buffer[i]);
 }
 bbuf.rewind();

 // The approximate location where the application crashed.
 Bitmap bmp = Bitmap.createBitmap(frame_width, frame_height, bmp_config);
 bmp.copyPixelsFromBuffer(bbuf);
 
 float aspect_ratio = (float) frame_width / (float) frame_height;
 int scaled_width = (int)(aspect_ratio * (c.getHeight()));
 c.drawBitmap(bmp,
 null,
 new RectF(
 ((c.getWidth() - scaled_width) / 2), 0,
 ((c.getWidth() - scaled_width) / 2) + scaled_width,
 c.getHeight()),
 null);
 holder.unlockCanvasAndPost(c);
 bmp.recycle();
 bbuf.clear();
 } else {
 Log.d(MPLAY_TAG, "Video frame buffer is null");
 }
 } catch (Exception ex) {
 ex.printStackTrace();
 } catch (OutOfMemoryError oom) {
 oom.printStackTrace();
 stop();
 }
 }
 }
 }
 }).start();
 }



Exception (tested in Android 4.1.2 emulator) :


E/dalvikvm-heap: Out of memory on a 1228812-byte allocation
I/dalvikvm: "Thread-495" prio=5 tid=21 RUNNABLE
 ................................................
 at android.graphics.Bitmap.nativeCreate(Native Method)
 at android.graphics.Bitmap.createBitmap(Bitmap.java:640)
 at android.graphics.Bitmap.createBitmap(Bitmap.java:620)
 at [app_package_name].MediaPlayer$5.run(MediaPlayer.java:406)
 at java.lang.Thread.run(Thread.java:856)



For clarification : I first compiled FFmpeg 0.11.x on a virtual machine with Ubuntu 12.04 LTS from my written build script, looked for player examples suitable for Android below 2.2 (there is little information about them, unfortunately) and opened the file on the player and after showing the first frames it crashed into a stack or buffer overflow, on I put off developing the player for some time.


Is there anything ready-made that, as a rule, fits into one C++ file and takes into account all the nuances of backporting ? Thanks in advance.


-
Split video with ffmpeg segment option is missing frame
9 février 2024, par DanI’m trying to get the ffmpeg “segment” option to split my video into segments at the Iframes. I'm using ffmpeg V6.1.1.


First I added time stamps to each frame of my video so that when it plays, I can see exactly which frame is being displayed. I used this command :


ffmpeg -i In.mp4 -vf "drawtext=fontfile='C :\Windows\Fonts\Arial.ttf' : text='%frame_num :~ %pts':fontsize=200 : r=25 : x=(w-tw)/2 : y=h-(2*lh) : fontcolor=white : box=1 : boxcolor=0x00000099" -y Out.mp4


Then I used ffprobe to confirm that the video is 30 FPS and the Iframes are as follows :


0.000000
4.933333
10.000000
11.533333
18.866667
24.966667


Based on these Iframe times, I’d expect the following segments :







 Start Frame 

Start Time 

End Frame 

End Time 







 0 

0 

147 

4.900000 




 148 

4.933333 

299 

9.966667 




 300 

10.000000 

345 

11.500000 




 346 

11.533333 

565 

18.833334 




 566 

18.866667 

748 

24.933334 




 749 

24.966667 

867 

28.906667 









When I use ffmpeg to split the video into segments with the following command, I get six files as expected :


ffmpeg -i Out.mp4 -f segment -c copy -reset_timestamps 1 -map 0 "Out %d.mp4"


When I play the segments, they are all correct except the first segment file (Out 0.mp4). It seems to be missing the last frame. It contains frames 0 to 146 (4.866667 sec) but should also include frame 147 (4.9 sec). All the other segment files are as expected.


I’ve tried this on several different mp4 videos and they all are missing the last frame on the first segments.


Any idea why my first segment files is missing the last frame of the segment ?


Could this be an ffmpeg bug ?


Thanks for the help !
Dan


Here is my console session with all output :


C:\> ffprobe Out.mp4
ffprobe version 2023-12-21-git-1e42a48e37-full_build-www.gyan.dev Copyright (c) 2007-2023 the FFmpeg developers
 built with gcc 12.2.0 (Rev10, Built by MSYS2 project)
 configuration: --enable-gpl --enable-version3 --enable-static --pkg-config=pkgconf --disable-w32threads --disable-autodetect --enable-fontconfig --enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp --enable-bzlib --enable-lzma --enable-libsnappy --enable-zlib --enable-librist --enable-libsrt --enable-libssh --enable-libzmq --enable-avisynth --enable-libbluray --enable-libcaca --enable-sdl2 --enable-libaribb24 --enable-libaribcaption --enable-libdav1d --enable-libdavs2 --enable-libuavs3d --enable-libzvbi --enable-librav1e --enable-libsvtav1 --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs2 --enable-libxvid --enable-libaom --enable-libjxl --enable-libopenjpeg --enable-libvpx --enable-mediafoundation --enable-libass --enable-frei0r --enable-libfreetype --enable-libfribidi --enable-libharfbuzz --enable-liblensfun --enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf --enable-cuda-llvm --enable-cuvid --enable-ffnvcodec --enable-nvdec --enable-nvenc --enable-dxva2 --enable-d3d11va --enable-libvpl --enable-libshaderc --enable-vulkan --enable-libplacebo --enable-opencl --enable-libcdio --enable-libgme --enable-libmodplug --enable-libopenmpt --enable-libopencore-amrwb --enable-libmp3lame --enable-libshine --enable-libtheora --enable-libtwolame --enable-libvo-amrwbenc --enable-libcodec2 --enable-libilbc --enable-libgsm --enable-libopencore-amrnb --enable-libopus --enable-libspeex --enable-libvorbis --enable-ladspa --enable-libbs2b --enable-libflite --enable-libmysofa --enable-librubberband --enable-libsoxr --enable-chromaprint
 libavutil 58. 36.100 / 58. 36.100
 libavcodec 60. 36.100 / 60. 36.100
 libavformat 60. 20.100 / 60. 20.100
 libavdevice 60. 4.100 / 60. 4.100
 libavfilter 9. 14.100 / 9. 14.100
 libswscale 7. 6.100 / 7. 6.100
 libswresample 4. 13.100 / 4. 13.100
 libpostproc 57. 4.100 / 57. 4.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'Out.mp4':
 Metadata:
 major_brand : isom
 minor_version : 512
 compatible_brands: isomiso2avc1mp41
 title : Short 4k video sample - 4K Ultra HD (3840x2160)
 date : 2014:05:24 19:00:00
 encoder : Lavf60.20.100
 Duration: 00:00:28.96, start: 0.000000, bitrate: 3181 kb/s
 Stream #0:0[0x1](und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(progressive), 1920x1080 [SAR 1:1 DAR 16:9], 3045 kb/s, 30 fps, 30 tbr, 15360 tbn (default)
 Metadata:
 handler_name : VideoHandler
 vendor_id : [0][0][0][0]
 encoder : Lavc60.36.100 libx264
 Stream #0:1[0x2](und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 128 kb/s (default)
 Metadata:
 handler_name : SoundHandler
 vendor_id : [0][0][0][0]

C:\ ffprobe -loglevel error -skip_frame nokey -select_streams v:0 -show_entries frame=pts_time -of csv=print_section=0 Out.mp4
0.000000,
4.933333
10.000000
11.533333
18.866667
24.966667

C:\ ffmpeg -i Out.mp4 -f segment -c copy -reset_timestamps 1 -map 0 "Out %1d.mp4"
ffmpeg version 2023-12-21-git-1e42a48e37-full_build-www.gyan.dev Copyright (c) 2000-2023 the FFmpeg developers
 built with gcc 12.2.0 (Rev10, Built by MSYS2 project)
 configuration: --enable-gpl --enable-version3 --enable-static --pkg-config=pkgconf --disable-w32threads --disable-autodetect --enable-fontconfig --enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp --enable-bzlib --enable-lzma --enable-libsnappy --enable-zlib --enable-librist --enable-libsrt --enable-libssh --enable-libzmq --enable-avisynth --enable-libbluray --enable-libcaca --enable-sdl2 --enable-libaribb24 --enable-libaribcaption --enable-libdav1d --enable-libdavs2 --enable-libuavs3d --enable-libzvbi --enable-librav1e --enable-libsvtav1 --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs2 --enable-libxvid --enable-libaom --enable-libjxl --enable-libopenjpeg --enable-libvpx --enable-mediafoundation --enable-libass --enable-frei0r --enable-libfreetype --enable-libfribidi --enable-libharfbuzz --enable-liblensfun --enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf --enable-cuda-llvm --enable-cuvid --enable-ffnvcodec --enable-nvdec --enable-nvenc --enable-dxva2 --enable-d3d11va --enable-libvpl --enable-libshaderc --enable-vulkan --enable-libplacebo --enable-opencl --enable-libcdio --enable-libgme --enable-libmodplug --enable-libopenmpt --enable-libopencore-amrwb --enable-libmp3lame --enable-libshine --enable-libtheora --enable-libtwolame --enable-libvo-amrwbenc --enable-libcodec2 --enable-libilbc --enable-libgsm --enable-libopencore-amrnb --enable-libopus --enable-libspeex --enable-libvorbis --enable-ladspa --enable-libbs2b --enable-libflite --enable-libmysofa --enable-librubberband --enable-libsoxr --enable-chromaprint
 libavutil 58. 36.100 / 58. 36.100
 libavcodec 60. 36.100 / 60. 36.100
 libavformat 60. 20.100 / 60. 20.100
 libavdevice 60. 4.100 / 60. 4.100
 libavfilter 9. 14.100 / 9. 14.100
 libswscale 7. 6.100 / 7. 6.100
 libswresample 4. 13.100 / 4. 13.100
 libpostproc 57. 4.100 / 57. 4.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'Out.mp4':
 Metadata:
 major_brand : isom
 minor_version : 512
 compatible_brands: isomiso2avc1mp41
 title : Short 4k video sample - 4K Ultra HD (3840x2160)
 date : 2014:05:24 19:00:00
 encoder : Lavf60.20.100
 Duration: 00:00:28.96, start: 0.000000, bitrate: 3181 kb/s
 Stream #0:0[0x1](und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(progressive), 1920x1080 [SAR 1:1 DAR 16:9], 3045 kb/s, 30 fps, 30 tbr, 15360 tbn (default)
 Metadata:
 handler_name : VideoHandler
 vendor_id : [0][0][0][0]
 encoder : Lavc60.36.100 libx264
 Stream #0:1[0x2](und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 128 kb/s (default)
 Metadata:
 handler_name : SoundHandler
 vendor_id : [0][0][0][0]
Stream mapping:
 Stream #0:0 -> #0:0 (copy)
 Stream #0:1 -> #0:1 (copy)
[segment @ 00000195bbc52940] Opening 'Out 0.mp4' for writing
Output #0, segment, to 'Out %1d.mp4':
 Metadata:
 major_brand : isom
 minor_version : 512
 compatible_brands: isomiso2avc1mp41
 title : Short 4k video sample - 4K Ultra HD (3840x2160)
 date : 2014:05:24 19:00:00
 encoder : Lavf60.20.100
 Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(progressive), 1920x1080 [SAR 1:1 DAR 16:9], q=2-31, 3045 kb/s, 30 fps, 30 tbr, 15360 tbn (default)
 Metadata:
 handler_name : VideoHandler
 vendor_id : [0][0][0][0]
 encoder : Lavc60.36.100 libx264
 Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 128 kb/s (default)
 Metadata:
 handler_name : SoundHandler
 vendor_id : [0][0][0][0]
Press [q] to stop, [?] for help
[segment @ 00000195bbc52940] Opening 'Out 1.mp4' for writing
[segment @ 00000195bbc52940] Opening 'Out 2.mp4' for writing
[segment @ 00000195bbc52940] Opening 'Out 3.mp4' for writing
[segment @ 00000195bbc52940] Opening 'Out 4.mp4' for writing
[segment @ 00000195bbc52940] Opening 'Out 5.mp4' for writing
[out#0/segment @ 00000195bc3e8cc0] video:10757kB audio:456kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown
size=N/A time=00:00:28.86 bitrate=N/A speed= 322x