
Recherche avancée
Autres articles (105)
-
Amélioration de la version de base
13 septembre 2013Jolie sélection multiple
Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Ajouter des informations spécifiques aux utilisateurs et autres modifications de comportement liées aux auteurs
12 avril 2011, parLa manière la plus simple d’ajouter des informations aux auteurs est d’installer le plugin Inscription3. Il permet également de modifier certains comportements liés aux utilisateurs (référez-vous à sa documentation pour plus d’informations).
Il est également possible d’ajouter des champs aux auteurs en installant les plugins champs extras 2 et Interface pour champs extras.
Sur d’autres sites (12359)
-
Composition on ffmpeg does not match sizes
24 mai 2016, par DbuggerI am trying to take 3 videos and an image, and make a composition of 10 seconds roughly like this :
This is the command I got so far. (I formatted it a little, so that is more readable)
ffmpeg
-i /home/dbugger/projects/mediabooth/public/uploads/5fa87d68edd8190351c42f02c2ebbaeea0e786fe/media0
-i /home/dbugger/projects/mediabooth/public/uploads/5fa87d68edd8190351c42f02c2ebbaeea0e786fe/media1
-i /home/dbugger/projects/mediabooth/public/uploads/5fa87d68edd8190351c42f02c2ebbaeea0e786fe/media2
-i /home/dbugger/projects/mediabooth/public/uploads/5fa87d68edd8190351c42f02c2ebbaeea0e786fe/media3
-filter_complex "
[0:v]scale='if(gt(a,512/288),-1,512)':'if(gt(a,512/288),288,-1)',setsar=1,crop=512:288[v0c];
[1:v]scale='if(gt(a,512/288),-1,512)':'if(gt(a,512/288),288,-1)',setsar=1,crop=512:288[v1c];
[2:v]scale='if(gt(a,512/288),-1,512)':'if(gt(a,512/288),288,-1)',setsar=1,crop=512:288[v2c];
[3:v]scale='if(gt(a,640/874),-1,640)':'if(gt(a,640/874),874,-1)',setsar=1,crop=640:874[p0c];
[v0c]pad=iw+0:ih+5:0:0:color=black[v0cp];
[v1c]pad=iw+0:ih+5:0:0:color=black[v1cp];
[v0cp][v1cp][v2c]vstack=inputs=3[col0];
[col0][p0c]hstack=inputs=2[videoout]
"
-map '[videoout]' -c:v libx264 -b:v 3000k -t 00:00:10.0 /home/dbugger/projects/mediabooth/public/uploads/5fa87d68edd8190351c42f02c2ebbaeea0e786fe/output.mp4On the first 3 filters, I try to fill/crop the videos, to match 512x288
On the next filters crop/fill the image to 640x874
on the next 2 filtes, I add some padding to the top and middle video, of 5 pixels.The total height of the left column should be
288*3 + 5*2 = 874
But when I run this command I get this error :
Input 1 height 874 does not match input 0 height 872.
Where did those 2 pixels go ? If instead of videos on the left, I use images, it works alright. Only with some videos, it seems to destroy those extra 2 pixels somehow.
What is going on ? How could I fix it ?
UPDATE
Full paste :
ffmpeg version 3.0.2-1~xenial2 Copyright (c) 2000-2016 the FFmpeg developers
built with gcc 5.3.1 (Ubuntu 5.3.1-14ubuntu2) 20160413
configuration: --prefix=/usr --extra-version='1~xenial2' --libdir=/usr/lib/ffmpeg --shlibdir=/usr/lib/ffmpeg --disable-static --disable-debug --toolchain=hardened --enable-pthreads --enable-runtime-cpudetect --enable-gpl --enable-shared --disable-decoder=libopenjpeg --disable-decoder=libschroedinger --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmodplug --enable-libmp3lame --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-librtmp --enable-libschroedinger --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxvid --enable-libzvbi --enable-openal --enable-opengl --enable-x11grab --enable-version3 --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libvo-amrwbenc --enable-libdc1394 --enable-libiec61883 --enable-libzmq --enable-frei0r --enable-libx264 --enable-libopencv --enable-libkvazaar --enable-libopenh264 --enable-nonfree --enable-libfdk-aac --enable-libfaac
libavutil 55. 17.103 / 55. 17.103
libavcodec 57. 24.102 / 57. 24.102
libavformat 57. 25.100 / 57. 25.100
libavdevice 57. 0.101 / 57. 0.101
libavfilter 6. 31.100 / 6. 31.100
libavresample 3. 0. 0 / 3. 0. 0
libswscale 4. 0.100 / 4. 0.100
libswresample 2. 0.101 / 2. 0.101
libpostproc 54. 0.100 / 54. 0.100
Input #0, matroska,webm, from '/home/dbugger/projects/mediabooth/public/uploads/1199c37e0b76e2e322b6473e0c61b7a87fe4b06e/media0':
Metadata:
encoder : libwebm-0.2.1.0
creation_time : 2015-03-08 22:30:14
Duration: 00:00:09.98, start: 0.000000, bitrate: 2254 kb/s
Stream #0:0(eng): Video: vp8, yuv420p, 960x720, SAR 1:1 DAR 4:3, 29.97 fps, 29.97 tbr, 1k tbn, 1k tbc (default)
Input #1, matroska,webm, from '/home/dbugger/projects/mediabooth/public/uploads/1199c37e0b76e2e322b6473e0c61b7a87fe4b06e/media1':
Metadata:
encoder : libwebm-0.2.1.0
creation_time : 2015-03-12 16:22:27
Duration: 00:00:09.97, start: 0.000000, bitrate: 1648 kb/s
Stream #1:0(eng): Video: vp8, yuv420p, 960x720, SAR 1:1 DAR 4:3, 23.98 fps, 23.98 tbr, 1k tbn, 1k tbc (default)
Input #2, matroska,webm, from '/home/dbugger/projects/mediabooth/public/uploads/1199c37e0b76e2e322b6473e0c61b7a87fe4b06e/media2':
Metadata:
encoder : libwebm-0.2.1.0
creation_time : 2015-03-11 04:14:51
Duration: 00:00:09.98, start: 0.000000, bitrate: 2058 kb/s
Stream #2:0(eng): Video: vp8, yuv420p, 960x720, SAR 1:1 DAR 4:3, 29.97 fps, 29.97 tbr, 1k tbn, 1k tbc (default)
[mjpeg @ 0x18d6840] Changing bps to 8
Input #3, jpeg_pipe, from '/home/dbugger/projects/mediabooth/public/uploads/1199c37e0b76e2e322b6473e0c61b7a87fe4b06e/media3':
Duration: N/A, bitrate: N/A
Stream #3:0: Video: mjpeg, yuvj420p(pc, bt470bg/unknown/unknown), 1032x1032 [SAR 1:1 DAR 1:1], 25 tbr, 25 tbn, 25 tbc
[swscaler @ 0x19d0080] deprecated pixel format used, make sure you did set range correctly
[Parsed_hstack_15 @ 0x1914f40] Input 1 height 874 does not match input 0 height 872.
[Parsed_hstack_15 @ 0x1914f40] Failed to configure output pad on Parsed_hstack_15
Error configuring complex filters. -
ffmpeg, v4l, snd_aloop ... sound asyncron (alsa buffer xrun)
28 janvier 2019, par TobiasI’m trying to create a stream that automatically reloads random inputs. I would like to extend this to a database later.
Each time ffmpeg finishes and starts again, so the input changes, the connection to the rtmp is interrupted briefly causing the whole connection breaks down. I then tried to separate audio and video, to send them to virtual devices and read from there again. Split the stream on virtual devices, reassemble them directly and send them to rtmp. If the input is then exchanged, the sending to the devices interrupts what does not bother the second ffmpeg. As soon as I stop sending to the devices the fps go very slowly (10 - 20 sec) from 25 to 0. Only then does the transmitter ffmpeg break the connection to the rtmp. The script which exchanges the inputs needs only one second. A practical test showed that everything works as desired.
I can quite comfortably change the input while the second ffmpeg maintains the stream ...
The joy did not last long. The sound is good 1 sec delayed. But sporadically. Sometimes everything works great. Sometimes the sound is offset.
I wrote several scripts for this.
Background :
- File is selected by random
- Media file is split and written to / dev / video0 (v4l loopback) and alsa default (snd_aloop loopback)
- Put the splits together again and stream them to a rtmp server
Code that selects the input and sends to / dev / video0 and alsa default
#!/bin/bash
cat /dev/null > log
while true;
do
WATERMARK="watermark.png";
dir='/homeXXXXXXXXXX/mix'
file=`/bin/ls -1 "$dir" | sort --random-sort | head -1`
DATEI=`readlink --canonicalize "$dir/$file"` # Converts to full path
if [ -z $DATEI ]
then
echo "Keine Datei gefunden" >> log;
else
START=$(date +%s);
echo "Sende $DATEI" >> log;
ffmpeg -re -y -i "$DATEI" -c:v libx264 -vf "fps=25,scale=640:480,setdar=4:3" -async 1 -pix_fmt yuv420p -preset ultrafast -map 0:0 -f v4l2 -vcodec rawvideo /dev/video0 -f alsa default
fi
DOKILL=`cat kill`;
if [ "$DOKILL" = "1"]
then
break;
fi
doneThe Output
./run.sh
ffmpeg version 3.2.12-1~deb9u1 Copyright (c) 2000-2018 the FFmpeg developers
built with gcc 6.3.0 (Debian 6.3.0-18+deb9u1) 20170516
configuration: --prefix=/usr --extra-version='1~deb9u1' --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --enable-gpl --disable-stripping --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libebur128 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmp3lame --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-omx --enable-openal --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libopencv --enable-libx264 --enable-shared
libavutil 55. 34.101 / 55. 34.101
libavcodec 57. 64.101 / 57. 64.101
libavformat 57. 56.101 / 57. 56.101
libavdevice 57. 1.100 / 57. 1.100
libavfilter 6. 65.100 / 6. 65.100
libavresample 3. 1. 0 / 3. 1. 0
libswscale 4. 2.100 / 4. 2.100
libswresample 2. 3.100 / 2. 3.100
libpostproc 54. 1.100 / 54. 1.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/home/mix/XXXXXXXXXXXXX.mp4':
Metadata:
major_brand : isom
minor_version : 512
compatible_brands: isomiso2avc1mp41
title : XXXXXXXXXXXXXXX
encoder : Lavf57.41.100
Duration: 00:03:53.48, start: 0.000000, bitrate: 2705 kb/s
Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, bt709), 1920x1080 [SAR 1:1 DAR 16:9], 2573 kb/s, 23.98 fps, 23.98 tbr, 24k tbn, 47.95 tbc (default)
Metadata:
handler_name : VideoHandler
Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 127 kb/s (default)
Metadata:
handler_name : SoundHandler
Codec AVOption preset (Configuration preset) specified for output file #0 (/dev/video0) has not been used for any stream. The most likely reason is either wrong type (e.g. a video option with no video streams) or that it is a private option of some encoder which was not actually used for any stream.
[Parsed_setdar_2 @ 0x5571234fe020] num:den syntax is deprecated, please use num/den or named options instead
-async is forwarded to lavfi similarly to -af aresample=async=1:min_hard_comp=0.100000:first_pts=0.
Output #0, v4l2, to '/dev/video0':
Metadata:
major_brand : isom
minor_version : 512
compatible_brands: isomiso2avc1mp41
title : XXXXXXXXXXX
encoder : Lavf57.56.101
Stream #0:0(und): Video: rawvideo (I420 / 0x30323449), yuv420p, 640x480 [SAR 1:1 DAR 4:3], q=2-31, 200 kb/s, 25 fps, 25 tbn, 25 tbc (default)
Metadata:
handler_name : VideoHandler
encoder : Lavc57.64.101 rawvideo
Output #1, alsa, to 'default':
Metadata:
major_brand : isom
minor_version : 512
compatible_brands: isomiso2avc1mp41
title : XXXXXXXXXX
encoder : Lavf57.56.101
Stream #1:0(und): Audio: pcm_s16le, 44100 Hz, stereo, s16, 1411 kb/s (default)
Metadata:
handler_name : SoundHandler
encoder : Lavc57.64.101 pcm_s16le
Stream mapping:
Stream #0:0 -> #0:0 (h264 (native) -> rawvideo (native))
Stream #0:1 -> #1:0 (aac (native) -> pcm_s16le (native))
Press [q] to stop, [?] for help
frame= 736 fps= 24 q=-0.0 Lsize=N/A time=00:00:29.67 bitrate=N/A speed=0.979x
video:331200kB audio:5112kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown
Exiting normally, received signal 2.The send script
#!/bin/bash
IP="XXXXXXXXX";
ffmpeg -f video4linux2 -i /dev/video0 -f alsa -acodec pcm_s16le -i default -f flv -async 1 -pix_fmt yuv420p -preset ultrafast -vcodec libx264 -r 25 -s 640x260 -acodec aac rtmp://$IP:1935/live/testThe Output
./send_stream.sh
ffmpeg version 3.2.12-1~deb9u1 Copyright (c) 2000-2018 the FFmpeg developers
built with gcc 6.3.0 (Debian 6.3.0-18+deb9u1) 20170516
configuration: --prefix=/usr --extra-version='1~deb9u1' --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --enable-gpl --disable-stripping --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libebur128 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmp3lame --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-omx --enable-openal --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libopencv --enable-libx264 --enable-shared
libavutil 55. 34.101 / 55. 34.101
libavcodec 57. 64.101 / 57. 64.101
libavformat 57. 56.101 / 57. 56.101
libavdevice 57. 1.100 / 57. 1.100
libavfilter 6. 65.100 / 6. 65.100
libavresample 3. 1. 0 / 3. 1. 0
libswscale 4. 2.100 / 4. 2.100
libswresample 2. 3.100 / 2. 3.100
libpostproc 54. 1.100 / 54. 1.100
Input #0, video4linux2,v4l2, from '/dev/video0':
Duration: N/A, start: 1548393682.674066, bitrate: 110592 kb/s
Stream #0:0: Video: rawvideo (I420 / 0x30323449), yuv420p, 640x480, 110592 kb/s, 30 fps, 30 tbr, 1000k tbn, 1000k tbc
Guessed Channel Layout for Input Stream #1.0 : stereo
Input #1, alsa, from 'default':
Duration: N/A, start: 1548393682.677901, bitrate: 1536 kb/s
Stream #1:0: Audio: pcm_s16le, 48000 Hz, stereo, s16, 1536 kb/s
-async is forwarded to lavfi similarly to -af aresample=async=1:min_hard_comp=0.100000:first_pts=0.
[libx264 @ 0x55e22cfa4f00] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX
[libx264 @ 0x55e22cfa4f00] profile Constrained Baseline, level 2.1
[libx264 @ 0x55e22cfa4f00] 264 - core 148 r2748 97eaef2 - H.264/MPEG-4 AVC codec - Copyleft 2003-2016 - http://www.videolan.org/x264.html - options: cabac=0 ref=1 deblock=0:0:0 analyse=0:0 me=dia subme=0 psy=1 psy_rd=1.00:0.00 mixed_ref=0 me_range=16 chroma_me=1 trellis=0 8x8dct=0 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=0 threads=6 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=0 weightp=0 keyint=250 keyint_min=25 scenecut=0 intra_refresh=0 rc=crf mbtree=0 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=0
Output #0, flv, to 'rtmp://XXXXXXXXXXX:1935/live/test':
Metadata:
encoder : Lavf57.56.101
Stream #0:0: Video: h264 (libx264) ([7][0][0][0] / 0x0007), yuv420p, 640x260, q=-1--1, 25 fps, 1k tbn, 25 tbc
Metadata:
encoder : Lavc57.64.101 libx264
Side data:
cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: -1
Stream #0:1: Audio: aac (LC) ([10][0][0][0] / 0x000A), 48000 Hz, stereo, fltp, 128 kb/s
Metadata:
encoder : Lavc57.64.101 aac
Stream mapping:
Stream #0:0 -> #0:0 (rawvideo (native) -> h264 (libx264))
Stream #1:0 -> #0:1 (pcm_s16le (native) -> aac (native))
Press [q] to stop, [?] for help
[alsa @ 0x55e22cf87300] Thread message queue blocking; consider raising the thread_queue_size option (current value: 8)
[video4linux2,v4l2 @ 0x55e22cf84fe0] Thread message queue blocking; consider raising the thread_queue_size option (current value: 8)
Past duration 0.613319 too large 7344kB time=00:01:05.85 bitrate= 913.5kbits/s speed=1.04x
Past duration 0.614372 too large 7644kB time=00:01:08.39 bitrate= 915.6kbits/s speed=1.04x
Past duration 0.609749 too large 7834kB time=00:01:10.91 bitrate= 905.0kbits/s speed=1.04x
Past duration 0.604362 too large 8038kB time=00:01:12.92 bitrate= 903.0kbits/s speed=1.04x
Past duration 0.609489 too large 8070kB time=00:01:13.45 bitrate= 900.1kbits/s speed=1.04x
Past duration 0.615013 too large 8094kB time=00:01:13.94 bitrate= 896.8kbits/s speed=1.04x
Past duration 0.610893 too large 8179kB time=00:01:14.94 bitrate= 894.0kbits/s speed=1.04x
Past duration 0.664711 too large
Past duration 0.639565 too large 8263kB time=00:01:15.47 bitrate= 896.8kbits/s speed=1.04x
Past duration 0.668999 too large 8339kB time=00:01:15.94 bitrate= 899.5kbits/s speed=1.04x
Past duration 0.605766 too large
Past duration 0.633049 too large 8399kB time=00:01:16.48 bitrate= 899.6kbits/s speed=1.04x
Past duration 0.674599 too large
Past duration 0.616035 too large 8451kB time=00:01:16.95 bitrate= 899.7kbits/s speed=1.04x
Past duration 0.656136 too large
Past duration 0.604195 too large
Past duration 0.601387 too large 8512kB time=00:01:17.46 bitrate= 900.2kbits/s speed=1.04x
Past duration 0.621895 too large 8565kB time=00:01:17.95 bitrate= 900.1kbits/s speed=1.04x
Past duration 0.670937 too large 8605kB time=00:01:18.46 bitrate= 898.4kbits/s speed=1.04x
Past duration 0.604500 too large 8642kB time=00:01:18.99 bitrate= 896.2kbits/s speed=1.04x
frame= 1913 fps= 25 q=-1.0 Lsize= 8670kB time=00:01:19.48 bitrate= 893.6kbits/s speed=1.04x
video:7290kB audio:1280kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 1.160292%
[libx264 @ 0x55e22cfa4f00] frame I:8 Avg QP:18.25 size: 15502
[libx264 @ 0x55e22cfa4f00] frame P:1905 Avg QP:20.95 size: 3853
[libx264 @ 0x55e22cfa4f00] mb I I16..4: 100.0% 0.0% 0.0%
[libx264 @ 0x55e22cfa4f00] mb P I16..4: 6.4% 0.0% 0.0% P16..4: 38.1% 0.0% 0.0% 0.0% 0.0% skip:55.5%
[libx264 @ 0x55e22cfa4f00] coded y,uvDC,uvAC intra: 46.0% 30.3% 13.4% inter: 20.1% 9.8% 1.1%
[libx264 @ 0x55e22cfa4f00] i16 v,h,dc,p: 47% 34% 10% 9%
[libx264 @ 0x55e22cfa4f00] i8c dc,h,v,p: 45% 28% 22% 5%
[libx264 @ 0x55e22cfa4f00] kb/s:750.98
[aac @ 0x55e22cfa62a0] Qavg: 579.067
Exiting normally, received signal 2.First everything is fine and then comes
Past duration 0.616035 too large 8451kB time=00:01:16.95 bitrate= 899.7kbits/s speed=1.04x
Past duration 0.656136 too large
Past duration 0.604195 too large
Past duration 0.601387 too large 8512kB time=00:01:17.46 bitrate= 900.2kbits/s speed=1.04xAnd then when that comes, dives in the first window, so in the ffmpeg sends the input :
Stream mapping:
Stream #0:0 -> #0:0 (h264 (native) -> rawvideo (native))
Stream #0:1 -> #1:0 (aac (native) -> pcm_s16le (native))
Press [q] to stop, [?] for help
frame= 9 fps=0.0 q=-0.0 size=N/A time=00:00:00.36 bitrate=N/A dup=1 drop=0 spframe= 21 fps= 21 q=-0.0 size=N/A time=00:00:00.84 bitrate=N/A dup=1 drop=0 sp[alsa @ 0x5643b3293160] ALSA buffer xrun.
Last message repeated 1 times
frame= 33 fps= 22 q=-0.0 size=N/A time=00:00:01.32 bitrate=N/A dup=1 drop=0 sp[alsa @ 0x5643b3293160] ALSA buffer xrun.
Last message repeated 1 times
frame= 46 fps= 23 q=-0.0 size=N/A time=00:00:01.84 bitrate=N/A dup=1 drop=0 spframe= 58 fps= 23 q=-0.0 size=N/A time=00:00:02.32 bitrate=N/A dup=1 drop=0 spframe= 71 fps= 24 q=-0.0 size=N/A time=00:00:02.84 bitrate=N/A dup=1 drop=0 spframe= 83 fps= 24 q=-0.0 size=N/A time=00:00:03.32 bitrate=N/A dup=1 drop=0 sp[alsa @ 0x5643b3293160] ALSA buffer xrun.
frame= 96 fps= 24 q=-0.0 size=N/A time=00:00:03.84 bitrate=N/A dup=1 drop=0 sp[alsa @ 0x5643b3293160] ALSA buffer xrun.The sound is then absolutely unsynchronized ...
Does anyone have any advice and can help me ?
-
Play video using mse (media source extension) in google chrome
23 août 2019, par liyuqihxcI’m working on a project that convert rtsp stream (ffmpeg) and play it on the web page (signalr + mse).
So far it works pretty much as I expected on the latest version of edge and firefox, but not chrome.
here’s the code
public class WebmMediaStreamContext
{
private Process _ffProcess;
private readonly string _cmd;
private byte[] _initSegment;
private Task _readMediaStreamTask;
private CancellationTokenSource _cancellationTokenSource;
private const string _CmdTemplate = "-i {0} -c:v libvpx -tile-columns 4 -frame-parallel 1 -keyint_min 90 -g 90 -f webm -dash 1 pipe:";
public static readonly byte[] ClusterStart = { 0x1F, 0x43, 0xB6, 0x75, 0x01, 0x00, 0x00, 0x00 };
public event EventHandler<clusterreadyeventargs> ClusterReadyEvent;
public WebmMediaStreamContext(string rtspFeed)
{
_cmd = string.Format(_CmdTemplate, rtspFeed);
}
public async Task StartConverting()
{
if (_ffProcess != null)
throw new InvalidOperationException();
_ffProcess = new Process();
_ffProcess.StartInfo = new ProcessStartInfo
{
FileName = "ffmpeg/ffmpeg.exe",
Arguments = _cmd,
UseShellExecute = false,
CreateNoWindow = true,
RedirectStandardOutput = true
};
_ffProcess.Start();
_initSegment = await ParseInitSegmentAndStartReadMediaStream();
}
public byte[] GetInitSegment()
{
return _initSegment;
}
// Find the first cluster, and everything before it is the InitSegment
private async Task ParseInitSegmentAndStartReadMediaStream()
{
Memory<byte> buffer = new byte[10 * 1024];
int length = 0;
while (length != buffer.Length)
{
length += await _ffProcess.StandardOutput.BaseStream.ReadAsync(buffer.Slice(length));
int cluster = buffer.Span.IndexOf(ClusterStart);
if (cluster >= 0)
{
_cancellationTokenSource = new CancellationTokenSource();
_readMediaStreamTask = new Task(() => ReadMediaStreamProc(buffer.Slice(cluster, length - cluster).ToArray(), _cancellationTokenSource.Token), _cancellationTokenSource.Token, TaskCreationOptions.LongRunning);
_readMediaStreamTask.Start();
return buffer.Slice(0, cluster).ToArray();
}
}
throw new InvalidOperationException();
}
private void ReadMoreBytes(Span<byte> buffer)
{
int size = buffer.Length;
while (size > 0)
{
int len = _ffProcess.StandardOutput.BaseStream.Read(buffer.Slice(buffer.Length - size));
size -= len;
}
}
// Parse every single cluster and fire ClusterReadyEvent
private void ReadMediaStreamProc(byte[] bytesRead, CancellationToken cancel)
{
Span<byte> buffer = new byte[5 * 1024 * 1024];
bytesRead.CopyTo(buffer);
int bufferEmptyIndex = bytesRead.Length;
do
{
if (bufferEmptyIndex < ClusterStart.Length + 4)
{
ReadMoreBytes(buffer.Slice(bufferEmptyIndex, 1024));
bufferEmptyIndex += 1024;
}
int clusterDataSize = BitConverter.ToInt32(
buffer.Slice(ClusterStart.Length, 4)
.ToArray()
.Reverse()
.ToArray()
);
int clusterSize = ClusterStart.Length + 4 + clusterDataSize;
if (clusterSize > buffer.Length)
{
byte[] newBuffer = new byte[clusterSize];
buffer.Slice(0, bufferEmptyIndex).CopyTo(newBuffer);
buffer = newBuffer;
}
if (bufferEmptyIndex < clusterSize)
{
ReadMoreBytes(buffer.Slice(bufferEmptyIndex, clusterSize - bufferEmptyIndex));
bufferEmptyIndex = clusterSize;
}
ClusterReadyEvent?.Invoke(this, new ClusterReadyEventArgs(buffer.Slice(0, bufferEmptyIndex).ToArray()));
bufferEmptyIndex = 0;
} while (!cancel.IsCancellationRequested);
}
}
</byte></byte></byte></clusterreadyeventargs>I use ffmpeg to convert the rtsp stream to vp8 WEBM byte stream and parse it to "Init Segment" (ebml head、info、tracks...) and "Media Segment" (cluster), then send it to browser via signalR
$(function () {
var mediaSource = new MediaSource();
var mimeCodec = 'video/webm; codecs="vp8"';
var video = document.getElementById('video');
mediaSource.addEventListener('sourceopen', callback, false);
function callback(e) {
var sourceBuffer = mediaSource.addSourceBuffer(mimeCodec);
var queue = [];
sourceBuffer.addEventListener('updateend', function () {
if (queue.length === 0) {
return;
}
var base64 = queue[0];
if (base64.length === 0) {
mediaSource.endOfStream();
queue.shift();
return;
} else {
var buffer = new Uint8Array(atob(base64).split("").map(function (c) {
return c.charCodeAt(0);
}));
sourceBuffer.appendBuffer(buffer);
queue.shift();
}
}, false);
var connection = new signalR.HubConnectionBuilder()
.withUrl("/signalr-video")
.configureLogging(signalR.LogLevel.Information)
.build();
connection.start().then(function () {
connection.stream("InitVideoReceive")
.subscribe({
next: function(item) {
if (queue.length === 0 && !!!sourceBuffer.updating) {
var buffer = new Uint8Array(atob(item).split("").map(function (c) {
return c.charCodeAt(0);
}));
sourceBuffer.appendBuffer(buffer);
console.log(blockindex++ + " : " + buffer.byteLength);
} else {
queue.push(item);
}
},
complete: function () {
queue.push('');
},
error: function (err) {
console.error(err);
}
});
});
}
video.src = window.URL.createObjectURL(mediaSource);
})chrome just play the video for 3 5 seconds and then stop for buffering, even though there are plenty of cluster transfered and inserted into SourceBuffer.
here’s the information in chrome ://media-internals/
Player Properties :
render_id: 217
player_id: 1
origin_url: http://localhost:52531/
frame_url: http://localhost:52531/
frame_title: Home Page
url: blob:http://localhost:52531/dcb25d89-9830-40a5-ba88-33c13b5c03eb
info: Selected FFmpegVideoDecoder for video decoding, config: codec: vp8 format: 1 profile: vp8 coded size: [1280,720] visible rect: [0,0,1280,720] natural size: [1280,720] has extra data? false encryption scheme: Unencrypted rotation: 0°
pipeline_state: kSuspended
found_video_stream: true
video_codec_name: vp8
video_dds: false
video_decoder: FFmpegVideoDecoder
duration: unknown
height: 720
width: 1280
video_buffering_state: BUFFERING_HAVE_NOTHING
for_suspended_start: false
pipeline_buffering_state: BUFFERING_HAVE_NOTHING
event: PAUSELog
Timestamp Property Value
00:00:00 00 origin_url http://localhost:52531/
00:00:00 00 frame_url http://localhost:52531/
00:00:00 00 frame_title Home Page
00:00:00 00 url blob:http://localhost:52531/dcb25d89-9830-40a5-ba88-33c13b5c03eb
00:00:00 00 info ChunkDemuxer: buffering by DTS
00:00:00 35 pipeline_state kStarting
00:00:15 213 found_video_stream true
00:00:15 213 video_codec_name vp8
00:00:15 216 video_dds false
00:00:15 216 video_decoder FFmpegVideoDecoder
00:00:15 216 info Selected FFmpegVideoDecoder for video decoding, config: codec: vp8 format: 1 profile: vp8 coded size: [1280,720] visible rect: [0,0,1280,720] natural size: [1280,720] has extra data? false encryption scheme: Unencrypted rotation: 0°
00:00:15 216 pipeline_state kPlaying
00:00:15 213 duration unknown
00:00:16 661 height 720
00:00:16 661 width 1280
00:00:16 665 video_buffering_state BUFFERING_HAVE_ENOUGH
00:00:16 665 for_suspended_start false
00:00:16 665 pipeline_buffering_state BUFFERING_HAVE_ENOUGH
00:00:16 667 pipeline_state kSuspending
00:00:16 670 pipeline_state kSuspended
00:00:52 759 info Effective playback rate changed from 0 to 1
00:00:52 759 event PLAY
00:00:52 759 pipeline_state kResuming
00:00:52 760 video_dds false
00:00:52 760 video_decoder FFmpegVideoDecoder
00:00:52 760 info Selected FFmpegVideoDecoder for video decoding, config: codec: vp8 format: 1 profile: vp8 coded size: [1280,720] visible rect: [0,0,1280,720] natural size: [1280,720] has extra data? false encryption scheme: Unencrypted rotation: 0°
00:00:52 760 pipeline_state kPlaying
00:00:52 793 height 720
00:00:52 793 width 1280
00:00:52 798 video_buffering_state BUFFERING_HAVE_ENOUGH
00:00:52 798 for_suspended_start false
00:00:52 798 pipeline_buffering_state BUFFERING_HAVE_ENOUGH
00:00:56 278 video_buffering_state BUFFERING_HAVE_NOTHING
00:00:56 295 for_suspended_start false
00:00:56 295 pipeline_buffering_state BUFFERING_HAVE_NOTHING
00:01:20 717 event PAUSE
00:01:33 538 event PLAY
00:01:35 94 event PAUSE
00:01:55 561 pipeline_state kSuspending
00:01:55 563 pipeline_state kSuspendedCan someone tell me what’s wrong with my code, or dose chrome require some magic configuration to work ?
Thanks
Please excuse my english :)