
Recherche avancée
Médias (1)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (66)
-
Other interesting software
13 avril 2011, parWe don’t claim to be the only ones doing what we do ... and especially not to assert claims to be the best either ... What we do, we just try to do it well and getting better ...
The following list represents softwares that tend to be more or less as MediaSPIP or that MediaSPIP tries more or less to do the same, whatever ...
We don’t know them, we didn’t try them, but you can take a peek.
Videopress
Website : http://videopress.com/
License : GNU/GPL v2
Source code : (...) -
D’autres logiciels intéressants
12 avril 2011, parOn ne revendique pas d’être les seuls à faire ce que l’on fait ... et on ne revendique surtout pas d’être les meilleurs non plus ... Ce que l’on fait, on essaie juste de le faire bien, et de mieux en mieux...
La liste suivante correspond à des logiciels qui tendent peu ou prou à faire comme MediaSPIP ou que MediaSPIP tente peu ou prou à faire pareil, peu importe ...
On ne les connais pas, on ne les a pas essayé, mais vous pouvez peut être y jeter un coup d’oeil.
Videopress
Site Internet : (...) -
Possibilité de déploiement en ferme
12 avril 2011, parMediaSPIP peut être installé comme une ferme, avec un seul "noyau" hébergé sur un serveur dédié et utilisé par une multitude de sites différents.
Cela permet, par exemple : de pouvoir partager les frais de mise en œuvre entre plusieurs projets / individus ; de pouvoir déployer rapidement une multitude de sites uniques ; d’éviter d’avoir à mettre l’ensemble des créations dans un fourre-tout numérique comme c’est le cas pour les grandes plate-formes tout public disséminées sur le (...)
Sur d’autres sites (7650)
-
ffmpeg, v4l, snd_aloop ... sound asyncron (alsa buffer xrun)
28 janvier 2019, par TobiasI’m trying to create a stream that automatically reloads random inputs. I would like to extend this to a database later.
Each time ffmpeg finishes and starts again, so the input changes, the connection to the rtmp is interrupted briefly causing the whole connection breaks down. I then tried to separate audio and video, to send them to virtual devices and read from there again. Split the stream on virtual devices, reassemble them directly and send them to rtmp. If the input is then exchanged, the sending to the devices interrupts what does not bother the second ffmpeg. As soon as I stop sending to the devices the fps go very slowly (10 - 20 sec) from 25 to 0. Only then does the transmitter ffmpeg break the connection to the rtmp. The script which exchanges the inputs needs only one second. A practical test showed that everything works as desired.
I can quite comfortably change the input while the second ffmpeg maintains the stream ...
The joy did not last long. The sound is good 1 sec delayed. But sporadically. Sometimes everything works great. Sometimes the sound is offset.
I wrote several scripts for this.
Background :
- File is selected by random
- Media file is split and written to / dev / video0 (v4l loopback) and alsa default (snd_aloop loopback)
- Put the splits together again and stream them to a rtmp server
Code that selects the input and sends to / dev / video0 and alsa default
#!/bin/bash
cat /dev/null > log
while true;
do
WATERMARK="watermark.png";
dir='/homeXXXXXXXXXX/mix'
file=`/bin/ls -1 "$dir" | sort --random-sort | head -1`
DATEI=`readlink --canonicalize "$dir/$file"` # Converts to full path
if [ -z $DATEI ]
then
echo "Keine Datei gefunden" >> log;
else
START=$(date +%s);
echo "Sende $DATEI" >> log;
ffmpeg -re -y -i "$DATEI" -c:v libx264 -vf "fps=25,scale=640:480,setdar=4:3" -async 1 -pix_fmt yuv420p -preset ultrafast -map 0:0 -f v4l2 -vcodec rawvideo /dev/video0 -f alsa default
fi
DOKILL=`cat kill`;
if [ "$DOKILL" = "1"]
then
break;
fi
doneThe Output
./run.sh
ffmpeg version 3.2.12-1~deb9u1 Copyright (c) 2000-2018 the FFmpeg developers
built with gcc 6.3.0 (Debian 6.3.0-18+deb9u1) 20170516
configuration: --prefix=/usr --extra-version='1~deb9u1' --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --enable-gpl --disable-stripping --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libebur128 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmp3lame --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-omx --enable-openal --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libopencv --enable-libx264 --enable-shared
libavutil 55. 34.101 / 55. 34.101
libavcodec 57. 64.101 / 57. 64.101
libavformat 57. 56.101 / 57. 56.101
libavdevice 57. 1.100 / 57. 1.100
libavfilter 6. 65.100 / 6. 65.100
libavresample 3. 1. 0 / 3. 1. 0
libswscale 4. 2.100 / 4. 2.100
libswresample 2. 3.100 / 2. 3.100
libpostproc 54. 1.100 / 54. 1.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/home/mix/XXXXXXXXXXXXX.mp4':
Metadata:
major_brand : isom
minor_version : 512
compatible_brands: isomiso2avc1mp41
title : XXXXXXXXXXXXXXX
encoder : Lavf57.41.100
Duration: 00:03:53.48, start: 0.000000, bitrate: 2705 kb/s
Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, bt709), 1920x1080 [SAR 1:1 DAR 16:9], 2573 kb/s, 23.98 fps, 23.98 tbr, 24k tbn, 47.95 tbc (default)
Metadata:
handler_name : VideoHandler
Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 127 kb/s (default)
Metadata:
handler_name : SoundHandler
Codec AVOption preset (Configuration preset) specified for output file #0 (/dev/video0) has not been used for any stream. The most likely reason is either wrong type (e.g. a video option with no video streams) or that it is a private option of some encoder which was not actually used for any stream.
[Parsed_setdar_2 @ 0x5571234fe020] num:den syntax is deprecated, please use num/den or named options instead
-async is forwarded to lavfi similarly to -af aresample=async=1:min_hard_comp=0.100000:first_pts=0.
Output #0, v4l2, to '/dev/video0':
Metadata:
major_brand : isom
minor_version : 512
compatible_brands: isomiso2avc1mp41
title : XXXXXXXXXXX
encoder : Lavf57.56.101
Stream #0:0(und): Video: rawvideo (I420 / 0x30323449), yuv420p, 640x480 [SAR 1:1 DAR 4:3], q=2-31, 200 kb/s, 25 fps, 25 tbn, 25 tbc (default)
Metadata:
handler_name : VideoHandler
encoder : Lavc57.64.101 rawvideo
Output #1, alsa, to 'default':
Metadata:
major_brand : isom
minor_version : 512
compatible_brands: isomiso2avc1mp41
title : XXXXXXXXXX
encoder : Lavf57.56.101
Stream #1:0(und): Audio: pcm_s16le, 44100 Hz, stereo, s16, 1411 kb/s (default)
Metadata:
handler_name : SoundHandler
encoder : Lavc57.64.101 pcm_s16le
Stream mapping:
Stream #0:0 -> #0:0 (h264 (native) -> rawvideo (native))
Stream #0:1 -> #1:0 (aac (native) -> pcm_s16le (native))
Press [q] to stop, [?] for help
frame= 736 fps= 24 q=-0.0 Lsize=N/A time=00:00:29.67 bitrate=N/A speed=0.979x
video:331200kB audio:5112kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown
Exiting normally, received signal 2.The send script
#!/bin/bash
IP="XXXXXXXXX";
ffmpeg -f video4linux2 -i /dev/video0 -f alsa -acodec pcm_s16le -i default -f flv -async 1 -pix_fmt yuv420p -preset ultrafast -vcodec libx264 -r 25 -s 640x260 -acodec aac rtmp://$IP:1935/live/testThe Output
./send_stream.sh
ffmpeg version 3.2.12-1~deb9u1 Copyright (c) 2000-2018 the FFmpeg developers
built with gcc 6.3.0 (Debian 6.3.0-18+deb9u1) 20170516
configuration: --prefix=/usr --extra-version='1~deb9u1' --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --enable-gpl --disable-stripping --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libebur128 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmp3lame --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-omx --enable-openal --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libopencv --enable-libx264 --enable-shared
libavutil 55. 34.101 / 55. 34.101
libavcodec 57. 64.101 / 57. 64.101
libavformat 57. 56.101 / 57. 56.101
libavdevice 57. 1.100 / 57. 1.100
libavfilter 6. 65.100 / 6. 65.100
libavresample 3. 1. 0 / 3. 1. 0
libswscale 4. 2.100 / 4. 2.100
libswresample 2. 3.100 / 2. 3.100
libpostproc 54. 1.100 / 54. 1.100
Input #0, video4linux2,v4l2, from '/dev/video0':
Duration: N/A, start: 1548393682.674066, bitrate: 110592 kb/s
Stream #0:0: Video: rawvideo (I420 / 0x30323449), yuv420p, 640x480, 110592 kb/s, 30 fps, 30 tbr, 1000k tbn, 1000k tbc
Guessed Channel Layout for Input Stream #1.0 : stereo
Input #1, alsa, from 'default':
Duration: N/A, start: 1548393682.677901, bitrate: 1536 kb/s
Stream #1:0: Audio: pcm_s16le, 48000 Hz, stereo, s16, 1536 kb/s
-async is forwarded to lavfi similarly to -af aresample=async=1:min_hard_comp=0.100000:first_pts=0.
[libx264 @ 0x55e22cfa4f00] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX
[libx264 @ 0x55e22cfa4f00] profile Constrained Baseline, level 2.1
[libx264 @ 0x55e22cfa4f00] 264 - core 148 r2748 97eaef2 - H.264/MPEG-4 AVC codec - Copyleft 2003-2016 - http://www.videolan.org/x264.html - options: cabac=0 ref=1 deblock=0:0:0 analyse=0:0 me=dia subme=0 psy=1 psy_rd=1.00:0.00 mixed_ref=0 me_range=16 chroma_me=1 trellis=0 8x8dct=0 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=0 threads=6 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=0 weightp=0 keyint=250 keyint_min=25 scenecut=0 intra_refresh=0 rc=crf mbtree=0 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=0
Output #0, flv, to 'rtmp://XXXXXXXXXXX:1935/live/test':
Metadata:
encoder : Lavf57.56.101
Stream #0:0: Video: h264 (libx264) ([7][0][0][0] / 0x0007), yuv420p, 640x260, q=-1--1, 25 fps, 1k tbn, 25 tbc
Metadata:
encoder : Lavc57.64.101 libx264
Side data:
cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: -1
Stream #0:1: Audio: aac (LC) ([10][0][0][0] / 0x000A), 48000 Hz, stereo, fltp, 128 kb/s
Metadata:
encoder : Lavc57.64.101 aac
Stream mapping:
Stream #0:0 -> #0:0 (rawvideo (native) -> h264 (libx264))
Stream #1:0 -> #0:1 (pcm_s16le (native) -> aac (native))
Press [q] to stop, [?] for help
[alsa @ 0x55e22cf87300] Thread message queue blocking; consider raising the thread_queue_size option (current value: 8)
[video4linux2,v4l2 @ 0x55e22cf84fe0] Thread message queue blocking; consider raising the thread_queue_size option (current value: 8)
Past duration 0.613319 too large 7344kB time=00:01:05.85 bitrate= 913.5kbits/s speed=1.04x
Past duration 0.614372 too large 7644kB time=00:01:08.39 bitrate= 915.6kbits/s speed=1.04x
Past duration 0.609749 too large 7834kB time=00:01:10.91 bitrate= 905.0kbits/s speed=1.04x
Past duration 0.604362 too large 8038kB time=00:01:12.92 bitrate= 903.0kbits/s speed=1.04x
Past duration 0.609489 too large 8070kB time=00:01:13.45 bitrate= 900.1kbits/s speed=1.04x
Past duration 0.615013 too large 8094kB time=00:01:13.94 bitrate= 896.8kbits/s speed=1.04x
Past duration 0.610893 too large 8179kB time=00:01:14.94 bitrate= 894.0kbits/s speed=1.04x
Past duration 0.664711 too large
Past duration 0.639565 too large 8263kB time=00:01:15.47 bitrate= 896.8kbits/s speed=1.04x
Past duration 0.668999 too large 8339kB time=00:01:15.94 bitrate= 899.5kbits/s speed=1.04x
Past duration 0.605766 too large
Past duration 0.633049 too large 8399kB time=00:01:16.48 bitrate= 899.6kbits/s speed=1.04x
Past duration 0.674599 too large
Past duration 0.616035 too large 8451kB time=00:01:16.95 bitrate= 899.7kbits/s speed=1.04x
Past duration 0.656136 too large
Past duration 0.604195 too large
Past duration 0.601387 too large 8512kB time=00:01:17.46 bitrate= 900.2kbits/s speed=1.04x
Past duration 0.621895 too large 8565kB time=00:01:17.95 bitrate= 900.1kbits/s speed=1.04x
Past duration 0.670937 too large 8605kB time=00:01:18.46 bitrate= 898.4kbits/s speed=1.04x
Past duration 0.604500 too large 8642kB time=00:01:18.99 bitrate= 896.2kbits/s speed=1.04x
frame= 1913 fps= 25 q=-1.0 Lsize= 8670kB time=00:01:19.48 bitrate= 893.6kbits/s speed=1.04x
video:7290kB audio:1280kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 1.160292%
[libx264 @ 0x55e22cfa4f00] frame I:8 Avg QP:18.25 size: 15502
[libx264 @ 0x55e22cfa4f00] frame P:1905 Avg QP:20.95 size: 3853
[libx264 @ 0x55e22cfa4f00] mb I I16..4: 100.0% 0.0% 0.0%
[libx264 @ 0x55e22cfa4f00] mb P I16..4: 6.4% 0.0% 0.0% P16..4: 38.1% 0.0% 0.0% 0.0% 0.0% skip:55.5%
[libx264 @ 0x55e22cfa4f00] coded y,uvDC,uvAC intra: 46.0% 30.3% 13.4% inter: 20.1% 9.8% 1.1%
[libx264 @ 0x55e22cfa4f00] i16 v,h,dc,p: 47% 34% 10% 9%
[libx264 @ 0x55e22cfa4f00] i8c dc,h,v,p: 45% 28% 22% 5%
[libx264 @ 0x55e22cfa4f00] kb/s:750.98
[aac @ 0x55e22cfa62a0] Qavg: 579.067
Exiting normally, received signal 2.First everything is fine and then comes
Past duration 0.616035 too large 8451kB time=00:01:16.95 bitrate= 899.7kbits/s speed=1.04x
Past duration 0.656136 too large
Past duration 0.604195 too large
Past duration 0.601387 too large 8512kB time=00:01:17.46 bitrate= 900.2kbits/s speed=1.04xAnd then when that comes, dives in the first window, so in the ffmpeg sends the input :
Stream mapping:
Stream #0:0 -> #0:0 (h264 (native) -> rawvideo (native))
Stream #0:1 -> #1:0 (aac (native) -> pcm_s16le (native))
Press [q] to stop, [?] for help
frame= 9 fps=0.0 q=-0.0 size=N/A time=00:00:00.36 bitrate=N/A dup=1 drop=0 spframe= 21 fps= 21 q=-0.0 size=N/A time=00:00:00.84 bitrate=N/A dup=1 drop=0 sp[alsa @ 0x5643b3293160] ALSA buffer xrun.
Last message repeated 1 times
frame= 33 fps= 22 q=-0.0 size=N/A time=00:00:01.32 bitrate=N/A dup=1 drop=0 sp[alsa @ 0x5643b3293160] ALSA buffer xrun.
Last message repeated 1 times
frame= 46 fps= 23 q=-0.0 size=N/A time=00:00:01.84 bitrate=N/A dup=1 drop=0 spframe= 58 fps= 23 q=-0.0 size=N/A time=00:00:02.32 bitrate=N/A dup=1 drop=0 spframe= 71 fps= 24 q=-0.0 size=N/A time=00:00:02.84 bitrate=N/A dup=1 drop=0 spframe= 83 fps= 24 q=-0.0 size=N/A time=00:00:03.32 bitrate=N/A dup=1 drop=0 sp[alsa @ 0x5643b3293160] ALSA buffer xrun.
frame= 96 fps= 24 q=-0.0 size=N/A time=00:00:03.84 bitrate=N/A dup=1 drop=0 sp[alsa @ 0x5643b3293160] ALSA buffer xrun.The sound is then absolutely unsynchronized ...
Does anyone have any advice and can help me ?
-
The Method of Making Video Special Effect
19 janvier 2019, par szhanI’ve been trying to use javacv libraries to make video effects.
like this screenshotI’m trying to use ffmpeg’s watermarking method to cover the head. But I don’t know how to control the transparency and shape of the watermarking.
watermark = "movie=./girl.png[watermark];[in][watermark]overlay=W-w-300:10:format=rgb[out]";
FFmpegFrameFilter frameFilter = new FFmpegFrameFilter(watermark, width, height); -
Fragmented MP4 - problem playing in browser
12 juin 2019, par PookyFanI try to create fragmented MP4 from raw H264 video data so I could play it in internet browser’s player. My goal is to create live streaming system, where media server would send fragmented MP4 pieces to browser. The server would buffer input data from RaspberryPi camera, which sends video as H264 frames. It would then mux that video data and make it available for client. The browser would play media data (that were muxed by server and sent i.e. through websocket) by using Media Source Extensions.
For test purpose I wrote the following pieces of code (using many examples I found in the intenet) :
C++ application using avcodec which muxes raw H264 video to fragmented MP4 and saves it to a file :
#define READBUFSIZE 4096
#define IOBUFSIZE 4096
#define ERRMSGSIZE 128
#include <cstdint>
#include <iostream>
#include <fstream>
#include <string>
#include <vector>
extern "C"
{
#include <libavformat></libavformat>avformat.h>
#include <libavutil></libavutil>error.h>
#include <libavutil></libavutil>opt.h>
}
enum NalType : uint8_t
{
//NALs containing stream metadata
SEQ_PARAM_SET = 0x7,
PIC_PARAM_SET = 0x8
};
std::vector outputData;
int mediaMuxCallback(void *opaque, uint8_t *buf, int bufSize)
{
outputData.insert(outputData.end(), buf, buf + bufSize);
return bufSize;
}
std::string getAvErrorString(int errNr)
{
char errMsg[ERRMSGSIZE];
av_strerror(errNr, errMsg, ERRMSGSIZE);
return std::string(errMsg);
}
int main(int argc, char **argv)
{
if(argc < 2)
{
std::cout << "Missing file name" << std::endl;
return 1;
}
std::fstream file(argv[1], std::ios::in | std::ios::binary);
if(!file.is_open())
{
std::cout << "Couldn't open file " << argv[1] << std::endl;
return 2;
}
std::vector inputMediaData;
do
{
char buf[READBUFSIZE];
file.read(buf, READBUFSIZE);
int size = file.gcount();
if(size > 0)
inputMediaData.insert(inputMediaData.end(), buf, buf + size);
} while(!file.eof());
file.close();
//Initialize avcodec
av_register_all();
uint8_t *ioBuffer;
AVCodec *codec = avcodec_find_decoder(AV_CODEC_ID_H264);
AVCodecContext *codecCtxt = avcodec_alloc_context3(codec);
AVCodecParserContext *parserCtxt = av_parser_init(AV_CODEC_ID_H264);
AVOutputFormat *outputFormat = av_guess_format("mp4", nullptr, nullptr);
AVFormatContext *formatCtxt;
AVIOContext *ioCtxt;
AVStream *videoStream;
int res = avformat_alloc_output_context2(&formatCtxt, outputFormat, nullptr, nullptr);
if(res < 0)
{
std::cout << "Couldn't initialize format context; the error was: " << getAvErrorString(res) << std::endl;
return 3;
}
if((videoStream = avformat_new_stream( formatCtxt, avcodec_find_encoder(formatCtxt->oformat->video_codec) )) == nullptr)
{
std::cout << "Couldn't initialize video stream" << std::endl;
return 4;
}
else if(!codec)
{
std::cout << "Couldn't initialize codec" << std::endl;
return 5;
}
else if(codecCtxt == nullptr)
{
std::cout << "Couldn't initialize codec context" << std::endl;
return 6;
}
else if(parserCtxt == nullptr)
{
std::cout << "Couldn't initialize parser context" << std::endl;
return 7;
}
else if((ioBuffer = (uint8_t*)av_malloc(IOBUFSIZE)) == nullptr)
{
std::cout << "Couldn't allocate I/O buffer" << std::endl;
return 8;
}
else if((ioCtxt = avio_alloc_context(ioBuffer, IOBUFSIZE, 1, nullptr, nullptr, mediaMuxCallback, nullptr)) == nullptr)
{
std::cout << "Couldn't initialize I/O context" << std::endl;
return 9;
}
//Set video stream data
videoStream->id = formatCtxt->nb_streams - 1;
videoStream->codec->width = 1280;
videoStream->codec->height = 720;
videoStream->time_base.den = 60; //FPS
videoStream->time_base.num = 1;
videoStream->codec->flags |= CODEC_FLAG_GLOBAL_HEADER;
formatCtxt->pb = ioCtxt;
//Retrieve SPS and PPS for codec extdata
const uint32_t synchMarker = 0x01000000;
unsigned int i = 0;
int spsStart = -1, ppsStart = -1;
uint16_t spsSize = 0, ppsSize = 0;
while(spsSize == 0 || ppsSize == 0)
{
uint32_t *curr = (uint32_t*)(inputMediaData.data() + i);
if(*curr == synchMarker)
{
unsigned int currentNalStart = i;
i += sizeof(uint32_t);
uint8_t nalType = inputMediaData.data()[i] & 0x1F;
if(nalType == SEQ_PARAM_SET)
spsStart = currentNalStart;
else if(nalType == PIC_PARAM_SET)
ppsStart = currentNalStart;
if(spsStart >= 0 && spsSize == 0 && spsStart != i)
spsSize = currentNalStart - spsStart;
else if(ppsStart >= 0 && ppsSize == 0 && ppsStart != i)
ppsSize = currentNalStart - ppsStart;
}
++i;
}
videoStream->codec->extradata = inputMediaData.data() + spsStart;
videoStream->codec->extradata_size = ppsStart + ppsSize;
//Write main header
AVDictionary *options = nullptr;
av_dict_set(&options, "movflags", "frag_custom+empty_moov", 0);
res = avformat_write_header(formatCtxt, &options);
if(res < 0)
{
std::cout << "Couldn't write container main header; the error was: " << getAvErrorString(res) << std::endl;
return 10;
}
//Retrieve frames from input video and wrap them in container
int currentInputIndex = 0;
int framesInSecond = 0;
while(currentInputIndex < inputMediaData.size())
{
uint8_t *frameBuffer;
int frameSize;
res = av_parser_parse2(parserCtxt, codecCtxt, &frameBuffer, &frameSize, inputMediaData.data() + currentInputIndex,
inputMediaData.size() - currentInputIndex, AV_NOPTS_VALUE, AV_NOPTS_VALUE, 0);
if(frameSize == 0) //No more frames while some data still remains (is that even possible?)
{
std::cout << "Some data left unparsed: " << std::to_string(inputMediaData.size() - currentInputIndex) << std::endl;
break;
}
//Prepare packet with video frame to be dumped into container
AVPacket packet;
av_init_packet(&packet);
packet.data = frameBuffer;
packet.size = frameSize;
packet.stream_index = videoStream->index;
currentInputIndex += frameSize;
//Write packet to the video stream
res = av_write_frame(formatCtxt, &packet);
if(res < 0)
{
std::cout << "Couldn't write packet with video frame; the error was: " << getAvErrorString(res) << std::endl;
return 11;
}
if(++framesInSecond == 60) //We want 1 segment per second
{
framesInSecond = 0;
res = av_write_frame(formatCtxt, nullptr); //Flush segment
}
}
res = av_write_frame(formatCtxt, nullptr); //Flush if something has been left
//Write media data in container to file
file.open("my_mp4.mp4", std::ios::out | std::ios::binary);
if(!file.is_open())
{
std::cout << "Couldn't open output file " << std::endl;
return 12;
}
file.write((char*)outputData.data(), outputData.size());
if(file.fail())
{
std::cout << "Couldn't write to file" << std::endl;
return 13;
}
std::cout << "Media file muxed successfully" << std::endl;
return 0;
}
</vector></string></fstream></iostream></cstdint>(I hardcoded a few values, such as video dimensions or framerate, but as I said this is just a test code.)
Simple HTML webpage using MSE to play my fragmented MP4
<video width="1280" height="720" controls="controls">
</video>
<code class="echappe-js"><script><br />
var vidElement = document.querySelector('video');<br />
<br />
if (window.MediaSource) {<br />
var mediaSource = new MediaSource();<br />
vidElement.src = URL.createObjectURL(mediaSource);<br />
mediaSource.addEventListener('sourceopen', sourceOpen);<br />
} else {<br />
console.log("The Media Source Extensions API is not supported.")<br />
}<br />
<br />
function sourceOpen(e) {<br />
URL.revokeObjectURL(vidElement.src);<br />
var mime = 'video/mp4; codecs="avc1.640028"';<br />
var mediaSource = e.target;<br />
var sourceBuffer = mediaSource.addSourceBuffer(mime);<br />
var videoUrl = 'my_mp4.mp4';<br />
fetch(videoUrl)<br />
.then(function(response) {<br />
return response.arrayBuffer();<br />
})<br />
.then(function(arrayBuffer) {<br />
sourceBuffer.addEventListener('updateend', function(e) {<br />
if (!sourceBuffer.updating &amp;&amp; mediaSource.readyState === 'open') {<br />
mediaSource.endOfStream();<br />
}<br />
});<br />
sourceBuffer.appendBuffer(arrayBuffer);<br />
});<br />
}<br />
</script>
Output MP4 file generated by my C++ application can be played i.e. in MPC, but it doesn’t play in any web browser I tested it with. It also doesn’t have any duration (MPC keeps showing 00:00).
To compare output MP4 file I got from my C++ application described above, I also used FFMPEG to create fragmented MP4 file from the same source file with raw H264 stream. I used the following command :
ffmpeg -r 60 -i input.h264 -c:v copy -f mp4 -movflags empty_moov+default_base_moof+frag_keyframe test.mp4
This file generated by FFMPEG is played correctly by every web browser I used for tests. It also has correct duration (but also it has trailing atom, which wouldn’t be present in my live stream anyway, and as I need a live stream, it won’t have any fixed duration in the first place).
MP4 atoms for both files look very similiar (they have identical avcc section for sure). What’s interesting (but not sure if it’s of any importance), both files have different NALs format than input file (RPI camera produces video stream in Annex-B format, while output MP4 files contain NALs in AVCC format... or at least it looks like it’s the case when I compare mdat atoms with input H264 data).
I assume there is some field (or a few fields) I need to set for avcodec to make it produce video stream that would be properly decoded and played by browsers players. But what field(s) do I need to set ? Or maybe problem lies somewhere else ? I ran out of ideas.
EDIT 1 :
As suggested, I investigated binary content of both MP4 files (generated by my app and FFMPEG tool) with hex editor. What I can confirm :- both files have identical avcc section (they match perfectly and are in AVCC format, I analyzed it byte after byte and there’s no mistake about it)
- both files have NALs in AVCC format (I looked closely at mdat atoms and they don’t differ between both MP4 files)
So I guess there’s nothing wrong with the extradata creation in my code - avcodec takes care of it properly, even if I just feed it with SPS and PPS NALs. It converts them by itself, so no need for me to do it by hand. Still, my original problem remains.
EDIT 2 : I achieved partial success - MP4 generated by my app now plays in Firefox. I added this line to the code (along with rest of stream initialization) :
videoStream->codec->time_base = videoStream->time_base;
So now this section of my code looks like this :
//Set video stream data
videoStream->id = formatCtxt->nb_streams - 1;
videoStream->codec->width = 1280;
videoStream->codec->height = 720;
videoStream->time_base.den = 60; //FPS
videoStream->time_base.num = 1;
videoStream->codec->time_base = videoStream->time_base;
videoStream->codec->flags |= CODEC_FLAG_GLOBAL_HEADER;
formatCtxt->pb = ioCtxt;