
Recherche avancée
Autres articles (40)
-
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
De l’upload à la vidéo finale [version standalone]
31 janvier 2010, parLe chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
Upload et récupération d’informations de la vidéo source
Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...) -
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)
Sur d’autres sites (6371)
-
How to modify the bit_rate of AVFormatContext ?
27 mars 2023, par Zion LiuHello,I would like to know how to modify the bit_rate of AVFormatContext.


Here is the minimal reproducible example.This is a simple process for pushing rtmp video streams.Now I want to control the bitrate of the video it pushes.


I tried modifying the bitrate like this, but it didn't work.


AVFormatContext *ifmt_ctx = NULL, *ofmt_ctx = NULL;
int64_t bit_rate = 400*1000;
....
ofmt_ctx->bit_rate = bit_rate;



Can be compiled by
g++ -Wall -o -g -I/usr/include/glib-2.0 -I/usr/lib/x86_64-linux-gnu/glib-2.0/include -I/usr/local/include -L/usr/local/lib simplt_push_streaming.cpp -o test.out -lavformat -lavcodec -lavutil -lgobject-2.0 -lglib-2.0 -lpthread


And my ffmpeg version :
ffmpeg version 4.4.2-0ubuntu0.22.04.1


#include 
extern "C"
{
#include <libavformat></libavformat>avformat.h>
#include <libavutil></libavutil>mathematics.h>
#include <libavutil></libavutil>time.h>
};

int main(int argc, char* argv[])
{
 AVOutputFormat *ofmt = NULL;
 //Input AVFormatContext and Output AVFormatContext
 AVFormatContext *ifmt_ctx = NULL, *ofmt_ctx = NULL;
 AVPacket pkt;
 const char *in_filename, *out_filename;
 int ret, i;
 int videoindex=-1;
 int frame_index=0;
 int64_t start_time=0;
 int64_t bit_rate = 400*1000;
 in_filename = "/home/zion/video/mecha.flv";// Input file URL
 out_filename = "rtmp://localhost:1935/live/test";//Output URL [RTMP]
 av_register_all();
 avformat_network_init();
 avformat_open_input(&ifmt_ctx, in_filename, 0, 0);
 avformat_find_stream_info(ifmt_ctx, 0);
 for(i=0; inb_streams; i++) 
 if(ifmt_ctx->streams[i]->codec->codec_type==AVMEDIA_TYPE_VIDEO){
 videoindex=i;
 break;
 }
 //Output
 avformat_alloc_output_context2(&ofmt_ctx, NULL, "flv", out_filename); //RTMP
 ofmt = ofmt_ctx->oformat;
 for (i = 0; i < ifmt_ctx->nb_streams; i++) {
 //Create output AVStream according to input AVStream
 AVStream *in_stream = ifmt_ctx->streams[i];
 AVStream *out_stream = avformat_new_stream(ofmt_ctx, in_stream->codec->codec);
 //Copy the settings of AVCodecContext
 ret = avcodec_copy_context(out_stream->codec, in_stream->codec);
 out_stream->codec->codec_tag = 0;
 if (ofmt_ctx->oformat->flags & AVFMT_GLOBALHEADER)
 out_stream->codec->flags |= AV_CODEC_FLAG_GLOBAL_HEADER;
 }
 //Open output URL
 if (!(ofmt->flags & AVFMT_NOFILE)) {
 ret = avio_open(&ofmt_ctx->pb, out_filename, AVIO_FLAG_WRITE);
 }
 //Write file header
 avformat_write_header(ofmt_ctx, NULL);
 start_time=av_gettime();
 while (1) {
 AVStream *in_stream, *out_stream;
 //Get an AVPacket
 ret = av_read_frame(ifmt_ctx, &pkt);
 if (ret < 0)
 break;
 //Important:Delay
 if(pkt.stream_index==videoindex){
 AVRational time_base=ifmt_ctx->streams[videoindex]->time_base;
 AVRational time_base_q={1,AV_TIME_BASE};
 int64_t pts_time = av_rescale_q(pkt.dts, time_base, time_base_q);
 int64_t now_time = av_gettime() - start_time;
 if (pts_time > now_time)
 av_usleep(pts_time - now_time);
 }
 in_stream = ifmt_ctx->streams[pkt.stream_index];
 out_stream = ofmt_ctx->streams[pkt.stream_index];
 if(pkt.stream_index==videoindex){
 printf("Send %8d video frames to output URL\n",frame_index);
 frame_index++;
 }
/*I tried modifying the bitrate here and I'm not sure if this is the correct usage.*/
 ofmt_ctx->bit_rate = bit_rate;
 av_interleaved_write_frame(ofmt_ctx, &pkt);
 av_free_packet(&pkt);
 }
 //Write file trailer
 av_write_trailer(ofmt_ctx);
end:
 avformat_close_input(&ifmt_ctx);
 /* close output */
 if (ofmt_ctx && !(ofmt->flags & AVFMT_NOFILE))
 avio_close(ofmt_ctx->pb);
 avformat_free_context(ofmt_ctx);
 return 0;
}



-
Streaming H.264 over UDP using FFmpeg, and "dimensions not set" error
3 septembre 2015, par Baris DemirayI’m trying to stream H.264 over UDP with no luck so far. Here is a minimal code that you can reproduce the problem.
To compile,
g++ -o test -lavcodec -lavformat -lavutil test.cpp
Extra information, I start
ffplay
as follows. Currently it’s of no use.ffplay -i udp://127.0.0.1:8554/live.sdp
Output of my code (see
avio_open()
call),[libx264 @ 0x6a26c0] using mv_range_thread = 24
[libx264 @ 0x6a26c0] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.1 Cache64
[libx264 @ 0x6a26c0] profile High, level 3.1
Output #0, h264, to 'udp://127.0.0.1:8554/live.sdp':
Stream #0:0, 0, 0/0: Video: h264 (libx264), -1 reference frame, none, q=-1--1
[h264 @ 0x6a2020] dimensions not set
Cannot write header to stream: SuccessAnd the code,
extern "C" {
#include <libavcodec></libavcodec>avcodec.h>
#include <libavformat></libavformat>avformat.h>
#include <libavutil></libavutil>avutil.h>
}
#include <iostream>
using namespace std;
int main() {
AVCodecContext* m_codecContext;
AVCodec* m_codec;
AVFormatContext* m_formatContext;
AVStream* m_stream;
unsigned m_outWidth = 768;
unsigned m_outHeight = 608;
av_register_all();
avcodec_register_all();
avformat_network_init();
int errorStatus = 0;
char errorLog[128] = { 0 };
av_log_set_level(AV_LOG_TRACE);
string m_output("udp://127.0.0.1:8554/live.sdp");
if (avformat_alloc_output_context2(&m_formatContext, NULL, "h264", m_output.c_str()) < 0) {
cerr << "Cannot allocate output context: "
<< av_make_error_string(errorLog, 128, errorStatus) << endl;
return -1;
}
AVOutputFormat *m_outputFormat = m_formatContext->oformat;
m_codec = avcodec_find_encoder(AV_CODEC_ID_H264);
if (!m_codec) {
cerr << "Cannot find an encoder: "
<< av_make_error_string(errorLog, 128, errorStatus) << endl;
return -1;
}
m_codecContext = avcodec_alloc_context3(m_codec);
if (!m_codecContext) {
cerr << "Cannot allocate a codec context: "
<< av_make_error_string(errorLog, 128, errorStatus) << endl;
return -1;
}
m_codecContext->pix_fmt = AV_PIX_FMT_YUV420P;
m_codecContext->width = m_outWidth;
m_codecContext->height = m_outHeight;
if (avcodec_open2(m_codecContext, m_codec, NULL) < 0) {
cerr << "Cannot open codec: "
<< av_make_error_string(errorLog, 128, errorStatus) << endl;
return -1;
}
m_stream = avformat_new_stream(m_formatContext, m_codec);
if (!m_stream) {
cerr << "Cannot create a new stream: "
<< av_make_error_string(errorLog, 128, errorStatus) << endl;
return -1;
}
av_dump_format(m_formatContext, 0, m_output.c_str(), 1);
if ((errorStatus = avio_open(&m_formatContext->pb, m_output.c_str(), AVIO_FLAG_WRITE)) < 0) {
cerr << "Cannot open output: "
<< av_make_error_string(errorLog, 128, errorStatus) << endl;
return -1;
}
if (avformat_write_header(m_formatContext, NULL) < 0) {
cerr << "Cannot write header to stream: "
<< av_make_error_string(errorLog, 128, errorStatus) << endl;
return -1;
}
cout << "All done." << endl;
return 0;
}
</iostream>For those who has even more time to spare on my problem, when I change
m_output
tortsp://127.0.0.1:8554/live.sdp
, andffplay
command toffplay -rtsp_flags listen -i rtsp://127.0.0.1:8554/live.sdp
I get the error,[libx264 @ 0x1e056c0] using mv_range_thread = 24
[libx264 @ 0x1e056c0] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.1 Cache64
[libx264 @ 0x1e056c0] profile High, level 3.1
Output #0, h264, to 'rtsp://127.0.0.1:8554/live.sdp':
Stream #0:0, 0, 0/0: Video: h264 (libx264), -1 reference frame, none, q=-1--1
Cannot open output: Protocol not foundAm I naive to expect that streaming protocol will be changed like this ?
-
Puzzled with file descriptor in Bash (ffmpeg video capture)
3 mai 2020, par ChrisAgaI am trying to use file descriptors in Bash and found a problem I cannot solve.
I have to read a video stream coming from the standard output of a command executed in a
coproc
. This piece of code works as expected :


ffmpeg \
 -i <(exec cat <&${COPROC[0]}) \
 -c:v $ENCODE_VIDEO_FORMAT_LOSSLESS $ENCODE_VIDEO_OPTIONS_LOSSLESS \
 -c:a copy \
 -progress /dev/fd/1 \
 "${capfile}"




But the
cat
process is not really useful sinceffmpeg -i pipe:<file descriptor="descriptor"></file>
seems to do the same. So I tried the following code which fails withpipe:63: Bad file descriptor

error.


ffmpeg \
 -i pipe:"${COPROC[0]}" \
 -c:v $ENCODE_VIDEO_FORMAT_LOSSLESS $ENCODE_VIDEO_OPTIONS_LOSSLESS \
 -c:a copy \
 -progress /dev/fd/1 \
 "${capfile}"




The actual script is something a bit complicated but here is a minimal testing code for this issue :



#!/bin/bash
#

ENCODE_VIDEO_FORMAT_LOSSLESS=ffv1
ENCODE_VIDEO_OPTIONS_LOSSLESS="-level 3 -threads 7 -coder 1 -context 1 -g 1 -slices 30 -slicecrc 1"

capfile=capure.mkv

coproc ffmpeg -i file:'Camomille.mkv' -c:v copy -c:a copy -f matroska pipe:1

capture_fd=${COPROC[0]}
echo "hk_capture_pid=${COPROC_PID}"

ffmpeg \
 -i pipe:${COPROC[0]} \
 -c:v $ENCODE_VIDEO_FORMAT_LOSSLESS $ENCODE_VIDEO_OPTIONS_LOSSLESS \
 -c:a copy \
 -progress /dev/fd/1 \
 "${capfile}"




This is the output of the second
ffmpeg
command :


ffmpeg version 4.1.4-1build2 Copyright (c) 2000-2019 the FFmpeg developers
 built with gcc 9 (Ubuntu 9.2.1-4ubuntu1)
 configuration: --prefix=/usr --extra-version=1build2 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-avresample --disable-filter=resample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librsvg --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared
 libavutil 56. 22.100 / 56. 22.100
 libavcodec 58. 35.100 / 58. 35.100
 libavformat 58. 20.100 / 58. 20.100
 libavdevice 58. 5.100 / 58. 5.100
 libavfilter 7. 40.101 / 7. 40.101
 libavresample 4. 0. 0 / 4. 0. 0
 libswscale 5. 3.100 / 5. 3.100
 libswresample 3. 3.100 / 3. 3.100
 libpostproc 55. 3.100 / 55. 3.100
pipe:63: Bad file descriptor
av_interleaved_write_frame(): Broken pipe 
Error writing trailer of pipe:1: Broken pipe 
frame= 4 fps=0.0 q=-1.0 Lsize= 48kB time=00:00:00.03 bitrate=10051.1kbits/s speed=3.44x 
video:86kB audio:1kB subtitle:0kB other streams:0kB global headers:2kB muxing overhead: unknown
Conversion failed!




This one fails and if you replace
-i pipe:${COPROC[0]}
by-i <(exec cat <&${COPROC[0]})
a capture.mkv file is created.


I run ubuntu eoan and bash version is :
GNU bash, version 5.0.3(1)-release (x86_64-pc-linux-gnu)
. I upgraded several times since I started with this issue, so it wouldn't be related too much to bash and ffmpeg versions.


If someone can point me to what I do wrong with bash file descriptors I would be grateful.