
Recherche avancée
Médias (1)
-
Video d’abeille en portrait
14 mai 2011, par
Mis à jour : Février 2012
Langue : français
Type : Video
Autres articles (91)
-
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
Amélioration de la version de base
13 septembre 2013Jolie sélection multiple
Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...) -
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
Sur d’autres sites (11054)
-
Why does ffmpeg report different durations ?
18 mai 2012, par kenitechSource videos : http://www.artworknotavailable.com/tmp/ffmpegtest
Quicktime Pro 7.7.1 Inspector (Win 7) reports the following for the file 2398.mov
4.19MB
H.264
Movie FPS : 23.98
Data Rate : 2.35 mbits/Sec
Duration 14:97ffmpeg reports the following (see full ffmpeg version info at bottom of post)
ffmpeg -i 2398.mov
Seems stream 1 codec frame rate differs from container frame rate : 47952.00 (47952/1) -> >23.98 (2997/125)
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '2398.mov' :
Metadata :
major_brand : qtminor_version : 537199360
compatible_brands : qt
Duration : 00:00:15.97, start : 0.-963005, bitrate : 2210 kb/s
Stream #0.0(eng) : Audio : aac, 48000 Hz, stereo, s16, 152 kb/s
Stream #0.1(eng) : Video : h264, yuv420p, 848x480, 2060 kb/s, 23.98 fps, 23.98 tbr, 23976 tbn, 47952 tbcOne second longer than what Quicktime reports.
As an experiment I exported this file from Quicktime Pro using the following settings :
Frame Rate : Current
Key Frames : Every 24 frames
Frame Reordering On
Quality : High
Encoding Best
Data Rate : Automatic
Optimized for Download
Output file : qtime-export-2398.movQuicktime Inspector reports :
5.62 MB
H.264
Movie FPS : 23.98
Data Rate : 3.15 mbits/Sec
Duration 14:97ffmpeg now reports :
ffmpeg -i qtime-export-2398.mov
Seems stream 1 codec frame rate differs from container frame rate : 1200.00 (1200/1) -> 23.98 (24000/1001)
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'qtime-export-2398.mov' :
Metadata :
major_brand : qt
minor_version : 537199360
compatible_brands : qt
Duration : 00:00:14.96, start : 0.000000, bitrate : 3153 kb/s
Stream #0.0(eng) : Audio : pcm_s16le, 44100 Hz, 2 channels, s16, 1411 kb/s
Stream #0.1(eng) : Video : h264, yuv420p, 678x384, 1738 kb/s, 23.98 fps, 23.98 tbr, 600 tbn, 1200 tbcffmpeg's report on duration went from 15.97 to 14.96 (I can live with .1)
Is this duration calculated from the bitrate ?
I need to accurately report the duration of uploaded videos as well as convert them to FLV. Can somebody tell me what is going on here and how I might get around this ?
ffmpeg info below. I've tried this on 2 completely different installs/versions of ffmpeg. Same result.
FFmpeg version 0.6.5, Copyright (c) 2000-2010 the FFmpeg developers
built on Jan 29 2012 23:55:02 with gcc 4.1.2 20080704 (Red Hat 4.1.2-51)
configuration : —prefix=/usr —libdir=/usr/lib64 —shlibdir=/usr/lib64 —mandir=/usr/share/man —incdir=/usr/include —disable-avisynth —extra-cflags='-O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector —param=ssp-buffer-size=4 -m64 -mtune=generic -fPIC' —enable-avfilter —enable-avfilter-lavf —enable-libdirac —enable-libfaac —enable-libfaad —enable-libfaadbin —enable-libgsm —enable-libmp3lame —enable-libopencore-amrnb —enable-libopencore-amrwb —enable-libx264 —enable-gpl —enable-nonfree —enable-postproc —enable-pthreads —enable-shared —enable-swscale —enable-vdpau —enable-version3 —enable-x11grab
libavutil 50.15. 1 / 50.15. 1
libavcodec 52.72. 2 / 52.72. 2
libavformat 52.64. 2 / 52.64. 2
libavdevice 52. 2. 0 / 52. 2. 0
libavfilter 1.19. 0 / 1.19. 0
libswscale 0.11. 0 / 0.11. 0
libpostproc 51. 2. 0 / 51. 2. 0
FFmpeg 0.6.5
libavutil 50.15. 1 / 50.15. 1
libavcodec 52.72. 2 / 52.72. 2
libavformat 52.64. 2 / 52.64. 2
libavdevice 52. 2. 0 / 52. 2. 0
libavfilter 1.19. 0 / 1.19. 0
libswscale 0.11. 0 / 0.11. 0
libpostproc 51. 2. 0 / 51. 2. 0 -
ffmpeg version 2.6.8 : Stream specifier ':a' in filtergraph description matches no streams
16 décembre 2020, par AshitakaI am not getting why this isn't working.. I have tried to get the video streams with [0:v]/[0:1]/[0:v:0] & the audio streams with [0:a]/[0:0]/[0:0:0].
nothing worked.


Explaining the inputs :


1.1st input stream is a video that can be of varying resolution on which the filter adds a padding on to make it 600:480.


2.2nd input is an overlay png which is already at 5:4 ratio.. just making it 600:480 before it gets overlaid in the filter.


3.3rd & 4th ones are also videos which I don't care if they get stretched.. n they are getting stretched to 600:480.


4.so finally there are 3 streams 1 overlaid video 2 stretched videos which needs to be concatenated.


here's the command :


ffmpeg 
-i '/home/vidinflux/public_html/assets/temp/2018020116464612/2018020116464612.mp4' 
-i '/home/vidinflux/public_html/assets/temp/2018020116464612/2018020116464612_overlay.png' 
-i '/home/vidinflux/public_html/assets/user/736/video/Lines1.mp4' 
-i '/home/vidinflux/public_html/assets/user/736/video/Lines11.mp4' 
-filter_complex
"[0:v]trim=0:138,setpts=PTS-STARTPTS[v0];[0:a]atrim=0:138,asetpts=PTS-STARTPTS[a0];[v0]scale='gte(iw/ih\,600/480)*600+lt(iw/ih\,600/480)*((480*iw)/ih):lte(iw/ih\,600/480)*480+gt(iw/ih\,600/480)*((600*ih)/iw)',pad='600:480:(600-gte(iw/ih\,600/480)*600-lt(iw/ih\,600/480)*((480*iw)/ih))/2:(480-lte(iw/ih\,600/480)*480-gt(iw/ih\,600/480)*((600*ih)/iw))/2:black'[x];[1:v]scale=600:480[y];[x][y]overlay=0:0[z];[2:v]scale=600:480,setsar=1:1[x0];[3:v]scale=600:480,setsar=1:1[x1];[x0][2:a][z][a0][x1][3:a]concat=n=3:v=1:a=1[v][a]" 
-map "[v]" 
-map "[a]" 
-c:v libx264 
-shortest /home/vidinflux/public_html/assets/temp/2018020116464612/2018020116464612_final.mp4



this is the complete error I am getting :


Stream specifier ':a' in filtergraph description [0:v]trim=0:138,setpts=PTS-STARTPTS[v0];[0:a]atrim=0:138,asetpts=PTS-STARTPTS[a0];[v0]scale='gte(iw/ih\,600/480)*600+lt(iw/ih\,600/480)*((480*iw)/ih):lte(iw/ih\,600/480)*480+gt(iw/ih\,600/480)*((600*ih)/iw)',pad='600:480:(600-gte(iw/ih\,600/480)*600-lt(iw/ih\,600/480)*((480*iw)/ih))/2:(480-lte(iw/ih\,600/480)*480-gt(iw/ih\,600/480)*((600*ih)/iw))/2:black'[x];[1:v]scale=600:480[y];[x][y]overlay=0:0[z];[2:v]scale=600:480,setsar=1:1[x0];[3:v]scale=600:480,setsar=1:1[x1];[x0][2:a][z][a0][x1][3:a]concat=n=3:v=1:a=1[v][a] matches no streams.



also there are these warnings :


[Parsed_setsar_9 @ 0x219fba0] num:den syntax is deprecated, please use num/den or named options instead
[Parsed_setsar_11 @ 0x21a4840] num:den syntax is deprecated, please use num/den or named options instead



Complete log as requested :


[root@cloud ~]# ffmpeg -i '/home/vidinflux/public_html/assets/temp/2018020116464612/2018020116464612.mp4' -i '/home/vidinflux/public_html/assets/temp/2018020116464612/2018020116464612_overlay.png' -i '/home/vidinflux/public_html/assets/user/736/video/Lines1.mp4' -i '/home/vidinflux/public_html/assets/user/736/video/Lines11.mp4' -filter_complex \ "[0:v]trim=0:138,setpts=PTS-STARTPTS[v0];[0:a]atrim=0:138,asetpts=PTS-STARTPTS[a0];[v0]scale='gte(iw/ih\,600/480)*600+lt(iw/ih\,600/480)*((480*iw)/ih):lte(iw/ih\,600/480)*480+gt(iw/ih\,600/480)*((600*ih)/iw)',pad='600:480:(600-gte(iw/ih\,600/480)*600-lt(iw/ih\,600/480)*((480*iw)/ih))/2:(480-lte(iw/ih\,600/480)*480-gt(iw/ih\,600/480)*((600*ih)/iw))/2:black'[x];[1:v]scale=600:480[y];[x][y]overlay=0:0[z];[2:v]scale=600:480,setsar=1:1[x0];[3:v]scale=600:480,setsar=1:1[x1];[x0][2:a][z][a0][x1][3:a]concat=n=3:v=1:a=1[v][a]" -map "[v]" -map "[a]" -c:v libx264 -shortest /home/vidinflux/public_html/assets/temp/2018020116464612/2018020116464612_final.mp4
ffmpeg version 2.6.8 Copyright (c) 2000-2016 the FFmpeg developers
 built with gcc 4.4.7 (GCC) 20120313 (Red Hat 4.4.7-16)
 configuration: --prefix=/usr --bindir=/usr/bin --datadir=/usr/share/ffmpeg --incdir=/usr/include/ffmpeg --libdir=/usr/lib64 --mandir=/usr/share/man --arch=x86_64 --optflags='-O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic' --enable-bzlib --disable-crystalhd --enable-gnutls --enable-ladspa --enable-libass --enable-libdc1394 --enable-libfaac --enable-nonfree --disable-indev=jack --enable-libfreetype --enable-libgsm --enable-libmp3lame --enable-openal --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-libschroedinger --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libvorbis --enable-libv4l2 --enable-libx264 --enable-libx265 --enable-libxvid --enable-x11grab --enable-avfilter --enable-avresample --enable-postproc --enable-pthreads --disable-static --enable-shared --enable-gpl --disable-debug --disable-stripping --shlibdir=/usr/lib64 --enable-runtime-cpudetect
 libavutil 54. 20.100 / 54. 20.100
 libavcodec 56. 26.100 / 56. 26.100
 libavformat 56. 25.101 / 56. 25.101
 libavdevice 56. 4.100 / 56. 4.100
 libavfilter 5. 11.102 / 5. 11.102
 libavresample 2. 1. 0 / 2. 1. 0
 libswscale 3. 1.101 / 3. 1.101
 libswresample 1. 1.100 / 1. 1.100
 libpostproc 53. 3.100 / 53. 3.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/home/vidinflux/public_html/assets/temp/2018020116464612/2018020116464612.mp4':
 Metadata:
 major_brand : mp42
 minor_version : 0
 compatible_brands: isommp42
 creation_time : 2017-08-21 02:23:24
 Duration: 00:02:17.23, start: 0.000000, bitrate: 417 kb/s
 Stream #0:0(und): Video: h264 (Constrained Baseline) (avc1 / 0x31637661), yuv420p(tv, bt709), 640x360 [SAR 1:1 DAR 16:9], 318 kb/s, 25 fps, 25 tbr, 12800 tbn, 50 tbc (default)
 Metadata:
 handler_name : VideoHandler
 Stream #0:1(eng): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 96 kb/s (default)
 Metadata:
 creation_time : 2017-08-21 02:23:24
 handler_name : IsoMedia File Produced by Google, 5-11-2011
Input #1, png_pipe, from '/home/vidinflux/public_html/assets/temp/2018020116464612/2018020116464612_overlay.png':
 Duration: N/A, bitrate: N/A
 Stream #1:0: Video: png, rgba, 600x479, 25 tbr, 25 tbn, 25 tbc
Input #2, mov,mp4,m4a,3gp,3g2,mj2, from '/home/vidinflux/public_html/assets/user/736/video/Lines1.mp4':
 Metadata:
 major_brand : mp42
 minor_version : 1
 compatible_brands: mp41mp42isom
 creation_time : 2018-01-31 22:40:09
 Duration: 00:00:04.90, start: 0.103811, bitrate: 846 kb/s
 Stream #2:0(und): Video: h264 (Baseline) (avc1 / 0x31637661), yuv420p(tv, smpte170m/bt470bg/bt709), 1920x1080, 827 kb/s, SAR 1:1 DAR 16:9, 30 fps, 30 tbr, 90k tbn, 180k tbc (default)
 Metadata:
 creation_time : 2018-01-31 22:40:10
 handler_name : Core Media Video
Input #3, mov,mp4,m4a,3gp,3g2,mj2, from '/home/vidinflux/public_html/assets/user/736/video/Lines11.mp4':
 Metadata:
 major_brand : mp42
 minor_version : 1
 compatible_brands: mp41mp42isom
 creation_time : 2018-01-31 22:40:09
 Duration: 00:00:04.90, start: 0.103811, bitrate: 846 kb/s
 Stream #3:0(und): Video: h264 (Baseline) (avc1 / 0x31637661), yuv420p(tv, smpte170m/bt470bg/bt709), 1920x1080, 827 kb/s, SAR 1:1 DAR 16:9, 30 fps, 30 tbr, 90k tbn, 180k tbc (default)
 Metadata:
 creation_time : 2018-01-31 22:40:10
 handler_name : Core Media Video
[Parsed_setsar_9 @ 0x17c8ba0] num:den syntax is deprecated, please use num/den or named options instead
[Parsed_setsar_11 @ 0x17cd840] num:den syntax is deprecated, please use num/den or named options instead
Stream specifier ':a' in filtergraph description [0:v]trim=0:138,setpts=PTS-STARTPTS[v0];[0:a]atrim=0:138,asetpts=PTS-STARTPTS[a0];[v0]scale='gte(iw/ih\,600/480)*600+lt(iw/ih\,600/480)*((480*iw)/ih):lte(iw/ih\,600/480)*480+gt(iw/ih\,600/480)*((600*ih)/iw)',pad='600:480:(600-gte(iw/ih\,600/480)*600-lt(iw/ih\,600/480)*((480*iw)/ih))/2:(480-lte(iw/ih\,600/480)*480-gt(iw/ih\,600/480)*((600*ih)/iw))/2:black'[x];[1:v]scale=600:480[y];[x][y]overlay=0:0[z];[2:v]scale=600:480,setsar=1:1[x0];[3:v]scale=600:480,setsar=1:1[x1];[x0][2:a][z][a0][x1][3:a]concat=n=3:v=1:a=1[v][a] matches no streams.



-
c++ - using FFmpeg encode and UDP with a Webcam
14 mars, par RendresI'm trying to get frames from a Webcam using OpenCV, encode them with FFmpeg and send them using UDP.



I did before a similar project that instead of sending the packets with UDP, it saved them in a video file.



My code is.



#include 
#include 
#include 
#include 

extern "C" {
#include <libavcodec></libavcodec>avcodec.h>
#include <libavformat></libavformat>avformat.h>
#include <libavutil></libavutil>opt.h>
#include <libavutil></libavutil>imgutils.h>
#include <libavutil></libavutil>mathematics.h>
#include <libswscale></libswscale>swscale.h>
#include <libswresample></libswresample>swresample.h>
}

#include <opencv2></opencv2>opencv.hpp>

using namespace std;
using namespace cv;

#define WIDTH 640
#define HEIGHT 480
#define CODEC_ID AV_CODEC_ID_H264
#define STREAM_PIX_FMT AV_PIX_FMT_YUV420P

static AVFrame *frame, *pFrameBGR;

int main(int argc, char **argv)
{
VideoCapture cap(0);
const char *url = "udp://127.0.0.1:8080";

AVFormatContext *formatContext;
AVStream *stream;
AVCodec *codec;
AVCodecContext *c;
AVDictionary *opts = NULL;

int ret, got_packet;

if (!cap.isOpened())
{
 return -1;
}

av_log_set_level(AV_LOG_TRACE);

av_register_all();
avformat_network_init();

avformat_alloc_output_context2(&formatContext, NULL, "h264", url);
if (!formatContext)
{
 av_log(NULL, AV_LOG_FATAL, "Could not allocate an output context for '%s'.\n", url);
}

codec = avcodec_find_encoder(CODEC_ID);
if (!codec)
{
 av_log(NULL, AV_LOG_ERROR, "Could not find encoder.\n");
}

stream = avformat_new_stream(formatContext, codec);

c = avcodec_alloc_context3(codec);

stream->id = formatContext->nb_streams - 1;
stream->time_base = (AVRational){1, 25};

c->codec_id = CODEC_ID;
c->bit_rate = 400000;
c->width = WIDTH;
c->height = HEIGHT;
c->time_base = stream->time_base;
c->gop_size = 12;
c->pix_fmt = STREAM_PIX_FMT;

if (formatContext->flags & AVFMT_GLOBALHEADER)
 c->flags |= AV_CODEC_FLAG_GLOBAL_HEADER;

av_dict_set(&opts, "preset", "fast", 0);

av_dict_set(&opts, "tune", "zerolatency", 0);

ret = avcodec_open2(c, codec, NULL);
if (ret < 0)
{
 av_log(NULL, AV_LOG_ERROR, "Could not open video codec.\n");
}

pFrameBGR = av_frame_alloc();
if (!pFrameBGR)
{
 av_log(NULL, AV_LOG_ERROR, "Could not allocate video frame.\n");
}

frame = av_frame_alloc();
if (!frame)
{
 av_log(NULL, AV_LOG_ERROR, "Could not allocate video frame.\n");
}

frame->format = c->pix_fmt;
frame->width = c->width;
frame->height = c->height;

ret = avcodec_parameters_from_context(stream->codecpar, c);
if (ret < 0)
{
 av_log(NULL, AV_LOG_ERROR, "Could not open video codec.\n");
}

av_dump_format(formatContext, 0, url, 1);

ret = avformat_write_header(formatContext, NULL);
if (ret != 0)
{
 av_log(NULL, AV_LOG_ERROR, "Failed to connect to '%s'.\n", url);
}

Mat image(Size(HEIGHT, WIDTH), CV_8UC3);
SwsContext *swsctx = sws_getContext(WIDTH, HEIGHT, AV_PIX_FMT_BGR24, WIDTH, HEIGHT, AV_PIX_FMT_YUV420P, SWS_BILINEAR, NULL, NULL, NULL);
int frame_pts = 0;

while (1)
{
 cap >> image;

 int numBytesYUV = av_image_get_buffer_size(STREAM_PIX_FMT, WIDTH, HEIGHT, 1);
 uint8_t *bufferYUV = (uint8_t *)av_malloc(numBytesYUV * sizeof(uint8_t));

 avpicture_fill((AVPicture *)pFrameBGR, image.data, AV_PIX_FMT_BGR24, WIDTH, HEIGHT);
 avpicture_fill((AVPicture *)frame, bufferYUV, STREAM_PIX_FMT, WIDTH, HEIGHT);

 sws_scale(swsctx, (uint8_t const *const *)pFrameBGR->data, pFrameBGR->linesize, 0, HEIGHT, frame->data, frame->linesize);

 AVPacket pkt = {0};
 av_init_packet(&pkt);

 frame->pts = frame_pts;

 ret = avcodec_encode_video2(c, &pkt, frame, &got_packet);
 if (ret < 0)
 {
 av_log(NULL, AV_LOG_ERROR, "Error encoding frame\n");
 }

 if (got_packet)
 {
 pkt.pts = av_rescale_q_rnd(pkt.pts, c->time_base, stream->time_base, AVRounding(AV_ROUND_NEAR_INF | AV_ROUND_PASS_MINMAX));
 pkt.dts = av_rescale_q_rnd(pkt.dts, c->time_base, stream->time_base, AVRounding(AV_ROUND_NEAR_INF | AV_ROUND_PASS_MINMAX));
 pkt.duration = av_rescale_q(pkt.duration, c->time_base, stream->time_base);
 pkt.stream_index = stream->index;

 return av_interleaved_write_frame(formatContext, &pkt);

 cout << "Seguro que si" << endl;
 }
 frame_pts++;
}

avcodec_free_context(&c);
av_frame_free(&frame);
avformat_free_context(formatContext);

return 0;
}




The code compiles but it returns Segmentation fault in the function av_interleaved_write_frame(). I've tried several implementations or several codecs (in this case I'm using libopenh264, but using mpeg2video returns the same segmentation fault). I tried also with av_write_frame() but it returns the same error.



As I told before, I only want to grab frames from a webcam connected via USB, encode them to H264 and send the packets through UDP to another PC.



My console log when I run the executable is.



[100%] Built target display
[OpenH264] this = 0x0x244b4f0, Info:CWelsH264SVCEncoder::SetOption():ENCODER_OPTION_TRACE_CALLBACK callback = 0x7f0c302a87c0.
[libopenh264 @ 0x244aa00] [OpenH264] this = 0x0x244b4f0, Info:CWelsH264SVCEncoder::InitEncoder(), openh264 codec version = 5a5c4f1
[libopenh264 @ 0x244aa00] [OpenH264] this = 0x0x244b4f0, Info:iUsageType = 0,iPicWidth= 640;iPicHeight= 480;iTargetBitrate= 400000;iMaxBitrate= 400000;iRCMode= 0;iPaddingFlag= 0;iTemporalLayerNum= 1;iSpatialLayerNum= 1;fFrameRate= 25.000000f;uiIntraPeriod= 12;eSpsPpsIdStrategy = 0;bPrefixNalAddingCtrl = 0;bSimulcastAVC=0;bEnableDenoise= 0;bEnableBackgroundDetection= 1;bEnableSceneChangeDetect = 1;bEnableAdaptiveQuant= 1;bEnableFrameSkip= 0;bEnableLongTermReference= 0;iLtrMarkPeriod= 30, bIsLosslessLink=0;iComplexityMode = 0;iNumRefFrame = 1;iEntropyCodingModeFlag = 0;uiMaxNalSize = 0;iLTRRefNum = 0;iMultipleThreadIdc = 1;iLoopFilterDisableIdc = 0 (offset(alpha/beta): 0,0;iComplexityMode = 0,iMaxQp = 51;iMinQp = 0)
[libopenh264 @ 0x244aa00] [OpenH264] this = 0x0x244b4f0, Info:sSpatialLayers[0]: .iVideoWidth= 640; .iVideoHeight= 480; .fFrameRate= 25.000000f; .iSpatialBitrate= 400000; .iMaxSpatialBitrate= 400000; .sSliceArgument.uiSliceMode= 1; .sSliceArgument.iSliceNum= 0; .sSliceArgument.uiSliceSizeConstraint= 1500;uiProfileIdc = 66;uiLevelIdc = 41
[libopenh264 @ 0x244aa00] [OpenH264] this = 0x0x244b4f0, Warning:SliceArgumentValidationFixedSliceMode(), unsupported setting with Resolution and uiSliceNum combination under RC on! So uiSliceNum is changed to 6!
[libopenh264 @ 0x244aa00] [OpenH264] this = 0x0x244b4f0, Info:Setting MaxSpatialBitrate (400000) the same at SpatialBitrate (400000) will make the actual bit rate lower than SpatialBitrate
[libopenh264 @ 0x244aa00] [OpenH264] this = 0x0x244b4f0, Warning:bEnableFrameSkip = 0,bitrate can't be controlled for RC_QUALITY_MODE,RC_BITRATE_MODE and RC_TIMESTAMP_MODE without enabling skip frame.
[libopenh264 @ 0x244aa00] [OpenH264] this = 0x0x244b4f0, Warning:Change QP Range from(0,51) to (12,42)
[libopenh264 @ 0x244aa00] [OpenH264] this = 0x0x244b4f0, Info:WELS CPU features/capacities (0x4007fe3f) detected: HTT: Y, MMX: Y, MMXEX: Y, SSE: Y, SSE2: Y, SSE3: Y, SSSE3: Y, SSE4.1: Y, SSE4.2: Y, AVX: Y, FMA: Y, X87-FPU: Y, 3DNOW: N, 3DNOWEX: N, ALTIVEC: N, CMOV: Y, MOVBE: Y, AES: Y, NUMBER OF LOGIC PROCESSORS ON CHIP: 8, CPU CACHE LINE SIZE (BYTES): 64
[libopenh264 @ 0x244aa00] [OpenH264] this = 0x0x244b4f0, Info:WelsInitEncoderExt() exit, overall memory usage: 4542878 bytes
[libopenh264 @ 0x244aa00] [OpenH264] this = 0x0x244b4f0, Info:WelsInitEncoderExt(), pCtx= 0x0x245a400.
Output #0, h264, to 'udp://192.168.100.39:8080':
Stream #0:0, 0, 1/25: Video: h264 (libopenh264), 1 reference frame, yuv420p, 640x480 (0x0), 0/1, q=2-31, 400 kb/s, 25 tbn
[libopenh264 @ 0x244aa00] [OpenH264] this = 0x0x244b4f0, Debug:RcUpdateIntraComplexity iFrameDqBits = 385808,iQStep= 2016,iIntraCmplx = 777788928
[libopenh264 @ 0x244aa00] [OpenH264] this = 0x0x244b4f0, Debug:[Rc]Layer 0: Frame timestamp = 0, Frame type = 2, encoding_qp = 30, average qp = 30, max qp = 33, min qp = 27, index = 0, iTid = 0, used = 385808, bitsperframe = 16000, target = 64000, remainingbits = -257808, skipbuffersize = 200000
[libopenh264 @ 0x244aa00] [OpenH264] this = 0x0x244b4f0, Debug:WelsEncoderEncodeExt() OutputInfo iLayerNum = 2,iFrameSize = 48252
[libopenh264 @ 0x244aa00] [OpenH264] this = 0x0x244b4f0, Debug:WelsEncoderEncodeExt() OutputInfo iLayerId = 0,iNalType = 0,iNalCount = 2, first Nal Length=18,uiSpatialId = 0,uiTemporalId = 0,iSubSeqId = 0
[libopenh264 @ 0x244aa00] [OpenH264] this = 0x0x244b4f0, Debug:WelsEncoderEncodeExt() OutputInfo iLayerId = 1,iNalType = 1,iNalCount = 6, first Nal Length=6057,uiSpatialId = 0,uiTemporalId = 0,iSubSeqId = 0
[libopenh264 @ 0x244aa00] 6 slices
./scriptBuild.sh: line 20: 10625 Segmentation fault (core dumped) ./display




As you can see, FFmpeg uses libopenh264 and configures it correctly. However, no matter what. It always returns the same Segmentation fault error...



I've used commands like this.



ffmpeg -s 640x480 -f video4linux2 -i /dev/video0 -r 30 -vcodec libopenh264 -an -f h264 udp://127.0.0.1:8080




And it works perfectly, but I need to process the frames before sending them. Thats why I'm trying to use the libs.



My FFmpeg version is.



ffmpeg version 3.3.6 Copyright (c) 2000-2017 the FFmpeg developers
built with gcc 4.8 (Ubuntu 4.8.4-2ubuntu1~14.04.3)
configuration: --disable-yasm --enable-shared --enable-libopenh264 --cc='gcc -fPIC'
libavutil 55. 58.100 / 55. 58.100
libavcodec 57. 89.100 / 57. 89.100
libavformat 57. 71.100 / 57. 71.100
libavdevice 57. 6.100 / 57. 6.100
libavfilter 6. 82.100 / 6. 82.100
libswscale 4. 6.100 / 4. 6.100
libswresample 2. 7.100 / 2. 7.100




I tried to get more information of the error using gbd, but it didn't give me debugging info.



How can I solve this problem ? I don't know what else can I try...



Thank you !