
Recherche avancée
Autres articles (38)
-
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
MediaSPIP Player : problèmes potentiels
22 février 2011, parLe lecteur ne fonctionne pas sur Internet Explorer
Sur Internet Explorer (8 et 7 au moins), le plugin utilise le lecteur Flash flowplayer pour lire vidéos et son. Si le lecteur ne semble pas fonctionner, cela peut venir de la configuration du mod_deflate d’Apache.
Si dans la configuration de ce module Apache vous avez une ligne qui ressemble à la suivante, essayez de la supprimer ou de la commenter pour voir si le lecteur fonctionne correctement : /** * GeSHi (C) 2004 - 2007 Nigel McNie, (...) -
Keeping control of your media in your hands
13 avril 2011, parThe vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...)
Sur d’autres sites (6377)
-
fileapi::WriteFile() doesn't send an input, if processthreadsapi::STARTUPINFO::hStdError is set (ffmpeg)
23 avril 2021, par LidekysI'm trying to capture my screen using
ffmpeg
in a different thread (which I create usingprocessthreadsapi::CreateProcess())
so I'd be able to do something else in the main thread, and redirectffmpeg
output, so it wouldn't pop up in the console for the user to see. To stop filming, I send a 'q' input usingWriteFile()
, and after that I want to saveffmpeg
accumulated output usingReadFile()
.

However, if I set
STARTUPINFO::hStdError
(note, thatffmpeg
output goes tostderr
) to a pipe, from which I could read the accumulated data, the inputs I send usingWriteFile()
are no longer registered and ffmpeg.exe keeps running.

I've tried redirecting
ffmpeg
output in a simple command line, but I can still stop the process by pressing the q button.

Also, if I record for less than 8 seconds, the input is registered and
ffmpeg.exe
closes.

Is there something wrong with my code, or is it processthreadsapi issue, any hints will be kindly appreciared !


Here's a minimal code of how I am trying to do it :



#include <iostream>

#include 
#include 

using namespace std;

HANDLE g_hChildStd_IN_Rd = NULL;
HANDLE g_hChildStd_IN_Wr = NULL;
HANDLE g_hChildStd_OUT_Rd = NULL;
HANDLE g_hChildStd_OUT_Wr = NULL;

int main()
{
 //Create IN and OUT pipes
 SECURITY_ATTRIBUTES saAttr;
 saAttr.nLength = sizeof(SECURITY_ATTRIBUTES);
 saAttr.lpSecurityDescriptor = NULL;


 if (! CreatePipe(&g_hChildStd_OUT_Rd, &g_hChildStd_OUT_Wr, &saAttr, 0) )
 cout<<"StdoutRd CreatePipe error"</Start recording
 if(!CreateProcess(NULL,
 "ffmpeg -y -f gdigrab -framerate 2 -i desktop record.avi", // command line
 NULL, // process security attributes
 NULL, // primary thread security attributes
 TRUE, // handles are inherited
 0, // creation flags
 NULL, // use parent's environment
 NULL, // use parent's current directory
 &siStartInfo, // STARTUPINFO pointer
 &piProcInfo)) // receives PROCESS_INFORMATION
 {
 cout<<"Error create process"</Record for a while
 while(getch() != 'k'){
 cout<<"While press k"</Stop recording by emulating a Q button push
 DWORD dwWritten;
 CHAR chBufW[1] = {'q'};

 if ( ! WriteFile(g_hChildStd_IN_Wr, chBufW, 1, &dwWritten, NULL) )
 cout<<"Error write file"</Save stdError (ffmpeg) data
 DWORD dwRead;
 char stdErrorData[4096];
 bool bSuccess;

 bSuccess = ReadFile( g_hChildStd_OUT_Wr, stdErrorData, 4096, &dwRead, NULL);

 if(!bSuccess || dwRead == 0)
 cout<<"Read failed"<code></iostream>


-
Puzzled with file descriptor in Bash (ffmpeg video capture)
3 mai 2020, par ChrisAgaI am trying to use file descriptors in Bash and found a problem I cannot solve.
I have to read a video stream coming from the standard output of a command executed in a
coproc
. This piece of code works as expected :


ffmpeg \
 -i <(exec cat <&${COPROC[0]}) \
 -c:v $ENCODE_VIDEO_FORMAT_LOSSLESS $ENCODE_VIDEO_OPTIONS_LOSSLESS \
 -c:a copy \
 -progress /dev/fd/1 \
 "${capfile}"




But the
cat
process is not really useful sinceffmpeg -i pipe:<file descriptor="descriptor"></file>
seems to do the same. So I tried the following code which fails withpipe:63: Bad file descriptor

error.


ffmpeg \
 -i pipe:"${COPROC[0]}" \
 -c:v $ENCODE_VIDEO_FORMAT_LOSSLESS $ENCODE_VIDEO_OPTIONS_LOSSLESS \
 -c:a copy \
 -progress /dev/fd/1 \
 "${capfile}"




The actual script is something a bit complicated but here is a minimal testing code for this issue :



#!/bin/bash
#

ENCODE_VIDEO_FORMAT_LOSSLESS=ffv1
ENCODE_VIDEO_OPTIONS_LOSSLESS="-level 3 -threads 7 -coder 1 -context 1 -g 1 -slices 30 -slicecrc 1"

capfile=capure.mkv

coproc ffmpeg -i file:'Camomille.mkv' -c:v copy -c:a copy -f matroska pipe:1

capture_fd=${COPROC[0]}
echo "hk_capture_pid=${COPROC_PID}"

ffmpeg \
 -i pipe:${COPROC[0]} \
 -c:v $ENCODE_VIDEO_FORMAT_LOSSLESS $ENCODE_VIDEO_OPTIONS_LOSSLESS \
 -c:a copy \
 -progress /dev/fd/1 \
 "${capfile}"




This is the output of the second
ffmpeg
command :


ffmpeg version 4.1.4-1build2 Copyright (c) 2000-2019 the FFmpeg developers
 built with gcc 9 (Ubuntu 9.2.1-4ubuntu1)
 configuration: --prefix=/usr --extra-version=1build2 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-avresample --disable-filter=resample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librsvg --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared
 libavutil 56. 22.100 / 56. 22.100
 libavcodec 58. 35.100 / 58. 35.100
 libavformat 58. 20.100 / 58. 20.100
 libavdevice 58. 5.100 / 58. 5.100
 libavfilter 7. 40.101 / 7. 40.101
 libavresample 4. 0. 0 / 4. 0. 0
 libswscale 5. 3.100 / 5. 3.100
 libswresample 3. 3.100 / 3. 3.100
 libpostproc 55. 3.100 / 55. 3.100
pipe:63: Bad file descriptor
av_interleaved_write_frame(): Broken pipe 
Error writing trailer of pipe:1: Broken pipe 
frame= 4 fps=0.0 q=-1.0 Lsize= 48kB time=00:00:00.03 bitrate=10051.1kbits/s speed=3.44x 
video:86kB audio:1kB subtitle:0kB other streams:0kB global headers:2kB muxing overhead: unknown
Conversion failed!




This one fails and if you replace
-i pipe:${COPROC[0]}
by-i <(exec cat <&${COPROC[0]})
a capture.mkv file is created.


I run ubuntu eoan and bash version is :
GNU bash, version 5.0.3(1)-release (x86_64-pc-linux-gnu)
. I upgraded several times since I started with this issue, so it wouldn't be related too much to bash and ffmpeg versions.


If someone can point me to what I do wrong with bash file descriptors I would be grateful.


-
Streaming H.264 over UDP using FFmpeg, and "dimensions not set" error
3 septembre 2015, par Baris DemirayI’m trying to stream H.264 over UDP with no luck so far. Here is a minimal code that you can reproduce the problem.
To compile,
g++ -o test -lavcodec -lavformat -lavutil test.cpp
Extra information, I start
ffplay
as follows. Currently it’s of no use.ffplay -i udp://127.0.0.1:8554/live.sdp
Output of my code (see
avio_open()
call),[libx264 @ 0x6a26c0] using mv_range_thread = 24
[libx264 @ 0x6a26c0] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.1 Cache64
[libx264 @ 0x6a26c0] profile High, level 3.1
Output #0, h264, to 'udp://127.0.0.1:8554/live.sdp':
Stream #0:0, 0, 0/0: Video: h264 (libx264), -1 reference frame, none, q=-1--1
[h264 @ 0x6a2020] dimensions not set
Cannot write header to stream: SuccessAnd the code,
extern "C" {
#include <libavcodec></libavcodec>avcodec.h>
#include <libavformat></libavformat>avformat.h>
#include <libavutil></libavutil>avutil.h>
}
#include <iostream>
using namespace std;
int main() {
AVCodecContext* m_codecContext;
AVCodec* m_codec;
AVFormatContext* m_formatContext;
AVStream* m_stream;
unsigned m_outWidth = 768;
unsigned m_outHeight = 608;
av_register_all();
avcodec_register_all();
avformat_network_init();
int errorStatus = 0;
char errorLog[128] = { 0 };
av_log_set_level(AV_LOG_TRACE);
string m_output("udp://127.0.0.1:8554/live.sdp");
if (avformat_alloc_output_context2(&m_formatContext, NULL, "h264", m_output.c_str()) < 0) {
cerr << "Cannot allocate output context: "
<< av_make_error_string(errorLog, 128, errorStatus) << endl;
return -1;
}
AVOutputFormat *m_outputFormat = m_formatContext->oformat;
m_codec = avcodec_find_encoder(AV_CODEC_ID_H264);
if (!m_codec) {
cerr << "Cannot find an encoder: "
<< av_make_error_string(errorLog, 128, errorStatus) << endl;
return -1;
}
m_codecContext = avcodec_alloc_context3(m_codec);
if (!m_codecContext) {
cerr << "Cannot allocate a codec context: "
<< av_make_error_string(errorLog, 128, errorStatus) << endl;
return -1;
}
m_codecContext->pix_fmt = AV_PIX_FMT_YUV420P;
m_codecContext->width = m_outWidth;
m_codecContext->height = m_outHeight;
if (avcodec_open2(m_codecContext, m_codec, NULL) < 0) {
cerr << "Cannot open codec: "
<< av_make_error_string(errorLog, 128, errorStatus) << endl;
return -1;
}
m_stream = avformat_new_stream(m_formatContext, m_codec);
if (!m_stream) {
cerr << "Cannot create a new stream: "
<< av_make_error_string(errorLog, 128, errorStatus) << endl;
return -1;
}
av_dump_format(m_formatContext, 0, m_output.c_str(), 1);
if ((errorStatus = avio_open(&m_formatContext->pb, m_output.c_str(), AVIO_FLAG_WRITE)) < 0) {
cerr << "Cannot open output: "
<< av_make_error_string(errorLog, 128, errorStatus) << endl;
return -1;
}
if (avformat_write_header(m_formatContext, NULL) < 0) {
cerr << "Cannot write header to stream: "
<< av_make_error_string(errorLog, 128, errorStatus) << endl;
return -1;
}
cout << "All done." << endl;
return 0;
}
</iostream>For those who has even more time to spare on my problem, when I change
m_output
tortsp://127.0.0.1:8554/live.sdp
, andffplay
command toffplay -rtsp_flags listen -i rtsp://127.0.0.1:8554/live.sdp
I get the error,[libx264 @ 0x1e056c0] using mv_range_thread = 24
[libx264 @ 0x1e056c0] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.1 Cache64
[libx264 @ 0x1e056c0] profile High, level 3.1
Output #0, h264, to 'rtsp://127.0.0.1:8554/live.sdp':
Stream #0:0, 0, 0/0: Video: h264 (libx264), -1 reference frame, none, q=-1--1
Cannot open output: Protocol not foundAm I naive to expect that streaming protocol will be changed like this ?