
Recherche avancée
Autres articles (45)
-
Configurer la prise en compte des langues
15 novembre 2010, parAccéder à la configuration et ajouter des langues prises en compte
Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...) -
Emballe médias : à quoi cela sert ?
4 février 2011, parCe plugin vise à gérer des sites de mise en ligne de documents de tous types.
Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ; -
Supporting all media types
13 avril 2011, parUnlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)
Sur d’autres sites (4967)
-
Faster Real Time A.R Drone Video Streaming
5 septembre 2017, par mikeI’ve attempted
ffplay tcp://192.168.1.1:5555
to video stream from the AR Drone 2.0 ; however, the delay is way too high.My second attempt was with the following :
var arDrone = require('ar-drone');
var http = require('http');
console.log('Connecting png stream ...');
var pngStream = arDrone.createClient().getPngStream();
var lastPng;
pngStream
.on('error', console.log)
.on('data', function(pngBuffer) {
lastPng = pngBuffer;
});
var server = http.createServer(function(req, res) {
if (!lastPng) {
res.writeHead(503);
res.end('Did not receive any png data yet.');
return;
}
res.writeHead(200, {'Content-Type': 'image/png'});
res.end(lastPng);
});
server.listen(8080, function() {
console.log('Serving latest png on port 8080 ...');
});This only streamed images. I had to refresh browser every second.
My third option was using this option :
var arDrone=require('ar-drone')
var client= arDrone.createclient();
require('ar-drone-png-stream')(client,{port:8000})It streamed a lot of images in a short amount of time. The delay is still significant and I’m looking for a video.
Are there other approaches that will significantly lower the delay of the video stream ?
-
How to give real timestamp information to encoded frames inside mpeg1 container
24 août 2021, par jackey balwaniI referred following link for my implementation. https://ffmpeg.org/doxygen/trunk/muxing_8c_source.html
I am converting raw data rgb to yuv420 format through scaling and conversion apis available in FFMPEG and then passing the frames to MPEG1 encoder.
I observe that the encoded video plays too fast. Below is the code of encoding the frame and then writing it to output file.


static int write_frame(AVFormatContext *fmt_ctx, AVCodecContext *c,
 AVStream *st, AVFrame *frame)
 {
 int ret;

 // send the frame to the encoder
 ret = avcodec_send_frame(c, frame);
 if (ret < 0) {
 fprintf(stderr, "Error sending a frame to the encoder: %s\n",
 av_err2str(ret));
 exit(1);
 }

 while (ret >= 0) {
 AVPacket pkt = { 0 };

 ret = avcodec_receive_packet(c, &pkt);
 if (ret == AVERROR(EAGAIN) || ret == AVERROR_EOF)
 break;
 else if (ret < 0) {
 fprintf(stderr, "Error encoding a frame: %s\n", av_err2str(ret));
 exit(1);
 }

 /* rescale output packet timestamp values from codec to stream timebase */
 av_packet_rescale_ts(&pkt, c->time_base, st->time_base);
 pkt.stream_index = st->index;

 /* Write the compressed frame to the media file. */
 log_packet(fmt_ctx, &pkt);
 ret = av_interleaved_write_frame(fmt_ctx, &pkt);
 av_packet_unref(&pkt);
 if (ret < 0) {
 fprintf(stderr, "Error while writing output packet: %s\n", av_err2str(ret));
 exit(1);
 }
 }

 return ret == AVERROR_EOF ? 1 : 0;
}



resulting mpeg video's playback time is very quick, hence video gets played so fast.


so to match output video duration with input video coming from source, I am trying to pass following realtime information to Avframe structure before calling avcodec_send_frame() -


- 

-
realtime PTS value (current time + duration got through av_gettime() in microseconds) to AvFrame structure before calling avcodec_send_frame().


-
populating pkt_duration with time difference between frames (current_PTS - previous_PTS)


-
Removed this call av_packet_rescale_ts(&pkt, c->time_base, st->time_base) ; which is used after avcodec_receive_packet.










below highlighted code are the changes done for real time info-


**static int64_t m_currVideopts = 0;
static int64_t m_prevVideopts = 0;**

static int write_frame(AVFormatContext *fmt_ctx, AVCodecContext *c, AVStream *st, AVFrame *frame)
{
 **int64_t pts = av_gettime(); // returns current time in micro seconds
 int64_t duration = 90000/STREAM_FRAME_RATE; /*duration for first frame taken default as 
 3600 as we dont have reference frame to compare diff */
 pts = av_rescale_q(pts, (AVRational){1, 1000000}, st->time_base); //pAvStream->time_base - stream time base (1:90000)
 if((m_prevVideopts > 0LL) && (pts > m_prevVideopts))
 {
 duration = pts - m_prevVideopts;
 }
 else if (pts < m_prevVideopts)
 {
 pts = m_prevVideopts + duration;
 }
 m_prevVideopts = pts;
 /* Going with the approach of pts value is equal to pts value for every packet */
 frame->pts = m_currVideopts; /*AV_NOPTS_VALUE; */
 m_currVideopts += duration;
 //pFfmpegVidCtx->next_pts = m_currVideopts;
 frame->pkt_duration = duration;**

 // send the frame to the encoder
 ret = avcodec_send_frame(c, frame);
 if (ret < 0) {
 fprintf(stderr, "Error sending a frame to the encoder: %s\n",
 av_err2str(ret));
 exit(1);
 }
 ....
 // receive the packet avcodec_receive_packet()
 ...
 
 // removed or commented av_packet_rescale_ts
 **/* av_packet_rescale_ts(&pkt, c->time_base, st->time_base) */**
 ret = av_interleaved_write_frame(fmt_ctx, &pkt);



}


with the above changes, video is not playing proper.
There are couple of issues with respect to total duration (some time it is not proper on player) and also some frames are getting dropped or lost while playing in vlc or media player.




I am unable to find the cause of these frames loss while playing. Is it the correct way of passing real time pts information to the encoder or any mistake in above code.
Any suggestion would help me proceed further,
thanks.


-
-
Revision 18d323606d : Fix test to call clear system state in convolve_test. Assembly tests should cle
12 décembre 2014, par Jim BankoskiChanged Paths :
Modify /test/convolve_test.cc
Fix test to call clear system state in convolve_test.Assembly tests should clear system state, as we have no
expectation of proper system state in between test runs..Change-Id : I0f591996c1f17ef2a5a8572a6b445f757223a144