
Recherche avancée
Autres articles (112)
-
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Ecrire une actualité
21 juin 2013, parPrésentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
Vous pouvez personnaliser le formulaire de création d’une actualité.
Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir
Sur d’autres sites (5999)
-
ffmpeg how to calculate complete frame ['I' frame ] from [ 'P' frame ]. It is conceptually correct ?
15 février 2014, par WhoamiI am trying to get the knowledge of ffmpeg streaming video handling.
what I understood :
I get from the IPed Camaera, the frames like 'IPPPPPPPPPPPPPPPPIPPPPPPP'..
Frame 'I' is a complete frame, where as frame 'P' depends on the previous either 'P' or 'I' frame which ever is the closes.
I get the frame by using avcodec_decode_video2
while (av_read_frame (context, &packet) >=0)
{
//LOGD (" Received PACKET...DTS and PTS %ld and %ld ", packet.pts, packet.dts);
if(packet.stream_index == videoStreamIndex ) {
avcodec_decode_video2 (pCodecCtx, pFrame, &finished, &packet);
if ( finished) {
// Here is my frame, getting the type by av_get_picture_type_char(pFrame->pict_type).
}
}Now, When i display just the frames that i have received, looks like whenever 'I' frame received, it displays properly, when received 'P' frames, the image goes for a toss.
1) We need to manually do any calculation to convert 'P' Frame to 'I' Frame so that it can be rendered ?
2) If not (1), what do i have to take care ?..Does DTS/PTS calcuation do the magic here ?
-
Stream camera/video from python to client using ffmpeg
31 janvier 2018, par RahulSo I’m making an application where I have connected my camera with python code through opencv.
Now I can process each frame inside my python code but I want to stream this live feed to users.
I found that FFmpeg can be used to achieve this but I’m wondering how.
Also please do suggest any solution to achieve this with any other library.The only condition is that I should be able to stream from my python code may be via a subprocess.
thanks in advance :)
-
Extended client ownership of MediaCodec encoder output buffers for RTMP streaming
13 février 2014, par dbroBackground :
I've connected Android's MediaCodec to FFmpeg for muxing a variety of formats not supported by MediaMuxer, including
rtmp://
output via a.flv
container. Such streaming muxers require longer, unpredictable ownership of MediaCodec's output buffers, as they may perform networking I/O on any packet processing step. For my video stream, I'm using MediaCodec configured for Surface input. To decouple muxing from encoding, I queue MediaCodec's ByteBuffer output buffers to my muxer via a Handler.All works splendidly if I mux the
.flv
output to file, rather than rtmp endpoint.Problem :
When muxing to
rtmp://...
endpoint I notice my streaming application begins to block on calls toeglSwapBuffers(mEGLDisplay, mEncodingEGLSurface)
atdequeueOutputBuffer()
once I'm retaining even a few MediaCodec output buffers in my muxing queue as MediaCodec seems to be locked to only 4 output buffers.Any tricks to avoid copying all encoder output returned by
MediaCodec#dequeueOutputBuffers
and immediately callingreleaseOutputBuffer(...)
?The full source of my project is available on Github. Specifically, see :
- AndroidEncoder.java : Abstract Encoder class with shared behavior between Audio and Video encoders : mainly drainEncoder(). Writes data to a
Muxer
instance. - FFmpegMuxer.java : Implements
Muxer
- CameraEncoder.java. Sends camera frames to an AndroidEncoder subclass configured for Video encoding.
Systrace
Here's some systrace output streaming 720p @ 2Mbps video to Zencoder.
Solved
Copying then releasing the MediaCodec encoder output ByteBuffers as soon as they're available solves the issue without significantly affecting performance. I recycle the ByteBuffer copies in an
ArrayDeque<bytebuffer></bytebuffer>
for each muxer track, which limits the number of allocations. - AndroidEncoder.java : Abstract Encoder class with shared behavior between Audio and Video encoders : mainly drainEncoder(). Writes data to a