
Recherche avancée
Autres articles (9)
-
Supporting all media types
13 avril 2011, parUnlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)
-
Gestion générale des documents
13 mai 2011, parMédiaSPIP ne modifie jamais le document original mis en ligne.
Pour chaque document mis en ligne il effectue deux opérations successives : la création d’une version supplémentaire qui peut être facilement consultée en ligne tout en laissant l’original téléchargeable dans le cas où le document original ne peut être lu dans un navigateur Internet ; la récupération des métadonnées du document original pour illustrer textuellement le fichier ;
Les tableaux ci-dessous expliquent ce que peut faire MédiaSPIP (...) -
Les vidéos
21 avril 2011, parComme les documents de type "audio", Mediaspip affiche dans la mesure du possible les vidéos grâce à la balise html5 .
Un des inconvénients de cette balise est qu’elle n’est pas reconnue correctement par certains navigateurs (Internet Explorer pour ne pas le nommer) et que chaque navigateur ne gère en natif que certains formats de vidéos.
Son avantage principal quant à lui est de bénéficier de la prise en charge native de vidéos dans les navigateur et donc de se passer de l’utilisation de Flash et (...)
Sur d’autres sites (3684)
-
FFMpeg Muxer Internal Buffering
7 mars 2016, par rkrishnan2012I am using libFFmpeg to mux h264 and aac packets into an flv container and streaming that via rtmp.
I want to be able to detect when frames are being sent to the server or the size of the internal buffer, in order to detect the network lag and apply variable bit-rate. However, when I clock the time it takes for
av_write_frame
, it is much too fast for it to be syncronous (so we can’t use this to detect network speed).I already tried calling
av_write_frame
with a null parameter (to flush data) every frame, and also directly callingavio_flush
but the bitrate is still too high. So, I am wondering if there is another buffer that I am missing (such as a socket-level buffer).I also tried to print the AVIOContext->buf_end, buffer_size, and buf_ptr values but none of those are indicative of my network speed getting throttled (when I limit the bandwidth on my router, I would expect the buffer size to increase).
tl ;dr - I would like to know how to properly detect packets being sent to the server, or the network speed lagging compared to my av_write_frame calls.
If it makes a difference, this is being run on an arm-v7a Android device.
-
ffmpeg get frame time stamp
19 novembre 2018, par Anshul G.I am trying to record a webcam video using ffmpeg. I have a logitech c922 Pro Stream Webcam. This is the command that I use :
ffmpeg -f v4l2 -framerate 60 -video_size 1280x720 -input_format mjpeg -i /dev/video1 out.mp4
My application requires me to get the exact timestamp for every frame. While I could use my knowledge of the framerate and frame number to add the required interval to the start time, I am afraid that this might not be completely accurate.
Firstly, I have noticed that while recording, the console seems to initially display a far higher fps than the one I have set :
Press [q] to stop, [?] for help
frame= 177 fps= 85 q=-1.0 Lsize= 502kB time=00:00:02.91 bitrate=1410.8kbits/s dup=144 drop=0Also, I think that ffmpeg drops frames in between sometimes.
However, my videos seem to have the correct number of frames so I think that the fps value displayed could instead be referring to the encoding/ decoding speed. I am not sure about frame dropping.
I would be happy of you could let me know what you think, or suggest an alternative so that I can timestamp my frames accurately. Thanks !
Edit :
I have understood that frame rate is correlated to ambient light which can lead to high duplication in frames. I am currently recording on windows and have set frame rate as the priority in logitech gaming software. However there is still the occasional drop in frame or duplication. Does this affect time stamp of the frames ? Or can I extrapolate from the start time ?
-
Create RTP packets with audio or video for a particular codec in FFmpeg
15 mai 2020, par easy_breezyI'm new to FFmpeg and I want to stream audio and video data from my android device to another device (PC, Android, iOS, Mac it doesn't matter) in real time (RTP packets by UDP) by Wi-Fi or LTE. My audio is encoded in G711 or Opus codec and my video is H264, so my question is : is there way to create RTP packets for encoded audio or video using FFmpeg in C++ code ?



I'd like to have something like this :



- 

- I encode my audio and video using FFmpeg or third party library.
- I pass my encoded data and it's metadata (like a keyframe flag, and
some other codec specific things) to some FFmpeg method and this
method returns to me an RTP packet filled with my encoded data and
codec specific data.
- I take the RTP packet and send it by UDP using Java/Kotlin socket.









It's very important for me to get RTP packets and send it in Java/Kotlin layer not in C++.