
Recherche avancée
Autres articles (96)
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
Emballe médias : à quoi cela sert ?
4 février 2011, parCe plugin vise à gérer des sites de mise en ligne de documents de tous types.
Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ;
Sur d’autres sites (8944)
-
Why does ffmpeg have bigger latency on dark images ?
19 novembre 2017, par doodoromaI have a c# application to stream real camera images using ffmpeg. The input images are in raw, 8-bit gray-scale format. I created an ffmpeg stream using the standard input to feed the images and send the output packages to websocket clients.
I start an external ffmpeg process using this config :
-f rawvideo -pixel_format gray -video_size " + camera.Width.ToString() + "x" + camera.Height.ToString() + " -framerate 25 -i - -f mpeg1video -b:v 512k -s 320x240 -
Typical image size is 1040*1392 pixels
I display the stream on the browser, using jsmpeg library
This works with a reasonable latency ( 500ms on localhost), but when the camera image is really dark (black image), the latency is extremely big ( 2-3 seconds on localhost). When there is something bright on the image again after a black period, it takes 2-3 seconds to "synchronize".
I was thinking that black images are really easy to compress and will generate really small packages, jsmpeg has almost no information to display and wait until a complete data package arrives, but I couldn’t prove my theory.
I played with ffmpeg parameters like bitrate and fps but nothing has changed.
Is there any settings which I could try ?
-
FFmpeg drawtext and live coordinates with sendcmd/zmq
9 mai 2019, par DavidKI’d like to use a marker on a live video source which has external live data to set x,y coordinates. sendcmd can read the text file but it won’t update it, so later updates are not executed. Is it possible only with zmq ? And if so, can I use zmq as a single filter (with drawtext), not with filter complex ?
I have a python that exports live coordinates in the appropriate format to a cmd.txt file. I use unix time for the coordinates and I also copy the input timestamps from the live loopback device so they have almost the same time. There’s a small delay so I have compensated the exported timestamps with +1.5s. This means the marker is moving for this extra period (while timestamps in cmd.txt are a bit ahead of the live source), but it won’t update any more. I assume that FFmpeg reads this cmd.txt and won’t update it any more but my python is writing it continuously.
Example line of the cmd.txt :
1557402120.3119707 drawtext reinit 'x=752:y=480';
This is the actual ffmpeg pipe :
ffmpeg -fflags nobuffer -vaapi_device /dev/dri/renderD128 -f v4l2 -i /dev/video0 -vf "sendcmd=f=cmd.txt,drawtext=fontfile=font.ttf:fontsize=30:fontcolor=white:r=25:text='o',format=nv12,hwupload" -copyts -c:v h264_vaapi -qp 24 -y 0.mp4
Source is a loopback device with unix time as input timestamp.
-
Swapping FFMPEG input source
21 novembre 2017, par stevendesuI’m using FFMPEG to stream RMTP to a server. I wish to change what I’m streaming without breaking the connection to this server.
My current FFMPEG command looks like so :
ffmpeg -f v4l2 -s 1280x720 -r 10 -i /dev/video0 -c:v libx264 -f flv -r 30 -pix_fmt yuv420p "rtmp://server live=true pubUser=user pubPasswd=pass playpath=stream"
If I wanted to change from
/dev/video0
to/dev/video1
then I need to stop this program and re-run the command swapping out the-i
bit.Since FFMPEG can read from stdin as well as files, I believe it should be possible to switch the input source on the fly by either piping the output of a different program, or utilizing UNIX sockets. There may also be a solution built into FFMPEG which I’m not aware of.
My question now is : what’s the simplest / least code / most recommended way of switching these inputs ? Is there a third-party tool that’s recommended ? Does FFMPEG have an alternate command-line parameter I’ve never heard of ? If I do use a UNIX socket of stdin, is there a recommended way to change what’s being written to them ?
One of my concerns is that if I have FFMPEG read in from a UNIX socket and in a separate shell I have a different instance of FFMPEG writing to this UNIX socket, during the brief period when I’m switching sources the first instance of FFMPEG (which is doing the broadcasting) would die, as it couldn’t find any video data to stream.