
Recherche avancée
Médias (3)
-
Elephants Dream - Cover of the soundtrack
17 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Image
-
Valkaama DVD Label
4 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Image
-
Publier une image simplement
13 avril 2011, par ,
Mis à jour : Février 2012
Langue : français
Type : Video
Autres articles (53)
-
Gestion générale des documents
13 mai 2011, parMédiaSPIP ne modifie jamais le document original mis en ligne.
Pour chaque document mis en ligne il effectue deux opérations successives : la création d’une version supplémentaire qui peut être facilement consultée en ligne tout en laissant l’original téléchargeable dans le cas où le document original ne peut être lu dans un navigateur Internet ; la récupération des métadonnées du document original pour illustrer textuellement le fichier ;
Les tableaux ci-dessous expliquent ce que peut faire MédiaSPIP (...) -
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page. -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...)
Sur d’autres sites (5935)
-
ffmpeg removes dead space in video but not audio
10 juillet 2015, par Peter BecichI’m transferring a VHS tape on Ubuntu with
ffmpeg
,somagic-capture
, the line-in jack and an EasyCap dongle.There is a big issue with audio drift that, I believe, is caused by
ffmpeg
removing dead space from the video stream but not the audio stream.The capture utility creates a stream :
somagic-capture --ntsc -c --luminance=2 --lum-aperture=3 \
2> $SOMAGIC_LOG |Which is piped into
ffmpeg
.ffmpeg
also captures fromalsa
:ffmpeg -pixel_format uyvy422 -s:v 720x480 \
-framerate 29.97 -f rawvideo -i - -f alsa -thread_queue_size 1024 \
-i hw:0,0 -vf scale=w=640:h=480 -vcodec libx264 -preset ultrafast \
-shortest -c:a libfdk_aac -b:a 256k $OUTFILEIs it true that
ffmpeg
by default removes dead space from a stream ?If so, would
ffmpeg
trim both streams even if only one is "dead" — suppose the video is momentarily blank while the audio is noisy.If not, are these two streams incompatible with each other ? Is the ALSA input (line-in jack) on a different clock than the EasyCap video dongle ?
Input #0, rawvideo, from 'pipe:':
Duration: N/A, start: 0.000000, bitrate: 165722 kb/s
Stream #0:0: Video: rawvideo (UYVY / 0x59565955), uyvy422, 720x480,
165722 kb/s, 29.97 tbr, 29.97 tbn, 29.97 tbc
Guessed Channel Layout for Input Stream #1.0 : stereo
Input #1, alsa, from 'hw:0,0':
Duration: N/A, start: 1436492353.282796, bitrate: 1536 kb/s
Stream #1:0: Audio: pcm_s16le, 48000 Hz, 2 channels, s16, 1536 kb/sI think the
-shortest
flag only concerns killing theffmpeg
process.My prior, more general question didn’t solve the problem : http://video.stackexchange.com/questions/14809/sync-up-ffmpeg-rawvideo-recording-and-audacity-alsa-recording
Thanks for any advice !
The full script :
#!/bin/sh
PIPE=/tmp/somagic-pipe
OUTFILEDIR=~/easycap/Videos/
LOGDIR=~/.somagic-log/
NOW=`date +"%m_%d_%Y_%H_%M_%S"`
OUTFILE=${OUTFILEDIR}fpv_${NOW}_video.mp4
mkdir $LOGDIR
FFMPEG_LOG=${LOGDIR}ffmpeg_video.log
SOMAGIC_LOG=${LOGDIR}somagic_video.log
MPLAYER_LOG=${LOGDIR}mplayer_video.log
rm $PIPE >/dev/null 2>&1
rm $OUTFILE >/dev/null 2>&1
rm $FFMPEG_LOG
rm $SOMAGIC_LOG
rm $MPLAYER_LOG
mkfifo $PIPE >/dev/null 2>&1
somagic-capture --ntsc -c --luminance=2 --lum-aperture=3 \
2> $SOMAGIC_LOG | ffmpeg -pixel_format uyvy422 -s:v 720x480 \
-framerate 29.97 -f rawvideo -i - -f alsa -thread_queue_size 1024 \
-i hw:0,0 -vf scale=w=640:h=480 -vcodec libx264 -preset ultrafast \
-shortest -c:a libfdk_aac -b:a 256k $OUTFILE
rm $PIPE >/dev/null 2>&1Modified from here : https://gist.github.com/Brick85/0b327ac2d3d45e23ed33
-
Low latency video streaming on android
17 mai 2021, par Louis BlennerI'd like to be able to stream the video from my webcam to an Android app with a latency below 500ms, on my local network.


To capture and send the video over the network, I use ffmpeg.


ffmpeg -f v4l2 -i /dev/video0 -preset ultrafast -tune zerolatency -vcodec libx264 -an -vf format=yuv420p -f mpegts udp://192.168.1.155:5000



This command takes the webcam as an input, convert it and send it to a device using the mpegts protocol.


I am able to read the video on another PC with a latency below 500 ms, using commands like


gst-launch-1.0 -v udpsrc port=5000 ! video/mpegts ! tsdemux ! h264parse ! avdec_h264 ! fpsdisplaysink sync=false



or


mpv udp://0.0.0.0:5000 --no-cache --untimed --no-demuxer-thread --video-sync=audio --vd-lavc-threads=1 



So it is possible to have this range of latency.

I'd like to have the same thing on Android.

Here are my tries to do that.


Exoplayer


After looking at the different players available on Android studio, it seems like Exoplayer is the go-to choice.

I tried different options indicated in the live-streaming documentation, but I always end up with a stream taking seconds to start and with a latency of seconds.

I tried to add a Button to seek to the default position of the windows, but it results in a loading of several seconds.

DefaultExtractorsFactory extractorsFactory =
 new DefaultExtractorsFactory()
 .setTsExtractorFlags(DefaultTsPayloadReaderFactory.FLAG_IGNORE_AAC_STREAM);

 player = new SimpleExoPlayer.Builder(this)
 .setMediaSourceFactory(
 new DefaultMediaSourceFactory(this, extractorsFactory))
 .setLoadControl(new DefaultLoadControl.Builder()
 .setBufferDurationsMs(DefaultLoadControl.DEFAULT_MIN_BUFFER_MS, DefaultLoadControl.DEFAULT_MAX_BUFFER_MS, 200, 200)
 .build())
 .build();
 MyPlayerView playerView = findViewById(R.id.player_view);
 // Bind the player to the view.
 playerView.setPlayer(player);
 // Build the media item.
 MediaItem mediaItem = new MediaItem.Builder()
 .setUri(Uri.parse("udp://0.0.0.0:5000"))
 .setLiveMaxOffsetMs(500)
 .setLiveTargetOffsetMs(0)
 .setLiveMinOffsetMs(0)
 .build();
 // Set the media item to be played.
 player.setMediaItem(mediaItem);
 // Prepare the player.
 player.setPlayWhenReady(true);
 player.prepare();
 //player.seekToDefaultPosition();



This issue is about the same issue and the conclusion was that Exoplayer was not fit for this use case.




I'll be honest, ultra low-latency like this isn't ExoPlayer's main use-case




Vlc


Another try was to use the Vlc library.

But I was unable to have the same low latency stream as with the two previous example with Vlc.

I tried changing the preferences of Vlc to stream as fast as possible.

Input/Codecs -> x264 preset: ultrafast - zerolatency
Input/Codecs -> Access Module: UDP input
Input/Codecs -> Clock Jitter: 500
Audio: disable audio



I also tried reducing the different buffers.

However, I still have a latency of more than 1 seconds with that.

Gstreamer


Another try was to create a react-native project to use the different players available here.

One player that seemed promising was react-native-gstreamer because it uses gstreamer which is able to stream with low latency (gst-launch command).

But the library is now outdated.

Question


There were other tries, but none were successful.

Is there a problem with one of my approaches ?

And if not, Is there a player on Android (that I missed) which is able to achieve low latency stream like gstream or mpv on linux ?

-
ffplay 461 Unsupported transport
28 octobre 2017, par CeratoPlease help me to understand what I am doing wrong.
I am executing the following command for making a live stream
Executing: powershell gst-launch-1.0 filesrc location=D:/testVODs/sample.mkv ! matroskademux name=dmx dmx. ! h264parse config-interval=1 ! queue ! rtspclientsink profiles=avpf protocols=udp location="rtsp://192.168.1.1:8554/8c83bea4-b49b-4776-9c0f-d11e9e8898f0/1" name=rtsp_out dmx. ! opusparse ! queue ! rtsp_out.
and in the result, I see that the execution ended after 00:01:21 which means that the stream is ended successfully after my video ended :
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://192.168.1.1:8554/8c83bea4-b49b-4776-9c0f-d11e9e8898f0/1
Progress: (open) Retrieving server options
Progress: (open) Opened Stream
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Progress: (request) Sending RECORD request
Progress: (record) Sending server stream info
Progress: (request) Sending RECORD request
Progress: (request) SETUP stream 1
Progress: (request) SETUP stream 0
Progress: (record) Starting recording
Got EOS from element "pipeline0".
Execution ended after 0:01:21.318315052
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...at the same time I am trying to play the stream using the following request :
Executing: powershell ffplay -rtsp_transport http "rtsp://192.168.1.1:8554/live/8c83bea4-b49b-4776-9c0f-d11e9e8898f0/1"
but I have the following response :
ffplay started on 2017-10-27 at 18:52:13
Report written to "D:/testVODs/Stream1to.log"
ffplay version 3.4 Copyright (c) 2003-2017 the FFmpeg developers
built with gcc 7.2.0 (GCC)
configuration: --enable-gpl --enable-version3 --enable-sdl2 --enable-bzlib --enable-fontconfig --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libfreetype --enable-libmp3lame --enable-libopenjpeg --enable-libopus --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libtheora --enable-libtwolame --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libzimg --enable-lzma --enable-zlib --enable-gmp --enable-libvidstab --enable-libvorbis --enable-cuda --enable-cuvid --enable-d3d11va --enable-nvenc --enable-dxva2 --enable-avisynth --enable-libmfx
libavutil 55. 78.100 / 55. 78.100
libavcodec 57.107.100 / 57.107.100
libavformat 57. 83.100 / 57. 83.100
libavdevice 57. 10.100 / 57. 10.100
libavfilter 6.107.100 / 6.107.100
libswscale 4. 8.100 / 4. 8.100
libswresample 2. 9.100 / 2. 9.100
libpostproc 54. 7.100 / 54. 7.100
nan : 0.000 fd= 0 aq= 0KB vq= 0KB sq= 0B f=0/0
nan : 0.000 fd= 0 aq= 0KB vq= 0KB sq= 0B f=0/0
nan : 0.000 fd= 0 aq= 0KB vq= 0KB sq= 0B f=0/0
nan : 0.000 fd= 0 aq= 0KB vq= 0KB sq= 0B f=0/0
nan : 0.000 fd= 0 aq= 0KB vq= 0KB sq= 0B f=0/0
nan : 0.000 fd= 0 aq= 0KB vq= 0KB sq= 0B f=0/0
nan : 0.000 fd= 0 aq= 0KB vq= 0KB sq= 0B f=0/0
nan : 0.000 fd= 0 aq= 0KB vq= 0KB sq= 0B f=0/0
nan : 0.000 fd= 0 aq= 0KB vq= 0KB sq= 0B f=0/0
nan : 0.000 fd= 0 aq= 0KB vq= 0KB sq= 0B f=0/0
nan : 0.000 fd= 0 aq= 0KB vq= 0KB sq= 0B f=0/0
nan : 0.000 fd= 0 aq= 0KB vq= 0KB sq= 0B f=0/0
nan : 0.000 fd= 0 aq= 0KB vq= 0KB sq= 0B f=0/0
[rtsp @ 000001930a05e500] method SETUP failed: 461 Unsupported transport
rtsp://192.168.1.1:8554/live/8c83bea4-b49b-4776-9c0f-d11e9e8898f0/1: Unknown errorwhy method SETUP failed as 461 Unsupported transport and how it can be solved ?
I tried to use different ffplay, ffmpeg commands, but all of them lead to the same result.
If I do not use rtsp_trasport command I have the error [udp @ 000001d1ab52c300] ’circular_buffer_size’ option was set but it is not supported on this build (pthread support is required)
and the same 461 Unsupported transportBefore I added "profiles=avpf" to my stream creation request I managed to play it successfully with "rtsp_transport tcp".
But now profiles=avpf is required and I need to understand how to deal with it