
Recherche avancée
Autres articles (48)
-
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
Contribute to a better visual interface
13 avril 2011MediaSPIP is based on a system of themes and templates. Templates define the placement of information on the page, and can be adapted to a wide range of uses. Themes define the overall graphic appearance of the site.
Anyone can submit a new graphic theme or template and make it available to the MediaSPIP community. -
Submit enhancements and plugins
13 avril 2011If you have developed a new extension to add one or more useful features to MediaSPIP, let us know and its integration into the core MedisSPIP functionality will be considered.
You can use the development discussion list to request for help with creating a plugin. As MediaSPIP is based on SPIP - or you can use the SPIP discussion list SPIP-Zone.
Sur d’autres sites (9156)
-
on-the-fly transcode and stream with VP9 and MPEG-DASH
18 juillet 2016, par Calvin W.In the VOD system, I want to add a unique watermark to the video with FFmpeg, so I need a on-the-fly transcoding and streaming method.
In this article, it describe how to create non-muxed and chunked WebM files for live streaming. And I tried replacing input with my MP4 file and it can generate chunks of video/audio files.
Is there any way to generate a VOD-oriented MPD file, which contains video duration, rather than a live stream one ? Thanks.
-
How to generate valid live DASH for YouTube ?
24 septembre 2019, par Matt HensleyI am attempting to implement YouTube live video ingestion via DASH as documented at :
https://developers.google.com/youtube/v3/live/guides/encoding-with-dashTo start, I am exercising the YouTube API manually and running ffmpeg to verify required video parameters before implementing in my app.
Created a new livestream with
liveStreams.insert
and these values for thecdn
field :"cdn": {
"frameRate": "variable",
"ingestionType": "dash",
"resolution": "variable"
}Created a broadcast via
liveBroadcasts.insert
, then usedliveBroadcasts.bind
to bind the stream to the broadcast.Then I grabbed the
ingestionInfo
from the stream and ran this ffmpeg command, copying in theingestionAddress
with thestreamName
:ffmpeg -stream_loop -1 -re -i mov_bbb.mp4 \
-loglevel warning \
-r 30 \
-g 60 \
-keyint_min 60 \
-force_key_frames "expr:eq(mod(n,60),0)" \
-quality realtime \
-map v:0 \
-c:v libx264 \
-b:v:0 800k \
-map a:0 \
-c:a aac \
-b:a 128k \
-strict -2 \
-f dash \
-streaming 1 \
-seg_duration 2 \
-use_timeline 0 \
-use_template 1 \
-window_size 5 \
-extra_window_size 10 \
-index_correction 1 \
-adaptation_sets "id=0,streams=v id=1,streams=a" \
-dash_segment_type mp4 \
-method PUT \
-http_persistent 1 \
-init_seg_name "dash_upload?cid=${streamName}&copy=0&file=init$RepresentationID$.mp4" \
-media_seg_name "dash_upload?cid=${streamName}&copy=0&file=media$RepresentationID$$Number$.mp4" \
'https://a.upload.youtube.com/dash_upload?cid=${streamName}&copy=0&file=dash.mpd'It appears all the playlist updates and video segments upload fine to YouTube - ffmpeg does not report any errors. However the
liveStream
status always showsnoData
, and the YouTube Live Control Room doesn’t show the stream as receiving data.The DASH output, when written to files play backs fine in this test player. The playlist output doesn’t match exactly the samples, but does have the required tags per the "MPD Contents" section in the documentation.
Are my ffmpeg arguments incorrect, or does YouTube have additional playlist format requirements that are not documented ?
-
How to synchronize HLS and/or MPEG-DASH videos on multiple clients using ExoPlayer ?
22 mai 2019, par G CI’m trying to guarantee synchronization between multiple clients using DASH and/or HLS. Synchronization between each client must fall within 40 milliseconds.
Live streaming seems to be an obvious choice. However, the only way to really get within a small time frame of synchronization would be to lower the segment times. Is this the only viable solution ? Are there any tags that would help me keep clients within 40 milliseconds to the live time ?
Currently, I’m using FFMPEG to encode video and audio to live content.