
Recherche avancée
Autres articles (64)
-
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
Creating farms of unique websites
13 avril 2011, parMediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...) -
Supporting all media types
13 avril 2011, parUnlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)
Sur d’autres sites (6579)
-
checkasm : x86 : post commit review fixes
22 décembre 2015, par Janne Grunau -
Unsplitting two video files with ffmpeg [duplicate]
2 septembre 2016, par exebookThis question is an exact duplicate of :
I cannot find a better word for this, because I searched for "combine", "join", "merge", "concat" and each time I get the information on how to get two videos into one so that the resulting file will play first video and then continues to the second one. I, on the other hand, need them to play simultaneously in the resulting video, so that first file goes to the left, and the second one to the right. It is a video chat recording of two person web cameras that need to get into one file to play at once. How to do that with ffmpeg ?
-
Tips for encoding a live stream to IIS Media Services (or Azure Live Media Services) with FFMPEG ?
6 mars 2014, par user3104748We're trying to encode assets, either live or static, in a live stream to IIS Media Services using ffmpeg. Can anyone provide pointers for exactly what kinds of parameters we should be using and setting ?
As part of our test, just to see if we can get things to work, we have a standard plain-old MP4 video static asset that we're trying to stream to the server. It seems to work on the client side, but when we try to view the video on the receiving end, we get nothing.
Here's an example of the command we're using, where gg.mp4 is the static MP4 video (obviously (hostname) is the name of our host and not the actual word in parenthesis :)...
ffmpeg -y -re -i gg.mp4 -movflags isml+frag_keyframe -f ismv -threads 0 -c:a libvo_aacenc -ac 2 -b:a 64k -c:v libx264 -preset fast -profile:v baseline -g 48 -keyint_min 48 -map 0:v -b:v:0 477k -s:v:0 368x152 -map 0:v -b:v:1 331k -s:v:1 288x120 -map 0:v -b:v:2 230k -s:v:2 224x92 -map 0:a:0 http://(hostname)/ingest.isml/Streams(video)