
Recherche avancée
Médias (1)
-
MediaSPIP Simple : futur thème graphique par défaut ?
26 septembre 2013, par
Mis à jour : Octobre 2013
Langue : français
Type : Video
Autres articles (100)
-
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
Creating farms of unique websites
13 avril 2011, parMediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)
Sur d’autres sites (14486)
-
Need explanation of details of ffmpeg and pipes comand
6 décembre 2011, par Don Nummer JrGot the following from FFmpeg FAQ :
mkfifo intermediate1.mpg
mkfifo intermediate2.mpg
ffmpeg -i input1.avi -sameq -y intermediate1.mpg < /dev/null &
ffmpeg -i input2.avi -sameq -y intermediate2.mpg < /dev/null &
cat intermediate1.mpg intermediate2.mpg |\
ffmpeg -f mpeg -i - -sameq -vcodec mpeg4 -acodec libmp3lame output.aviBefore i use or modify it I would like to understand it completely.
What does the
< /dev/null &
do ?I understand | is pipe but why |\ ?
What is the -f mpeg after ffmpeg (Seems, it tells ffmpeg to accept the piped in output from the cat(?) )
-
ffmpeg from erlang using open_port
19 octobre 2011, par user1002473I've got a problem with ffmpeg, when I try to use it in pipe to pipe mode in erlang. This's my code list :
fun(Data) ->
Port = open_port(
{spawn, "ffmpeg -i - -acodec copy -vcodec copy -f flv - "},
[binary,stream,use_stdio,exit_status]
),
Port ! {self(),{command,<<data></data>binary>>}},
receive_data(Port).and I've got this error from std error :
av_interleaved_write_frame() : Broken pipe
-
Decode android's hardware encoded H264 camera feed using ffmpeg in real time
31 octobre 2012, par user971871I'm trying to use the hardware
H264
encoder on Android to create video from the camera, and useFFmpeg
to mux in audio (all on the Android phone itself)What I've accomplished so far is packetizing the
H264
video intortsp
packets, and decoding it using VLC (overUDP
), so I know the video is at least correctly formatted. However, I'm having trouble getting the video data toffmpeg
in a format it can understand.I've tried sending the same
rtsp
packets to a port 5006 on localhost (over UDP), then providingffmpeg
with thesdp
file that tells it which local port the video stream is coming in on and how to decode the video, if I understandrtsp
streaming correctly. However this doesn't work and I'm having trouble diagnosing why, asffmpeg
just sits there waiting for input.For reasons of latency and scalability I can't just send the video and audio to the server and mux it there, it has to be done on the phone, in as lightweight a manner as possible.
What I guess I'm looking for are suggestions as to how this can be accomplished. The optimal solution would be sending the packetized
H264
video toffmpeg
over a pipe, but then I can't sendffmpeg
thesdp
file parameters it needs to decode the video.I can provide more information on request, like how
ffmpeg
is compiled for Android but I doubt that's necessary.Oh, and the way I start
ffmpeg
is through command line, I would really rather avoid mucking about with jni if that's at all possible.And help would be much appreciated, thanks.