
Recherche avancée
Autres articles (30)
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Les formats acceptés
28 janvier 2010, parLes commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
ffmpeg -codecs ffmpeg -formats
Les format videos acceptés en entrée
Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
Les formats vidéos de sortie possibles
Dans un premier temps on (...) -
Ajouter notes et légendes aux images
7 février 2011, parPour pouvoir ajouter notes et légendes aux images, la première étape est d’installer le plugin "Légendes".
Une fois le plugin activé, vous pouvez le configurer dans l’espace de configuration afin de modifier les droits de création / modification et de suppression des notes. Par défaut seuls les administrateurs du site peuvent ajouter des notes aux images.
Modification lors de l’ajout d’un média
Lors de l’ajout d’un média de type "image" un nouveau bouton apparait au dessus de la prévisualisation (...)
Sur d’autres sites (6799)
-
Low latency video shared in local gigabit network using linux [on hold]
6 mai 2017, par user3387542For a robotics task we need to share the video (Webcam) live to about 6 or 7 users in the same room. OpenCV will be used on the clients to read the situation and send new tasks to the robots. Latency should not be much more than one second, the lower the better. What commands would you recommend for this ?
We have one camera on a Linux host which wants to share the video to about 6 other units just some meters away.
I already experimented with different setups. While raw-video looks like perfectly latency free (local loopback, the issue is the amount of data), any compression suddenly ads about a second delay.
And how should we share this in the network. Is broadcasting the right approach ? How can it be so hard, they are right next to each other.Works locally, issues over the network.
#server
ffmpeg -f video4linux2 -r 10 -s 1280x720 -i /dev/video0 -c:v libx264 -preset veryfast -tune zerolatency -pix_fmt yuv420p -f mpegts - | socat - udp-sendto:192.168.0.255:12345,broadcast
#client
socat -u udp-recv:12345,reuseaddr - | vlc --live-caching=0 --network-caching=0 --file-caching=0 -raw video - perfectly fine like this, video with many artefacts if sent over the network
ffmpeg -f video4linux2 -r 10 -s 1280x720 -i /dev/video0 -c:v rawvideo -f rawvideo -pix_fmt yuv420p - | vlc --demux rawvideo --rawvid-fps 10 --rawvid-width 1280 --rawvid-height 720 --rawvid-chroma I420 -
The technology used doesen’t matter, we do not care about network load either. Just want to use opencv on different clients using live data.
-
Can not copy RTSP streamed video in android using FFMPEG [on hold]
28 mars 2017, par user711457I am developing an android application. In my application I wanted to record live streamed video using
ffmpeg
library. I write one code for record live streamed video but it does’t work. It shows some error. If any one know this please help me.This is the code I am used to record video
try {
fFmpeg.execute(new String[]{"ffmpeg -i rtsp://192.168.1.1:6667/streamhd -acodec copy -vcode c copy"+String.valueOf(getCacheDir())+"/MyVideo.mp4"}, new ExecuteBinaryResponseHandler() {
@Override
public void onSuccess(String message) {
Log.d("fffffff", "FFmpeg cmd success");
}
@Override
public void onFailure(String message) {
Log.d("ffffffffffff", message.toString());
}
});
}catch (FFmpegCommandAlreadyRunningException e) {
// Handle if FFmpeg is already running
e.printStackTrace();
Log.w(null,e.toString());}when I execute this block I get the following error message.
03-27 12:48:47.109 2042-2042/com.steelmanpro.wifivideoscope D/ffffffffffff: ffmpeg version n3.0.1 Copyright (c) 2000-2016 the FFmpeg developers
built with gcc 4.8 (GCC)
configuration: --target-os=linux --cross- prefix=/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/bin/arm-linux-androideabi- --arch=arm --cpu=cortex-a8 --enable-runtime-cpudetect --sysroot=/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/sysroot --enable-pic --enable-libx264 --enable-libass --enable-libfreetype --enable-libfribidi --enable-libmp3lame --enable-fontconfig --enable-pthreads --disable-debug --disable-ffserver --enable-version3 --enable-hardcoded-tables --disable-ffplay --disable-ffprobe --enable-gpl --enable-yasm --disable-doc --disable-shared --enable-static --pkg-config=/home/vagrant/SourceCode/ffmpeg-android/ffmpeg-pkg-config --prefix=/home/vagrant/SourceCode/ffmpeg-android/build/armeabi-v7a --extra-cflags='-I/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/include -U_FORTIFY_SOURCE -D_FORTIFY_SOURCE=2 -fno-strict-overflow -fstack-protector-all' --extra-ldflags='-L/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/lib -Wl,-z,relro -Wl,-z,now -pie' --extra-libs='-lpng -lexpat -lm' --extra-cxxflags=
libavutil 55. 17.103 / 55. 17.103
libavcodec 57. 24.102 / 57. 24.102
libavformat 57. 25.100 / 57. 25.100
libavdevice 57. 0.101 / 57. 0.101
libavfilter 6. 31.100 / 6. 31.100
libswscale 4. 0.100 / 4. 0.100
libswresample 2. 0.101 / 2. 0.101
libpostproc 54. 0.100 / 54. 0.100
Output #0, mp4, to 'ffmpeg -i rtsp://192.168.1.1:6667/streamhd -acodec copy -vcode c copy/data/data/com.steelmanpro.wifivideoscope/cache/MyVideo.mp4':
Output file #0 does not contain any stream -
Is it possible for Nginx rtmp server to receive RTSP and convert it to RTMP stream using ffmpeg ? [on hold]
11 février 2017, par Jun KimI want to stream live video and audio from android built-in camera to nginx rtmp server to publish it to multiple clients.
From this open source api https://github.com/fyhertz/libstreaming-examples, I successfully implemented example 3, which streams live video and audio using rtsp to wowza media server. It works fine.
In order to implement it with nginx rtmp server instead of wowza media server, I read some comments saying that I could convert RTSP live stream to RTMP by using ffmpeg in nginx.conf file. However, it doesn’t seem to work.
Below ffmpeg command is what i put in nginx.conf file.
"ffmpeg -rtsp_transport rtp -i "rtsp url" -vcodec copy -f flv -r 25 -s 1920x1080 "rtmp url" (since libstreaming streams camera using rtp over udp)Does anybody know if it is possible for nginx media server to receive RTSP from android and convert it to RTMP stream and publish it to multiple clients ?
If it is impossible, should I use different api that helps streaming live content over RTMP from android phone to nginx rtmp server, such as JacaCV ?
(https://github.com/bytedeco/javacv)