
Recherche avancée
Médias (2)
-
SPIP - plugins - embed code - Exemple
2 septembre 2013, par
Mis à jour : Septembre 2013
Langue : français
Type : Image
-
Publier une image simplement
13 avril 2011, par ,
Mis à jour : Février 2012
Langue : français
Type : Video
Autres articles (39)
-
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page. -
List of compatible distributions
26 avril 2011, parThe table below is the list of Linux distributions compatible with the automated installation script of MediaSPIP. Distribution nameVersion nameVersion number Debian Squeeze 6.x.x Debian Weezy 7.x.x Debian Jessie 8.x.x Ubuntu The Precise Pangolin 12.04 LTS Ubuntu The Trusty Tahr 14.04
If you want to help us improve this list, you can provide us access to a machine whose distribution is not mentioned above or send the necessary fixes to add (...) -
Submit enhancements and plugins
13 avril 2011If you have developed a new extension to add one or more useful features to MediaSPIP, let us know and its integration into the core MedisSPIP functionality will be considered.
You can use the development discussion list to request for help with creating a plugin. As MediaSPIP is based on SPIP - or you can use the SPIP discussion list SPIP-Zone.
Sur d’autres sites (7014)
-
How to stream live video from DJI Professional 3 camera ?
29 avril 2017, par raullalvesI have to get the live stream video from DJI Phantom 3 camera in my C++ application, in order to do a Computer Vision processing in OpenCV.
First I tried sending the H264 raw data through an UDP socket, inside this callback :
mReceivedVideoDataCallBack = new CameraReceivedVideoDataCallback() {
@Override
public void onResult(byte[] videoBuffer, int size) {
//Here, I call a method from a class I created, that sends the buffer through UDP
if (gravar_trigger) controleVideo.enviarFrame(videoBuffer, size);
if (mCodecManager != null) mCodecManager.sendDataToDecoder(videoBuffer, size);
}
};That communication above works well. However, I haven’t been able to decode that UDP H264 data in my C++ desktop application. I have tested with FFmpeg lib, but couldn’t get to alocate an
AVPacket
with my UDP data, in order to decode usingavcodec_send_packet
andavcodec_receive_frame
. I also had problems withAVCodecContext
, since my UDP communication wasn’t a stream like RTSP, where it could get information about its source. Therefore, I had to change how I was trying to solve the problem.Then, I found libstreaming, in which can be associate to stream the android video camera to a Wowza Server, creating something like a RTSP stream connection, where the data could be obtained in my final C++ application easily using OpenCV
videoCapture
. However, libstreaming uses its ownsurfaceView
. In other words, I would have to link the libstreamingsurfaceView
with the DJI Drone’svideoSurface
. I’m really new to Android, so don’t have any clue of how to do that.To sum up, is that the correct approach ? Someone has a better idea ? Thanks in advance
-
How to stream live video from DJI Professional 3 camera ?
5 juin 2017, par raullalvesI have to get the live stream video from DJI Phantom 3 camera in my C++ application, in order to do a Computer Vision processing in OpenCV.
First I tried sending the H264 raw data through an UDP socket, inside this callback :
mReceivedVideoDataCallBack = new CameraReceivedVideoDataCallback() {
@Override
public void onResult(byte[] videoBuffer, int size) {
//Here, I call a method from a class I created, that sends the buffer through UDP
if (gravar_trigger) controleVideo.enviarFrame(videoBuffer, size);
if (mCodecManager != null) mCodecManager.sendDataToDecoder(videoBuffer, size);
}
};That communication above works well. However, I haven’t been able to decode that UDP H264 data in my C++ desktop application. I have tested with FFmpeg lib, but couldn’t get to alocate an
AVPacket
with my UDP data, in order to decode usingavcodec_send_packet
andavcodec_receive_frame
. I also had problems withAVCodecContext
, since my UDP communication wasn’t a stream like RTSP, where it could get information about its source. Therefore, I had to change how I was trying to solve the problem.Then, I found libstreaming, in which can be associate to stream the android video camera to a Wowza Server, creating something like a RTSP stream connection, where the data could be obtained in my final C++ application easily using OpenCV
videoCapture
. However, libstreaming uses its ownsurfaceView
. In other words, I would have to link the libstreamingsurfaceView
with the DJI Drone’svideoSurface
. I’m really new to Android, so don’t have any clue of how to do that.To sum up, is that the correct approach ? Someone has a better idea ? Thanks in advance
-
Ffmpeg 4 audio and 1 video input to 4 video outputs performance issue
22 février 2021, par YusufuI have create 4 null sink pulse monitor and it works fine.
When I publish 4 output to rtmp audio having noise and stopping. When works with 2 outputs it works fine.
If I decrease the resolution from 1920x1080 to 1280*720 it works fine too.
Using 4 different audio inputs and 1 video input



SCREEN_WIDTH=1920
SCREEN_HEIGHT=1080
SCREEN_RESOLUTION=${SCREEN_WIDTH}x${SCREEN_HEIGHT}
COLOR_DEPTH=24
X_SERVER_NUM=2
VIDEO_BITRATE=3000
VIDEO_FRAMERATE=30
VIDEO_GOP=$((VIDEO_FRAMERATE))
AUDIO_BITRATE=160k
AUDIO_SAMPLERATE=44100
AUDIO_CHANNELS=1
#some codes here

ffmpeg -y\
 -hide_banner -loglevel error \
 -nostdin \
 -s ${SCREEN_RESOLUTION} \
 -r ${VIDEO_FRAMERATE} \
 -draw_mouse 0 \
 -f x11grab \
 -i ${DISPLAY} \
 -f pulse -i MySink1.monitor \
 -f pulse -i MySink2.monitor \
 -f pulse -i MySink3.monitor \
 -f pulse -i MySink4.monitor \
 -c:v libx264 \
 -pix_fmt yuv420p \
 -profile:v main \
 -preset veryfast \
 -minrate ${VIDEO_BITRATE} \
 -maxrate ${VIDEO_BITRATE} \
 -g ${VIDEO_GOP} \
 -map 0 -f flv -map 1 ${RTMP_URL1} \
 -c:v libx264 \
 -pix_fmt yuv420p \
 -profile:v main \
 -preset veryfast \
 -minrate ${VIDEO_BITRATE} \
 -maxrate ${VIDEO_BITRATE} \
 -g ${VIDEO_GOP} \
 -map 0 -f flv -map 2 ${RTMP_URL2} \
 -c:v libx264 \
 -pix_fmt yuv420p \
 -profile:v main \
 -preset veryfast \
 -minrate ${VIDEO_BITRATE} \
 -maxrate ${VIDEO_BITRATE} \
 -g ${VIDEO_GOP} \
 -map 0 -f flv -map 3 ${RTMP_URL3} \
 -c:v libx264 \
 -pix_fmt yuv420p \
 -profile:v main \
 -preset veryfast \
 -minrate ${VIDEO_BITRATE} \
 -maxrate ${VIDEO_BITRATE} \
 -g ${VIDEO_GOP} \
 -map 0 -f flv -map 4 ${RTMP_URL4} \````
 
I guess I need some performance issue. How can I add **tee** or use one decoded video in all outputs.