
Recherche avancée
Autres articles (59)
-
Les tâches Cron régulières de la ferme
1er décembre 2010, parLa gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
Le super Cron (gestion_mutu_super_cron)
Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
List of compatible distributions
26 avril 2011, parThe table below is the list of Linux distributions compatible with the automated installation script of MediaSPIP. Distribution nameVersion nameVersion number Debian Squeeze 6.x.x Debian Weezy 7.x.x Debian Jessie 8.x.x Ubuntu The Precise Pangolin 12.04 LTS Ubuntu The Trusty Tahr 14.04
If you want to help us improve this list, you can provide us access to a machine whose distribution is not mentioned above or send the necessary fixes to add (...)
Sur d’autres sites (8002)
-
combine three videos between specific time using ffmpeg
9 décembre 2014, par KrisiI have a situation where I want to combine 3 videos
The first video (guide.mp4) is the guide and the other 2 are small pieces that need to go inside the "guide"
In a graphical manner, I have the following
| .---1.mp4---. .-----2.mp4-----. |
|---------+-----------+------------guide.mp4-------+---------------+----|
0s 4s 9s 18s 25s 28sthe 1.mp4 and 2.mp4 will cover the guide between these times.
the videos are of same size and all are .mp4 filesI am pretty sure that this is covered, but I looked a lot, but since I myself am not good in english, I dont know how I could possibly look it up with success...
Thanks in advance.
-
How to stream live video from DJI Professional 3 camera ?
29 avril 2017, par raullalvesI have to get the live stream video from DJI Phantom 3 camera in my C++ application, in order to do a Computer Vision processing in OpenCV.
First I tried sending the H264 raw data through an UDP socket, inside this callback :
mReceivedVideoDataCallBack = new CameraReceivedVideoDataCallback() {
@Override
public void onResult(byte[] videoBuffer, int size) {
//Here, I call a method from a class I created, that sends the buffer through UDP
if (gravar_trigger) controleVideo.enviarFrame(videoBuffer, size);
if (mCodecManager != null) mCodecManager.sendDataToDecoder(videoBuffer, size);
}
};That communication above works well. However, I haven’t been able to decode that UDP H264 data in my C++ desktop application. I have tested with FFmpeg lib, but couldn’t get to alocate an
AVPacket
with my UDP data, in order to decode usingavcodec_send_packet
andavcodec_receive_frame
. I also had problems withAVCodecContext
, since my UDP communication wasn’t a stream like RTSP, where it could get information about its source. Therefore, I had to change how I was trying to solve the problem.Then, I found libstreaming, in which can be associate to stream the android video camera to a Wowza Server, creating something like a RTSP stream connection, where the data could be obtained in my final C++ application easily using OpenCV
videoCapture
. However, libstreaming uses its ownsurfaceView
. In other words, I would have to link the libstreamingsurfaceView
with the DJI Drone’svideoSurface
. I’m really new to Android, so don’t have any clue of how to do that.To sum up, is that the correct approach ? Someone has a better idea ? Thanks in advance
-
How to stream live video from DJI Professional 3 camera ?
5 juin 2017, par raullalvesI have to get the live stream video from DJI Phantom 3 camera in my C++ application, in order to do a Computer Vision processing in OpenCV.
First I tried sending the H264 raw data through an UDP socket, inside this callback :
mReceivedVideoDataCallBack = new CameraReceivedVideoDataCallback() {
@Override
public void onResult(byte[] videoBuffer, int size) {
//Here, I call a method from a class I created, that sends the buffer through UDP
if (gravar_trigger) controleVideo.enviarFrame(videoBuffer, size);
if (mCodecManager != null) mCodecManager.sendDataToDecoder(videoBuffer, size);
}
};That communication above works well. However, I haven’t been able to decode that UDP H264 data in my C++ desktop application. I have tested with FFmpeg lib, but couldn’t get to alocate an
AVPacket
with my UDP data, in order to decode usingavcodec_send_packet
andavcodec_receive_frame
. I also had problems withAVCodecContext
, since my UDP communication wasn’t a stream like RTSP, where it could get information about its source. Therefore, I had to change how I was trying to solve the problem.Then, I found libstreaming, in which can be associate to stream the android video camera to a Wowza Server, creating something like a RTSP stream connection, where the data could be obtained in my final C++ application easily using OpenCV
videoCapture
. However, libstreaming uses its ownsurfaceView
. In other words, I would have to link the libstreamingsurfaceView
with the DJI Drone’svideoSurface
. I’m really new to Android, so don’t have any clue of how to do that.To sum up, is that the correct approach ? Someone has a better idea ? Thanks in advance