
Recherche avancée
Médias (1)
-
The Great Big Beautiful Tomorrow
28 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Texte
Autres articles (59)
-
Taille des images et des logos définissables
9 février 2011, parDans beaucoup d’endroits du site, logos et images sont redimensionnées pour correspondre aux emplacements définis par les thèmes. L’ensemble des ces tailles pouvant changer d’un thème à un autre peuvent être définies directement dans le thème et éviter ainsi à l’utilisateur de devoir les configurer manuellement après avoir changé l’apparence de son site.
Ces tailles d’images sont également disponibles dans la configuration spécifique de MediaSPIP Core. La taille maximale du logo du site en pixels, on permet (...) -
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs
Sur d’autres sites (9313)
-
How to stream live video from DJI Professional 3 camera ?
29 avril 2017, par raullalvesI have to get the live stream video from DJI Phantom 3 camera in my C++ application, in order to do a Computer Vision processing in OpenCV.
First I tried sending the H264 raw data through an UDP socket, inside this callback :
mReceivedVideoDataCallBack = new CameraReceivedVideoDataCallback() {
@Override
public void onResult(byte[] videoBuffer, int size) {
//Here, I call a method from a class I created, that sends the buffer through UDP
if (gravar_trigger) controleVideo.enviarFrame(videoBuffer, size);
if (mCodecManager != null) mCodecManager.sendDataToDecoder(videoBuffer, size);
}
};That communication above works well. However, I haven’t been able to decode that UDP H264 data in my C++ desktop application. I have tested with FFmpeg lib, but couldn’t get to alocate an
AVPacket
with my UDP data, in order to decode usingavcodec_send_packet
andavcodec_receive_frame
. I also had problems withAVCodecContext
, since my UDP communication wasn’t a stream like RTSP, where it could get information about its source. Therefore, I had to change how I was trying to solve the problem.Then, I found libstreaming, in which can be associate to stream the android video camera to a Wowza Server, creating something like a RTSP stream connection, where the data could be obtained in my final C++ application easily using OpenCV
videoCapture
. However, libstreaming uses its ownsurfaceView
. In other words, I would have to link the libstreamingsurfaceView
with the DJI Drone’svideoSurface
. I’m really new to Android, so don’t have any clue of how to do that.To sum up, is that the correct approach ? Someone has a better idea ? Thanks in advance
-
How to stream live video from DJI Professional 3 camera ?
5 juin 2017, par raullalvesI have to get the live stream video from DJI Phantom 3 camera in my C++ application, in order to do a Computer Vision processing in OpenCV.
First I tried sending the H264 raw data through an UDP socket, inside this callback :
mReceivedVideoDataCallBack = new CameraReceivedVideoDataCallback() {
@Override
public void onResult(byte[] videoBuffer, int size) {
//Here, I call a method from a class I created, that sends the buffer through UDP
if (gravar_trigger) controleVideo.enviarFrame(videoBuffer, size);
if (mCodecManager != null) mCodecManager.sendDataToDecoder(videoBuffer, size);
}
};That communication above works well. However, I haven’t been able to decode that UDP H264 data in my C++ desktop application. I have tested with FFmpeg lib, but couldn’t get to alocate an
AVPacket
with my UDP data, in order to decode usingavcodec_send_packet
andavcodec_receive_frame
. I also had problems withAVCodecContext
, since my UDP communication wasn’t a stream like RTSP, where it could get information about its source. Therefore, I had to change how I was trying to solve the problem.Then, I found libstreaming, in which can be associate to stream the android video camera to a Wowza Server, creating something like a RTSP stream connection, where the data could be obtained in my final C++ application easily using OpenCV
videoCapture
. However, libstreaming uses its ownsurfaceView
. In other words, I would have to link the libstreamingsurfaceView
with the DJI Drone’svideoSurface
. I’m really new to Android, so don’t have any clue of how to do that.To sum up, is that the correct approach ? Someone has a better idea ? Thanks in advance
-
Is possible play rtmpdump on web player ? [on hold]
2 mai 2014, par DavidI would know wich is the best way for play rtmpdump links on a web player. I made some researchs and I don´t find clear info about it.
Thanks
David