
Recherche avancée
Médias (1)
-
SWFUpload Process
6 septembre 2011, par
Mis à jour : Septembre 2011
Langue : français
Type : Texte
Autres articles (70)
-
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...) -
Librairies et binaires spécifiques au traitement vidéo et sonore
31 janvier 2010, parLes logiciels et librairies suivantes sont utilisées par SPIPmotion d’une manière ou d’une autre.
Binaires obligatoires FFMpeg : encodeur principal, permet de transcoder presque tous les types de fichiers vidéo et sonores dans les formats lisibles sur Internet. CF ce tutoriel pour son installation ; Oggz-tools : outils d’inspection de fichiers ogg ; Mediainfo : récupération d’informations depuis la plupart des formats vidéos et sonores ;
Binaires complémentaires et facultatifs flvtool2 : (...)
Sur d’autres sites (4469)
-
Trying to grab video stream from a 802W device
1er juin 2015, par brentilA group of us in the RC hobby forums had started trying to use a device called the 802W, it takes RCA in and then broadcasts it back out over a WiFi you connect to via an Android or iOS device. They’re typically used for backup camera addon systems for vehicles. We want to use it to do FPV (First Person Video/View) with using smartphones instead of buying more expensive FPV goggles.
802W device example (plenty of clones online)
http://www.amazon.com/Wireless-Backup-Camera-Transmitter-Android/dp/B00LJPTJSY
The problem is you can only use their application WIFI_AVIN or WIFI_AVIN2 from the app stores to connect to it because they don’t publish the information about how to grab the stream data. We want to write our own apps that can use the stream to better show the information. We’ve tried using VLC to grab the stream from an Android phone or a Windows PC but we’ve had no success so far. I was hoping someone could look at the Wireshark outputs and might understand what they’re looking at better than I am. I "think" it’s a UDP multicast being broadcasted but I just don’t know enough to be sure. We’ve tried using VLC to connect to network streams directly on the device or from udp ://@ type addresses but I think part of the issue too might be we’re missing the file path of the stream file.
Attempting to reverse engineer their code for learning purposes showed that ffmpeg is inside a compiled .so library which also seems to be where the actual connection code happens which we were unable to dig into.
In the images 192.168.72.33 is my phone and 192.168.72.173 is the 802W device.
Image of what I believe is a UDP broadcast of the video information.
This is what the stream turns into when the device connects using the WIFI_AVIN application.
-
Anomalie #3418 : Les tables des plugins ne s’installent pas
9 mai 2015, par Maïeul RouquetteMa config local
PHP Version 5.2.17
System Darwin FTSR22998 12.6.0 Darwin Kernel Version 12.6.0 : Wed Mar 18 16:23:48 PDT 2015 ; root:xnu-2050.48.19 1/RELEASE_X86_64 x86_64
Build Date Sep 18 2013 13:58:06
Configure Command ’./configure’ ’—with-mysql=/Applications/MAMP/Library’ ’—with-apxs2=/Applications/MAMP/Library/bin/apxs’ ’—with-gd’ ’—with-jpeg-dir=/Applications/MAMP/Library’ ’—with-png-dir=/Applications/MAMP/Library’ ’—with-zlib’ ’—with-freetype-dir=/Applications/MAMP/Library’ ’—prefix=/Applications/MAMP/bin/php/php5.2.17’ ’—exec-prefix=/Applications/MAMP/bin/php/php5.2.17’ ’—sysconfdir=/Applications/MAMP/bin/php/php5.2.17/conf’ ’—with-soap’ ’—with-config-file-path=/Applications/MAMP/bin/php/php5.2.17/conf’ ’—enable-track-vars’ ’—enable-bcmath’ ’—enable-ftp’ ’—enable-gd-native-ttf’ ’—with-bz2=/usr’ ’—with-ldap’ ’—with-mysqli=/Applications/MAMP/Library/bin/mysql_config’ ’—with-sqlite’ ’—with-ttf’ ’—with-t1lib=/Applications/MAMP/Library’ ’—enable-mbstring=all’ ’—with-curl=/Applications/MAMP/Library’ ’—enable-dbx’ ’—enable-sockets’ ’—enable-bcmath’ ’—with-imap=shared,/Applications/MAMP/Library/lib/imap-2007f’ ’—enable-soap’ ’—with-kerberos’ ’—enable-calendar’ ’—with-pgsql=shared,/Applications/MAMP/Library/pg’ ’—enable-dbase’ ’—enable-exif’ ’—with-libxml-dir=/Applications/MAMP/Library’ ’—with-gettext=shared,/Applications/MAMP/Library’ ’—with-xsl=/Applications/MAMP/Library’ ’—with-pdo-mysql=shared,/Applications/MAMP/Library’ ’—with-pdo-pgsql=shared,/Applications/MAMP/Library/pg’ ’—with-mcrypt=shared,/Applications/MAMP/Library’ ’—with-openssl’ ’—enable-zip’ ’—with-iconv=/Applications/MAMP/Library’
Server API Apache 2.0 Handler
Virtual Directory Support disabled
Configuration File (php.ini) Path /Applications/MAMP/bin/php/php5.2.17/conf
Loaded Configuration File /Applications/MAMP/bin/php/php5.2.17/conf/php.ini
Scan this dir for additional .ini files (none)
additional .ini files parsed (none)
PHP API 20041225
PHP Extension 20060613
Zend Extension 220060519
Debug Build no
Thread Safety disabled
Zend Memory Manager enabled
IPv6 Support enabled
Registered PHP Streams https, ftps, compress.zlib, compress.bzip2, php, file, data, http, ftp, zip
Registered Stream Socket Transports tcp, udp, unix, udg, ssl, sslv3, sslv2, tls
Registered Stream Filters zlib.*, bzip2.*, convert.iconv.*, string.rot13, string.toupper, string.tolower, string.strip_tags, convert.*, consumedMon test
1) spip 3.1 svn 22109
2) pas de plugins/auto mais le flux déclaré
3) plugin tickets mis manuellement dans plugins
4) j’active le plugin. Il se "préactive" en entendance les dépendances
5) lorsque j’installe saisies, le plugin tickets peut bien s’activer
6) mais les tables n’ont pas été installées -
How to Read DJI FPV Feed as OpenCV Object ?
24 mai 2019, par Walter MorawaI’ve officially spent a lot of time looking for a solution to reading DJI’s FPV feed as an OpenCV Mat object. I am probably overlooking something simple, since I am not too familiar with Image Encoding/Decoding.
I apologize if I am missing something very basic, but I know I’m not the first person to have issues getting DJI’s FPV feed, and answering this question, especially if option 1 is possible, would be extremely valuable to many developers. Please consider upvoting this question, as I’ve thoroughly researched this issue and future developers who come across it will likely run into a bunch of the same issues I had.
I’m willing to use ffmpeg or Javacv if necessary, but that’s quite the hurdle for most Android developers as we’re going to have to use cpp, ndk, terminal for testing, etc. That seems like overkill.
I believe the issue lies in the fact that we need to decode both the byte array of length 6 (info array) and the byte array with current frame info simultaneously. Thanks in advance for your time.
Basically, DJI’s FPV feed comes in a number of formats.
- Raw H264 (MPEG4) in VideoFeeder.VideoDataListener
// The callback for receiving the raw H264 video data for camera live view
mReceivedVideoDataListener = new VideoFeeder.VideoDataListener() {
@Override
public void onReceive(byte[] videoBuffer, int size) {
//Log.d("BytesReceived", Integer.toString(videoStreamFrameNumber));
if (videoStreamFrameNumber++%30 == 0){
//convert video buffer to opencv array
OpenCvAndModelAsync openCvAndModelAsync = new OpenCvAndModelAsync();
openCvAndModelAsync.execute(videoBuffer);
}
if (mCodecManager != null) {
mCodecManager.sendDataToDecoder(videoBuffer, size);
}
}
};- DJI also has it’s own Android decoder sample with FFMPEG to convert to YUV format.
@Override
public void onYuvDataReceived(final ByteBuffer yuvFrame, int dataSize, final int width, final int height) {
//In this demo, we test the YUV data by saving it into JPG files.
//DJILog.d(TAG, "onYuvDataReceived " + dataSize);
if (count++ % 30 == 0 && yuvFrame != null) {
final byte[] bytes = new byte[dataSize];
yuvFrame.get(bytes);
AsyncTask.execute(new Runnable() {
@Override
public void run() {
if (bytes.length >= width * height) {
Log.d("MatWidth", "Made it");
YuvImage yuvImage = saveYuvDataToJPEG(bytes, width, height);
Bitmap rgbYuvConvert = convertYuvImageToRgb(yuvImage, width, height);
Mat yuvMat = new Mat(height, width, CvType.CV_8UC1);
yuvMat.put(0, 0, bytes);
//OpenCv Stuff
}
}
});
}
}- DJI also appears to have a "getRgbaData" function, but there is literally not a single example online or by DJI. Go ahead and Google "DJI getRgbaData"... There’s only the reference to the api documentation that explains the self explanatory parameters and return values but nothing else. I couldn’t figure out where to call this and there doesn’t appear to be a callback function as there is with YUV. You can’t call it from the h264b byte array directly, but perhaps you can get it from the yuv data.
Option 1 is much more preferable to option 2, since YUV format has quality issues. Option 3 would also likely involve a decoder.
Here’s a screenshot that DJI’s own YUV conversion produces.
I’ve looked at a bunch of things about how to improve the YUV, remove green and yellow colors and whatnot, but at this point if DJI can’t do it right, I don’t want to invest resources there.
Regarding Option 1, I know there’s FFMPEG and JavaCV that seem like good options if I have to go the video decoding route. However, both options seem quite time consuming. This JavaCV H264 conversion seems unnecessarily complex. I found it from this relevant question.
Moreover, from what I understand, OpenCV can’t handle reading and writing video files without FFMPEG, but I’m not trying to read a video file, I am trying to read an H264/MPEG4 byte[] array. The following code seems to get positive results.
/* Async OpenCV Code */
private class OpenCvAndModelAsync extends AsyncTask {
@Override
protected double[] doInBackground(byte[]... params) {//Background Code Executing. Don't touch any UI components
//get fpv feed and convert bytes to mat array
Mat videoBufMat = new Mat(4, params[0].length, CvType.CV_8UC4);
videoBufMat.put(0,0, params[0]);
//if I add this in it says the bytes are empty.
//Mat videoBufMat = Imgcodecs.imdecode(encodeVideoBuf, Imgcodecs.IMREAD_ANYCOLOR);
//encodeVideoBuf.release();
Log.d("MatRgba", videoBufMat.toString());
for (int i = 0; i< videoBufMat.rows(); i++){
for (int j=0; j< videoBufMat.cols(); j++){
double[] rgb = videoBufMat.get(i, j);
Log.i("Matrix", "red: "+rgb[0]+" green: "+rgb[1]+" blue: "+rgb[2]+" alpha: "
+ rgb[3] + " Length: " + rgb.length + " Rows: "
+ videoBufMat.rows() + " Columns: " + videoBufMat.cols());
}
}
double[] center = openCVThingy(videoBufMat);
return center;
}
protected void onPostExecute(double[] center) {
//handle ui or another async task if necessary
}
}Rows = 4, Columns > 30k. I get lots of RGB values that seem valid, such as red = 113, green=75, blue=90, alpha=220 as a made up example ; however, I get a ton of 0,0,0,0 values. That should be somewhat okay, since Black is 0,0,0 (although I would have thought the alpha would be higher) and I have a black object in my image.
However, when I try to compute the contours from this image, I almost always get that the moments (center x, y) are exactly in the center of the image. This error has nothing to do with my color filter or contours algorithm, as I wrote a script in python and tested that I implemented it correctly in Android by reading a still image and getting the exact same number of contours, position, etc in both Python and Android.
I noticed it has something to do with the videoBuffer byte size (bonus points if you can explain why every other length is 6 !)
2019-05-23 21:14:29.601 21431-22086/com.dji.simulatorDemo D/VideoBufferSize: 2425
2019-05-23 21:14:29.802 21431-22086/com.dji.simulatorDemo D/VideoBufferSize: 2659
2019-05-23 21:14:30.004 21431-22086/com.dji.simulatorDemo D/VideoBufferSize: 6
2019-05-23 21:14:30.263 21431-22086/com.dji.simulatorDemo D/VideoBufferSize: 6015
2019-05-23 21:14:30.507 21431-22086/com.dji.simulatorDemo D/VideoBufferSize: 6
2019-05-23 21:14:30.766 21431-22086/com.dji.simulatorDemo D/VideoBufferSize: 4682
2019-05-23 21:14:31.005 21431-22086/com.dji.simulatorDemo D/VideoBufferSize: 6
2019-05-23 21:14:31.234 21431-22086/com.dji.simulatorDemo D/VideoBufferSize: 2840
2019-05-23 21:14:31.433 21431-22086/com.dji.simulatorDemo D/VideoBufferSize: 4482
2019-05-23 21:14:31.664 21431-22086/com.dji.simulatorDemo D/VideoBufferSize: 6
2019-05-23 21:14:31.927 21431-22086/com.dji.simulatorDemo D/VideoBufferSize: 4768
2019-05-23 21:14:32.174 21431-22086/com.dji.simulatorDemo D/VideoBufferSize: 6
2019-05-23 21:14:32.433 21431-22086/com.dji.simulatorDemo D/VideoBufferSize: 4700
2019-05-23 21:14:32.668 21431-22086/com.dji.simulatorDemo D/VideoBufferSize: 6
2019-05-23 21:14:32.864 21431-22086/com.dji.simulatorDemo D/VideoBufferSize: 4740
2019-05-23 21:14:33.102 21431-22086/com.dji.simulatorDemo D/VideoBufferSize: 6
2019-05-23 21:14:33.365 21431-22086/com.dji.simulatorDemo D/VideoBufferSize: 4640My questions :
I. Is this the correct format way to read an h264 byte as mat ?
Assuming the format is RGBA, that means row = 4 and columns = byte[].length, and CvType.CV_8UC4. Do I have height and width correct ? Something tells me YUV height and width is off. I was getting some meaningful results, but the contours were exactly in the center, just like with the H264.II. Does OpenCV handle MP4 in android like this ? If not, do I need to use FFMPEG or JavaCV ?
III. Does the int size have something to do with it ? Why is the int size occassionally 6, and other times 2400 to 6000 ? I’ve heard about the difference between this frames information and information about the next frame, but I’m simply not knowledgeable enough to know how to apply that here.
I’m starting to think this is where the issue lies. Since I need to get the 6 byte array for info about next frame, perhaps my modulo 30 is incorrect. So should I pass the 29th or 31st frame as a format byte for each frame ? How is that done in opencv or are we doomed to use to the the complicated ffmpeg.IV. Can I fix this using Imcodecs ? I was hoping opencv would natively handle whether a frame was color from this frame or info about next frame. I added the below code, but I am getting an empty array :
Mat videoBufMat = Imgcodecs.imdecode(new MatOfByte(params[0]), Imgcodecs.IMREAD_UNCHANGED);
This also is empty :
Mat encodeVideoBuf = new Mat(4, params[0].length, CvType.CV_8UC4);
encodeVideoBuf.put(0,0, params[0]);
Mat videoBufMat = Imgcodecs.imdecode(encodeVideoBuf, Imgcodecs.IMREAD_UNCHANGED);V. Should I try converting the bytes into Android jpeg and then import it ? Why is djis yuv decoder so complicated looking ? It makes me cautious from wanting to try ffmpeg or Javacv and just stick to Android decoder or opencv decoder.
VI. At what stage should I resize the frames to speed up calculations ?