
Recherche avancée
Médias (2)
-
GetID3 - Bloc informations de fichiers
9 avril 2013, par
Mis à jour : Mai 2013
Langue : français
Type : Image
-
GetID3 - Boutons supplémentaires
9 avril 2013, par
Mis à jour : Avril 2013
Langue : français
Type : Image
Autres articles (52)
-
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
Librairies et binaires spécifiques au traitement vidéo et sonore
31 janvier 2010, parLes logiciels et librairies suivantes sont utilisées par SPIPmotion d’une manière ou d’une autre.
Binaires obligatoires FFMpeg : encodeur principal, permet de transcoder presque tous les types de fichiers vidéo et sonores dans les formats lisibles sur Internet. CF ce tutoriel pour son installation ; Oggz-tools : outils d’inspection de fichiers ogg ; Mediainfo : récupération d’informations depuis la plupart des formats vidéos et sonores ;
Binaires complémentaires et facultatifs flvtool2 : (...) -
Contribute to a better visual interface
13 avril 2011MediaSPIP is based on a system of themes and templates. Templates define the placement of information on the page, and can be adapted to a wide range of uses. Themes define the overall graphic appearance of the site.
Anyone can submit a new graphic theme or template and make it available to the MediaSPIP community.
Sur d’autres sites (9513)
-
FFMpeg Android Stagefright SIGSEGV error (h264 decode)
19 avril 2013, par Sergey OchkurI need to decode h264 file to YUV on Android 2.3+. As I understand I need to communicate with Stagefright, as it`s the only way now, after closing access with OpenMAX IL implementations. I have used FFmpeg 0.10 (and tried 0.9/0.9.1..) for this issue, compiled it with NDK7 (and also tried NDK6b with the same result) :
ffmpeg version 0.10 Copyright (c) 2000-2012 the FFmpeg developers
built on Jan 28 2012 14:42:37 with gcc 4.4.3
configuration: --target-os=linux --cross-prefix=arm-linux-androideabi- --arch=arm --cpu=armv7-a --sysroot=/home/grid/Android/Android_NDK/platforms/android-9/arch-arm --disable-avdevice --disable-decoder=h264 --disable-decoder=h264_vdpau --enable-libstagefright-h264 --prefix=build/stagefright/armeabi-v7a --extra-cflags='-Iandroid-source/frameworks/base/include -Iandroid-source/system/core/include -Iandroid-source/frameworks/base/media/libstagefright -Iandroid-source/frameworks/base/include/media/stagefright/openmax -I/home/grid/Android/Android_NDK/sources/cxx-stl/system/include -march=armv7-a -mfloat-abi=softfp -mfpu=neon' --extra-ldflags='-Wl,--fix-cortex-a8 -Landroid-libs -Wl,-rpath-link,android-libs' --extra-cxxflags='-Wno-multichar -fno-exceptions -fno-rtti'
libavutil 51. 34.101 / 51. 34.101
libavcodec 53. 60.100 / 53. 60.100
libavformat 53. 31.100 / 53. 31.100
libavfilter 2. 60.100 / 2. 60.100
libswscale 2. 1.100 / 2. 1.100
libswresample 0. 6.100 / 0. 6.100
Hyper fast Audio and Video encoder
usage: ffmpeg [options] [[infile options] -i infile]... {[outfile options] outfile}...Hardware : Beagleboard-Xm + TI Android 2.3 (official)
So, entering next command give me error with 480p :
ffmpeg -i /sdcard/Video/480p.mp4Stopped (signal) ffmpeg -i /sdcard/Video/480p.mp4
Full Android "answer" from ADB Logcat :
http://pastebin.com/76JLgtXXAndroid-developers, does anybody know what this error means and how to deal with it ?
I tried to make DSP window bigger, but with no luck.
Commands like "stagefright /sdcard/Video/480p.mp4" works fine.P.S. Additionally I found that on some bigger files (720p) Android answers next :
[libstagefright_h264 @ 0xd479b0] Decode failed : 80000000
-
How to use ffmpeg in JavaScript to decode H.264 frames into RGB frames
17 juin 2020, par noelI'm trying to compile ffmpeg into javascript so that I can decode H.264 video streams using node. The streams are H.264 frames packed into RTP NALUs so any solution has to be able to accept H.264 frames rather than a whole file name. These frames can't be in a container like MP4 or AVI because then the demuxer needs to needs the timestamp of every frame before demuxing can occur, but I'm dealing with a real time stream, no containers.



Streaming H.264 over RTP



Below is the basic code I'm using to listen on a udp socket. Inside the 'message' callback the data packet is an RTP datagram. The data portion of the data gram is an H.264 frame (P-frames and I-frames).



var PORT = 33333;
var HOST = '127.0.0.1';

var dgram = require('dgram');
var server = dgram.createSocket('udp4');

server.on('listening', function () {
 var address = server.address();
 console.log('UDP Server listening on ' + address.address + ":" + address.port);
});

server.on('message', function (message, remote) {
 console.log(remote.address + ':' + remote.port +' - ' + message);
 frame = parse_rtp(message);

 rgb_frame = some_library.decode_h264(frame); // This is what I need.

});

server.bind(PORT, HOST); 




I found the Broadway.js library, but I couldn't get it working and it doesn't handle P-frames which I need. I also found ffmpeg.js, but could get that to work and it needs a whole file not a stream. Likewise, fluent-ffmpeg doesn't appear to support file streams ; all of the examples show a filename being passed to the constructor. So I decided to write my own API.



My current solution attempt



I have been able to compile ffmpeg into one big js file, but I can't use it like that. I want to write an API around ffmpeg and then expose those functions to JS. So it seems to me like I need to do the following :



- 

- Compile ffmpeg components (avcodec, avutil, etc.) into llvm bitcode.
- Write a C wrapper that exposes the decoding functionality and uses EMSCRIPTEN_KEEPALIVE.
- Use emcc to compile the wrapper and link it to the bitcode created in step 1.









I found WASM+ffmpeg, but it's in Chinese and some of the steps aren't clear. In particular there is this step :



emcc web.c process.c ../lib/libavformat.bc ../lib/libavcodec.bc ../lib/libswscale.bc ../lib/libswresample.bc ../lib/libavutil.bc \




:( Where I think I'm stuck



I don't understand how all the ffmpeg components get compiled into separate *.bc files. I followed the emmake commands in that article and I end up with one big .bc file.



2 questions



1. Does anyone know the steps to compile ffmpeg using emscripten so that I can expose some API to javascript ?

 2. Is there a better way (with decent documentation/examples) to decode h264 video streams using node ?

-
ffmpeg, live MPEG-TS demux & decode
8 mai 2017, par NadavRubEnvironment
- Ubuntu-14
- C++
- ffmpeg
Use-case
- Live SPTS is received via UDP by a 3rd party module
- TS Packets are received iteratively
- The TS Video (ES) should be decoded in minimal latency
Considered Implementation
- Upon TS packet reception, immediately push it to the TS demux
- Once enough packets are received the video format is resolvable, create the video codec
- Push each video packet into the video decoder
- Once enough video packets were processed the video codec result a valid output frame
Problem at-hand
Can this be done w/ ffmpeg ?!?!, … using “avformat_open_input” mandate a file to read from… I need a way where I can iteratively push packets to the TS demuxer ( w/ minimal latency )…
Does ffmpeg support the above mentioned use-case ? How ?