
Recherche avancée
Autres articles (42)
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
MediaSPIP v0.2
21 juin 2013, parMediaSPIP 0.2 is the first MediaSPIP stable release.
Its official release date is June 21, 2013 and is announced here.
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
Librairies et binaires spécifiques au traitement vidéo et sonore
31 janvier 2010, parLes logiciels et librairies suivantes sont utilisées par SPIPmotion d’une manière ou d’une autre.
Binaires obligatoires FFMpeg : encodeur principal, permet de transcoder presque tous les types de fichiers vidéo et sonores dans les formats lisibles sur Internet. CF ce tutoriel pour son installation ; Oggz-tools : outils d’inspection de fichiers ogg ; Mediainfo : récupération d’informations depuis la plupart des formats vidéos et sonores ;
Binaires complémentaires et facultatifs flvtool2 : (...)
Sur d’autres sites (7664)
-
lavc : Implement Dolby Vision RPU parsing
3 janvier 2022, par Niklas Haaslavc : Implement Dolby Vision RPU parsing
Based on a mixture of guesswork, partial documentation in patents, and
reverse engineering of real-world samples. Confirmed working for all the
samples I've thrown at it.Contains some annoying machinery to persist these values in between
frames, which is needed in theory even though I've never actually seen a
sample that relies on it in practice. May or may not work.Since the distinction matters greatly for parsing the color matrix
values, this includes a small helper function to guess the right profile
from the RPU itself in case the user has forgotten to forward the dovi
configuration record to the decoder. (Which in practice, only ffmpeg.c
and ffplay do..)Notable omissions / deviations :
CRC32 verification. This is based on the MPEG2 CRC32 type, which is
similar to IEEE CRC32 but apparently different in subtle enough ways
that I could not get it to pass verification no matter what parameters
I fed to av_crc. It's possible the code needs some changes.Linear interpolation support. Nothing documents this (beyond its
existence) and no samples use it, so impossible to implement.All of the extension metadata blocks, but these contain values that
seem largely congruent with ST2094, HDR10, or other existing forms of
side data, so I will defer parsing/attaching them to a future commit.The patent describes a mechanism for predicting coefficients from
previous RPUs, but the bit for the flag whether to use the
prediction deltas or signal entirely new coefficients does not seem to
be present in actual RPUs, so we ignore this subsystem entirely.In the patent's spec, the NLQ subsystem also loops over
num_nlq_pivots, but even in the patent the number is hard-coded to one
iteration rather than signalled. So we only store one set of coefs.Heavily influenced by https://github.com/quietvoid/dovi_tool
Documentation drawn from US Patent 10,701,399 B2 and ETSI GS CCM 001Signed-off-by : Niklas Haas <git@haasn.dev>
Signed-off-by : Andreas Rheinhardt <andreas.rheinhardt@outlook.com> -
How to use ffmpeg in JavaScript to decode H.264 frames into RGB frames
17 juin 2020, par noelI'm trying to compile ffmpeg into javascript so that I can decode H.264 video streams using node. The streams are H.264 frames packed into RTP NALUs so any solution has to be able to accept H.264 frames rather than a whole file name. These frames can't be in a container like MP4 or AVI because then the demuxer needs to needs the timestamp of every frame before demuxing can occur, but I'm dealing with a real time stream, no containers.



Streaming H.264 over RTP



Below is the basic code I'm using to listen on a udp socket. Inside the 'message' callback the data packet is an RTP datagram. The data portion of the data gram is an H.264 frame (P-frames and I-frames).



var PORT = 33333;
var HOST = '127.0.0.1';

var dgram = require('dgram');
var server = dgram.createSocket('udp4');

server.on('listening', function () {
 var address = server.address();
 console.log('UDP Server listening on ' + address.address + ":" + address.port);
});

server.on('message', function (message, remote) {
 console.log(remote.address + ':' + remote.port +' - ' + message);
 frame = parse_rtp(message);

 rgb_frame = some_library.decode_h264(frame); // This is what I need.

});

server.bind(PORT, HOST); 




I found the Broadway.js library, but I couldn't get it working and it doesn't handle P-frames which I need. I also found ffmpeg.js, but could get that to work and it needs a whole file not a stream. Likewise, fluent-ffmpeg doesn't appear to support file streams ; all of the examples show a filename being passed to the constructor. So I decided to write my own API.



My current solution attempt



I have been able to compile ffmpeg into one big js file, but I can't use it like that. I want to write an API around ffmpeg and then expose those functions to JS. So it seems to me like I need to do the following :



- 

- Compile ffmpeg components (avcodec, avutil, etc.) into llvm bitcode.
- Write a C wrapper that exposes the decoding functionality and uses EMSCRIPTEN_KEEPALIVE.
- Use emcc to compile the wrapper and link it to the bitcode created in step 1.









I found WASM+ffmpeg, but it's in Chinese and some of the steps aren't clear. In particular there is this step :



emcc web.c process.c ../lib/libavformat.bc ../lib/libavcodec.bc ../lib/libswscale.bc ../lib/libswresample.bc ../lib/libavutil.bc \




:( Where I think I'm stuck



I don't understand how all the ffmpeg components get compiled into separate *.bc files. I followed the emmake commands in that article and I end up with one big .bc file.



2 questions



1. Does anyone know the steps to compile ffmpeg using emscripten so that I can expose some API to javascript ?

 2. Is there a better way (with decent documentation/examples) to decode h264 video streams using node ?

-
Anomalie #4438 : Manque Msg :message:lien_reponse_message :
22 mars 2020Ça m’interroge...
Les chaines de langues sont dans ’forum’, là : https://git.spip.net/spip/forum/src/branch/master/lang/forum_fr.php#L129
Donc appeler `_T(’message:lien_reponse_message’)` ne donnera rien, quelque soit la version de SPIP.
Cette chaine (forum:lien_reponse_message) est appelé si le message a un `id_parent`.La question semble plutôt :
- soit `#OBJET` qui vaut ’message’ est erroné (ça devait être autre chose (genre l’objet du parent), mais un bug remplit a rempli ’message’ ?
- soit on avait jamais eu ce cas simplement ?