
Recherche avancée
Médias (1)
-
Video d’abeille en portrait
14 mai 2011, par
Mis à jour : Février 2012
Langue : français
Type : Video
Autres articles (108)
-
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...) -
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs
Sur d’autres sites (12062)
-
how to write image with yuv420 format data with PIL or something like that
16 avril 2021, par nathan wuI have a video with yuv420p pixel format. 
At first I tried to read each frame's bytes of it using pipe and pixel format as rgb24. And I used PIL to make image of it.
However, the frames read with format of rgb24 seem to lose a little bit of quality.



Here is the command of reading frame with rgb24 pixel format :



ffmpeg -y -i input.mp4 -vcodec rawvideo -pix_fmt rgb24 -an -r 25 -f rawvideo pipe:1
 frame_data = self.process.stdout.read(1920*1080*3)




Then I tried to read it with yuv420p pixel format.



ffmpeg -y -i input.mp4 -vcodec rawvideo -pix_fmt yuv420p -an -r 25 -f rawvideo pipe:1
 frame_data = self.process.stdout.read(1920*1080*3/2)




One single frame includes half of the bytes of rgb24 frame. It is 3110400 bytes within a 1920*1080 yuv420p frame. I tossed these data into PIL :



Image.frombytes('YCbCr', (1920, 1080), frame_data)




but PIL raise an error of not enough image data.
I looked up the modes that PIL support to write from bytes, none of it is 12_bit pixels.
I also tried to transform the yuv data into rgb data, but it took a lot more time than before when is a long video to process.



Am I doing something wrong ? Is there any way to write an image with raw yuv data without any transform ??


-
how to make a video from timeline position data ?
5 décembre 2013, par BradSo I have timeline position data available to me as a JSON object where a object is being moved around a screen. I want to be able to make a video using that information. How would you suggest going about doing that ? (Ubuntu or OSX environment).
-
Trying to decode/convert raw AAC data from a recording that stopped abruptly [closed]
30 août 2023, par Gonzalo LeonI made a very long recording with an application in my cellphone that used AAC but its battery died after 13 hours (although it was plugged in). The file can't be played since the file structure is incorrect, but I know the information is there. The bitrate of the recorder is 100kbps, the sample rate 16000Hz and the file format is QuickTime.


¿Is there a way to play or decode easily the data in the 'mdat' atom ?.


Please help, this recording is very important for me. Thanks.


I read the documentation of atoms in the QuickTime page and tried to fix the file structure without success so far. I also extracted de raw data from the 'mdat' atom with Python and tried to convert it to WAV using ffmpeg but it didn't work.