
Recherche avancée
Médias (1)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (59)
-
Les formats acceptés
28 janvier 2010, parLes commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
ffmpeg -codecs ffmpeg -formats
Les format videos acceptés en entrée
Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
Les formats vidéos de sortie possibles
Dans un premier temps on (...) -
Automated installation script of MediaSPIP
25 avril 2011, parTo overcome the difficulties mainly due to the installation of server side software dependencies, an "all-in-one" installation script written in bash was created to facilitate this step on a server with a compatible Linux distribution.
You must have access to your server via SSH and a root account to use it, which will install the dependencies. Contact your provider if you do not have that.
The documentation of the use of this installation script is available here.
The code of this (...) -
Creating farms of unique websites
13 avril 2011, parMediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)
Sur d’autres sites (6107)
-
washed out colors when converting h264 into vp9 using ffmpeg [closed]
31 juillet 2024, par apes-together-strongI'm trying to convert an h264 video to vp9/opus webm.
Every attempt so far has had washed out colors.
Is there a fix for this issue or is it just the way h264->vp9 conversion is ?


ffprobe of the source file :


Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'test1.mp4':
 Metadata:
 major_brand : mp42
 minor_version : 0
 compatible_brands: isommp42
 creation_time : 2024-07-30T17:03:10.000000Z
 com.android.version: 10
 Duration: 00:01:04.07, start: 0.000000, bitrate: 20198 kb/s
 Stream #0:0[0x1](eng): Video: h264 (High) (avc1 / 0x31637661), yuvj420p(pc, bt470bg/bt470bg/smpte170m, progressive), 1920x1080, 20001 kb/s, SAR 1:1 DAR 16:9, 30 fps, 30 tbr, 90k tbn (default)
 Metadata:
 creation_time : 2024-07-30T17:03:10.000000Z
 handler_name : VideoHandle
 vendor_id : [0][0][0][0]
 Side data:
 displaymatrix: rotation of -90.00 degrees
 Stream #0:1[0x2](eng): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 96 kb/s (default)
 Metadata:
 creation_time : 2024-07-30T17:03:10.000000Z
 handler_name : SoundHandle
 vendor_id : [0][0][0][0]



-
Yet another ffmpeg concat audio sync issue [closed]
15 mars 2024, par DemiurgI've read through dozens of posts, tried many suggestions, nothing seems to work for me. The funny part is that the video is fine in some players (e.g. Quicktime) but not the others (e.g. Chrome).


This is what I currently use :


ffmpeg -i segment.mp4 -q 0 -c copy segment.ts
ffmpeg -f concat -i videos.txt -c copy -y final.mp4



This is what ffmpeg shows for the originals


Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '52M35S_1710280355.mp4':
 Metadata:
 major_brand : mp42
 minor_version : 0
 compatible_brands: mp42isom
 creation_time : 2024-03-12T21:53:35.000000Z
 Duration: 00:00:59.86, start: 0.000000, bitrate: 851 kb/s
 Stream #0:0[0x1](und): Audio: opus (Opus / 0x7375704F), 48000 Hz, mono, fltp, 10 kb/s (default)
 Metadata:
 creation_time : 2024-03-12T21:53:35.000000Z
 vendor_id : [0][0][0][0]
 Stream #0:1[0x2](und): Video: hevc (Main) (hvc1 / 0x31637668), yuvj420p(pc, bt709), 1920x1080, 836 kb/s, 10.02 fps, 10 tbr, 90k tbn (default)
 Metadata:
 creation_time : 2024-03-12T21:53:35.000000Z
 vendor_id : [0][0][0][0]



-
Last bytes of AVPacket
3 avril 2018, par João GueifãoI have been experimenting with FFmpeg libav C libraries to open, read and demux a video file with both video and KLV (key-lengh-value) streams. The data stream is built according to the UAS Datalink Local Metadata Set as per the MISB ST 0601.11 standard.
At the moment I am able to play the video on a window and dump the KLV metadata on the console just fine. I came to realise that whenever I dump the content of a AVPacket on the console, the last 14 bytes are constant, throughout different video files. I was provided a KLV decoder according to that MISB standard, which is working just fine, but ONLY WHEN I REMOVE THOSE last 14 bytes from every AVPacket data array given by FFmpeg.My question is : what are those 14 bytes in the first place ? I could not find them in the video file itself. I inspected the raw binary stream at one of the files and could not find those bytes anywhere. That makes me hypothesise that it is FFmpeg that is computing them itself ?
Further details
I discovered the following :
- for a same video file, the value of those 14 bytes never change ;
- when switching to a different video file, only the first 2 bytes of those 14 final bytes change ;
- when I dump the content of a AVPacket corresponding to a video frame, those 14 bytes also are very similar.
Here are two examples of the different 14 byte strings that I got until so far :
- FC00 0000 01CE 8C4D 9D10 8E25 E9FE
- BD00 0000 01CE 8C4D 9D10 8E25 E9FE
As you may see, they are all very similar.
Below is an example of the dump of an AVPacket::data array on the console. First we can see the the 16-byte Universal Key for this UAS Datalink Local Data Set, followed by the remaining of the packet, finishing with the mysterious 14-byte footer. I provide newlines just for readability.
06 0E2B 3402 0B01 010E 0103 0101 0000 00
81 F102 0800 04CA 140D 4323 0B03 1545 5352 495F 4D65 7461 6461 7461 5F43 6F6C 6C65 6374 0406 4E39 3738 3236 0502 F86E 0602 119A 0702 ED0B 0A05 4332 3038 420B 000C 000D 043A 841D A40E 04B5 80F4 A10F 0231 C710 0201 8B11 0200 DE12 04CD 0444 4513 04F1 2666 6614 0400 0000 0015 0400 2037 BB16 0200 0017 043A 8562 8718 04B5 7C46 AC19 0223 811A 02FF BA1B 02FF 551C 0200 6F1D 02FF 801E 0200 451F 0200 A820 02FF 9421 0200 7E2F 0100 302A 0101 0102 0101 0304 2F2F 4341 0400 0500 0602 4341 1510 0000 0000 0000 0000 0000 0000 0000 0000 1602 0005 3801 003B 0846 6972 6562 6972 6441 0101 4808 0000 0000 0000 0000 0102 DA78
FC00 0000 01CE 8C4D 9D10 8E25 E9FEI tried to follow the metadata.c example file on FFmpeg source examples, but it was unhelpful, as it only shows how to leverage the decoding of metadata from streams for which FFmpeg was an appropriate metadata codec. Again, in my case, the data stream is structured according to the UAS Datalink Local Metadata Set, and FFmpeg does not provide an appropriate codec.
Thank you for your help.