
Recherche avancée
Médias (1)
-
SPIP - plugins - embed code - Exemple
2 septembre 2013, par
Mis à jour : Septembre 2013
Langue : français
Type : Image
Autres articles (58)
-
Mise à jour de la version 0.1 vers 0.2
24 juin 2013, parExplications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...) -
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
Sur d’autres sites (7703)
-
Revision 6ce718eb18 : Merge "End of orientation zero group experiment" into experimental
22 avril 2013, par Deb MukherjeeChanged Paths : Modify /vp9/decoder/vp9_decodframe.c Modify /vp9/encoder/vp9_encodemb.c Modify /vp9/encoder/vp9_rdopt.c Merge "End of orientation zero group experiment" into experimental
-
How to dump ALL metadata from a media file, including cover image title ? [closed]
9 avril, par UnidealI have an MP3 song :


# ffprobe -hide_banner -i filename.mp3
Input #0, mp3, from 'filename.mp3':
 Metadata:
 composer : Music Author
 title : Song Name
 artist : Singer
 encoder : Lavf61.7.100
 genre : Rock
 date : 2025
 Duration: 00:03:14.04, start: 0.023021, bitrate: 208 kb/s
 Stream #0:0: Audio: mp3 (mp3float), 48000 Hz, stereo, fltp, 192 kb/s
 Metadata:
 encoder : Lavc61.19
 Stream #0:1: Video: png, rgb24(pc, gbr/unknown/unknown), 600x600 [SAR 1:1 DAR 1:1], 90k tbr, 90k tbn (attached pic)
 Metadata:
 title : Cover
 comment : Cover (front)



The task is to save its metadata to a text file and restore from that file later. Both goals should be accomplished with ffmpeg.


The simpliest method is to run :


# ffmpeg -i filename.mp3 -f ffmetadata metadata.txt



After that,
metadata.txt
contains :

;FFMETADATA1
composer=Music Author
title=Song Name
artist=Singer
date=2025
genre=Rock
encoder=Lavf61.7.100



I got global metadata only, but stream-specific info (cover image title and comment in my case) are missing.


Google suggested a more complex form of the command above to extract all metadata fields without any exclusions :


# ffmpeg -y -i filename.mp3 -c copy -map_metadata 0 -map_metadata:s:v 0:s:v -map_metadata:s:a 0:s:a -f ffmetadata metadata.txt



But the output is exactly the same :


;FFMETADATA1
composer=Music Author
title=Song Name
artist=Singer
date=2025
genre=Rock
encoder=Lavf61.7.100



Again, no info about the attached image.


Please explain what am I doing wrong.


-
How to set pts, dts and duration in ffmpeg library ?
24 mars, par hsleeI want to pack some compressed video packets(h.264) to ".mp4" container.
One word, Muxing, no decoding and no encoding.
And I have no idea how to set pts, dts and duration.



- 

- I get the packets with "pcap" library.
- I removed headers before compressed video data show up. e.g. Ethernet, VLAN.
- I collected data until one frame and decoded it for getting information of data. e.g. width, height. (I am not sure that it is necessary)
- I initialized output context, stream and codec context.
- I started to receive packets with "pcap" library again. (now for muxing)
- I made one frame and put that data in AVPacket structure.
- I try to set PTS, DTS and duration. (I think here is wrong part, not sure though)

















*7-1. At the first frame, I saved time(msec) with packet header structure.



*7-2. whenever I made one frame, I set parameters like this : PTS(current time - start time), DTS(same PTS value), duration(current PTS - before PTS)



I think it has some error because :



- 

-
I don't know how far is suitable long for dts from pts.
-
At least, I think duration means how long time show this frame from now to next frame, so It should have value(next PTS - current PTS), but I can not know the value next PTS at that time.







It has I-frame only.



// make input context for decoding

AVFormatContext *&ic = gInputContext;

ic = avformat_alloc_context();

AVCodec *cd = avcodec_find_decoder(AV_CODEC_ID_H264);

AVStream *st = avformat_new_stream(ic, cd);

AVCodecContext *cc = st->codec;

avcodec_open2(cc, cd, NULL);

// make packet and decode it after collect packets is be one frame

gPacket.stream_index = 0;

gPacket.size = gPacketLength[0];

gPacket.data = gPacketData[0];

gPacket.pts = AV_NOPTS_VALUE;

gPacket.dts = AV_NOPTS_VALUE;

gPacket.flags = AV_PKT_FLAG_KEY;

avcodec_decode_video2(cc, gFrame, &got_picture, &gPacket);

// I checked automatically it initialized after "avcodec_decode_video2"

// put some info that I know that not initialized

cc->time_base.den = 90000;

cc->time_base.num = 1;

cc->bit_rate = 2500000;

cc->gop_size = 1;

// make output context with input context

AVFormatContext *&oc = gOutputContext;

avformat_alloc_output_context2(&oc, NULL, NULL, filename);

AVFormatContext *&ic = gInputContext;

AVStream *ist = ic->streams[0];

AVCodecContext *&icc = ist->codec;

AVStream *ost = avformat_new_stream(oc, icc->codec);

AVCodecContext *occ = ost->codec;

avcodec_copy_context(occ, icc);

occ->flags |= CODEC_FLAG_GLOBAL_HEADER;

avio_open(&(oc->pb), filename, AVIO_FLAG_WRITE);

// repeated part for muxing

AVRational Millisecond = { 1, 1000 };

gPacket.stream_index = 0;

gPacket.data = gPacketData[0];

gPacket.size = gPacketLength[0];

gPacket.pts = av_rescale_rnd(pkthdr->ts.tv_sec * 1000 /

 + pkthdr->ts.tv_usec / 1000 /

 - gStartTime, Millisecond.den, ost->time_base.den, /

 (AVRounding)(AV_ROUND_NEAR_INF | AV_ROUND_PASS_MINMAX));

gPacket.dts = gPacket.pts;

gPacket.duration = gPacket.pts - gPrev;

gPacket.flags = AV_PKT_FLAG_KEY;

gPrev = gPacket.pts;

av_interleaved_write_frame(gOutputContext, &gPacket);




Expected and actual results is a .mp4 video file that can play.