
Recherche avancée
Médias (1)
-
Bug de détection d’ogg
22 mars 2013, par
Mis à jour : Avril 2013
Langue : français
Type : Video
Autres articles (111)
-
XMP PHP
13 mai 2011, parDixit Wikipedia, XMP signifie :
Extensible Metadata Platform ou XMP est un format de métadonnées basé sur XML utilisé dans les applications PDF, de photographie et de graphisme. Il a été lancé par Adobe Systems en avril 2001 en étant intégré à la version 5.0 d’Adobe Acrobat.
Étant basé sur XML, il gère un ensemble de tags dynamiques pour l’utilisation dans le cadre du Web sémantique.
XMP permet d’enregistrer sous forme d’un document XML des informations relatives à un fichier : titre, auteur, historique (...) -
Taille des images et des logos définissables
9 février 2011, parDans beaucoup d’endroits du site, logos et images sont redimensionnées pour correspondre aux emplacements définis par les thèmes. L’ensemble des ces tailles pouvant changer d’un thème à un autre peuvent être définies directement dans le thème et éviter ainsi à l’utilisateur de devoir les configurer manuellement après avoir changé l’apparence de son site.
Ces tailles d’images sont également disponibles dans la configuration spécifique de MediaSPIP Core. La taille maximale du logo du site en pixels, on permet (...) -
Configuration spécifique d’Apache
4 février 2011, parModules spécifiques
Pour la configuration d’Apache, il est conseillé d’activer certains modules non spécifiques à MediaSPIP, mais permettant d’améliorer les performances : mod_deflate et mod_headers pour compresser automatiquement via Apache les pages. Cf ce tutoriel ; mode_expires pour gérer correctement l’expiration des hits. Cf ce tutoriel ;
Il est également conseillé d’ajouter la prise en charge par apache du mime-type pour les fichiers WebM comme indiqué dans ce tutoriel.
Création d’un (...)
Sur d’autres sites (2266)
-
Meaning of "ffprobe -v trace demo.mp4" output
17 mai 2019, par blueskyThe ’mdat box’ of Mp4 file may at the last of file. I want to know the position of ’mdat’ box using ’ffmpeg’ or ’ffprobe’.
Mp4 consists of ’ftyp’, ’moov’ and ’mdat’ BOX. each BOX consists of "BoxHeader" and "BoxData". "BoxHeader" consists of "BoxSize(4Byte)", "BoxType(4Byte)", "BoxLargesize(8Byte, only have when box size exceeding the range of 4Byte expression, then the value of BoxSize is 1)".
In program, you could first read 8 Byte and know the size of ’ftyp box’, then seek the size and read 8 Byte to know if the next box is ’moov box’. If not ’moov’, it shoud be ’mdat box’, then seek cross ’mdat box’ to find ’mdat box’...
But I want to use ’ffprobe’ to find the position of ’moov’. I use ’ffprobe -v trace demo.mp4’, and output is like below
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x7fc8fd000e00] Format mov,mp4,m4a,3gp,3g2,mj2 probed with size=2048 and score=100
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x7fc8fd000e00] type:'ftyp' parent:'root' sz: 28 8 41044500
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x7fc8fd000e00] ISO: File Type Major Brand: mp42
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x7fc8fd000e00] type:'moov' parent:'root' sz: 17943 36 41044500
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x7fc8fd000e00] type:'mvhd' parent:'moov' sz: 108 8 17935I want to know the meaning of
type:'ftyp' parent:'root' sz: 28 8 41044500
:
type:'ftyp' parent:'root'
is easy to know,sz: 28 8 41044500
is really make me confused, I guess28
is size of ftyp box,but the meaning of8 41044500
is what ?Could you explain the meaning of
sz: 28 8 41044500
, and where could find the doc ? -
Meaning of Timestamp Retrieved by AudioQueueGetCurrentTime() in AudioQueue Callback
30 juillet 2024, par White0930I'm working on an audio project and have a question regarding the meaning of the timestamp retrieved by AudioQueueGetCurrentTime().


According to the Apple Developer Documentation, the following calculation gives the audio time being played (since AudioQueueStart) :


- (Float64) GetCurrentTime {
 AudioTimeStamp c; 
 AudioQueueGetCurrentTime(playState.queue, NULL, &c, NULL); 
 return c.mSampleTime / _av->audio.sample_rate;
}



However, in a project I'm working on, I noticed the following code inside the fillAudioBuffer callback function of AudioQueue :



static void fillAudioBuffer(AudioQueueRef queue, AudioQueueBufferRef buffer){
 
 int lengthCopied = INT32_MAX;
 int dts= 0;
 int isDone = 0;

 buffer->mAudioDataByteSize = 0;
 buffer->mPacketDescriptionCount = 0;
 
 OSStatus err = 0;
 AudioTimeStamp bufferStartTime;

 AudioQueueGetCurrentTime(queue, NULL, &bufferStartTime, NULL);
 

 
 while(buffer->mPacketDescriptionCount < numPacketsToRead && lengthCopied > 0){
 if (buffer->mAudioDataByteSize) {
 break;
 }
 
 lengthCopied = getNextAudio(_av,buffer->mAudioDataBytesCapacity-buffer->mAudioDataByteSize, (uint8_t*)buffer->mAudioData+buffer->mAudioDataByteSize,&dts,&isDone);
 if(!lengthCopied || isDone) break;
 
 if(aqStartDts < 0) aqStartDts = dts;
 if (dts>0) currentDts = dts;
 if(buffer->mPacketDescriptionCount ==0){
 bufferStartTime.mFlags = kAudioTimeStampSampleTimeValid;
 bufferStartTime.mSampleTime = (Float64)(dts-aqStartDts) * _av->audio.frame_size;
 
 if (bufferStartTime.mSampleTime <0 ) bufferStartTime.mSampleTime = 0;
 PMSG2("AQHandler.m fillAudioBuffer: DTS for %x: %lf time base: %lf StartDTS: %d\n", (unsigned int)buffer, bufferStartTime.mSampleTime, _av->audio.time_base, aqStartDts);
 
 }
 buffer->mPacketDescriptions[buffer->mPacketDescriptionCount].mStartOffset = buffer->mAudioDataByteSize;
 buffer->mPacketDescriptions[buffer->mPacketDescriptionCount].mDataByteSize = lengthCopied;
 buffer->mPacketDescriptions[buffer->mPacketDescriptionCount].mVariableFramesInPacket = _av->audio.frame_size;
 
 buffer->mPacketDescriptionCount++;
 buffer->mAudioDataByteSize += lengthCopied;
 
 }
 
#ifdef DEBUG
 int audioBufferCount, audioBufferTotal, videoBufferCount, videoBufferTotal;
 bufferCheck(_av,&videoBufferCount, &videoBufferTotal, &audioBufferCount, &audioBufferTotal);
 
 PMSG2("AQHandler.m fillAudioBuffer: Video Buffer: %d/%d Audio Buffer: %d/%d\n", videoBufferCount, videoBufferTotal, audioBufferCount, audioBufferTotal);
 
 PMSG2("AQHandler.m fillAudioBuffer: Bytes copied for buffer 0x%x: %d\n",(unsigned int)buffer, (int)buffer->mAudioDataByteSize );
#endif 
 if(buffer->mAudioDataByteSize){
 
 if(err=AudioQueueEnqueueBufferWithParameters(queue, buffer, 0, NULL, 0, 0, 0, NULL, &bufferStartTime, NULL))
 {
#ifdef DEBUG
 char sErr[10];

 PMSG2(@"AQHandler.m fillAudioBuffer: Could not enqueue buffer 0x%x: %d %s.", buffer, err, FormatError(sErr, err));
#endif
 }
 }

}



Based on the documentation for
AudioQueueEnqueueBufferWithParameters
and the variable naming used by the author,bufferStartTime
seems to represent the time when the newly filled audio buffer will start playing, i.e., the time when all current audio in the queue has finished playing and the new audio starts. This interpretation suggestsbufferStartTime
is not the same as the time of the audio currently being played.

I have browsed through many related questions, but I still have some doubts.. I'm currently fixing an audio-video synchronization issue in my project, and there isn't much detailed information in the Apple Developer Documentation (or maybe my search skills are lacking).


Can someone clarify the exact meaning of the timestamp returned by AudioQueueGetCurrentTime() in this context ? Is it the time when the current audio will finish playing, or is it the time when the new audio will start playing ? Any additional resources or documentation that explain this in detail would also be appreciated.


-
avutil/pixfmt : Clarify the meaning of the "alpha" bit in rgb555/bgr555
6 février 2015, par Michael Niedermayer