Recherche avancée

Médias (91)

Autres articles (75)

  • Submit bugs and patches

    13 avril 2011

    Unfortunately a software is never perfect.
    If you think you have found a bug, report it using our ticket system. Please to help us to fix it by providing the following information : the browser you are using, including the exact version as precise an explanation as possible of the problem if possible, the steps taken resulting in the problem a link to the site / page in question
    If you think you have solved the bug, fill in a ticket and attach to it a corrective patch.
    You may also (...)

  • Les images

    15 mai 2013
  • Taille des images et des logos définissables

    9 février 2011, par

    Dans beaucoup d’endroits du site, logos et images sont redimensionnées pour correspondre aux emplacements définis par les thèmes. L’ensemble des ces tailles pouvant changer d’un thème à un autre peuvent être définies directement dans le thème et éviter ainsi à l’utilisateur de devoir les configurer manuellement après avoir changé l’apparence de son site.
    Ces tailles d’images sont également disponibles dans la configuration spécifique de MediaSPIP Core. La taille maximale du logo du site en pixels, on permet (...)

Sur d’autres sites (13215)

  • What does the summary output of the coding with ffmpeg means

    12 novembre 2015, par Jai

    I am working with video comparison using ffmpeg. By Using ffmpeg command I can find the difference between 2 videos. But i want to find the percentage different in 2 videos.
    From the below ffmpeg output how can i found the percentage difference in two videos. Which attribute denote the difference.?

    TaskList: video:530kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.266679%
    TaskList: [libx264 @ 0000000002750b00] frame I:2     Avg QP:23.92  size: 29796
    TaskList: [libx264 @ 0000000002750b00] frame P:97    Avg QP:22.97  size:  4477
    TaskList: [libx264 @ 0000000002750b00] frame B:9     Avg QP:28.16  size:  5338
    TaskList: [libx264 @ 0000000002750b00] consecutive B-frames: 83.3% 16.7%  0.0%  0.0%
    TaskList: [libx264 @ 0000000002750b00] mb I  I16..4: 25.7% 37.8% 36.5%
    TaskList: [libx264 @ 0000000002750b00] mb P  I16..4:  1.9%  4.5%  1.0%  P16..4: 26.7%  8.8%  3.8%  0.0%  0.0%    skip:53.3%
    TaskList: [libx264 @ 0000000002750b00] mb B  I16..4:  0.7%  2.4%  2.7%  B16..8: 19.9%  8.8%  2.6%  direct: 4.7%  skip:58.2%  L0:32.3% L1:53.2% BI:14.4%
    TaskList: [libx264 @ 0000000002750b00] 8x8 transform intra:55.1% inter:69.5%
    TaskList: [libx264 @ 0000000002750b00] coded y,uvDC,uvAC intra: 55.6% 70.0% 24.2% inter: 19.8% 26.7% 2.5%
    TaskList: [libx264 @ 0000000002750b00] i16 v,h,dc,p: 25% 44%  5% 27%
    TaskList: [libx264 @ 0000000002750b00] i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 24% 26% 17%  5%  5%  6%  5%  6%  6%
    TaskList: [libx264 @ 0000000002750b00] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 26% 29% 13%  5%  5%  6%  5%  6%  5%
    TaskList: [libx264 @ 0000000002750b00] i8c dc,h,v,p: 44% 30% 20%  5%
    TaskList: [libx264 @ 0000000002750b00] Weighted P-Frames: Y:6.2% UV:4.1%
    TaskList: [libx264 @ 0000000002750b00] ref P L0: 64.2% 28.5%  5.8%  1.3%  0.1%
    TaskList: [libx264 @ 0000000002750b00] ref B L0: 88.4% 11.6%
    TaskList: [libx264 @ 0000000002750b00] kb/s:1204.25
  • Meaning of Timestamp Retrieved by AudioQueueGetCurrentTime() in AudioQueue Callback

    30 juillet 2024, par White0930

    I'm working on an audio project and have a question regarding the meaning of the timestamp retrieved by AudioQueueGetCurrentTime().

    


    According to the Apple Developer Documentation, the following calculation gives the audio time being played (since AudioQueueStart) :

    


    - (Float64) GetCurrentTime {
    AudioTimeStamp c;       
    AudioQueueGetCurrentTime(playState.queue, NULL, &c, NULL);  
    return c.mSampleTime / _av->audio.sample_rate;
}


    


    However, in a project I'm working on, I noticed the following code inside the fillAudioBuffer callback function of AudioQueue :

    


    
static void fillAudioBuffer(AudioQueueRef queue, AudioQueueBufferRef buffer){
    
    int lengthCopied = INT32_MAX;
    int dts= 0;
    int isDone = 0;

    buffer->mAudioDataByteSize = 0;
    buffer->mPacketDescriptionCount = 0;
    
    OSStatus err = 0;
    AudioTimeStamp bufferStartTime;

    AudioQueueGetCurrentTime(queue, NULL, &bufferStartTime, NULL);
    

    
    while(buffer->mPacketDescriptionCount < numPacketsToRead && lengthCopied > 0){
        if (buffer->mAudioDataByteSize) {
            break;
        }
        
        lengthCopied = getNextAudio(_av,buffer->mAudioDataBytesCapacity-buffer->mAudioDataByteSize, (uint8_t*)buffer->mAudioData+buffer->mAudioDataByteSize,&dts,&isDone);
        if(!lengthCopied || isDone) break;
      
        if(aqStartDts < 0) aqStartDts = dts;
        if (dts>0) currentDts = dts;
        if(buffer->mPacketDescriptionCount ==0){
            bufferStartTime.mFlags = kAudioTimeStampSampleTimeValid;
            bufferStartTime.mSampleTime = (Float64)(dts-aqStartDts) * _av->audio.frame_size;
            
            if (bufferStartTime.mSampleTime <0 ) bufferStartTime.mSampleTime = 0;
            PMSG2("AQHandler.m fillAudioBuffer: DTS for %x: %lf time base: %lf StartDTS: %d\n", (unsigned int)buffer, bufferStartTime.mSampleTime, _av->audio.time_base, aqStartDts);
            
        }
        buffer->mPacketDescriptions[buffer->mPacketDescriptionCount].mStartOffset = buffer->mAudioDataByteSize;
        buffer->mPacketDescriptions[buffer->mPacketDescriptionCount].mDataByteSize = lengthCopied;
        buffer->mPacketDescriptions[buffer->mPacketDescriptionCount].mVariableFramesInPacket = _av->audio.frame_size;
        
        buffer->mPacketDescriptionCount++;
        buffer->mAudioDataByteSize += lengthCopied;
        
    }
    
#ifdef DEBUG
    int audioBufferCount, audioBufferTotal,  videoBufferCount, videoBufferTotal;
    bufferCheck(_av,&videoBufferCount, &videoBufferTotal, &audioBufferCount, &audioBufferTotal);
    
    PMSG2("AQHandler.m fillAudioBuffer: Video Buffer: %d/%d Audio Buffer: %d/%d\n", videoBufferCount, videoBufferTotal, audioBufferCount, audioBufferTotal);
    
    PMSG2("AQHandler.m fillAudioBuffer: Bytes copied for buffer 0x%x: %d\n",(unsigned int)buffer, (int)buffer->mAudioDataByteSize );
#endif  
    if(buffer->mAudioDataByteSize){
        
        if(err=AudioQueueEnqueueBufferWithParameters(queue, buffer, 0, NULL, 0, 0, 0, NULL, &bufferStartTime, NULL))
        {
#ifdef DEBUG
            char sErr[10];

            PMSG2(@"AQHandler.m fillAudioBuffer: Could not enqueue buffer 0x%x: %d %s.", buffer, err, FormatError(sErr, err));
#endif
        }
    }

}


    


    Based on the documentation for AudioQueueEnqueueBufferWithParameters and the variable naming used by the author, bufferStartTime seems to represent the time when the newly filled audio buffer will start playing, i.e., the time when all current audio in the queue has finished playing and the new audio starts. This interpretation suggests bufferStartTime is not the same as the time of the audio currently being played.

    


    I have browsed through many related questions, but I still have some doubts.. I'm currently fixing an audio-video synchronization issue in my project, and there isn't much detailed information in the Apple Developer Documentation (or maybe my search skills are lacking).

    


    Can someone clarify the exact meaning of the timestamp returned by AudioQueueGetCurrentTime() in this context ? Is it the time when the current audio will finish playing, or is it the time when the new audio will start playing ? Any additional resources or documentation that explain this in detail would also be appreciated.

    


  • Revision 75178 : Passer systématiquement le contenu des entré de selections par ...

    31 août 2013, par maieul@… — Log

    Passer systématiquement le contenu des entré de selections par _T_ou_typo() (Rastapopoulos)