Recherche avancée

Médias (1)

Mot : - Tags -/intégration

Autres articles (89)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

Sur d’autres sites (11822)

  • MediaCodec AV Sync when decoding

    12 juin 2020, par ClassA

    All of the questions regarding syncing audio and video, when decoding using MediaCodec, suggests that we should use an "AV Sync" mechanism to sync the video and audio using their timestamps.

    



    Here is what I do to achieve this :

    



    I have 2 threads, one for decoding video and one for audio. To sync the video and audio I'm using Extractor.getSampleTime() to determine if I should release the audio or video buffers, please see below :

    



    //This is called after configuring MediaCodec(both audio and video)
private void startPlaybackThreads(){
    //Audio playback thread
    mAudioWorkerThread = new Thread("AudioThread") {
        @Override
        public void run() {
            if (!Thread.interrupted()) {
                try {
                    //Check info below
                    if (shouldPushAudio()) {
                        workLoopAudio();
                    }
                } catch (Exception e) {
                    e.printStackTrace();
                }
            }
        }
    };
    mAudioWorkerThread.start();

    //Video playback thread
    mVideoWorkerThread = new Thread("VideoThread") {
        @Override
        public void run() {
            if (!Thread.interrupted()) {
                try {
                    //Check info below
                    if (shouldPushVideo()) {
                        workLoopVideo();
                    }
                } catch (Exception e) {
                    e.printStackTrace();
                }
            }
        }
    };
    mVideoWorkerThread.start();
}

//Check if more buffers should be sent to the audio decoder
private boolean shouldPushAudio(){
    int audioTime =(int) mAudioExtractor.getSampleTime();
    int videoTime = (int) mExtractor.getSampleTime();
    return audioTime <= videoTime;
}
//Check if more buffers should be sent to the video decoder
private boolean shouldPushVideo(){
    int audioTime =(int) mAudioExtractor.getSampleTime();
    int videoTime = (int) mExtractor.getSampleTime();
    return audioTime > videoTime;
}


    



    Inside workLoopAudio() and workLoopVideo() is all my MediaCodec logic (I decided not to post it because it's not relevant).

    



    So what I do is, I get the sample time of the video and the audio tracks, I then check which one is bigger(further ahead). If the video is "ahead" then I pass more buffers to my audio decoder and visa versa.

    



    This seems to be working fine - The video and audio are playing in sync.

    




    



    My question :

    



    I would like to know if my approach is correct(is this how we should be doing it, or is there another/better way) ? I could not find any working examples of this(written in java/kotlin), thus the question.

    




    



    EDIT 1 :

    



    I've found that the audio trails behind the video (very slightly) when I decode/play a video that was encoded using FFmpeg. If I use a video that was not encoded using FFmpeg then the video and audio syncs perfectly.

    



    The FFmpeg command is nothing out of the ordinary :

    



    -i inputPath -crf 18 -c:v libx264 -preset ultrafast OutputPath


    



    I will be providing additional information below :

    



    I initialize/create AudioTrack like this :

    



    //Audio
mAudioExtractor = new MediaExtractor();
mAudioExtractor.setDataSource(mSource);
int audioTrackIndex = selectAudioTrack(mAudioExtractor);
if (audioTrackIndex < 0){
    throw new IOException("Can't find Audio info!");
}
mAudioExtractor.selectTrack(audioTrackIndex);
mAudioFormat = mAudioExtractor.getTrackFormat(audioTrackIndex);
mAudioMime = mAudioFormat.getString(MediaFormat.KEY_MIME);

mAudioChannels = mAudioFormat.getInteger(MediaFormat.KEY_CHANNEL_COUNT);
mAudioSampleRate = mAudioFormat.getInteger(MediaFormat.KEY_SAMPLE_RATE);

final int min_buf_size = AudioTrack.getMinBufferSize(mAudioSampleRate, (mAudioChannels == 1 ? AudioFormat.CHANNEL_OUT_MONO : AudioFormat.CHANNEL_OUT_STEREO), AudioFormat.ENCODING_PCM_16BIT);
final int max_input_size = mAudioFormat.getInteger(MediaFormat.KEY_MAX_INPUT_SIZE);
mAudioInputBufSize =  min_buf_size > 0 ? min_buf_size * 4 : max_input_size;
if (mAudioInputBufSize > max_input_size) mAudioInputBufSize = max_input_size;
final int frameSizeInBytes = mAudioChannels * 2;
mAudioInputBufSize = (mAudioInputBufSize / frameSizeInBytes) * frameSizeInBytes;

mAudioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,
    mAudioSampleRate,
    (mAudioChannels == 1 ? AudioFormat.CHANNEL_OUT_MONO : AudioFormat.CHANNEL_OUT_STEREO),
    AudioFormat.ENCODING_PCM_16BIT,
    AudioTrack.getMinBufferSize(mAudioSampleRate, mAudioChannels == 1 ? AudioFormat.CHANNEL_OUT_MONO : AudioFormat.CHANNEL_OUT_STEREO, AudioFormat.ENCODING_PCM_16BIT),
    AudioTrack.MODE_STREAM);

try {
    mAudioTrack.play();
} catch (final Exception e) {
    Log.e(TAG, "failed to start audio track playing", e);
    mAudioTrack.release();
    mAudioTrack = null;
}


    



    And I write to the AudioTrack like this :

    



    //Called from within workLoopAudio, when releasing audio buffers
if (bufferAudioIndex >= 0) {
    if (mAudioBufferInfo.size > 0) {
        internalWriteAudio(mAudioOutputBuffers[bufferAudioIndex], mAudioBufferInfo.size);
    }
    mAudioDecoder.releaseOutputBuffer(bufferAudioIndex, false);
}

private boolean internalWriteAudio(final ByteBuffer buffer, final int size) {
    if (mAudioOutTempBuf.length < size) {
        mAudioOutTempBuf = new byte[size];
    }
    buffer.position(0);
    buffer.get(mAudioOutTempBuf, 0, size);
    buffer.clear();
    if (mAudioTrack != null)
        mAudioTrack.write(mAudioOutTempBuf, 0, size);
    return true;
}


    



    "NEW" Question :

    



    The audio trails about 200ms behind the video if I use a video that was encoded using FFmpeg, is there a reason why this could be happening ?

    


  • Can VideoView be detach and reattached without stopping the it ?

    31 octobre 2013, par Thierry-Dimitri Roy

    I'm building an app where the user clicks on a button to show a video full screen. Initially the video is attached to a view inside a ViewPager. To be able to show it fullscreen I detach it from its parent and reattach it to the root view. This works fine, except when the video is switched to fullscreen while playing. When I detach a playing VideoView it just stop and I need to restart it. This is not acceptable since the video starts buffering before resume. Here the part of the code where the detach is done :

       final ViewGroup parent = (ViewGroup) findViewById(R.id.parent);

       final ViewGroup root = (ViewGroup) findViewById(R.id.root);

       Button b = (Button) findViewById(R.id.button);
       b.setOnClickListener(new OnClickListener() {

           @Override
           public void onClick(View v) {
               parent.removeView(mVideoView);

               LayoutParams lp = new FrameLayout.LayoutParams(LayoutParams.MATCH_PARENT, LayoutParams.MATCH_PARENT);
               root.addView(mVideoView, lp);
           }
       });

    Depending of the device, I have a different log error. Probably because the actual video player is provided by the manufacturer and not the Android SDK. Here are the error logs for a Nexus 7 :

    10-30 20:26:18.618 : D/NvOsDebugPrintf(124) : NvMMDecTVMRDestroyParser Begin
    10-30 20:26:18.618 : D/NvOsDebugPrintf(124) : --------- Closing TVMR Frame Delivery Thread -------------
    10-30 20:26:18.678 : D/NvOsDebugPrintf(124) : ------- NvAvpClose -------
    10-30 20:26:18.678 : D/NvOsDebugPrintf(124) : NvMMDecTVMRDestroyParser Done
    10-30 20:26:18.678 : D/NvOsDebugPrintf(124) : NvMMLiteTVMRDecPrivateClose Done

    I haven't been able to detach the video without stopping it. I tried using SurfaceView or TextureView without success.

    I also tried finding a third party video player. I found a commercial one (http://www.vitamio.org/) that I can't really use for business reason. I found an open source one, that hasn't been updated in the last year (https://code.google.com/p/dolphin-player/).

    I'm currently targeting Android 4.2 or better on tablet only.

  • Unable to scrub video after remuxing multiple mpeg-ts to mp4

    21 juillet 2022, par Jona

    I have written custom code to concat multiple mpeg-ts files into an mp4 video file. I've used as reference the remuxing code sample.

    


    I'm having issues with the final output. I'm unable to scrub the video on QuickTime. The final output seems to be missing something.

    


    I compared using ffprobe my custom code remuxer results to that of using the following terminal command :

    


    ffmpeg -i "concat:input1.ts|input2.ts|input3.ts" -c copy output.mp4


    


    To my surprise, the videos look almost identical but I'm noticing some differences in the reported distance and keyframe values. I also did a hex comparison and see minor differences at the end of the file which tells me I've muxed the frames correctly. What could I be missing ? Any tips or ideas would be really helpful !

    


    FFmpeg terminal remuxed video [WORKING VIDEO]
FFmpeg terminal working video

    


    Custom remuxer code video [SCRUBBING BROKEN]
Custom remuxer code video

    


    App.mp4 vs ffmpeg.mp4 remuxed video
Comparison of custom coded remuxer vs ffmpeg terminal concat to mp4 video.