Recherche avancée

Médias (1)

Mot : - Tags -/ogg

Autres articles (111)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Soumettre bugs et patchs

    10 avril 2011

    Un logiciel n’est malheureusement jamais parfait...
    Si vous pensez avoir mis la main sur un bug, reportez le dans notre système de tickets en prenant bien soin de nous remonter certaines informations pertinentes : le type de navigateur et sa version exacte avec lequel vous avez l’anomalie ; une explication la plus précise possible du problème rencontré ; si possibles les étapes pour reproduire le problème ; un lien vers le site / la page en question ;
    Si vous pensez avoir résolu vous même le bug (...)

Sur d’autres sites (11560)

  • Android : FFMpeg video creation crashes with no exception in lower APIs (18 in my case)

    28 avril 2019, par Diego Perez

    I have an app that uses FFMpeg for video creation (these next lines are the relevant build.gradle plugin files) :

    //writingminds
    api 'com.writingminds:FFmpegAndroid:0.3.2'
    //JavaCV video
    api group: 'org.bytedeco', name: 'javacv', version: '1.4.4'
    api group: 'org.bytedeco.javacpp-presets', name: 'opencv', version: '4.0.1-1.4.4', classifier: 'android-arm'
    api group: 'org.bytedeco.javacpp-presets', name: 'ffmpeg', version: '4.1-1.4.4', classifier: 'android-arm'

    And my app is working (and creating video) just fine in my phone with a newer Android 8 version but I’m having weird problems in my old API 18 tablet (where, as a note, I had to install multidex).

    These next lines are the main part of FFMpeg video creation, where binaries are loaded, and, in fact, binaries load are where app crashes in my old tablet in the line "ffmpeg.loadBinary(new LoadBinaryResponseHandler()..."

    As you can see, I have a try/catch where the app crashes, but it crashes with no exception, as catch blocks are never hit.

    public static String recordVideo(JSONObject objJSON) {

       String strReturn = Enum.Result.OK;

       try {
           fileName = objJSON.has("file_name") ? String.valueOf(objJSON.getString("file_name")) : "";
           videoPath = objJSON.has("video_path") ? String.valueOf(objJSON.getString("video_path")) : "";
       } catch (JSONException e) {
           ExceptionHandler.logException(e);
       }

       FFmpeg ffmpeg = FFmpeg.getInstance(ApplicationContext.get());
       try {
           ffmpeg.loadBinary(new LoadBinaryResponseHandler() {

               @Override
               public void onStart() {}

               @Override
               public void onFailure() {}

               @Override
               public void onSuccess() {}

               @Override
               public void onFinish() {}
           });
       } catch (FFmpegNotSupportedException e) {
           // Handle if FFmpeg is not supported by device
       } catch (Exception e) {

       }
    ...

    These next lines are the relevant part of the LogCat, but I cannot figure out where the problem resides, maybe an out of memory problem ?

    Any help will be much appreciated.

    04-28 21:44:45.873 13743-13964/com.artandwords.thoughtoftheday A/libc: Fatal signal 11 (SIGSEGV) at 0x00000000 (code=1), thread 13964 (AsyncTask #4)
    04-28 21:44:45.973 144-144/? I/DEBUG: *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** ***
    04-28 21:44:45.983 144-144/? I/DEBUG: Build fingerprint: 'asus/WW_epad/ME302C:4.3/JSS15Q/WW_epad-V5.0.21-20140701:user/release-keys'
    04-28 21:44:45.983 144-144/? I/DEBUG: Revision: '0'
    04-28 21:44:45.983 144-144/? I/DEBUG: pid: 13743, tid: 13964, name: AsyncTask #4  >>> com.artandwords.thoughtoftheday <<<
    04-28 21:44:45.983 144-144/? I/DEBUG: signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 00000000
    04-28 21:44:46.003 144-144/? I/DEBUG:     eax 00000000  ebx 000000c6  ecx 00000000  edx 00000000
    04-28 21:44:46.003 144-144/? I/DEBUG:     esi 00000e59  edi 00000000
    04-28 21:44:46.003 144-144/? I/DEBUG:     xcs 00000073  xds 0000007b  xes 0000007b  xfs 00000043  xss 0000007b
    04-28 21:44:46.003 144-144/? I/DEBUG:     eip 784ed378  ebp 2200ff0c  esp 2200fec4  flags 00210246
    04-28 21:44:46.003 144-144/? I/DEBUG: backtrace:
    04-28 21:44:46.003 144-144/? I/DEBUG:     #00  pc 00087378  /system/lib/libhoudini.so.3.4.7.44914
    04-28 21:44:46.003 144-144/? I/DEBUG:     #01  pc 00085d0e  /system/lib/libhoudini.so.3.4.7.44914
    04-28 21:44:46.003 144-144/? I/DEBUG:     #02  pc 00073328  /system/lib/libhoudini.so.3.4.7.44914
    04-28 21:44:46.003 144-144/? I/DEBUG:     #03  pc 0006f7ff  /system/lib/libhoudini.so.3.4.7.44914
    04-28 21:44:46.003 144-144/? I/DEBUG:     #04  pc 0006f3bf  /system/lib/libhoudini.so.3.4.7.44914
    04-28 21:44:46.003 144-144/? I/DEBUG:     #05  pc 000b92de  /system/lib/libhoudini.so.3.4.7.44914
    04-28 21:44:46.003 144-144/? I/DEBUG:     #06  pc ffffffff  <unknown>
    04-28 21:44:46.003 144-144/? I/DEBUG:     #07  pc 001445aa  /system/lib/libhoudini.so.3.4.7.44914
    04-28 21:44:46.013 144-144/? I/DEBUG: stack:
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b436850  00000000  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b436854  00000000  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b436858  00000000  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b43685c  00000000  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b436860  00000000  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b436864  00000000  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b436868  00000000  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b43686c  3822676c  /system/lib/arm/libc.so
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b436870  7b436a98  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b436874  00000000  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b436878  7b4368c8  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b43687c  7b436a98  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b436880  383003a0  /system/lib/arm/libdl.so
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b436884  00000000  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b436888  7b4368c8  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b43688c  785aa5ab  /system/lib/libhoudini.so.3.4.7.44914
    04-28 21:44:46.013 144-144/? I/DEBUG:     #07  7b436890  7b4368a0  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b436894  00000000  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b436898  00000000  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b43689c  00000000  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b4368a0  7b4368c8  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b4368a4  7b436890  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b4368a8  785aa59d  /system/lib/libhoudini.so.3.4.7.44914
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b4368ac  7b436a98  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b4368b0  7b437930  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b4368b4  220001d0  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b4368b8  7b436a70  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b4368bc  00000000  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b4368c0  00000000  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b4368c4  00000000  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b4368c8  7b436a88  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b4368cc  785f3141  /system/lib/libhoudini.so.3.4.7.44914


       --------- beginning of /dev/log/system
    04-28 21:44:46.063 450-470/? I/BootReceiver: Copying /data/tombstones/tombstone_03 to DropBox (SYSTEM_TOMBSTONE)
    04-28 21:44:46.063 450-13973/? W/ActivityManager:   Force finishing activity com.artandwords.thoughtoftheday/.activities.DisplayThoughtActivity
    04-28 21:44:46.073 145-862/? E/IMGSRV: :0: PVRDRMOpen: TP3, ret = 75
    04-28 21:44:46.093 450-13973/? E/JavaBinder: !!! FAILED BINDER TRANSACTION !!!
    04-28 21:44:46.093 450-483/? W/InputDispatcher: channel '21edd9e8 com.artandwords.thoughtoftheday/com.artandwords.thoughtoftheday.activities.main.MainActivity (server)' ~ Consumer closed input channel or an error occurred.  events=0x9
    04-28 21:44:46.093 450-483/? E/InputDispatcher: channel '21edd9e8 com.artandwords.thoughtoftheday/com.artandwords.thoughtoftheday.activities.main.MainActivity (server)' ~ Channel is unrecoverably broken and will be disposed!
    04-28 21:44:46.093 155-13945/? W/TimedEventQueue: Event 25 was not found in the queue, already cancelled?
    04-28 21:44:46.093 155-3134/? W/AudioFlinger: session id 324 not found for pid 155
    04-28 21:44:46.103 450-450/? W/InputDispatcher: Attempted to unregister already unregistered input channel '21edd9e8 com.artandwords.thoughtoftheday/com.artandwords.thoughtoftheday.activities.main.MainActivity (server)'
    04-28 21:44:46.103 450-755/? I/WindowState: WIN DEATH: Window{21b33a28 u0 com.artandwords.thoughtoftheday/com.artandwords.thoughtoftheday.activities.DisplayThoughtActivity}
    04-28 21:44:46.103 450-13973/? W/ActivityManager: Exception thrown during pause
       android.os.TransactionTooLargeException
           at android.os.BinderProxy.transact(Native Method)
           at android.app.ApplicationThreadProxy.schedulePauseActivity(ApplicationThreadNative.java:642)
           at com.android.server.am.ActivityStack.startPausingLocked(ActivityStack.java:1007)
           at com.android.server.am.ActivityStack.finishActivityLocked(ActivityStack.java:3905)
           at com.android.server.am.ActivityStack.finishActivityLocked(ActivityStack.java:3837)
           at com.android.server.am.ActivityManagerService.handleAppCrashLocked(ActivityManagerService.java:8588)
           at com.android.server.am.ActivityManagerService.makeAppCrashingLocked(ActivityManagerService.java:8465)
           at com.android.server.am.ActivityManagerService.crashApplication(ActivityManagerService.java:9170)
           at com.android.server.am.ActivityManagerService.handleApplicationCrashInner(ActivityManagerService.java:8699)
           at com.android.server.am.NativeCrashListener$NativeCrashReporter.run(NativeCrashListener.java:86)
    04-28 21:44:46.103 450-450/? I/WindowState: WIN DEATH: Window{21edd9e8 u0 com.artandwords.thoughtoftheday/com.artandwords.thoughtoftheday.activities.main.MainActivity}
    04-28 21:44:46.103 450-636/? I/WindowState: WIN DEATH: Window{21be0690 u0 com.artandwords.thoughtoftheday/com.artandwords.thoughtoftheday.activities.DisplayThoughtActivity}
    04-28 21:44:46.113 450-13973/? I/WindowManager: computeScreenConfigurationLocked() Enter {1.15 ?mcc?mnc ?locale ?layoutDir ?swdp ?wdp ?hdp ?density ?lsize ?long ?orien ?uimode ?night ?touch ?keyb/?/? ?nav/?}
    04-28 21:44:46.113 450-13973/? I/WindowManager: dw=1200, dh=1920
    04-28 21:44:46.113 450-13973/? I/WindowManager: appWidth=1200, appHeight=1848
    04-28 21:44:46.113 450-13973/? I/WindowManager: tempdm=DisplayMetrics{density=1.5, width=1200, height=1848, scaledDensity=1.5, xdpi=221.201, ydpi=220.591}
    04-28 21:44:46.113 450-13973/? I/WindowManager: dm=DisplayMetrics{density=1.5, width=1200, height=1848, scaledDensity=1.5, xdpi=221.201, ydpi=220.591}, ro.product.device=ME302C
    04-28 21:44:46.113 450-13973/? I/WindowManager: getConfigDisplayWidth=1200, getConfigDisplayHeight=1810
    04-28 21:44:46.113 450-13973/? I/WindowManager: screenWidthDp=800, screenHeightDp=1206
    04-28 21:44:46.113 450-13973/? I/WindowManager: computeScreenConfigurationLocked() Leave {1.15 ?mcc?mnc ?locale ?layoutDir sw800dp w800dp h1206dp 240dpi xlrg port ?uimode ?night finger -keyb/v/h -nav/h}
    04-28 21:44:46.113 450-13973/? I/ActivityManager: Restarting because process died: ActivityRecord{21ab1f80 u0 com.artandwords.thoughtoftheday/.activities.main.MainActivity}
    04-28 21:44:46.113 450-13973/? W/ActivityManager: Exception when starting activity com.artandwords.thoughtoftheday/.activities.main.MainActivity
       android.os.DeadObjectException
           at android.os.BinderProxy.transact(Native Method)
           at android.app.ApplicationThreadProxy.scheduleLaunchActivity(ApplicationThreadNative.java:730)
           at com.android.server.am.ActivityStack.realStartActivityLocked(ActivityStack.java:733)
           at com.android.server.am.ActivityStack.startSpecificActivityLocked(ActivityStack.java:840)
           at com.android.server.am.ActivityStack.resumeTopActivityLocked(ActivityStack.java:1790)
           at com.android.server.am.ActivityStack.resumeTopActivityLocked(ActivityStack.java:1449)
           at com.android.server.am.ActivityStack.startPausingLocked(ActivityStack.java:1058)
           at com.android.server.am.ActivityStack.finishActivityLocked(ActivityStack.java:3905)
           at com.android.server.am.ActivityStack.finishActivityLocked(ActivityStack.java:3837)
           at com.android.server.am.ActivityManagerService.handleAppCrashLocked(ActivityManagerService.java:8588)
           at com.android.server.am.ActivityManagerService.makeAppCrashingLocked(ActivityManagerService.java:8465)
           at com.android.server.am.ActivityManagerService.crashApplication(ActivityManagerService.java:9170)
           at com.android.server.am.ActivityManagerService.handleApplicationCrashInner(ActivityManagerService.java:8699)
           at com.android.server.am.NativeCrashListener$NativeCrashReporter.run(NativeCrashListener.java:86)
    04-28 21:44:46.123 450-13973/? W/ContextImpl: Calling a method in the system process without a qualified user: android.app.ContextImpl.startService:1396 com.android.server.am.ActivityStack.sendActivityBroadcastLocked:4923 com.android.server.am.ActivityStack.removeActivityFromHistoryLocked:4089 com.android.server.am.ActivityStack.removeHistoryRecordsForAppLocked:4346 com.android.server.am.ActivityManagerService.handleAppDiedLocked:3163
    04-28 21:44:46.123 450-13973/? I/ActivityManager: Start proc com.artandwords.thoughtoftheday for activity com.artandwords.thoughtoftheday/.activities.main.MainActivity: pid=13975
    </unknown>

    Edit 1 :

    Still investigating and entered FFmpeg.java to the method loadBinary while debugging, which code I’ll paste below and the line making the crash is switch (CpuArchHelper.getCpuArch())

    @Override
    public void loadBinary(FFmpegLoadBinaryResponseHandler ffmpegLoadBinaryResponseHandler) throws FFmpegNotSupportedException
    String cpuArchNameFromAssets = null ;
    switch (CpuArchHelper.getCpuArch())
    case x86 :
    Log.i("Loading FFmpeg for x86 CPU") ;
    cpuArchNameFromAssets = "x86" ;
    break ;
    case ARMv7 :
    Log.i("Loading FFmpeg for armv7 CPU") ;
    cpuArchNameFromAssets = "armeabi-v7a" ;
    break ;
    case NONE :
    throw new FFmpegNotSupportedException("Device not supported") ;

    if (!TextUtils.isEmpty(cpuArchNameFromAssets)) {
       ffmpegLoadLibraryAsyncTask = new FFmpegLoadLibraryAsyncTask(context, cpuArchNameFromAssets, ffmpegLoadBinaryResponseHandler);
       ffmpegLoadLibraryAsyncTask.execute();
    } else {
       throw new FFmpegNotSupportedException("Device not supported");
    }

    I’ll keep on investigating...

    Edit 2 :

    Further debugging has just lead me to the exact line where app crashes, and it’s CpuArchHelper.java from FFmpeg library :

    The line causing the crash is the next :

    String archInfo = cpuNativeArchHelper.cpuArchFromJNI();

    and I cannot even go inside cpuArchFromJNI() with F7 as it just crashes.

    package com.github.hiteshsondhi88.libffmpeg;

    import android.os.Build;

    class CpuArchHelper {

       static CpuArch getCpuArch() {
           Log.d("Build.CPU_ABI : " + Build.CPU_ABI);
           // check if device is x86 or x86_64
           if (Build.CPU_ABI.equals(getx86CpuAbi()) || Build.CPU_ABI.equals(getx86_64CpuAbi())) {
               return CpuArch.x86;
           } else {
               // check if device is armeabi
               if (Build.CPU_ABI.equals(getArmeabiv7CpuAbi())) {
                   ArmArchHelper cpuNativeArchHelper = new ArmArchHelper();
                   String archInfo = cpuNativeArchHelper.cpuArchFromJNI();
                   // check if device is arm v7
                   if (cpuNativeArchHelper.isARM_v7_CPU(archInfo)) {
                       // check if device is neon
                       return CpuArch.ARMv7;
                   }
                   // check if device is arm64 which is supported by ARMV7
               } else if (Build.CPU_ABI.equals(getArm64CpuAbi())) {
                   return CpuArch.ARMv7;
               }
           }
           return CpuArch.NONE;
       }

       static String getx86CpuAbi() {
           return "x86";
       }

       static String getx86_64CpuAbi() {
           return "x86_64";
       }

       static String getArm64CpuAbi() {
           return "arm64-v8a";
       }

       static String getArmeabiv7CpuAbi() {
           return "armeabi-v7a";
       }
    }

    This is ArmArchHelper.java class :

    package com.github.hiteshsondhi88.libffmpeg;

    class ArmArchHelper {
       static {
           System.loadLibrary("ARM_ARCH");
       }

       native String cpuArchFromJNI();

       boolean isARM_v7_CPU(String cpuInfoString) {
           return cpuInfoString.contains("v7");
       }

       boolean isNeonSupported(String cpuInfoString) {
           // check cpu arch for loading correct ffmpeg lib
           return cpuInfoString.contains("-neon");
       }

    }
  • HLS script has been lost to time, previous content was made in specific format, attempting to recreate using FFMPEG primitives

    28 février, par Wungo

    Looking to add this video to a stitched playlist. The variants, encoding, and everything must match exactly. We have no access to how things were done previously, so I am literally vibing through this as best as I can.

    &#xA;

    I recommend using a clip of buck bunny that's 30 seconds long, or the original buck bunny 1080p video.

    &#xA;

    #!/bin/bash&#xA;ffmpeg -i bbb_30s.mp4 -filter_complex "&#xA;[0:v]split=7[v1][v2][v3][v4][v5][v6][v7];&#xA;[v1]scale=416:234[v1out];&#xA;[v2]scale=416:234[v2out];&#xA;[v3]scale=640:360[v3out];&#xA;[v4]scale=768:432[v4out];&#xA;[v5]scale=960:540[v5out];&#xA;[v6]scale=1280:720[v6out];&#xA;[v7]scale=1920:1080[v7out]&#xA;" \&#xA;-map "[v1out]" -c:v:0 libx264 -b:v:0 200k -maxrate 361k -bufsize 400k -r 29.97 -g 60 -keyint_min 60 -sc_threshold 0 -preset veryfast -profile:v baseline -level 3.0 \&#xA;-map "[v2out]" -c:v:1 libx264 -b:v:1 500k -maxrate 677k -bufsize 700k -r 29.97 -g 60 -keyint_min 60 -sc_threshold 0 -preset veryfast -profile:v baseline -level 3.0 \&#xA;-map "[v3out]" -c:v:2 libx264 -b:v:2 1000k -maxrate 1203k -bufsize 1300k -r 29.97 -g 60 -keyint_min 60 -sc_threshold 0 -preset veryfast -profile:v main -level 3.1 \&#xA;-map "[v4out]" -c:v:3 libx264 -b:v:3 1800k -maxrate 2057k -bufsize 2200k -r 29.97 -g 60 -keyint_min 60 -sc_threshold 0 -preset veryfast -profile:v main -level 3.1 \&#xA;-map "[v5out]" -c:v:4 libx264 -b:v:4 2500k -maxrate 2825k -bufsize 3000k -r 29.97 -g 60 -keyint_min 60 -sc_threshold 0 -preset veryfast -profile:v main -level 4.0 \&#xA;-map "[v6out]" -c:v:5 libx264 -b:v:5 5000k -maxrate 5525k -bufsize 6000k -r 29.97 -g 60 -keyint_min 60 -sc_threshold 0 -preset veryfast -profile:v high -level 4.1 \&#xA;-map "[v7out]" -c:v:6 libx264 -b:v:6 8000k -maxrate 9052k -bufsize 10000k -r 29.97 -g 60 -keyint_min 60 -sc_threshold 0 -preset veryfast -profile:v high -level 4.2 \&#xA;-map a:0 -c:a:0 aac -b:a:0 128k -ar 48000 -ac 2 \&#xA;-f hls -hls_time 6 -hls_playlist_type vod -hls_flags independent_segments \&#xA;-hls_segment_type fmp4 \&#xA;-hls_segment_filename "output_%v_%03d.mp4" \&#xA;-master_pl_name master.m3u8 \&#xA;-var_stream_map "v:0,name:layer-416x234-200k v:1,name:layer-416x234-500k v:2,name:layer-640x360-1000k v:3,name:layer-768x432-1800k v:4,name:layer-960x540-2500k v:5,name:layer-1280x720-5000k v:6,name:layer-1920x1080-8000k a:0,name:layer-audio-128k" \&#xA;output_%v.m3u8&#xA;&#xA;

    &#xA;

    Above is what i've put together over the past few days.

    &#xA;

    I consistently run into the same issues :

    &#xA;

      &#xA;
    1. my variants must match identically, the bit rate etc. must match identically no excuses. No variance allowed.
    2. &#xA;

    3. When I did it a different way previously, it became impossible to sync the variants timing, thus making the project not stitchable, making the asset useless.The variants are encoded to last longer than the master.m3u8 says it will last. Rejecting the asset downstream.
    4. &#xA;

    5. I end up either having variants mismatched with timing, or no audio/audio channels synced properly. Here is what the master.m3u8 should look like.
    6. &#xA;

    &#xA;

    #EXTM3U&#xA;#EXT-X-VERSION:7&#xA;&#xA;#EXT-X-STREAM-INF:AUDIO="aac",AVERAGE-BANDWIDTH=333000,BANDWIDTH=361000,CLOSED-CAPTIONS="cc1",CODECS="avc1.4d400d,mp4a.40.2",FRAME-RATE=29.97,RESOLUTION=416x234&#xA;placeholder.m3u8&#xA;#EXT-X-STREAM-INF:AUDIO="aac",AVERAGE-BANDWIDTH=632000,BANDWIDTH=677000,CLOSED-CAPTIONS="cc1",CODECS="avc1.4d400d,mp4a.40.2",FRAME-RATE=29.97,RESOLUTION=416x234&#xA;placeholder2.m3u8&#xA;#EXT-X-STREAM-INF:AUDIO="aac",AVERAGE-BANDWIDTH=1133000,BANDWIDTH=1203000,CLOSED-CAPTIONS="cc1",CODECS="avc1.4d401e,mp4a.40.2",FRAME-RATE=29.97,RESOLUTION=640x360&#xA;placeholder3.m3u8&#xA;&#xA;#EXT-X-STREAM-INF:AUDIO="aac",AVERAGE-BANDWIDTH=1933000,BANDWIDTH=2057000,CLOSED-CAPTIONS="cc1",CODECS="avc1.4d401f,mp4a.40.2",FRAME-RATE=29.97,RESOLUTION=768x432&#xA;placeholder4.m3u8&#xA;&#xA;#EXT-X-STREAM-INF:AUDIO="aac",AVERAGE-BANDWIDTH=2633000,BANDWIDTH=2825000,CLOSED-CAPTIONS="cc1",CODECS="avc1.4d401f,mp4a.40.2",FRAME-RATE=29.97,RESOLUTION=960x540&#xA;placeholder5.m3u8&#xA;&#xA;#EXT-X-STREAM-INF:AUDIO="aac",AVERAGE-BANDWIDTH=5134000,BANDWIDTH=5525000,CLOSED-CAPTIONS="cc1",CODECS="avc1.4d401f,mp4a.40.2",FRAME-RATE=29.97,RESOLUTION=1280x720&#xA;placeholder6.m3u8&#xA;&#xA;#EXT-X-STREAM-INF:AUDIO="aac",AVERAGE-BANDWIDTH=8135000,BANDWIDTH=9052000,CLOSED-CAPTIONS="cc1",CODECS="avc1.640028,mp4a.40.2",FRAME-RATE=29.97,RESOLUTION=1920x1080&#xA;placeholder7.m3u8&#xA;&#xA;#EXT-X-STREAM-INF:AUDIO="aac",AVERAGE-BANDWIDTH=129000,BANDWIDTH=130000,CLOSED-CAPTIONS="cc1",CODECS="mp4a.40.2"&#xA;placeholder8.m3u8&#xA;&#xA;#EXT-X-MEDIA:AUTOSELECT=YES,CHANNELS="2",DEFAULT=YES,GROUP-ID="aac",LANGUAGE="en",NAME="English",TYPE=AUDIO,URI="placeholder8.m3u8"&#xA;#EXT-X-MEDIA:AUTOSELECT=YES,DEFAULT=YES,GROUP-ID="cc1",INSTREAM-ID="CC1",LANGUAGE="en",NAME="English",TYPE=CLOSED-CAPTIONS&#xA;

    &#xA;

    Underlying playlist clips should be *.mp4 not *.m4s or anything like that. Audio must be on a single channel by itself, closed captions are handled by a remote server and aren't a concern.

    &#xA;

    as mentioned above :

    &#xA;

      &#xA;
    1. I have tried transcoding separately and then combining manually or later. Here is an example of that.
    2. &#xA;

    &#xA;

    #!/bin/bash&#xA;set -e&#xA;&#xA;# Input file&#xA;INPUT_FILE="bbb_30.mp4"&#xA;&#xA;# Output directory&#xA;OUTPUT_DIR="hls_output"&#xA;mkdir -p "$OUTPUT_DIR"&#xA;&#xA;# First, extract exact duration from master.m3u8 (if it exists)&#xA;MASTER_M3U8="master.m3u8"  # Change if needed&#xA;&#xA;echo "Extracting exact duration from the source MP4..."&#xA;EXACT_DURATION=$(ffprobe -v error -show_entries format=duration -of default=noprint_wrappers=1:nokey=1 "$INPUT_FILE")&#xA;echo "Using exact duration: $EXACT_DURATION seconds"&#xA;# Create a reference file with exact duration from the source&#xA;echo "Creating reference file with exact duration..."&#xA;ffmpeg -y -i "$INPUT_FILE" -c copy -t "$EXACT_DURATION" "$OUTPUT_DIR/exact_reference.mp4"&#xA;&#xA;# Calculate exact GOP size for segment alignment (for 6-second segments at 29.97fps)&#xA;FPS=29.97&#xA;SEGMENT_DURATION=6&#xA;GOP_SIZE=$(echo "$FPS * $SEGMENT_DURATION" | bc | awk &#x27;{print int($1)}&#x27;)&#xA;echo "Using GOP size of $GOP_SIZE frames for $SEGMENT_DURATION-second segments at $FPS fps"&#xA;&#xA;# Function to encode a variant with exact duration&#xA;encode_variant() {&#xA;  local resolution="$1"&#xA;  local bitrate="$2"&#xA;  local maxrate="$3"&#xA;  local bufsize="$4"&#xA;  local profile="$5"&#xA;  local level="$6"&#xA;  local audiorate="$7"&#xA;  local name_suffix="$8"&#xA;  &#xA;  echo "Encoding $resolution variant with video bitrate $bitrate kbps and audio bitrate ${audiorate}k..."&#xA;  &#xA;  # Step 1: Create an intermediate file with exact duration and GOP alignment&#xA;  ffmpeg -y -i "$OUTPUT_DIR/exact_reference.mp4" \&#xA;    -c:v libx264 -profile:v "$profile" -level "$level" \&#xA;    -x264-params "bitrate=$bitrate:vbv-maxrate=$maxrate:vbv-bufsize=$bufsize:keyint=$GOP_SIZE:min-keyint=$GOP_SIZE:no-scenecut=1" \&#xA;    -s "$resolution" -r "$FPS" \&#xA;    -c:a aac -b:a "${audiorate}k" \&#xA;    -vsync cfr -start_at_zero -reset_timestamps 1 \&#xA;    -map 0:v:0 -map 0:a:0 \&#xA;    -t "$EXACT_DURATION" \&#xA;    -force_key_frames "expr:gte(t,n_forced*6)" \&#xA;    "$OUTPUT_DIR/temp_${name_suffix}.mp4"&#xA;  &#xA;  # Step 2: Create HLS segments with exact boundaries from the intermediate file.&#xA;  ffmpeg -y -i "$OUTPUT_DIR/temp_${name_suffix}.mp4" \&#xA;    -c copy \&#xA;    -f hls \&#xA;    -hls_time "$SEGMENT_DURATION" \&#xA;    -hls_playlist_type vod \&#xA;    -hls_segment_filename "$OUTPUT_DIR/layer-${name_suffix}-segment-%03d.mp4" \&#xA;    -hls_flags independent_segments&#x2B;program_date_time&#x2B;round_durations \&#xA;    -hls_list_size 0 \&#xA;    "$OUTPUT_DIR/layer-${name_suffix}.m3u8"&#xA;  &#xA;  # Verify duration&#xA;  VARIANT_DURATION=$(ffprobe -v error -show_entries format=duration -of default=noprint_wrappers=1:nokey=1 "$OUTPUT_DIR/temp_${name_suffix}.mp4")&#xA;  echo "Variant $name_suffix duration: $VARIANT_DURATION (target: $EXACT_DURATION, diff: $(echo "$VARIANT_DURATION - $EXACT_DURATION" | bc))"&#xA;  &#xA;  # Clean up temporary file&#xA;  rm "$OUTPUT_DIR/temp_${name_suffix}.mp4"&#xA;}&#xA;&#xA;# Process each variant with exact duration matching&#xA;# Format: resolution, bitrate, maxrate, bufsize, profile, level, audio bitrate, name suffix&#xA;encode_variant "416x234" "333" "361" "722" "baseline" "3.0" "64" "416x234-200k"&#xA;encode_variant "416x234" "632" "677" "1354" "baseline" "3.0" "64" "416x234-500k"&#xA;encode_variant "640x360" "1133" "1203" "2406" "main" "3.0" "96" "640x360-1000k"&#xA;encode_variant "768x432" "1933" "2057" "4114" "main" "3.1" "96" "768x432-1800k"&#xA;encode_variant "960x540" "2633" "2825" "5650" "main" "3.1" "128" "960x540-2500k"&#xA;encode_variant "1280x720" "5134" "5525" "11050" "main" "3.1" "128" "1280x720-5000k"&#xA;encode_variant "1920x1080" "8135" "9052" "18104" "high" "4.0" "128" "1920x1080-8000k"&#xA;&#xA;# 8. Audio-only variant&#xA;echo "Creating audio-only variant..."&#xA;&#xA;&#xA;# ffmpeg -y -i "$INPUT_FILE" \&#xA;#   -vn -map 0:a \&#xA;#   -c:a aac -b:a 128k -ac 2 \ &#xA;#   -t "$EXACT_DURATION" \&#xA;#   -f hls \&#xA;#   -hls_time "$SEGMENT_DURATION" \&#xA;#   -hls_playlist_type vod \&#xA;#   -hls_flags independent_segments&#x2B;program_date_time&#x2B;round_durations \&#xA;#   -hls_segment_filename "$OUTPUT_DIR/layer-audio-128k-segment-%03d.ts" \&#xA;#   -hls_list_size 0 \&#xA;#   "$OUTPUT_DIR/layer-audio-128k.m3u8"&#xA;&#xA;ffmpeg -y -i "$INPUT_FILE" \&#xA;  -vn \&#xA;  -map 0:a \&#xA;  -c:a aac -b:a 128k \&#xA;  -t "$EXACT_DURATION" \&#xA;  -f hls \&#xA;  -hls_time "$SEGMENT_DURATION" \&#xA;  -hls_playlist_type vod \&#xA;  -hls_segment_type fmp4 \&#xA;  -hls_flags independent_segments&#x2B;program_date_time&#x2B;round_durations \&#xA;  -hls_list_size 0 \&#xA;  -hls_segment_filename "$OUTPUT_DIR/layer-audio-128k-segment-%03d.m4s" \&#xA;  "$OUTPUT_DIR/layer-audio-128k.m3u8"&#xA;&#xA;&#xA;# Create master playlist&#xA;cat > "$OUTPUT_DIR/master.m3u8" &lt;&lt; EOF&#xA;#EXTM3U&#xA;#EXT-X-VERSION:7&#xA;#EXT-X-INDEPENDENT-SEGMENTS&#xA;&#xA;#EXT-X-STREAM-INF:BANDWIDTH=361000,AVERAGE-BANDWIDTH=333000,CODECS="avc1.4d400d,mp4a.40.2",RESOLUTION=416x234,FRAME-RATE=29.97&#xA;layer-416x234-200k.m3u8&#xA;#EXT-X-STREAM-INF:BANDWIDTH=677000,AVERAGE-BANDWIDTH=632000,CODECS="avc1.4d400d,mp4a.40.2",RESOLUTION=416x234,FRAME-RATE=29.97&#xA;layer-416x234-500k.m3u8&#xA;#EXT-X-STREAM-INF:BANDWIDTH=1203000,AVERAGE-BANDWIDTH=1133000,CODECS="avc1.4d401e,mp4a.40.2",RESOLUTION=640x360,FRAME-RATE=29.97&#xA;layer-640x360-1000k.m3u8&#xA;#EXT-X-STREAM-INF:BANDWIDTH=2057000,AVERAGE-BANDWIDTH=1933000,CODECS="avc1.4d401f,mp4a.40.2",RESOLUTION=768x432,FRAME-RATE=29.97&#xA;layer-768x432-1800k.m3u8&#xA;#EXT-X-STREAM-INF:BANDWIDTH=2825000,AVERAGE-BANDWIDTH=2633000,CODECS="avc1.4d401f,mp4a.40.2",RESOLUTION=960x540,FRAME-RATE=29.97&#xA;layer-960x540-2500k.m3u8&#xA;#EXT-X-STREAM-INF:BANDWIDTH=5525000,AVERAGE-BANDWIDTH=5134000,CODECS="avc1.4d401f,mp4a.40.2",RESOLUTION=1280x720,FRAME-RATE=29.97&#xA;layer-1280x720-5000k.m3u8&#xA;#EXT-X-STREAM-INF:BANDWIDTH=9052000,AVERAGE-BANDWIDTH=8135000,CODECS="avc1.640028,mp4a.40.2",RESOLUTION=1920x1080,FRAME-RATE=29.97&#xA;layer-1920x1080-8000k.m3u8&#xA;#EXT-X-STREAM-INF:BANDWIDTH=130000,AVERAGE-BANDWIDTH=129000,CODECS="mp4a.40.2"&#xA;layer-audio-128k.m3u8&#xA;EOF&#xA;&#xA;# Verify all durations match&#xA;cat > "$OUTPUT_DIR/verify_all.sh" &lt;&lt; &#x27;EOF&#x27;&#xA;#!/bin/bash&#xA;&#xA;# Get exact reference duration from the exact reference file&#xA;REFERENCE_DURATION=$(ffprobe -v error -show_entries format=duration -of default=noprint_wrappers=1:nokey=1 "exact_reference.mp4")&#xA;echo "Reference duration: $REFERENCE_DURATION seconds"&#xA;&#xA;# Check each segment&#x27;s duration&#xA;echo -e "\nChecking individual segments..."&#xA;for seg in layer-*-segment-*.mp4 layer-audio-128k-segment-*.ts; do&#xA;  dur=$(ffprobe -v error -show_entries format=duration -of default=noprint_wrappers=1:nokey=1 "$seg")&#xA;  echo "$seg: $dur seconds"&#xA;done&#xA;&#xA;# Get total duration for each variant by summing segment EXTINF durations from each playlist&#xA;echo -e "\nChecking combined variant durations..."&#xA;for variant in layer-*.m3u8; do&#xA;  total=0&#xA;  while read -r line; do&#xA;    if [[ $line == "#EXTINF:"* ]]; then&#xA;      dur=$(echo "$line" | sed &#x27;s/#EXTINF:\([0-9.]*\).*/\1/&#x27;)&#xA;      total=$(echo "$total &#x2B; $dur" | bc)&#xA;    fi&#xA;  done &lt; "$variant"&#xA;  echo "$variant: $total seconds (reference: $REFERENCE_DURATION, diff: $(echo "$total - $REFERENCE_DURATION" | bc))"&#xA;done&#xA;EOF&#xA;&#xA;chmod &#x2B;x "$OUTPUT_DIR/verify_all.sh"&#xA;&#xA;echo "HLS packaging complete with exact duration matching."&#xA;echo "Master playlist available at: $OUTPUT_DIR/master.m3u8"&#xA;echo "Run $OUTPUT_DIR/verify_all.sh to verify durations."&#xA;rm "$OUTPUT_DIR/exact_reference.mp4"&#xA;&#xA;

    &#xA;

    I end up with weird audio in vlc which can't be right, and I also end up with variants being longer than the master.m3u8 playlist which is wonky.

    &#xA;

    I tried using AI to fix the audio sync issue, and honestly I'm more confused than when I started.

    &#xA;

  • Libav (ffmpeg) copying decoded video timestamps to encoder

    31 octobre 2016, par Jason C

    I am writing an application that decodes a single video stream from an input file (any codec, any container), does a bunch of image processing, and encodes the results to an output file (single video stream, Quicktime RLE, MOV). I am using ffmpeg’s libav 3.1.5 (Windows build for now, but the application will be cross-platform).

    There is a 1:1 correspondence between input and output frames and I want the frame timing in the output to be identical to the input. I am having a really, really hard time accomplishing this. So my general question is : How do I reliably (as in, in all cases of inputs) set the output frame timing identical to the input ?

    It took me a very long time to slog through the API and get to the point I am at now. I put together a minimal test program to work with :

    #include <cstdio>

    extern "C" {
    #include <libavcodec></libavcodec>avcodec.h>
    #include <libavformat></libavformat>avformat.h>
    #include <libavutil></libavutil>avutil.h>
    #include <libavutil></libavutil>imgutils.h>
    #include <libswscale></libswscale>swscale.h>
    }

    using namespace std;


    struct DecoderStuff {
       AVFormatContext *formatx;
       int nstream;
       AVCodec *codec;
       AVStream *stream;
       AVCodecContext *codecx;
       AVFrame *rawframe;
       AVFrame *rgbframe;
       SwsContext *swsx;
    };


    struct EncoderStuff {
       AVFormatContext *formatx;
       AVCodec *codec;
       AVStream *stream;
       AVCodecContext *codecx;
    };


    template <typename t="t">
    static void dump_timebase (const char *what, const T *o) {
       if (o)
           printf("%s timebase: %d/%d\n", what, o->time_base.num, o->time_base.den);
       else
           printf("%s timebase: null object\n", what);
    }


    // reads next frame into d.rawframe and d.rgbframe. returns false on error/eof.
    static bool read_frame (DecoderStuff &amp;d) {

       AVPacket packet;
       int err = 0, haveframe = 0;

       // read
       while (!haveframe &amp;&amp; err >= 0 &amp;&amp; ((err = av_read_frame(d.formatx, &amp;packet)) >= 0)) {
          if (packet.stream_index == d.nstream) {
              err = avcodec_decode_video2(d.codecx, d.rawframe, &amp;haveframe, &amp;packet);
          }
          av_packet_unref(&amp;packet);
       }

       // error output
       if (!haveframe &amp;&amp; err != AVERROR_EOF) {
           char buf[500];
           av_strerror(err, buf, sizeof(buf) - 1);
           buf[499] = 0;
           printf("read_frame: %s\n", buf);
       }

       // convert to rgb
       if (haveframe) {
           sws_scale(d.swsx, d.rawframe->data, d.rawframe->linesize, 0, d.rawframe->height,
                     d.rgbframe->data, d.rgbframe->linesize);
       }

       return haveframe;

    }


    // writes an output frame, returns false on error.
    static bool write_frame (EncoderStuff &amp;e, AVFrame *inframe) {

       // see note in so post about outframe here
       AVFrame *outframe = av_frame_alloc();
       outframe->format = inframe->format;
       outframe->width = inframe->width;
       outframe->height = inframe->height;
       av_image_alloc(outframe->data, outframe->linesize, outframe->width, outframe->height,
                      AV_PIX_FMT_RGB24, 1);
       //av_frame_copy(outframe, inframe);
       static int count = 0;
       for (int n = 0; n &lt; outframe->width * outframe->height; ++ n) {
           outframe->data[0][n*3+0] = ((n+count) % 100) ? 0 : 255;
           outframe->data[0][n*3+1] = ((n+count) % 100) ? 0 : 255;
           outframe->data[0][n*3+2] = ((n+count) % 100) ? 0 : 255;
       }
       ++ count;

       AVPacket packet;
       av_init_packet(&amp;packet);
       packet.size = 0;
       packet.data = NULL;

       int err, havepacket = 0;
       if ((err = avcodec_encode_video2(e.codecx, &amp;packet, outframe, &amp;havepacket)) >= 0 &amp;&amp; havepacket) {
           packet.stream_index = e.stream->index;
           err = av_interleaved_write_frame(e.formatx, &amp;packet);
       }

       if (err &lt; 0) {
           char buf[500];
           av_strerror(err, buf, sizeof(buf) - 1);
           buf[499] = 0;
           printf("write_frame: %s\n", buf);
       }

       av_packet_unref(&amp;packet);
       av_freep(&amp;outframe->data[0]);
       av_frame_free(&amp;outframe);

       return err >= 0;

    }


    int main (int argc, char *argv[]) {

       const char *infile = "wildlife.wmv";
       const char *outfile = "test.mov";
       DecoderStuff d = {};
       EncoderStuff e = {};

       av_register_all();

       // decoder
       avformat_open_input(&amp;d.formatx, infile, NULL, NULL);
       avformat_find_stream_info(d.formatx, NULL);
       d.nstream = av_find_best_stream(d.formatx, AVMEDIA_TYPE_VIDEO, -1, -1, &amp;d.codec, 0);
       d.stream = d.formatx->streams[d.nstream];
       d.codecx = avcodec_alloc_context3(d.codec);
       avcodec_parameters_to_context(d.codecx, d.stream->codecpar);
       avcodec_open2(d.codecx, NULL, NULL);
       d.rawframe = av_frame_alloc();
       d.rgbframe = av_frame_alloc();
       d.rgbframe->format = AV_PIX_FMT_RGB24;
       d.rgbframe->width = d.codecx->width;
       d.rgbframe->height = d.codecx->height;
       av_frame_get_buffer(d.rgbframe, 1);
       d.swsx = sws_getContext(d.codecx->width, d.codecx->height, d.codecx->pix_fmt,
                               d.codecx->width, d.codecx->height, AV_PIX_FMT_RGB24,
                               SWS_POINT, NULL, NULL, NULL);
       //av_dump_format(d.formatx, 0, infile, 0);
       dump_timebase("in stream", d.stream);
       dump_timebase("in stream:codec", d.stream->codec); // note: deprecated
       dump_timebase("in codec", d.codecx);

       // encoder
       avformat_alloc_output_context2(&amp;e.formatx, NULL, NULL, outfile);
       e.codec = avcodec_find_encoder(AV_CODEC_ID_QTRLE);
       e.stream = avformat_new_stream(e.formatx, e.codec);
       e.codecx = avcodec_alloc_context3(e.codec);
       e.codecx->bit_rate = 4000000; // arbitrary for qtrle
       e.codecx->width = d.codecx->width;
       e.codecx->height = d.codecx->height;
       e.codecx->gop_size = 30; // 99% sure this is arbitrary for qtrle
       e.codecx->pix_fmt = AV_PIX_FMT_RGB24;
       e.codecx->time_base = d.stream->time_base; // ???
       e.codecx->flags |= (e.formatx->flags &amp; AVFMT_GLOBALHEADER) ? AV_CODEC_FLAG_GLOBAL_HEADER : 0;
       avcodec_open2(e.codecx, NULL, NULL);
       avcodec_parameters_from_context(e.stream->codecpar, e.codecx);
       //av_dump_format(e.formatx, 0, outfile, 1);
       dump_timebase("out stream", e.stream);
       dump_timebase("out stream:codec", e.stream->codec); // note: deprecated
       dump_timebase("out codec", e.codecx);

       // open file and write header
       avio_open(&amp;e.formatx->pb, outfile, AVIO_FLAG_WRITE);
       avformat_write_header(e.formatx, NULL);

       // frames
       while (read_frame(d) &amp;&amp; write_frame(e, d.rgbframe))
           ;

       // write trailer and close file
       av_write_trailer(e.formatx);
       avio_closep(&amp;e.formatx->pb);

    }
    </typename></cstdio>

    A few notes about that :

    • Since all of my attempts at frame timing so far have failed, I’ve removed almost all timing-related stuff from this code to start with a clean slate.
    • Almost all error checking and cleanup omitted for brevity.
    • The reason I allocate a new output frame with a new buffer in write_frame, rather than using inframe directly, is because this is more representative of what my real application is doing. My real app also uses RGB24 internally, hence the conversions here.
    • The reason I generate a weird pattern in outframe, rather than using e.g. av_copy_frame, is because I just wanted a test pattern that compressed well with Quicktime RLE (my test input ends up generating a 1.7GB output file otherwise).
    • The input video I am using, "wildlife.wmv", can be found here. I’ve hard-coded the filenames.
    • I am aware that avcodec_decode_video2 and avcodec_encode_video2 are deprecated, but don’t care. They work fine, I’ve already struggled too much getting my head around the latest version of the API, ffmpeg changes their API with nearly every release, and I really don’t feel like dealing with avcodec_send_* and avcodec_receive_* right now.
    • I think I’m supposed to be finishing off by passing a NULL frame to avcodec_encode_video2 to flush some buffers or something but I’m a bit confused about that. Unless somebody feels like explaining that let’s ignore it for now, it’s a separate question. The docs are as vague about this point as they are about everything else.
    • My test input file’s frame rate is 29.97.

    Now, as for my current attempts. The following timing related fields are present in the above code, with details/confusion in bold. There’s a lot of them, because the API is mind-bogglingly convoluted :

    • main: d.stream->time_base : Input video stream time base. For my test input file this is 1/1000.
    • main: d.stream->codec->time_base : Not sure what this is (I never could make sense of why AVStream has an AVCodecContext field when you always use your own new context anyways) and also the codec field is deprecated. For my test input file this is 1/1000.
    • main: d.codecx->time_base : Input codec context time-base. For my test input file this is 0/1. Am I supposed to set it ?
    • main: e.stream->time_base : Time base of the output stream I create. What do I set this to ?
    • main: e.stream->codec->time_base : Time base of the deprecated and mysterious codec field of the output stream I create. Do I set this to anything ?
    • main: e.codecx->time_base : Time base of the encoder context I create. What do I set this to ?
    • read_frame: packet.dts : Decoding timestamp of packet read.
    • read_frame: packet.pts : Presentation timestamp of packet read.
    • read_frame: packet.duration : Duration of packet read.
    • read_frame: d.rawframe->pts : Presentation timestamp of raw frame decoded. This is always 0. Why isn’t it read by the decoder...?
    • read_frame: d.rgbframe->pts / write_frame: inframe->pts : Presentation timestamp of decoded frame converted to RGB. Not set to anything currently.
    • read_frame: d.rawframe->pkt_* : Fields copied from packet, discovered after reading this post. They are set correctly but I don’t know if they are useful.
    • write_frame: outframe->pts : Presentation timestamp of frame being encoded. Should I set this to something ?
    • write_frame: outframe->pkt_* : Timing fields from a packet. Should I set these ? They seem to be ignored by the encoder.
    • write_frame: packet.dts : Decoding timestamp of packet being encoded. What do I set it to ?
    • write_frame: packet.pts : Presentation timestamp of packet being encoded. What do I set it to ?
    • write_frame: packet.duration : Duration of packet being encoded. What do I set it to ?

    I have tried the following, with the described results. Note that inframe is d.rgbframe :

    1.  
      • Init e.stream->time_base = d.stream->time_base
      • Init e.codecx->time_base = d.codecx->time_base
      • Set d.rgbframe->pts = packet.dts in read_frame
      • Set outframe->pts = inframe->pts in write_frame
      • Result : Warning that encoder time base is not set (since d.codecx->time_base was 0/1), seg fault.
    2.  
      • Init e.stream->time_base = d.stream->time_base
      • Init e.codecx->time_base = d.stream->time_base
      • Set d.rgbframe->pts = packet.dts in read_frame
      • Set outframe->pts = inframe->pts in write_frame
      • Result : No warnings, but VLC reports frame rate as 480.048 (no idea where this number came from) and file plays too fast. Also the encoder sets all the timing fields in packet to 0, which was not what I expected. (Edit : Turns out this is because av_interleaved_write_frame, unlike av_write_frame, takes ownership of the packet and swaps it with a blank one, and I was printing the values after that call. So they are not ignored.)
    3.  
      • Init e.stream->time_base = d.stream->time_base
      • Init e.codecx->time_base = d.stream->time_base
      • Set d.rgbframe->pts = packet.dts in read_frame
      • Set any of pts/dts/duration in packet in write_frame to anything.
      • Result : Warnings about packet timestamps not set. Encoder seems to reset all packet timing fields to 0, so none of this has any effect.
    4.  
      • Init e.stream->time_base = d.stream->time_base
      • Init e.codecx->time_base = d.stream->time_base
      • I found these fields, pkt_pts, pkt_dts, and pkt_duration in AVFrame after reading this post, so I tried copying those all the way through to outframe.
      • Result : Really had my hopes up, but ended up with same results as attempt 3 (packet timestamp not set warning, incorrect results).

    I tried various other hand-wavey permutations of the above and nothing worked. What I want to do is create an output file that plays back with the same timing and frame rate as the input (29.97 constant frame rate in this case).

    So how do I do this ? Of the zillions of timing related fields here, what do I do to make the output be the same as the input ? And how do I do it in such a way that handles arbitrary video input formats that may store their time stamps and time bases in different places ? I need this to always work.


    For reference, here is a table of all the packet and frame timestamps read from the video stream of my test input file, to give a sense of what my test file looks like. None of the input packet pts’ are set, same with frame pts, and for some reason the duration of the first 108 frames is 0. VLC plays the file fine and reports the frame rate as 29.9700089 :