Recherche avancée

Médias (0)

Mot : - Tags -/diogene

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (66)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Possibilité de déploiement en ferme

    12 avril 2011, par

    MediaSPIP peut être installé comme une ferme, avec un seul "noyau" hébergé sur un serveur dédié et utilisé par une multitude de sites différents.
    Cela permet, par exemple : de pouvoir partager les frais de mise en œuvre entre plusieurs projets / individus ; de pouvoir déployer rapidement une multitude de sites uniques ; d’éviter d’avoir à mettre l’ensemble des créations dans un fourre-tout numérique comme c’est le cas pour les grandes plate-formes tout public disséminées sur le (...)

  • Ajouter des informations spécifiques aux utilisateurs et autres modifications de comportement liées aux auteurs

    12 avril 2011, par

    La manière la plus simple d’ajouter des informations aux auteurs est d’installer le plugin Inscription3. Il permet également de modifier certains comportements liés aux utilisateurs (référez-vous à sa documentation pour plus d’informations).
    Il est également possible d’ajouter des champs aux auteurs en installant les plugins champs extras 2 et Interface pour champs extras.

Sur d’autres sites (9785)

  • Flutter (Dart) : Merge two videos and view the new output in the device's gallery (photos)

    17 juin 2020, par Ittai Barkai

    I am aware that there already exists a solution to a very similar question, which can be found on the following link : Flutter/Dart : Find two video segments and merge them into a single valid video file ? However, being relatively new to Flutter (and programming in general) I cannot seem to replicate the desired result.

    



    My app is very simple and currently looks like this :

    



    enter image description here

    



    I click on the button "Record Video" to record two videos, which are both successfully stored into the device's gallery. Using the Flutter image_picker and gallery_saver packages and the following piece of code :

    



        void _recordVideo() async {
    ImagePicker.pickVideo(source: ImageSource.camera)
        .then((File recordedVideo) {
      if (recordedVideo != null && recordedVideo.path != null) {
        setState(() {
          _buttonText = 'Saving in Progress...';
        });
        GallerySaver.saveVideo(recordedVideo.path).then((_) {
          setState(() {
            _buttonText = 'Video Saved!\n\nClick to Record New Video';
            if (_storedVideoOne == null) {
              _storedVideoOne = recordedVideo;
              print('video 1 stored');
            } else {
              _storedVideoTwo = recordedVideo;
              print('video 2 stored');
              _videoMerger();
            }
          });
        });
      }
    });
  }


    



    I can view these videos when I click on the button at the bottom "View Video From Gallery".

    



    Next I try to merge these two stored video files, using the flutter_ffmpeg package, as well as following the solution provided in the stack overflow question mentioned above. I try and do this using the following function I wrote :

    



    void _videoMerger() async {


    final appDir = await syspaths.getApplicationDocumentsDirectory();
    String rawDocumentPath = appDir.path;
    final outputPath = '$rawDocumentPath/output.mp4';

    final FlutterFFmpeg _flutterFFmpeg = new FlutterFFmpeg();

        String commandToExecute = '-i ${_storedVideoOne.path} -i ${_storedVideoTwo.path} -filter_complex \'[0:0][1:0]concat=n=2:v=1:a=0[out]\' -map \'[out]\' outputPath';
        _flutterFFmpeg.execute(commandToExecute).then((rc) => print("FFmpeg process exited with rc $rc"));

  }


    



    But after running the function I do not seem to get a new combined video, which should be stored in outputPath and ideally also viewable in the gallery. Uploaded the Flutter project onto GitHub here :

    



    https://github.com/IttaiBarkai/Flutter-Video-Merger

    



    Any help would be greatly appreciated :)

    



    Updated :

    



    Below is the output displayed on my debug console when ffmpeg gets executed :

    



    D/flutter-ffmpeg( 4146): Running FFmpeg with arguments: [-i, /storage/emulated/0/Android/data/com.example.video_merger_two/files/Pictures/d2b7a612-7c6d-48fe-8d06-85ceeb10e2f584195978113840656.mp4, -i, /storage/emulated/0/Android/data/com.example.video_merger_two/files/Pictures/b6cb83a3-10ac-49c7-80f3-3447bebe93ac5245748251872788895.mp4, -filter_complex, [0:0][1:0]concat=n=2:v=1:a=0[out], -map, [out], outputPath.mp4].
I/mobile-ffmpeg( 4146): ffmpeg version git-2020-01-25-fd11dd500
I/mobile-ffmpeg( 4146):  Copyright (c) 2000-2020 the FFmpeg developers
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146):   built with Android (5220042 based on r346389c) clang version 8.0.7 (https://android.googlesource.com/toolchain/clang b55f2d4ebfd35bf643d27dbca1bb228957008617) (https://android.googlesource.com/toolchain/llvm 3c393fe7a7e13b0fba4ac75a01aa683d7a5b11cd) (based on LLVM 8.0.7svn)
I/mobile-ffmpeg( 4146):   configuration: --cross-prefix=i686-linux-android- --sysroot=/files/android-sdk/ndk-bundle/toolchains/llvm/prebuilt/linux-x86_64/sysroot --prefix=/home/taner/Projects/mobile-ffmpeg/prebuilt/android-x86/ffmpeg --pkg-config=/usr/bin/pkg-config --enable-version3 --arch=i686 --cpu=i686 --cc=i686-linux-android24-clang --cxx=i686-linux-android24-clang++ --target-os=android --disable-neon --disable-asm --disable-inline-asm --enable-cross-compile --enable-pic --enable-jni --enable-optimizations --enable-swscale --enable-shared --disable-v4l2-m2m --disable-outdev=v4l2 --disable-outdev=fbdev --disable-indev=v4l2 --disable-indev=fbdev --enable-small --disable-openssl --disable-xmm-clobber-test --disable-debug --enable-lto --disable-neon-clobber-test --disable-programs --disable-postproc --disable-doc --disable-htmlpages --disable-manpages --disable-podpages --disable-txtpages --disable-static --disable-sndio --disable-schannel --disable-securetransport --disable-xlib --disable-cuda --disable-cuvid --disable-nvenc --di
I/mobile-ffmpeg( 4146):   libavutil      56. 38.100 / 56. 38.100
I/mobile-ffmpeg( 4146):   libavcodec     58. 65.102 / 58. 65.102
I/mobile-ffmpeg( 4146):   libavformat    58. 35.101 / 58. 35.101
I/mobile-ffmpeg( 4146):   libavdevice    58.  9.103 / 58.  9.103
I/mobile-ffmpeg( 4146):   libavfilter     7. 70.101 /  7. 70.101
I/mobile-ffmpeg( 4146):   libswscale      5.  6.100 /  5.  6.100
I/mobile-ffmpeg( 4146):   libswresample   3.  6.100 /  3.  6.100
I/mobile-ffmpeg( 4146): Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/storage/emulated/0/Android/data/com.example.video_merger_two/files/Pictures/d2b7a612-7c6d-48fe-8d06-85ceeb10e2f584195978113840656.mp4':
I/mobile-ffmpeg( 4146):   Metadata:
I/mobile-ffmpeg( 4146):     major_brand     :
I/mobile-ffmpeg( 4146): mp42
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146):     minor_version   :
I/mobile-ffmpeg( 4146): 0
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146):     compatible_brands:
I/mobile-ffmpeg( 4146): isommp42
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146):     creation_time   :
I/mobile-ffmpeg( 4146): 2020-06-17T12:07:20.000000Z
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146):     com.android.version:
I/mobile-ffmpeg( 4146): 10
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146):   Duration:
I/mobile-ffmpeg( 4146): 27:34:19.40
I/mobile-ffmpeg( 4146): , start:
I/mobile-ffmpeg( 4146): 0.000000
I/mobile-ffmpeg( 4146): , bitrate:
I/mobile-ffmpeg( 4146): 0 kb/s
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146):     Stream #0:0
I/mobile-ffmpeg( 4146): (eng)
I/mobile-ffmpeg( 4146): : Video: h264 (avc1 / 0x31637661), yuv420p(tv, GBR), 1280x720, 3536 kb/s
I/mobile-ffmpeg( 4146): , SAR 1:1 DAR 16:9
I/mobile-ffmpeg( 4146): ,
I/mobile-ffmpeg( 4146): 28.75 fps,
I/mobile-ffmpeg( 4146): 29.08 tbr,
I/mobile-ffmpeg( 4146): 90k tbn,
I/mobile-ffmpeg( 4146): 180k tbc
I/mobile-ffmpeg( 4146):  (default)
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146):     Metadata:
I/mobile-ffmpeg( 4146):       rotate          :
I/mobile-ffmpeg( 4146): 90
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146):       creation_time   :
I/mobile-ffmpeg( 4146): 2020-06-17T12:07:20.000000Z
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146):       handler_name    :
I/mobile-ffmpeg( 4146): VideoHandle
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146):     Side data:
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146): displaymatrix: rotation of -90.00 degrees
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146):     Stream #0:1
I/mobile-ffmpeg( 4146): (eng)
I/mobile-ffmpeg( 4146): : Audio: amr_nb (samr / 0x726D6173), 8000 Hz, mono, flt, 12 kb/s
I/mobile-ffmpeg( 4146):  (default)
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146):     Metadata:
I/mobile-ffmpeg( 4146):       creation_time   :
I/mobile-ffmpeg( 4146): 2020-06-17T12:07:20.000000Z
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146):       handler_name    :
I/mobile-ffmpeg( 4146): SoundHandle
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146): Input #1, mov,mp4,m4a,3gp,3g2,mj2, from '/storage/emulated/0/Android/data/com.example.video_merger_two/files/Pictures/b6cb83a3-10ac-49c7-80f3-3447bebe93ac5245748251872788895.mp4':
I/mobile-ffmpeg( 4146):   Metadata:
I/mobile-ffmpeg( 4146):     major_brand     :
I/mobile-ffmpeg( 4146): mp42
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146):     minor_version   :
I/mobile-ffmpeg( 4146): 0
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146):     compatible_brands:
I/mobile-ffmpeg( 4146): isommp42
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146):     creation_time   :
I/mobile-ffmpeg( 4146): 2020-06-17T12:07:32.000000Z
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146):     com.android.version:
I/mobile-ffmpeg( 4146): 10
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146):   Duration:
I/mobile-ffmpeg( 4146): 27:34:19.35
I/mobile-ffmpeg( 4146): , start:
I/mobile-ffmpeg( 4146): 0.000000
I/mobile-ffmpeg( 4146): , bitrate:
I/mobile-ffmpeg( 4146): 0 kb/s
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146):     Stream #1:0
I/mobile-ffmpeg( 4146): (eng)
I/mobile-ffmpeg( 4146): : Video: h264 (avc1 / 0x31637661), yuv420p(tv, GBR), 1280x720, 3561 kb/s
I/mobile-ffmpeg( 4146): , SAR 1:1 DAR 16:9
I/mobile-ffmpeg( 4146): ,
I/mobile-ffmpeg( 4146): 28.95 fps,
I/mobile-ffmpeg( 4146): 29 tbr,
I/mobile-ffmpeg( 4146): 90k tbn,
I/mobile-ffmpeg( 4146): 180k tbc
I/mobile-ffmpeg( 4146):  (default)
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146):     Metadata:
I/mobile-ffmpeg( 4146):       rotate          
:
I/mobile-ffmpeg( 4146): 90
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146):       creation_time   :
I/mobile-ffmpeg( 4146): 2020-06-17T12:07:32.000000Z
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146):       handler_name    :
I/mobile-ffmpeg( 4146): VideoHandle
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146):     Side data:
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146): displaymatrix: rotation of -90.00 degrees
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146):     Stream #1:1
I/mobile-ffmpeg( 4146): (eng)
I/mobile-ffmpeg( 4146): : Audio: amr_nb (samr / 0x726D6173), 8000 Hz, mono, flt, 12 kb/s
I/mobile-ffmpeg( 4146):  (default)
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146):     Metadata:
I/mobile-ffmpeg( 4146):       creation_time   :
I/mobile-ffmpeg( 4146): 2020-06-17T12:07:32.000000Z
I/mobile-ffmpeg( 4146):
I/mobile-ffmpeg( 4146):       handler_name    :
I/mobile-ffmpeg( 4146): SoundHandle
I/mobile-ffmpeg( 4146):
E/mobile-ffmpeg( 4146): outputPath.mp4: Read-only file system
D/flutter-ffmpeg( 4146): FFmpeg exited with rc: 1
I/flutter ( 4146): FFmpeg process exited with rc 1


    


  • FFMPEG Video to Audio Conversion Results in Different Durations

    10 juin 2020, par Eric J

    I am trying to covert an MP4 file into a mono WAV file sampled at 16,000 Hz.

    



    When I run below code, the duration goes from 00:09:59.99 (MP4) to 00:09:57.64 (WAV). Its original, longer version goes from 00:48:37.46 (MP4) to 00:48:23.38 (WAV).

    



    ffmpeg -i .mp4 -ac 1 -ar 16000 .wav


    



    I've also tried below code. The result is much worse, going from 00:09:59.99 (MP4) to 00:12:56.29 (AAC).

    



    ffmpeg -I .mp4 -vn -acodec copy .aac


    



    Attaching the log :

    



    Report written to "ffmpeg-20200610-093115.log"
Command line:
ffmpeg -i short.mp4 -ac 1 -ar 16000 short.wav -report
ffmpeg version 4.1.1 Copyright (c) 2000-2019 the FFmpeg developers
  built with Apple LLVM version 10.0.0 (clang-1000.11.45.5)
  configuration: --prefix=/usr/local/Cellar/ffmpeg/4.1.1 --enable-shared --enable-pthreads --enable-version3 --enable-hardcoded-tables --enable-avresample --cc=clang --host-cflags='-I/Library/Java/JavaVirtualMachines/openjdk-11.0.2.jdk/Contents/Home/include -I/Library/Java/JavaVirtualMachines/openjdk-11.0.2.jdk/Contents/Home/include/darwin' --host-ldflags= --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libbluray --enable-libmp3lame --enable-libopus --enable-librubberband --enable-libsnappy --enable-libtesseract --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-librtmp --enable-libspeex --enable-videotoolbox --disable-libjack --disable-indev=jack --enable-libaom --enable-libsoxr
  libavutil      56. 22.100 / 56. 22.100
  libavcodec     58. 35.100 / 58. 35.100
  libavformat    58. 20.100 / 58. 20.100
  libavdevice    58.  5.100 / 58.  5.100
  libavfilter     7. 40.101 /  7. 40.101
  libavresample   4.  0.  0 /  4.  0.  0
  libswscale      5.  3.100 /  5.  3.100
  libswresample   3.  3.100 /  3.  3.100
  libpostproc    55.  3.100 / 55.  3.100
Splitting the commandline.
Reading option '-i' ... matched as input url with argument 'short.mp4'.
Reading option '-ac' ... matched as option 'ac' (set number of audio channels) with argument '1'.
Reading option '-ar' ... matched as option 'ar' (set audio sampling rate (in Hz)) with argument '16000'.
Reading option 'short.wav' ... matched as output url.
Reading option '-report' ... matched as option 'report' (generate a report) with argument '1'.
Finished splitting the commandline.
Parsing a group of options: global .
Applying option report (generate a report) with argument 1.
Successfully parsed a group of options.
Parsing a group of options: input url short.mp4.
Successfully parsed a group of options.
Opening an input file: short.mp4.
[NULL @ 0x7f98a3008200] Opening 'short.mp4' for reading
[file @ 0x7f98a2904440] Setting default whitelist 'file,crypto'
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x7f98a3008200] Format mov,mp4,m4a,3gp,3g2,mj2 probed with size=2048 and score=100
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x7f98a3008200] ISO: File Type Major Brand: mp42
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x7f98a3008200] Unknown dref type 0x206c7275 size 12
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x7f98a3008200] Processing st: 0, edit list 0 - media time: 0, duration: 7679872
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x7f98a3008200] Unknown dref type 0x206c7275 size 12
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x7f98a3008200] Processing st: 1, edit list 0 - media time: 1024, duration: 26459559
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x7f98a3008200] drop a frame at curr_cts: 0 @ 0
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x7f98a3008200] Before avformat_find_stream_info() pos: 11213917 bytes read:318782 seeks:1 nb_streams:2
[h264 @ 0x7f98a3808800] nal_unit_type: 7(SPS), nal_ref_idc: 3
[h264 @ 0x7f98a3808800] nal_unit_type: 8(PPS), nal_ref_idc: 3
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x7f98a3008200] demuxer injecting skip 1024 / discard 0
[aac @ 0x7f98a1008c00] skip 1024 / discard 0 samples due to side data
[h264 @ 0x7f98a3808800] nal_unit_type: 6(SEI), nal_ref_idc: 0
[h264 @ 0x7f98a3808800] nal_unit_type: 5(IDR), nal_ref_idc: 3
[h264 @ 0x7f98a3808800] Format yuv420p chosen by get_format().
[h264 @ 0x7f98a3808800] Reinit context to 640x368, pix_fmt: yuv420p
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x7f98a3008200] All info found
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x7f98a3008200] After avformat_find_stream_info() pos: 21961 bytes read:351550 seeks:2 frames:46
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'short.mp4':
  Metadata:
    major_brand     : mp42
    minor_version   : 1
    compatible_brands: isommp41mp42
    creation_time   : 2020-06-10T16:12:17.000000Z
  Duration: 00:09:59.99, start: 0.000000, bitrate: 149 kb/s
    Stream #0:0(eng), 1, 1/12800: Video: h264 (Constrained Baseline) (avc1 / 0x31637661), yuv420p, 640x360 [SAR 1:1 DAR 16:9], 47 kb/s, 25 fps, 25 tbr, 12800 tbn, 50 tbc (default)
    Metadata:
      creation_time   : 2020-06-10T16:12:17.000000Z
      handler_name    : Core Media Video
    Stream #0:1(eng), 45, 1/44100: Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, mono, fltp, 98 kb/s (default)
    Metadata:
      creation_time   : 2020-06-10T16:12:17.000000Z
      handler_name    : Core Media Audio
Successfully opened the file.
Parsing a group of options: output url short.wav.
Applying option ac (set number of audio channels) with argument 1.
Applying option ar (set audio sampling rate (in Hz)) with argument 16000.
Successfully parsed a group of options.
Opening an output file: short.wav.
[file @ 0x7f98a0c1db40] Setting default whitelist 'file,crypto'
Successfully opened the file.
Stream mapping:
  Stream #0:1 -> #0:0 (aac (native) -> pcm_s16le (native))
Press [q] to stop, [?] for help
cur_dts is invalid (this is harmless if it occurs once at the start per stream)
[aac @ 0x7f98a100de00] skip 1024 / discard 0 samples due to side data
cur_dts is invalid (this is harmless if it occurs once at the start per stream)
detected 12 logical cores
[graph_0_in_0_1 @ 0x7f98a0e2c4c0] Setting 'time_base' to value '1/44100'
[graph_0_in_0_1 @ 0x7f98a0e2c4c0] Setting 'sample_rate' to value '44100'
[graph_0_in_0_1 @ 0x7f98a0e2c4c0] Setting 'sample_fmt' to value 'fltp'
[graph_0_in_0_1 @ 0x7f98a0e2c4c0] Setting 'channel_layout' to value '0x4'
[graph_0_in_0_1 @ 0x7f98a0e2c4c0] tb:1/44100 samplefmt:fltp samplerate:44100 chlayout:0x4
[format_out_0_0 @ 0x7f98a0e2cb80] Setting 'sample_fmts' to value 's16'
[format_out_0_0 @ 0x7f98a0e2cb80] Setting 'sample_rates' to value '16000'
[format_out_0_0 @ 0x7f98a0e2cb80] Setting 'channel_layouts' to value '0x4'
[format_out_0_0 @ 0x7f98a0e2cb80] auto-inserting filter 'auto_resampler_0' between the filter 'Parsed_anull_0' and the filter 'format_out_0_0'
[AVFilterGraph @ 0x7f98a0c16ac0] query_formats: 4 queried, 6 merged, 3 already done, 0 delayed
[auto_resampler_0 @ 0x7f98a0e2d540] [SWR @ 0x7f98a28e1000] Using fltp internally between filters
[auto_resampler_0 @ 0x7f98a0e2d540] ch:1 chl:mono fmt:fltp r:44100Hz -> ch:1 chl:mono fmt:s16 r:16000Hz
Output #0, wav, to 'short.wav':
  Metadata:
    major_brand     : mp42
    minor_version   : 1
    compatible_brands: isommp41mp42
    ISFT            : Lavf58.20.100
    Stream #0:0(eng), 0, 1/16000: Audio: pcm_s16le ([1][0][0][0] / 0x0001), 16000 Hz, mono, s16, 256 kb/s (default)
    Metadata:
      creation_time   : 2020-06-10T16:12:17.000000Z
      handler_name    : Core Media Audio
      encoder         : Lavc58.35.100 pcm_s16le
size=   17152kB time=00:09:16.63 bitrate= 252.4kbits/s speed=1.11e+03x    
[out_0_0 @ 0x7f98a0e2c700] EOF on sink link out_0_0:default.
No more output streams to write to, finishing.
size=   18676kB time=00:09:59.99 bitrate= 255.0kbits/s speed=1.11e+03x    
video:0kB audio:18676kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.000408%
Input file #0 (short.mp4):
  Input stream #0:0 (video): 1 packets read (3689 bytes); 
  Input stream #0:1 (audio): 25739 packets read (7375414 bytes); 25738 frames decoded (26355712 samples); 
  Total: 25740 packets (7379103 bytes) demuxed
Output file #0 (short.wav):
  Output stream #0:0 (audio): 25739 frames encoded (9562163 samples); 25739 packets muxed (19124326 bytes); 
  Total: 25739 packets (19124326 bytes) muxed
25738 frames successfully decoded, 0 decoding errors
[AVIOContext @ 0x7f98a0c1dc40] Statistics: 4 seeks, 76 writeouts
[AVIOContext @ 0x7f98a29045c0] Statistics: 10902846 bytes read, 29 seeks


    


  • How to export audio from a iPhone video file with FFmpeg ?

    11 janvier 2020, par DanScripter

    UPDATE 1 the problem resides in the input read-stream. (check below)

    I am using fluent-ffmpeg (version : 2.1.2) to get a .mp3 file out of a video file that I input as a stream .createReadStream(). I output the file as a .pipe to a remoteWriteStream.

    ffmpeg -i pipe:0 -vn -f mp3 -acodec mp3 -movflags frag_keyframe+empty_moov pipe:1

    This works fine with .mp4, .webm and .mov (codec : prores) files.

    But it somehow does not want to work with a .mov out of an iPhone 11. FFmpeg is not giving me any error when running above code. It creates the .mp3 file but the size is just 152 B and it is not playable.

    I ffprobed the iPhone .mov files it returns this :

    "{ streams:
      [ { index: 0,
          codec_name: 'hevc',
          codec_long_name: 'H.265 / HEVC (High Efficiency Video Coding)',
          profile: 'Main',
          codec_type: 'video',
          codec_time_base: '1111/33300',
          codec_tag_string: 'hvc1',
          codec_tag: '0x31637668',
          width: 1920,
          height: 1080,
          coded_width: 1920,
          coded_height: 1088,
          has_b_frames: 2,
          sample_aspect_ratio: '0:1',
          display_aspect_ratio: '0:1',
          pix_fmt: 'yuv420p',
          level: 120,
          color_range: 'tv',
          color_space: 'bt709',
          color_transfer: 'bt709',
          color_primaries: 'bt709',
          chroma_location: 'unspecified',
          field_order: 'unknown',
          timecode: 'N/A',
          refs: 1,
          id: 'N/A',
          r_frame_rate: '30000/1001',
          avg_frame_rate: '33300/1111',
          time_base: '1/600',
          start_pts: 'N/A',
          start_time: 'N/A',
          duration_ts: 6666,
          duration: 11.11,
          bit_rate: 7611708,
          max_bit_rate: 'N/A',
          bits_per_raw_sample: 'N/A',
          nb_frames: 333,
          nb_read_frames: 'N/A',
          nb_read_packets: 'N/A',
          tags: [Object],
          disposition: [Object] },
        { index: 1,
          codec_name: 'aac',
          codec_long_name: 'AAC (Advanced Audio Coding)',
          profile: 'unknown',
          codec_type: 'audio',
          codec_time_base: '1/44100',
          codec_tag_string: 'mp4a',
          codec_tag: '0x6134706d',
          sample_fmt: 'fltp',
          sample_rate: 44100,
          channels: 2,
          channel_layout: 'stereo',
          bits_per_sample: 0,
          id: 'N/A',
          r_frame_rate: '0/0',
          avg_frame_rate: '0/0',
          time_base: '1/44100',
          start_pts: 'N/A',
          start_time: 'N/A',
          duration_ts: 489951,
          duration: 11.11,
          bit_rate: 135091,
          max_bit_rate: 192000,
          bits_per_raw_sample: 'N/A',
          nb_frames: 481,
          nb_read_frames: 'N/A',
          nb_read_packets: 'N/A',
          tags: [Object],
          disposition: [Object] },
        { index: 2,
          codec_name: 'unknown',
          codec_long_name: 'unknown',
          profile: 'unknown',
          codec_type: 'data',
          codec_tag_string: 'mebx',
          codec_tag: '0x7862656d',
          id: 'N/A',
          r_frame_rate: '0/0',
          avg_frame_rate: '0/0',
          time_base: '1/600',
          start_pts: 'N/A',
          start_time: 'N/A',
          duration_ts: 6666,
          duration: 11.11,
          bit_rate: 2670,
          max_bit_rate: 'N/A',
          bits_per_raw_sample: 'N/A',
          nb_frames: 38,
          nb_read_frames: 'N/A',
          nb_read_packets: 'N/A',
          tags: [Object],
          disposition: [Object] },
        { index: 3,
          codec_name: 'unknown',
          codec_long_name: 'unknown',
          profile: 'unknown',
          codec_type: 'data',
          codec_tag_string: 'mebx',
          codec_tag: '0x7862656d',
          id: 'N/A',
          r_frame_rate: '0/0',
          avg_frame_rate: '0/0',
          time_base: '1/600',
          start_pts: 'N/A',
          start_time: 'N/A',
          duration_ts: 6666,
          duration: 11.11,
          bit_rate: 7,
          max_bit_rate: 'N/A',
          bits_per_raw_sample: 'N/A',
          nb_frames: 1,
          nb_read_frames: 'N/A',
          nb_read_packets: 'N/A',
          tags: [Object],
          disposition: [Object] },
        { index: 4,
          codec_name: 'unknown',
          codec_long_name: 'unknown',
          profile: 'unknown',
          codec_type: 'data',
          codec_tag_string: 'mebx',
          codec_tag: '0x7862656d',
          id: 'N/A',
          r_frame_rate: '0/0',
          avg_frame_rate: '0/0',
          time_base: '1/600',
          start_pts: 'N/A',
          start_time: 'N/A',
          duration_ts: 6666,
          duration: 11.11,
          bit_rate: 18117,
          max_bit_rate: 'N/A',
          bits_per_raw_sample: 'N/A',
          nb_frames: 333,
          nb_read_frames: 'N/A',
          nb_read_packets: 'N/A',
          tags: [Object],
          disposition: [Object] } ],
     format:
      { filename: 'pipe:0',
        nb_streams: 5,
        nb_programs: 0,
        format_name: 'mov,mp4,m4a,3gp,3g2,mj2',
        format_long_name: 'QuickTime / MOV',
        start_time: 'N/A',
        duration: 11.11,
        size: 'N/A',
        bit_rate: 'N/A',
        probe_score: 100,
        tags:
         { major_brand: 'qt  ',
           minor_version: '0',
           compatible_brands: 'qt  ',
           creation_time: '2020-01-11T12:33:36.000000Z',
           'com.apple.quicktime.make': 'Apple',
           'com.apple.quicktime.model': 'iPhone 11',
           'com.apple.quicktime.software': '13.3',
           'com.apple.quicktime.creationdate': '2020-01-11T13:33:36+0100' } },
     chapters: [] }"

    The created .mp3 file results in a undefined.

    I already tried to take the .mov re-encode it to an .mp4 via ffmpeg and then try the above code to get the .mp3 - still does not work.

    Any advice how I can make this work ?

    thanks !


    UPDATE 1
    The problem resides in the input stream ! When I download the file to the local machine and input it as as a local file, not as a read stream, it works perfectly.

    I am creating the stream from a google cloud bucket like this :

    const myBucket = storage.bucket('myBucket');
    const remoteReadStream = myBucket.file(file).createReadStream();

    Since this code is perfectly working with all other codecs, what maybe the issue while creating the read stream from the google cloud from a h265 file ?