Recherche avancée

Médias (0)

Mot : - Tags -/publication

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (109)

  • L’agrémenter visuellement

    10 avril 2011

    MediaSPIP est basé sur un système de thèmes et de squelettes. Les squelettes définissent le placement des informations dans la page, définissant un usage spécifique de la plateforme, et les thèmes l’habillage graphique général.
    Chacun peut proposer un nouveau thème graphique ou un squelette et le mettre à disposition de la communauté.

  • Ajouter des informations spécifiques aux utilisateurs et autres modifications de comportement liées aux auteurs

    12 avril 2011, par

    La manière la plus simple d’ajouter des informations aux auteurs est d’installer le plugin Inscription3. Il permet également de modifier certains comportements liés aux utilisateurs (référez-vous à sa documentation pour plus d’informations).
    Il est également possible d’ajouter des champs aux auteurs en installant les plugins champs extras 2 et Interface pour champs extras.

  • Encoding and processing into web-friendly formats

    13 avril 2011, par

    MediaSPIP automatically converts uploaded files to internet-compatible formats.
    Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
    Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
    Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
    All uploaded files are stored online in their original format, so you can (...)

Sur d’autres sites (11917)

  • Using ffmpeg to read mp4 file from stdin and extract image is failing with "Invalid data found when processing input"

    16 mars 2023, par jsindos

    Using ffmpeg, I'm trying to read an mp4 file from stdin, and save the last frame of the file to jpg.

    


    cat PXL_20221118_051057924.TS.mp4 | ffmpeg -sseof -3 -i - -f mp4 -update 1 -q:v 1 last.jpg


    


    The command works fine when not running from stdin.

    


    When running from stdin, an empty file is created, with the following error message :

    


    [mov,mp4,m4a,3gp,3g2,mj2 @ 0x7fcb4d0041c0] stream 1, offset 0x2eaee0d: partial file
pipe:: Invalid data found when processing input


    


    I've posted the full output below.

    


    


    ffmpeg version 5.1.2 Copyright (c) 2000-2022 the FFmpeg developers
  built with Apple clang version 14.0.0 (clang-1400.0.29.202)
  configuration: --prefix=/usr/local/Cellar/ffmpeg/5.1.2_5 --enable-shared --enable-pthreads --enable-version3 --cc=clang --host-cflags= --host-ldflags= --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libaribb24 --enable-libbluray --enable-libdav1d --enable-libmp3lame --enable-libopus --enable-librav1e --enable-librist --enable-librubberband --enable-libsnappy --enable-libsrt --enable-libsvtav1 --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libspeex --enable-libsoxr --enable-libzmq --enable-libzimg --disable-libjack --disable-indev=jack --enable-videotoolbox
  libavutil      57. 28.100 / 57. 28.100
  libavcodec     59. 37.100 / 59. 37.100
  libavformat    59. 27.100 / 59. 27.100
  libavdevice    59.  7.100 / 59.  7.100
  libavfilter     8. 44.100 /  8. 44.100
  libswscale      6.  7.100 /  6.  7.100
  libswresample   4.  7.100 /  4.  7.100
  libpostproc    56.  6.100 / 56.  6.100
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x7fcb4d0041c0] stream 2, offset 0x2c: partial file
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'pipe:':
  Metadata:
    major_brand     : isom
    minor_version   : 131072
    compatible_brands: isomiso2mp41
    creation_time   : 2022-11-18T05:11:23.000000Z
    com.android.capture.fps: 30.000000
  Duration: 00:00:23.74, start: 0.000000, bitrate: N/A
  Stream #0:0[0x1](eng): Data: none (mett / 0x7474656D), 45 kb/s (default)
    Metadata:
      creation_time   : 2022-11-18T05:11:23.000000Z
      handler_name    : MetaHandle
  Stream #0:1[0x2](eng): Audio: aac (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 192 kb/s (default)
    Metadata:
      creation_time   : 2022-11-18T05:11:23.000000Z
      handler_name    : SoundHandle
      vendor_id       : [0][0][0][0]
  Stream #0:2[0x3](eng): Video: hevc (Main) (hvc1 / 0x31637668), yuvj420p(pc, bt709), 1920x1080, 19524 kb/s, SAR 1:1 DAR 16:9, 30 fps, 30 tbr, 90k tbn (default)
    Metadata:
      creation_time   : 2022-11-18T05:11:23.000000Z
      handler_name    : VideoHandle
      vendor_id       : [0][0][0][0]
    Side data:
      displaymatrix: rotation of -90.00 degrees
  Stream #0:3[0x4](eng): Data: none (mett / 0x7474656D) (default)
    Metadata:
      creation_time   : 2022-11-18T05:11:23.000000Z
      handler_name    : MetaHandle
Stream mapping:
  Stream #0:2 -> #0:0 (hevc (native) -> h264 (libx264))
  Stream #0:1 -> #0:1 (aac (native) -> aac (native))
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x7fcb4d0041c0] stream 1, offset 0x2eaee0d: partial file
pipe:: Invalid data found when processing input
[libx264 @ 0x7fcb4d10b500] -qscale is ignored, -crf is recommended.
[libx264 @ 0x7fcb4d10b500] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 BMI2 AVX2
[libx264 @ 0x7fcb4d10b500] profile High, level 4.0, 4:2:0, 8-bit
[libx264 @ 0x7fcb4d10b500] 264 - core 164 r3095 baee400 - H.264/MPEG-4 AVC codec - Copyleft 2003-2022 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=18 lookahead_threads=3 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
Output #0, mp4, to '/Users/josephtsindos/Downloads/last2.jpg':
  Metadata:
    major_brand     : isom
    minor_version   : 131072
    compatible_brands: isomiso2mp41
    com.android.capture.fps: 30.000000
    encoder         : Lavf59.27.100
  Stream #0:0(eng): Video: h264 (avc1 / 0x31637661), yuvj420p(pc), 1080x1920, q=2-31, 30 fps, 15360 tbn (default)
    Metadata:
      creation_time   : 2022-11-18T05:11:23.000000Z
      handler_name    : VideoHandle
      vendor_id       : [0][0][0][0]
      encoder         : Lavc59.37.100 libx264
    Side data:
      cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: N/A
      displaymatrix: rotation of -0.00 degrees
  Stream #0:1(eng): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 128 kb/s (default)
    Metadata:
      creation_time   : 2022-11-18T05:11:23.000000Z
      handler_name    : SoundHandle
      vendor_id       : [0][0][0][0]
      encoder         : Lavc59.37.100 aac
frame=    0 fps=0.0 q=0.0 Lsize=       0kB time=00:00:00.00 bitrate=N/A speed=   0x    
video:0kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown
[aac @ 0x7fcb4d10c840] Qavg: nan


    


  • FFMPEG Kit issue in Banuba Video SDK : "FFmpegKit failed to start on brand"

    10 mai 2023, par Vivek Makwana

    I am trying to integrate Banuba Video Editor sdk in flutter.

    


    The issue i am facing is as below

    


    Banuba sdk works with native code for Android in Flutter

    


    So Banuba requires FFMPEG dependency to define in Native Android

    


    And I am already using 'ffmpeg_kit_flutter' plugin for flutter.

    


    So both the FFMPEG kits are conflicted and some of the functions are not found. due to that I am receiving the below error.

    


    When the app is launched I am receiving this Crash as mentioned

    


    **java.lang.Error: FFmpegKit failed to start on brand: google, model: sdk_gphone_x86, device: generic_x86_arm, api level: 30, abis: x86 armeabi-v7a armeabi, 32bit abis: x86 armeabi-v7a armeabi, 64bit abis: .&#xA;**  at com.arthenica.ffmpegkit.NativeLoader.loadLibrary(NativeLoader.java:50)&#xA;    at com.arthenica.ffmpegkit.NativeLoader.loadFFmpegKit(NativeLoader.java:189)&#xA;    at com.arthenica.ffmpegkit.FFmpegKitConfig.<clinit>(FFmpegKitConfig.java:145)&#xA;    at com.arthenica.ffmpegkit.FFmpegKitConfig.enableFFmpegSessionCompleteCallback(FFmpegKitConfig.java:864)&#xA;    at com.arthenica.ffmpegkit.flutter.FFmpegKitFlutterPlugin.registerGlobalCallbacks(FFmpegKitFlutterPlugin.java:168)&#xA;    at com.arthenica.ffmpegkit.flutter.FFmpegKitFlutterPlugin.init(FFmpegKitFlutterPlugin.java:652)&#xA;    at com.arthenica.ffmpegkit.flutter.FFmpegKitFlutterPlugin.onAttachedToActivity(FFmpegKitFlutterPlugin.java:198)&#xA;    at io.flutter.embedding.engine.FlutterEngineConnectionRegistry.attachToActivityInternal(FlutterEngineConnectionRegistry.java:351)&#xA;    at io.flutter.embedding.engine.FlutterEngineConnectionRegistry.attachToActivity(FlutterEngineConnectionRegistry.java:324)&#xA;    at io.flutter.embedding.android.FlutterActivityAndFragmentDelegate.onAttach(FlutterActivityAndFragmentDelegate.java:194)&#xA;    at io.flutter.embedding.android.FlutterActivity.onCreate(FlutterActivity.java:498)&#xA;    at com.sound.it.MainActivity.onCreate(MainActivity.kt:43)&#xA;    at android.app.Activity.performCreate(Activity.java:8000)&#xA;    at android.app.Activity.performCreate(Activity.java:7984)&#xA;    at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1309)&#xA;    at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:3422)&#xA;    at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:3601)&#xA;    at android.app.servertransaction.LaunchActivityItem.execute(LaunchActivityItem.java:85)&#xA;    at android.app.servertransaction.TransactionExecutor.executeCallbacks(TransactionExecutor.java:135)&#xA;    at android.app.servertransaction.TransactionExecutor.execute(TransactionExecutor.java:95)&#xA;    at android.app.ActivityThread$H.handleMessage(ActivityThread.java:2066)&#xA;    at android.os.Handler.dispatchMessage(Handler.java:106)&#xA;    at android.os.Looper.loop(Looper.java:223)&#xA;    at android.app.ActivityThread.main(ActivityThread.java:7656)&#xA;    at java.lang.reflect.Method.invoke(Native Method)&#xA;    at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:592)&#xA;    at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:947)&#xA;**Caused by: java.lang.UnsatisfiedLinkError: dlopen failed: cannot locate symbol "av_log_default_callback" referenced by "/data/app/~~umF3qlB9U53L4peU9nKYuQ==/com.sound.it-agqiFlNmMe17AHMK5gYIHw==/lib/x86/libffmpegkit.so"...**&#xA;    at java.lang.Runtime.loadLibrary0(Runtime.java:1087)&#xA;    at java.lang.Runtime.loadLibrary0(Runtime.java:1008)&#xA;    at java.lang.System.loadLibrary(System.java:1664)&#xA;    at com.arthenica.ffmpegkit.NativeLoader.loadLibrary(NativeLoader.java:48)&#xA;</clinit>

    &#xA;

    Looking for the solutions

    &#xA;

  • ffmpeg : "Input/output error" Read gif from named pipe ?

    23 octobre 2022, par Xirado

    I've been trying to debug this for hours now, and cannot seem to understand why it doesn't work.

    &#xA;

    I want to overlay a png onto all frames of a gif, and since writing all the files to disk is slow, i wanted to use pipes.

    &#xA;

    I created a named pipe using mkfifo imgstream1 and then launch ffmpeg :

    &#xA;

    ffmpeg -report -hide_banner -v 9 -loglevel 99 -y -f gif -i imgstream1 out1.gif&#xA;

    &#xA;

    (This isn't actually useful as it just spits out the same gif again but it's simpler for debugging)

    &#xA;

    After ffmpeg launches it waits for data from the pipe, so i write an actual gif file to the pipe using this kotlin code

    &#xA;

    val backgroundFifo = File("imgstream1")&#xA;val inputOne = File("background.gif").readBytes()&#xA;println("background.gif -> ${inputOne.size} bytes")&#xA;val output = backgroundFifo.outputStream()&#xA;output.write(inputOne)&#xA;output.flush()&#xA;output.close()&#xA;

    &#xA;

    It prints : background.gif -> 7233247 bytes

    &#xA;

    Now ffmpeg should read the bytes and spit out a gif, but it always seems to fail doing so.

    &#xA;

    Opening an input file: imgstream1.&#xA;[gif @ 0x7714180] Opening &#x27;imgstream1&#x27; for reading&#xA;[file @ 0x7714980] Setting default whitelist &#x27;file,crypto,data&#x27;&#xA;[AVIOContext @ 0x7724dc0] Statistics: 7233247 bytes read, 0 seeks&#xA;imgstream1: Input/output error&#xA;

    &#xA;

    Sadly this does not give any further information, i'm already on the highest log level...

    &#xA;

    I've tried multiple things :

    &#xA;

      &#xA;
    • -f image2pipe -> Works, but only loads the first 2 frames of the gif
    • &#xA;

    • Renaming the pipe to imgstream1.gif so ffmpeg thinks it's a gif file, no change.
    • &#xA;

    • Different ffmpeg versions
    • &#xA;

    • Using the actual gif as an ffmpeg input works
    • &#xA;

    &#xA;

    At this point i have no idea what the problem might be, as ffmpeg seems to load all bytes.&#xA;It's worth noting that doing cat imgstream1 > file.gif produces the desired gif.

    &#xA;

    Any help would be appreciated !

    &#xA;