Recherche avancée

Médias (2)

Mot : - Tags -/map

Autres articles (67)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Mise à jour de la version 0.1 vers 0.2

    24 juin 2013, par

    Explications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
    Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...)

  • MediaSPIP v0.2

    21 juin 2013, par

    MediaSPIP 0.2 est la première version de MediaSPIP stable.
    Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

Sur d’autres sites (8570)

  • ffmpeg : "Input/output error" Read gif from named pipe ?

    23 octobre 2022, par Xirado

    I've been trying to debug this for hours now, and cannot seem to understand why it doesn't work.

    


    I want to overlay a png onto all frames of a gif, and since writing all the files to disk is slow, i wanted to use pipes.

    


    I created a named pipe using mkfifo imgstream1 and then launch ffmpeg :

    


    ffmpeg -report -hide_banner -v 9 -loglevel 99 -y -f gif -i imgstream1 out1.gif


    


    (This isn't actually useful as it just spits out the same gif again but it's simpler for debugging)

    


    After ffmpeg launches it waits for data from the pipe, so i write an actual gif file to the pipe using this kotlin code

    


    val backgroundFifo = File("imgstream1")
val inputOne = File("background.gif").readBytes()
println("background.gif -> ${inputOne.size} bytes")
val output = backgroundFifo.outputStream()
output.write(inputOne)
output.flush()
output.close()


    


    It prints : background.gif -> 7233247 bytes

    


    Now ffmpeg should read the bytes and spit out a gif, but it always seems to fail doing so.

    


    Opening an input file: imgstream1.
[gif @ 0x7714180] Opening 'imgstream1' for reading
[file @ 0x7714980] Setting default whitelist 'file,crypto,data'
[AVIOContext @ 0x7724dc0] Statistics: 7233247 bytes read, 0 seeks
imgstream1: Input/output error


    


    Sadly this does not give any further information, i'm already on the highest log level...

    


    I've tried multiple things :

    


      

    • -f image2pipe -> Works, but only loads the first 2 frames of the gif
    • 


    • Renaming the pipe to imgstream1.gif so ffmpeg thinks it's a gif file, no change.
    • 


    • Different ffmpeg versions
    • 


    • Using the actual gif as an ffmpeg input works
    • 


    


    At this point i have no idea what the problem might be, as ffmpeg seems to load all bytes.
It's worth noting that doing cat imgstream1 > file.gif produces the desired gif.

    


    Any help would be appreciated !

    


  • FFMPEG Kit issue in Banuba Video SDK : "FFmpegKit failed to start on brand"

    10 mai 2023, par Vivek Makwana

    I am trying to integrate Banuba Video Editor sdk in flutter.

    


    The issue i am facing is as below

    


    Banuba sdk works with native code for Android in Flutter

    


    So Banuba requires FFMPEG dependency to define in Native Android

    


    And I am already using 'ffmpeg_kit_flutter' plugin for flutter.

    


    So both the FFMPEG kits are conflicted and some of the functions are not found. due to that I am receiving the below error.

    


    When the app is launched I am receiving this Crash as mentioned

    


    **java.lang.Error: FFmpegKit failed to start on brand: google, model: sdk_gphone_x86, device: generic_x86_arm, api level: 30, abis: x86 armeabi-v7a armeabi, 32bit abis: x86 armeabi-v7a armeabi, 64bit abis: .&#xA;**  at com.arthenica.ffmpegkit.NativeLoader.loadLibrary(NativeLoader.java:50)&#xA;    at com.arthenica.ffmpegkit.NativeLoader.loadFFmpegKit(NativeLoader.java:189)&#xA;    at com.arthenica.ffmpegkit.FFmpegKitConfig.<clinit>(FFmpegKitConfig.java:145)&#xA;    at com.arthenica.ffmpegkit.FFmpegKitConfig.enableFFmpegSessionCompleteCallback(FFmpegKitConfig.java:864)&#xA;    at com.arthenica.ffmpegkit.flutter.FFmpegKitFlutterPlugin.registerGlobalCallbacks(FFmpegKitFlutterPlugin.java:168)&#xA;    at com.arthenica.ffmpegkit.flutter.FFmpegKitFlutterPlugin.init(FFmpegKitFlutterPlugin.java:652)&#xA;    at com.arthenica.ffmpegkit.flutter.FFmpegKitFlutterPlugin.onAttachedToActivity(FFmpegKitFlutterPlugin.java:198)&#xA;    at io.flutter.embedding.engine.FlutterEngineConnectionRegistry.attachToActivityInternal(FlutterEngineConnectionRegistry.java:351)&#xA;    at io.flutter.embedding.engine.FlutterEngineConnectionRegistry.attachToActivity(FlutterEngineConnectionRegistry.java:324)&#xA;    at io.flutter.embedding.android.FlutterActivityAndFragmentDelegate.onAttach(FlutterActivityAndFragmentDelegate.java:194)&#xA;    at io.flutter.embedding.android.FlutterActivity.onCreate(FlutterActivity.java:498)&#xA;    at com.sound.it.MainActivity.onCreate(MainActivity.kt:43)&#xA;    at android.app.Activity.performCreate(Activity.java:8000)&#xA;    at android.app.Activity.performCreate(Activity.java:7984)&#xA;    at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1309)&#xA;    at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:3422)&#xA;    at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:3601)&#xA;    at android.app.servertransaction.LaunchActivityItem.execute(LaunchActivityItem.java:85)&#xA;    at android.app.servertransaction.TransactionExecutor.executeCallbacks(TransactionExecutor.java:135)&#xA;    at android.app.servertransaction.TransactionExecutor.execute(TransactionExecutor.java:95)&#xA;    at android.app.ActivityThread$H.handleMessage(ActivityThread.java:2066)&#xA;    at android.os.Handler.dispatchMessage(Handler.java:106)&#xA;    at android.os.Looper.loop(Looper.java:223)&#xA;    at android.app.ActivityThread.main(ActivityThread.java:7656)&#xA;    at java.lang.reflect.Method.invoke(Native Method)&#xA;    at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:592)&#xA;    at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:947)&#xA;**Caused by: java.lang.UnsatisfiedLinkError: dlopen failed: cannot locate symbol "av_log_default_callback" referenced by "/data/app/~~umF3qlB9U53L4peU9nKYuQ==/com.sound.it-agqiFlNmMe17AHMK5gYIHw==/lib/x86/libffmpegkit.so"...**&#xA;    at java.lang.Runtime.loadLibrary0(Runtime.java:1087)&#xA;    at java.lang.Runtime.loadLibrary0(Runtime.java:1008)&#xA;    at java.lang.System.loadLibrary(System.java:1664)&#xA;    at com.arthenica.ffmpegkit.NativeLoader.loadLibrary(NativeLoader.java:48)&#xA;</clinit>

    &#xA;

    Looking for the solutions

    &#xA;

  • Is there a way to cut movement "dead air" on a screen recording ? [closed]

    16 mai 2023, par Raelbe

    I have got a couple of screen recordings of a painting I've done, and I've managed to concat the files together.

    &#xA;

    Unfortunately, there is a lot of "dead air" in the video (where I have left my desk, so there is no movement happening on screen) is there a way to cut out this down time ?

    &#xA;

    I found an example that another artist uses for his screen recordings, so I plugged it in with my file directory's. This is what I used :

    &#xA;

    .\ffmpeg -f concat -safe 0 -i "merge.txt" -vf npdecimate=hi=64*12:lo=64*5:frac=0.33,seipts=N/30/TB,"setpts=0.25*PTS" -r 30 -crf 30 -an Illu_Test.mp4&#xA;

    &#xA;

    I got this error message at the end :

    &#xA;

    [AVFilterGraph @ 000001cadfe5b1c0] No option name near &#x27;N/30/TB&#x27;&#xA;[AVFilterGraph @ 000001cadfe5b1c0] Error parsing a filter description around: ,setpts=0.25*PTS&#xA;[AVFilterGraph @ 000001cadfe5b1c0] Error parsing filterchain &#x27;npdecimate=hi=64*12:lo=64*5:frac=0.33,seipts=N/30/TB,setpts=0.25*PTS&#x27; around: ,setpts=0.25*PTS&#xA;Error reinitializing filters!&#xA;Failed to inject frame into filter network: Invalid argument&#xA;Error while processing the decoded data for stream #0:0`&#xA;

    &#xA;

    So I chopped it up a bit and this is what I used to concat the files and it worked perfectly.

    &#xA;

    .\ffmpeg -f concat -safe 0 -i "merge.txt" -crf 30 -an Illu_Test.mp4&#xA;

    &#xA;

    Now, I'm looking to cut out the seconds of no movement. I'm unsure what the -crf command does (as stated I am brand new to this) OG artist states that :

    &#xA;

    "This is the tolerance level that determines whether there has been enough change between frames or not to be considered as detected motion."

    &#xA;

    Any help would be appreciated.

    &#xA;

    (Apologies if the format of this question is wrong)

    &#xA;