Recherche avancée

Médias (1)

Mot : - Tags -/swfupload

Autres articles (31)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

Sur d’autres sites (4839)

  • FFmpegKitFlutter, Error : MissingPluginException(No implementation found for method ffmpegSession on channel flutter.arthenica.com/ffmpeg_kit)

    12 février, par 7Solomon

    I Have this simple Flutter function, to compress Audio Files

    


    import &#x27;package:ffmpeg_kit_flutter/ffmpeg_kit.dart&#x27;;&#xA;import &#x27;package:ffmpeg_kit_flutter/return_code.dart&#x27;;&#xA;&#xA;&#xA;Future<string> compressAudio(String inputPath) async {&#xA;    try {&#xA;      //     ...&#xA;      final command = &#x27;-y -i "$inputPath" &#x27; // Input file&#xA;          &#x27;-af "loudnorm=I=-16:TP=-1.5:LRA=11,&#x27; // Loudness normalization&#xA;          &#x27;acompressor=threshold=${_config[&#x27;threshold_db&#x27;]}:&#x27;&#xA;          &#x27;ratio=${_config[&#x27;ratio&#x27;]}:&#x27;&#xA;          &#x27;attack=${_config[&#x27;attack&#x27;]}:&#x27;&#xA;          &#x27;release=${_config[&#x27;release&#x27;]},&#x27;&#xA;          &#x27;highpass=f=20,lowpass=f=20000" &#x27; // Audio filters&#xA;          &#x27;-ar 44100 &#x27; // Sample rate&#xA;          &#x27;-b:a 128k &#x27; // Bitrate&#xA;          &#x27;-codec:a libmp3lame &#x27; // MP3 encoder&#xA;          &#x27;-q:a 2 &#x27; // Quality setting for LAME (0-9, lower is better)&#xA;          &#x27;-map_metadata 0 &#x27; // Copy metadata&#xA;          &#x27;"$outputPath"&#x27;; // Output file&#xA;&#xA;      // Execute FFmpeg command&#xA;      final session = await FFmpegKit.execute(command);&#xA;      final returnCode = await session.getReturnCode();&#xA;      final logs = await session.getLogs();&#xA;      print(&#x27;FFmpeg logs: $logs&#x27;);&#xA;&#xA;      if (ReturnCode.isSuccess(returnCode)) {&#xA;        return outputFileName;&#xA;      } else {&#xA;        final logs = await session.getLogs();&#xA;        throw Exception(&#xA;            &#x27;FFmpeg process failed with code $returnCode\nLogs: $logs&#x27;);&#xA;      }&#xA;    } catch (e, stackTrace) {&#xA;      print(&#x27;Error: $e&#x27;);&#xA;      print(&#x27;Stack trace: $stackTrace&#x27;);&#xA;      throw Exception(&#x27;Failed to compress audio: $e\nStack trace: $stackTrace&#x27;);&#xA;    }&#xA;  }&#xA;</string>

    &#xA;

    And I get this error&#xA;Error: MissingPluginException(No implementation found for method ffmpegSession on channel flutter.arthenica.com/ffmpeg_kit)

    &#xA;

    This is the Stacktrace

    &#xA;

    flutter: Error: MissingPluginException(No implementation found for method ffmpegSession on channel flutter.arthenica.com/ffmpeg_kit)&#xA;flutter: Stack trace: #0      MethodChannel._invokeMethod (package:flutter/src/services/platform_channel.dart:332:7)&#xA;<asynchronous suspension="suspension">&#xA;#1      AbstractSession.createFFmpegSession (package:ffmpeg_kit_flutter/abstract_session.dart:71:11)&#xA;<asynchronous suspension="suspension">&#xA;#2      FFmpegSession.create (package:ffmpeg_kit_flutter/ffmpeg_session.dart:40:21)&#xA;<asynchronous suspension="suspension">&#xA;#3      FFmpegKit.executeWithArguments (package:ffmpeg_kit_flutter/ffmpeg_kit.dart:44:9)&#xA;<asynchronous suspension="suspension">&#xA;#4      FileProcessor.compressAudio (package:predigt_upload_fl/file.dart:182:23)&#xA;<asynchronous suspension="suspension">&#xA;#5      _DetailPageState._handleSubmit (package:predigt_upload_fl/GUIs/LiveStreamDetailPage.dart:334:30)&#xA;<asynchronous suspension="suspension">&#xA;&#xA;&#xA;══╡ EXCEPTION CAUGHT BY SERVICES LIBRARY ╞══════════════════════════════════════════════════════════&#xA;flutter.arthenica.com/ffmpeg_kit_event)&#xA;&#xA;When the exception was thrown, this was the stack:&#xA;#0      MethodChannel._invokeMethod (package:flutter/src/services/platform_channel.dart:332:7)&#xA;<asynchronous suspension="suspension">&#xA;#1      EventChannel.receiveBroadcastStream.<anonymous closure="closure"> (package:flutter/src/services/platform_channel.dart:676:9)        &#xA;<asynchronous suspension="suspension">&#xA;</asynchronous></anonymous></asynchronous></asynchronous></asynchronous></asynchronous></asynchronous></asynchronous></asynchronous>

    &#xA;

    And i think this comes from abstract class FFmpegKitPlatform extends PlatformInterface inside ffmpeg_kit_flutter_platform_interface.dart, because in there are a lot of functions that are not Implemented.

    &#xA;

    This happens not just because of one functions :

    &#xA;

        // ...&#xA;    final logLevel = await _getLogLevel();&#xA;    print(&#x27;logLevel: $logLevel&#x27;);&#xA;    if (logLevel != null) {&#xA;      FFmpegKitConfig.setLogLevel(logLevel);&#xA;    }&#xA;    final version = FFmpegKitFactory.getVersion();&#xA;    final platform = await FFmpegKitConfig.getPlatform();&#xA;    final arch = await ArchDetect.getArch();&#xA;    final packageName = await Packages.getPackageName();&#xA;    await FFmpegKitConfig.enableRedirection();&#xA;    final isLTSPostfix = (await FFmpegKitConfig.isLTSBuild()) ? "-lts" : "";&#xA;    // ...&#xA;&#xA;

    &#xA;

    All of these Functions and more in FFmpegKitInitializer are not Implemented. So im pretty sure im missing something else here than just some outdated Version.

    &#xA;

    These are my Dependencies :

    &#xA;

    dependencies:&#xA;  flutter:&#xA;    sdk: flutter&#xA;  path_provider: ^2.0.15&#xA;  just_audio: ^0.9.34&#xA;  file_picker: ^5.3.1&#xA;  path: ^1.8.3&#xA;  id3_codec: ^1.0.3&#xA;  ftpconnect: ^2.0.5&#xA;  http: ^1.1.0&#xA;  shared_preferences: ^2.2.0&#xA;  html: ^0.15.5&#xA;  youtube_explode_dart: ^2.3.9&#xA;  intl: ^0.19.0&#xA;  ffmpeg_kit_flutter: ^6.0.3&#xA;

    &#xA;

    Im pretty new to Flutter Development so Im not quiete sure how to go about this Problem because every other FFmpeg Wrapper also has some problems that i couldnt fix.

    &#xA;

    If you need any other Information feel free to ask me, because I also dont know what someone would need to go fix the problem.

    &#xA;

  • Acurate cut mp4 avconv or ffmpeg

    3 septembre 2014, par diogopms
    avconv -y -i "/files/33.mp4" -ss 0.000000 -t 11.8 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-0.000000.ts"
    avconv -y -i "/files/33.mp4" -ss 11.800000 -t 11.768254 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-11.800000.ts"
    avconv -y -i "/files/33.mp4" -ss 23.568254 -t 11.888617 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-23.568254.ts"
    avconv -y -i "/files/33.mp4" -ss 35.456871 -t 11.423129 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-35.456871.ts"
    avconv -y -i "/files/33.mp4" -ss 46.880000 -t 12.075465 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-46.880000.ts"
    avconv -y -i "/files/33.mp4" -ss 58.955465 -t 11.404535 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-58.955465.ts"
    avconv -y -i "/files/33.mp4" -ss 70.360000 -t 12.094059 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-70.360000.ts"
    avconv -y -i "/files/33.mp4" -ss 82.454059 -t 11.385941 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-82.454059.ts"
    avconv -y -i "/files/33.mp4" -ss 93.840000 -t 12.112653 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-93.840000.ts"
    avconv -y -i "/files/33.mp4" -ss 105.952653 -t 11.367347 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-105.952653.ts"
    avconv -y -i "/files/33.mp4" -ss 117.320000 -t 12.131247 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-117.320000.ts"
    avconv -y -i "/files/33.mp4" -ss 129.451247 -t 11.348753 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-129.451247.ts"
    avconv -y -i "/files/33.mp4" -ss 140.800000 -t 12.149841 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-140.800000.ts"

    cat bla-*.ts > final.ts ---> this final movie is not fluid....What i need to do do fix it

    Anyone can help with the final solution ?

  • make fifo pipe in java(windows), write some data into it, let other process read the pipe

    9 novembre 2017, par vs93

    My objective is to create a named pipe(fifo) in windows(java), write some data(coming from camera) into that and invoke ffmpeg command to make mp4 from that data. But I suspect that it is not opening a fifo pipe, rather it is opening a file. How can I make it open a fifo pipe ?
    Here is the code :

    public void run () {
    String mp4Folder = "C://Users/user_2/Desktop/streamDestination";            // Where the MP4 is to be kept
    VFrame frame = null;
    int actualPipeValLen = 0;
    FileOutputStream requestStream = null;
    long lastProcessedTS = 0;
    final String FIFOPATH = "D://FIFO//";
    final String PIPE_NAME = FIFOPATH + "myfifo";
    final String MKFIFOCOMMAND = "mkfifo -m=rw " + PIPE_NAME;
    final String DELFIFOCOMMAND = "rm -r " + PIPE_NAME;
    String mp4FileName = mp4Folder + File.separator + "1.mp4";
    mp4FileName = mp4FileName.replace("\\", "/");
    long firstTimeStamp = 0;
    try {
       Runtime.getRuntime().exec(MKFIFOCOMMAND);
    } catch (IOException e1) {
       e1.printStackTrace();
    }
    if(firstTimeStamp == 0) {
       firstTimeStamp = frame.getTimestamp();
    }
    if((frame.getTimestamp() - firstTimeStamp) > (15 * 1000)) {
       if(requestStream != null) {
           requestStream.close();
           requestStream = null;
       }
       if(actualPipeValLen > 0) {
           String[] ffmpeg = new String[] {"ffmpeg", "-i", PIPE_NAME , "-vcodec", "copy", mp4FileName };
           Process ffmpegProcess = Runtime.getRuntime().exec(ffmpeg);
           actualPipeValLen = 0;
           firstTimeStamp = lastProcessedTS;
           Thread.sleep(2 * 1000);
           try {
               Runtime.getRuntime().exec(DELFIFOCOMMAND);
           } catch (IOException e1) {
               e1.printStackTrace();
           }
           System.exit(0);
       }
    } else {
       System.out.println("Writing into pipe : " + actualPipeValLen);
       if(requestStream == null) {
           requestStream = new  FileOutputStream(PIPE_NAME);
       }
       requestStream.write(frame.getFrame());
       actualPipeValLen += frame.getFrame().length;
       lastProcessedTS = frame.getTimestamp();
    }

    }