
Recherche avancée
Médias (1)
-
SWFUpload Process
6 septembre 2011, par
Mis à jour : Septembre 2011
Langue : français
Type : Texte
Autres articles (55)
-
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...) -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...)
Sur d’autres sites (4090)
-
FFmpegKitFlutter, Error : MissingPluginException(No implementation found for method ffmpegSession on channel flutter.arthenica.com/ffmpeg_kit)
12 février, par 7SolomonI Have this simple Flutter function, to compress Audio Files


import 'package:ffmpeg_kit_flutter/ffmpeg_kit.dart';
import 'package:ffmpeg_kit_flutter/return_code.dart';


Future<string> compressAudio(String inputPath) async {
 try {
 // ...
 final command = '-y -i "$inputPath" ' // Input file
 '-af "loudnorm=I=-16:TP=-1.5:LRA=11,' // Loudness normalization
 'acompressor=threshold=${_config['threshold_db']}:'
 'ratio=${_config['ratio']}:'
 'attack=${_config['attack']}:'
 'release=${_config['release']},'
 'highpass=f=20,lowpass=f=20000" ' // Audio filters
 '-ar 44100 ' // Sample rate
 '-b:a 128k ' // Bitrate
 '-codec:a libmp3lame ' // MP3 encoder
 '-q:a 2 ' // Quality setting for LAME (0-9, lower is better)
 '-map_metadata 0 ' // Copy metadata
 '"$outputPath"'; // Output file

 // Execute FFmpeg command
 final session = await FFmpegKit.execute(command);
 final returnCode = await session.getReturnCode();
 final logs = await session.getLogs();
 print('FFmpeg logs: $logs');

 if (ReturnCode.isSuccess(returnCode)) {
 return outputFileName;
 } else {
 final logs = await session.getLogs();
 throw Exception(
 'FFmpeg process failed with code $returnCode\nLogs: $logs');
 }
 } catch (e, stackTrace) {
 print('Error: $e');
 print('Stack trace: $stackTrace');
 throw Exception('Failed to compress audio: $e\nStack trace: $stackTrace');
 }
 }
</string>


And I get this error

Error: MissingPluginException(No implementation found for method ffmpegSession on channel flutter.arthenica.com/ffmpeg_kit)


This is the Stacktrace


flutter: Error: MissingPluginException(No implementation found for method ffmpegSession on channel flutter.arthenica.com/ffmpeg_kit)
flutter: Stack trace: #0 MethodChannel._invokeMethod (package:flutter/src/services/platform_channel.dart:332:7)
<asynchronous suspension="suspension">
#1 AbstractSession.createFFmpegSession (package:ffmpeg_kit_flutter/abstract_session.dart:71:11)
<asynchronous suspension="suspension">
#2 FFmpegSession.create (package:ffmpeg_kit_flutter/ffmpeg_session.dart:40:21)
<asynchronous suspension="suspension">
#3 FFmpegKit.executeWithArguments (package:ffmpeg_kit_flutter/ffmpeg_kit.dart:44:9)
<asynchronous suspension="suspension">
#4 FileProcessor.compressAudio (package:predigt_upload_fl/file.dart:182:23)
<asynchronous suspension="suspension">
#5 _DetailPageState._handleSubmit (package:predigt_upload_fl/GUIs/LiveStreamDetailPage.dart:334:30)
<asynchronous suspension="suspension">


══╡ EXCEPTION CAUGHT BY SERVICES LIBRARY ╞══════════════════════════════════════════════════════════
flutter.arthenica.com/ffmpeg_kit_event)

When the exception was thrown, this was the stack:
#0 MethodChannel._invokeMethod (package:flutter/src/services/platform_channel.dart:332:7)
<asynchronous suspension="suspension">
#1 EventChannel.receiveBroadcastStream.<anonymous closure="closure"> (package:flutter/src/services/platform_channel.dart:676:9) 
<asynchronous suspension="suspension">
</asynchronous></anonymous></asynchronous></asynchronous></asynchronous></asynchronous></asynchronous></asynchronous></asynchronous>


And i think this comes from abstract class FFmpegKitPlatform extends PlatformInterface inside ffmpeg_kit_flutter_platform_interface.dart, because in there are a lot of functions that are not Implemented.


This happens not just because of one functions :


// ...
 final logLevel = await _getLogLevel();
 print('logLevel: $logLevel');
 if (logLevel != null) {
 FFmpegKitConfig.setLogLevel(logLevel);
 }
 final version = FFmpegKitFactory.getVersion();
 final platform = await FFmpegKitConfig.getPlatform();
 final arch = await ArchDetect.getArch();
 final packageName = await Packages.getPackageName();
 await FFmpegKitConfig.enableRedirection();
 final isLTSPostfix = (await FFmpegKitConfig.isLTSBuild()) ? "-lts" : "";
 // ...




All of these Functions and more in FFmpegKitInitializer are not Implemented. So im pretty sure im missing something else here than just some outdated Version.


These are my Dependencies :


dependencies:
 flutter:
 sdk: flutter
 path_provider: ^2.0.15
 just_audio: ^0.9.34
 file_picker: ^5.3.1
 path: ^1.8.3
 id3_codec: ^1.0.3
 ftpconnect: ^2.0.5
 http: ^1.1.0
 shared_preferences: ^2.2.0
 html: ^0.15.5
 youtube_explode_dart: ^2.3.9
 intl: ^0.19.0
 ffmpeg_kit_flutter: ^6.0.3



Im pretty new to Flutter Development so Im not quiete sure how to go about this Problem because every other FFmpeg Wrapper also has some problems that i couldnt fix.


If you need any other Information feel free to ask me, because I also dont know what someone would need to go fix the problem.


-
Acurate cut mp4 avconv or ffmpeg
3 septembre 2014, par diogopmsavconv -y -i "/files/33.mp4" -ss 0.000000 -t 11.8 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-0.000000.ts"
avconv -y -i "/files/33.mp4" -ss 11.800000 -t 11.768254 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-11.800000.ts"
avconv -y -i "/files/33.mp4" -ss 23.568254 -t 11.888617 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-23.568254.ts"
avconv -y -i "/files/33.mp4" -ss 35.456871 -t 11.423129 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-35.456871.ts"
avconv -y -i "/files/33.mp4" -ss 46.880000 -t 12.075465 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-46.880000.ts"
avconv -y -i "/files/33.mp4" -ss 58.955465 -t 11.404535 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-58.955465.ts"
avconv -y -i "/files/33.mp4" -ss 70.360000 -t 12.094059 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-70.360000.ts"
avconv -y -i "/files/33.mp4" -ss 82.454059 -t 11.385941 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-82.454059.ts"
avconv -y -i "/files/33.mp4" -ss 93.840000 -t 12.112653 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-93.840000.ts"
avconv -y -i "/files/33.mp4" -ss 105.952653 -t 11.367347 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-105.952653.ts"
avconv -y -i "/files/33.mp4" -ss 117.320000 -t 12.131247 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-117.320000.ts"
avconv -y -i "/files/33.mp4" -ss 129.451247 -t 11.348753 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-129.451247.ts"
avconv -y -i "/files/33.mp4" -ss 140.800000 -t 12.149841 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-140.800000.ts"cat bla-*.ts > final.ts ---> this final movie is not fluid....What i need to do do fix it
Anyone can help with the final solution ?
-
make fifo pipe in java(windows), write some data into it, let other process read the pipe
9 novembre 2017, par vs93My objective is to create a named pipe(fifo) in windows(java), write some data(coming from camera) into that and invoke ffmpeg command to make mp4 from that data. But I suspect that it is not opening a fifo pipe, rather it is opening a file. How can I make it open a fifo pipe ?
Here is the code :public void run () {
String mp4Folder = "C://Users/user_2/Desktop/streamDestination"; // Where the MP4 is to be kept
VFrame frame = null;
int actualPipeValLen = 0;
FileOutputStream requestStream = null;
long lastProcessedTS = 0;
final String FIFOPATH = "D://FIFO//";
final String PIPE_NAME = FIFOPATH + "myfifo";
final String MKFIFOCOMMAND = "mkfifo -m=rw " + PIPE_NAME;
final String DELFIFOCOMMAND = "rm -r " + PIPE_NAME;
String mp4FileName = mp4Folder + File.separator + "1.mp4";
mp4FileName = mp4FileName.replace("\\", "/");
long firstTimeStamp = 0;
try {
Runtime.getRuntime().exec(MKFIFOCOMMAND);
} catch (IOException e1) {
e1.printStackTrace();
}
if(firstTimeStamp == 0) {
firstTimeStamp = frame.getTimestamp();
}
if((frame.getTimestamp() - firstTimeStamp) > (15 * 1000)) {
if(requestStream != null) {
requestStream.close();
requestStream = null;
}
if(actualPipeValLen > 0) {
String[] ffmpeg = new String[] {"ffmpeg", "-i", PIPE_NAME , "-vcodec", "copy", mp4FileName };
Process ffmpegProcess = Runtime.getRuntime().exec(ffmpeg);
actualPipeValLen = 0;
firstTimeStamp = lastProcessedTS;
Thread.sleep(2 * 1000);
try {
Runtime.getRuntime().exec(DELFIFOCOMMAND);
} catch (IOException e1) {
e1.printStackTrace();
}
System.exit(0);
}
} else {
System.out.println("Writing into pipe : " + actualPipeValLen);
if(requestStream == null) {
requestStream = new FileOutputStream(PIPE_NAME);
}
requestStream.write(frame.getFrame());
actualPipeValLen += frame.getFrame().length;
lastProcessedTS = frame.getTimestamp();
}}