
Recherche avancée
Médias (1)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (14)
-
Emballe médias : à quoi cela sert ?
4 février 2011, parCe plugin vise à gérer des sites de mise en ligne de documents de tous types.
Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ; -
Use, discuss, criticize
13 avril 2011, parTalk to people directly involved in MediaSPIP’s development, or to people around you who could use MediaSPIP to share, enhance or develop their creative projects.
The bigger the community, the more MediaSPIP’s potential will be explored and the faster the software will evolve.
A discussion list is available for all exchanges between users. -
Other interesting software
13 avril 2011, parWe don’t claim to be the only ones doing what we do ... and especially not to assert claims to be the best either ... What we do, we just try to do it well and getting better ...
The following list represents softwares that tend to be more or less as MediaSPIP or that MediaSPIP tries more or less to do the same, whatever ...
We don’t know them, we didn’t try them, but you can take a peek.
Videopress
Website : http://videopress.com/
License : GNU/GPL v2
Source code : (...)
Sur d’autres sites (3783)
-
FFmpeg Integration in .NET MAUI for Android [closed]
15 juin 2024, par Billy VanegasI'm facing a challenge with integrating FFmpeg into my .NET MAUI project for Android. While everything works smoothly on Windows with Visual Studio 2022, I'm having a hard time replicating this on the Android platform. Despite exploring various NuGet packages like FFMpegCore, which appear to be wrappers around FFmpeg but don't include FFmpeg itself, I'm still at a loss.


I've tried following the instructions for integrating ffmpeg-kit for Android, but I keep running into issues, resulting in repeated failures and growing confusion. It feels like there is no straightforward way to seamlessly incorporate FFmpeg into a .NET MAUI project that works consistently across both iOS and Android.


The Problem :


I need to convert MP3 files to WAV format using FFmpeg on the Android platform within a .NET MAUI project. I’m using the FFMpegCore library and have downloaded the FFmpeg binaries from the official FFmpeg website.


However, when attempting to use these binaries on an Android emulator, I encounter a permission denied error in the working directory :
/data/user/0/com.companyname.projectname/files/ffmpeg


Here’s the code snippet where the issue occurs :


await FFMpegArguments
 .FromFileInput(mp3Path)
 .OutputToFile(wavPath, true, options => options
 .WithAudioCodec("pcm_s16le")
 .WithAudioSamplingRate(44100)
 .WithAudioBitrate(320000)
 )
 .ProcessAsynchronously();
 



I've updated AndroidManifest.xml with permissions, but the issue persists.


I've created a method ConvertMp3ToWav to handle the conversion.
I also have a method ExtractFFmpegBinaries to manage FFmpeg binaries extraction, but it seems the permission issue might be tied to how these binaries are accessed or executed.


AndroidManifest.xml :


<?xml version="1.0" encoding="utf-8"?>
<manifest>
 <application></application>
 
 
 
 
</manifest>



Method ConvertMp3ToWav :


private async Task ConvertMp3ToWav(string mp3Path, string wavPath)
{
 try
 {
 // Check directory and create if not exists
 var directory = Path.GetDirectoryName(wavPath);
 if (!Directory.Exists(directory))
 Directory.CreateDirectory(directory!);

 // Check if WAV file exists
 if (!File.Exists(wavPath))
 Console.WriteLine($"File not found {wavPath}, creating empty file.");
 using var fs = new FileStream(wavPath, FileMode.CreateNew);

 // Check if MP3 file exists
 if (!File.Exists(mp3Path))
 Console.WriteLine($"File not found {mp3Path}");

 // Extract FFmpeg binaries
 string? ffmpegBinaryPath = await ExtractFFmpegBinaries(Platform.AppContext);

 // Configure FFmpeg options
 FFMpegCore.GlobalFFOptions.Configure(new FFOptions { BinaryFolder = Path.GetDirectoryName(ffmpegBinaryPath!)! });

 // Convert MP3 to WAV
 await FFMpegArguments
 .FromFileInput(mp3Path)
 .OutputToFile(wavPath, true, options => options
 .WithAudioCodec("pcm_s16le")
 .WithAudioSamplingRate(44100)
 .WithAudioBitrate(320000)
 )
 .ProcessAsynchronously();
 }
 catch (Exception ex)
 {
 Console.WriteLine($"An error occurred during the conversion process: {ex.Message}");
 throw;
 }
}



Method ExtractFFmpegBinaries :


private async Task<string> ExtractFFmpegBinaries(Context context)
{
 var architectureFolder = "x86"; // Adjust according to device architecture
 var ffmpegBinaryName = "ffmpeg"; 
 var ffmpegBinaryPath = Path.Combine(context.FilesDir!.AbsolutePath, ffmpegBinaryName);
 var tempFFMpegFileName = Path.Combine(FileSystem.AppDataDirectory, ffmpegBinaryName);

 if (!File.Exists(ffmpegBinaryPath))
 {
 try
 {
 var assetPath = $"Libs/{architectureFolder}/{ffmpegBinaryName}";
 using var assetStream = context.Assets!.Open(assetPath);
 
 await using var tempFFMpegFile = File.OpenWrite(tempFFMpegFileName);
 await assetStream.CopyToAsync(tempFFMpegFile);

 // Adjust permissions for FFmpeg binary
 Java.Lang.Runtime.GetRuntime()!.Exec($"chmod 755 {tempFFMpegFileName}");
 }
 catch (Exception ex)
 {
 Console.WriteLine($"An error occurred while extracting FFmpeg binaries: {ex.Message}");
 throw;
 }
 }
 else
 {
 Console.WriteLine($"FFmpeg binaries already extracted to: {ffmpegBinaryPath}");
 }

 return tempFFMpegFileName!;
}
</string>


What I Need :


How to correctly integrate and use FFmpeg in my .NET MAUI project for Android ? Specifically :


- 

- How to properly set up and configure FFmpeg binaries for use on Android within a .NET MAUI project.
- How to resolve the permission denied issue when attempting to execute FFmpeg binaries.






-
Unable to port FFmpeg C library into android
28 avril 2016, par Navjot BediWhat i exactly want to-do : Access the ffmpeg.c file to modify the int main(int argc, char **argv) function to JNI and passing the command of ffmpeg as string.
I have tried to port ffmpeg C library to android(ARM processor). I followed following different ways to do this.
1st Try : using official ffmpeg installation documentation. Steps as follows
a) git clone git ://source.ffmpeg.org/ffmpeg.git ffmpeg
b) Read the INSTALL file.
c) Download x264 library and build by using build_x264.sh which build successfully.
NDK=/home/nav/Work/android/ndk
PREBUILT=$NDK/toolchains/arm-linux-androideabi-4.4.3/prebuilt
PLATFORM=$NDK/platforms/android-8/arch-arm
PREFIX=/home/nav/28ffmpeg/android-ffmpeg
./configure --prefix=$PREFIX --enable-static --enable-pic --disable-asm --disable-cli --host=arm-linux --cross-prefix=$PREBUILT/linux-x86/bin/arm-linux-androideabi- --sysroot=$PLATFORM
make
sudo make install
sudo ldconfigd) Then i download ffmpeg library from (http://ffmpeg.org/releases/ffmpeg-snapshot.tar.bz2) build it by using build_android.sh
#!/bin/bash
NDK=/home/nav/Work/android/ndk
PLATFORM=$NDK/platforms/android-8/arch-arm
PREBUILT=$NDK/toolchains/arm-linux-androideabi-4.4.3/prebuilt/linux-x86
PREFIX=/home/nav/28ffmpeg/android-ffmpeg
function build_one
{
./configure --target-os=linux --prefix=$PREFIX \
--enable-cross-compile \
--enable-runtime-cpudetect \
--disable-asm \
--arch=arm \
--cc=$PREBUILT/bin/arm-linux-androideabi-gcc \
--cross-prefix=$PREBUILT/bin/arm-linux-androideabi- \
--disable-stripping \
--nm=$PREBUILT/bin/arm-linux-androideabi-nm \
--sysroot=$PLATFORM \
--enable-nonfree \
--enable-version3 \
--disable-everything \
--enable-gpl \
--disable-doc \
--enable-avresample \
--enable-demuxer=rtsp \
--enable-muxer=rtsp \
--disable-ffplay \
--disable-ffserver \
--enable-ffmpeg \
--disable-ffprobe \
--enable-libx264 \
--enable-encoder=libx264 \
--enable-decoder=h264 \
--enable-protocol=rtp \
--enable-hwaccels \
--enable-zlib \
--disable-devices \
--disable-avdevice \
--extra-cflags="-I/home/android-ffmpeg/include -fPIC -DANDROID -D__thumb__ -mthumb -Wfatal-errors -Wno-deprecated -mfloat-abi=softfp -mfpu=vfpv3-d16 -marm -march=armv7-a" \
--extra-ldflags="-L/home/android-ffmpeg/lib"
make -j4 install
$PREBUILT/bin/arm-linux-androideabi-ar d libavcodec/libavcodec.a inverse.o
$PREBUILT/bin/arm-linux-androideabi-ld -rpath-link=$PLATFORM/usr/lib -L$PLATFORM/usr/lib -L$PREFIX/lib -soname libffmpeg.so -shared -nostdlib -z,noexecstack -Bsymbolic --whole-archive --no-undefined -o $PREFIX/libffmpeg.so libavcodec/libavcodec.a libavfilter/libavfilter.a libavresample/libavresample.a libavformat/libavformat.a libavutil/libavutil.a libswscale/libswscale.a -lc -lm -lz -ldl -llog -lx264 --warn-once --dynamic-linker=/system/bin/linker $PREBUILT/lib/gcc/arm-linux-androideabi/4.4.3/libgcc.a
}
build_onee) Initially it works perfectly. But after that error comes :
libavformat/libavformat.a(log2_tab.o):(.rodata+0x0): multiple definition of `ff_log2_tab'
libavcodec/libavcodec.a(log2_tab.o):(.rodata+0x0): first defined here
libavutil/libavutil.a(log2_tab.o):(.rodata+0x0): multiple definition of `ff_log2_tab'
libavcodec/libavcodec.a(log2_tab.o):(.rodata+0x0): first defined here
build_android.sh: 48: build_one: not foundResult : *libffmpeg.so* not found.
2nd Try : Then I follow steps in http://dl.dropbox.com/u/22605641/ffmpeg_android/main.html-> Builds
a) I downloaded Pre-Build libffmpeg.so from above link.
b) Add libffmpeg.so to libs/armeabi/ .
c) Make Android.mk
include $(CLEAR_VARS)
LOCAL_MODULE := ffmpeg
LOCAL_SRC_FILES := libffmpeg.so
include $(PREBUILT_SHARED_LIBRARY)
LOCAL_PATH := $(call my-dir)
include $(CLEAR_VARS)
LOCAL_MODULE := main
LOCAL_STATIC_LIBRARIES += ffmpeg
LOCAL_SRC_FILES := ffmpeg-test.c
include $(BUILD_SHARED_LIBRARY)d) Then I do all ndk set up and all.Copy ffmpeg.c from library to ffmpeg-test.c by changing its int main function to my JNI functin and include all necessary include files.
Error :
Console :
/home/nav/Work/android/ndk/ndk-build all
Prebuilt : libffmpeg.so <= jni/
Install : libffmpeg.so => libs/armeabi/libffmpeg.so
Compile thumb : main <= ffmpeg-test.c
jni/ffmpeg-test.c: In function 'print_report':
jni/ffmpeg-test.c:1139:94: warning: incompatible implicit declaration of built-in function 'log2' [enabled by default]
SharedLibrary : libmain.so
/home/nav/Work/android/ndk/toolchains/arm-linux-androideabi-4.6/prebuilt/linux-x86/bin/../lib/gcc/arm-linux-androideabi/4.6/../../../../arm-linux-androideabi/bin/ld: ./obj/local/armeabi/objs/main/ffmpeg-test.o: in function check_keyboard_interaction:jni/ffmpeg-test.c:2496: error: undefined reference to 'qp_hist'and many other undefined references.....
Tell me where I am getting wrong. Needed Urgently.....
-
How to add subtitles using FFmpeg-kit ?
17 novembre 2024, par Mohammed BekeleI'm running a Flutter app with Ffmpeg-kit package to burn a subtitle on a video. I have a words list with the timings and map that to generate an srt and ass file, but when executing this it didn't work.


Firstly, here is how I generated the Ass file.


Future<string> _createAssFile() async {
 String filePath = await getAssOutputFilePath();

 final file = File(filePath);
 final buffer = StringBuffer();

 // Write ASS headers
 buffer.writeln('[Script Info]');
 buffer.writeln('Title: Generated ASS');
 buffer.writeln('ScriptType: v4.00+');
 buffer.writeln('Collisions: Normal');
 buffer.writeln('PlayDepth: 0');
 buffer.writeln('Timer: 100,0000');
 buffer.writeln('[V4+ Styles]');
 buffer.writeln(
 'Format: Name, Fontname, Fontsize, PrimaryColour, SecondaryColour, OutlineColour, BackColour, Bold, Italic, Underline, StrikeOut, ScaleX, ScaleY, Spacing, Angle, BorderStyle, Outline, Shadow, Alignment, MarginL, MarginR, MarginV, Encoding');
 buffer.writeln(
 'Style: Default,Arial,40,&H00FFFFFF,&H000000FF,&H00000000,&H80000000,1,1,1,1,100,100,0,0,1,1,1,2,10,10,10,1');
 buffer.writeln('[Events]');
 buffer.writeln(
 'Format: Layer, Start, End, Style, Name, MarginL, MarginR, MarginV, Effect, Text');

 // Write events (subtitles)
 for (int i = 0; i < widget.words.length; i++) {
 final word = widget.words[i];
 final startTime = _formatAssTime(word['startTime'].toDouble());
 final endTime = _formatAssTime(word['endTime'].toDouble());
 final text = word['word'];

 buffer.writeln('Dialogue: 0,$startTime,$endTime,Default,,0,0,0,,$text');
 }

 await file.writeAsString(buffer.toString());
 return filePath;
 }

 String _formatAssTime(double seconds) {
 final int hours = seconds ~/ 3600;
 final int minutes = ((seconds % 3600) ~/ 60);
 final int secs = (seconds % 60).toInt();
 final int millis = ((seconds - secs) * 1000).toInt() % 1000;

 return '${hours.toString().padLeft(1, '0')}:${minutes.toString().padLeft(2, '0')}:${secs.toString().padLeft(2, '0')}.${(millis ~/ 10).toString().padLeft(2, '0')}';
 }
</string>


Then I used this command which was the official way of adding ass file to a video.


String newCmd = "-i $videoPath -vf ass=$assFilePath -c:a copy $_outputPath";



Yet when executing this command it did not work. However I changed it to this command


String newCmd = "-i $videoPath -i $assFilePath $_outputPath";



Well, that code works but the styling is not being applied. So I tried yet another command to filter and position the ass file.


String newCmd = "-i $videoPath -i $assFilePath -filter_complex \"[0:v][1:s]ass=\\an5:text='%{event.text}',scale=iw*0.8:ih*0.8,setdar=16/9[outv]\" -map [outv] -c:a copy $_outputPath";



This made it even worse, because the app crashed when I ran this. Also there is a log for this command before crashing


F/libc (19624): Fatal signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x19 in tid 21314 (pool-4-thread-7), pid 19624 (ple.caption_app)
*** *** *** *** *** *** *** *** *** *** *** *** *** *** *** ***
Build fingerprint: 'samsung/a15nsxx/a15:14/UP1A.231005.007/A155FXXU1AWKA:user/release-keys'
Revision: '5'
ABI: 'arm64'
Processor: '7'
Timestamp: 2024-07-10 19:27:40.812476860+0300
Process uptime: 1746s
Cmdline: com.example.caption_app
pid: 19624, tid: 21314, name: pool-4-thread-7 >>> com.example.caption_app <<<
uid: 10377
tagged_addr_ctrl: 0000000000000001 (PR_TAGGED_ADDR_ENABLE)
signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x0000000000000019
Cause: null pointer dereference
 x0 0000000000000001 x1 b400006ec98d8660 x2 0000000000000000 x3 0000000000000002
 x4 0000006f62d1ea12 x5 b400006e04f745ea x6 352f35372f64352f x7 7f7f7f7f7f7f7f7f
 x8 632ace36577a905e x9 632ace36577a905e x10 0000006e191378c0 x11 0000000000000001
 x12 0000000000000001 x13 0000000000000000 x14 0000000000000004 x15 0000000000000008
 x16 0000006e27ffa358 x17 0000006e27dbfb00 x18 0000006e00d48000 x19 0000006f62d1f0e8
 x20 0000006f62d24000 x21 0000006e27ffc5d0 x22 0000006e27ffc5f0 x23 0000000000000000
 x24 b400006f3f089248 x25 0000006f62d1f220 x26 b400006f3f057b20 x27 0000000000000002
 x28 0000000000000088 x29 0000006f62d1f100
 lr 0000006e27dbfb0c sp 0000006f62d1f0a0 pc 0000006e27dbfb10 pst 0000000060001000
1 total frames
backtrace:
 #00 pc 0000000000121b10 /data/app/~~oWqjrGA2indQhuEw6u_J2A==/com.example.caption_app-u2Ofk54FVtMc5D-i3SLC6g==/base.apk!libavfilter.so (offset 0x10a46000) (avfilter_inout_free+16) 
Lost connection to device.



I tried the normal command in my local machine on Windows, and this is what I used


ffmpeg -i F:\ffmpeg\video.avi -vf subtitles='F\:\\ffmpeg\\caption.srt':force_style='Fontsize=24' F:\ffmpeg\new.mp4



As you can see above when subtitles are used it had an extra backslashes for each path and a pre backslash after the partition letter. But I don't know if this does apply for android devices. How can I make sense of this ?