
Recherche avancée
Autres articles (72)
-
Ecrire une actualité
21 juin 2013, parPrésentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
Vous pouvez personnaliser le formulaire de création d’une actualité.
Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...) -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)
Sur d’autres sites (10163)
-
Adding ffmpeg OMX codec to Genymotion Android 4.4.2 emulator
22 avril 2016, par photonBasic Question :
Is there a way to add a new audio codec to the Genymotion Android emulator, short of downloading the entire Android source, learning how to build it, and creating my own version of Android ?
Context :
I have written a java Android app that acts as an audio renderer, as well as being a DLNA/OpenHome server and client. Think "BubbleUpnp" without video. My primary development platform is Win8.1. The program started as an ActiveState "pure-perl" DLNA MediaServer on Windows, which I then ported to Ubuntu, which I got working under Android a few years ago. It was pretty funky ... all UI being presented thru an HTTP server/jquery/jquery-ui, served from an Ubuntu shell running under Android (a trick in itself), serving up HTML pages to Chrome running on the same (Android) device. Besides being "funky" it had a major drawback that it required a valid IP address to work ... as I could not figure out how to get ubuntu to have a local loopback device for a 127.0.0.01 localhost I use the app as a "car stereo" on my boat (which is my home), which is often not hooked up to the internet.
I had a hard time getting started in Android app development because the speed of the Android emulators in Eclipse was horrid, and the ADB drivers did not work from Win8 for the longest time.
Then one day, about a year ago, I ran into Genymotion (kudos to the authors), and all of a sudden I had a workable Android development environment, so I added a Java implementation of the DLNA server, which then grew into a renderer also, using Android’s MediaPlayer class, and, adding the ability to act as a DLNA control point, and more recently also added OpenHome servers and renderers to it.
In a separate effort, I created a build environment for this program called fpCalc, based on ffMpeg, on a variety of platforms, including Win, Linux, and Android x86, arm, and arm7 devices (bitbucket.org/phorton1/) and did an extensive series of tests to determine the validity, and longevity of fpcalc fingerprints, discovering that the fpCalc fingerprint changed based on the version of ffmpeg it was built against, a separate topic to be sure, but in the process, learned at least a bit about how to build ffmpeg as well as Android shared libraries, JNI interfaces, etc.
So now the Android-Java version of the program has advanced past the old perl version, and I am debating whether I want to continue to try to build the perl version (and or add an wxPerl UI) to it.
One issue that has arisen, for me, is that the Genymotion emulator does not support WMA decoding ... as Android dropped support for WMA due to licensing issues, etc, a ways back in time ... yet my music library has significant numbers of tunes in WMA files, and I don’t want to "convert" them, my carefully thought-out philosophy is that my program does not modify the contents, or tags, or anything in the original media files that I have accumulated, or will receive in the future, rather treating them as "artifacts" worth preserving "as is". No conversion is going to make a file "better" than it was, and I wish to preserve ALL of the original sources for ALL of my music going forward.
So, I’m thinking, gee, I can build FFMPEG on 7 different platforms, and I see all these references to "OMX FFMPEG Codec Support for Android" on the net, so I’m thinking, "All I need to do is create the OMX Component and somehow get it into Genymotion".
I have studied up OMX, OpenMaxIL, seen Michael Chen’s posts, seen the stack overflow questions
How to make ffmpeg codec componet as OMX component
and
Android : How to integrate a decoder to multimedia framework
and Cedric Fung’s page https://vec.io/posts/use-android-hardware-decoder-with-omxcodec-in-ndk, and Michael Chen’s repository at https://github.com/omxcodec , as well as virtually every other page on the net that mentions any combination of libstagefright, OMX, Genymotion, and FFMPEG.
(this page would not let me put more than 2 links as i don’t have a "10" reputation, or I would have listed some of the sources I have seen) ..
My Linux development environment is a Ubuntu12.04 vbox running on my win machine. I have downloaded and run the Android-x86 iso as a vbox, and IT contains the ffmpeg codecs, but unfortunately, it neither supports a wifi interface, nor the vbox "guest additions", so it has a really funky mouse. I tried for about 3 days to address those two issues, but in the end do not feel it is usable for my puproses, and I really like the way genymotion "feels", particularly the moust support, so I’d like to keep genymotion as my "windows android" virtual device under which I may run my program, deprecate and stop using my old perl source,
except genymotion does not support WMA files ...
Several side notes :
(a) There is no good way to write a single sourced application in Java that runs natively in Windows, AND as an Android app.
(b) I don’t want to reboot my Windows machine to a "real" Android device just to play my music files. The machine has to stay in Windows as I use it for other things as well.
(c) I am writing this as my machine is in the 36th hour of downloading the entire ASOP source code base to a partition in my Ubuntu vbox while I am sitting in a hotel room on a not-so-good internet connection in Panama City, Panama, before I return to my boat in remote Bocas Del Toro Panama, where the internet connection is even worse.
(d) I did get WMA decoding to work in my app by calling my FFMPEG executable from Java (converting it to either WAV/PCM or AAC), but, because of limitations in Android’s MediaPlayer, it does not work well, particularly for remotely hosted WMA files ... MediaPlayer insists on having the whole file present before it starts to play, which can take several seconds or longer, and I am hoping that by getting a ’real’ WMA codec underneath MediaPlayer, that problem will just disappear ....
So, I’m trying to figure this whole mess out. There are a lot of tantalizing clues, and suggestions, but what I have found, or at least what I am starting to believe, is that if I want to add a simple WMA audio decoding codec to Android (Genymotion), not only do I have to download, basically, the ENTIRE ASOP Android source tree, and learn a new set of tools (repo, etc), but I have to (be able to) rebuild, from scratch, the entire Android system, esp. libstagefright.so in such a way as to be COMPLETELY compatible with the existing one in GenyMotion, while at the same time adding ffmpeg codecs ala Michael Chen’s page.
And I’m just asking, is it, could it really be that difficult ?
Anyways, this makes me crazy. Is there no way to just build a new component, or at worst a new OMX core, and add it to Genymotion, WITHOUT building all of Android, and preferably, based only on the OMX h files ? Or do I REALLY have to replace the existing libstagefright.so, which means, basically, rebuilding all of Android ...
p.s. I thought it would be nice to get this figured out, build it, and then post the installable new FFMPEG codecs someplace for other people to use, so that they don’t also grow warts on their ears and have steam shooting out of their eyeballs, while they get old trying to figure it out ....
-
Integrating FFMPEG using cmake : undefined reference when linking libraries
21 novembre 2018, par Hamed MomeniI’ve been trying to cross-compile ffmpeg for different Android cpu archs for couple of days now and I have finally succeeded in the task. But now that I need to integrate these pre-built
.so
files in my project I’m facing errors which have me baffled.This is the
CMakeLists.txt
which I’m using :cmake_minimum_required(VERSION 3.4.1)
# convert SDK path to forward slashes on Windows
file(TO_CMAKE_PATH ${PATH_TO_SUPERPOWERED} PATH_TO_SUPERPOWERED)
set(CMAKE_VERBOSE_MAKEFILE on)
include_directories(src/main/cpp)
include_directories(${PATH_TO_SUPERPOWERED})
include_directories(${PATH_TO_FFMPEG}/${ANDROID_ABI}/include)
add_library(
avutil
SHARED
${PATH_TO_FFMPEG}/${ANDROID_ABI}/lib/libavutil.so
)
set_target_properties(avutil PROPERTIES LINKER_LANGUAGE CXX)
add_library(
avformat
SHARED
${PATH_TO_FFMPEG}/${ANDROID_ABI}/lib/libavformat.so
)
set_target_properties(avformat PROPERTIES LINKER_LANGUAGE CXX)
add_library(
avcodec
SHARED
${PATH_TO_FFMPEG}/${ANDROID_ABI}/lib/libavcodec.so
)
set_target_properties(avcodec PROPERTIES LINKER_LANGUAGE CXX)
add_library(
Canto
SHARED
src/main/cpp/Dubsmash.cpp
src/main/cpp/Karaoke.cpp
src/main/cpp/Singing.cpp
src/main/cpp/EditDubsmash.cpp
${PATH_TO_SUPERPOWERED}/AndroidIO/SuperpoweredAndroidAudioIO.cpp
)
# link the native library against the following libraries
target_link_libraries(
Canto
avutil
avformat
avcodec
${PATH_TO_SUPERPOWERED}/libSuperpoweredAndroid${ANDROID_ABI}.a
OpenSLES
log
android
)And this is the source of the file I’m getting errors on :
#include
#include <android></android>log.h>
#include <string>
// unrelated includes
#include <libavformat></libavformat>avformat.h>
#include <libavutil></libavutil>dict.h>
#define log_write __android_log_write
#define log_print __android_log_print
// unrelated code parts
int print_file_data(const char *filePath) {
AVFormatContext *fmt_ctx = NULL;
AVDictionaryEntry *tag = NULL;
int ret;
if ((ret = avformat_open_input(&fmt_ctx, filePath, NULL, NULL)))
return ret;
while ((tag = av_dict_get(fmt_ctx->metadata, "", tag, AV_DICT_IGNORE_SUFFIX)))
log_print(ANDROID_LOG_DEBUG, "%s=%s\n", tag->key, tag->value);
avformat_close_input(&fmt_ctx);
return 0;
}
</string>And finally the errors themselves :
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':app:externalNativeBuildDebug'.
> Build command failed.
Error while executing process /home/hamed/dev/android-tools/android-sdk-linux/cmake/3.6.4111459/bin/cmake with arguments {--build /home/hamed/dev/projects/canto/Canto/app/.externalNativeBuild/cmake/debug/x86 --target Canto}
[1/1] Linking CXX shared library ../../../../build/intermediates/cmake/debug/obj/x86/libCanto.so
FAILED: : && /home/hamed/dev/android-tools/android-sdk-linux/ndk-bundle/toolchains/llvm/prebuilt/linux-x86_64/bin/clang++ --target=i686-none-linux-android16 --gcc-toolchain=/home/hamed/dev/android-tools/android-sdk-linux/ndk-bundle/toolchains/x86-4.9/prebuilt/linux-x86_64 --sysroot=/home/hamed/dev/android-tools/android-sdk-linux/ndk-bundle/sysroot -fPIC -isystem /home/hamed/dev/android-tools/android-sdk-linux/ndk-bundle/sysroot/usr/include/i686-linux-android -g -DANDROID -ffunction-sections -funwind-tables -fstack-protector-strong -no-canonical-prefixes -mstackrealign -Wa,--noexecstack -Wformat -Werror=format-security -std=c++11 -fsigned-char -I/home/hamed/dev/projects/canto/Canto/app/../Superpowered -I/home/hamed/dev/projects/canto/Canto/app/../ffmpeg -O0 -fno-limit-debug-info -Wl,--exclude-libs,libgcc.a -Wl,--exclude-libs,libatomic.a -nostdlib++ --sysroot /home/hamed/dev/android-tools/android-sdk-linux/ndk-bundle/platforms/android-16/arch-x86 -Wl,--build-id -Wl,--warn-shared-textrel -Wl,--fatal-warnings -L/home/hamed/dev/android-tools/android-sdk-linux/ndk-bundle/sources/cxx-stl/llvm-libc++/libs/x86 -Wl,--no-undefined -Wl,-z,noexecstack -Qunused-arguments -Wl,-z,relro -Wl,-z,now -shared -Wl,-soname,libCanto.so -o ../../../../build/intermediates/cmake/debug/obj/x86/libCanto.so CMakeFiles/Canto.dir/src/main/cpp/Dubsmash.cpp.o CMakeFiles/Canto.dir/src/main/cpp/Karaoke.cpp.o CMakeFiles/Canto.dir/src/main/cpp/Singing.cpp.o CMakeFiles/Canto.dir/src/main/cpp/EditDubsmash.cpp.o CMakeFiles/Canto.dir/home/hamed/dev/projects/canto/Canto/Superpowered/AndroidIO/SuperpoweredAndroidAudioIO.cpp.o ../../../../build/intermediates/cmake/debug/obj/x86/libavutil.so ../../../../build/intermediates/cmake/debug/obj/x86/libavformat.so ../../../../../Superpowered/libSuperpoweredAndroidx86.a -lOpenSLES -llog -landroid -latomic -lm "/home/hamed/dev/android-tools/android-sdk-linux/ndk-bundle/sources/cxx-stl/llvm-libc++/libs/x86/libc++_static.a" "/home/hamed/dev/android-tools/android-sdk-linux/ndk-bundle/sources/cxx-stl/llvm-libc++/libs/x86/libc++abi.a" "/home/hamed/dev/android-tools/android-sdk-linux/ndk-bundle/sources/cxx-stl/llvm-libc++/libs/x86/libandroid_support.a" && :
/home/hamed/dev/projects/canto/Canto/app/src/main/cpp/Dubsmash.cpp:295: error: undefined reference to 'avformat_open_input(AVFormatContext**, char const*, AVInputFormat*, AVDictionary**)'
/home/hamed/dev/projects/canto/Canto/app/src/main/cpp/Dubsmash.cpp:298: error: undefined reference to 'av_dict_get(AVDictionary const*, char const*, AVDictionaryEntry const*, int)'
/home/hamed/dev/projects/canto/Canto/app/src/main/cpp/Dubsmash.cpp:301: error: undefined reference to 'avformat_close_input(AVFormatContext**)'
clang++: error: linker command failed with exit code 1 (use -v to see invocation)
ninja: build stopped: subcommand failed.Obviously the linker can not find the needed libraries, but why ?!
UPDATE
This is the result of runningnm
on libavutil.so :0000000000000510 t atexit
0000000000000500 t __atexit_handler_wrapper
0000000000002008 A __bss_start
U __cxa_atexit
U __cxa_finalize
0000000000002000 d __dso_handle
0000000000001dd8 d _DYNAMIC
0000000000002008 A _edata
00000000000004e0 t __emutls_unregister_key
0000000000002008 A _end
00000000000005e8 r __FRAME_END__
0000000000001fd8 d _GLOBAL_OFFSET_TABLE_
0000000000000258 r ndk_build_number
0000000000000218 r ndk_version
0000000000000200 r note_android_ident
0000000000000214 r note_data
0000000000000298 r note_end
000000000000020c r note_name
00000000000004d0 t __on_dlclose
00000000000004f0 t __on_dlclose_lateand on libavformat :
0000000000000520 t atexit
0000000000000510 t __atexit_handler_wrapper
0000000000002008 A __bss_start
U __cxa_atexit
U __cxa_finalize
0000000000002000 d __dso_handle
0000000000001dd8 d _DYNAMIC
0000000000002008 A _edata
00000000000004f0 t __emutls_unregister_key
0000000000002008 A _end
00000000000005f8 r __FRAME_END__
0000000000001fd8 d _GLOBAL_OFFSET_TABLE_
0000000000000258 r ndk_build_number
0000000000000218 r ndk_version
0000000000000200 r note_android_ident
0000000000000214 r note_data
0000000000000298 r note_end
000000000000020c r note_name
00000000000004e0 t __on_dlclose
0000000000000500 t __on_dlclose_lateUPDATE 2
I applied @szatmary suggestion and finally managed to to build the apk file but upon reaching theSystem.loadLibrary
call an exception occurs indicating that the linker can not findlibavutil.so.56
. I tried changing the address of the library in thecmake
file to actually contain the version numbered lib file to no avail :java.lang.UnsatisfiedLinkError: dlopen failed: library "libavutil.so.56" not found
at java.lang.Runtime.loadLibrary0(Runtime.java:1016)
at java.lang.System.loadLibrary(System.java:1657)
at com.hmomeni.canto.activities.EditActivity.<init>(EditActivity.kt:26)
at java.lang.Class.newInstance(Native Method)
at android.app.Instrumentation.newActivity(Instrumentation.java:1174)
at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2669)
at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2856)
at android.app.ActivityThread.-wrap11(Unknown Source:0)
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1589)
at android.os.Handler.dispatchMessage(Handler.java:106)
at android.os.Looper.loop(Looper.java:164)
at android.app.ActivityThread.main(ActivityThread.java:6494)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:438)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:807)
</init> -
FFMPEG Audio decode and draw waveform
7 avril 2016, par HarisI am trying to decode the audio and draw the waveform using ffmpeg, and the input audio data is
AV_SAMPLE_FMT_S16P
, basically I am following the tutorial here, and the audio is playing fine with libao. Now I need to plot the waveform using decoded data, currently I am writing left and right channel to separate csv file and plotting on excel. But the waveform is something different from the waveform shown in Audacity using the same audio clip. When I analyzed the value written on csv most of the values are close to maximum ofuint16_t
(65535), but there are some other lower values, but majority is high peak.Here is the source code,
const char* input_filename="/home/user/Music/Clip.mp3";
av_register_all();
AVFormatContext* container=avformat_alloc_context();
if(avformat_open_input(&container,input_filename,NULL,NULL)<0){
endApp("Could not open file");
}
if(avformat_find_stream_info(container, NULL)<0){
endApp("Could not find file info");
}
av_dump_format(container,0,input_filename,false);
int stream_id=-1;
int i;
for(i=0;inb_streams;i++){
if(container->streams[i]->codec->codec_type==AVMEDIA_TYPE_AUDIO){
stream_id=i;
break;
}
}
if(stream_id==-1){
endApp("Could not find Audio Stream");
}
AVDictionary *metadata=container->metadata;
AVCodecContext *ctx=container->streams[stream_id]->codec;
AVCodec *codec=avcodec_find_decoder(ctx->codec_id);
if(codec==NULL){
endApp("cannot find codec!");
}
if(avcodec_open2(ctx,codec,NULL)<0){
endApp("Codec cannot be found");
}
AVPacket packet;
av_init_packet(&packet);
//AVFrame *frame=avcodec_alloc_frame();
AVFrame *frame=av_frame_alloc();
int buffer_size=AVCODEC_MAX_AUDIO_FRAME_SIZE+ FF_INPUT_BUFFER_PADDING_SIZE;
// MSVC can't do variable size allocations on stack, ohgodwhy
uint8_t *buffer = new uint8_t[buffer_size];
packet.data=buffer;
packet.size =buffer_size;
int frameFinished=0;
int plane_size;
ofstream fileCh1,fileCh2;
fileCh1.open ("ch1.csv");
fileCh2.open ("ch2.csv");
AVSampleFormat sfmt=ctx->sample_fmt;
while(av_read_frame(container,&packet)>=0)
{
if(packet.stream_index==stream_id){
int len=avcodec_decode_audio4(ctx,frame,&frameFinished,&packet);
int data_size = av_samples_get_buffer_size(&plane_size, ctx->channels,
frame->nb_samples,
ctx->sample_fmt, 1);
if(frameFinished){
int write_p=0;
// QTime t;
switch (sfmt){
case AV_SAMPLE_FMT_S16P:
for (int nb=0;nbsizeof(uint16_t);nb++){
for (int ch = 0; ch < ctx->channels; ch++) {
if(ch==0)
fileCh1 <<((uint16_t *) frame->extended_data[ch])[nb]<<"\n";
else if(ch==1)
fileCh2 <<((uint16_t *) frame->extended_data[ch])[nb]<<"\n";
}
}
break;
}
} else {
DBG("frame failed");
}
}
av_free_packet(&packet);
}
fileCh1.close();
fileCh2.close();
avcodec_close(ctx);
avformat_close_input(&container);
delete buffer;
return 0;Edit :
I have attached the waveform image draw using opencv, here I scaled the sample value to 0-255 range, and took value 127 as 0(Y-axis). Now for each sample draw line from (x,127) to (x,sample value) where x=1,2,3,...