
Recherche avancée
Médias (1)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (28)
-
La file d’attente de SPIPmotion
28 novembre 2010, parUne file d’attente stockée dans la base de donnée
Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...) -
Contribute to documentation
13 avril 2011Documentation is vital to the development of improved technical capabilities.
MediaSPIP welcomes documentation by users as well as developers - including : critique of existing features and functions articles contributed by developers, administrators, content producers and editors screenshots to illustrate the above translations of existing documentation into other languages
To contribute, register to the project users’ mailing (...) -
Gestion des droits de création et d’édition des objets
8 février 2011, parPar défaut, beaucoup de fonctionnalités sont limitées aux administrateurs mais restent configurables indépendamment pour modifier leur statut minimal d’utilisation notamment : la rédaction de contenus sur le site modifiables dans la gestion des templates de formulaires ; l’ajout de notes aux articles ; l’ajout de légendes et d’annotations sur les images ;
Sur d’autres sites (4898)
-
Matomo NAMED 2023 Hi-Tech Awards finalist
1er août 2023, par Lance — Press Releases -
Android java.lang.UnsatisfiedLinkError : android error for .so libraries
14 septembre 2017, par Burak irenso libraries to my project for ffmpeg camera but I have got some errors when I load the library.
I’m getting error like that "java.lang.UnsatisfiedLinkError : android" .
My .so files directory is src-main-jniLibs-armeabi - so files. When I loading the so file gettting error I can’t load the so file. I have tried many answer but still going this error.
If you want other config files I can load here.
Error log is here also
09-14 17:40:33.739 19728-20196/? E/AndroidRuntime: FATAL EXCEPTION: AsyncTask #6
Process: com.leadtimeapp.io.internal.debug, PID: 19728
java.lang.RuntimeException: An error occurred while executing doInBackground()
at android.os.AsyncTask$3.done(AsyncTask.java:309)
at java.util.concurrent.FutureTask.finishCompletion(FutureTask.java:354)
at java.util.concurrent.FutureTask.setException(FutureTask.java:223)
at java.util.concurrent.FutureTask.run(FutureTask.java:242)
at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:234)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1113)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:588)
at java.lang.Thread.run(Thread.java:818)
Caused by: java.lang.UnsatisfiedLinkError: dalvik.system.PathClassLoader[DexPathList[[zip file "/data/app/com.leadtimeapp.io.internal.debug-2/base.apk"],nativeLibraryDirectories=[/data/app/com.leadtimeapp.io.internal.debug-2/lib/arm64, /data/app/com.leadtimeapp.io.internal.debug-2/base.apk!/lib/arm64-v8a, /vendor/lib64, /system/lib64]]] couldn't find "libjniavformat.so"
at java.lang.Runtime.loadLibrary(Runtime.java:367)
at java.lang.System.loadLibrary(System.java:1076)
at com.googlecode.javacpp.Loader.loadLibrary(Loader.java:711)
at com.googlecode.javacpp.Loader.load(Loader.java:586)
at com.googlecode.javacpp.Loader.load(Loader.java:540)
at com.googlecode.javacv.cpp.avformat.<clinit>(avformat.java:40)
at com.googlecode.javacv.cpp.avformat.av_register_all(Native Method)
at com.leadtimeapp.io.ffmpeg.NewFFmpegFrameRecorder.<clinit>(NewFFmpegFrameRecorder.java:302)
at com.leadtimeapp.io.ffmpeg.FFmpegRecorderActivity.initVideoRecorder(FFmpegRecorderActivity.java:464)
at com.leadtimeapp.io.ffmpeg.FFmpegRecorderActivity.access$1400(FFmpegRecorderActivity.java:66)
at com.leadtimeapp.io.ffmpeg.FFmpegRecorderActivity$2.doInBackground(FFmpegRecorderActivity.java:368)
at com.leadtimeapp.io.ffmpeg.FFmpegRecorderActivity$2.doInBackground(FFmpegRecorderActivity.java:360)
at android.os.AsyncTask$2.call(AsyncTask.java:295)
at java.util.concurrent.FutureTask.run(FutureTask.java:237)
at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:234)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1113)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:588)
at java.lang.Thread.run(Thread.java:818)
Caused by: java.lang.UnsatisfiedLinkError: dalvik.system.PathClassLoader[DexPathList[[zip file "/data/app/com.leadtimeapp.io.internal.debug-2/base.apk"],nativeLibraryDirectories=[/data/app/com.leadtimeapp.io.internal.debug-2/lib/arm64, /data/app/com.leadtimeapp.io.internal.debug-2/base.apk!/lib/arm64-v8a, /vendor/lib64, /system/lib64]]] couldn't find "libavformat.so"
at java.lang.Runtime.loadLibrary(Runtime.java:367)
at java.lang.System.loadLibrary(System.java:1076)
at com.googlecode.javacpp.Loader.loadLibrary(Loader.java:711)
at com.googlecode.javacpp.Loader.load(Loader.java:577)
at com.googlecode.javacpp.Loader.load(Loader.java:540)
at com.googlecode.javacv.cpp.avformat.<clinit>(avformat.java:40)
at com.googlecode.javacv.cpp.avformat.av_register_all(Native Method)
at com.leadtimeapp.io.ffmpeg.NewFFmpegFrameRecorder.<clinit>(NewFFmpegFrameRecorder.java:302)
at com.leadtimeapp.io.ffmpeg.FFmpegRecorderActivity.initVideoRecorder(FFmpegRecorderActivity.java:464)
at com.leadtimeapp.io.ffmpeg.FFmpegRecorderActivity.access$1400(FFmpegRecorderActivity.java:66)
at com.leadtimeapp.io.ffmpeg.FFmpegRecorderActivity$2.doInBackground(FFmpegRecorderActivity.java:368)
at com.leadtimeapp.io.ffmpeg.FFmpegRecorderActivity$2.doInBackground(FFmpegRecorderActivity.java:360)
at android.os.AsyncTask$2.call(AsyncTask.java:295)
at java.util.concurrent.FutureTask.run(FutureTask.java:237)
at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:234)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1113)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:588)
at java.lang.Thread.run(Thread.java:818)
09-14 17:40:34.168 19728-19790/? E/NativeCrypto: ssl=0x7f643ae900
cert_verify_callback x509_store_ctx=0x7f76e3e1a0 arg=0x0
09-14 17:40:34.168 19728-19790/? E/NativeCrypto: ssl=0x7f643ae900
cert_verify_callback calling verifyCertificateChain authMethod=ECDHE_RSA
</clinit></clinit></clinit></clinit>My gradle is
apply plugin: 'com.android.application'
apply plugin: 'io.fabric'
apply plugin: 'com.github.triplet.play'
apply plugin: 'project-report'
def buildVersionCode = new Date().format("yyMMddHHmm",
TimeZone.getTimeZone("Europe/Istanbul")).toInteger()
def buildVersionName = "1.1.2"
def SupportVersion = '25.3.1'
def OkHttpVersion = '3.2.0'
def RetrofitVersion = '2.0.2'
def FacebookVersion = '4.+'
def GSonVersion = '2.6.2'
android {
compileSdkVersion 25
buildToolsVersion "25.0.3"
playAccountConfigs {
defaultAccountConfig {
...
}
}
defaultConfig {
applicationId "...."
resValue "string", "app_name", "..."
resValue "string", "face_app_id", "...."
minSdkVersion 16
playAccountConfig = playAccountConfigs.defaultAccountConfig
//noinspection OldTargetApi
targetSdkVersion 25
versionCode buildVersionCode
versionName buildVersionName
multiDexEnabled true
renderscriptTargetApi 19
renderscriptSupportModeEnabled true
}
signingConfigs {
....
}
lintOptions {
abortOnError false // true by default
checkAllWarnings false
checkReleaseBuilds false
ignoreWarnings true // false by default
quiet true // false by default
}
sourceSets {
main.jni.srcDirs = []
main.jniLibs.srcDirs = ['libs']
}
buildTypes {
release {
minifyEnabled true
debuggable false
signingConfig signingConfigs.release
proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro'
applicationVariants.all { variant ->
variant.outputs.each { output ->
output.outputFile = new File(output.outputFile.parent,
output.outputFile.name.replace("-release", "-release-" + buildVersionName + "-" + buildVersionCode)
)
}
}
}
debug {
debuggable true
zipAlignEnabled true
applicationIdSuffix ".debug"
}
productFlavors {
arm7 {
// in the future, ndk.abiFilter might also work
ndk {
abiFilter 'armeabi-v7a'
}
}
arm8 {
ndk {
abiFilters 'arm64-v8a'
}
}
arm {
ndk {
abiFilter 'armeabi'
}
}
x86 {
ndk {
abiFilter 'x86'
}
}
x86_64 {
ndk {
abiFilter 'x86_64'
}
}
mips {
ndk {
abiFilters 'mips', 'mips64'
}
}
universal {
ndk {
abiFilters 'mips', 'mips64', 'x86', 'x86_64'
}
}
}
}
lintOptions {
abortOnError false
checkReleaseBuilds false
checkAllWarnings true
htmlReport true
htmlOutput file("lint-report.html")
checkReleaseBuilds true
disable 'MissingTranslation'
disable 'UnusedResources'
disable 'IconLauncherShape'
disable 'GoogleAppIndexingApiWarning'
disable 'SelectableText'
error 'CommitTransaction'
error 'InconsistentArrays'
error 'Registered'
error 'ManifestOrder'
}
}
play {
track = 'alpha'
}
dependencies {
compile fileTree(dir: 'libs', include: ['.jar', '.so'])
compile files('libs/javacpp.jar')
compile files('libs/javacv.jar')
compile project(':camera')
compile project(':VideoRecorder')
compile("com.android.support:appcompat-v7:$SupportVersion") {
exclude group: 'com.android.support', module: 'support-v4'
exclude group: 'com.android.support', module: 'support-annotations'
}
compile("com.android.support:cardview-v7:$SupportVersion") {
exclude group: 'com.android.support', module: 'support-annotations'
}
compile("com.android.support:design:$SupportVersion") {
exclude group: 'com.android.support', module: 'support-core-ui'
exclude group: 'com.android.support', module: 'appcompat-v7'
exclude group: 'com.android.support', module: 'support-v4'
}
compile("com.android.support:support-v4:$SupportVersion") {
exclude group: 'com.android.support', module: 'support-media-compat'
exclude group: 'com.android.support', module: 'support-compat'
}
compile('com.android.support:multidex:1.0.1') {
exclude group: 'com.android.support', module: 'support-vector-drawable'
}
compile(group: 'org.bytedeco', name: 'javacv-platform', version: '1.3') {
exclude group: 'org.bytedeco.javacpp-presets'
}
compile group: 'org.bytedeco.javacpp-presets', name: 'ffmpeg', version: '3.2.1-1.3'
compile group: 'org.bytedeco.javacpp-presets', name: 'ffmpeg', version: '3.2.1-1.3', classifier: 'android-arm'
compile "com.google.code.gson:gson:$GSonVersion"
compile "com.squareup.okhttp3:okhttp:$OkHttpVersion"
compile "com.squareup.retrofit2:retrofit:$RetrofitVersion"
compile "com.facebook.android:facebook-android-sdk:$FacebookVersion"
compile('com.crashlytics.sdk.android:crashlytics:2.6.8@aar') {
transitive = true;
}
compile('com.twitter.sdk.android:twitter:2.3.0@aar') {
transitive = true;
}
compile 'com.davemorrissey.labs:subsampling-scale-image-view:3.5.0'
compile 'com.github.bumptech.glide:glide:3.7.0'
compile 'de.hdodenhof:circleimageview:2.0.0'
compile 'com.wdullaer:materialdatetimepicker:2.3.0'
compile 'com.github.zhaokaiqiang.klog:library:1.4.0'
compile 'com.github.filippudak.progresspieview:library:1.0.4'
compile 'com.daimajia.swipelayout:library:1.2.0@aar'
compile 'com.amazonaws:aws-android-sdk-s3:2.4.7'
compile 'io.paperdb:paperdb:1.5'
compile 'org.greenrobot:eventbus:3.0.0'
compile 'com.jakewharton:butterknife:8.4.0'
compile 'com.flipboard:bottomsheet-core:1.5.0'
compile 'com.flipboard:bottomsheet-commons:1.5.0'
compile 'com.yqritc:android-scalablevideoview:1.0.4'
compile 'com.github.bumptech.glide:okhttp3-integration:1.4.0@aar'
compile 'com.google.android.gms:play-services-gcm:11.0.4'
compile 'me.relex:circleindicator:1.1.8@aar'
compile 'com.github.ihsanbal:scissors:1.1.3'
compile 'org.zakariya.stickyheaders:stickyheaders:0.7.5'
compile 'com.parse.bolts:bolts-android:1.4.0'
compile 'com.parse:parse-android:1.15.8'
compile 'com.getbase:floatingactionbutton:1.10.1'
compile 'com.github.wooplr:Spotlight:1.2.3'
compile 'com.adjust.sdk:adjust-android:4.7.0'
compile 'com.google.android.gms:play-services-analytics:11.0.4'
compile 'com.makeramen:roundedimageview:2.3.0'
compile 'com.squareup.okhttp3:logging-interceptor:3.3.1'
compile 'com.github.castorflex.verticalviewpager:library:19.0.1'
annotationProcessor 'com.jakewharton:butterknife-compiler:8.4.0'
compile 'jp.wasabeef:blurry:2.1.1'
compile 'com.writingminds:FFmpegAndroid:0.3.2'
} -
make ffmpeg chose Nvidia CUDA over Intel QSV (Windows 10 with two video adapters)
3 avril 2024, par Bart LedermanI just set up a 'new' PC with built-in Intel video and an Nvida card : mostly to speed up video processing with ffmpeg and other programs. At first the built-in Intel was disabled, running only the Nvidia card. ffmpeg worked as expected, the CPU could be used for decoding and encoding.


However : VirtualDub, a program I use frequently, has a problem with Nvidia cards (at least on Windows 10). The display gets screwed up, previews don't work, and all sorts of other problems occur. I tried all of the various discussion boards, and nobody has a good solution. (The problem is apparently split between VirtualDub and Nvidia, as all other programs such as VideoLan, Avidemux, HandBrake, OBS studio, etc, all appear to work fine.)


So I re-enabled the on-board Intel adapter, and made that my primary and only video with a monitor. The Nvidia card is still there, but with no monitor attached. I really only need it for hardware acceleration.


HandBrake and OBS Studio found the card and used it with no problem.


However, my batch file that specified cuda for both decoding and encoding failed to run. The ffmpeg command that includes -hwaccel cuda resulted in :


[h264 @ 000002783beaa700] Hardware is lacking required capabilities 
[h264 @ 000002783beaa700] Failed setup for format cuda: hwaccel initialisation returned error.



I also tried -hwaccel nvenc, which is rejected. It's apparently not a synonym in this version of ffmpeg :


ffmpeg version 4.3.1-2021-01-01-full_build-www.gyan.dev Copyright (c) 2000-2021 the FFmpeg developers built with gcc 10.2.0 (Rev5, Built by MSYS2 project) configuration: --enable-gpl --enable-version3 --enable-static --disable-w32threads --disable-autodetect --enable-fontconfig --enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp --enable-lzma --enable-libsnappy --enable-zlib --enable-libsrt --enable-libssh --enable-libzmq --enable-avisynth --enable-libbluray --enable-libcaca --enable-sdl2 --enable-libdav1d --enable-libzvbi --enable-librav1e --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxvid --enable-libaom --enable-libopenjpeg --enable-libvpx --enable-libass --enable-frei0r --enable-libfreetype --enable-libfribidi --enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf --enable-cuda-llvm --enable-cuvid --enable-ffnvcodec --enable-nvdec --enable-nvenc --enable-d3d11va --enable-dxva2 --enable-libmfx --enable-libcdio --enable-libgme --enable-libmodplug --enable-libopenmpt --enable-libopencore-amrwb --enable-libmp3lame --enable-libshine --enable-libtheora --enable-libtwolame --enable-libvo-amrwbenc --enable-libilbc --enable-libgsm --enable-libopencore-amrnb --enable-libopus --enable-libspeex --enable-libvorbis --enable-ladspa --enable-libbs2b --enable-libflite --enable-libmysofa --enable-librubberband --enable-libsoxr --enable-chromaprint libavutil 56. 51.100 / 56. 51.100 libavcodec 58. 91.100 / 58. 91.100 libavformat 58. 45.100 / 58. 45.100 libavdevice 58. 10.100 / 58. 10.100 libavfilter 7. 85.100 / 7. 85.100 libswscale 5. 7.100 / 5. 7.100 libswresample 3. 7.100 / 3. 7.100 libpostproc 55. 7.100 / 55. 7.100



When I use QSV acceleration on my other PC I have to do this :


-init_hw_device qsv=qsv -hwaccel qsv



so I tried


-init_hw_device cuda=cuda -hwaccel cuda



but that didn't work either.


I've seen comments about the ability to select the GPU if there is more than one board installed, using the -gpu option. However, when I try to use -gpu 0 or -gpu 1 I get :


Codec AVOption gpu (Selects which NVENC capable GPU to use. First GPU is 0, second is 1, and so on.) specified for input file #0 (xxx.avi) is not a decoding option.


I looked at :


https://github.com/FFmpeg/FFmpeg/commit/527a1e213167123d24d014bc0b956ef43d9d6542


to get more information on -init_hw_device, but I'm sorry to say that what's on that page makes no sense to me at all. There are no examples, and no explanation of how to actually select a device.


I looked at :


https://docs.nvidia.com/video-technologies/video-codec-sdk/ffmpeg-with-nvidia-gpu/


which has an 'example' of -init_hw_device, and I did a cut and paste of what they had there to my batch file, but it was rejected.


I also looked at :


How to to burn subtitles based image on video using ’overlay_cuda’, ffmpeg video filter


which has two examples of how to initialize a cuda device, and they don't work for me either. -init_hw_device cuda=cuda is accepted without error, but then -hwaccel cuda still fails. Trying to use the hw accelerated filter scale_cuda also fails.


So how do I get the Nvidia card to decode video when it's not the only graphics adapter ? I was able to decode video when only the Nvidia card was active, there "must" be a way to get to it now. I just need to know how to tell ffmpeg to use the card that is there. Since it has no problem finding the card for encoding, shouldn't it also still be able to find the card for decoding and filters ? Or am I really the first person ever to have both Intel and Nvidia graphics adapters working on my system and trying to use ffmpeg with hardware acceleration ?


=====================


Latest update.


I had tried the examples on the Nvidia FFmpeg transcoding guide web page, and as mentioned previously I still got errors. I did a cut and paste from that web page to my command window, and ffmpeg still did not find the correct graphics adapter.


However, I do have a work-around. I don't particularly like it, but it works.


First : Windows (10) does not understand the concept of a graphics adapter that doesn't have a monitor attached to it. Even though graphics processors (specifically Nvidia) are available without the actual video output and are used in supercomputers and elsewhere to do high speed stream processing, Windows will not let you access the card settings if there is no monitor attached. The Nvidia control center also will not allow you to access any of the card's settings, and you can't set processor affinity.


So I connected a second monitor, and set up the Nvidia card as the primary.


Now ffmpeg -hwaccel cuda works the first time. The command I was using before :


ffmpeg -hide_banner -hwaccel cuda -i "input.avi" -c:a copy -ac 1 -c:v h264_nvenc -preset hq -movflags faststart -qp 30 "output.mp4"



Was failing because it couldn't find the Nvidia adapter. This command now works correctly the first time and uses hardware acceleration for both decode and encode. (The audio portion is irrelevant, if I also re-encode the audio the results are the same.)


With scaling, the command was like this :


ffmpeg -hide_banner -hwaccel cuda -i "input.avi" -c:a copy -ac 1 -c:v h264_nvenc -preset hq -vf "scale=640:480" -movflags faststart -qp 30 "output.mp4"



This works. However :


ffmpeg -hide_banner -hwaccel cuda -i "input.avi" -c:a copy -ac 1 -c:v h264_nvenc -preset hq -vf "scale_cuda=640:480" -movflags faststart -qp 30 "output.mp4



Fails with


Impossible to convert between the formats supported by the filter 'graph 0 input from stream 0:0' and the filter 'auto_scaler_0'
Error reinitializing filters!
Failed to inject frame into filter network: Function not implemented
Error while processing the decoded data for stream #0:0



I was able to get around this, by rearranging things in what seems to be to be an unnecessarily convoluted syntax.


ffmpeg -hide_banner -hwaccel cuvid -hwaccel_output_format cuda -i "input.avi" -c:a aac -b:a 192k -ar 48000 -vf "scale_cuda=856:480" -c:v h264_nvenc -preset hq -movflags faststart -qp 26 "output.mp4"



Having to specify the output format twice seems weird, but Task Manager shows near 100% Video Decode activity, and the time it takes to do this indicates to me that the scale_cuda filter is being used.


I don't particularly like having to use a second monitor (If VirtualDub worked properly I probably wouldn't have to), but I'm willing to live with it. It appears that if you have two different video cards and you want to use hardware acceleration on one of them it has to be the primary.


I haven't tested if Intel QSV is still accessible, nor have I tried switching the order of the graphics adapters back to completely verify the source of the problem, and I'm not really planning to do so (unless some of you think that would be useful). I get the definite impression that few people, if any, have tried to get both an Nvidia and an Intel adapter to provide hardware video acceleration on the same system. I will try to access QSV to see if using both accelerators is an improvement.


I can live with the weird command line to get the cuda filters to work, but if anyone knows a better way to do it I think it would be helpful to post it here for future reference if anyone else runs into a similar problem. None of the examples of using cuda accelerated filters that I've found on any of the many web sites I've read worked exactly as given.


==================


The good news :


It's possible to use both Nvidia and QSV hardware in at least some cases.


This command works :


ffmpeg -hide_banner -hwaccel dxva2 -i "input.avi" -c:a copy ^
 -c:v h264_qsv -vf "crop=1920:1044:0:0" -preset veryfast -profile:v high -level 4.1 -qp 22 "output.mp4"



Task Manager says Nvidia is decoding the input, and GPU-Z says Intel is also active, so it must be doing the encoding.


The bad news : I can't figure out a way to use both a CUDA filter and a standard filter in the same process.


This does not work :


ffmpeg -hide_banner -hwaccel cuvid -hwaccel_output_format cuda -i "input.avi" -c:a aac -b:a 192k -ar 48000 -vf "scale_cuda=856:480,crop=1280:696:0:24" -c:v h264_nvenc -preset hq -movflags faststart -qp 30 "output.mp4"



Reversing the order of scale_cuda and crop (with appropriate adjustments to the numbers) also does not work. There are errors about not being able to transfer the processing stream.


I will try the changes in the latest comment, but I think I may have tried it before and something didn't work. But I will check again.


In my web searches I have not found an example of 'mixed' filters.


I did see "-crop" and "-resize" on the Nvidia ffmpeg trancode web page similar to this :


–crop 0x36x0x0 –resize 1280x696



Once again, I did a cut and paste from the Nvidia web page to my command window and it didn't work. If there is a way to invoke the Nvidia command for these options that has been tested and found to actually work I would really like to see it.