
Recherche avancée
Autres articles (75)
-
(Dés)Activation de fonctionnalités (plugins)
18 février 2011, parPour gérer l’ajout et la suppression de fonctionnalités supplémentaires (ou plugins), MediaSPIP utilise à partir de la version 0.2 SVP.
SVP permet l’activation facile de plugins depuis l’espace de configuration de MediaSPIP.
Pour y accéder, il suffit de se rendre dans l’espace de configuration puis de se rendre sur la page "Gestion des plugins".
MediaSPIP est fourni par défaut avec l’ensemble des plugins dits "compatibles", ils ont été testés et intégrés afin de fonctionner parfaitement avec chaque (...) -
Activation de l’inscription des visiteurs
12 avril 2011, parIl est également possible d’activer l’inscription des visiteurs ce qui permettra à tout un chacun d’ouvrir soit même un compte sur le canal en question dans le cadre de projets ouverts par exemple.
Pour ce faire, il suffit d’aller dans l’espace de configuration du site en choisissant le sous menus "Gestion des utilisateurs". Le premier formulaire visible correspond à cette fonctionnalité.
Par défaut, MediaSPIP a créé lors de son initialisation un élément de menu dans le menu du haut de la page menant (...) -
Les tâches Cron régulières de la ferme
1er décembre 2010, parLa gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
Le super Cron (gestion_mutu_super_cron)
Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...)
Sur d’autres sites (8135)
-
java.lang.UnsatisfiedLinkError : Couldn't load ffmpeg library findLibrary returned null
22 février 2017, par Muthukumar SubramaniamI am new to live streaming from android to youtube in android. I mentioned my Project Structrue and build the gradle file in android. FFmpeg library could not load in runtime.
Note : I am working in windows 10 and android studio 2.1.2
apply plugin: 'com.android.application'
android {
compileSdkVersion 23
buildToolsVersion "23.0.1"
packagingOptions {
exclude 'META-INF/DEPENDENCIES'
exclude 'META-INF/NOTICE'
exclude 'META-INF/NOTICE.txt'
exclude 'META-INF/LICENSE'
exclude 'META-INF/LICENSE.txt'
}
defaultConfig {
applicationId "com.ephron.mobilizerapp"
minSdkVersion 14
targetSdkVersion 23
versionCode 1
versionName "1.4"
multiDexEnabled true
}
dexOptions {
javaMaxHeapSize "4g"
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro'
}
}
sourceSets {
main {
assets.srcDirs = ['src/main/assets', 'src/main/assets/']
}
}
sourceSets { main {
jni.srcDirs = ['src/main/jni', 'src/main/jni/libs']
jni.srcDirs = ['libs']
jni.srcDirs = []
} }
}
dependencies {
testCompile 'junit:junit:4.12'
compile files('libs/httpclient-4.5.2.jar')
compile files('libs/httpcore-4.4.4.jar')
compile files('libs/httpmime-4.2.1.jar')
compile files('libs/YouTubeAndroidPlayerApi.jar')
compile 'com.android.support:appcompat-v7:23.1.0'
compile 'com.google.android.gms:play-services:10.0.1'
compile 'com.google.code.gson:gson:2.2.2'
compile 'com.google.firebase:firebase-messaging:9.2.0'
compile 'testfairy:testfairy-android-sdk:1.+@aar'
compile 'com.android.support:multidex:1.0.0'
compile 'com.mcxiaoke.volley:library:1.0.19'
compile 'com.squareup.picasso:picasso:2.5.2'
compile 'cn.aigestudio.wheelpicker:WheelPicker:1.1.2'
compile 'com.google.android.gms:play-services-maps:10.0.1'
compile 'com.google.apis:google-api-services-youtube:v3-rev182-1.22.0'
compile 'com.google.api-client:google-api-client-android:1.22.0'
compile 'com.google.http-client:google-http-client-gson:1.19.0'
compile 'com.google.android.gms:play-services-ads:10.0.1'
compile 'com.google.android.gms:play-services-auth:10.0.1'
compile 'com.google.android.gms:play-services-gcm:10.0.1'
compile files('libs/ffmpeg-android.jar')
}
apply plugin: 'com.google.gms.google-services'My Project Structure mentioned below Link
https://www.screencast.com/t/E0TFsMUi1
Application.mk file
APP_OPTIM := release
APP_ABI := all
APP_STL := gnustl_static
APP_CPPFLAGS := -frtti -fexceptionsAndroid.mk file
#
# Copyright (c) 2014 Google Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except
# in compliance with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software distributed under the License
# is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express
# or implied. See the License for the specific language governing permissions and limitations under
# the License.
# set NDK_PROJECT_PATH := D:/MobilizerApp/app/src/main/jni
WORKING_DIR := $(call my-dir)
include $(CLEAR_VARS)
LOCAL_PATH := $(WORKING_DIR)/../third_party/lame/libmp3lame
LOCAL_MODULE := lame
LOCAL_C_INCLUDES := $(WORKING_DIR)/../third_party/lame/libmp3lame \
$(WORKING_DIR)/../third_party/lame/include
LOCAL_CFLAGS := -DSTDC_HEADERS -std=c99
LOCAL_ARM_MODE := arm
APP_OPTIM := release
LOCAL_SRC_FILES := VbrTag.c \
bitstream.c \
encoder.c \
fft.c \
gain_analysis.c \
id3tag.c \
lame.c \
mpglib_interface.c \
newmdct.c \
presets.c \
psymodel.c \
quantize.c \
quantize_pvt.c \
reservoir.c \
set_get.c \
tables.c \
takehiro.c \
util.c \
vbrquantize.c \
version.c
#include $(BUILD_STATIC_LIBRARY)
include $(BUILD_STATIC_LIBRARY)
#include $(CLEAR_VARS)
#LOCAL_MODULE := mp3lame_a
#LOCAL_STATIC_LIBRARIES := lame
#include $(BUILD_EXECUTABLE)
include $(CLEAR_VARS)
LOCAL_PATH := $(WORKING_DIR)
LOCAL_MODULE := ffmpeg
LOCAL_CFLAGS := -DHAVE_AV_CONFIG_H -std=c99 -D__STDC_CONSTANT_MACROS -DSTDC_HEADERS -Wno-deprecated-declarations
LOCAL_SRC_FILES := ffmpeg-jni.c
LOCAL_C_INCLUDES := $(WORKING_DIR)/libavcodec $(WORKING_DIR)/libavcodec/arm $(WORKING_DIR)/libavformat $(WORKING_DIR)/libavutil $(WORKING_DIR)/libavutil/arm
LOCAL_STATIC_LIBRARIES := lame
LOCAL_LDLIBS := -llog -lm -lz $(WORKING_DIR)/../third_party/lib/libavformat.a $(WORKING_DIR)/../third_party/lib/libavcodec.a $(WORKING_DIR)/../third_party/lib/libavfilter.a $(WORKING_DIR)/../third_party/lib/libavresample.a $(WORKING_DIR)/../third_party/lib/libswscale.a $(WORKING_DIR)/../third_party/lib/libavutil.a $(WORKING_DIR)/../third_party/lib/libx264.a $(WORKING_DIR)/../third_party/lib/libpostproc.a $(WORKING_DIR)/../third_party/lib/libswresample.a $(WORKING_DIR)/../third_party/lib/libfdk-aac.a
APP_OPTIM := release
include $(BUILD_SHARED_LIBRARY)FFMPEG.Java
package com.ephronsystem.mobilizerapp;
public class Ffmpeg {
static {
System.loadLibrary("ffmpeg");
}
public static native boolean init(int width, int height, int audio_sample_rate, String rtmpUrl);
public static native void shutdown();
// Returns the size of the encoded frame.
public static native int encodeVideoFrame(byte[] yuv_image);
public static native int encodeAudioFrame(short[] audio_data, int length);
} -
Why are Cb and Cr planes displaced differently from lum by the displace complex filter in ffmpeg ?
6 mai 2017, par NebI have a video encoded with the yuv420p pixel format and I want to displace its pixels. I’m using ffmpeg and its new displace filter. The filter takes as inputs (the video to be displaced and) two displacement maps respectively for X and Y axis. I decided to create the displacement maps directly into ffmpeg using the nullsrc video source filter and the geq filter to specify the value of the three planes : lum, Cb, Cr. The script is the following :
ffmpeg INPUT.mp4 -f lavfi -i nullsrc=size=${WIDTH}x${HEIGHT}:d=0.1,geq='lum=128+30*sin(2*PI*X/400):Cb=128+30*sin(2*PI*X/400):Cr='128+30*sin(2*PI*X/400)' -f lavfi -i nullsrc=size=${WIDTH}x${HEIGHT}:d=0.1,geq='lum=128+30*sin(2*PI*X/400):Cb=128+30*sin(2*PI*X/400):Cr=128+30*sin(2*PI*X/400)' -lavfi '[0][1][2]displace' OUTPUT.mp4
I used the example provided in the documentation of ffmpeg, since the expression used in geq is irrelevant for the purposes of the problem.
At the and of the computation, I get the pixels of the input video not properly displaced, meaning that I can clearly see a sort of ghost carrying-color-information video under a displaced but b/w one.
After some tests, I noticed that the displacemnt map created had only the luma plane displaced correctly while the chrominance planes were displaced, but differently from luma, which is the origin of the planes disalignment in the intput video as you can see in the following extract frames :I also noticed that the video describing the Cb and Cr planes of the displacement maps have half resolution of the luma plane.
My question is : how can i setup correctly the Cr and Cb planes in the geq definition so that they are exactly identical to the luma plane ?
It would be also great if someone could explain me why ffmpeg gives me an output so much different for luma and Cb, Cr planes even if the function provided is the same.
If, it can help, i’m using ffmpeg 3.3-static build.
Thanks for your time.
-
vf_hwmap : Add reverse mapping for hardware frames
4 mars 2017, par Mark Thompsonvf_hwmap : Add reverse mapping for hardware frames
This is something of a hack. It allocates a new hwframe context for
the target format, then maps it back to the source link and overwrites
the input link hw_frames_ctx so that the previous filter will receive
the frames we want from ff_get_video_buffer(). It may fail if
the previous filter imposes any additional constraints on the frames
it wants to use as output.