
Recherche avancée
Autres articles (26)
-
MediaSPIP Core : La Configuration
9 novembre 2010, parMediaSPIP Core fournit par défaut trois pages différentes de configuration (ces pages utilisent le plugin de configuration CFG pour fonctionner) : une page spécifique à la configuration générale du squelettes ; une page spécifique à la configuration de la page d’accueil du site ; une page spécifique à la configuration des secteurs ;
Il fournit également une page supplémentaire qui n’apparait que lorsque certains plugins sont activés permettant de contrôler l’affichage et les fonctionnalités spécifiques (...) -
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Librairies et logiciels spécifiques aux médias
10 décembre 2010, parPour un fonctionnement correct et optimal, plusieurs choses sont à prendre en considération.
Il est important, après avoir installé apache2, mysql et php5, d’installer d’autres logiciels nécessaires dont les installations sont décrites dans les liens afférants. Un ensemble de librairies multimedias (x264, libtheora, libvpx) utilisées pour l’encodage et le décodage des vidéos et sons afin de supporter le plus grand nombre de fichiers possibles. Cf. : ce tutoriel ; FFMpeg avec le maximum de décodeurs et (...)
Sur d’autres sites (6973)
-
Building a shared library from static libraries for ffmpeg 2.5.2
7 janvier 2015, par abijinxI am currently building a shared library for ffmpeg as myffmpeg.so using ffmpeg 0.8.6. I am achieving by combining static libs of individual modules of ffmpeg, along with a few additional static libraries. The main makefile used is as follows :
LOCAL_PATH := $(call my-dir)
include $(LOCAL_PATH)/Android_Files.mk
include $(CLEAR_VARS)
LOCAL_MODULE:= libavcodec
LOCAL_SRC_FILES:= $(MY_AVCODEC_FILES)
include $(LOCAL_PATH)/Android_Common.mk
include $(BUILD_STATIC_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE:= libavfilter
LOCAL_SRC_FILES:= $(MY_AVFILTER_FILES)
include $(LOCAL_PATH)/Android_Common.mk
include $(BUILD_STATIC_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE:= libavformat
LOCAL_SRC_FILES:= $(MY_AVFORMAT_FILES)
include $(LOCAL_PATH)/Android_Common.mk
include $(BUILD_STATIC_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE:= libavutil
LOCAL_SRC_FILES:= $(MY_AVUTIL_FILES)
include $(LOCAL_PATH)/Android_Common.mk
include $(BUILD_STATIC_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE:= libswscale
LOCAL_SRC_FILES:= $(MY_SWSCALE_FILES)
include $(LOCAL_PATH)/Android_Common.mk
include $(BUILD_STATIC_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE:= ffmpeg_myc
LOCAL_SRC_FILES:= ffmpeg.c cmdutils.c
include $(LOCAL_PATH)/Android_Common.mk
include $(BUILD_STATIC_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE := myffmpeg
LOCAL_WHOLE_STATIC_LIBRARIES := libavcodec libavfilter libavformat libavutil libswscale ffmpeg_c
LOCAL_LDFLAGS += -lz -lm -llog
ifeq ($(FF_ENABLE_AMR),yes)
LOCAL_WHOLE_STATIC_LIBRARIES += opencore-amrnb opencore-amrwb
endif
ifeq ($(FF_ENABLE_AAC),yes)
LOCAL_WHOLE_STATIC_LIBRARIES += vo-aacenc
endif
include $(BUILD_SHARED_LIBRARY)The above makefile uses only few selected source files to get the individual static libs as mentioned in the FF_SOURCE_FILES parameter.Then combines all of those with the additional libs to create a shared library myffmpeg.so.
Now I am trying to get a similar output with ffmpeg 2.5.2 for 64-bit arm architecture. This time I have built the latest ffmpeg using the following configure script and generated the static libraries of different ffmpeg modules like libavcodec.a, libavfilter.a, etc
#!/bin/bash
ABI=aarch64-linux-android
NDK=
SYSROOT=$NDK/platforms/android-21/arch-arm64/
TOOLCHAIN=$NDK/toolchains/aarch64-linux-android-4.9/prebuilt/linux-x86_64
CPU=arm64
PREFIX=$(pwd)/android/$CPU
./configure \
--prefix=$PREFIX \
--cc=$TOOLCHAIN/bin/$ABI-gcc \
--enable-static \
--disable-doc \
--disable-ffmpeg \
--disable-ffplay \
--disable-ffprobe \
--disable-network \
--disable-ffserver \
--disable-devices \
--disable-avdevice \
--disable-swscale-alpha \
--disable-doc \
--disable-symver \
--disable-neon \
--enable-optimizations \
--cross-prefix=$TOOLCHAIN/bin/aarch64-linux-android- \
--target-os=linux \
--arch=arm64 \
--enable-cross-compile \
--sysroot=$SYSROOT
--enable-libopencore-amrnb \
--enable-libopencore-amrwb \
--enable-libvo-aacenc \
$ADDITIONAL_CONFIGURE_FLAG
make clean
make
make installNow I am trying to combine the generated static libraries i.e
-
libavcodec.a
libavfilter.a
libavformat.a
libavutil.a
libswresample.a
libswscale.a
with -
ffmpeg_myc opencore-amrnb opencore-amrwb
and generate a myffmpeg.so shared library by giving ndk-build.
I tried to paste the generated ffmpeg static libraries from ffmpeg/android folder to obj/local/arm64-v8a (path where are all the static libs were previously created in 0.8.6 according the makefile). I also made changes to the makefile thinking that it will use the already generated ffmpeg libraries. But I found that the size of the generated shared library was too low and that it has not included the generated ffmpeg libs.
LOCAL_PATH := $(call my-dir)
include $(CLEAR_VARS)
LOCAL_MODULE := myffmpeg
LOCAL_WHOLE_STATIC_LIBRARIES := libavcodec libavfilter libavformat libavutil libswscale ffmpeg_c
LOCAL_LDFLAGS += -lz -lm -llog
ifeq ($(FF_ENABLE_AMR),yes)
LOCAL_WHOLE_STATIC_LIBRARIES += opencore-amrnb opencore-amrwb
endif
ifeq ($(FF_ENABLE_AAC),yes)
LOCAL_WHOLE_STATIC_LIBRARIES += vo-aacenc
endif
include $(BUILD_SHARED_LIBRARY)I would like to know whether I am going in a right way and have to do some minor changes. Or if I should start with an entirely different approach to achieve this. Any suggestions would be helpful.
Thanks in advance
-
-
How to convert images to video using FFMpeg for embedded applications ?
19 avril 2019, par zthatch56I’m encoding images as video using FFmpeg using custom C code rather than linux commands because I am developing the code for an embedded system.
I am currently following through the first dranger tutorial and the code provided in the following question.
I have found some "less abstract" code in the following github location.
https://github.com/FFmpeg/FFmpeg/blob/master/doc/examples/encode_video.c
And I plan to use it as well.
My end goal is simply to save video on an embedded system using embedded C source code, and I am coming up the curve too slowly. So in summary my question is, Does it seem like I am following the correct path here ? I know that my system does not come with hardware for video codec conversion, which means I need to do it with software, but I am unsure if FFmpeg is even a feasible option for embedded work because I am yet to compile.
The biggest red flag for me thus far is that FFmpeg uses dynamic memory allocation. I am unfamiliar with how to assess the amount of dynamic memory that it uses. This is very important information to me, and if anyone is familiar with the amount of memory used or how to assess it before compiling, I would greatly appreciate the input.
-
JavaCV : Video concatenation [on hold]
20 juin 2014, par user3266577I am working on a project that concatenates videos together but only found the option of using FFmpegFrameGrabbers to iterate through both files, getting their frames then generating a new file from them using a FFMpegRecorder which seems quite inefficient with the amount of frames I am having to loop through.
Ideally it’s going the long way around about the problem, is there a simpler way of concatenating two videos generated by FFMpegRecorder ?