
Recherche avancée
Médias (1)
-
Bug de détection d’ogg
22 mars 2013, par
Mis à jour : Avril 2013
Langue : français
Type : Video
Autres articles (40)
-
Les tâches Cron régulières de la ferme
1er décembre 2010, parLa gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
Le super Cron (gestion_mutu_super_cron)
Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...) -
Les formats acceptés
28 janvier 2010, parLes commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
ffmpeg -codecs ffmpeg -formats
Les format videos acceptés en entrée
Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
Les formats vidéos de sortie possibles
Dans un premier temps on (...) -
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs
Sur d’autres sites (5946)
-
Android - Choosing between MediaRecorder, MediaCodec and Ffmpeg
15 mars 2017, par Rohan StarkI am working on a video recording and sharing application for Android. The specifications of the app are as follows :-
- Recording a 10 second (maximum) video from inside the app (not using the device’s camera app)
- No further editing on the video
- Storing the video in a Firebase Cloud Storage (GCS) bucket
- Downloading and playing of the said video by other users
From the research, I did on SO and others sources for this, I have found the following (please correct me if I am wrong) :-
The three options and their respective features are :-
1.Ffmpeg
- Capable of achieving the above goal and has extensive answers and explanations on sites like SO, however
- Increases the APK size by 20-30mb (large library)
- Runs the risk of not working properly on certain 64-bit devices
2.MediaRecorder
- Reliable and supported by most devices
- Will store files in .mp4 format (unless converted to h264)
- Easier for playback (no decoding needed)
- Adds the mp4 and 3gp headers
- Increases latency according to this question
3.MediaCodec
- Low level
- Will require MediaCodec, MediaMuxer, and MediaExtractor
- Output in h264 ( without using MediaMuxer for playback )
- Good for video manipulations (though, not required in my use case)
- Not supported by pre 4.3 (API 18) devices
- More difficult to implement and code (my opinion - please correct me if I am wrong)
- Unavailability of extensive information, tutorials, answers or samples (Bigflake.com being the only exception)
After spending days on this, I still can’t figure out which approach suits my particular use case. Please elaborate on what I should do for my application. If there’s a completely different approach, then I am open to that as well.
My biggest criteria are that the video encoding process be as efficient as possible and the video to be stored in the cloud should have the lowest possible space usage without compromising on the video quality.
Also, I’d be grateful if you could suggest the appropriate format for saving and distributing the video in Firebase Storage, and point me to tutorials or samples of your suggested approach.
Thank you in advance ! And sorry for the long read.
-
ffmpeg stream ip cam from rtsp to http
15 mars 2017, par mihalpI’m trying to restream ip camera from home network to webpage (sort of video player in wordpress). I have rtsp url of camera and i want to have an output in http. It’s bothering me for 2 days now and i’m not sure what i’m missing...
My ffserver.conf file :HTTPPort 8090 # Port to bind the server to
HTTPBindAddress 0.0.0.0
MaxHTTPConnections 2000
MaxClients 1000
MaxBandwidth 10000 # Maximum bandwidth per client
RTSPPort 8002
# set this high enough to exceed stream bitrate
CustomLog -
<feed>
File /tmp/cam.ffm
FileMaxSize 10G
ACL allow 192.168.0.0 192.168.255.255
</feed>
<stream>
Feed cam.ffm
Format mpeg
VideoCodec libx264
VideoSize 1280x720
VideoFrameRate 6
VideoBufferSize 40
AVOptionVideo flags +global_header
PreRoll 15
StartSendOnKey
VideoBitRate 4096
NoAudio
</stream>
<stream> # Server status URL
Format status
# Only allow local people to get the status
ACL allow localhost
ACL allow 192.168.0.0 192.168.255.255
</stream>
<redirect> # Just an URL redirect for index
# Redirect index.html to the appropriate site
URL url/
</redirect>When i enter ffserver -f /etc/ffserver.conf, i get :
root@test:/tmp# ffserver -f /etc/ffserver.conf
ffserver version 2.6.9 Copyright (c) 2000-2016 the FFmpeg developers
built with gcc 4.9.2 (Debian 4.9.2-10)
configuration: --prefix=/usr --extra-cflags='-g -O2 -fstack-protector-strong -Wformat -Werror=format-security ' --extra-ldflags='-Wl,-z,relro' --cc='ccache cc' --enable-shared --enable-libmp3lame --enable-gpl --enable-nonfree --enable-libvorbis --enable-pthreads --enable-libfaac --enable-libxvid --enable-postproc --enable-x11grab --enable-libgsm --enable-libtheora --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libx264 --enable-libspeex --enable-nonfree --disable-stripping --enable-libvpx --enable-libschroedinger --disable-encoder=libschroedinger --enable-version3 --enable-libopenjpeg --enable-librtmp --enable-avfilter --enable-libfreetype --enable-libvo-aacenc --disable-decoder=amrnb --enable-libvo-amrwbenc --enable-libaacplus --libdir=/usr/lib/x86_64-linux-gnu --disable-vda --enable-libbluray --enable-libcdio --enable-gnutls --enable-frei0r --enable-openssl --enable-libass --enable-libopus --enable-fontconfig --enable-libpulse --disable-mips32r2 --disable-mipsdspr1 --disable-mipsdspr2 --enable-libvidstab --enable-libzvbi --enable-avresample --disable-htmlpages --disable-podpages --enable-libutvideo --enable-libfdk-aac --enable-libx265 --enable-libiec61883 --enable-vaapi --enable-libdc1394 --disable-altivec --shlibdir=/usr/lib/x86_64-linux-gnu
libavutil 54. 20.100 / 54. 20.100
libavcodec 56. 26.100 / 56. 26.100
libavformat 56. 25.101 / 56. 25.101
libavdevice 56. 4.100 / 56. 4.100
libavfilter 5. 11.102 / 5. 11.102
libavresample 2. 1. 0 / 2. 1. 0
libswscale 3. 1.101 / 3. 1.101
libswresample 1. 1.100 / 1. 1.100
libpostproc 53. 3.100 / 53. 3.100
/etc/ffserver.conf:30: Setting default value for video bit rate tolerance = 1024000. Use NoDefaults to disable it.
/etc/ffserver.conf:30: Setting default value for video rate control equation = tex^qComp. Use NoDefaults to disable it.
/etc/ffserver.conf:30: Setting default value for video max rate = 8192000. Use NoDefaults to disable it.
Wed Mar 15 20:23:41 2017 FFserver started.and when i enter
ffmpeg -i rtsp://<ip of="of" camera="camera">:8002 http://<ip of="of" this="this">:8090/cam.ffm</ip></ip>
, happens nothing.I can also reach
http://<ip of="of" my="my" server="server">:8090/status.html</ip>
, but no luck withhttp://<ip of="of" my="my" server="server">:8090/cam.mpg</ip>
.Any suggestions ?
Thank you !
-
Compile error for a simple library in Android using latest FFmpeg library with NDK
21 mars 2017, par satyresI’m trying to get FFmpeg working in Android , and after successfuly compiled it on Ubuntu 64bit , and created the Android.mk under source/ffmpeg-3.4.2/android/arm folder and also the Android.mk in my own Android project
! i’m unable to compile a simple program and create the .so file !
By the way i’ve searched over the net and in Stackoverflow for 2 days now without luck !
here is the code Android.mkLOCAL_PATH:= $(call my-dir)
include $(CLEAR_VARS)
LOCAL_MODULE := MyLibrary
LOCAL_SRC_FILES := MyNativeFunctions.c
LOCAL_LDLIBS := -lz -llog
LOCAL_STATIC_LIBRARIES := libavformat_static libavcodec_static libavutil_static
include $(BUILD_SHARED_LIBRARY)
$(call import-module,ffmpeg-3.2.4/android/arm)The native functions
#include <libavformat></libavformat>avformat.h>
#include <libavcodec></libavcodec>avcodec.h>
#include <libavutil></libavutil>avutil.h>
#include <android></android>log.h>
#define LOG_TAG "mylib"
#define LOGI(...) __android_log_print(ANDROID_LOG_INFO, LOG_TAG, __VA_ARGS__)
#define LOGE(...) __android_log_print(ANDROID_LOG_ERROR, LOG_TAG, __VA_ARGS__)
jint Java_com_example_home_hellondk_MyNativeFunctions_TestNdk(JNIEnv * env, jobject this, jstring filename)
{
av_register_all();
return 0;
}when i use NDK-BUILD here is the error i got :
/home/home/Android/Sdk/ndk-bundle/sources/ffmpeg-3.2.4/android/arm/lib/libavformat.a: error adding symbols: File in wrong format
clang++: error: linker command failed with exit code 1 (use -v to see invocation)
make: *** [obj/local/arm64-v8a/libMyLibrary.so] Error 1[EDIT]
Build_android.sh#!/bin/bash
NDK=/home/home/Android/Sdk/ndk-bundle
SYSROOT=$NDK/platforms/android-24/arch-arm/
TOOLCHAIN=$NDK/toolchains/arm-linux-androideabi-4.9/prebuilt/linux-x86_64
function build_one
{
./configure \
--prefix=$PREFIX \
--disable-shared \
--enable-static \
--disable-doc \
--disable-ffmpeg \
--disable-ffplay \
--disable-ffprobe \
--disable-ffserver \
--disable-avdevice \
--disable-doc \
--disable-symver \
--cross-prefix=$TOOLCHAIN/bin/arm-linux-androideabi- \
--target-os=linux \
--arch=arm \
--enable-cross-compile \
--sysroot=$SYSROOT \
--extra-cflags="-Os -fpic $ADDI_CFLAGS" \
--extra-ldflags="$ADDI_LDFLAGS" \
$ADDITIONAL_CONFIGURE_FLAG
make clean
make
make install
}
CPU=arm
PREFIX=$(pwd)/android/$CPU
ADDI_CFLAGS="-marm"
build_one[EDIT]
The problem has been solved by adding the missing library in
LOCAL_STATIC_LIBRARIES := libavformat_static libavcodec_static libswscale_static libavutil_static libswresample_static
and for all users of Android Studio do not forget to add this line to your gradle otherwise it won’t work.sourceSets.main {
jni.srcDirs = []
jniLibs.srcDir 'src/main/libs'
}Best regards