
Recherche avancée
Autres articles (80)
-
Ajouter des informations spécifiques aux utilisateurs et autres modifications de comportement liées aux auteurs
12 avril 2011, parLa manière la plus simple d’ajouter des informations aux auteurs est d’installer le plugin Inscription3. Il permet également de modifier certains comportements liés aux utilisateurs (référez-vous à sa documentation pour plus d’informations).
Il est également possible d’ajouter des champs aux auteurs en installant les plugins champs extras 2 et Interface pour champs extras. -
Problèmes fréquents
10 mars 2010, parPHP et safe_mode activé
Une des principales sources de problèmes relève de la configuration de PHP et notamment de l’activation du safe_mode
La solution consiterait à soit désactiver le safe_mode soit placer le script dans un répertoire accessible par apache pour le site -
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)
Sur d’autres sites (12983)
-
Not able to configure FFserver for RTSP
2 novembre 2015, par SujithI am kind of new to the streaming world... so, please forgive me if I ask kind of dumb question.
I am trying to stream my pre-recorded file over RTSP through FFserver.
My config file is :
RTSPPort 8544
<feed>
File /home/xyz/tmp/feed2.ffm
FileMaxSize 200K
ACL allow 127.0.0.1
</feed>
<stream>
Feed feed2.ffm
Format rtsp
VideoFrameRate 15
VideoSize 352x240
VideoBitRate 256
VideoBufferSize 40
VideoGopSize 30
AudioBitRate 64
StartSendOnKey
</stream>After starting the server, it give the below log :
$ ./ffserver -f doc/ffserver.conf
ffserver version 0.11.1 Copyright (c) 2000-2012 the FFmpeg developers
built on Sep 17 2012 19:46:38 with gcc 4.1.2 20080704 (Red Hat 4.1.2-52)
configuration: --enable-gpl --enable-libmp3lame --enable-libtheora --enable-libvo-aacenc
-enable-libvorbis --enable-libvpx --enable-libx264 --enable-version3
libavutil 51. 54.100 / 51. 54.100
libavcodec 54. 23.100 / 54. 23.100
libavformat 54. 6.100 / 54. 6.100
libavdevice 54. 0.100 / 54. 0.100
libavfilter 2. 77.100 / 2. 77.100
libswscale 2. 1.100 / 2. 1.100
libswresample 0. 15.100 / 0. 15.100
libpostproc 52. 0.100 / 52. 0.100
Wed Sep 19 17:03:32 2012 FFserver started.And now from my VLC client I am trying to type the URL : rtsp ://xxx.xxx.xxx.xxx:8554/test.sdp
But, what happens is that, there is no response on the ffserver.
I have no clue what might be the problem. Thanks in advance.
-
android ffmpeg bad video output
20 août 2014, par Sujith ManjavanaI’m following this tutorial to create my first ffmpeg app. I have successfully build the shared libs and compiled the project without any errors. But when i run the app on my nexus 5 the output is this
Here is the native code
#include <libavcodec></libavcodec>avcodec.h>
#include <libavformat></libavformat>avformat.h>
#include <libswscale></libswscale>swscale.h>
#include <libavutil></libavutil>pixfmt.h>
#include
#include
#include
#include <android></android>native_window.h>
#include <android></android>native_window_jni.h>
#define LOG_TAG "android-ffmpeg-tutorial02"
#define LOGI(...) __android_log_print(4, LOG_TAG, __VA_ARGS__);
#define LOGE(...) __android_log_print(6, LOG_TAG, __VA_ARGS__);
ANativeWindow* window;
char *videoFileName;
AVFormatContext *formatCtx = NULL;
int videoStream;
AVCodecContext *codecCtx = NULL;
AVFrame *decodedFrame = NULL;
AVFrame *frameRGBA = NULL;
jobject bitmap;
void* buffer;
struct SwsContext *sws_ctx = NULL;
int width;
int height;
int stop;
jint naInit(JNIEnv *pEnv, jobject pObj, jstring pFileName) {
AVCodec *pCodec = NULL;
int i;
AVDictionary *optionsDict = NULL;
videoFileName = (char *)(*pEnv)->GetStringUTFChars(pEnv, pFileName, NULL);
LOGI("video file name is %s", videoFileName);
// Register all formats and codecs
av_register_all();
// Open video file
if(avformat_open_input(&formatCtx, videoFileName, NULL, NULL)!=0)
return -1; // Couldn't open file
// Retrieve stream information
if(avformat_find_stream_info(formatCtx, NULL)<0)
return -1; // Couldn't find stream information
// Dump information about file onto standard error
av_dump_format(formatCtx, 0, videoFileName, 0);
// Find the first video stream
videoStream=-1;
for(i=0; inb_streams; i++) {
if(formatCtx->streams[i]->codec->codec_type==AVMEDIA_TYPE_VIDEO) {
videoStream=i;
break;
}
}
if(videoStream==-1)
return -1; // Didn't find a video stream
// Get a pointer to the codec context for the video stream
codecCtx=formatCtx->streams[videoStream]->codec;
// Find the decoder for the video stream
pCodec=avcodec_find_decoder(codecCtx->codec_id);
if(pCodec==NULL) {
fprintf(stderr, "Unsupported codec!\n");
return -1; // Codec not found
}
// Open codec
if(avcodec_open2(codecCtx, pCodec, &optionsDict)<0)
return -1; // Could not open codec
// Allocate video frame
decodedFrame=avcodec_alloc_frame();
// Allocate an AVFrame structure
frameRGBA=avcodec_alloc_frame();
if(frameRGBA==NULL)
return -1;
return 0;
}
jobject createBitmap(JNIEnv *pEnv, int pWidth, int pHeight) {
int i;
//get Bitmap class and createBitmap method ID
jclass javaBitmapClass = (jclass)(*pEnv)->FindClass(pEnv, "android/graphics/Bitmap");
jmethodID mid = (*pEnv)->GetStaticMethodID(pEnv, javaBitmapClass, "createBitmap", "(IILandroid/graphics/Bitmap$Config;)Landroid/graphics/Bitmap;");
//create Bitmap.Config
//reference: https://forums.oracle.com/thread/1548728
const wchar_t* configName = L"ARGB_8888";
int len = wcslen(configName);
jstring jConfigName;
if (sizeof(wchar_t) != sizeof(jchar)) {
//wchar_t is defined as different length than jchar(2 bytes)
jchar* str = (jchar*)malloc((len+1)*sizeof(jchar));
for (i = 0; i < len; ++i) {
str[i] = (jchar)configName[i];
}
str[len] = 0;
jConfigName = (*pEnv)->NewString(pEnv, (const jchar*)str, len);
} else {
//wchar_t is defined same length as jchar(2 bytes)
jConfigName = (*pEnv)->NewString(pEnv, (const jchar*)configName, len);
}
jclass bitmapConfigClass = (*pEnv)->FindClass(pEnv, "android/graphics/Bitmap$Config");
jobject javaBitmapConfig = (*pEnv)->CallStaticObjectMethod(pEnv, bitmapConfigClass,
(*pEnv)->GetStaticMethodID(pEnv, bitmapConfigClass, "valueOf", "(Ljava/lang/String;)Landroid/graphics/Bitmap$Config;"), jConfigName);
//create the bitmap
return (*pEnv)->CallStaticObjectMethod(pEnv, javaBitmapClass, mid, pWidth, pHeight, javaBitmapConfig);
}
jintArray naGetVideoRes(JNIEnv *pEnv, jobject pObj) {
jintArray lRes;
if (NULL == codecCtx) {
return NULL;
}
lRes = (*pEnv)->NewIntArray(pEnv, 2);
if (lRes == NULL) {
LOGI(1, "cannot allocate memory for video size");
return NULL;
}
jint lVideoRes[2];
lVideoRes[0] = codecCtx->width;
lVideoRes[1] = codecCtx->height;
(*pEnv)->SetIntArrayRegion(pEnv, lRes, 0, 2, lVideoRes);
return lRes;
}
void naSetSurface(JNIEnv *pEnv, jobject pObj, jobject pSurface) {
if (0 != pSurface) {
// get the native window reference
window = ANativeWindow_fromSurface(pEnv, pSurface);
// set format and size of window buffer
ANativeWindow_setBuffersGeometry(window, 0, 0, WINDOW_FORMAT_RGBA_8888);
} else {
// release the native window
ANativeWindow_release(window);
}
}
jint naSetup(JNIEnv *pEnv, jobject pObj, int pWidth, int pHeight) {
width = pWidth;
height = pHeight;
//create a bitmap as the buffer for frameRGBA
bitmap = createBitmap(pEnv, pWidth, pHeight);
if (AndroidBitmap_lockPixels(pEnv, bitmap, &buffer) < 0)
return -1;
//get the scaling context
sws_ctx = sws_getContext (
codecCtx->width,
codecCtx->height,
codecCtx->pix_fmt,
pWidth,
pHeight,
AV_PIX_FMT_RGBA,
SWS_BILINEAR,
NULL,
NULL,
NULL
);
// Assign appropriate parts of bitmap to image planes in pFrameRGBA
// Note that pFrameRGBA is an AVFrame, but AVFrame is a superset
// of AVPicture
avpicture_fill((AVPicture *)frameRGBA, buffer, AV_PIX_FMT_RGBA,
pWidth, pHeight);
return 0;
}
void finish(JNIEnv *pEnv) {
//unlock the bitmap
AndroidBitmap_unlockPixels(pEnv, bitmap);
av_free(buffer);
// Free the RGB image
av_free(frameRGBA);
// Free the YUV frame
av_free(decodedFrame);
// Close the codec
avcodec_close(codecCtx);
// Close the video file
avformat_close_input(&formatCtx);
}
void decodeAndRender(JNIEnv *pEnv) {
ANativeWindow_Buffer windowBuffer;
AVPacket packet;
int i=0;
int frameFinished;
int lineCnt;
while(av_read_frame(formatCtx, &packet)>=0 && !stop) {
// Is this a packet from the video stream?
if(packet.stream_index==videoStream) {
// Decode video frame
avcodec_decode_video2(codecCtx, decodedFrame, &frameFinished,
&packet);
// Did we get a video frame?
if(frameFinished) {
// Convert the image from its native format to RGBA
sws_scale
(
sws_ctx,
(uint8_t const * const *)decodedFrame->data,
decodedFrame->linesize,
0,
codecCtx->height,
frameRGBA->data,
frameRGBA->linesize
);
// lock the window buffer
if (ANativeWindow_lock(window, &windowBuffer, NULL) < 0) {
LOGE("cannot lock window");
} else {
// draw the frame on buffer
LOGI("copy buffer %d:%d:%d", width, height, width*height*4);
LOGI("window buffer: %d:%d:%d", windowBuffer.width,
windowBuffer.height, windowBuffer.stride);
memcpy(windowBuffer.bits, buffer, width * height * 4);
// unlock the window buffer and post it to display
ANativeWindow_unlockAndPost(window);
// count number of frames
++i;
}
}
}
// Free the packet that was allocated by av_read_frame
av_free_packet(&packet);
}
LOGI("total No. of frames decoded and rendered %d", i);
finish(pEnv);
}
/**
* start the video playback
*/
void naPlay(JNIEnv *pEnv, jobject pObj) {
//create a new thread for video decode and render
pthread_t decodeThread;
stop = 0;
pthread_create(&decodeThread, NULL, decodeAndRender, NULL);
}
/**
* stop the video playback
*/
void naStop(JNIEnv *pEnv, jobject pObj) {
stop = 1;
}
jint JNI_OnLoad(JavaVM* pVm, void* reserved) {
JNIEnv* env;
if ((*pVm)->GetEnv(pVm, (void **)&env, JNI_VERSION_1_6) != JNI_OK) {
return -1;
}
JNINativeMethod nm[8];
nm[0].name = "naInit";
nm[0].signature = "(Ljava/lang/String;)I";
nm[0].fnPtr = (void*)naInit;
nm[1].name = "naSetSurface";
nm[1].signature = "(Landroid/view/Surface;)V";
nm[1].fnPtr = (void*)naSetSurface;
nm[2].name = "naGetVideoRes";
nm[2].signature = "()[I";
nm[2].fnPtr = (void*)naGetVideoRes;
nm[3].name = "naSetup";
nm[3].signature = "(II)I";
nm[3].fnPtr = (void*)naSetup;
nm[4].name = "naPlay";
nm[4].signature = "()V";
nm[4].fnPtr = (void*)naPlay;
nm[5].name = "naStop";
nm[5].signature = "()V";
nm[5].fnPtr = (void*)naStop;
jclass cls = (*env)->FindClass(env, "roman10/tutorial/android_ffmpeg_tutorial02/MainActivity");
//Register methods with env->RegisterNatives.
(*env)->RegisterNatives(env, cls, nm, 6);
return JNI_VERSION_1_6;
}Here is the build.sh
#!/bin/bash
NDK=$HOME/Desktop/adt/android-ndk-r9
SYSROOT=$NDK/platforms/android-9/arch-arm/
TOOLCHAIN=$NDK/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86_64
function build_one
{
./configure \
--prefix=$PREFIX \
--enable-shared \
--disable-static \
--disable-doc \
--disable-ffmpeg \
--disable-ffplay \
--disable-ffprobe \
--disable-ffserver \
--disable-avdevice \
--disable-doc \
--disable-symver \
--cross-prefix=$TOOLCHAIN/bin/arm-linux-androideabi- \
--target-os=linux \
--arch=arm \
--enable-cross-compile \
--sysroot=$SYSROOT \
--extra-cflags="-Os -fpic $ADDI_CFLAGS" \
--extra-ldflags="$ADDI_LDFLAGS" \
$ADDITIONAL_CONFIGURE_FLAG
make clean
make
make install
}
CPU=arm
PREFIX=$(pwd)/android/$CPU
ADDI_CFLAGS="-marm"
build_oneIt works on the Galaxy tab2. what can i do to make it work on all devices ?? Please help me..
-
stagefright for ffmpeg : undefined symbol '_ZTIN7android11MediaSourceE'
14 juillet 2014, par user3051473all... I want to try ffmpeg hw accelerate with libstagefright_h264 for android device.
Seems that the source code has a little bit old with ffmpeg 2.2.1 and android 4.2.2.
So I try to update the source code but when I only write just a little code like following :struct FFSource : public MediaSource {
virtual status_t start(MetaData *params) { return -1; }
virtual status_t stop() { return -1; }
virtual sp<metadata> getFormat() { return sp<metadata>(); }
virtual status_t read(
MediaBuffer **buffer, const ReadOptions *options) { return -1; }
};
static av_cold int Stagefright_init(AVCodecContext *avctx)
{
av_log(avctx, AV_LOG_ERROR, "In function: %s\n", __FUNCTION__);
FFSource *decoder;
sp<mediasource> mediaSource;
decoder = new FFSource();
return -1;
}
</mediasource></metadata></metadata>it tells me like following when I run the code :
cannot locate symbol "_ZTIN7android11MediaSourceE" referenced by "libavcodec-55.so"...And the result show the nm command for libavcodec-55.so :
[cxm@ubuntu ffmpeg-2.2.1.lsc]$nm ./libavcodec/libavcodec.so | grep Source
003a31a8 t _ZN7android11MediaSource10setBuffersERKNS_6VectorIPNS_11MediaBufferEEE
003a319c t _ZN7android11MediaSource5pauseEv
U _ZN7android11MediaSourceC2Ev
U _ZN7android11MediaSourceD0Ev
U _ZN7android11MediaSourceD1Ev
U _ZN7android11MediaSourceD2Ev
003a31d0 t _ZN8FFSource4readEPPN7android11MediaBufferEPKNS0_11MediaSource11ReadOptionsE
003a31bc t _ZN8FFSource4stopEv
003a31b4 t _ZN8FFSource5startEPN7android8MetaDataE
003a31c4 t _ZN8FFSource9getFormatEv
003a3340 t _ZN8FFSourceD0Ev
003a32e0 t _ZN8FFSourceD1Ev
007b29f0 d _ZTC8FFSource0_N7android11MediaSourceE
007b29e0 d _ZTI8FFSource
U _ZTIN7android11MediaSourceE
006ac028 r _ZTS8FFSource
007b2970 d _ZTT8FFSource
U _ZTv0_n12_N7android11MediaSourceD0Ev
U _ZTv0_n12_N7android11MediaSourceD1Ev
003a3330 t _ZTv0_n12_N8FFSourceD0Ev
003a32d0 t _ZTv0_n12_N8FFSourceD1Ev
007b2980 d _ZTV8FFSourceAlso I cannot find _ZTIN7android11MediaSourceE in libstagefright.so. I think there’s any mistake I have in writing the source code instead of mis-link the stagefright library.
I use android standalone toolchain to compile ffmpeg source. Like following configuration :function config_lsf
{
# build flags
PREFIX=/home/cxm/work/Android/proj/dy-proj-careye/dy-proj-careye/jni/src/lib/android/ FLAGS="--prefix=$PREFIX"
FLAGS="$FLAGS --disable-static"
FLAGS="$FLAGS --disable-doc"
FLAGS="$FLAGS --disable-programs"
FLAGS="$FLAGS --disable-avdevice"
FLAGS="$FLAGS --disable-symver"
FLAGS="$FLAGS --enable-shared"
FLAGS="$FLAGS --cross-prefix=arm-linux-androideabi-"
FLAGS="$FLAGS --target-os=linux"
FLAGS="$FLAGS --arch=arm"
FLAGS="$FLAGS --enable-cross-compile"
FLAGS="$FLAGS --disable-decoder=h264"
FLAGS="$FLAGS --disable-decoder=h264_vdpau"
FLAGS="$FLAGS --enable-libstagefright-h264"
# cflags
CFLAGS="-Os -fpic -marm"
ANDROID_SOURCE=../mokesoures/
ANDROID_LIBS=../android-libs
CFLAGS="$CFLAGS -I$ANDROID_SOURCE/frameworks/av/include"
CFLAGS="$CFLAGS -I$ANDROID_SOURCE/frameworks/native/include/"
CFLAGS="$CFLAGS -I$ANDROID_SOURCE/system/core/include"
CFLAGS="$CFLAGS -I$ANDROID_SOURCE/hardware/libhardware/include"
CFLAGS="$CFLAGS -I$ANDROID_SOURCE/frameworks/native/include/media/openmax/"
# ld flags
LDFLAGS="-Wl,--fix-cortex-a8"
LDFLAGS="$LDFLAGS -lstdc++ -lgnustl_shared -lstagefright -lutils -lbinder"
LDFLAGS="$LDFLAGS -L$ANDROID_LIBS"
LDFLAGS="$LDFLAGS -Wl,-rpath-link,$ANDROID_LIBS"
./configure.android $FLAGS --extra-cflags="$CFLAGS" \
--extra-ldflags="$LDFLAGS" \
sed -i 's/#define\ HAVE_POSIX_MEMALIGN\ 1/#define\ HAVE_POSIX_MEMALIGN\ 0/' ./config.h
make -j4 V=1 2>&1 | tee log
make install
}Please help about this topic. Thanks very much.