
Recherche avancée
Autres articles (71)
-
Le profil des utilisateurs
12 avril 2011, parChaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...) -
Configurer la prise en compte des langues
15 novembre 2010, parAccéder à la configuration et ajouter des langues prises en compte
Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...) -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...)
Sur d’autres sites (4597)
-
How to understand the given ffplay C code snippet ?
20 juillet 2015, par Jerikc XIONGThe following code snippet is from ffplay :
static int decoder_decode_frame(Decoder *d, AVFrame *frame, AVSubtitle *sub) {
int got_frame = 0;
do {
int ret = -1;
if (d->queue->abort_request)
return -1;
if (!d->packet_pending || d->queue->serial != d->pkt_serial) {
AVPacket pkt;
do {
if (d->queue->nb_packets == 0)
SDL_CondSignal(d->empty_queue_cond);
if (packet_queue_get(d->queue, &pkt, 1, &d->pkt_serial) < 0)
return -1;
if (pkt.data == flush_pkt.data) {
avcodec_flush_buffers(d->avctx);
d->finished = 0;
d->next_pts = d->start_pts;
d->next_pts_tb = d->start_pts_tb;
}
} while (pkt.data == flush_pkt.data || d->queue->serial != d->pkt_serial);
av_free_packet(&d->pkt);
d->pkt_temp = d->pkt = pkt;
d->packet_pending = 1;
}
switch (d->avctx->codec_type) {
case AVMEDIA_TYPE_VIDEO:
ret = avcodec_decode_video2(d->avctx, frame, &got_frame, &d->pkt_temp);
if (got_frame) {
if (decoder_reorder_pts == -1) {
frame->pts = av_frame_get_best_effort_timestamp(frame);
} else if (decoder_reorder_pts) {
frame->pts = frame->pkt_pts;
} else {
frame->pts = frame->pkt_dts;
}
}
break;
case AVMEDIA_TYPE_AUDIO:
ret = avcodec_decode_audio4(d->avctx, frame, &got_frame, &d->pkt_temp);
if (got_frame) {
AVRational tb = (AVRational){1, frame->sample_rate};
if (frame->pts != AV_NOPTS_VALUE)
frame->pts = av_rescale_q(frame->pts, d->avctx->time_base, tb);
else if (frame->pkt_pts != AV_NOPTS_VALUE)
frame->pts = av_rescale_q(frame->pkt_pts, av_codec_get_pkt_timebase(d->avctx), tb);
else if (d->next_pts != AV_NOPTS_VALUE)
frame->pts = av_rescale_q(d->next_pts, d->next_pts_tb, tb);
if (frame->pts != AV_NOPTS_VALUE) {
d->next_pts = frame->pts + frame->nb_samples;
d->next_pts_tb = tb;
}
}
break;
case AVMEDIA_TYPE_SUBTITLE:
ret = avcodec_decode_subtitle2(d->avctx, sub, &got_frame, &d->pkt_temp);
break;
}
if (ret < 0) {
d->packet_pending = 0;
} else {
d->pkt_temp.dts =
d->pkt_temp.pts = AV_NOPTS_VALUE;
if (d->pkt_temp.data) {
if (d->avctx->codec_type != AVMEDIA_TYPE_AUDIO)
ret = d->pkt_temp.size;
d->pkt_temp.data += ret;
d->pkt_temp.size -= ret;
if (d->pkt_temp.size <= 0)
d->packet_pending = 0;
} else {
if (!got_frame) {
d->packet_pending = 0;
d->finished = d->pkt_serial; // FLAG
}
}
}
} while (!got_frame && !d->finished);
return got_frame;
}It’s difficult for me to understand the following code :
d->finished = d->pkt_serial; // FLAG
Can anyone help me ?
Thanks.
-
Android : Native Code, libx264
11 septembre 2011, par NadavRubI am working with Android OS, my aim is doing propriotary imgproc and send the result as H264 video stream out of the device, for this, I will need to use a compression tool, I have considered the open source libx264 ( which got compiled on the Android env quite easily ).
Having that said, does libx264 will know to identify the underling CPU/instruction set, OR, would I need to maintain a specialized binary for each CPU type ( eg. Nvidia/ARM ) ?
The last thing I want is to maintain a specific version of code/binary for each CPU type...
Any help would be appreciated.
Nadav
-
How to configure FFMPEG to generate position independent code on IOS
8 juillet 2013, par madadiI am developing an app on iOS which makes use of FFMPEG for video decoding. I want to get 'position independent code' of FFMPEG working on my app.
--enable-pic
option in the FFMPEG configuration generates position independent code only if I set--disable-asm
. I concluded this when I ran the otool on my app(otool -hv 'appname'
). Below are the details :If I set
--enable-pic
alone, below is the output with otool :magic cputype cpusubtype caps filetype ncmds sizeofcmds flags
MH_MAGIC ARM V7 0x00 EXECUTE 29 3724 NOUNDEFS DYLDLINK TWOLEVEL WEAK_DEFINES BINDS_TO_WEAKIf I set
--enable-pic
and--disable-asm
below is the output with otool,magic cputype cpusubtype caps filetype ncmds sizeofcmds flags
MH_MAGIC ARM V7 0x00 EXECUTE 51 5520 NOUNDEFS DYLDLINK TWOLEVEL WEAK_DEFINES BINDS_TO_WEAK **PIE**Below are my configuration settings for FFMPEG(build-ffmpeg-ios.sh) :
if [ `uname` = "Darwin" ]; then
SEDCMD="sed -i '' "
else
SEDCMD="sed -i "
fi
make clean
./configure \
--prefix=$PREFIX \
--enable-version3 \
--enable-static \
--disable-shared \
--disable-doc \
--disable-ffmpeg \
--disable-ffplay \
--disable-ffprobe \
--disable-ffserver \
--disable-avdevice \
--disable-avfilter \
--disable-postproc \
--enable-small \
--cross-prefix="$DEVROOT/usr/bin/" \
--enable-cross-compile \
--target-os=darwin \
--arch=$ARCH \
--cpu=$CPU \
--cc="$CC" \
--as="$AS" \
--extra-cflags="$CFLAGS" \
--extra-ldflags="$CFLAGS" \
--disable-symver \
--disable-debug \
--disable-indevs \
--disable-encoders \
--disable-bsfs \
--enable-pic \
--disable-filters \
$CONFIGURE_FLAGS || exit 1;
$SEDCMD 's/HAVE_INLINE_ASM 1/HAVE_INLINE_ASM 0/g' config.h
make install || exit 1;And below is the make file I use(iOS.mk) :
FFMPEG_LIBS := libavcodec.a libavformat.a libavutil.a libswscale.a
IOS_ARMV7_FFMPEG_LIB := $(addprefix build/ios/armv7/lib/, $(FFMPEG_LIBS))
IOS_UNI_FFMPEG_LIB := $(addprefix build/ios/universal/lib/, $(FFMPEG_LIBS))
IPHONE_SDK_VERSION := "5.0"
IPHONE_DEPLOY_SDK := "3.2"
ALL_IOS_LIBS := $(IOS_UNI_FFMPEG_LIB)
build/ios/universal/lib/%.a : build/ios/armv7/lib/%.a
mkdir -p build/ios/universal/lib
lipo -create -output $@ $^
$(IOS_ARMV7_FFMPEG_LIB):
export PREFIX="`$(ABSPATH) build/ios/armv7`" && \
export ARCH="arm" && \
export CPU="cortex-a8" && \
export IOS_BASE_SDK="$(IPHONE_SDK_VERSION)" && \
export IOS_DEPLOY_TGT="$(IPHONE_DEPLOY_SDK)" && \
export CONFIGURE_FLAGS="--enable-neon" && \
export PATH="`$(ABSPATH) venders`:$$PATH" && \
cd venders/ffmpeg && bash ../iphone-compile.sh arm7 ../build-ffmpeg-ios.shCan someone please help me with getting the FFMPEG up and running with both 'pic' and assembly optimization enabled. Without assembly optimizastion, FFMPEG would not be helpful to me as the video decoding is extremely slow.