
Recherche avancée
Médias (1)
-
The pirate bay depuis la Belgique
1er avril 2013, par
Mis à jour : Avril 2013
Langue : français
Type : Image
Autres articles (26)
-
MediaSPIP Core : La Configuration
9 novembre 2010, parMediaSPIP Core fournit par défaut trois pages différentes de configuration (ces pages utilisent le plugin de configuration CFG pour fonctionner) : une page spécifique à la configuration générale du squelettes ; une page spécifique à la configuration de la page d’accueil du site ; une page spécifique à la configuration des secteurs ;
Il fournit également une page supplémentaire qui n’apparait que lorsque certains plugins sont activés permettant de contrôler l’affichage et les fonctionnalités spécifiques (...) -
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Librairies et logiciels spécifiques aux médias
10 décembre 2010, parPour un fonctionnement correct et optimal, plusieurs choses sont à prendre en considération.
Il est important, après avoir installé apache2, mysql et php5, d’installer d’autres logiciels nécessaires dont les installations sont décrites dans les liens afférants. Un ensemble de librairies multimedias (x264, libtheora, libvpx) utilisées pour l’encodage et le décodage des vidéos et sons afin de supporter le plus grand nombre de fichiers possibles. Cf. : ce tutoriel ; FFMpeg avec le maximum de décodeurs et (...)
Sur d’autres sites (7373)
-
Decoding H.264 individual nal units
1er juin 2015, par madprogrammer2015I am currently sending individual NAL units across a network. These NAL units are generated by x264. Now is it possible to feed these NAL units individually into avcodec_decode_video2 ?
Or do I have to concatenate the nal units until they represent the same frame ? If thats the case then how is that done ?
I have also read that I might be able to receive the SPS and PPS packets. Then wait for at least one packet, and attempt to decode. Is this correct ?
Any advice that can be offered would be greatly appreciated
-
ffplay subscription to rtmp stream fails on first try
2 janvier 2018, par ehouseI have an Electron application that does some video streaming through ffmpeg. It works fine on Windows, but I am having a small problem with it on Mac. I stream the desktop with ffmpeg, and try to subscribe to the rtmp stream. I know the rtmp stream is live because I can successfully grab a frame from it.
The problem is, when I subscribe with ffplay, it does not work, the output looks like this :
2018/01/01 20:57:25:618 [web] nan : 0.000 fd= 0 aq= 0KB vq= 0KB sq= 0B f=0/0
When I subscribe again (while the previous process is still ongoing), it works :
2018/01/01 20:57:25:828 [web] 340.93 M-V: 0.001 fd= 0 aq= 0KB vq= 131KB sq= 0B f=0/0
The only difference in output to the command line is this :
Input #0, live_flv, from 'rtmp:{stream-url-here}':=0/0
2018/01/01 20:57:15:427 [web] Metadata:
2018/01/01 20:57:15:427 [web] Server : NGINX RTMP
(github.com/sergey-dryabzhinsky/nginx-rtmp-module)
2018/01/01 20:57:15:427 [web] displayWidth : 1024
2018/01/01 20:57:15:428 [web] displayHeight : 640
2018/01/01 20:57:15:429 [web] fps : 20
2018/01/01 20:57:15:429 [web] profile :
2018/01/01 20:57:15:430 [web] level :
2018/01/01 20:57:15:430 [web] Duration: 00:00:00.00, start:
316.500000, bitrate: N/A
2018/01/01 20:57:15:430 [web] Stream #0:0: Video: h264 (Constrained
Baseline), yuv420p(progressive), 1024x640, 20 fps, 20 tbr, 1k tbn, 40
tbcI am trying to figure out why it only works on the second time subscribing, and this is the difference, which I assumes signals the it successfully received the streams info, but I am trying to figure out why it only works the second time.
Thanks !!
-
Building a shared library from static libraries for ffmpeg 2.5.2
7 janvier 2015, par abijinxI am currently building a shared library for ffmpeg as myffmpeg.so using ffmpeg 0.8.6. I am achieving by combining static libs of individual modules of ffmpeg, along with a few additional static libraries. The main makefile used is as follows :
LOCAL_PATH := $(call my-dir)
include $(LOCAL_PATH)/Android_Files.mk
include $(CLEAR_VARS)
LOCAL_MODULE:= libavcodec
LOCAL_SRC_FILES:= $(MY_AVCODEC_FILES)
include $(LOCAL_PATH)/Android_Common.mk
include $(BUILD_STATIC_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE:= libavfilter
LOCAL_SRC_FILES:= $(MY_AVFILTER_FILES)
include $(LOCAL_PATH)/Android_Common.mk
include $(BUILD_STATIC_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE:= libavformat
LOCAL_SRC_FILES:= $(MY_AVFORMAT_FILES)
include $(LOCAL_PATH)/Android_Common.mk
include $(BUILD_STATIC_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE:= libavutil
LOCAL_SRC_FILES:= $(MY_AVUTIL_FILES)
include $(LOCAL_PATH)/Android_Common.mk
include $(BUILD_STATIC_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE:= libswscale
LOCAL_SRC_FILES:= $(MY_SWSCALE_FILES)
include $(LOCAL_PATH)/Android_Common.mk
include $(BUILD_STATIC_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE:= ffmpeg_myc
LOCAL_SRC_FILES:= ffmpeg.c cmdutils.c
include $(LOCAL_PATH)/Android_Common.mk
include $(BUILD_STATIC_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE := myffmpeg
LOCAL_WHOLE_STATIC_LIBRARIES := libavcodec libavfilter libavformat libavutil libswscale ffmpeg_c
LOCAL_LDFLAGS += -lz -lm -llog
ifeq ($(FF_ENABLE_AMR),yes)
LOCAL_WHOLE_STATIC_LIBRARIES += opencore-amrnb opencore-amrwb
endif
ifeq ($(FF_ENABLE_AAC),yes)
LOCAL_WHOLE_STATIC_LIBRARIES += vo-aacenc
endif
include $(BUILD_SHARED_LIBRARY)The above makefile uses only few selected source files to get the individual static libs as mentioned in the FF_SOURCE_FILES parameter.Then combines all of those with the additional libs to create a shared library myffmpeg.so.
Now I am trying to get a similar output with ffmpeg 2.5.2 for 64-bit arm architecture. This time I have built the latest ffmpeg using the following configure script and generated the static libraries of different ffmpeg modules like libavcodec.a, libavfilter.a, etc
#!/bin/bash
ABI=aarch64-linux-android
NDK=
SYSROOT=$NDK/platforms/android-21/arch-arm64/
TOOLCHAIN=$NDK/toolchains/aarch64-linux-android-4.9/prebuilt/linux-x86_64
CPU=arm64
PREFIX=$(pwd)/android/$CPU
./configure \
--prefix=$PREFIX \
--cc=$TOOLCHAIN/bin/$ABI-gcc \
--enable-static \
--disable-doc \
--disable-ffmpeg \
--disable-ffplay \
--disable-ffprobe \
--disable-network \
--disable-ffserver \
--disable-devices \
--disable-avdevice \
--disable-swscale-alpha \
--disable-doc \
--disable-symver \
--disable-neon \
--enable-optimizations \
--cross-prefix=$TOOLCHAIN/bin/aarch64-linux-android- \
--target-os=linux \
--arch=arm64 \
--enable-cross-compile \
--sysroot=$SYSROOT
--enable-libopencore-amrnb \
--enable-libopencore-amrwb \
--enable-libvo-aacenc \
$ADDITIONAL_CONFIGURE_FLAG
make clean
make
make installNow I am trying to combine the generated static libraries i.e
-
libavcodec.a
libavfilter.a
libavformat.a
libavutil.a
libswresample.a
libswscale.a
with -
ffmpeg_myc opencore-amrnb opencore-amrwb
and generate a myffmpeg.so shared library by giving ndk-build.
I tried to paste the generated ffmpeg static libraries from ffmpeg/android folder to obj/local/arm64-v8a (path where are all the static libs were previously created in 0.8.6 according the makefile). I also made changes to the makefile thinking that it will use the already generated ffmpeg libraries. But I found that the size of the generated shared library was too low and that it has not included the generated ffmpeg libs.
LOCAL_PATH := $(call my-dir)
include $(CLEAR_VARS)
LOCAL_MODULE := myffmpeg
LOCAL_WHOLE_STATIC_LIBRARIES := libavcodec libavfilter libavformat libavutil libswscale ffmpeg_c
LOCAL_LDFLAGS += -lz -lm -llog
ifeq ($(FF_ENABLE_AMR),yes)
LOCAL_WHOLE_STATIC_LIBRARIES += opencore-amrnb opencore-amrwb
endif
ifeq ($(FF_ENABLE_AAC),yes)
LOCAL_WHOLE_STATIC_LIBRARIES += vo-aacenc
endif
include $(BUILD_SHARED_LIBRARY)I would like to know whether I am going in a right way and have to do some minor changes. Or if I should start with an entirely different approach to achieve this. Any suggestions would be helpful.
Thanks in advance
-