
Recherche avancée
Médias (33)
-
Stereo master soundtrack
17 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Audio
-
#7 Ambience
16 octobre 2011, par
Mis à jour : Juin 2015
Langue : English
Type : Audio
-
#6 Teaser Music
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#5 End Title
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#3 The Safest Place
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#4 Emo Creates
15 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
Autres articles (42)
-
Demande de création d’un canal
12 mars 2010, parEn fonction de la configuration de la plateforme, l’utilisateur peu avoir à sa disposition deux méthodes différentes de demande de création de canal. La première est au moment de son inscription, la seconde, après son inscription en remplissant un formulaire de demande.
Les deux manières demandent les mêmes choses fonctionnent à peu près de la même manière, le futur utilisateur doit remplir une série de champ de formulaire permettant tout d’abord aux administrateurs d’avoir des informations quant à (...) -
Diogene : création de masques spécifiques de formulaires d’édition de contenus
26 octobre 2010, parDiogene est un des plugins ? SPIP activé par défaut (extension) lors de l’initialisation de MediaSPIP.
A quoi sert ce plugin
Création de masques de formulaires
Le plugin Diogène permet de créer des masques de formulaires spécifiques par secteur sur les trois objets spécifiques SPIP que sont : les articles ; les rubriques ; les sites
Il permet ainsi de définir en fonction d’un secteur particulier, un masque de formulaire par objet, ajoutant ou enlevant ainsi des champs afin de rendre le formulaire (...) -
Configuration spécifique pour PHP5
4 février 2011, parPHP5 est obligatoire, vous pouvez l’installer en suivant ce tutoriel spécifique.
Il est recommandé dans un premier temps de désactiver le safe_mode, cependant, s’il est correctement configuré et que les binaires nécessaires sont accessibles, MediaSPIP devrait fonctionner correctement avec le safe_mode activé.
Modules spécifiques
Il est nécessaire d’installer certains modules PHP spécifiques, via le gestionnaire de paquet de votre distribution ou manuellement : php5-mysql pour la connectivité avec la (...)
Sur d’autres sites (6797)
-
How to Choose the Optimal Multi-Touch Attribution Model for Your Organisation
13 mars 2023, par Erin — Analytics Tips -
ffmpeg decoding through C-API leads to artefacts when input resolution is 1200x1600. Am I doing something wrong ?
26 février 2023, par AntonioUsing the C-API and FFmpeg 5.1 I have been able to encode h264 videos with libx264 on Android.
Now I wanted to replay them on Linux inside my C++ application. These videos can be played correctly on a browser, or on other players that I tried like mplayer or ffplay from ffmpeg. Also, I can unroll the frames with
ffmpeg -i recording.mp4 -start_number 0 -qscale:v 5 %06d.jpg
and the images look alright.

However in my C++ application every now and then, but in a very repeatable way, I get artifacts (like the bright pixels showing up above the monitor). They do not accumulate, even though they are not related to keyframes. So whatever error is going on, it doesn't seem to have an impact on subsequent frames. I use OpenCV to visualize the output, and I am pretty sure the problem is not the conversion to BGR because the artifact is already there if I simply show the y channel (luminance, grayscale).


These artifacts show up in videos that I have recorded with a 1200x1600 resolution. It is to be noted that 1200 is not divisible by 32 so ffmpeg does add some padding, but I am dealing with it and it's not an issue. Videos recorded at 1920x1440 are replayed with no artifacts. Two sample videos can be found here for download.


Here follows the code I am using, on the bottom you can see a picture of my decoded image with the artifact and the same as unrolled by ffmpeg command line. It should be noted that I am working with a custom built version of ffmpeg, out of conan packages, while the unrolling is done with ffmpeg from command line that comes with Ubuntu.


extern "C" {
#include <libavcodec></libavcodec>avcodec.h>
#include <libavformat></libavformat>avformat.h>
}

#include 
#include 
#include 

#include <iostream>
int main(int argc, char** argv) {

 int ret;

 auto pkt = av_packet_alloc();
 if (!pkt) {
 std::cerr << "Failed av_packet_alloc()" << std::endl;
 exit(1);
 }

 AVFormatContext* av_format = avformat_alloc_context();
 ret = avformat_open_input(&av_format, FILE_NAME, nullptr, nullptr);
 if (ret < 0) {
 std::cerr << "Failed avformat_open_input, Error: " << ret << std::endl;
 ///Error codes https://stackoverflow.com/questions/12780931/ffmpeg-exit-status-1094995529
 exit(1);
 }
 av_dump_format(av_format, 0, FILE_NAME, 0);
 auto video_st_number = av_find_best_stream(av_format, AVMEDIA_TYPE_VIDEO, -1, -1, nullptr, 0);
 if (video_st_number < 0) {
 std::cerr << "av_find_best_stream couldn't find video stream" << std::endl;
 exit(1);
 }
 auto video_st = av_format->streams[video_st_number];
 auto codec_id = video_st->codecpar->codec_id;
 std::cout << "Duration " << video_st->duration << std::endl;
 std::cout << "n_frames " << video_st->nb_frames << std::endl;

 auto frame = av_frame_alloc();
 if (!frame) {
 fprintf(stderr, "Could not allocate video frame\n");
 exit(1);
 }

 auto codec = avcodec_find_decoder(codec_id);
 if (!codec) {
 fprintf(stderr, "Codec not found\n");
 exit(1);
 }

 auto c = avcodec_alloc_context3(codec);
 if (!c) {
 fprintf(stderr, "Could not allocate video codec context\n");
 exit(1);
 }


 if ((ret = avcodec_parameters_to_context(c, video_st->codecpar))) {
 fprintf(stderr, "Failed avcodec_parameters_to_context\n");
 exit(1);
 }

 c->pix_fmt = AV_PIX_FMT_YUV420P;///Not really necessary
 c->thread_count = 1;///No impact

 /* open it */
 if (avcodec_open2(c, codec, NULL) < 0) {
 fprintf(stderr, "Could not open codec\n ");
 exit(1);
 }

 std::size_t counter = 0;
 std::size_t n_keyframes = 0;

 while (ret >= 0) {
 ret = av_read_frame(av_format, pkt);
 if (pkt->size == 0) {
 std::cout << "Skipping packet of size zero" << std::endl;
 av_packet_unref(pkt);
 continue;
 }
 while (avcodec_send_packet(c, pkt) != 0) {
 if (avcodec_receive_frame(c, frame) != 0) {
 std::cerr << "Error receiving frame" << std::endl;
 exit(1);
 } else {
 n_keyframes += frame->key_frame;
 std::cout << "Decoded " << ++counter << " frames. Frame No. " << frame->pts / pkt->duration << " "
 << frame->decode_error_flags << " " << frame->key_frame << " " << n_keyframes << " "
 << frame->pkt_dts << std::endl;
 }
 display(frame);
 }
 av_packet_unref(pkt);
 }

 avcodec_send_packet(c, nullptr);
 std::cout << "Flushing decoder" << std::endl;

 while (avcodec_receive_frame(c, frame) == 0) {
 n_keyframes += frame->key_frame;
 std::cout << "Decoded " << ++counter << " frames. Frame No. " << frame->pts << " " << frame->decode_error_flags
 << " " << frame->key_frame << " " << n_keyframes << " " << frame->pkt_dts << std::endl;

 display(frame);
 }

 avcodec_free_context(&c);
 avformat_free_context(av_format);
 av_frame_free(&frame);
 av_packet_free(&pkt);

 return 0;
}
</iostream>




For completeness, this is the
display
function, using openCV

void display(const AVFrame* frame) {
 static std::vector yuv_buffer;
 yuv_buffer.resize(frame->linesize[0] * 3 / 2 * frame->width);
 cv::Mat mYUV(frame->height * 3 / 2, frame->width, CV_8UC1, yuv_buffer.data(), frame->linesize[0]);
 memcpy(mYUV.ptr(), frame->data[0], frame->linesize[0] * frame->height);
 //cv::imshow("grayscale", mYUV.rowRange(0, frame->height));
 //cv::imshow("u", cv::Mat(frame->height / 2, frame->width / 2, CV_8UC1, frame->data[1], frame->linesize[1]));
 //cv::imshow("v", cv::Mat(frame->height / 2, frame->width / 2, CV_8UC1, frame->data[2], frame->linesize[2]));

 int dest_row = frame->height;
 for (int j = 0; j < frame->height / 2; j++) {
 memcpy(mYUV.ptr(dest_row), frame->data[1] + frame->linesize[1] * j, frame->width);
 j++;
 memcpy(mYUV.ptr(dest_row) + frame->width / 2, frame->data[1] + frame->linesize[1] * j, frame->width);
 dest_row++;
 }
 for (int j = 0; j < frame->height / 2; j++) {
 memcpy(mYUV.ptr(dest_row), frame->data[2] + frame->linesize[2] * j, frame->width);
 j++;
 memcpy(mYUV.ptr(dest_row) + frame->width / 2, frame->data[2] + frame->linesize[2] * j, frame->width);
 dest_row++;
 }
 cv::Mat mRGB(frame->height, frame->width, CV_8UC3);
 cvtColor(mYUV, mRGB, cv::COLOR_YUV2BGR_I420, 3);
 cv::imshow("Video", mRGB);
 cv::waitKey(0);
}




Note : The
AVFrame
->cv::Mat
converter is now available in corrected version as answer here.

-
Can I Config FFMEG-2.4.3 to use in android windows 64 ?
13 novembre 2014, par Ngo KyI want to build ffmpeg-2.4.3 (lastest c lib) to use it on android for managing video.
my OS is windows 7 Utimate x64
cywin64 in directory : C :/cywin64/ have make installed.
NDK in directory : D :\solfware\Programming\android-ndk-r10ci follow toturial form : http://www.roman10.net/how-to-build-ffmpeg-for-android/comment-page-1/
my project struture like :i create build_android.sh like roman10 as following code :
# ARMv7+Neon (Cortex-A8)
# Customizing:
# 1. Feel free to change ./configure parameters for more features
# 2. To adapt other ARM variants
# set $CPU and $OPTIMIZE_CFLAGS
# call build_one
######################################################
NDK=D:\solfware\Programming\android-ndk-r10c
PLATFORM=$NDK/platforms/android-8/arch-arm/
PREBUILT=$NDK/prebuilt/windows-x86_64
function build_one
{
./configure --target-os=linux \
--prefix=$PREFIX \
--enable-cross-compile \
--extra-libs="-lgcc" \
--arch=arm \
--cc=$PREBUILT/bin/arm-linux-androideabi-gcc \
--cross-prefix=$PREBUILT/bin/arm-linux-androideabi- \
--nm=$PREBUILT/bin/arm-linux-androideabi-nm \
--sysroot=$PLATFORM \
--extra-cflags=" -O3 -fpic -DANDROID -DHAVE_SYS_UIO_H=1 -Dipv6mr_interface=ipv6mr_ifindex -fasm -Wno-psabi -fno-short-enums -fno-strict-aliasing -finline-limit=300 $OPTIMIZE_CFLAGS " \
--disable-shared \
--enable-static \
--extra-ldflags="-Wl,-rpath-link=$PLATFORM/usr/lib -L$PLATFORM/usr/lib -nostdlib -lc -lm -ldl -llog" \
--disable-everything \
--enable-demuxer=mov \
--enable-demuxer=h264 \
--disable-ffplay \
--enable-protocol=file \
--enable-avformat \
--enable-avcodec \
--enable-decoder=rawvideo \
--enable-decoder=mjpeg \
--enable-decoder=h263 \
--enable-decoder=mpeg4 \
--enable-decoder=h264 \
--enable-parser=h264 \
--disable-network \
--enable-zlib \
--disable-avfilter \
--disable-avdevice \
$ADDITIONAL_CONFIGURE_FLAG
make clean
make -j4 install
$PREBUILT/bin/arm-linux-androideabi-ar d libavcodec/libavcodec.a inverse.o
$PREBUILT/bin/arm-linux-androideabi-ld -rpath-link=$PLATFORM/usr/lib -L$PLATFORM/usr/lib -soname libffmpeg.so -shared -nostdlib -z,noexecstack -Bsymbolic --whole-archive --no-undefined -o $PREFIX/libffmpeg.so libavcodec/libavcodec.a libavformat/libavformat.a libavutil/libavutil.a libswscale/libswscale.a -lc -lm -lz -ldl -llog --warn-once --dynamic-linker=/system/bin/linker $PREBUILT/lib/gcc/arm-linux-androideabi/4.4.3/libgcc.a
}
#arm v6
#CPU=armv6
#OPTIMIZE_CFLAGS="-marm -march=$CPU"
#PREFIX=./android/$CPU
#ADDITIONAL_CONFIGURE_FLAG=
#build_one
#arm v7vfpv3
CPU=armv7-a
OPTIMIZE_CFLAGS="-mfloat-abi=softfp -mfpu=vfpv3-d16 -marm -march=$CPU "
PREFIX=./android/$CPU
ADDITIONAL_CONFIGURE_FLAG=
build_one
#arm v7vfp
#CPU=armv7-a
#OPTIMIZE_CFLAGS="-mfloat-abi=softfp -mfpu=vfp -marm -march=$CPU "
#PREFIX=./android/$CPU-vfp
#ADDITIONAL_CONFIGURE_FLAG=
#build_one
#arm v7n
#CPU=armv7-a
#OPTIMIZE_CFLAGS="-mfloat-abi=softfp -mfpu=neon -marm -march=$CPU -mtune=cortex-a8"
#PREFIX=./android/$CPU
#ADDITIONAL_CONFIGURE_FLAG=--enable-neon
#build_one
#arm v6+vfp
#CPU=armv6
#OPTIMIZE_CFLAGS="-DCMP_HAVE_VFP -mfloat-abi=softfp -mfpu=vfp -marm -march=$CPU"
#PREFIX=./android/${CPU}_vfp
#ADDITIONAL_CONFIGURE_FLAG=
#build_onei have set my path like above
I built it in Eclipse Everything is ok. but when i ran it in Android it crashed Unfortunately.
I found the errors in Logcat monitor :E/AndroidRuntime(1931) : java.lang.UnsatisfiedLinkError : Couldn’t load avcodec-55 from loader dalvik.system.PathClassLoader[DexPathList [1]] : findLibrary returned null
E/AndroidRuntime(1931) : at java.lang.Runtime.loadLibrary(Runtime.java:358)
In my activity i just call add Library from C preference like :
static {
//System.loadLibrary("avutil-52");
System.loadLibrary("avcodec-55");
System.loadLibrary("avformat-55");
System.loadLibrary("swscale-2");
System.loadLibrary("tutorial02");
}i check in DDMS in app-lib there is no existing Library in there.
The question is : Can i can compile the ffmpeg on windows ?
if ok. how can i config build_android.sh to point to Android $PREBUILTPLATFORM=$NDK/platforms/android-8/arch-arm/
PREBUILT=$NDK/prebuilt/windows-x86_64
function build_one
{
./configure --target-os=linux \
--prefix=$PREFIX \
--enable-cross-compile \
--extra-libs="-lgcc" \
--arch=arm \
--cc=$PREBUILT/bin/arm-linux-androideabi-gcc \
--cross-prefix=$PREBUILT/bin/arm-linux-androideabi- \
--nm=$PREBUILT/bin/arm-linux-androideabi-nm \
--sysroot=$PLATFORM \
--extra-cflags=" -O3 -fpic -DANDROID -DHAVE_SYS_UIO_H=1 -Dipv6mr_interface=ipv6mr_ifindex -fasm -Wno-psabi -fno-short-enums -fno-strict-aliasing -finline-limit=300 $OPTIMIZE_CFLAGS " \
--disable-shared \i found that in $NDK/prebuilt/windows-x86_64/bin did not exist arm-linux-androideabi-gcc\
How can i config to be suitable to run on windows ?Thank in advanced