Recherche avancée

Médias (0)

Mot : - Tags -/flash

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (70)

  • Des sites réalisés avec MediaSPIP

    2 mai 2011, par

    Cette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
    Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Changer son thème graphique

    22 février 2011, par

    Le thème graphique ne touche pas à la disposition à proprement dite des éléments dans la page. Il ne fait que modifier l’apparence des éléments.
    Le placement peut être modifié effectivement, mais cette modification n’est que visuelle et non pas au niveau de la représentation sémantique de la page.
    Modifier le thème graphique utilisé
    Pour modifier le thème graphique utilisé, il est nécessaire que le plugin zen-garden soit activé sur le site.
    Il suffit ensuite de se rendre dans l’espace de configuration du (...)

Sur d’autres sites (13228)

  • FFmpeg avcodec_encode_video2 access violation

    28 février 2016, par JustPingo

    I’ve been trying to encode a frame using FFmpeg with Visual C++. Here is how I do it.
    I first have a planar RGB24 image buffer. I convert it to planar YUV using the following rule :

    Y = ((66 * R + 129 * G + 25 * B + 128) >> 8) + 16;
    U = ((-38 * R - 74 * G + 112 * B + 128) >> 8) + 128;
    V = ((112 * R - 94 * G - 18 * B + 128) >> 8) + 128;

    I implemented it like this :

    void rgb8toYuv(uchar* rgb, uchar* yuv, uint pixelAmount) {
       uchar r, g, b;
       for (uint i = 0; i < pixelAmount; i++) {
           r = rgb[3 * i];
           g = rgb[3 * i + 1];
           b = rgb[3 * i + 2];
           yuv[3 * i] = ((66 * r + 129 * g + 25 * b + 128) >> 8) + 16;
           yuv[3 * i + 1] = ((-38 * r - 74 * g + 112 * b + 128) >> 8) + 128;
           yuv[3 * i + 2] = ((112 * r - 94 * g - 18 * b + 128) >> 8) + 128;
       }
    }

    I open everything like this (I’m using malloc because I’m used to it in C and it’s my first C++ program, I guess it shouldn’t cause any problem ?) :

    AVCodec* codec = avcodec_find_encoder(AV_CODEC_ID_H264);
    AVFormatContext* outContext;
    avformat_alloc_output_context2(&outContext, NULL, "mp4", filepath);

    AVStream* video = avformat_new_stream(outContext, codec);
    video->codec->bit_rate = VIDEOBITRATE;
    video->codec->width = VIDEOWIDTH;
    video->codec->height = VIDEOHEIGHT;
    video->time_base = fps;
    video->codec->gop_size = 10;
    video->codec->max_b_frames = 1;
    video->codec->pix_fmt = AV_PIX_FMT_YUV420P;
    video->codec->codec_id = AV_CODEC_ID_H264;
    video->codec->codec_type = AVMEDIA_TYPE_VIDEO;

    avio_open(&outContext->pb, filepath, AVIO_FLAG_READ_WRITE);
    avformat_write_header(outContext, NULL);

    AVFrame* frame = av_frame_alloc();
    frame->width = VIDEOWIDTH;
    frame->height = VIDEOHEIGHT;
    frame->format = AV_PIX_FMT_YUV420P;

    Then, here is the function I use to encode a frame :

    void encodeFrame(uint currentFrame, uchar* data) { // RGB data
       uchar* yuvData = (uchar*) malloc(videoWidth * videoHeight * 3);
       rgb8toYuv(data, yuvData, videoWidth * videoHeight);
       av_image_fill_arrays(frame->data, frame->linesize, yuvData, AV_PIX_FMT_YUV420P, videoWidth, videoHeight, 3); // I'm not sure about that 3, I couldn't find any documentation about it

       AVPacket* packet = (AVPacket*) malloc(sizeof(AVPacket));
       memset(packet, 0, sizeof(AVPacket));
       av_init_packet(packet);
       packet->data = NULL;
       packet->size = 0;

       frame->pts = currentFrame; // I don't know if this is corrrect too
       avcodec_encode_video2(video->codec, packet, frame, NULL);
       av_interleaved_write_frame(outContext, packet);
       av_packet_unref(packet);

       free(yuvData);
       free(packet);
    }

    However, this causes an Access violation writing location 0x00000000 on avcodec_encode_video2. I checked the errors returned by every of FFmpeg’s functions, and it seems like they all work except av_image_fill_arrays that returns a weird 1382400 error, although according to the debugger’s RAM-viewing tool, everything gets filled correctly.

    It seems like avcodec_encode_video2 tries to access a NULL object that shouldn’t be, however I can’t find what it could be, as I followed a lot of sources example, and I don’t know what I did wrong.

    Thanks in advance !

    EDIT : After applying the fix suggested by Edgar Rokyan (which is setting the 4th argument to an int pointer), I now get an access violation on 0x00000024, still with avformat_alloc_output_context2. I believe the problem is similar, but I still can’t find anything.

  • FFMpeg - Split Window RTMP - Delay on Second Stream

    22 février 2016, par Nick Smit

    I’m trying to combine two live RTMP sources into one split screen output with combined audio. The output is then sent on to a receiving RTMP server.

    Using the following command, which uses the same RTMP input for both feeds, I’ve managed to get the above working, however the input on the left is delayed by about 2 seconds from the one on the right.

    ffmpeg -re -i rtmp://myserver.tld/live/stream_key -re -i rtmp://myserver.tld/live/stream_key \
    -filter_complex "\
    nullsrc=size=1152x720 [base];\
    [0:v] crop=576:720 [upperleft];\
    [1:v] crop=576:720 [upperright];\
    [base][upperleft] overlay=shortest=1 [tmp1];\
    [tmp1][upperright] overlay=shortest=1:x=576;\
    [0:a][1:a]amix \
    " -c:a libfdk_aac -ar 44100 -threads 32 -c:v libx264 -g 50 -preset ultrafast -tune zerolatency -f flv rtmp://myserver.tld/live/new_stream_key

    Output :

    ffmpeg version N-76137-gb0bb1dc Copyright (c) 2000-2015 the FFmpeg developers
    built with gcc 4.8 (Ubuntu 4.8.4-2ubuntu1~14.04)
    configuration: --prefix=/home/ubuntu/ffmpeg_build --pkg-config-flags=--static --extra-cflags=-I/home/ubuntu/ffmpeg_build/include --extra-ldflags=-L/home/ubuntu/ffmpeg_build/lib --bindir=/home/ubuntu/bin --enable-gpl --enable-libass --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265 --enable-nonfree
    libavutil      55.  4.100 / 55.  4.100
    libavcodec     57.  7.100 / 57.  7.100
    libavformat    57.  8.102 / 57.  8.102
    libavdevice    57.  0.100 / 57.  0.100
    libavfilter     6. 12.100 /  6. 12.100
    libswscale      4.  0.100 /  4.  0.100
    libswresample   2.  0.100 /  2.  0.100
    libpostproc    54.  0.100 / 54.  0.100
    [flv @ 0x3a0e940] video stream discovered after head already parsed
    Input #0, flv, from 'rtmp://myserver.tld/live/stream_key':
    Metadata:
    Server          : NGINX RTMP (github.com/arut/nginx-rtmp-module)
    displayWidth    : 1152
    displayHeight   : 720
    fps             : 29
    profile         :
    level           :
    Duration: 00:00:00.00, start: 5.717000, bitrate: N/A
    Stream #0:0: Audio: aac (LC), 48000 Hz, stereo, fltp, 163 kb/s
    Stream #0:1: Video: h264 (High), yuv420p, 1152x720, 30.30 fps, 29.97 tbr, 1k tbn, 59.94 tbc
    [flv @ 0x3a49e00] video stream discovered after head already parsed
    Input #1, flv, from 'rtmp://myserver.tld/live/stream_key':
    Metadata:
    Server          : NGINX RTMP (github.com/arut/nginx-rtmp-module)
    displayWidth    : 1152
    displayHeight   : 720
    fps             : 29
    profile         :
    level           :
    Duration: 00:00:00.00, start: 9.685000, bitrate: N/A
    Stream #1:0: Audio: aac (LC), 48000 Hz, stereo, fltp, 163 kb/s
    Stream #1:1: Video: h264 (High), yuv420p, 1152x720, 30.30 fps, 29.97 tbr, 1k tbn, 59.94 tbc
    [libx264 @ 0x3a9cd60] Application has requested 32 threads. Using a thread count greater than 16 is not recommended.
    [libx264 @ 0x3a9cd60] using SAR=1/1
    [libx264 @ 0x3a9cd60] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX
    [libx264 @ 0x3a9cd60] profile Constrained Baseline, level 3.1
    [libx264 @ 0x3a9cd60] 264 - core 142 r2389 956c8d8 - H.264/MPEG-4 AVC codec - Copyleft 2003-2014 - http://www.videolan.org/x264.html - options: cabac=0 ref=1 deblock=0:0:0 analyse=0:0 me=dia subme=0 psy=1 psy_rd=1.00:0.00 mixed_ref=0 me_range=16 chroma_me=1 trellis=0 8x8dct=0 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=0 threads=11 lookahead_threads=11 sliced_threads=1 slices=11 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=0 weightp=0 keyint=50 keyint_min=5 scenecut=0 intra_refresh=0 rc=crf mbtree=0 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=0
    Output #0, flv, to 'rtmp://myserver.tld/live/new_stream_key':
    Metadata:
    Server          : NGINX RTMP (github.com/arut/nginx-rtmp-module)
    displayWidth    : 1152
    displayHeight   : 720
    fps             : 29
    profile         :
    level           :
    encoder         : Lavf57.8.102
    Stream #0:0: Video: h264 (libx264) ([7][0][0][0] / 0x0007), yuv420p, 1152x720 [SAR 1:1 DAR 8:5], q=-1--1, 25 fps, 1k tbn, 25 tbc (default)
    Metadata:
    encoder         : Lavc57.7.100 libx264
    Stream #0:1: Audio: aac (libfdk_aac) ([10][0][0][0] / 0x000A), 44100 Hz, stereo, s16, 128 kb/s (default)
    Metadata:
    encoder         : Lavc57.7.100 libfdk_aac
    Stream mapping:
    Stream #0:0 (aac) -> amix:input0
    Stream #0:1 (h264) -> crop
    Stream #1:0 (aac) -> amix:input1
    Stream #1:1 (h264) -> crop
    overlay -> Stream #0:0 (libx264)
    amix -> Stream #0:1 (libfdk_aac)
    Press [q] to stop, [?] for help
    [flv @ 0x3a0e940] Thread message queue blocking; consider raising the thread_queue_size option (current value: 512)
    frame=   81 fps= 20 q=15.0 size=     674kB time=00:00:03.24 bitrate=1703.3kbits/frame=  102 fps= 22 q=22.0 size=     945kB time=00:00:04.08 bitrate=1896.4kbits/

    Is there any way to force FFMpeg to read both RTMP inputs at the same time ?

  • Unable to build x264 for Android : configure doesn't work with cross-compile flags

    18 février 2016, par Pavel S.

    There is a problem when I try to build x264 lib in order to make it work with ffmpeg lib. I use Ubuntu 14.04.
    I have cloned fresh x264 sources.
    But when I run ./configure script, I got several issues :

    1. It doesn’t accept cross compile flags It doesn’t accept cross-compile flags (—cross-prefix, —host, —sysroot).
      Here’s how I run configure script :

      ./configure     --enable-pic \
                 --enable-static \
                 --disable-cli \
                 --host=arm-linux (or ARM, it doesn't work either) \
                 --cross-prefix=$ANDROID_NDK/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-$HOST_ARCH/bin/arm-linux-androideabi- \
                 --sysroot=$ANDROID_NDK/platforms/android-14/arch-arm \

    In this case default configuration is used.

    When I don’t pass 3 last flags, flags 1-3 are successfully used in config.

    1. With any flags passed to configure script, I see these errors in config.log :

      checking for -lpostproc... no
      Failed commandline was:
      gcc conftest.c -m64  -Wall -I. -I$(SRCPATH) -std=gnu99 -mpreferred-stack-boundary=5  -lpostproc  -m64  -lm -lpthread -o conftest
      ...
      /usr/bin/ld: cannot find -lpostproc
      collect2: error: ld returned 1 exit status

    Same I see for

    ...
    conftest.c:1:32: fatal error: libswscale/swscale.h: No such file or directory
    #include <libswscale></libswscale>swscale.h>

    Here’s full config.log :
    http://pastebin.com/U6aHKc28

    I guess I probably need to install ffmpeg on my Ubuntu to properly build x264 ?

    Any advice ?