Recherche avancée

Médias (1)

Mot : - Tags -/ogg

Autres articles (12)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Encoding and processing into web-friendly formats

    13 avril 2011, par

    MediaSPIP automatically converts uploaded files to internet-compatible formats.
    Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
    Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
    Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
    All uploaded files are stored online in their original format, so you can (...)

  • Le plugin : Podcasts.

    14 juillet 2010, par

    Le problème du podcasting est à nouveau un problème révélateur de la normalisation des transports de données sur Internet.
    Deux formats intéressants existent : Celui développé par Apple, très axé sur l’utilisation d’iTunes dont la SPEC est ici ; Le format "Media RSS Module" qui est plus "libre" notamment soutenu par Yahoo et le logiciel Miro ;
    Types de fichiers supportés dans les flux
    Le format d’Apple n’autorise que les formats suivants dans ses flux : .mp3 audio/mpeg .m4a audio/x-m4a .mp4 (...)

Sur d’autres sites (7172)

  • Create video from images, add audio and animation within images without FFMPEG in Android

    28 février 2017, par Vivek Bhatt

    I want to create a video from multiple images without using FFMPEG in Android. Also i have to multiple audios over it. And at last there should be some animation within each images. I have tried this using FFMPEG, but it is taking hell lot of time, let say , 4-5 min for 10 photos. Is there any other way for this or any efficient way ?

    There is 3 way process.

    1. First i create video from each image with audio and added text.
    2. Then i concatenate all the videos into a single video.
    3. At last, i add a background audio to the concatenated video.

    I am sharing the commands here :

    1.
    String[] cmds = "-y"
    , "-r", "1", "-vcodec", "mjpeg" ,"-loop", "1", "-t", "1"
    , "-i", image
    , "-i", audio
    , "-vf"
    , "zoompan=z=’if(lte(zoom,1),1.1,max(1.001,zoom-0.0015))’:d=125,drawbox=y=ih/PHI:color=black@0.4:width=iw:height=120:t=max,drawtext=fontfile=/storage/emulated/0/ffmpeg/font.ttf:text="+text+":fontcolor=white:fontsize=60:x=(w-tw)/2:y=(h/PHI)+th"
    , "-c:v", "libx264"
    , "-preset", "ultrafast", "-t", "5", "-c:a", "aac", "-b:a", "128k", "-pix_fmt", "yuv420p"
    , "video1.mp4"
     ;

    2.

    String cmd = "-y -i video1.mp4 -i video2.mp4 -i video3.mp4 -filter_complex concat=n=3:v=1:a=1 -preset ultrafast concat_video.mp4 ;

    3.

    cmd = "-y -i bg_audio.mp3 -i concat_video.mp4 -shortest -preset ultrafast -filter_complex [a1][a2]amerge,pan=stereo:c0

  • Stream publishing using ffmpeg rtmp : network bandwidth not fully utilized

    14 février 2017, par DeducibleSteak

    I’m developing an application that needs to publish a media stream to an rtmp "ingestion" url (as used in YouTube Live, or as input to Wowza Streaming Engine, etc), and I’m using the ffmpeg library (programmatically, from C/C++, not the command line tool) to handle the rtmp layer. I’ve got a working version ready, but am seeing some problems when streaming higher bandwidth streams to servers with worse ping. The problem exists both when using the ffmpeg "native"/builtin rtmp implementation and the librtmp implementation.

    When streaming to a local target server with low ping through a good network (specifically, a local Wowza server), my code has so far handled every stream I’ve thrown at it and managed to upload everything in real time - which is important, since this is meant exclusively for live streams.

    However, when streaming to a remote server with a worse ping (e.g. the youtube ingestion urls on a.rtmp.youtube.com, which for me have 50+ms pings), lower bandwidth streams work fine, but with higher bandwidth streams the network is underutilized - for example, for a 400kB/s stream, I’m only seeing 140kB/s network usage, with a lot of frames getting delayed/dropped, depending on the strategy I’m using to handle network pushback.

    Now, I know this is not a problem with the network connection to the target server, because I can successfully upload the stream in real time when using the ffmpeg command line tool to the same target server or using my code to stream to a local Wowza server which then forwards the stream to the youtube ingestion point.

    So the network connection is not the problem and the issue seems to lie with my code.

    I’ve timed various parts of my code and found that when the problem appears, calls to av_write_frame / av_interleaved_write_frame (I never mix & match them, I am always using one version consistently in any specific build, it’s just that I’ve experimented with both to see if there is any difference) sometimes take a really long time - I’ve seen those calls sometimes take up to 500-1000ms, though the average "bad case" is in the 50-100ms range. Not all calls to them take this long, most return instantly, but the average time spent in these calls grows bigger than the average frame duration, so I’m not getting a real time upload anymore.

    The main suspect, it seems to me, could be the rtmp Acknowledgement Window mechanism, where a sender of data waits for a confirmation of receipt after sending every N bytes, before sending any more data - this would explain the available network bandwidth not being fully used, since the client would simply sit there and wait for a response (which takes a longer time because of the lower ping), instead of using the available bandwidth. Though I haven’t looked at ffmpeg’s rtmp/librtmp code to see if it actually implements this kind of throttling, so it could be something else entirely.

    The full code of the application is too much to post here, but here are some important snippets :

    Format context creation :

    const int nAVFormatContextCreateError = avformat_alloc_output_context2(&m_pAVFormatContext, nullptr, "flv", m_sOutputUrl.c_str());

    Stream creation :

    m_pVideoAVStream = avformat_new_stream(m_pAVFormatContext, nullptr);
    m_pVideoAVStream->id = m_pAVFormatContext->nb_streams - 1;

    m_pAudioAVStream = avformat_new_stream(m_pAVFormatContext, nullptr);
    m_pAudioAVStream->id = m_pAVFormatContext->nb_streams - 1;

    Video stream setup :

    m_pVideoAVStream->codecpar->codec_type = AVMEDIA_TYPE_VIDEO;
    m_pVideoAVStream->codecpar->codec_id = AV_CODEC_ID_H264;
    m_pVideoAVStream->codecpar->width = nWidth;
    m_pVideoAVStream->codecpar->height = nHeight;
    m_pVideoAVStream->codecpar->format = AV_PIX_FMT_YUV420P;
    m_pVideoAVStream->codecpar->bit_rate = 10 * 1000 * 1000;
    m_pVideoAVStream->time_base = AVRational { 1, 1000 };

    m_pVideoAVStream->codecpar->extradata_size = int(nTotalSizeRequired);
    m_pVideoAVStream->codecpar->extradata = (uint8_t*)av_malloc(m_pVideoAVStream->codecpar->extradata_size + AV_INPUT_BUFFER_PADDING_SIZE);
    // Fill in the extradata here - I'm sure I'm doing that correctly.

    Audio stream setup :

    m_pAudioAVStream->time_base = AVRational { 1, 1000 };
    // Let's leave creation of m_pAudioCodecContext out of the scope of this question, I'm quite sure everything is done right there.
    const int nAudioCodecCopyParamsError = avcodec_parameters_from_context(m_pAudioAVStream->codecpar, m_pAudioCodecContext);

    Opening the connection :

    const int nAVioOpenError = avio_open2(&m_pAVFormatContext->pb, m_sOutputUrl.c_str(), AVIO_FLAG_WRITE);

    Starting the stream :

    AVDictionary * pOptions = nullptr;
    const int nWriteHeaderError = avformat_write_header(m_pAVFormatContext, &pOptions);

    Sending a video frame :

    AVPacket pkt = { 0 };
    av_init_packet(&pkt);
    pkt.dts = nTimestamp;
    pkt.pts = nTimestamp;
    pkt.duration = nDuration; // I know what I have the wrong duration sometimes, but I don't think that's the issue.
    pkt.data = pFrameData;
    pkt.size = pFrameDataSize;
    pkt.flags = bKeyframe ? AV_PKT_FLAG_KEY : 0;
    pkt.stream_index = m_pVideoAVStream->index;
    const int nWriteFrameError = av_write_frame(m_pAVFormatContext, &pkt); // This is where too much time is spent.

    Sending an audio frame :

    AVPacket pkt = { 0 };
    av_init_packet(&pkt);
    pkt.pts = m_nTimestampMs;
    pkt.dts = m_nTimestampMs;
    pkt.duration = m_nDurationMs;
    pkt.stream_index = m_pAudioAVStream->index;
    const int nWriteFrameError = av_write_frame(m_pAVFormatContext, &pkt);

    Any ideas ? Am I on the right track with thinking about the Acknowledgement Window ? Am I doing something else completely wrong ?

  • Compiling FFmpeg staticly using NDK with openssl

    27 février 2017, par David Barishev

    I have been trying to compile ffmpeg into a static library in order to use it in my android application, but i couldn’t get it to work.

    Im working with FFmpeg 3.2.4, and ndk r13b, using bash on windows 10(Ubuntu 14.04).

    Here is what i did :

    • I made a stand alone toolchain for x86_64 and api 21 using :
      python make_standalone_toolchain.py --api 21 --arch x86_64 --install-dir {}

    • Made a configuration script :

      ./configure \
      --target-os=android                                     \
      --arch=x86_64                                              \
      --prefix=/home/david/ffmpeg_x86_64_build                          \
      --cross-prefix=/home/david/x86_64_toolchain/bin/x86_64-linux-android-\
      --sysroot=/home/david/x86_64_toolchain/sysroot                    \
      --enable-cross-compile                                  \
      --pkg-config-flags="--static"                           \
      --enable-ffmpeg                                         \
      --disable-ffplay                                        \
      --disable-ffprobe                                       \
      --disable-ffserver                                      \
      --disable-doc                                           \
      --disable-htmlpages                                     \
      --disable-manpages                                      \
      --disable-podpages                                      \
      --disable-txtpages                                      \
      --extra-cflags="-fPIC"                                  \
      --extra-cxxflags="-fPIC"
      --disable-shared --enable-static \
      --enable-yasm

      make
      make install

    It produced an FFmpeg executable, however when i ran it on my API 23 emulator, i got an error message :error: only position independent executables (PIE) are supported.
    Even that i used -fPic

    How can i fix it ? Also i’m not sure about my configuration, there wasn’t up to date sources on how to compile it correctly for every ABI (arm,arm64,x86,x86_64,mips,mips64) that i need for my application.
    I have seen many script, and im not too familiar with compiling native code, so i wasn’t sure what settings i need, for example like C flags and etc.

    To be precise on how i tried to configure FFmpeg :

    • I need a static library
    • I Only need the ffmpeg command line utility
    • I want to compile the library for every ABI i listed above.This configuration tried to compile for x86_64.
    • Running on android of course

    I would greatly appreciate some help on how to configure and compile this correctly.

    EDIT

    Thanks to @Alex Cohn i have been able to compile ffmpeg, however i couldn’t get it to compile using —enable-ssl however i have been faced with a problem.

    Since we are providing a system root, the compiler won’t look for files on my host machine, so we will need need to compile ssl for android from source.
    I have used this library OpenSSL-For-Android, and used the precompiled library.

    I added to the configure file these lines :

    --extra-cflags='-m32  -fPIE -I./OpenSSL-For-Android/openssl-1.0.2j/include' \
    --extra-ldflags='-m32 -Wl,-eh-frame-hdr -Wl,-m,elf_i386 -Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now  -fPIE -pie -L./OpenSSL-For-Android/openssl-1.0.2j/lib/x86'

    I tried telling the config, that it should search for the headers of openssl here, and that it should link against this library here.

    But it fails on this test as stated on config.log :

    BEGIN /tmp/ffconf.JVeoLDmr.c
       1   extern int SSL_library_init();
       2   int main(void){ SSL_library_init(); }
    END /tmp/ffconf.JVeoLDmr.c
    /home/david/ndk/toolchains/x86-4.9/prebuilt/linux-x86_64/bin/i686-linux-android-gcc --sysroot=/home/david/ndk/platforms/android-9/arch-x86/ -isysroot /home/david/ndk/platforms/android-9/arch-x86/ -D_ISOC99_SOURCE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE -Dstrtod=avpriv_strtod -DPIC -m32 -fPIE -I./OpenSSL-For-Android/openssl-1.0.2j/include -std=c99 -fomit-frame-pointer -fPIC -pthread -c -o /tmp/ffconf.x5yr8e11.o /tmp/ffconf.JVeoLDmr.c
    /home/david/ndk/toolchains/x86-4.9/prebuilt/linux-x86_64/bin/i686-linux-android-gcc -m32 -Wl,-eh-frame-hdr -Wl,-m,elf_i386 -Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now -fPIE -pie -L./OpenSSL-For-Android/openssl-1.0.2j/lib/x86 --sysroot=/home/david/ndk/platforms/android-9/arch-x86/ -isysroot /home/david/ndk/platforms/android-9/arch-x86/ -Wl,--as-needed -Wl,-z,noexecstack -o /tmp/ffconf.mxzx6kFL /tmp/ffconf.x5yr8e11.o -lssl -lcrypto -lws2_32 -lgdi32 -lm -pthread
    /home/david/ndk/toolchains/x86-4.9/prebuilt/linux-x86_64/bin/../lib/gcc/i686-linux-android/4.9.x/../../../../i686-linux-android/bin/ld: error: cannot find -lssl
    /home/david/ndk/toolchains/x86-4.9/prebuilt/linux-x86_64/bin/../lib/gcc/i686-linux-android/4.9.x/../../../../i686-linux-android/bin/ld: error: cannot find -lws2_32
    /home/david/ndk/toolchains/x86-4.9/prebuilt/linux-x86_64/bin/../lib/gcc/i686-linux-android/4.9.x/../../../../i686-linux-android/bin/ld: error: cannot find -lgdi32
    /tmp/ffconf.x5yr8e11.o:ffconf.JVeoLDmr.c:function main: error: undefined reference to 'SSL_library_init'

    And it seems like it doesn’t use the precompiled shared library i passed it(since it’s compiled for Android) and doesn’t find another instance of openssl, since the sysroot is the Android AOSP one (this is what I am thinking, please correct me if I’m wrong - I’m not experienced enough in this field ) .
    How can this be fixed ? I have seen some setups with PKG-Config, but i have no clue as how to configure it correctly.