Recherche avancée

Médias (91)

Autres articles (100)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Emballe médias : à quoi cela sert ?

    4 février 2011, par

    Ce plugin vise à gérer des sites de mise en ligne de documents de tous types.
    Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ;

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

Sur d’autres sites (9316)

  • Where is my ffmpeg stream getting saved to ?

    14 mars 2013, par Chris

    I'm just starting to explore ffmpeg (ultimately for use with openCV), and I'm running this command :

    root@beaglebone:/# ffmpeg -f video4linux2 -r 25 -s 640x480 -i /dev/video0 /out.avi

    At which point the camera indicator light turns on and it appears to be capturing. However when I end it with CTRL+C, the file is nowhere to be found.

    Any thoughts ?

    Full output :

    root@beaglebone:/# ffmpeg -f video4linux2 -r 25 -s 640x480 -i /dev/video0 /out.avi
    ffmpeg version v0.7.4, Copyright (c) 2000-2011 the Libav developers
     built on Oct  9 2012 10:50:57 with gcc 4.5.4 20120305 (prerelease)
     configuration: --enable-shared --enable-pthreads --enable-gpl --enable-postproc --enable-avfilter --cross-prefix=arm-angstrom-linux-gnueabi- --prefix=/usr --enable-ffserver --enable-ffplay --enable-x11grab --enable-libtheora --enable-libvorbis --arch=arm --target-os=linux --enable-cross-compile --extra-cflags=' -fexpensive-optimizations -fomit-frame-pointer -O4 -ffast-math -march=armv7-a -fno-tree-vectorize -mthumb-interwork -mfloat-abi=softfp -mfpu=neon -mtune=cortex-a8 --sysroot=/home/koen/setup-scripts/build/tmp-angstrom_v2012_05-eglibc/sysroots/beaglebone' --extra-ldflags='-Wl,-O1 -Wl,--hash-style=gnu -Wl,--as-needed' --sysroot=/home/koen/setup-scripts/build/tmp-angstrom_v2012_05-eglibc/sysroots/beaglebone --enable-hardcoded-tables --cpu=cortex-a8
     libavutil    51.  7. 0 / 51.  7. 0
     libavcodec   53.  6. 0 / 53.  6. 0
     libavformat  53.  3. 0 / 53.  3. 0
     libavdevice  53.  0. 0 / 53.  0. 0
     libavfilter   2.  4. 0 /  2.  4. 0
     libswscale    2.  0. 0 /  2.  0. 0
     libpostproc  52.  0. 0 / 52.  0. 0
    ^C
    root@beaglebone:/# ls
    bin   dev  home  lost+found  mnt  proc  sbin  tmp  var
    boot  etc  lib   media       opt  run   sys   usr
    root@beaglebone:/#
  • Image to MPEG on Linux works, same code on Android = green video

    27 novembre 2014, par JScoobyCed

    EDIT
    I have check the execution and found that the error is not (yet) at the swscale point. My current issue is that the JPG image is not found :
    No such file or directory
    when doing the avformat_open_input(&pFormatCtx, imageFileName, NULL, NULL);
    Before you tell me I need to register anything, I can tell I already did (I updated the code below).
    I also added the Android permission to access the external storage (I don’t think it is related to Android since I can already write to the /mnt/sdcard/ where the image is also located)
    END EDIT

    I have been through several tutorials (including the few posted from SO, i.e. http://dranger.com/ffmpeg/, how to compile ffmpeg for Android...,been through dolphin-player source code). Here is what I have :
    . Compiled ffmpeg for android
    . Ran basic tutorials using NDK to create a dummy video on my android device
    . been able to generate a MPEG2 video from images on Ubuntu using a modified version of dummy video code above and a lot of Googling
    . running the new code on Android device gives a green screen video (duration 1 sec whatever the number of frames I encode)

    I saw another post about iPhone in a similar situation that mentioned the ARM processor optimization could be the culprit. I tried a few ldextra-flags (-arch armv7-a and similar) to no success.

    I include at the end the code that loads the image. Is there something different to do on Android than on linux ? Is my ffmpeg build not correct for Android video encoding ?

    void copyFrame(AVCodecContext *destContext, AVFrame* dest,
               AVCodecContext *srcContext, AVFrame* source) {
    struct SwsContext *swsContext;
    swsContext = sws_getContext(srcContext->width, srcContext->height, srcContext->pix_fmt,
                   destContext->width, destContext->height, destContext->pix_fmt,
                   SWS_FAST_BILINEAR, NULL, NULL, NULL);
    sws_scale(swsContext, source->data, source->linesize, 0, srcContext->height, dest->data, dest->linesize);
    sws_freeContext(swsContext);
    }

    int loadFromFile(const char* imageFileName, AVFrame* realPicture, AVCodecContext* videoContext) {
    AVFormatContext *pFormatCtx = NULL;
    avcodec_register_all();
    av_register_all();

    int ret = avformat_open_input(&pFormatCtx, imageFileName, NULL, NULL);
    if (ret != 0) {
       // ERROR hapening here
       // Can't open image file. Use strerror(AVERROR(ret))) for details
       return ERR_CANNOT_OPEN_IMAGE;
    }

    AVCodecContext *pCodecCtx;

    pCodecCtx = pFormatCtx->streams[0]->codec;
    pCodecCtx->width = W_VIDEO;
    pCodecCtx->height = H_VIDEO;
    pCodecCtx->pix_fmt = PIX_FMT_YUV420P;

    // Find the decoder for the video stream
    AVCodec *pCodec = avcodec_find_decoder(pCodecCtx->codec_id);
    if (!pCodec) {
       // Codec not found
       return ERR_CODEC_NOT_FOUND;
    }

    // Open codec
    if (avcodec_open(pCodecCtx, pCodec) < 0) {
       // Could not open codec
       return ERR_CANNOT_OPEN_CODEC;
    }

    //
    AVFrame *pFrame;

    pFrame = avcodec_alloc_frame();

    if (!pFrame) {
       // Can't allocate memory for AVFrame
       return ERR_CANNOT_ALLOC_MEM;
    }

    int frameFinished;
    int numBytes;

    // Determine required buffer size and allocate buffer
    numBytes = avpicture_get_size(PIX_FMT_YUV420P, pCodecCtx->width, pCodecCtx->height);
    uint8_t *buffer = (uint8_t *) av_malloc(numBytes * sizeof (uint8_t));

    avpicture_fill((AVPicture *) pFrame, buffer, PIX_FMT_YUV420P, pCodecCtx->width, pCodecCtx->height);
    AVPacket packet;
    int res = 0;
    while (av_read_frame(pFormatCtx, &packet) >= 0) {
       if (packet.stream_index != 0)
           continue;

       ret = avcodec_decode_video2(pCodecCtx, pFrame, &frameFinished, &packet);
       if (ret > 0) {
           // now, load the useful info into realPicture
           copyFrame(videoContext, realPicture, pCodecCtx, pFrame);
           // Free the packet that was allocated by av_read_frame
           av_free_packet(&packet);
           return 0;
       } else {
           // Error decoding frame. Use strerror(AVERROR(ret))) for details
           res = ERR_DECODE_FRAME;
       }
    }
    av_free(pFrame);

    // close codec
    avcodec_close(pCodecCtx);

    // Close the image file
    av_close_input_file(pFormatCtx);

    return res;
    }

    Some ./configure options :
    --extra-cflags="-O3 -fpic -DANDROID -DHAVE_SYS_UIO_H=1 -Dipv6mr_interface=ipv6mr_ifindex -fasm -Wno-psabi -fno-short-enums -fno-strict-aliasing -finline-limit=300 -mfloat-abi=softfp -mfpu=vfp -marm -march=armv7-a -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE -D_LARGEFILE64_SOURCE"

    --extra-ldflags="-Wl,-rpath-link=$PLATFORM/usr/lib -L$PLATFORM/usr/lib -nostdlib -lc -lm -ldl -llog"

    --arch=armv7-a --enable-armv5te --enable-armv6 --enable-armvfp --enable-memalign-hack

  • ffmpeg + ffserver : "Broken ffmpeg default settings detected"

    18 octobre 2012, par Chris Nolet

    I'm just trying to connect ffmpeg to ffserver and stream rawvideo.

    I keep getting the error : broken ffmpeg default settings detected from libx264 and then Error while opening encoder for output stream #0:0 - maybe incorrect parameters such as bit_rate, rate, width or height from ffmpeg before it exits.

    I'm launching ffmpeg with the command : ffmpeg -f x11grab -s 320x480 -r 10 -i :0.0 -tune zerolatency http://localhost:8090/feed1.ffm

    My ffserver.conf file (for ffserver) looks like this :

    Port 8090
    BindAddress 0.0.0.0
    MaxHTTPConnections 2000
    MaxClients 1000
    MaxBandwidth 1000
    CustomLog -
    NoDaemon

    <feed>
     ACL allow 127.0.0.1
    </feed>

    <stream>
     Feed feed1.ffm
     Format asf

     NoAudio

     VideoBitRate 128
     VideoBufferSize 400
     VideoFrameRate 24
     VideoSize 320x480

     VideoGopSize 12

     VideoQMin 1
     VideoQMax 31

     VideoCodec libx264
    </stream>

    <stream>
     Format status
    </stream>

    And the full output is :

    ffmpeg version N-45614-g364c60b Copyright (c) 2000-2012 the FFmpeg developers
     built on Oct 17 2012 04:34:04 with Apple clang version 4.1 (tags/Apple/clang-421.11.65) (based on LLVM 3.1svn)
     configuration: --enable-shared --enable-libx264 --enable-libmp3lame --enable-x11grab --enable-gpl --enable-version3 --enable-nonfree --enable-hardcoded-tables --cc=/usr/bin/clang --host-cflags=&#39;-Os -w -pipe -march=native -Qunused-arguments -mmacosx-version-min=10.7&#39; --extra-cflags=&#39;-x objective-c&#39; --extra-ldflags=&#39;-framework Foundation -framework Cocoa -framework CoreServices -framework ApplicationServices -lobjc&#39;
     libavutil      51. 76.100 / 51. 76.100
     libavcodec     54. 66.100 / 54. 66.100
     libavformat    54. 32.101 / 54. 32.101
     libavdevice    54.  3.100 / 54.  3.100
     libavfilter     3. 19.103 /  3. 19.103
     libswscale      2.  1.101 /  2.  1.101
     libswresample   0. 16.100 /  0. 16.100
     libpostproc    52.  1.100 / 52.  1.100
    [x11grab @ 0x7f87dc01e200] device: :0.0 -> display: :0.0 x: 0 y: 0 width: 320 height: 480
    [x11grab @ 0x7f87dc01e200] Estimating duration from bitrate, this may be inaccurate
    Input #0, x11grab, from &#39;:0.0&#39;:
     Duration: N/A, start: 1350517708.386699, bitrate: 49152 kb/s
       Stream #0:0: Video: rawvideo (BGRA / 0x41524742), bgra, 320x480, 49152 kb/s, 10 tbr, 1000k tbn, 10 tbc
    [tcp @ 0x7f87dc804120] TCP connection to localhost:8090 failed: Connection refused
    [tcp @ 0x7f87dc804b20] TCP connection to localhost:8090 failed: Connection refused
    [libx264 @ 0x7f87dd801000] broken ffmpeg default settings detected
    [libx264 @ 0x7f87dd801000] use an encoding preset (e.g. -vpre medium)
    [libx264 @ 0x7f87dd801000] preset usage: -vpre <speed> -vpre <profile>
    [libx264 @ 0x7f87dd801000] speed presets are listed in x264 --help
    [libx264 @ 0x7f87dd801000] profile is optional; x264 defaults to high
    Output #0, ffm, to &#39;http://localhost:8090/feed1.ffm&#39;:
     Metadata:
       creation_time   : now
       Stream #0:0: Video: h264, yuv420p, 160x128, q=2-31, 128 kb/s, 1000k tbn, 10 tbc
    Stream mapping:
     Stream #0:0 -> #0:0 (rawvideo -> libx264)
    Error while opening encoder for output stream #0:0 - maybe incorrect parameters such as bit_rate, rate, width or height
    </profile></speed>

    Any help much appreciated :)