Recherche avancée

Médias (91)

Autres articles (40)

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • De l’upload à la vidéo finale [version standalone]

    31 janvier 2010, par

    Le chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
    Upload et récupération d’informations de la vidéo source
    Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
    Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)

Sur d’autres sites (7892)

  • ffmpeg command line working on emulator but not in device in android

    18 avril 2013, par rams

    i want to convert sequence of images into video.i successfully completed this task in emulator.but in real device its not working.what's the problem.i put the ffmpeg executable files and libraries in system/bin as well as system/lib folder.

    Runtime.getRuntime().exec("chmod 777 /system/bin/ffmpeg");
    chperm=Runtime.getRuntime().exec("ffmpeg -f image2 -i img%4d.jpg adi.avi",null,dir);
    Toast.makeText(getBaseContext(), "success", Toast.LENGTH_SHORT).show();
  • x264 & libavcodec

    25 janvier 2012, par moose

    After some considerable amount of time while trying to build the ffmpeg static library with the x264 encoder on Windows, I have spent some more time for writing some example with it.
    Of course, there are tons of "instructions" on how to build, how to use, bla bla... But, non of them works on Windows. I guess the Linux guys are in better position here. Now, the zillion dollars question is "What's the purpose of all that ?". Not only that this is useless on Windows, but I could have bought some third party library that actually works.

    If somebody is about to say "But, it works !". I must say, give me a working proof. I don't care about 200x100 at 10fps. I don't need H264 for that. Show me how to compress a single second of 1080i footage. It's H264, it's crossplatform (sounds funny if you ask me), Google is using it (it has to be perfect, right ?), some more hipe here...

  • sws_scale YUV —> RGB distorted image

    3 janvier 2015, par Sami susu

    I want to convert YUV420P image (received from H.264 stream) to RGB, while also resizing it, using sws_scale.
    The size of the original image is 480 × 800. Just converting with same dimensions works fine.
    But when I try to change the dimensions, I get a distorted image, with the following pattern :

    • changing to 481 × 800 will yield a distorted B&W image which looks like it’s cut in the middle
    • 482 × 800 will be even more distorted
    • 483 × 800 is distorted but in color
    • 484 × 800 is ok (scaled correctly).

    Now this pattern follows - scaling will only work fine if the difference between divides by 4.

    Here’s a sample code of the way that I decode and convert the image. All methods show "success".

    int srcX = 480;
    int srcY = 800;
    int dstX = 481; // or 482, 483 etc
    int dstY = 800;

    AVFrame* avFrameYUV = avcodec_alloc_frame();
    avpicture_fill((AVPicture *)avFrameYUV, decoded_yuv_frame, PIX_FMT_YUV420P, srcX , srcY);

    AVFrame *avFrameRGB = avcodec_alloc_frame();

    AVPacket avPacket;
    av_init_packet(&avPacket);
    avPacket.size = read; // size of raw data
    avPacket.data = raw_data; // raw data before decoding to YUV

    int frame_decoded = 0;
    int decoded_length = avcodec_decode_video2(g_avCodecContext, avFrameYUV, &frame_decoded, &avPacket);
    int size = dstX * dstY * 3;

    struct SwsContext *img_convert_ctx = sws_getContext(srcX, srcY, SOURCE_FORMAT, dstX, dstY, PIX_FMT_BGR24, SWS_BICUBIC, NULL, NULL, NULL);

    avpicture_fill((AVPicture *)avFrameRGB, rgb_frame, PIX_FMT_RGB24, dstX, dstY);
    sws_scale(img_convert_ctx, avFrameYUV->data, avFrameYUV->linesize, 0, srcY, avFrameRGB->data, avFrameRGB->linesize);

    // draws the resulting frame with windows BitBlt
    DrawBitmap(hdc, dstX, dstY, rgb_frame, size);

    sws_freeContext(img_convert_ctx);