Recherche avancée

Médias (91)

Autres articles (93)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • ANNEXE : Les plugins utilisés spécifiquement pour la ferme

    5 mars 2010, par

    Le site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)

Sur d’autres sites (7805)

  • How can I use avformat_open_input function (ffmpeg)

    5 mai 2016, par johncarrie

    I have bought a HD HDMI to UVC device which has HDMI video source input and UVC for video output here.
    I connect it from laptop A (input source HDMI) to laptop B (output USB).
    I have installed Ubuntu 14.04 desktop on Laptop B and Win 8.1 on Laptop A.
    B also have ffmpeg, opencv and sdl library installed.
    My target is to capture video and audio from A via HD HDMI to UVC on B.
    So I have decided to use libav of ffmpeg.
    I saw this and used avformat_open_input function but this function returned error.
    I thought that the error was occurred because the second parameter of avformat_open_input (const char * url) was invalid.
    I know that the url should be like video:video device name:audio:audio card name.
    How can I indicate the device names ?

    Here’s the result of v4l2-ctl --list-devices command in terminal.

    HD WebCam (usb-0000:02:03.0-1):  
       /dev/video0
    HD TV CAM (usb-0000:03:00.0-2.1):  
       /dev/video1

    And the result of arecord -l in terminal.

    **** List of CAPTURE Hardware Devices ****  
    card 0: AudioPCI [Ensoniq AudioPCI], device 0: ES1371/1 [ES1371 DAC2/ADC]  
    Subdevices: 1/1  
    Subdevice #0: subdevice #0    
    card 1: CAM [HD TV CAM], device 0: USB Audio [USB Audio]    
    Subdevices: 1/1  
    Subdevice #0: subdevice #0  

    Thank you.

  • texture rendering issue on iOS using OpenGL ES in Unity project

    28 mars 2016, par Time1ess

    I’m working on a project, part of it is to streaming video to my iPhone, currently I use my laptop to create the video stream to my iPhone with ffmpeg.The stream code in shell is below :

    ffmpeg \
       -f avfoundation -i "1" -s 1280*720 -r 29.97 \
       -c:v mpeg2video -q:v 20 -pix_fmt yuv420p -g 1 -threads 4\
       -f mpegts udp://192.168.1.102:6666

    with this, I successfully create my video stream.

    In Unity, I want to decode the video stream to create a texture. After I have gone through with some ffmpeg tutorial and Unity tutorial(Since I’m new to both of them), I followed tutorials and created my link library. Some of these codes are below(ask me if more is needed) :

    In my library :

    buffer alloc :

    uint8_t *buffer;
    int buffer_size;
    buffer_size = avpicture_get_size(AV_PIX_FMT_RGBA, VIEW_WIDTH, VIEW_HEIGHT);

    buffer = (uint8_t *) av_malloc(buffer_size*sizeof(uint8_t));

    avpicture_fill((AVPicture *) pFrameRGB, buffer, AV_PIX_FMT_RGBA,
                  VIEW_WIDTH, VIEW_HEIGHT);

    getContext :

       is->sws_ctx = sws_getContext
       (
        is->video_st->codec->width,
        is->video_st->codec->height,
        is->video_st->codec->pix_fmt,
        VIEW_WIDTH,
        VIEW_HEIGHT,
        AV_PIX_FMT_RGBA,
        SWS_BILINEAR,
        NULL,
        NULL,
        NULL
        );

    sws_scale :

    sws_scale(
             is->sws_ctx,
             (uint8_t const * const *)pFrame->data,
             pFrame->linesize,
             0,
             is->video_st->codec->height,
             pFrameRGB->data,
             pFrameRGB->linesize
             );

    texture render :

    static void UNITY_INTERFACE_API OnRenderEvent(int texID)
    {
       GLuint gltex = (GLuint)(size_t)(texID);

       glBindTexture(GL_TEXTURE_2D, gltex);

       glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, VIEW_WIDTH, VIEW_HEIGHT,
                       GL_RGBA, GL_UNSIGNED_BYTE, pFrameRGB->data[0]);

       glGetError();
       return;
    }

    extern "C" UnityRenderingEvent UNITY_INTERFACE_EXPORT UNITY_INTERFACE_API GetRenderEventFunc()
    {
       return OnRenderEvent;
    }

    In Unity :

    texture created :

       private Texture2D texture;
       private int texID;
       texture = new Texture2D (width, height, TextureFormat.RGBA32, false);
       texture.filterMode = FilterMode.Point;
       texture.Apply ();
       GetComponent<renderer> ().material.mainTexture = texture;
       texID = texture.GetNativeTexturePtr ().ToInt32();
    </renderer>

    update func :

       void Update ()
       {
           GL.IssuePluginEvent(GetRenderEventFunc(), texID);
       }

    Video stream info :

    Input #0, mpegts, from 'udp://0.0.0.0:6666':
     Duration: N/A, start: 2.534467, bitrate: N/A
     Program 1
       Metadata:
         service_name    : Service01
         service_provider: FFmpeg
       Stream #0:0[0x100]: Video: mpeg2video (Main) ([2][0][0][0] / 0x0002), yuv420p(tv), 1280x720 [SAR 1:1 DAR 16:9], max. 104857 kb/s, 29.97 fps, 29.97 tbr, 90k tbn, 59.94 tbc

    Leave other details, my library works fine on the Unity simulator, but when I compiled all my libraries for arm64 and used the xcode project that Unity created to build my app and ran it, I couldn’t get any texture rendered in my iPhone, I checkd my network and I’m sure that data had been sent to my iPhone and the Debug log showed me that frame has been successfully decoded also the OnRenderEvent func had been called.

    I’m confused and try to find answer on stackoverflow, maybe I’m a beginer cause I can’t find answers, so I ask you guys to help me plz.

    FYI :

    Unity 5.3.2f1 Personal

    Xcode 7.2.1

    iOS 9.2.1

    ffmpeg 3.0

  • HTML5 video not playing properly in Firefox browser on OS X

    10 août 2016, par Brian J

    The issue :

    I recently made a video that I would like to embed in my website as html5 video. I have the video in .mp4 and .ogv format. Here is a test page that I have set up with the video :

    http://www.briancjenkins.com/video_test/

    Here is the relevant html code that I used to embed the video :

    <video width="75%" controls="controls" loop="loop" poster="US_Inflation_Unemployment_Monthly_BP_Filtered.png">
     <source src="US_Inflation_Unemployment_Monthly_BP_Filtered.mp4" type="video/mp4">
     <source src="US_Inflation_Unemployment_Monthly_BP_Filtered.ogv" type="video/ogg">
     Your browser does not support the video tag.
    </source></source></video>

    The video plays perfectly on Safari (iOS and OS X), Chrome (OS X), and Firefox (iOS).

    However, the video does not play properly in Firefox on OS X. On both my desktop and laptop, when I play the video, the time bar moves, but there is no video. However, if I copy and paste the video URL into the Firefox address bar, the video plays fine. Here a the direct link to the .ogv-format video :

    http://www.briancjenkins.com/video_test/US_Inflation_Unemployment_Monthly_BP_Filtered.ogv

    I would like to know why the video is not playing in Firefox on OS X and how to fix the problem.

    What I know about the issue :

    I know that plenty of people have had trouble with html5 video and Firefox. And I also know from reading other questions that in some cases, the server is sending an incorrect mime type. I ran :

    $ curl -I http://www.briancjenkins.com/video_test/US_Inflation_Unemployment_Monthly_BP_Filtered.ogv

    and found that this does not seem to be the case here :

    Server: GitHub.com
    Content-Type: video/ogg

    which isn’t too surprising since Firefox will play the video hosted by Github, just not when its embedded in a webpage. Note that I also tried converting to .webm as an alternative to .ogv but that didn’t solve the issue.

    About the video

    The video was created in Python and and Matplotlib. I used Matplotlib to save the video to .mp4 and then I used ffmpeg to convert the video to .ogv format. Here is the ffmpeg command I used :

    $ ffmpeg -i US_Inflation_Unemployment_Monthly_BP_Filtered.mp4 -acodec libvorbis -ac 2 -ab 128k -ar 44100 -b:v 1800k  US_Inflation_Unemployment_Monthly_BP_Filtered.ogv

    It’s possible that I am not doing the conversion correctly, but I have tried several different variations on the conversion command. The video in question is hosted by Github.