Recherche avancée

Médias (1)

Mot : - Tags -/sintel

Autres articles (41)

  • Gestion générale des documents

    13 mai 2011, par

    MédiaSPIP ne modifie jamais le document original mis en ligne.
    Pour chaque document mis en ligne il effectue deux opérations successives : la création d’une version supplémentaire qui peut être facilement consultée en ligne tout en laissant l’original téléchargeable dans le cas où le document original ne peut être lu dans un navigateur Internet ; la récupération des métadonnées du document original pour illustrer textuellement le fichier ;
    Les tableaux ci-dessous expliquent ce que peut faire MédiaSPIP (...)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • Support de tous types de médias

    10 avril 2011

    Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)

Sur d’autres sites (4969)

  • Texture rendering on iOS using OpenGL ES in Unity project

    30 mars 2016, par Time1ess

    I’m working on a project, part of which is to stream video to my iPhone. I use my laptop to create the video stream to my iPhone with ffmpeg.

    The stream code in shell is below :

    ffmpeg \
       -f avfoundation -i "1" -s 1280*720 -r 29.97 \
       -c:v mpeg2video -q:v 20 -pix_fmt yuv420p -g 1 -threads 4\
       -f mpegts udp://192.168.1.102:6666

    with this, I successfully create my video stream.

    In Unity, I want to decode the video stream to create a texture. After I have gone through some ffmpeg tutorial and Unity tutorial, I created my link library. Some of these codes are below (ask me if more is needed) :

    In my library :

    buffer alloc :

    uint8_t *buffer;
    int buffer_size;
    buffer_size = avpicture_get_size(AV_PIX_FMT_RGBA, VIEW_WIDTH, VIEW_HEIGHT);

    buffer = (uint8_t *) av_malloc(buffer_size*sizeof(uint8_t));

    avpicture_fill((AVPicture *) pFrameRGB, buffer, AV_PIX_FMT_RGBA,
                  VIEW_WIDTH, VIEW_HEIGHT);

    getContext :

       is->sws_ctx = sws_getContext
       (
        is->video_st->codec->width,
        is->video_st->codec->height,
        is->video_st->codec->pix_fmt,
        VIEW_WIDTH,
        VIEW_HEIGHT,
        AV_PIX_FMT_RGBA,
        SWS_BILINEAR,
        NULL,
        NULL,
        NULL
        );

    sws_scale :

    sws_scale(
             is->sws_ctx,
             (uint8_t const * const *)pFrame->data,
             pFrame->linesize,
             0,
             is->video_st->codec->height,
             pFrameRGB->data,
             pFrameRGB->linesize
             );

    texture render :

    static void UNITY_INTERFACE_API OnRenderEvent(int texID)
    {
       GLuint gltex = (GLuint)(size_t)(texID);

       glBindTexture(GL_TEXTURE_2D, gltex);

       glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, VIEW_WIDTH, VIEW_HEIGHT,
                       GL_RGBA, GL_UNSIGNED_BYTE, pFrameRGB->data[0]);

       glGetError();
       return;
    }

    extern "C" UnityRenderingEvent UNITY_INTERFACE_EXPORT UNITY_INTERFACE_API GetRenderEventFunc()
    {
       return OnRenderEvent;
    }

    In Unity :

    texture created :

       private Texture2D texture;
       private int texID;
       texture = new Texture2D (width, height, TextureFormat.RGBA32, false);
       texture.filterMode = FilterMode.Point;
       texture.Apply ();
       GetComponent<renderer> ().material.mainTexture = texture;
       texID = texture.GetNativeTexturePtr ().ToInt32();
    </renderer>

    update func :

       void Update ()
       {
           GL.IssuePluginEvent(GetRenderEventFunc(), texID);
       }

    Video stream info :

    Input #0, mpegts, from 'udp://0.0.0.0:6666':
     Duration: N/A, start: 2.534467, bitrate: N/A
     Program 1
       Metadata:
         service_name    : Service01
         service_provider: FFmpeg
       Stream #0:0[0x100]: Video: mpeg2video (Main) ([2][0][0][0] / 0x0002), yuv420p(tv), 1280x720 [SAR 1:1 DAR 16:9], max. 104857 kb/s, 29.97 fps, 29.97 tbr, 90k tbn, 59.94 tbc

    Leave other details, my library works fine on the Unity simulator, but when I compiled all my libraries for arm64 and used the xcode project that Unity created to build my app and ran it, I couldn’t get any texture rendered in my iPhone, I checked my network and I’m sure that data had been sent to my iPhone and the Debug log showed me that frame has been successfully decoded also the OnRenderEvent function had been called.

    FYI :

    Unity 5.3.2f1 Personal

    Xcode 7.2.1

    iOS 9.2.1

    ffmpeg 3.0

  • ffmpeg x264 encoding cpu usage

    4 décembre 2016, par Amin Fazlali

    I want to convert a mkv formatted video to mp4, using the ffmpeg application.
    and for that I ran below command in terminal :

    ffmpeg -y -i c38a4990774b3c23.mkv -c:v libx264 -c:a aac -r 25 -strict -2 -map_metadata -1 -movflags faststart -vf "crop=1920:800:0:4, scale=iw*min(426/iw\,240/ih):ih*min(426/iw\,240/ih), pad=426:240:(426-iw*min(426/iw\,240/ih))/2:(240-ih*min(426/iw\,240/ih))/2, setsar=sar=1" output.mp4

    I have compiled ffmpeg with —enable-pthread configuration

    when I run this command on my personal PC with a 3.2GHz quad core cpu, it uses 60% of overall cpu process and encode video with 150fps ; but when I run this command on a production server with 8 2.4GHz dual core cpu (16 core) it only uses up to 20% of overall cpu process and encode video with 97fps.

    I have also tried ramdisk but I got no performance improvement.

  • Running error, [h264 @ 0x10af4a0] AVC : nal size [big number]

    27 mars 2017, par Beanocean

    I am using libavformat apis to get video frame from a MP4 video file. My code (c++) runs good in my personal computer, but when I try to deploy it into computing server, there are something strange happens. In the function ’av_read_frame()’, some errors appear.

    [h264 @ 0x10af4a0] AVC: nal size 555453589
    [h264 @ 0x10af4a0] AVC: nal size 555453589
    [h264 @ 0x10af4a0] no frame!

    My code is like this :

    if (av_read_frame(_p_format_ctx, &amp;_packet) &lt; 0) {
       return false;
    }

    But when this error occurs, the program doesn’t exit. But the final results are wrong.

    The OS of computing server is Linux, the kernel is 2.6.32.
    The version of FFmpeg is 3.2.4.
    The version of gcc is 4.8.2.