Recherche avancée

Médias (2)

Mot : - Tags -/documentation

Autres articles (111)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Soumettre bugs et patchs

    10 avril 2011

    Un logiciel n’est malheureusement jamais parfait...
    Si vous pensez avoir mis la main sur un bug, reportez le dans notre système de tickets en prenant bien soin de nous remonter certaines informations pertinentes : le type de navigateur et sa version exacte avec lequel vous avez l’anomalie ; une explication la plus précise possible du problème rencontré ; si possibles les étapes pour reproduire le problème ; un lien vers le site / la page en question ;
    Si vous pensez avoir résolu vous même le bug (...)

Sur d’autres sites (11560)

  • How to change mp4 aspect ratio to 16:9 using ffmpeg ?

    1er mai 2016, par user1788736

    I got an mp4 video that I copy 4 minute of it using ffmpeg. After uploading to YouTube I noticed the uploaded video has black bars on both side of video(right and left side) !After searching for a way to remove those black bars I found that I need to use yt:stretch=16:9 !However,using yt:stretch=16.9 tag will not remove the black bars on iPhone and Samsung smart tv YouTube app !

    could an expert help me change the aspect ratio of original mp4 video to 16:9 using ffmpeg (without losing video quality) for re uploading to YouTube ? Thanks in advance ?

    I got two types of source with following information :

    1)Resolution:720x576 ,Frame rate:25 . Codec:H264 - MPEG-4 AVC(part 10)(avc1),
    2)Resolution:848x480 , Frame rate:24.804393,Codec:H264 - MPEG-4 AVC(part 10)(avc1)

    ffmpeg code used to trim the original video :

      ffmpeg -i orginalVideo.mp4 -ss 00:25:55 -t 00:04:02 -acodec copy -vcodec copy videoForYoutube.mp4
  • Compile FFMPEG for iOS development

    7 avril 2016, par Alec Gamble

    So I’ve been googling around and there seem to be a lot of answers to this questions... for people who know what they’re doing. Unfortunately I am not one of these people. I had a version of FFMPEG I was using in a project and now I want to port that project to iOS. So I have a version of FFMPEG with LAME installed on my desktop and in my C++ application I was just accessing this through the command line via the system() method. I also just downloaded the FFMPEG iOS build from here :

    https://github.com/kewlbear/FFmpeg-iOS-build-script

    and the gas-preprocessor from the link on that page but I’m unsure as to what to do with it now and how to get it to a state where I would be able to compile a basic FFMPEG test to my iPhone.


    I tried adding the libraries as mentioned in :
    How to Build FFMpeg as iOS Framework so my project hierarchy looks like :

    enter image description here

    but when I do :

    #import <libavcodec></libavcodec>avcodec.h> as suggested

    or any other import I can think to do it always errors me out saying it can’t find it. I’m not sure if this is because I’m not using search paths correctly or my import line is wrong or what...

  • Texture rendering on iOS using OpenGL ES in Unity project

    30 mars 2016, par Time1ess

    I’m working on a project, part of which is to stream video to my iPhone. I use my laptop to create the video stream to my iPhone with ffmpeg.

    The stream code in shell is below :

    ffmpeg \
       -f avfoundation -i "1" -s 1280*720 -r 29.97 \
       -c:v mpeg2video -q:v 20 -pix_fmt yuv420p -g 1 -threads 4\
       -f mpegts udp://192.168.1.102:6666

    with this, I successfully create my video stream.

    In Unity, I want to decode the video stream to create a texture. After I have gone through some ffmpeg tutorial and Unity tutorial, I created my link library. Some of these codes are below (ask me if more is needed) :

    In my library :

    buffer alloc :

    uint8_t *buffer;
    int buffer_size;
    buffer_size = avpicture_get_size(AV_PIX_FMT_RGBA, VIEW_WIDTH, VIEW_HEIGHT);

    buffer = (uint8_t *) av_malloc(buffer_size*sizeof(uint8_t));

    avpicture_fill((AVPicture *) pFrameRGB, buffer, AV_PIX_FMT_RGBA,
                  VIEW_WIDTH, VIEW_HEIGHT);

    getContext :

       is->sws_ctx = sws_getContext
       (
        is->video_st->codec->width,
        is->video_st->codec->height,
        is->video_st->codec->pix_fmt,
        VIEW_WIDTH,
        VIEW_HEIGHT,
        AV_PIX_FMT_RGBA,
        SWS_BILINEAR,
        NULL,
        NULL,
        NULL
        );

    sws_scale :

    sws_scale(
             is->sws_ctx,
             (uint8_t const * const *)pFrame->data,
             pFrame->linesize,
             0,
             is->video_st->codec->height,
             pFrameRGB->data,
             pFrameRGB->linesize
             );

    texture render :

    static void UNITY_INTERFACE_API OnRenderEvent(int texID)
    {
       GLuint gltex = (GLuint)(size_t)(texID);

       glBindTexture(GL_TEXTURE_2D, gltex);

       glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, VIEW_WIDTH, VIEW_HEIGHT,
                       GL_RGBA, GL_UNSIGNED_BYTE, pFrameRGB->data[0]);

       glGetError();
       return;
    }

    extern "C" UnityRenderingEvent UNITY_INTERFACE_EXPORT UNITY_INTERFACE_API GetRenderEventFunc()
    {
       return OnRenderEvent;
    }

    In Unity :

    texture created :

       private Texture2D texture;
       private int texID;
       texture = new Texture2D (width, height, TextureFormat.RGBA32, false);
       texture.filterMode = FilterMode.Point;
       texture.Apply ();
       GetComponent<renderer> ().material.mainTexture = texture;
       texID = texture.GetNativeTexturePtr ().ToInt32();
    </renderer>

    update func :

       void Update ()
       {
           GL.IssuePluginEvent(GetRenderEventFunc(), texID);
       }

    Video stream info :

    Input #0, mpegts, from 'udp://0.0.0.0:6666':
     Duration: N/A, start: 2.534467, bitrate: N/A
     Program 1
       Metadata:
         service_name    : Service01
         service_provider: FFmpeg
       Stream #0:0[0x100]: Video: mpeg2video (Main) ([2][0][0][0] / 0x0002), yuv420p(tv), 1280x720 [SAR 1:1 DAR 16:9], max. 104857 kb/s, 29.97 fps, 29.97 tbr, 90k tbn, 59.94 tbc

    Leave other details, my library works fine on the Unity simulator, but when I compiled all my libraries for arm64 and used the xcode project that Unity created to build my app and ran it, I couldn’t get any texture rendered in my iPhone, I checked my network and I’m sure that data had been sent to my iPhone and the Debug log showed me that frame has been successfully decoded also the OnRenderEvent function had been called.

    FYI :

    Unity 5.3.2f1 Personal

    Xcode 7.2.1

    iOS 9.2.1

    ffmpeg 3.0