Recherche avancée

Médias (91)

Autres articles (47)

  • La file d’attente de SPIPmotion

    28 novembre 2010, par

    Une file d’attente stockée dans la base de donnée
    Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
    Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...)

  • Submit enhancements and plugins

    13 avril 2011

    If you have developed a new extension to add one or more useful features to MediaSPIP, let us know and its integration into the core MedisSPIP functionality will be considered.
    You can use the development discussion list to request for help with creating a plugin. As MediaSPIP is based on SPIP - or you can use the SPIP discussion list SPIP-Zone.

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

Sur d’autres sites (6941)

  • Cannot use FFmpeg in Xcode iOS Project (file .h not found)

    19 mars 2016, par BlackBox

    I followed almost step by step this guide.

    Almost because I downloaded, as a user suggested, ffmpeg ios library already built from here

    I followed from "Linking static libraries in Xcode" but I cannot import anyway the header files of ffmpeg. (So I got the .a files)

    For example

    #include "avformat.h"
    // or
    #import "libavformat/avformat.h"
    // or
    #import <libavformat></libavformat>avformat.h>

    Everything that I use does not work.

    I specify that those .a files are currently in my project directory, indeed, if I import the .a file, it doesn’t complain that it isn’t found, but when compiling, it complains about UTF-8 stuff because .a files are object libraries and cannot be imported that way.

    I put also the Header Search Paths for the project as it was suggested and the config.log file but nothing.

    Also I see libraries are missing from every project example of FFmpeg I was able to find on GitHub.

    Any ideas ?

  • Using ffmpeg to display a static image if an RTMP input source is missing

    19 mars 2016, par iameli

    Here is what I would like ffmpeg to output :

    • If I am streaming from my iPhone to my RTMP server, ffmpeg should output the live video from my iPhone.
    • If not, ffmpeg should output a blank red screen.

    Here’s what I have so far. It sort of works.

    ffmpeg \
     -f lavfi \
     -re \
     -i 'color=s=320x240:r=30:c=red' \
     -thread_queue_size 512 \
     -i 'rtmp://localhost/stream/iphone' \
     -c:v libx264 \
     -f flv \
     -filter_complex "\
       [1:v]scale=320:240[stream]; \
       [0:v][stream]overlay=0:0:eof_action=pass[output] \
     "\
     -map '[output]' \
     -tune zerolatency \
     'rtmp://localhost/stream/output'

    What happens : it boots up and starts streaming my iPhone’s output no problem. When I disconnect, it hangs for a long time, perhaps 20 seconds. Then it starts outputting red, okay. But then if I reconnect my phone, it doesn’t resume. It’s still red. Two questions :

    • Is there a way to configure the buffering so that it starts outputting red as soon as it stops getting data from the RTMP stream ?
    • Is there a way to have it auto-retry, so after the RTMP stream returns, it will switch back ?

    Full verbose output, if that’s helpful. I’m using the latest git version of ffmpeg as of 2016-03-18 on Ubuntu Wily. The RTMP server is nginx-rtmp.

  • texture rendering issue on iOS using OpenGL ES in Unity project

    28 mars 2016, par Time1ess

    I’m working on a project, part of it is to streaming video to my iPhone, currently I use my laptop to create the video stream to my iPhone with ffmpeg.The stream code in shell is below :

    ffmpeg \
       -f avfoundation -i "1" -s 1280*720 -r 29.97 \
       -c:v mpeg2video -q:v 20 -pix_fmt yuv420p -g 1 -threads 4\
       -f mpegts udp://192.168.1.102:6666

    with this, I successfully create my video stream.

    In Unity, I want to decode the video stream to create a texture. After I have gone through with some ffmpeg tutorial and Unity tutorial(Since I’m new to both of them), I followed tutorials and created my link library. Some of these codes are below(ask me if more is needed) :

    In my library :

    buffer alloc :

    uint8_t *buffer;
    int buffer_size;
    buffer_size = avpicture_get_size(AV_PIX_FMT_RGBA, VIEW_WIDTH, VIEW_HEIGHT);

    buffer = (uint8_t *) av_malloc(buffer_size*sizeof(uint8_t));

    avpicture_fill((AVPicture *) pFrameRGB, buffer, AV_PIX_FMT_RGBA,
                  VIEW_WIDTH, VIEW_HEIGHT);

    getContext :

       is->sws_ctx = sws_getContext
       (
        is->video_st->codec->width,
        is->video_st->codec->height,
        is->video_st->codec->pix_fmt,
        VIEW_WIDTH,
        VIEW_HEIGHT,
        AV_PIX_FMT_RGBA,
        SWS_BILINEAR,
        NULL,
        NULL,
        NULL
        );

    sws_scale :

    sws_scale(
             is->sws_ctx,
             (uint8_t const * const *)pFrame->data,
             pFrame->linesize,
             0,
             is->video_st->codec->height,
             pFrameRGB->data,
             pFrameRGB->linesize
             );

    texture render :

    static void UNITY_INTERFACE_API OnRenderEvent(int texID)
    {
       GLuint gltex = (GLuint)(size_t)(texID);

       glBindTexture(GL_TEXTURE_2D, gltex);

       glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, VIEW_WIDTH, VIEW_HEIGHT,
                       GL_RGBA, GL_UNSIGNED_BYTE, pFrameRGB->data[0]);

       glGetError();
       return;
    }

    extern "C" UnityRenderingEvent UNITY_INTERFACE_EXPORT UNITY_INTERFACE_API GetRenderEventFunc()
    {
       return OnRenderEvent;
    }

    In Unity :

    texture created :

       private Texture2D texture;
       private int texID;
       texture = new Texture2D (width, height, TextureFormat.RGBA32, false);
       texture.filterMode = FilterMode.Point;
       texture.Apply ();
       GetComponent<renderer> ().material.mainTexture = texture;
       texID = texture.GetNativeTexturePtr ().ToInt32();
    </renderer>

    update func :

       void Update ()
       {
           GL.IssuePluginEvent(GetRenderEventFunc(), texID);
       }

    Video stream info :

    Input #0, mpegts, from 'udp://0.0.0.0:6666':
     Duration: N/A, start: 2.534467, bitrate: N/A
     Program 1
       Metadata:
         service_name    : Service01
         service_provider: FFmpeg
       Stream #0:0[0x100]: Video: mpeg2video (Main) ([2][0][0][0] / 0x0002), yuv420p(tv), 1280x720 [SAR 1:1 DAR 16:9], max. 104857 kb/s, 29.97 fps, 29.97 tbr, 90k tbn, 59.94 tbc

    Leave other details, my library works fine on the Unity simulator, but when I compiled all my libraries for arm64 and used the xcode project that Unity created to build my app and ran it, I couldn’t get any texture rendered in my iPhone, I checkd my network and I’m sure that data had been sent to my iPhone and the Debug log showed me that frame has been successfully decoded also the OnRenderEvent func had been called.

    I’m confused and try to find answer on stackoverflow, maybe I’m a beginer cause I can’t find answers, so I ask you guys to help me plz.

    FYI :

    Unity 5.3.2f1 Personal

    Xcode 7.2.1

    iOS 9.2.1

    ffmpeg 3.0