Recherche avancée

Médias (0)

Mot : - Tags -/masques

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (104)

  • XMP PHP

    13 mai 2011, par

    Dixit Wikipedia, XMP signifie :
    Extensible Metadata Platform ou XMP est un format de métadonnées basé sur XML utilisé dans les applications PDF, de photographie et de graphisme. Il a été lancé par Adobe Systems en avril 2001 en étant intégré à la version 5.0 d’Adobe Acrobat.
    Étant basé sur XML, il gère un ensemble de tags dynamiques pour l’utilisation dans le cadre du Web sémantique.
    XMP permet d’enregistrer sous forme d’un document XML des informations relatives à un fichier : titre, auteur, historique (...)

  • Use, discuss, criticize

    13 avril 2011, par

    Talk to people directly involved in MediaSPIP’s development, or to people around you who could use MediaSPIP to share, enhance or develop their creative projects.
    The bigger the community, the more MediaSPIP’s potential will be explored and the faster the software will evolve.
    A discussion list is available for all exchanges between users.

  • Installation en mode ferme

    4 février 2011, par

    Le mode ferme permet d’héberger plusieurs sites de type MediaSPIP en n’installant qu’une seule fois son noyau fonctionnel.
    C’est la méthode que nous utilisons sur cette même plateforme.
    L’utilisation en mode ferme nécessite de connaïtre un peu le mécanisme de SPIP contrairement à la version standalone qui ne nécessite pas réellement de connaissances spécifique puisque l’espace privé habituel de SPIP n’est plus utilisé.
    Dans un premier temps, vous devez avoir installé les mêmes fichiers que l’installation (...)

Sur d’autres sites (7778)

  • ffmpeg build from source fails in docker container ?

    24 mars 2016, par John Allard

    I’m trying to make some changes to the ffmpeg source code (yes, I’m a masochist), and to start I booted an Arch Linux docker container, installed the requirements, downloaded ffmpeg source code, and tried to compile, but I’m getting some extremely odd errors.

    compile command :

    ./configure --bindidr=~/ffmpeg_build --enable-gpl --enable-libass --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --enable-libx264 --enable-nonfree

    Output :

    ./configure:unset:3338: no such has table element: mktemp
    pr: /tmp/ffconf.uQI7CeV.c: No such file or directory.
    pr: /tmp/ffconf.uQI7CeV.c: No such file or directory.
    pr: /tmp/ffconf.uQI7CeV.c: No such file or directory.
    pr: /tmp/ffconf.uQI7CeV.c: No such file or directory.
    ... (24 times)
    pr: /tmp/ffconf.uQI7CeV.c: No such file or directory.
    pr: /tmp/ffconf.r547UgWy.m: No such file or directory.
    ./configure:53378: parse error near '}'
    ==> ERROR" A failure occured in build()
       Aborting...
    The build failed.

    This doesn’t seem to be a problem with ffmpeg, more a problem with the container. If I check find /tmp -name ffconf\* I see that the files do exist and they contain

    extern int getrusage();
    int main(void){ getrusage(); }

    what in the hell is going on ? hash tabled ? mktemp not working ? files not being found ?

    edit-

    here is what is on the line numbers in the files that give the errors

    3338:configure - unset -f mktemp
    5338:configure - check_builtin gmtime_r time.g "time_t * time; strict tm*; gmtime_r(time, tm)"

    edit2 - Here’s the dockerfile (cloned from here https://hub.docker.com/r/greyltc/archlinux/ /dockerfile/)

    # Arch Linux baseline docker container
    # Generated on Sat Mar 19 14:26:28 GMT 2016 using code in this GitHub repo:
    # https://github.com/greyltc/docker-archlinux
    FROM scratch
    MAINTAINER Grey Christoforo <grey@christoforo.net>

    # copy in super minimal root filesystem archive
    ADD archlinux.tar.xz /

    # perform initial container setup tasks
    RUN setup-arch-docker-container

    # this allows the system profile to be sourced at every shell
    ENV ENV /etc/profile
  • FFmpeg can not open video file after adding the GLsurfaceView to render frames

    4 avril 2016, par Kyle Lo

    The source code works perfectly without any modification.

    I successfully use the below function to play the specified video.

    playview.openVideoFile("/sdcard/Test/mv.mp4");

    And for the research purpose I need to display the frame by using OpenGL ES. So I remove the original method below.

    ANativeWindow* window = ANativeWindow_fromSurface(env, javaSurface);

    ANativeWindow_Buffer buffer;
    if (ANativeWindow_lock(window, &buffer, NULL) == 0) {
     memcpy(buffer.bits, pixels,  w * h * 2);
     ANativeWindow_unlockAndPost(window);
    }

    ANativeWindow_release(window);

    And I add FrameRenderer class into my project

    public class FrameRenderer implements GLSurfaceView.Renderer {

       public long time = 0;
       public short framerate = 0;
       public long fpsTime = 0;
       public long frameTime = 0;
       public float avgFPS = 0;
       private PlayNative mNative = null;

       @Override
       public void onSurfaceCreated(GL10 gl, EGLConfig config) {/*do nothing*/}

       @Override
       public void onSurfaceChanged(GL10 gl, int width, int height) {

       }

       @Override
       public void onDrawFrame(GL10 gl) {
           mNative.render();
       }

    In the native side I create a corresponding method in VideoPlay.cpp And I only use glClearColorto test if the OpenGL function works or not.

    void VideoPlay::render() {
       glClearColor(1.0f, 0.0f, 0.0f, 1.0f);
       glClear(GL_COLOR_BUFFER_BIT);
    }

    And the onCreate is as below.

    protected void onCreate(Bundle savedInstanceState) {
           // TODO Auto-generated method stub
           super.onCreate(savedInstanceState);
           setContentView(R.layout.main_layout);

           playview = new PlayView(this);

           playview.openVideoFile("/sdcard/test_tt_racing.mp4");
           //playview.openVideoFile("/sdcard/big_buck_bunny.mp4");

           GLSurfaceView surface = (GLSurfaceView)findViewById(R.id.surfaceviewclass);
           surface.setRenderer(new FrameRenderer());
           ...

    Then test it on the mobile, the screen becomes red which means the GLSurfaceView and OpenGL works fine.

    But after I press the play bottom, whole the app stucked. And Show in the
    Log

    My question is why I can open the video whose path is totally the same with the previous one, just after I added the GLsurface renderer and how can I fix it ?

  • How to have multiple websocket RTSP streams ?

    6 octobre 2020, par kinx

    After spending some time reading various open-source projects on how to develop RTSP and WebSocket streams, I've almost built a simple project that allows me to display multiple streams on the page.

    


    I have a working example of just one stream working with the code below. A single URL in an array is sent to the client via WebSocket and with JSMPeg, it displays it with some success.

    


    However, I'm not sure how to build this where I have multiple sockets with an RTSP stream in each one and how to give each socket url it's own id. The idea is to encrypt the URL and when the client requests the list of streams, send back that as a socket id, and with JSMPeg, request that data.

    


    Server :

    


    class Stream extends EventEmitter {
  constructor() {
    super();
    this.urls = ["rtsp://someIPAddress:554/1"];
    this.urls.map((url) => {
      this.start(url);
    });
  }
  start(url) {
    this.startStream(url);
  }
  setOptions(url) {
    const options = {
      "-rtsp_transport": "tcp",
      "-i": url,
      "-f": "mpegts",
      "-codec:v": "mpeg1video",
      "-codec:a": "mp2",
      "-stats": "",
      "-b:v": "1500k",
      "-ar": "44100",
      "-r": 30,
    };
    let params = [];
    for (let key in options) {
      params.push(key);
      if (String(options[key]) !== "") {
        params.push(String(options[key]));
      }
    }
    params.push("-");
    return params;
  }
  startStream(url) {
    const wss = new WebSocket.Server({ port: 8080 });
    this.child = child_process.spawn("ffmpeg", this.setOptions(url));
    this.child.stdout.on("data", (data) => {
      wss.clients.forEach((client) => {
        client.send(data);
      });
      return this.emit("data", data);
    });
  }
}

const s = new Stream();
s.on("data", (data) => {
  console.log(data);
});


    


    In the constructor, there's an array of URLs, while I only have one here, i'd like to add multiple. I create a websocket and send that back. What I'd like to do is encrypt that URL with Crypto.createHash('md5').update(url).digest('hex') to give it it's own ID and create a websocket based on that id, send the data to that websocket and send that with a list of other id's to the client.

    


    client :

    


      <canvas style="width: 100%"></canvas>&#xA;  <code class="echappe-js">&lt;script type=&quot;text/javascript&quot;&gt;&amp;#xA;    var player = new JSMpeg.Player(&quot;ws://localhost:8080&quot;, {&amp;#xA;      loop: true,&amp;#xA;      autoplay: true,&amp;#xA;      canvas: document.getElementById(&quot;video&quot;),&amp;#xA;    });&amp;#xA;  &lt;/script&gt;&#xA;

    &#xA;

    What I'd like to do here is request from /api/streams and get back an array of streams/socket id's and request them from the array.

    &#xA;

    But how do I open up multiple sockets with multiple URLs ?

    &#xA;