Recherche avancée

Médias (17)

Mot : - Tags -/wired

Autres articles (93)

  • Gestion des droits de création et d’édition des objets

    8 février 2011, par

    Par défaut, beaucoup de fonctionnalités sont limitées aux administrateurs mais restent configurables indépendamment pour modifier leur statut minimal d’utilisation notamment : la rédaction de contenus sur le site modifiables dans la gestion des templates de formulaires ; l’ajout de notes aux articles ; l’ajout de légendes et d’annotations sur les images ;

  • Supporting all media types

    13 avril 2011, par

    Unlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)

  • Keeping control of your media in your hands

    13 avril 2011, par

    The vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
    While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
    MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
    MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...)

Sur d’autres sites (11305)

  • ffplay does not play an RTMP stream on VM with Ubuntu

    7 novembre 2020, par PiotrKulesza

    I am trying to run my RTMP stream on a VM with Ubuntu installed. The stream starts on the host computer from the obs program.

    


    Obs stream settings :

    


    Server: rtmp://192.168.56.102:1935/show
Stream key: stream


    


    Obs sends the stream to the nginx server on the VM with Ubuntu installed.

    


    RTMP configuration in nginx.conf

    


    rtmp {
    server {
        listen 1935; # Listen on standard RTMP port
        chunk_size 4000;

        application show {
            live on;
            # Turn on HLS
            hls on;
            hls_path /mnt/hls/;
            hls_fragment 3;
            hls_playlist_length 60;
            # disable consuming the stream from nginx as rtmp
            deny play all;
        }
    }
}


    


    When I start the stream it connects because it shows up in netstat.

    


    Output from netstat :

    


    Active Internet connections (w/o servers)
Proto Recv-Q Send-Q Local Address           Foreign Address         State      
tcp        0      0 webapp-VirtualBox:1935  192.168.56.1:56924      ESTABLISHED


    


    But when I try to play stream with ffplay, it doesn't work. I am getting the following error.

    


    ffplay version 4.2.4-1ubuntu0.1 Copyright (c) 2003-2020 the FFmpeg developers
  built with gcc 9 (Ubuntu 9.3.0-10ubuntu2)
  configuration: --prefix=/usr --extra-version=1ubuntu0.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-avresample --disable-filter=resample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librsvg --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opencl --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-nvenc --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared
  libavutil      56. 31.100 / 56. 31.100
  libavcodec     58. 54.100 / 58. 54.100
  libavformat    58. 29.100 / 58. 29.100
  libavdevice    58.  8.100 / 58.  8.100
  libavfilter     7. 57.100 /  7. 57.100
  libavresample   4.  0.  0 /  4.  0.  0
  libswscale      5.  5.100 /  5.  5.100
  libswresample   3.  5.100 /  3.  5.100
  libpostproc    55.  5.100 / 55.  5.100
rtmp://192.168.56.102:1935/show/stream: Broken pipeq=    0B f=0/0 


    


    I have tried the following commands to play a stream but each one gives the same error.

    


    ffplay -i rtmp://192.168.56.102:1935/show/stream
ffplay -i rtmp://webapp-VirtualBox:1935/show/stream
ffplay -i rtmp://localhost:1935/show/stream


    


    I also tried VLC but this program also didn't work.
Can anyone tell me what I am doing wrong or forgot to play this stream ?

    


  • Decoding a FFMPEG Buffer video Streaming using websocket for processing with OpenCV ?

    4 octobre 2019, par Alexis Meneses

    I am having a problem trying to get a frame from a streaming that I am doing on a web socket.
    I am sending my data from a webcam with ffmpeg using this command :

    ffmpeg -s 320x240 -f video4linux2 -i /dev/video0 -f mpeg1video -b:v 800k -r 30 http://localhost:8092

    Later, I get that stream and broadcast on a node js server with the following code :

    var childProcess = require('child_process')
     , express = require('express')
     , http = require('http')
     , morgan = require('morgan')
     , ws = require('ws');

    // configuration files
    var configServer = require('./lib/config/server');

    // app parameters
    var app = express();
    app.set('port', configServer.httpPort);
    app.use(express.static(configServer.staticFolder));
    app.use(morgan('dev'));

    // serve index
    require('./lib/routes').serveIndex(app, configServer.staticFolder);

    // HTTP server
    http.createServer(app).listen(app.get('port'), function () {
     console.log('HTTP server listening on port ' + app.get('port'));
    });


    var STREAM_MAGIC_BYTES = 'jsmp'; // Must be 4 bytes
    var width = 320;
    var height = 240;

    // WebSocket server
    var wsServer = new (ws.Server)({ port: configServer.wsPort });
    console.log('WebSocket server listening on port ' + configServer.wsPort);

    wsServer.on('connection', function(socket) {

     var streamHeader = new Buffer(8);

     streamHeader.write(STREAM_MAGIC_BYTES);
     streamHeader.writeUInt16BE(width, 4);
     streamHeader.writeUInt16BE(height, 6);
     socket.send(streamHeader, { binary: true });
     console.log(streamHeader);

     console.log('New WebSocket Connection (' + wsServer.clients.length + ' total)');

     socket.on('close', function(code, message){
       console.log('Disconnected WebSocket (' + wsServer.clients.length + ' total)');
     });
    });

    wsServer.broadcast = function(data, opts) {
     for(var i in this.clients) {
       if(this.clients[i].readyState == 1) {
         this.clients[i].send(data, opts);
       }
       else {
         console.log('Error: Client (' + i + ') not connected.');
       }
     }


    };

    // HTTP server to accept incoming MPEG1 stream
    http.createServer(function (req, res) {
     console.log(
       'Stream Connected: ' + req.socket.remoteAddress +
       ':' + req.socket.remotePort + ' size: ' + width + 'x' + height
     );

     req.on('data', function (data) {
       wsServer.broadcast(data, { binary: true });
     });
    }).listen(configServer.streamPort, function () {
     console.log('Listening for video stream on port ' + configServer.streamPort);


    });

    module.exports.app = app;

    I am getting successfully the data from this.clients[i].send(data, opts) on my python program, but I dont know how to decode the information to process the image with opencv. Any idea ?

    What I want to do is :

    import asyncio
    import websockets
    import cv2

    async def hello():
       uri = "ws://192.168.1.170:8094" #URL of the websocket server
       async with websockets.connect(uri) as websocket:
               inf = await websocket.recv()
               # Process the data in order to showimg with opencv.

               print(inf)


    asyncio.get_event_loop().run_until_complete(hello())
  • Not getting partial video content while using ffmpeg

    1er mai 2023, par MI Sabic

    I'm trying to partially send a video using nodejs and fluent-ffmpeg. But it's failing to send the data.

    


    When I send the video using only fs module only, it works fine. Here's the code.

    


    const express = require("express");
const app = express();  
const fs = require("fs");

const VIDEO_PATH = 'video.mp4';

app.get("/", (req, res) => {
  res.sendFile(__dirname + "/index.html");
})

app.get("/video", (req, res) => {
  const range = req.headers.range;
  if(!range) {
    res.status(400).send("Requires range header!");
  }

  const size = fs.statSync(VIDEO_PATH).size;
  const CHUNK_SIZE = 10**6;
  const start = Number(range.replace(/\D/g, ""));
  const end = Math.min(start + CHUNK_SIZE, size - 1);

  const contentLength = end - start + 1;

  const headers = {
    "Content-Range": `bytes ${start}-${end}/${size}`,
    "Accept-Ranges": 'bytes',
    "Content-Length": contentLength, 
    "Content-Type": "video/mp4"
  }

  res.writeHead(206, headers);

  const videoStream = fs.createReadStream(VIDEO_PATH, {start, end});
  videoStream.pipe(res);
})

app.listen(3000, () => {
  console.log("Server is running on port: ", 3000);
})


    


    When I send the video after processing it using fluent-ffmpeg module, it doesn't work. I've simplified the code for understanding. Here's my code.

    


    const express = require("express");
const app = express();  
const fs = require("fs");
const ffmpegStatic = require('ffmpeg-static');
const ffmpeg = require('fluent-ffmpeg');

ffmpeg.setFfmpegPath(ffmpegStatic);

const VIDEO_PATH = 'video.mp4';

app.get("/", (req, res) => {
  res.sendFile(__dirname + "/index.html");
})

app.get("/video", (req, res) => {
  const range = req.headers.range;
  if(!range) {
    res.status(400).send("Requires range header!");
  }

  const size = fs.statSync(VIDEO_PATH).size;
  const CHUNK_SIZE = 10**6;
  const start = Number(range.replace(/\D/g, ""));
  const end = Math.min(start + CHUNK_SIZE, size - 1);

  const contentLength = end - start + 1;

  const headers = {
    "Content-Range": `bytes ${start}-${end}/${size}`,
    "Accept-Ranges": 'bytes',
    "Content-Length": contentLength, 
    "Content-Type": "video/mp4"
  }

  res.writeHead(206, headers);

  const videoStream = fs.createReadStream(VIDEO_PATH, {start, end});

  ffmpeg(videoStream)
    .outputOptions('-movflags frag_keyframe+empty_moov')
    .toFormat('mp4')
    .pipe(res);
})

app.listen(3000, () => {
  console.log("Server is running on port: ", 3000);
})


    


    My index.html

    


    &#xA;&#xA;&#xA;  &#xA;  &#xA;  &#xA;  &#xA;&#xA;&#xA;  <video width="50%" controls="controls">&#xA;    <source src="/video" type="video/mp4">&#xA;  </source></video>&#xA;&#xA;&#xA;

    &#xA;

    Any help would be appreciated. Thanks in advance.

    &#xA;