Recherche avancée

Médias (1)

Mot : - Tags -/école

Autres articles (66)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

Sur d’autres sites (7519)

  • Programmatically creating mpd stream using Python

    3 février 2021, par mifol68042

    I have a live feed using RTSP from an IP camera that is captured using OpenCV in my python code. I capture every frame run some object detection on it and then need to show this in my angular front end.

    


    Initially I had thought to create an RTMP stream in my code and then use that in angular app but then realised that RTMP support is EOL on browsers. Now the alternative to this is that I am planning to create a stream using MPEG Dash. When researching about this, found this link to create a manifest mpd file however the examples there have no mention of the frames or video file. On researching more, I realised we could do something like this :

    


    ffmpeg -i $INPUT.mp4 \
-map 0:v:0 -map 0:a:0 -map 0:v:0 -map 0:a:0 \
-b:v:0 250k -filter:v:0 "scale=-2:240" -profile:v:0 baseline \ 
-b:v:1 750k -filter:v:1 "scale=-2:480" -profile:v:1 main \    
-use_timeline 1 -use_template 1 -window_size 5 \
-adaptation_sets "id=0,streams=v id=1,streams=a" -f dash $OUTPUT.mpd


    


    This is from command line but how can I achieve this programmatically ??? Also how do I keep updating the mp4 to get a live stream ??

    


    In short, I want to understand how to create live feed using frames via MPEG Dash.

    


  • Optimizing Adaptive Streaming with FFMPEG

    25 octobre 2018, par Ramesh Navi

    I am working on a video on demand website, I am using laravel 5.7, FFMPEG and DASH player from (dashif.org). Got some questions.

    Extracting audio like :

    ffmpeg -i original.mp4 -vn -acodec libvorbis -ab 128k -dash 1 my_audio.webm

    Converting video like :

    ffmpeg -i original.mp4 -c:v libvpx-vp9 -keyint_min 150 \
    -g 150 -tile-columns 4 -frame-parallel 1  -f webm -dash 1 \
    -an -vf scale=160:190 -b:v 250k -dash 1 video_160x90_250k.webm \
    -an -vf scale=320:180 -b:v 500k -dash 1 video_320x180_500k.webm \
    -an -vf scale=640:360 -b:v 750k -dash 1 video_640x360_750k.webm \
    -an -vf scale=1280:720 -b:v 1500k -dash 1 video_1280x720_1500k.webm

    Creating manifest like :

    ffmpeg \
    -f webm_dash_manifest -i video_160x90_250k.webm \
    -f webm_dash_manifest -i video_320x180_500k.webm \
    -f webm_dash_manifest -i video_640x360_750k.webm \
    -f webm_dash_manifest -i video_1280x720_1500k.webm \
    -f webm_dash_manifest -i my_audio.webm \
    -c copy \
    -map 0 -map 1 -map 2 -map 3 -map 4 \
    -f webm_dash_manifest \
    -adaptation_sets "id=0,streams=0,1,2,3 id=1,streams=4" \
    my_video_manifest.mpd

    Now the problems :

    1. Video conversion takes a lot of time on the latest i5 8gb
      think-pad running Ubuntu 18. 4minute mp4 took more than 30minute. 10minute, 720p MP4 took forever, had to kill the process. Is
      that normal ? Any idea to optimize ?
    2. Need to find out original
      video’s dimension so that I can avoid dimension conversion more than
      the original. ffprob looks fine in command but produces too much of
      information than required, is there any simple function ?
    3. Converted webm file is bigger in size than original mp4, original
      mp4 of 720p was 33MB, but 640p webm is 76MB. Is that normal or
      something wrong ?

    Any suggestions to optimize the process are welcome.

  • How to live video stream using node API(Read file with chunk logic)

    28 septembre 2023, par Mukesh Singh Thakur

    I want to make a live video streaming API and send the video buffer chunk data to an HTML.
I am using rtsp URL.
The chunk logic does not work. The video only plays for 5 seconds then stops.

    


    index.js file

    


    const express = require('express');
const ffmpeg = require('fluent-ffmpeg');
const fs = require('fs');
const path = require('path');

const app = express();
const port = 3000;

app.get('/', (req, res) => {
  res.sendFile(__dirname + "/index.html");
});

const rtspUrl = 'rtsp://zephyr.rtsp.stream/movie?streamKey=64fd08123635440e7adc17ba31de2036';
const chunkDuration = 5; // Duration of each chunk in seconds


app.get('/video', (req, res) => {
  const outputDirectory = path.join(__dirname, 'chunks');
  if (!fs.existsSync(outputDirectory)) {
    fs.mkdirSync(outputDirectory);
  }

  const startTime = new Date().getTime();
  const outputFileName = `chunk_${startTime}.mp4`;
  const outputFilePath = path.join(outputDirectory, outputFileName);

  const command = ffmpeg(rtspUrl)
    .inputFormat('rtsp')
    // .inputOptions(['-rtsp_transport tcp'])
    .videoCodec('copy')
    .output(outputFilePath)
    .duration(chunkDuration)
    .on('start', () => {
      console.log(`start ${outputFileName}`);
    })
    .on('end', () => {
      console.log(`Chunk ${outputFileName} saved`);
      res.setHeader('Content-Type', 'video/mp4');
      res.sendFile(outputFilePath, (err) => {
        if (err) {
          console.error('Error sending file:', err);
        } else {
          fs.unlinkSync(outputFilePath); // Delete the chunk after it's sent
        }
      });
    })
    .on('error', (error) => {
      console.error('Error: ', error);
    });

  command.run();
});

app.listen(port, () => {
  console.log(`API server is running on port ${port}`);
});


    


    index.html

    


    &#xA;&#xA;&#xA;&#xA;  &#xA;  &#xA;  &#xA;  &#xA;&#xA;&#xA;&#xA;  <video width="50%" controls="controls" autoplay="autoplay">&#xA;    <source src="/video" type="video/mp4"></source>&#xA;    Your browser does not support the video tag.&#xA;  </video>&#xA;&#xA;&#xA;&#xA;

    &#xA;

    package.json

    &#xA;

    {&#xA;.....&#xA;  "scripts": {&#xA;    "test": "echo \"Error: no test specified\" &amp;&amp; exit 1",&#xA;    "start": "nodemon index.js"&#xA;  },&#xA;.....&#xA;}&#xA;

    &#xA;