Recherche avancée

Médias (2)

Mot : - Tags -/media

Autres articles (65)

  • Gestion générale des documents

    13 mai 2011, par

    MédiaSPIP ne modifie jamais le document original mis en ligne.
    Pour chaque document mis en ligne il effectue deux opérations successives : la création d’une version supplémentaire qui peut être facilement consultée en ligne tout en laissant l’original téléchargeable dans le cas où le document original ne peut être lu dans un navigateur Internet ; la récupération des métadonnées du document original pour illustrer textuellement le fichier ;
    Les tableaux ci-dessous expliquent ce que peut faire MédiaSPIP (...)

  • Des sites réalisés avec MediaSPIP

    2 mai 2011, par

    Cette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
    Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

Sur d’autres sites (5729)

  • Fragmented mp4 file is not played by MSE

    19 mai 2020, par Daniel

    I created a fragmented mp4 file with ffmpeg (from h264), and removed the first 6 moof and mdat pairs.

    



    So now it still has the correct order of boxes : ftyp, moov, moof, mdat, moof, mdat, ..., but the first moof packet has the sequenceNumber of 7.

    



    VLC can play it fine, 'Movies & TV' can also play, but the first some seconds are black.

    



    If I drag the file into the browser, it can also play it fine.

    



    It is however not being displayed at all in the browser (Chrome) if I feed it via MSE.

    



    No error messages are printed, and in the media-internals' log it can be seen that the videoplayer starts playing in the first second and suspends it only in the 18th second :

    



    Timestamp   Property    Value
00:00:00.000    origin_url  "https://localhost:8443/"
00:00:00.000    kFrameUrl   "https://localhost:8443/websocket/videodemo.html"
00:00:00.000    kFrameTitle "WebSocket and MSE demo"
00:00:00.000    url "blob:https://localhost:8443/3b4d4b1a-7c08-4136-95fe-dabc14fba95f"
00:00:00.000    info    "ChunkDemuxer"
00:00:00.000    pipeline_state  "kStarting"
00:00:01.067    kVideoTracks    [{"alpha mode":"is_opaque","codec":"h264","coded size":"1600x900","color space":"{primaries:BT709, transfer:BT709, matrix:BT709, range:LIMITED}","encryption scheme":"Unencrypted","flipped":false,"has_extra_data":false,"natural size":"1600x900","profile":"h264 main","rotation":"0°","visible rect":"0,0 1600x900"}]
00:00:01.067    debug   "Video rendering in low delay mode."
00:00:01.070    info    "Using D3D11 device for DXVA"
00:00:01.075    kIsVideoDecryptingDemuxerStream false
00:00:01.075    kVideoDecoderName   "MojoVideoDecoder"
00:00:01.075    kIsPlatformVideoDecoder true
00:00:01.075    info    "Selected MojoVideoDecoder for video decoding, config: codec: h264, profile: h264 main, alpha_mode: is_opaque, coded size: [1600,900], visible rect: [0,0,1600,900], natural size: [1600,900], has extra data: false, encryption scheme: Unencrypted, rotation: 0°, flipped: 0, color space: {primaries:BT709, transfer:BT709, matrix:BT709, range:LIMITED}"
00:00:01.075    pipeline_state  "kPlaying"
00:00:01.067    duration    "unknown"
00:00:18.926    pipeline_state  "kSuspending"
00:00:18.926    pipeline_state  "kSuspended"
00:00:18.927    event   "SUSPENDED"


    



    Here is the video file for reference.

    



    What is the problem with this file, why is it not displayed in the browser with MSE ?

    


  • Opencv VideoCapture always returns false on Heroku

    27 juin 2022, par Dacian Mujdar

    I'm using the following code to open a video stream :

    


    import cv2
video = cv2.VideoCapture()
video.open("some_m3u8_link")
success, image = video.read()


    


    However, even if the code works as intended locally, on Heroku success is always false.

    


    I'm using cedar-14 stack with the following buildpacks :

    


    


    heroku/python

    


    https://github.com/jonathanong/heroku-buildpack-ffmpeg-latest.git

    


    


    (I tried several buildpack options for ffmpeg)

    


    Running ffmpeg --version on heroku instance will return ffmpeg version 4.0-static https://johnvansickle.com/ffmpeg/

    


    Is there any setting/configuration I missed in order to make it work on deployment ? Thank you !

    


    Later edit : I tried several links for "some_m3u8_link" including from twitch and other streaming services (including traffic streaming li
An example for reproducing :

    


    python -c "import cv2; video=cv2.VideoCapture(); video.open('https://hddn01.skylinewebcams.com/live.m3u8?a=5tm6kfqrhqbpblan9j5d4bmua4'); success, image = video.read(); print(success)"


    


    Returns True on local machine and False on Heroku.

    


    (the link is taken from here)

    


  • Taking in RTSP stream, converting it to fragmented mp4 using ffmpeg before broadcasting to connected clients through web socket

    24 mars 2023, par Shaun

    I am taking in a live RTSP stream and converting it to fragmented mp4 using ffmpeg before re-distributing it to connected clients via a websocket. The clients will then play it using a normal web browser such as Chrome with Media Source Extensions. Most of the time, the code will work as intended but there will be occasions when the process of parsing the chunks emitted from the ffmpeg stdout will run into some issues. When everything is working well, each chunk of data emitted from ffmpeg will be parsed, sliced and regrouped into either a Initialisation Segment (made up of a FYTP box and a MOOV box) OR a Media Segment (made up of a MOOF box and a MDAT box). The key steps involved are :

    


      

    1. Every byte of data coming out from will first be pushed into an array.
    2. 


    3. Next, I will try to locate the box headers ('FYTP', 'MOOV', 'MOOF', 'MDAT') in the data.
    4. 


    5. After the box headers are located, I will look for the length of each corresponding box (which resides in the first 32 bytes of each box) to locate the start and end of each box.
    6. 


    7. With the box's length, I will then parse the array to filter out the data.
    8. 


    9. Once the entire segment is filtered out, I will send it to all connected clients. For 'FYTP and 'MOOV', they will only be sent once to each client. For 'MOOF' AND 'MDAT', this will be a continuous process for as long as the client(s) are connected.
    10. 


    


    However, when things are not working well, my code will for some unknown reason be unable to slice the segments correctly. During the debugging process, I will notice that the problem will start first with a segment having sliced a few extra bytes more than it should have at the end. This problem will then quickly escalate and the next few segments will go totally haywire.

    


    My suspicion is that this problem has less to do with my code than from the ffmpeg output. Has anyone faced similar problem before ? The following is my code. Any help on this will be much appreciated ! Thank you.

    


    const WebSocket = require("ws");
const { spawn } = require('child_process');
const rtspUrl = "rtsp://localhost:8554/mystream";
const fs = require('fs');
const outputFile = fs.createWriteStream('ffmpeg_output.bin');
const wss = new WebSocket.Server({ port: 4001 });

var buf_chunks_string_holder_array = [];
var initialization_segment_ready_flag = false;
var initialization_segment_to_send = [];
var buffered_media_segment_ready_flag = false;
var buffered_media_segment_to_send = [];
var segment_end_index = { "box_type": "nil", "end_index": 0 };
var next_segment_counter = 0;
var processing_counter_queue = []
var moof_counter = 0;
var mdat_counter = 0;
var checker = '';

wss.on("connection", (ws) => {
    console.log("Client connected");
  if (initialization_segment_ready_flag == true) {
    wss.clients.forEach(client => {
      client.send(new Uint8Array(initialization_segment_to_send).buffer);
    })
  }
  if (buffered_media_segment_ready_flag == true) {
    wss.clients.forEach(client => {
      client.send(new Uint8Array(buffered_media_segment_to_send.buffer));
    })
  }

  ws.on("close", () => {
    console.log("Client disconnected");
  });

  ws.addEventListener("error", (error) => {
    console.error("WebSocket error:", error);
  });

});

const ffmpeg = spawn("ffmpeg", [
  "-rtsp_transport",
  "tcp",
  "-i",
  rtspUrl,
  "-g",
  "10",
  "-bufsize", 
  "50k",
  "-preset",
  "ultrafast",
  "-tune",
  "zerolatency",
  "-c:v",
  "libx264",
  "-c:a",
  "aac",
  '-f',
  'mp4',
  '-movflags',
  'frag_keyframe+empty_moov+default_base_moof',
  '-min_frag_duration',
  '50000',   
  "pipe:1",
]);

function decToHex(dec) {
  return dec.toString(16).padStart(2,'0').toUpperCase();
}

function hexToDec(hex) {
  return parseInt(hex, 16);
}

//This is where the parsing and calculating of the length of each box will take place.
ffmpeg.stdout.on('data', (chunk) => {
  outputFile.write(chunk);
  for (var i=0; i < chunk.length; i ++){
    buf_chunks_string_holder_array.push(chunk[i]);
    checker += decToHex(chunk[i]);
    if(checker.length == 16){
      if (checker.slice(-8) === '66747970'){
        let box_size_string = checker.slice(-16, -8);
        let num_bytes = hexToDec(box_size_string);
        next_segment_counter += num_bytes;
      }
      else if(checker.slice(-8) === '6D6F6F76'){
        let box_size_string = checker.slice(-16, -8);
        let num_bytes = hexToDec(box_size_string);
        next_segment_counter += num_bytes;
        segment_end_index = { "box_type": "ftyp&moov", "end_index": next_segment_counter };
        processing_counter_queue.push(segment_end_index);
        next_segment_counter = 0;
      }
      else if(checker.slice(-8) === '6D6F6F66'){
        let box_size_string = checker.slice(-16, -8);
        let num_bytes = hexToDec(box_size_string);
        next_segment_counter += num_bytes;
        moof_counter ++;      
      }
      else if(checker.slice(-8) === '6D646174'){
        let box_size_string = checker.slice(-16, -8);
        let num_bytes = hexToDec(box_size_string);
        next_segment_counter += num_bytes;
        segment_end_index = { "box_type": "moof&mdat", "end_index": next_segment_counter };
        processing_counter_queue.push(segment_end_index);
        next_segment_counter = 0;
        mdat_counter ++;  
      }
      checker = checker.slice(2);
    }
  }
    //This is where the data will be sliced, grouped into their respective segments and sent to the connected clients.
  if (processing_counter_queue.length > 0) {
    var jobs_removal_counter = 0;
    processing_counter_queue.forEach(job_info => {
      if (job_info.box_type == 'ftyp&moov' && buf_chunks_string_holder_array.length >= job_info.end_index) {
        initialization_segment_to_send = buf_chunks_string_holder_array.slice(0, job_info.end_index);
        initialization_segment_ready_flag = true;
        buf_chunks_string_holder_array = buf_chunks_string_holder_array.slice(job_info.end_index);
        jobs_removal_counter++;
      }
      if (job_info.box_type == 'moof&mdat' && buf_chunks_string_holder_array.length >= job_info.end_index) {
        buffered_media_segment_to_send = buf_chunks_string_holder_array.slice(0, job_info.end_index);
        buffered_media_segment_ready_flag = true;
        buf_chunks_string_holder_array = buf_chunks_string_holder_array.slice(job_info.end_index);
        if (buf_chunks_string_holder_array.length!= 0 ||
          (buf_chunks_string_holder_array[4]!= 102 && buf_chunks_string_holder_array[5]!= 116 && buf_chunks_string_holder_array[6]!= 121 && buf_chunks_string_holder_array[7]!= 112) || 
          (buf_chunks_string_holder_array[4]!= 109 && buf_chunks_string_holder_array[5]!= 100 && buf_chunks_string_holder_array[6]!= 97 && buf_chunks_string_holder_array[7]!= 116)){
          buf_chunks_string_holder_array = [];
          processing_counter_queue = [];
        }
        jobs_removal_counter++;
        if (wss.clients.size >= 1) {
          wss.clients.forEach(client => {
            client.send(new Uint8Array(buffered_media_segment_to_send).buffer);
          })
        }
      } 
      processing_counter_queue = processing_counter_queue.slice(jobs_removal_counter);
    }
    );
  }
});

ffmpeg.stderr.on('data', (data) => {
  console.log(`stderr: ${data}`);
});

ffmpeg.on('close', (code) => {
  console.log(`FFmpeg process exited with code ${code}`);
});