Recherche avancée

Médias (1)

Mot : - Tags -/epub

Autres articles (48)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Participer à sa traduction

    10 avril 2011

    Vous pouvez nous aider à améliorer les locutions utilisées dans le logiciel ou à traduire celui-ci dans n’importe qu’elle nouvelle langue permettant sa diffusion à de nouvelles communautés linguistiques.
    Pour ce faire, on utilise l’interface de traduction de SPIP où l’ensemble des modules de langue de MediaSPIP sont à disposition. ll vous suffit de vous inscrire sur la liste de discussion des traducteurs pour demander plus d’informations.
    Actuellement MediaSPIP n’est disponible qu’en français et (...)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

Sur d’autres sites (5695)

  • How to live video stream using node API (Read file with chunk logic)

    28 septembre 2023, par Mukesh Singh Thakur

    I want to make a live video streaming API and send the video buffer chunk data to an HTML.
I am using rtsp URL.
The chunk logic does not work. The video only plays for 5 seconds then stops.

    


    index.js file

    


    const express = require('express');
const ffmpeg = require('fluent-ffmpeg');
const fs = require('fs');
const path = require('path');

const app = express();
const port = 3000;

app.get('/', (req, res) => {
  res.sendFile(__dirname + "/index.html");
});

const rtspUrl = 'rtsp://zephyr.rtsp.stream/movie?streamKey=64fd08123635440e7adc17ba31de2036';
const chunkDuration = 5; // Duration of each chunk in seconds


app.get('/video', (req, res) => {
  const outputDirectory = path.join(__dirname, 'chunks');
  if (!fs.existsSync(outputDirectory)) {
    fs.mkdirSync(outputDirectory);
  }

  const startTime = new Date().getTime();
  const outputFileName = `chunk_${startTime}.mp4`;
  const outputFilePath = path.join(outputDirectory, outputFileName);

  const command = ffmpeg(rtspUrl)
    .inputFormat('rtsp')
    // .inputOptions(['-rtsp_transport tcp'])
    .videoCodec('copy')
    .output(outputFilePath)
    .duration(chunkDuration)
    .on('start', () => {
      console.log(`start ${outputFileName}`);
    })
    .on('end', () => {
      console.log(`Chunk ${outputFileName} saved`);
      res.setHeader('Content-Type', 'video/mp4');
      res.sendFile(outputFilePath, (err) => {
        if (err) {
          console.error('Error sending file:', err);
        } else {
          fs.unlinkSync(outputFilePath); // Delete the chunk after it's sent
        }
      });
    })
    .on('error', (error) => {
      console.error('Error: ', error);
    });

  command.run();
});

app.listen(port, () => {
  console.log(`API server is running on port ${port}`);
});


    


    index.html

    


    &#xA;&#xA;&#xA;&#xA;  &#xA;  &#xA;  &#xA;  &#xA;&#xA;&#xA;&#xA;  <video width="50%" controls="controls" autoplay="autoplay">&#xA;    <source src="/video" type="video/mp4"></source>&#xA;    Your browser does not support the video tag.&#xA;  </video>&#xA;&#xA;&#xA;&#xA;

    &#xA;

    package.json

    &#xA;

    {&#xA;.....&#xA;  "scripts": {&#xA;    "test": "echo \"Error: no test specified\" &amp;&amp; exit 1",&#xA;    "start": "nodemon index.js"&#xA;  },&#xA;.....&#xA;}&#xA;

    &#xA;

  • ffmpeg - stretched pixel issue

    5 juin 2023, par Adunato

    Context

    &#xA;

    I'm converting a PNG sequence into a video using FFMPEG. The images are semi-transparent portraits where the background has been removed digitally.

    &#xA;

    Issue

    &#xA;

    The edge pixels of the subject are stretched all the way to the frame border, creating a fully opaque video.

    &#xA;

    Cause Analysis

    &#xA;

    The process worked fine in the previous workflow using rembg from command line however, since I started using rembg via python script using alpha_matting to obtain higher quality results, the resulting video has these issues.

    &#xA;

    The issue is present in both webm format (target) and mp4 (used for testing).

    &#xA;

    Command Used

    &#xA;

    Command used for webm is :

    &#xA;

    ffmpeg -thread_queue_size 64 -framerate 30 -i <png sequence="sequence" location="location"> -c:v libvpx -b:v 0 -crf 18 -pix_fmt yuva420p -auto-alt-ref 0 -c:a libvorbis <png output="output">&#xA;</png></png>

    &#xA;

    Throubleshooting Steps Taken

    &#xA;

      &#xA;
    1. PNG Visual inspection The PNG images have a fully transparent background as desired.
    2. &#xA;

    3. PNG Alpha Measurement I have created a couple of python scripts to look at alpha level in pixels and confirmed that there is no subtle alpha level in the background pixels
    4. &#xA;

    5. Exported MP4 with AE Using the native AE renderer the resulting MP4/H.265 has a black background, so not showing the stretched pixel issue
    6. &#xA;

    &#xA;

    Image of the Issue

    &#xA;

    Text

    &#xA;

    Sample PNG Image from sequence&#xA;Text

    &#xA;

    Code Context

    &#xA;

    rembg call via API using alpha_matting seems to generate a premultiplied alpha which uses non black pixels for 0 alpha pixels.

    &#xA;

    remove(input_data, alpha_matting=True, alpha_matting_foreground_threshold=250,&#xA;                    alpha_matting_background_threshold=250, alpha_matting_erode_size=12)&#xA;

    &#xA;

    A test using a rough RGB reset of 0-alpha pixels confirms that the images are being played with their RGB value ignoring Alpha.

    &#xA;

    def reset_alpha_pixels(img):&#xA;    # Open the image file&#xA;    # Process each pixel&#xA;    data = list(img.getdata())&#xA;    new_data = []&#xA;    for item in data:&#xA;        if item[3] == 0:&#xA;            new_data.append((0, 0, 0, 0))&#xA;        else:&#xA;            new_data.append((item[0], item[1], item[2], item[3]))&#xA;        # Replace the alpha value but keep the RGB&#xA;        &#xA;&#xA;    # Update the image data&#xA;    img.putdata(new_data)&#xA;&#xA;    return img&#xA;

    &#xA;

    Updates

    &#xA;

      &#xA;
    • Added python context to make the question more relevant within SO scope.
    • &#xA;

    &#xA;

  • How to prepare media stream to play using dash.js web player ?

    7 avril 2016, par Paweł Tobiszewski

    I want to stream media from nginx server to Android device and play it using web player embedded into web page. Player I want to use is dash.js.
    I play the same media also using different methods (MediaPlayer and ExoPlayer) and they are working great. But when I try to use dash.js, I faced problem with codecs - they are not supported.
    I prepare my streams using ffmpeg and MP4Box, I also tried different codecs, like libx264, x264, x265 - always with the same effect.
    My based media are video in Y4M format and audio in WAV.
    How to encode it to use it in dash.js player ?

    EDIT :
    I get error "Video Element Error : MEDIA_ERR_DECODE" while trying to decode video stream.

    Here is full log :

    [16] EME detected on this user agent! (ProtectionModel_21Jan2015)
    [19] Playback Initialized
    [28] [dash.js 2.0.0] MediaPlayer has been initialized
    [102] Parsing complete: ( xml2json: 3ms, objectiron: 3ms, total: 0.006s)
    [103] Manifest has been refreshed at Thu Apr 07 2016 22:02:52 GMT+0200 (CEST)[1460059372.696]  
    [107] SegmentTimeline detected using calculated Live Edge Time
    [118] MediaSource is open!
    [118] [object Event]
    [119] Duration successfully set to: 18.58
    [119] Added 0 inline events
    [120] video codec: video/mp4;codecs="avc1.640032"
    [132] Schedule controller stopping for video
    [137] No audio data.
    [137] No text data.
    [137] No fragmentedText data.
    [137] No embeddedText data.
    [138] No muxed data.
    [139] Start Event Controller
    [141] Schedule controller starting for video
    [143] Native video element event: play
    [144] Schedule controller starting for video
    [148] loaded video:InitializationSegment:NaN (200, 0ms, 7ms)
    [149] Initialization finished loading
    [154] Getting the request for video time : 0
    [155] SegmentList: 0 / 18.58
    [164] loaded video:MediaSegment:0 (200, 7ms, 1ms)
    [169] Native video element event: loadedmetadata
    [171] Starting playback at offset: 0
    [175] Got enough buffer to start.
    [175] Buffered Range: 0 - 0.999999
    [179] Requesting seek to time: 0
    [181] Prior to making a request for time, NextFragmentRequestRule is aligning index handler's currentTime with bufferedRange.end. 0  was changed to  0.999999
    [182] Getting the request for video time : 0.999999
    [183] SegmentList: 0 / 18.58
    [183] Getting the next request at index: 1
    [184] SegmentList: 1 / 18.58
    [190] loaded video:MediaSegment:1 (200, 5ms, 0ms)
    [192] Buffered Range: 0 - 0.999999
    [195] Getting the request for video time : 2
    [196] Index for video time 2 is 1
    [197] SegmentList: 1 / 18.58
    [197] Getting the next request at index: 2
    [198] SegmentList: 2 / 18.58
    [205] loaded video:MediaSegment:2 (200, 4ms, 1ms)
    [207] Buffered Range: 0 - 0.999999
    [207] Getting the request for video time : 3
    [208] Index for video time 3 is 2
    [208] SegmentList: 2 / 18.58
    [209] Getting the next request at index: 3
    [209] SegmentList: 3 / 18.58
    [212] Video Element Error: MEDIA_ERR_DECODE
    [212] [object MediaError]
    [215] Schedule controller stopping for video
    [219] Native video element event: pause