Recherche avancée

Médias (2)

Mot : - Tags -/doc2img

Autres articles (111)

  • Keeping control of your media in your hands

    13 avril 2011, par

    The vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
    While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
    MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
    MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Submit bugs and patches

    13 avril 2011

    Unfortunately a software is never perfect.
    If you think you have found a bug, report it using our ticket system. Please to help us to fix it by providing the following information : the browser you are using, including the exact version as precise an explanation as possible of the problem if possible, the steps taken resulting in the problem a link to the site / page in question
    If you think you have solved the bug, fill in a ticket and attach to it a corrective patch.
    You may also (...)

Sur d’autres sites (16551)

  • ffmpeg - stretched pixel issue

    5 juin 2023, par Adunato

    Context

    


    I'm converting a PNG sequence into a video using FFMPEG. The images are semi-transparent portraits where the background has been removed digitally.

    


    Issue

    


    The edge pixels of the subject are stretched all the way to the frame border, creating a fully opaque video.

    


    Cause Analysis

    


    The process worked fine in the previous workflow using rembg from command line however, since I started using rembg via python script using alpha_matting to obtain higher quality results, the resulting video has these issues.

    


    The issue is present in both webm format (target) and mp4 (used for testing).

    


    Command Used

    


    Command used for webm is :

    


    ffmpeg -thread_queue_size 64 -framerate 30 -i <png sequence="sequence" location="location"> -c:v libvpx -b:v 0 -crf 18 -pix_fmt yuva420p -auto-alt-ref 0 -c:a libvorbis <png output="output">&#xA;</png></png>

    &#xA;

    Throubleshooting Steps Taken

    &#xA;

      &#xA;
    1. PNG Visual inspection The PNG images have a fully transparent background as desired.
    2. &#xA;

    3. PNG Alpha Measurement I have created a couple of python scripts to look at alpha level in pixels and confirmed that there is no subtle alpha level in the background pixels
    4. &#xA;

    5. Exported MP4 with AE Using the native AE renderer the resulting MP4/H.265 has a black background, so not showing the stretched pixel issue
    6. &#xA;

    &#xA;

    Image of the Issue

    &#xA;

    Text

    &#xA;

    Sample PNG Image from sequence&#xA;Text

    &#xA;

    Code Context

    &#xA;

    rembg call via API using alpha_matting seems to generate a premultiplied alpha which uses non black pixels for 0 alpha pixels.

    &#xA;

    remove(input_data, alpha_matting=True, alpha_matting_foreground_threshold=250,&#xA;                    alpha_matting_background_threshold=250, alpha_matting_erode_size=12)&#xA;

    &#xA;

    A test using a rough RGB reset of 0-alpha pixels confirms that the images are being played with their RGB value ignoring Alpha.

    &#xA;

    def reset_alpha_pixels(img):&#xA;    # Open the image file&#xA;    # Process each pixel&#xA;    data = list(img.getdata())&#xA;    new_data = []&#xA;    for item in data:&#xA;        if item[3] == 0:&#xA;            new_data.append((0, 0, 0, 0))&#xA;        else:&#xA;            new_data.append((item[0], item[1], item[2], item[3]))&#xA;        # Replace the alpha value but keep the RGB&#xA;        &#xA;&#xA;    # Update the image data&#xA;    img.putdata(new_data)&#xA;&#xA;    return img&#xA;

    &#xA;

    Updates

    &#xA;

      &#xA;
    • Added python context to make the question more relevant within SO scope.
    • &#xA;

    &#xA;

  • How to prepare media stream to play using dash.js web player ?

    7 avril 2016, par Paweł Tobiszewski

    I want to stream media from nginx server to Android device and play it using web player embedded into web page. Player I want to use is dash.js.
    I play the same media also using different methods (MediaPlayer and ExoPlayer) and they are working great. But when I try to use dash.js, I faced problem with codecs - they are not supported.
    I prepare my streams using ffmpeg and MP4Box, I also tried different codecs, like libx264, x264, x265 - always with the same effect.
    My based media are video in Y4M format and audio in WAV.
    How to encode it to use it in dash.js player ?

    EDIT :
    I get error "Video Element Error : MEDIA_ERR_DECODE" while trying to decode video stream.

    Here is full log :

    [16] EME detected on this user agent! (ProtectionModel_21Jan2015)
    [19] Playback Initialized
    [28] [dash.js 2.0.0] MediaPlayer has been initialized
    [102] Parsing complete: ( xml2json: 3ms, objectiron: 3ms, total: 0.006s)
    [103] Manifest has been refreshed at Thu Apr 07 2016 22:02:52 GMT+0200 (CEST)[1460059372.696]  
    [107] SegmentTimeline detected using calculated Live Edge Time
    [118] MediaSource is open!
    [118] [object Event]
    [119] Duration successfully set to: 18.58
    [119] Added 0 inline events
    [120] video codec: video/mp4;codecs="avc1.640032"
    [132] Schedule controller stopping for video
    [137] No audio data.
    [137] No text data.
    [137] No fragmentedText data.
    [137] No embeddedText data.
    [138] No muxed data.
    [139] Start Event Controller
    [141] Schedule controller starting for video
    [143] Native video element event: play
    [144] Schedule controller starting for video
    [148] loaded video:InitializationSegment:NaN (200, 0ms, 7ms)
    [149] Initialization finished loading
    [154] Getting the request for video time : 0
    [155] SegmentList: 0 / 18.58
    [164] loaded video:MediaSegment:0 (200, 7ms, 1ms)
    [169] Native video element event: loadedmetadata
    [171] Starting playback at offset: 0
    [175] Got enough buffer to start.
    [175] Buffered Range: 0 - 0.999999
    [179] Requesting seek to time: 0
    [181] Prior to making a request for time, NextFragmentRequestRule is aligning index handler's currentTime with bufferedRange.end. 0  was changed to  0.999999
    [182] Getting the request for video time : 0.999999
    [183] SegmentList: 0 / 18.58
    [183] Getting the next request at index: 1
    [184] SegmentList: 1 / 18.58
    [190] loaded video:MediaSegment:1 (200, 5ms, 0ms)
    [192] Buffered Range: 0 - 0.999999
    [195] Getting the request for video time : 2
    [196] Index for video time 2 is 1
    [197] SegmentList: 1 / 18.58
    [197] Getting the next request at index: 2
    [198] SegmentList: 2 / 18.58
    [205] loaded video:MediaSegment:2 (200, 4ms, 1ms)
    [207] Buffered Range: 0 - 0.999999
    [207] Getting the request for video time : 3
    [208] Index for video time 3 is 2
    [208] SegmentList: 2 / 18.58
    [209] Getting the next request at index: 3
    [209] SegmentList: 3 / 18.58
    [212] Video Element Error: MEDIA_ERR_DECODE
    [212] [object MediaError]
    [215] Schedule controller stopping for video
    [219] Native video element event: pause
  • Live streaming through video tag

    1er février 2018, par ElmiS

    I am currently trying to make a video streaming service without using plugins.

    My server is using ffmpeg to transmux a rtsp stream to a fragmented mp4 so that the HTML 5 video tag can play it.

    The ffmpeg command is

    > ffmpeg -rtsp_transport tcp -i rtsp_address -movflags frag_keyframe+empty_moov -c:v copy -f mp4 -

    (My rtsp stream has no audio)

    Then I feed it to the client page though websocket which then will use Media Source Extensions API to give data to the video tag.

    This all works on firefox but does not work for chrome or edge/i.e and I have searched all over but could not find an answer as to why it won’t play.

    The initial data such as the width and the height of the video seems to go through as the size of the video frame changes. However the video won’t start and when I use video.play() so that it would autoplay the video once loaded, I get an error saying

    Uncaught (in promise) DOMException : The play() request was interrupted by a call to pause().

    It seems like the video automatically pauses itself since I did not use pause() anywhere in my code.

    My client side code is

     var webSocket;
     var video = document.getElementById("rtspPlayer");
     var mediaSource;
     var sourceArray;
     var strCodec = 'video/mp4; codecs="avc1.420029"';
     var buffer = [];

     window.onload = function(){
       var selector = document.getElementById("select");
       var url = selector.options[selector.selectedIndex].value;

       setUpWebSocket(url);
       setUpMediaSource();
     }

     function setUpWebSocket(url){
       webSocket = new WebSocket(url);
       webSocket.onopen = onWebSocketOpen;
       webSocket.onmessage = onWebSocketData;
     }

     function setUpMediaSource(){
       mediaSource = new window.MediaSource();
       var uri = window.URL.createObjectURL(mediaSource);
       video.setAttribute('src', uri);
       video.setAttribute('type', 'video/mp4');

       mediaSource.onsourceopen = function(){
         mediaSource.duration = Infinity;
         sourceBuffer = mediaSource.addSourceBuffer(strCodec);
         sourceBuffer.onupdateend = readFromBuffer();
       }
     }

     function onWebSocketOpen(){
       video.play();
     }

     function onWebSocketData(data){
       var blob = data.data;
       var fileReader = new FileReader();
       fileReader.onload = function() {
         buffer.push(this.result);
         readFromBuffer();
       };
       fileReader.readAsArrayBuffer(blob);
     }

     function readFromBuffer(){
       if(buffer.length === 0){
         //console.log("Buffer length 0");
         return;
       }
       else if(!sourceBuffer){
         //console.log("No Source Buffer");
         return;
       }
       else if(sourceBuffer.updating){
         //console.log("SourceBuffer Updating");
         return;
       }

       try{
         var data = buffer.shift();
         sourceBuffer.appendBuffer(data);
       }
       catch(e){
         return;
       }
     }

    I’m certain that this is the right Codec for the video I want to play as I have saved 30 second of the video using ffmpeg and used Bento4’s mp4info to check the codec. Also the 30 second video clip played well though the media source. However playing live seems impossible for me to solve.

    Help me please ! I’ve been trying to solve this problem for days.