Recherche avancée

Médias (2)

Mot : - Tags -/kml

Autres articles (20)

  • Contribute to translation

    13 avril 2011

    You can help us to improve the language used in the software interface to make MediaSPIP more accessible and user-friendly. You can also translate the interface into any language that allows it to spread to new linguistic communities.
    To do this, we use the translation interface of SPIP where the all the language modules of MediaSPIP are available. Just subscribe to the mailing list and request further informantion on translation.
    MediaSPIP is currently available in French and English (...)

  • List of compatible distributions

    26 avril 2011, par

    The table below is the list of Linux distributions compatible with the automated installation script of MediaSPIP. Distribution nameVersion nameVersion number Debian Squeeze 6.x.x Debian Weezy 7.x.x Debian Jessie 8.x.x Ubuntu The Precise Pangolin 12.04 LTS Ubuntu The Trusty Tahr 14.04
    If you want to help us improve this list, you can provide us access to a machine whose distribution is not mentioned above or send the necessary fixes to add (...)

  • Submit bugs and patches

    13 avril 2011

    Unfortunately a software is never perfect.
    If you think you have found a bug, report it using our ticket system. Please to help us to fix it by providing the following information : the browser you are using, including the exact version as precise an explanation as possible of the problem if possible, the steps taken resulting in the problem a link to the site / page in question
    If you think you have solved the bug, fill in a ticket and attach to it a corrective patch.
    You may also (...)

Sur d’autres sites (5695)

  • Anomalie #2025 : Surlignage intempestif

    20 septembre 2011, par tetue -

    Ceinture et bretelles : la classe pas_surlignable est passée par défaut sur toutes les balises de la dist depuis Changeset 50225 (c’est inutile, du coup ?)

  • .MKV Video File not playing on Azure Kinect Viewer

    17 avril 2023, par Ashutosh Singla

    I have a video file that contains 4 streams, 3 videos streams and one steam for metadata. I want to extract these streams first with metadata and then want to combine them together with metadata. I would like to play the video on Azure Kinect Viewer so that I can see if I am doing anything wrong while extracting and copying the stream.

    


    Original Stream Info :

    


    Input #0, matroska,webm, from 'output_master.mkv':
Metadata:
title           : Azure Kinect
encoder         : libmatroska-1.4.9
creation_time   : 2021-05-20T12:11:15.000000Z
K4A_DEPTH_DELAY_NS: 0
K4A_WIRED_SYNC_MODE: MASTER
K4A_COLOR_FIRMWARE_VERSION: 1.6.110
K4A_DEPTH_FIRMWARE_VERSION: 1.6.79
K4A_DEVICE_SERIAL_NUMBER: 000123102712
K4A_START_OFFSET_NS: 298800000
Duration: 00:00:40.03, start: 0.000000, bitrate: 480934 kb/s

Stream #0:0(eng): Video: mjpeg (Baseline) (MJPG / 0x47504A4D), yuvj422p(pc, bt470bg/unknown/unknown), 2048x1536, SAR 1:1 DAR 4:3, 30 fps, 30 tbr, 1000k tbn (default)
Metadata:
  title           : COLOR
  K4A_COLOR_TRACK : 14499183330009048
  K4A_COLOR_MODE  : MJPG_1536P
Stream #0:1(eng): Video: rawvideo (b16g / 0x67363162), gray16be, 640x576, SAR 1:1 DAR 10:9, 30 fps, 30 tbr, 1000k tbn (default)
Metadata:
  title           : DEPTH
  K4A_DEPTH_TRACK : 429408169412322196
  K4A_DEPTH_MODE  : NFOV_UNBINNED
Stream #0:2(eng): Video: rawvideo (b16g / 0x67363162), gray16be, 640x576, SAR 1:1 DAR 10:9, 30 fps, 30 tbr, 1000k tbn (default)
Metadata:
  title           : IR
  K4A_IR_TRACK    : 194324406376800992
  K4A_IR_MODE     : ACTIVE
Stream #0:3: Attachment: none
Metadata:
  filename        : calibration.json
  mimetype        : application/octet-stream
  K4A_CALIBRATION_FILE: calibration.json


    


    I am using the below command to extract Stream #0:0, Stream #0:1, and Stream #0:2 by changing map 0:X.

    


    ffmpeg -i output_master.mkv -c copy -allow_raw_vfw 1 -map 0:0 temp_0.mkv 


    


    To extract the configuration file from the video and store them in calibration.json, I am using the command below :

    


    ffmpeg -dump_attachment:3 calibration.json -i output_master.mkv


    


    To combine the streams with configuration file using FFmpeg, I am using the command below :

    


    ffmpeg -i temp_0.mkv -i temp_1.mkv -i temp_2.mkv -c copy -map 0:0 -map 1:0 -map 2:0 -allow_raw_vfw 1 -attach calibration.json -metadata:s:3 mimetype=application/octet-stream out.mkv


    


    Reconstrcuted Stream Info :

    


      Could not find codec parameters for stream 3 (Attachment: none): unknown codec
  Consider increasing the value for the 'analyzeduration' (0) and 'probesize' (5000000) options

  Input #0, matroska,webm, from '.\out.mkv':
  Metadata:
  title           : Azure Kinect
  K4A_COLOR_FIRMWARE_VERSION: 1.6.110
  K4A_DEPTH_FIRMWARE_VERSION: 1.6.79
  K4A_DEVICE_SERIAL_NUMBER: 000123102712
  K4A_START_OFFSET_NS: 298800000
  K4A_DEPTH_DELAY_NS: 0
  K4A_WIRED_SYNC_MODE: MASTER
  ENCODER         : Lavf60.3.100
  Duration: 00:00:40.06, start: 0.000000, bitrate: 480559 kb/s

  Stream #0:0(eng): Video: mjpeg (Baseline), yuvj422p(pc, bt470bg/unknown/unknown), 2048x1536, SAR 1:1 DAR 4:3, 30 fps, 30 tbr, 1k tbn (default)
  Metadata:
  title           : COLOR
  K4A_COLOR_TRACK : 14499183330009048
  K4A_COLOR_MODE  : MJPG_1536P
  DURATION        : 00:00:40.029000000

  Stream #0:1(eng): Video: rawvideo, rgb555le, 640x576, SAR 1:1 DAR 10:9, 30 fps, 30 tbr, 1k tbn (default)
  Metadata:
  title           : DEPTH
  K4A_DEPTH_TRACK : 429408169412322196
  K4A_DEPTH_MODE  : NFOV_UNBINNED
  DURATION        : 00:00:40.062000000

  Stream #0:2(eng): Video: rawvideo, rgb555le, 640x576, SAR 1:1 DAR 10:9, 30 fps, 30 tbr, 1k tbn (default)
  Metadata:
  title           : IR
  K4A_IR_TRACK    : 194324406376800992
  K4A_IR_MODE     : ACTIVE
  DURATION        : 00:00:40.062000000

  Stream #0:3: Attachment: none
  Metadata:
  filename        : calibration.json
  mimetype        : application/octet-stream


    


    However, I can not play the video on Azure Kinect Viewer, It displays failed to open recording.

    


    Any help would be appreciated.

    


  • Video files recorded in Google Chrome have stuttering audio

    4 juin 2018, par maxpaj

    Background

    I’m developing a platform where users can record videos of themselves or their screen and send them as video messages to customers / clients.

    I have limited users to only using my application with Google Chrome and I’m using the MediaRecorder API to record the video data from the users screen or webcamera. The codecs that are used for recording are VP8/OPUS (WEBM container).

    I need the videos to run in as many browsers as possible, so I’m using a 3rd party service to transcode videos from whatever format I’m getting from the users to a H.265/AAC MP4 container (caniuse MPEG-4/H.264).

    Issue

    Lately I’ve seen that some videos recorded on Mac OSX machines have the video and audio out of sync or that the video and audio stutters, depending on which player I’m using. I call these video files corrupt, for lack of a better word. Playing a corrupt file in Google Chrome renders smooth playing audio. Playing the video in VLC on my Windows machine renders stuttering audio.

    When I run the corrupt video files through the transcoding service I get video files with stuttering audio, no matter which player I’m using.

    This is an unwanted result and pretty much unacceptable since the audio needs to be smooth in order for the recipient of a video to not be bothered with the quality.

    Debugging

    According to the transcoding service support, this happens because of their mechanisms that try to sync up the audio and video from the corrupt file :

    Inspecting our encoding logs, I’ve noticed the following kind of
    warnings :

    [2018-05-16 14:08:38.009] [pcm_s16le @ 0x1d608c0] pcm_encode_frame :
    filling in for 5856 missing samples (122 ms) before pts 40800 to
    correct sync ! [2018-05-16 14:08:38.009] [pcm_s16le @ 0x1d608c0]
    pcm_encode_frame : dropping 2880 samples (60 ms) at pts 43392 to help
    correct sync to -3168 samples (-66 ms) !

    The problem here comes from the way that the audio in the original
    source file is encoded.

    -

    you should ensure that the audio is not out of sync (audio timestamps
    are correct) in your source file before submitting the job

    Running a corrupt file through ffmpeg on my own machine, re-encoding with the same codecs, produces the same kind of stuttering video. The logs produce an alarming amount of errors. Here is a sample of the log output :

    [libopus @ 0000029938e24d80] Queue input is backward in timeitrate= 194.8kbits/s dup=0 drop=5 speed=0.31x
    [webm @ 0000029938e09b00] Non-monotonous DTS in output stream 0:1; previous: 15434, current: 15394; changing to 15434. This may result in incorrect timestamps in the output file.
    [webm @ 0000029938e09b00] Non-monotonous DTS in output stream 0:1; previous: 15434, current: 15414; changing to 15434. This may result in incorrect timestamps in the output file.
    [libopus @ 0000029938e24d80] Queue input is backward in timeitrate= 193.3kbits/s dup=0 drop=5 speed=0.309x
    [webm @ 0000029938e09b00] Non-monotonous DTS in output stream 0:1; previous: 15539, current: 15499; changing to 15539. This may result in incorrect timestamps in the output file.
    [webm @ 0000029938e09b00] Non-monotonous DTS in output stream 0:1; previous: 15539, current: 15519; changing to 15539. This may result in incorrect timestamps in the output file.
    [libopus @ 0000029938e24d80] Queue input is backward in timeitrate= 192.0kbits/s dup=0 drop=5 speed=0.308x
    [webm @ 0000029938e09b00] Non-monotonous DTS in output stream 0:1; previous: 15667, current: 15627; changing to 15667. This may result in incorrect timestamps in the output file.
    [webm @ 0000029938e09b00] Non-monotonous DTS in output stream 0:1; previous: 15667, current: 15647; changing to 15667. This may result in incorrect timestamps in the output file.
    [libopus @ 0000029938e24d80] Queue input is backward in time

    I tried running the same inputs through another transcoding service and those outputs worked a lot better - video was still stuttering but the audio played smoothly, which is more important to the use case of my application.

    To my knowledge, this have so far only occurred for users on Mac OSX machines.

    Questions

    1. Is there anything I can do to have the files work better ? Or is this entirely a consequence of how encoding of video and audio in Google Chrome works ?

    2. One step in the right direction would be to just be able to detect when the video is corrupt. How can I do that ?