Recherche avancée

Médias (1)

Mot : - Tags -/censure

Autres articles (44)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

  • De l’upload à la vidéo finale [version standalone]

    31 janvier 2010, par

    Le chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
    Upload et récupération d’informations de la vidéo source
    Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
    Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)

Sur d’autres sites (7429)

  • How to prepare media stream to play using dash.js web player ?

    7 avril 2016, par Paweł Tobiszewski

    I want to stream media from nginx server to Android device and play it using web player embedded into web page. Player I want to use is dash.js.
    I play the same media also using different methods (MediaPlayer and ExoPlayer) and they are working great. But when I try to use dash.js, I faced problem with codecs - they are not supported.
    I prepare my streams using ffmpeg and MP4Box, I also tried different codecs, like libx264, x264, x265 - always with the same effect.
    My based media are video in Y4M format and audio in WAV.
    How to encode it to use it in dash.js player ?

    EDIT :
    I get error "Video Element Error : MEDIA_ERR_DECODE" while trying to decode video stream.

    Here is full log :

    [16] EME detected on this user agent! (ProtectionModel_21Jan2015)
    [19] Playback Initialized
    [28] [dash.js 2.0.0] MediaPlayer has been initialized
    [102] Parsing complete: ( xml2json: 3ms, objectiron: 3ms, total: 0.006s)
    [103] Manifest has been refreshed at Thu Apr 07 2016 22:02:52 GMT+0200 (CEST)[1460059372.696]  
    [107] SegmentTimeline detected using calculated Live Edge Time
    [118] MediaSource is open!
    [118] [object Event]
    [119] Duration successfully set to: 18.58
    [119] Added 0 inline events
    [120] video codec: video/mp4;codecs="avc1.640032"
    [132] Schedule controller stopping for video
    [137] No audio data.
    [137] No text data.
    [137] No fragmentedText data.
    [137] No embeddedText data.
    [138] No muxed data.
    [139] Start Event Controller
    [141] Schedule controller starting for video
    [143] Native video element event: play
    [144] Schedule controller starting for video
    [148] loaded video:InitializationSegment:NaN (200, 0ms, 7ms)
    [149] Initialization finished loading
    [154] Getting the request for video time : 0
    [155] SegmentList: 0 / 18.58
    [164] loaded video:MediaSegment:0 (200, 7ms, 1ms)
    [169] Native video element event: loadedmetadata
    [171] Starting playback at offset: 0
    [175] Got enough buffer to start.
    [175] Buffered Range: 0 - 0.999999
    [179] Requesting seek to time: 0
    [181] Prior to making a request for time, NextFragmentRequestRule is aligning index handler's currentTime with bufferedRange.end. 0  was changed to  0.999999
    [182] Getting the request for video time : 0.999999
    [183] SegmentList: 0 / 18.58
    [183] Getting the next request at index: 1
    [184] SegmentList: 1 / 18.58
    [190] loaded video:MediaSegment:1 (200, 5ms, 0ms)
    [192] Buffered Range: 0 - 0.999999
    [195] Getting the request for video time : 2
    [196] Index for video time 2 is 1
    [197] SegmentList: 1 / 18.58
    [197] Getting the next request at index: 2
    [198] SegmentList: 2 / 18.58
    [205] loaded video:MediaSegment:2 (200, 4ms, 1ms)
    [207] Buffered Range: 0 - 0.999999
    [207] Getting the request for video time : 3
    [208] Index for video time 3 is 2
    [208] SegmentList: 2 / 18.58
    [209] Getting the next request at index: 3
    [209] SegmentList: 3 / 18.58
    [212] Video Element Error: MEDIA_ERR_DECODE
    [212] [object MediaError]
    [215] Schedule controller stopping for video
    [219] Native video element event: pause
  • Prepare release notes for Libav 10 "Eks"

    18 décembre 2013, par Anton Khirnov
    Prepare release notes for Libav 10 "Eks"
    

    Additional editing by Diego Biurrun

    • [DBH] doc/RELEASE_NOTES
  • How to show progress dialog with ffmpeg processing on android

    19 janvier 2015, par Wenying Zhu

    I am concating two video files to one new file using ffmpeg on android.
    This takes a long time, so I need to show the progress dialog.
    How to show the progress bar with ffmpeg on android ?
    I have checked Can ffmpeg show a progress bar ?
    But I can’t understand well.
    I am a beginner for android.

    Best Regards.