Recherche avancée

Médias (0)

Mot : - Tags -/interaction

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (102)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Modifier la date de publication

    21 juin 2013, par

    Comment changer la date de publication d’un média ?
    Il faut au préalable rajouter un champ "Date de publication" dans le masque de formulaire adéquat :
    Administrer > Configuration des masques de formulaires > Sélectionner "Un média"
    Dans la rubrique "Champs à ajouter, cocher "Date de publication "
    Cliquer en bas de la page sur Enregistrer

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

Sur d’autres sites (12189)

  • lavf : Remove codec_tag from dashenc and smoothstreamingenc

    30 juin 2017, par Martin Storsjö
    lavf : Remove codec_tag from dashenc and smoothstreamingenc
    

    Currently, the tags enforced and set on the segmenter muxer level
    mismatch what the mp4/ismv muxer uses (since 713efb2c0d013).

    Skip the codec_tag altogether here, to let the user (try to) set
    whichever codec/tag is preferred ; the individual chained muxer will
    reject invalid codecs anyway.

    Signed-off-by : Martin Storsjö <martin@martin.st>

    • [DBH] libavformat/dashenc.c
    • [DBH] libavformat/smoothstreamingenc.c
  • ffmpeg concat drops audio frames

    5 octobre 2017, par Shaun

    I have an mp4 file and I want to take two sequential sections of the video out and render them as individual files, later recombining them back into the original video. For instance, with my video video.mp4, I can run

    ffmpeg -i video.mp4 -ss 56 -t 4 out1.mp4
    ffmpeg -i video.mp4 -ss 60 -t 4 out2.mp4

    creating out1.mp4 which contains 00:00:56 to 00:01:00 of video.mp4, and out2.mp4 which contains 00:01:00 to 00:01:04. However, later I want to be able to recombine them again quickly (i.e., without reencoding), so I use the concat demuxer,

    ffmpeg -f concat -safe 0 -i files.txt -c copy concat.mp4

    where files.txt contains

    file out1.mp4
    file out2.mp4

    which theoretically should give me back 00:00:56 to 00:01:04 of video.mp4, however there are always dropped audio frames where the concatenation occurs, creating a very unpleasant sound artifact, an audio blip, if you will.

    missing audio frames

    I have tried using async and -af apad on initially creating the two sections of the video but I am still faced with the same problem, and have not found the solution elsewhere. I have experienced this issue in multiple different use cases, so hopefully this simple example will shed some light on the real problem.

  • displaying a baseline h264 frames stream in browsers

    6 août 2021, par Thabet Sabha

    So, I have a server that receives a live rtsp stream then generates baseline h264 frames using ffmpeg, which then are sent via an rtcDataChannel to browser, and while the frames arrive as intended, I can't figure out a way to display them on my html5 videoElement,&#xA;here is a simplified version of my current approach :

    &#xA;

    const remoteStream = new MediaSource();&#xA;myVideoElement.src = window.URL.createObjectURL(remoteStream);&#xA;&#xA;// called when remoteStream.readyState === "open"&#xA;let sourceBuffer = remoteStream.addSourceBuffer(&#x27;video/mp4; codecs="avc1.4d002a"&#x27;);&#xA;&#xA;// this gets called when ever a new frame is received from the webrtc data channel.&#xA;function onFrame(frame) {&#xA;      sourceBuffer.appendBuffer(new Uint8Array(frame));&#xA;&#xA;      /*&#xA;      console.log(frame) ==> <buffer 00="00" 01="01" 41="41" 9b="9b" a0="a0" 22="22" 80="80" a5="a5" d7="d7" 42="42" ea="ea" 34="34" 14="14" 85="85" ba="ba" bc="bc" 1b="1b" f2="f2" 71="71" 0d="0d" 8b="8b" e1="e1" 3c="3c" 52="52" d5="d5" 8c="8c" ef="ef" c1="c1" 89="89" 10="10" c5="c5" 05="05" 78="78" ee="ee" 1d="1d" 03="03" 8d="8d" 2896="2896" more="more" bytes="bytes">&#xA;      */&#xA;}&#xA;</buffer>

    &#xA;

    ffmpeg options :

    &#xA;

    [&#xA;    "-rtsp_transport", "tcp",&#xA;    "-i", `${rtspCamURL}`,  &#xA;    "-framerate", "15",&#xA;    "-c:v", "libx264",&#xA;    "-vprofile", "baseline",&#xA;    "-b:v", "600k",&#xA;    "-bufsize", "600k",&#xA;    "-pix_fmt", "yuv420p",&#xA;    &#x27;-tune&#x27;, &#x27;zerolatency&#x27;,&#xA;    "-preset", "ultrafast",&#xA;    "-f", "rawvideo",&#xA;    &#x27;-&#x27;&#xA;]; &#xA;

    &#xA;

    ffmpeg stream is then split using NAL delimiter (to generate individual frames) then each frame is sent via the data channel like so :&#xA;Buffer.concat([nalDelimiter, frame]).

    &#xA;

    I am not sure if i'm missing something as i'm not getting any helpful errors due to the remoteSource closing as soon as the first frame arrives for some reason.

    &#xA;

    or does the media source just not support raw h264 frames, and if so is there a workaround to solve this issue ? (even if it has to do with changing the ffmpeg params.

    &#xA;