Recherche avancée

Médias (1)

Mot : - Tags -/biomaping

Autres articles (59)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

  • De l’upload à la vidéo finale [version standalone]

    31 janvier 2010, par

    Le chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
    Upload et récupération d’informations de la vidéo source
    Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
    Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)

Sur d’autres sites (7233)

  • avcodec/vc2enc : Don't use bitcount when byte-aligned

    6 octobre 2022, par Andreas Rheinhardt
    avcodec/vc2enc : Don't use bitcount when byte-aligned
    

    (There is a small issue that is now being treated differently :
    The earlier code would record a position in a buffer that
    is being written to via put_bits(), then write data,
    then overwrite the byte at the position recorded earlier
    and only then flush the PutBitContext. In case there was
    no writeout in the meantime, said flush would overwrite
    what one has just written. This never happened in my tests,
    but maybe it can happen. In this case this commit fixes
    this issue by flushing before overwriting the old data.)

    Signed-off-by : Andreas Rheinhardt <andreas.rheinhardt@outlook.com>

    • [DH] libavcodec/vc2enc.c
  • init seg_duration option for dash similar to -hls_init_time option

    5 août 2019, par Saurabh

    Is there any option to set first segment duration for dash files, There is similar option for HLS : hls_init_time ?

    I could not found any such option in documentation.


    Why I want such option to be there :

    From generated DASH files : in Android ExoPlayer, It always plays first chunk of lowest quality(bitrate/resolution), after first chunk, basis user’s internet speed, it picks the next chunk. If there is no init_seg_duration option, I have to rely only on seg_duration and I can not keep it bit large. Why I will like to keep it large : to reduce the number of files generated.

    With such option, I could had given -init_seg_duraion 2 -seg_duration 6, so video quality would had increased after 2 seconds and total number of files generated would also not had exploded.

  • Adaptive bit rate streaming of mp4 of different gop size using media source api

    25 décembre 2018, par wahab khurram

    I want the adaptive bitrate streaming of the mp4 video of different gop keyframe size.

    I know there are couple of options for multi bitrate streaming i.e hls, dash etc

    But I already uploaded the videos on the server each video have the 360p, 480p and 720p mp4 file and each video are having different keyframe intervals.

    So the real challenge is to make the own multi bitrate mp4 media player using the media source api

    I have brain storming all aspect.

    We can only cut the h264 at keyframe

    So my real challenge is to know the video each keyframe, the keyframe chunk duration, the offset duration and the offset byte position in the mp4 file.

    So my question is how I can get these following using ffmpeg, ffprobe or any other software.

    1- Keyframe chunk duration

    2- Offset video duration

    3- Offset byte position in video.

    The following ffprobe command give the detailed info of the each keyframe, maybe this will help

    ffprobe -i "1080p.mp4" -select_streams v -skip_frame nokey -show_frames

    Thanks !