Recherche avancée

Médias (1)

Mot : - Tags -/stallman

Autres articles (30)

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

  • De l’upload à la vidéo finale [version standalone]

    31 janvier 2010, par

    Le chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
    Upload et récupération d’informations de la vidéo source
    Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
    Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

Sur d’autres sites (4355)

  • Ffmpeg - Generate VTT File From Sprite, Using Spatial Media Fragment

    20 octobre 2019, par David

    Hi I am looking to create a .VTT file from a sprite that i have generated using Ffmpeg.

    Ffmpeg command :

    $"-i {inputMediaFile} -vf \"select = not(mod(n\\, 30)),scale = 120:80,tile = 7x7\" -an -vsync 0 {outputMediaFile}"

    This selects every 30th frame, and then scales it to 120x80 pixels and creates 8x8 tiles in the output image.

    I would like to make a .VTT from the generated image in C#, so i know the height and width of my individual images in the sprite (120x80) and there is 64 images in total in the output image.

    From this i need to produce a VTT like this :

    WEBVTT

    1
    00:00:00.000 --> 00:00:01.000
    test-00001.jpg#xywh=0,0,120,80

    2
    00:00:01.000 --> 00:00:02.000
    test-00001.jpg#xywh=120,0,120,80

    3
    00:00:02.000 --> 00:00:03.000
    test-00001.jpg#xywh=240,0,120,80

    4
    00:00:03.000 --> 00:00:04.000
    test-00001.jpg#xywh=360,0,120,80

    5
    00:00:04.000 --> 00:00:05.000
    test-00001.jpg#xywh=480,0,120,80

    6
    00:00:05.000 --> 00:00:06.000
    test-00001.jpg#xywh=600,0,120,80

    7
    00:00:06.000 --> 00:00:07.000
    test-00001.jpg#xywh=720,0,120,80

    8
    00:00:07.000 --> 00:00:08.000
    test-00001.jpg#xywh=840,0,120,80

    9
    00:00:08.000 --> 00:00:09.000
    test-00001.jpg#xywh=0,80,120,80

    There is also situations when there is n amount of sprite files.

    Im hoping there may be a library out there that can handle this, or even better if i can keep it contained within Ffmpeg - based on Ffmpeg docs i dont think this is possible though.

    Thanks in advance if anyone as any ideas, its doable as ive seen Nodejs and Ruby examples.

  • How to fade video using a custom curve function in ffmpeg ?

    2 avril 2018, par Zoltan

    I would like to fade out a video using a different transition curve than what the fade filter uses (which is probably linear).

    For audio, I can easily choose from a wide variety of curves that the afade filter provides, or I can supply a custom expression to the volume filter, like (sin(PI/2 * min(1\, max(-1\, 1/2 * (t - 3)))) + 1)/2. For video, however, I could not find similar possibilities.

    Based on its description, it seems to me that the geq filter could be misused to achieve this, but I couldn’t get it to work. Additionally, it is very slow, probably because it works on individual pixels, even though the expression only depends on time.

  • Splitting odd and even frames in Gstreamer

    4 avril 2014, par user3498379

    I am building an application whereby I need to split a videostream(RTSP/MJPEG) into individual frames. The extracted frames then need to be put into two named pipes. I need to alternate the frames between two named pipes, i.e. even frames go to pipe one and odd frames go to pipe 2. I have been able to achieve this with ffmpeg using the following command :

    ffmpeg -i "rtsp://<ipaddress>/axis-media/media.amp?videocodec=jpeg" -vf select="mod(n-1\,2)" -vcodec mjpeg -f avi -y  -vf select="not(mod(n-1\,2))" -vcodec mjpeg -f avi -y
    </ipaddress>

    However I have run into an RTP packet size issue with ffmpeg when using large resolutions e.g. 2048x1536. My question is ; is there a gstreamer equivalent command ? I have the basic gstreamer command which extracts images :

    gst-launch rtspsrc location=rtsp://<ipaddress>/axis-media/media.amp?videocodec=jpeg ! decodebin2 ! jpegenc ! multifilesink location="frame%d.jpg"
    </ipaddress>

    But now I need the additional piece, any help will be much appreciated.