Recherche avancée

Médias (39)

Mot : - Tags -/audio

Autres articles (111)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

  • De l’upload à la vidéo finale [version standalone]

    31 janvier 2010, par

    Le chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
    Upload et récupération d’informations de la vidéo source
    Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
    Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)

Sur d’autres sites (11184)

  • How can I stream raw video frames AND audio to FFMPEG with Python 2.7 ?

    18 novembre 2017, par Just Askin

    I am streaming raw video frames from Pygame to FFMPEG, then sending to a rtmp stream, but for the life of me, I can’t figure out how to send live audio using the same Python module. It does not need to be the Pygame mixer, but I am not opposed to using it if that is where the best answer lies. I’m pretty sure it’s not though.

    My question is this : What is the best strategy to send live audio output from a program to FFMPEG along with raw video frames simultaneously from the same Python module ?

    My program is large, and eventually I would like to build options to switch audio inputs from a queue of music, a microphone, or any other random sounds from any program I want to use. But for the time being, I just want something to work. I am starting off with a simple Espeak command.

    Here is my Python commands :

    command = ['ffmpeg', '-re', '-framerate', '22', '-s', '1280x720', '-pix_fmt', 'rgba', '-f', 'rawvideo', '-i', '-', '-f', 's16le', '-ar', '22500', '-i', '/tmp/audio', '-preset', ultrafast', '-pix_fmt', 'rgba', '-b:v', '2500', '-s', 'hd720', '-r', '25', '-g', '50', '-crf', '20', '-f', 'flv', 'rtmp://xxx' ]

    pipe = sp.Popen(command, stdin=sp.PIPE)

    Then I send my frames to stdin from within my main while True: loop.

    The problem I run into with this strategy is I can’t figure out how to shove audio into FFMPEG from within Python without blocking the pipe. After hours of research, I am pretty confident I can’t use the pipe to send the audio along with the frames. I thought the named pipe was my solution (which works running Espeak outside of Python), but it blocks Python until the Espeak is done... so no good.

    I assume I need threading for multiprocessing, but I cannot figure out from the official documentation or any other resources as to how I can solve my problem with it.

    The ['-f', 's16le', '-ar', '22500', '-i', '/tmp/audio'] are settings that work if I run espeak from a separate terminal with espeak 'some text' --stdout > /tmp/audio.

    I am using Centos 7, Python 2.7, pygame, the latest build of FFMPEG,

  • How to use ffmpeg to push my Unity3d process screenshot to nginx-rtmp server ?

    26 mars 2019, par Menghui

    I want to broadcast my Unity process screen.

    What I can think of is to use the Unity screenshot, then pipe png to the ffmpeg process, ffmpeg encodes png into h264 and pushes it to the server.

    How to use ffmpeg and named pipe to encod and push stream ?

    1. About png to h264, is this possible to push ? (But I can’t play server video stream with vlc)
    .\ffmpeg.exe -framerate 24 -i .\screenshot%03d.png -vcodec libx264 -acodec aac -strict -2 -f h264 rtmp://xxx.xxx.xxx.xxx/live

    png -> rtmp image

    1. About ffmpeg and named pipes, it seems to be written, and can not be read.
    # write
    .\ffmpeg.exe -i test.mp4 -f h264 pipe:MyPipe

    mp4 -> pipe image

    # read and push stream
    .\ffmpeg.exe -re -i pipe:MyPipe -vcodec libx264 -acodec aac -strict -2 -f h264 rtmp://xxx.xxx.xxx.xxx/live

    pipe -> rtmp image

    And ffmpeg official website does not seem to have an introduction about named pipes.

    Thank you for your help !

  • avcodec/exr : preserve half-float NaN bits and add fate test

    23 novembre 2020, par Mark Reid
    avcodec/exr : preserve half-float NaN bits and add fate test
    

    Handles NaNs more like the official implementation handles them, preserving
    the original bits.

    • [DH] libavcodec/exr.c
    • [DH] tests/fate/image.mak
    • [DH] tests/ref/fate/exr-rgb-scanline-zip-half-0x0-0xFFFF