Recherche avancée

Médias (0)

Mot : - Tags -/clipboard

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (83)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Amélioration de la version de base

    13 septembre 2013

    Jolie sélection multiple
    Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
    Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

Sur d’autres sites (14461)

  • avcodec/exr : preserve half-float NaN bits and add fate test

    23 novembre 2020, par Mark Reid
    avcodec/exr : preserve half-float NaN bits and add fate test
    

    Handles NaNs more like the official implementation handles them, preserving
    the original bits.

    • [DH] libavcodec/exr.c
    • [DH] tests/fate/image.mak
    • [DH] tests/ref/fate/exr-rgb-scanline-zip-half-0x0-0xFFFF
  • How to use ffmpeg to push my Unity3d process screenshot to nginx-rtmp server ?

    26 mars 2019, par Menghui

    I want to broadcast my Unity process screen.

    What I can think of is to use the Unity screenshot, then pipe png to the ffmpeg process, ffmpeg encodes png into h264 and pushes it to the server.

    How to use ffmpeg and named pipe to encod and push stream ?

    1. About png to h264, is this possible to push ? (But I can’t play server video stream with vlc)
    .\ffmpeg.exe -framerate 24 -i .\screenshot%03d.png -vcodec libx264 -acodec aac -strict -2 -f h264 rtmp://xxx.xxx.xxx.xxx/live

    png -> rtmp image

    1. About ffmpeg and named pipes, it seems to be written, and can not be read.
    # write
    .\ffmpeg.exe -i test.mp4 -f h264 pipe:MyPipe

    mp4 -> pipe image

    # read and push stream
    .\ffmpeg.exe -re -i pipe:MyPipe -vcodec libx264 -acodec aac -strict -2 -f h264 rtmp://xxx.xxx.xxx.xxx/live

    pipe -> rtmp image

    And ffmpeg official website does not seem to have an introduction about named pipes.

    Thank you for your help !

  • How can I stream raw video frames AND audio to FFMPEG with Python 2.7 ?

    18 novembre 2017, par Just Askin

    I am streaming raw video frames from Pygame to FFMPEG, then sending to a rtmp stream, but for the life of me, I can’t figure out how to send live audio using the same Python module. It does not need to be the Pygame mixer, but I am not opposed to using it if that is where the best answer lies. I’m pretty sure it’s not though.

    My question is this : What is the best strategy to send live audio output from a program to FFMPEG along with raw video frames simultaneously from the same Python module ?

    My program is large, and eventually I would like to build options to switch audio inputs from a queue of music, a microphone, or any other random sounds from any program I want to use. But for the time being, I just want something to work. I am starting off with a simple Espeak command.

    Here is my Python commands :

    command = ['ffmpeg', '-re', '-framerate', '22', '-s', '1280x720', '-pix_fmt', 'rgba', '-f', 'rawvideo', '-i', '-', '-f', 's16le', '-ar', '22500', '-i', '/tmp/audio', '-preset', ultrafast', '-pix_fmt', 'rgba', '-b:v', '2500', '-s', 'hd720', '-r', '25', '-g', '50', '-crf', '20', '-f', 'flv', 'rtmp://xxx' ]

    pipe = sp.Popen(command, stdin=sp.PIPE)

    Then I send my frames to stdin from within my main while True: loop.

    The problem I run into with this strategy is I can’t figure out how to shove audio into FFMPEG from within Python without blocking the pipe. After hours of research, I am pretty confident I can’t use the pipe to send the audio along with the frames. I thought the named pipe was my solution (which works running Espeak outside of Python), but it blocks Python until the Espeak is done... so no good.

    I assume I need threading for multiprocessing, but I cannot figure out from the official documentation or any other resources as to how I can solve my problem with it.

    The ['-f', 's16le', '-ar', '22500', '-i', '/tmp/audio'] are settings that work if I run espeak from a separate terminal with espeak 'some text' --stdout > /tmp/audio.

    I am using Centos 7, Python 2.7, pygame, the latest build of FFMPEG,