Recherche avancée

Médias (0)

Mot : - Tags -/organisation

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (100)

  • Submit bugs and patches

    13 avril 2011

    Unfortunately a software is never perfect.
    If you think you have found a bug, report it using our ticket system. Please to help us to fix it by providing the following information : the browser you are using, including the exact version as precise an explanation as possible of the problem if possible, the steps taken resulting in the problem a link to the site / page in question
    If you think you have solved the bug, fill in a ticket and attach to it a corrective patch.
    You may also (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Activation de l’inscription des visiteurs

    12 avril 2011, par

    Il est également possible d’activer l’inscription des visiteurs ce qui permettra à tout un chacun d’ouvrir soit même un compte sur le canal en question dans le cadre de projets ouverts par exemple.
    Pour ce faire, il suffit d’aller dans l’espace de configuration du site en choisissant le sous menus "Gestion des utilisateurs". Le premier formulaire visible correspond à cette fonctionnalité.
    Par défaut, MediaSPIP a créé lors de son initialisation un élément de menu dans le menu du haut de la page menant (...)

Sur d’autres sites (12395)

  • avcodec/exr : preserve half-float NaN bits and add fate test

    23 novembre 2020, par Mark Reid
    avcodec/exr : preserve half-float NaN bits and add fate test
    

    Handles NaNs more like the official implementation handles them, preserving
    the original bits.

    • [DH] libavcodec/exr.c
    • [DH] tests/fate/image.mak
    • [DH] tests/ref/fate/exr-rgb-scanline-zip-half-0x0-0xFFFF
  • How to use ffmpeg to push my Unity3d process screenshot to nginx-rtmp server ?

    26 mars 2019, par Menghui

    I want to broadcast my Unity process screen.

    What I can think of is to use the Unity screenshot, then pipe png to the ffmpeg process, ffmpeg encodes png into h264 and pushes it to the server.

    How to use ffmpeg and named pipe to encod and push stream ?

    1. About png to h264, is this possible to push ? (But I can’t play server video stream with vlc)
    .\ffmpeg.exe -framerate 24 -i .\screenshot%03d.png -vcodec libx264 -acodec aac -strict -2 -f h264 rtmp://xxx.xxx.xxx.xxx/live

    png -> rtmp image

    1. About ffmpeg and named pipes, it seems to be written, and can not be read.
    # write
    .\ffmpeg.exe -i test.mp4 -f h264 pipe:MyPipe

    mp4 -> pipe image

    # read and push stream
    .\ffmpeg.exe -re -i pipe:MyPipe -vcodec libx264 -acodec aac -strict -2 -f h264 rtmp://xxx.xxx.xxx.xxx/live

    pipe -> rtmp image

    And ffmpeg official website does not seem to have an introduction about named pipes.

    Thank you for your help !

  • How can I stream raw video frames AND audio to FFMPEG with Python 2.7 ?

    18 novembre 2017, par Just Askin

    I am streaming raw video frames from Pygame to FFMPEG, then sending to a rtmp stream, but for the life of me, I can’t figure out how to send live audio using the same Python module. It does not need to be the Pygame mixer, but I am not opposed to using it if that is where the best answer lies. I’m pretty sure it’s not though.

    My question is this : What is the best strategy to send live audio output from a program to FFMPEG along with raw video frames simultaneously from the same Python module ?

    My program is large, and eventually I would like to build options to switch audio inputs from a queue of music, a microphone, or any other random sounds from any program I want to use. But for the time being, I just want something to work. I am starting off with a simple Espeak command.

    Here is my Python commands :

    command = ['ffmpeg', '-re', '-framerate', '22', '-s', '1280x720', '-pix_fmt', 'rgba', '-f', 'rawvideo', '-i', '-', '-f', 's16le', '-ar', '22500', '-i', '/tmp/audio', '-preset', ultrafast', '-pix_fmt', 'rgba', '-b:v', '2500', '-s', 'hd720', '-r', '25', '-g', '50', '-crf', '20', '-f', 'flv', 'rtmp://xxx' ]

    pipe = sp.Popen(command, stdin=sp.PIPE)

    Then I send my frames to stdin from within my main while True: loop.

    The problem I run into with this strategy is I can’t figure out how to shove audio into FFMPEG from within Python without blocking the pipe. After hours of research, I am pretty confident I can’t use the pipe to send the audio along with the frames. I thought the named pipe was my solution (which works running Espeak outside of Python), but it blocks Python until the Espeak is done... so no good.

    I assume I need threading for multiprocessing, but I cannot figure out from the official documentation or any other resources as to how I can solve my problem with it.

    The ['-f', 's16le', '-ar', '22500', '-i', '/tmp/audio'] are settings that work if I run espeak from a separate terminal with espeak 'some text' --stdout > /tmp/audio.

    I am using Centos 7, Python 2.7, pygame, the latest build of FFMPEG,