Recherche avancée

Médias (91)

Autres articles (105)

  • Encoding and processing into web-friendly formats

    13 avril 2011, par

    MediaSPIP automatically converts uploaded files to internet-compatible formats.
    Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
    Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
    Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
    All uploaded files are stored online in their original format, so you can (...)

  • Ajouter des informations spécifiques aux utilisateurs et autres modifications de comportement liées aux auteurs

    12 avril 2011, par

    La manière la plus simple d’ajouter des informations aux auteurs est d’installer le plugin Inscription3. Il permet également de modifier certains comportements liés aux utilisateurs (référez-vous à sa documentation pour plus d’informations).
    Il est également possible d’ajouter des champs aux auteurs en installant les plugins champs extras 2 et Interface pour champs extras.

  • Gestion de la ferme

    2 mars 2010, par

    La ferme est gérée dans son ensemble par des "super admins".
    Certains réglages peuvent être fais afin de réguler les besoins des différents canaux.
    Dans un premier temps il utilise le plugin "Gestion de mutualisation"

Sur d’autres sites (10539)

  • How can I open a program using PYTHON with Mutliprocessing, and send it strings from the main process ?

    12 février 2018, par Just Askin

    I have a program that sends frames as strings to FFMPEG using something similar to :

    Working script that streams without using multiprocessing module currently on Ubuntu

    #!/usr/bin/python
    import sys, os
    import subprocess as sp
    import pygame
    from pygame.locals import QUIT, KEYUP, K_ESCAPE
    import pygame.display

    pygame.init()
    os.environ['SDL_VIDEODRIVER'] = 'dummy'
    pygame.display.init()
    Display_Surface = pygame.display.set_mode([1280,720], 0, 32)

    # FFMPEG command and settings
    command = ['ffmpeg', '-framerate', '25', '-s', '1280x720', '-pix_fmt', 'rgba', '-f', 'rawvideo', '-i', '-',
              '-f', 'lavfi', '-i', 'anullsrc=cl=mono',
              '-pix_fmt', 'yuv420p','-s', 'hd720', '-r', '25', '-g', '50',
              '-f', 'flv', 'rtmp://a.rtmp.youtube.com/live2/xxxx-xxxx-xxxx-xxxx']

    pipe = sp.Popen(command, bufsize=0, stdin=sp.PIPE)

    while True:
       # Quit event handling
       for event in pygame.event.get():
           if event.type == QUIT or (event.type == KEYUP and event.key == K_ESCAPE):
               pygame.quit()
               sys.exit()

       pipe.stdin.write(pygame.image.tostring(Display_Surface, "RGBA"))

    pipe.stdin.close()
    pygame.display.quit()
    os._exit()

    This works fine, except for the fact that it is killing my CPU, which in turn causes my live stream to freeze often. The stupid GIL won’t let FFMPEG run on another CPU/Core while I have 3 perfectly good cores doing nothing.

    I just whipped up some code to open FFMPEG in another process. (By the way, I’m familiar with threading.Thread, but not Multiprocessing).

    import os
    import subprocess as sp
    import multiprocessing

    class FFMPEG_Consumer():

       def __init__(self):
           proc = multiprocessing.Process(target=self.start_ffmpeg)
           proc.start()

       def start_ffmpeg(self):
           command = ['ffmpeg','-pix_fmt', 'rgba', '-f', 'rawvideo', '-i', '-',
                      '-f, 'lavfi', '-i', 'anullsrc=channel_layout=stereo:sample_rate=44100',
                      '-pix_fmt', 'yuv420p','-s', 'hd720', '-f', 'flv', 'rtmp://example.com']

           pipe = sp.Popen(command, bufsize=-1, stdin=sp.PIPE)

       def send_down_the_pipe(self, frame):
           pipe.stdin.write(frame)

    ffmpeg = FFMPEG_Consumer()

    For anyone that knows how to use multiprocessing, I’m sure you will immediately see that this does not work because I can’t share variables this way across processes. But, it does open FFMPEG on another core.

    Most online tutorials and resources focus creating pools of workers and queues to send those workers something to be processed until a job is finished. I am however trying to send a new string repeatedly to FFMPEG through each iteration.

    How can I pipe my string to that process/instance of FFMPEG ?

    Or is what I’m trying to do not possible ?

    This was the working solution (with dumbed down FFMPEG settings) :

    #!/usr/bin/python
    import sys, os, multiprocessing
    import subprocess as sp
    import pygame
    from pygame.locals import QUIT, KEYUP, K_ESCAPE
    import pygame.display

    pygame.init()
    os.environ['SDL_VIDEODRIVER'] = 'dummy'
    pygame.display.init()
    Display_Surface = pygame.display.set_mode([1280,720], 0, 32)

    class FFMPEGConsumer(object):
       def __init__(self):
           self._r, self._w = multiprocessing.Pipe()
           self.reader = os.fdopen(self._r.fileno(), 'r')
           self.writer = os.fdopen(self._w.fileno(), 'w', 0)
           self.proc = None

       def start_ffmpeg(self):

           command = ['ffmpeg', '-framerate', '25', '-s', '1280x720', '-pix_fmt', 'rgba', '-f', 'rawvideo', '-i', '-',
              '-f', 'lavfi', '-i', 'anullsrc=cl=mono',
              '-pix_fmt', 'yuv420p','-s', 'hd720', '-r', '25', '-g', '50',
              '-f', 'flv', 'rtmp://a.rtmp.youtube.com/live2/xxxx-xxxx-xxxx-xxxx']

           self.proc = sp.Popen(command, bufsize=-1, stdin=self.reader)

       def send_down_the_pipe(self, frame):
           self.writer.write(frame)
           #print self._stdin.read()

       def __del__(self):
           self.reader.close()
           self.writer.close()

    ffmpeg = FFMPEGConsumer()

    while True:
       # Quit event handling
       for event in pygame.event.get():
           if event.type == QUIT or (event.type == KEYUP and event.key == K_ESCAPE):
               pygame.quit()
               sys.exit()

       ffmpeg.send_down_the_pipe(pygame.image.tostring(Display_Surface, "RGBA"))
       proc.join()

    pipe.stdin.close()
    pygame.display.quit()
    os._exit()

    All cores are firing and no lags so far !!!

  • FFmpeg mix audio clips at given time into a main audio file

    5 mai 2020, par Aramil

    I has been recording small audio clips for an audio book. I have the start time of each one in seconds. The music lenght is of, let's say 60 min. I am thinking in create a silence audio file of the same duration as the music, but how I can add each clip into the given start time ? No matter if the clips overlap. I tried using concat and inpoint without the blank file and the output is empty (I am using wav files), that why the idea of use a master blank file as base.

    



    If possible I would really appreciate any example.

    



    Thanks

    


  • Why are rectangular boxes ([]) showing instead of text in burned subtitles when using ffmpeg/MoviePy on Google Colab ?

    14 mai, par Lavish

    I'm working on a Python script that adds subtitles to a video using MoviePy and burns/hardcodes them directly onto the video. The subtitles contain Hindi text, and I’ve specified a custom font that supports Devanagari (Hindi) script.

    


    The code works perfectly fine on my local machine, the subtitles appear correctly with Hindi characters. However, when I run the same code on Google Colab, the subtitles display as rectangular boxes (something like this -> [][][]) instead of proper characters.

    


    Things I've tried :

    


      

    • Ensured the font used supports Hindi (I'm using NotoSansDevanagari-Regular.ttf renamed as font.ttf).
    • 


    • Uploaded the font to Colab and specified the full path correctly.
    • 


    • Verified that the text is passed as a proper Unicode string.
    • 


    


    Here's the code snippet :

    


    def add_subtitles(video_path, subtitles_path, output_path):
    """Adds subtitles using FFmpeg with proper path escaping."""
    # Convert to absolute paths and normalize
    
    video_path = os.path.abspath(video_path)
    subtitles_path = os.path.abspath(subtitles_path)
    output_path = os.path.abspath(output_path)


    # Subtitle path
    font_path = "input_files/font.ttf"
    font_path = os.path.abspath(font_path).replace("\\", "\\\\")
    subtitles_path_escaped = os.path.abspath(subtitles_path).replace("\\", "\\\\")
    
    # Escape backslashes in paths
    subtitles_path = subtitles_path.replace("\\", "\\\\")
    # Remove all files in final_videos
    [os.remove(os.path.join(base_dir, "final_videos", f)) for f in os.listdir(os.path.join(base_dir, "final_videos")) if os.path.isfile(os.path.join(base_dir, "final_videos", f))]

    os.makedirs(os.path.dirname(output_path), exist_ok=True)
    escaped_path = subtitles_path.replace(':', '\\:').replace('\\', '\\\\')

    cmd = [
    "ffmpeg",
    "-i", video_path,
    "-vf", f"subtitles={escaped_path}:force_style='FontFile={font_path}'",
    "-c:v", "libx264",
    "-c:a", "copy",
    "-preset", "fast",
    "-crf", "22",
    output_path
]


    # Debug: Print the exact command being executed
    print("Executing:", " ".join(cmd))
    try:
        subprocess.run(cmd, check=True, capture_output=True, text=True)
        print(f"✅ Success! Output saved to: {output_path}")
    except subprocess.CalledProcessError as e:
        print(f"❌ FFmpeg failed with error:\n{e.stderr}")



    


    What could be causing this issue on Colab, and how can I get non-English subtitles (like Hindi) to render properly when burning subtitles using MoviePy/ffmpeg in a Colab environment ?