
Recherche avancée
Médias (1)
-
The Slip - Artworks
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Texte
Autres articles (110)
-
Gestion de la ferme
2 mars 2010, parLa ferme est gérée dans son ensemble par des "super admins".
Certains réglages peuvent être fais afin de réguler les besoins des différents canaux.
Dans un premier temps il utilise le plugin "Gestion de mutualisation" -
Demande de création d’un canal
12 mars 2010, parEn fonction de la configuration de la plateforme, l’utilisateur peu avoir à sa disposition deux méthodes différentes de demande de création de canal. La première est au moment de son inscription, la seconde, après son inscription en remplissant un formulaire de demande.
Les deux manières demandent les mêmes choses fonctionnent à peu près de la même manière, le futur utilisateur doit remplir une série de champ de formulaire permettant tout d’abord aux administrateurs d’avoir des informations quant à (...) -
Creating farms of unique websites
13 avril 2011, parMediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)
Sur d’autres sites (16555)
-
Anomalie #2427 (Nouveau) : vignette sur image trop grosse
24 novembre 2011, par Ben .15:02 reprenons 15:03 de ce que tu décris et ce qui arrive de temps en temps : 15:03 un utilisateur poste une GROSSE image, SPIP essaye de générer une vignette de cette image, mais php n’y arrive pas car pas assez de mémoire en général 15:03 ok 15:04 fil> c’st affreux comme situation Benny_b (...)
-
How to make circular waveform video with background image using ffmpeg ?
20 août 2024, par Ansh RathodHi I want to make a circular waveform video using ffmpeg but want to use the background image. I tried this command many times but didn't work, I tried using ChatGPT over and over but didn't work at all now I'm questioning whether is it even possible to make it in a single command or not.


ffmpeg -i input.mp3 -i background.png -filter_complex "[0:a]showwaves=size=100x100:colors=white:draw=full:mode=p2p[v];[v]format=rgba,geq='p(mod((2*W/(2*PI))*(PI+atan2(0.5*H-Y,X-W/2)),W), H-2*hypot(0.5*H-Y,X-W/2))'[vout];[1:v][vout]overlay=(W-w)/2:(H-h)/2[outv]" -map "[outv]" -map 0:a -pix_fmt yuv420p output.mp4



I used this command without bg it works :



ffmpeg -i input.mp3 -filter_complex "[0:a]showwaves=size=100x100:colors=white:draw=full:mode=p2p[v]; \[v]format=rgba,geq='p(mod((2*W/(2*PI))*(PI+atan2(0.5*H-Y,X-W/2)),W), H-2*hypot(0.5*H-Y,X-W/2))'[vout]" -map "[vout]" -map 0:a -pix_fmt yuv420p output.mp4




-
How to dynamically overlay images with ffmpeg
23 mai, par RorschyAs a part of a bigger project, I'm trying to stream a live feed using ffmpeg through RTSP while also dynamically changing subtitles depending on the situation.


As of now, I'm able to live stream with no issue. I also came across a solution for the subtitles by using a text file.


However, I'd like to avoid having this text file in my project. I thought about creating a picture with the subtitles and overlaying it with the screen stream. However, with my current solution, the data is streamed only when I kill the running code (data streamed for a few seconds).


Here is the current code for this problem :


import subprocess
import threading
import string
import random
import time
import io
from PIL import Image, ImageDraw, ImageFont

RTSP_URL = "..."
ffmpeg = None

def generate_subtitle():
 width = 640
 height = 100
 font_size = 32
 while True:
 if ffmpeg:
 try:
 text = ''.join(random.choice(string.ascii_uppercase + string.digits) for _ in range(10))

 image = Image.new("RGBA", (width, height), (0, 0, 0, 128))
 draw = ImageDraw.Draw(image)

 try:
 font = ImageFont.truetype("arial.ttf", font_size)
 except IOError:
 font = ImageFont.load_default()

 bbox = draw.textbbox((0, 0), text, font=font)
 text_width = bbox[2] - bbox[0]
 text_height = bbox[3] - bbox[1]

 x = (width - text_width) // 2
 y = (height - text_height) // 2

 draw.text((x, y), text, font=font, fill=(255, 255, 255, 255))
 buffer = io.BytesIO()
 image.save(buffer, format="PNG")
 ffmpeg.stdin.write(buffer.getvalue())
 ffmpeg.stdin.flush()
 time.sleep(5)
 except Exception as e:
 print("Erreur d'envoi d'image :", e)
 break
 else:
 time.sleep(1)

def run_ffmpeg():
 global ffmpeg
 ffmpeg = subprocess.Popen([
 'ffmpeg',

 # Input 0: capture desktop
 "-f", "gdigrab",
 "-offset_x", "0",
 "-offset_y", "0",
 "-video_size", "1920x1080",
 "-i", "desktop",

 # Input 1: PNG overlay from stdin
 "-f", "image2pipe",
 "-vcodec", "png",
 "-i", "-",

 # Filter to overlay Input 1 on Input 0
 "-filter_complex", "[0:v][1:v]overlay=(main_w-overlay_w)/2:(main_h-overlay_h)-10",

 # Output settings
 "-vcodec", "libx264",
 "-preset", "ultrafast",
 "-tune", "zerolatency",
 "-g", "30",
 "-sc_threshold", "0",
 "-f", "rtsp",
 RTSP_URL
 ], stdin=subprocess.PIPE)

threading.Thread(target=run_ffmpeg, daemon=True).start()

threading.Thread(target=generate_subtitle, daemon=True).start()

while True:
 time.sleep(1)




My question is how can I stream the data correctly ?
If there is another solution to change dynamically the subtitles without using a text file or a temporary file I'd be glad to hear it.