
Recherche avancée
Médias (1)
-
The Great Big Beautiful Tomorrow
28 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Texte
Autres articles (36)
-
Mediabox : ouvrir les images dans l’espace maximal pour l’utilisateur
8 février 2011, parLa visualisation des images est restreinte par la largeur accordée par le design du site (dépendant du thème utilisé). Elles sont donc visibles sous un format réduit. Afin de profiter de l’ensemble de la place disponible sur l’écran de l’utilisateur, il est possible d’ajouter une fonctionnalité d’affichage de l’image dans une boite multimedia apparaissant au dessus du reste du contenu.
Pour ce faire il est nécessaire d’installer le plugin "Mediabox".
Configuration de la boite multimédia
Dès (...) -
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir
Sur d’autres sites (5501)
-
Combining Video and Audio Buffers in Memory Using FFmpeg and Python
22 août 2024, par RUBICKI need to combine video and audio streams into a single MP4 file. My goal is to handle this entirely in memory and then send the resulting file directly to the user without storing any files on disk.


I am consistently facing problems with the FFmpeg process when trying to combine the video and audio streams


@bot.callback_query_handler(func=lambda call: call.data.startswith("download_mp4"))
def handle_download_mp4(call):
 itag, url = call.data.split("|")[1:]
 yt = YouTube(url)
 
 video_stream = yt.streams.get_by_itag(itag)
 audio_stream = yt.streams.filter(only_audio=True).first()

 video_buffer = io.BytesIO()
 audio_buffer = io.BytesIO()

 # downloading video and audio to memory
 video_stream.stream_to_buffer(video_buffer)
 audio_stream.stream_to_buffer(audio_buffer)

 video_buffer.seek(0)
 audio_buffer.seek(0)
 combined_buffer = io.BytesIO()

 process = None
 try:
 process = (
 ffmpeg
 .input('pipe:0')
 .input('pipe:1')
 .output('pipe:2', format='mp4', vcodec='libx264', acodec='aac')
 .run_async(pipe_stdin=True, pipe_stdout=True, pipe_stderr=True)
 )

 # video and audio buffers to ffmpeg stdin
 process.stdin.write(video_buffer.read())
 process.stdin.write(audio_buffer.read())
 process.stdin.close()

 combined_buffer.write(process.stdout.read())
 process.wait()
 combined_buffer.seek(0)
 
 except Exception as e:
 bot.send_message(call.message.chat.id, f"Error during processing: {str(e)}")
 if process:
 process.kill()
 return

 file_size = combined_buffer.getbuffer().nbytes

 if file_size > MAX_FILE_SIZE:
 bot.send_message(call.message.chat.id, "The combined file is too large to download :(")
 combined_buffer.close()
 return

 # sending combined bufer to user
 bot.send_document(call.message.chat.id, combined_buffer, visible_file_name=f"{yt.title}.mp4")



Here's the workflow I'm aiming for :


1. Download video and audio streams from YouTube.
2. Combine these streams in memory using FFmpeg.
3. Send the combined MP4 file to the user.


Im using pytube to download the video and audio streams and ffmpeg to merge them. However, I'm encountering issues with the combination process. Specifically, I’m struggling with how to correctly handle merging the video and audio streams using in-memory buffers


-
Stopping rq worker doesn't stop underlying ffmpeg process
23 mars 2020, par sqrI am fairly new to python and rq, and have come to a point I can’t solve by myself.
I am using ffmpeg-python to encode livestreams, this is distributed in rq workers and displayed on a web app using flask, but since the livestreams can go on forever, I need some way to stop this process while it is still in execution. Opening the terminal where the rq worker is executing the task and pressing ’q’ (ffmpeg shortcut to quit) works, and marks the job as OK, but I need to be able to do this from my web app.
I have tried getting the worker ID and sending it a SIGKILL, this stops the worker but the task continues running, which is something I don’t understand at all. It’s as if the actual ffmpeg process was being executed somewhere else and stopping the worker didn’t stop ffmpeg. Note that I am not using ffmpeg.run_async, I am using ffmpeg.run which as far as my limited knowledge goes, should not be executed asynchronously. While the streaming is being encoded the worker is marked as busy and has the Job ID properly assigned, so I really don’t understand why, when the worker is killed, the underlying process is still in execution.
If instead of sending a SIGKILL I send a SIGTERM, the worker says it’s waiting for a warm exit and is never closed, as the ffmpeg process is still doing it’s thing.
One of my ideas was trying to send a ’q’ keystroke to the worker (which I have no idea how to do even though i’ve been doing some research) or trying to switch from rq to celery, that supposedly supports the cancellation of tasks that are being executed.
This is my routes file
@app.route('/streamings', methods=['GET', 'POST'])
@login_required
def streamings():
...
if form2.submit_stop.data and form2.validate():
conn1 = Redis.from_url('redis://')
queue = rq.Queue('tasks-q', connection=Redis.from_url('redis://'))
workers = rq.Worker.all(queue=queue)
for worker in workers:
peine = worker.get_current_job_id()
if peine == form2.fld1.data:
os.kill(worker.pid, signal.SIGKILL)and this is my tasks file
def restream(origin, server, stream_key):
stream_server = generate_url(server, stream_key)
try:
stream_map = None
stream1 = ffmpeg.input(get_manifest(origin), re=None)
stream2 = ffmpeg.input('mosca_66.png')
stream_ol = ffmpeg.overlay(stream1, stream2, x='main_w-overlay_w-50', y='50')
a1 = stream1.audio
stream = ffmpeg.output(stream_ol, a1, stream_server, format='flv', vcodec='libx264', acodec='aac', preset='medium', g='120', crf='23', maxrate='4M', bufsize='5M', channel_layout='stereo')
print(stream.get_args())
ffmpeg.run(stream)
except:
set_complete()Any insight on possible solutions would be greatly appreciated.
Thanks
-
how does one loop a video using ffmpeg on android ?
15 juillet 2015, par MagicomirallesI’m using hiteshsondhi88’s "FFmpeg Android Java" library and it works fine for scaling, trans-coding, and overlaying. However, when I try to loop a video I just get the same un-looped video on my output.
My code :
String internalPath = getFilesDir().getPath(); //get internal directory
String command = "-y -f concat -i "+internalPath+"/myList.txt -c copy "+internalPath+"/final.mp4";
ffmpeg.execute(command, ffmpegResponseHandler);The text in "myList.txt" is generated programmatically but it looks like this :
file 'final_unlooped.mp4' file 'final_unlooped.mp4' file 'final_unlooped.mp4' file 'final_unlooped.mp4' file 'final_unlooped.mp4'
Finally, the log :
ffmpeg version n2.4.2 Copyright (c) 2000-2014 the FFmpeg developers
built on Oct 7 2014 15:11:41 with gcc 4.8 (GCC)
configuration: --target-os=linux --cross-prefix=/home/sb/Source-Code/ffmpeg-android/toolchain-android/bin/i686-linux-android- --arch=x86 --cpu=i686 --enable-runtime-cpudetect --sysroot=/home/sb/Source-Code/ffmpeg-android/toolchain-android/sysroot --enable-pic --enable-libx264 --enable-libass --enable-libfreetype --enable-libfribidi --enable-fontconfig --enable-pthreads --disable-debug --disable-ffserver --enable-version3 --enable-hardcoded-tables --disable-ffplay --disable-ffprobe --enable-gpl --enable-yasm --disable-doc --disable-shared --enable-static --pkg-config=/home/sb/Source-Code/ffmpeg-android/ffmpeg-pkg-config --prefix=/home/sb/Source-Code/ffmpeg-android/build/x86 --extra-cflags='-I/home/sb/Source-Code/ffmpeg-android/toolchain-android/include -U_FORTIFY_SOURCE -D_FORTIFY_SOURCE=2 -fno-strict-overflow -fstack-protector-all -march=i686' --extra-ldflags='-L/home/sb/Source-Code/ffmpeg-android/toolchain-android/lib -Wl,-z,relro -Wl,-z,now -pie' --extra-libs='-lpng -lexpat -lm' --extra-cxxflags=
libavutil 54. 7.100 / 54. 7.100
libavcodec 56. 1.100 / 56. 1.100
libavformat 56. 4.101 / 56. 4.101
libavdevice 56. 0.100 / 56. 0.100
libavfilter 5. 1.100 / 5. 1.100
libswscale 3. 0.100 / 3. 0.100
libswresample 1. 1.100 / 1. 1.100
libpostproc 53. 0.100 / 53. 0.100
Input #0, concat, from '/data/data/com.testApp/files/myList.txt':
Duration: N/A, start: 0.000000, bitrate: 744 kb/s
Stream #0:0: Video: h264 (High) (avc1 / 0x31637661), yuv420p, 640x640, 744 kb/s, 25 fps, 25 tbr, 12800 tbn, 50 tbc
Output #0, mp4, to '/data/data/com.testApp/files/final.mp4':
Metadata:
encoder : Lavf56.4.101
Stream #0:0: Video: h264 ([33][0][0][0] / 0x0021), yuv420p, 640x640, q=2-31, 744 kb/s, 25 fps, 12800 tbn, 12800 tbc
Stream mapping:
Stream #0:0 -> #0:0 (copy)
Press [q] to stop, [?] for help
frame= 115 fps=0.0 q=-1.0 Lsize= 420kB time=00:00:04.48 bitrate= 768.1kbits/s
video:418kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.509634%The OnSuccess() method in the response handler launches indicating that the execution was successful but the output is still unlooped.
This command works fine on Windows but for some reason it wont work on Android. any ideas of what it could be, or what am I doing wrong ?