
Recherche avancée
Autres articles (70)
-
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
MediaSPIP v0.2
21 juin 2013, parMediaSPIP 0.2 est la première version de MediaSPIP stable.
Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...) -
Creating farms of unique websites
13 avril 2011, parMediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)
Sur d’autres sites (13016)
-
Splitting my code into OpenCV and UDP and the differences between using OpenCV and UDP
18 avril 2024, par Sagiv Shaniv1) I wrote a Python code to receive video in real-time, compress it, duplicate the video, and then send it to OpenCV and UDP using ffmpeg. I would like to know how I can duplicate the code to send it to both UDP and OpenCV (without sending it to another device) without affecting the frame rate.


This is the code I used so far :


import subprocess
import cv2

# Start ffmpeg process to capture video from USB, encode it in H.264, and send it over UDP and to virtual video device
ffmpeg_cmd = [
 'ffmpeg',
 '-f', 'v4l2', # Input format for USB camera
 '-video_size', '1920x1080', # Video size
 '-i', '/dev/video2', # USB device
 '-c:v', 'libx264', # H.264 codec
 '-preset', 'ultrafast', # Preset for speed
 '-tune', 'zerolatency', # Tune for zero latency
 '-b:v', '2M', # Bitrate
 '-bufsize', '5M', # Buffer size
 '-pix_fmt', 'yuv420p', # Specify pixel format
 '-filter_complex', '[0:v]split=2[out1][out2]', # Split the video stream
 '-map', '[out1]', # Map the first output to UDP
 '-f', 'mpegts', # Output format for UDP
 'udp://192.168.1.100:8000', # UDP destination
 '-map', '[out2]', # Map the second output to virtual video device
 '-f', 'v4l2', # Output format for virtual video device
 '-video_size', '1920x1080', # Video size for virtual video device
 '-pix_fmt', 'yuv420p', # Specify pixel format for virtual video device
 '/dev/video1' # Virtual video device
]

ffmpeg_process = subprocess.Popen(ffmpeg_cmd)

v4l2_cap = cv2.VideoCapture(1)

while True:
 ret, frame = v4l2_cap.read() # Read frame from virtual video device
 if not ret:
 break
 cv2.imshow('Frame', frame) # Display frame
 if cv2.waitKey(1) & 0xFF == ord('q'): # Exit on 'q' key press
 break

# Clean up
cv2.destroyAllWindows()
ffmpeg_process.terminate()



2) When I get the video straight from the device and send it over UDP I get 25 FPS with this code :


import cv2
import subprocess
import time
import os

width = 1920
height = 1080
fps = 40 # Increase FPS to 60
proc = None
os.environ['LD_LIBRARY_PATH'] = '/opt/vc/lib' # Set the library path

def stream_video():
 command = [
 'ffmpeg',
 '-f', 'v4l2',
 '-input_format', 'mjpeg',
 '-video_size', '1920x1080',
 '-framerate', '30',
 '-thread_queue_size', '512',
 '-i', '/dev/video2',
 '-f', 'lavfi',
 '-i', 'sine=frequency=440:sample_rate=48000',
 '-pix_fmt', 'yuvj420p',
 '-c:v', 'libx264',
 '-c:a', 'aac',
 '-b:v', '5000k',
 '-b:a', '128k',
 '-profile:v', 'baseline',
 '-preset', 'ultrafast',
 '-x264-params', 'tune=zerolatency',
 '-g', '60',
 '-f', 'mpegts',
 'udp://192.168.1.100:8000'
]


 try:
 proc = subprocess.Popen(command, stdin=subprocess.PIPE)
 finally:
 proc.stdin.close()
 proc.wait()

if _name_ == '_main_':
 stream_video()



However when I get the video from openCV I get 9 FPS with this code :


import subprocess
import cv2
import numpy as np

width = 1920
height = 1080
fps = 30

cap = cv2.VideoCapture(0)
cap.set(cv2.CAP_PROP_FRAME_WIDTH, width)
cap.set(cv2.CAP_PROP_FRAME_HEIGHT, height)
cap.set(cv2.CAP_PROP_FPS, fps)

command = [
 'ffmpeg',
 
 '-f', 'rawvideo',
 '-s', f'{width}x{height}',
 '-r', str(fps),
 '-i', '-', # Read from stdin
 '-f', 'lavfi',
 '-i', 'sine=frequency=440:sample_rate=48000',
 '-pix_fmt', 'yuv420p',
 '-c:v', 'libx264',
 '-c:a', 'aac',
 '-b:v', '5M',
 '-b:a', '128k',
 '-profile:v', 'baseline',
 '-preset', 'ultrafast',
 '-x264-params', 'tune=zerolatency',
 '-g', '60',
 '-f', 'mpegts',
 'udp://192.168.1.100:8000'
]

try:
 proc = subprocess.Popen(command, stdin=subprocess.PIPE)
 while True:
 ret, frame = cap.read()
 if not ret:
 break
 proc.stdin.write(frame.tobytes())
finally:
 cap.release()
 proc.stdin.close()
 proc.wait()



How can I receive video from OpenCV without affecting the frame rate (as I need it for later image processing) and without compromising on quality or resolution ?


I have tried capturing video frames using OpenCV and sending them over UDP using ffmpeg. I expected to maintain the original frame rate of the video without compromising on quality or resolution. However, I noticed a significant drop in frame rate when using OpenCV compared to directly capturing from the device. Specifically, I achieved 25 FPS when capturing and sending directly from the device using ffmpeg, but only 9 FPS when capturing frames using OpenCV and then sending them over UDP using ffmpeg.


Thank you


-
How to process and upload large video files directly to cloud with ffmpeg but without fragmented MP4 ?
9 avril 2024, par volume oneI am using
ffmpeg
viafluent-ffpmeg
for Node.js to process videos uploaded by users.

The problem I have is if a user uploades a huge movie file, say 8GB in size, then I don't want to store the file on the server as it will soon reach full capacity of space.


I thought a way to tackle this was to stream the output from
ffmpeg
straight to cloud storage like AWS S3. The only way to do this (I believe) is using aPassThrough()
stream :

import PassThrough from 'node:stream' ;
import FFMpeg from 'fluent-ffmpeg' ;


let PassThroughStream = new PassThrough() ;


FFMpeg('/testvideo.mp4')
 .videoCodec('libx264')
 .audioCodec('libmp3lame')
 .size(`640x480`)
 // Stream input requires manually specifying input format
 .inputFormat('mp4')
 // Stream output requires manually specifying output formats
 .format('mp4')
 // Must be fragmented for stream to work. This causes duration problem.
 .outputOptions('-movflags dash')
 .pipe(PassThroughStream, {end: true})



When the video is created using fragmented MP4, there is no
duration
associated with the file which means it has nolength
metadata. That makes playback difficult in a browser and is unacceptable :



The only way I have been able to get a proper
length
property set in the file's metadata is by not using fragmented MP4 (that is the-movflags dash
part in the code above). By not using this, I cannot stream the output directly to cloud storage - I have to save the file somewhere locally first.

I think I am missing something but don't know what. How could this be solved ? I want to process and write the output to AWS S3 without storing the file locally without creating a fragmented MP4.


-
avcodec/mips/aaccoder_mips : Remove MIPS-specific aaccoder
15 mars 2024, par Andreas Rheinhardtavcodec/mips/aaccoder_mips : Remove MIPS-specific aaccoder
ff_aac_coder_init_mips() modifies a static const structure of
function pointers. This will crash if the binary uses relro
and is a data race in any case.Furthermore it points to a maintainability issue : The
AACCoefficientsEncoder structures have been constified
in commit fd9212f2edfe9b107c3c08ba2df5fd2cba5ab9e3,
a Libav commit merged in 318778de9ebec276cb9dfc65509231ca56590d13.
Libav did not have the MIPS-specific AAC code and so this was
fine for them ; yet FFmpeg had them, but this was not recognized.Commit 75a099fc734a4ee2b1347d0a3d8c53d883b95174 points to another
maintainability issue : Contrary to ordinary DSP code, this code
here is way more complex and needs to be constantly kept in sync
with the ordinary code which it mimicks and replaces. Said commit
is the only commit actually changing aaccoder.c in the last few
years and the same change has not been performed for the MIPS
clone ; before that, it even happened several times that the mips
code was broken due to changes of the generic code (see commits
97437bd17a8c5d4135b2f3b1b299bd7bb72ce02c and
de262d018d7d7d9c967af1dfd1b861c4b9eb2a60 or
860dbe0275e57cbf4228f3f653f872ff66ca596b or
933309a6ca0f18bf1d40e917fff455221f57fb4b or
b65ffa316e377213c29736929beba584d0d80d7c). This might even lead
to scenarios where someone changing non-dsp aacenc code would
have to modify mips inline asm in order to keep them in sync.
This is obviously a significant burden (if the AAC encoder were
actively developed).Finally, the code does not even compile here due to errors like
"Error : float register should be even, was 1".Reviewed-by : Lynne <dev@lynne.ee>
Reviewed-by : Jean-Baptiste Kempf <jb@videolan.org>
Signed-off-by : Andreas Rheinhardt <andreas.rheinhardt@outlook.com>