
Recherche avancée
Médias (33)
-
Stereo master soundtrack
17 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Audio
-
#7 Ambience
16 octobre 2011, par
Mis à jour : Juin 2015
Langue : English
Type : Audio
-
#6 Teaser Music
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#5 End Title
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#3 The Safest Place
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#4 Emo Creates
15 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
Autres articles (60)
-
Encoding and processing into web-friendly formats
13 avril 2011, parMediaSPIP automatically converts uploaded files to internet-compatible formats.
Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
All uploaded files are stored online in their original format, so you can (...) -
Contribute to documentation
13 avril 2011Documentation is vital to the development of improved technical capabilities.
MediaSPIP welcomes documentation by users as well as developers - including : critique of existing features and functions articles contributed by developers, administrators, content producers and editors screenshots to illustrate the above translations of existing documentation into other languages
To contribute, register to the project users’ mailing (...) -
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs
Sur d’autres sites (9801)
-
Issues with Publishing and Subscribing Rates for H.264 Video Streaming over RabbitMQ
7 octobre 2024, par LuisI am working on a project to stream an H.264 video file using RabbitMQ (AMQP protocol) and display it in a web application. The setup involves capturing video frames, encoding them, sending them to RabbitMQ, and then consuming and decoding them on the web application side using Flask and Flask-SocketIO.


However, I am encountering performance issues with the publishing and subscribing rates in RabbitMQ. I cannot seem to achieve more than 10 messages per second. This is not sufficient for smooth video streaming.
I need help to diagnose and resolve these performance bottlenecks.


Here is my code :


- 

- Video Capture and Publishing Script :




# RabbitMQ setup
RABBITMQ_HOST = 'localhost'
EXCHANGE = 'DRONE'
CAM_LOCATION = 'Out_Front'
KEY = f'DRONE_{CAM_LOCATION}'
QUEUE_NAME = f'DRONE_{CAM_LOCATION}_video_queue'

# Path to the H.264 video file
VIDEO_FILE_PATH = 'videos/FPV.h264'

# Configure logging
logging.basicConfig(level=logging.INFO)

@contextmanager
def rabbitmq_channel(host):
 """Context manager to handle RabbitMQ channel setup and teardown."""
 connection = pika.BlockingConnection(pika.ConnectionParameters(host))
 channel = connection.channel()
 try:
 yield channel
 finally:
 connection.close()

def initialize_rabbitmq(channel):
 """Initialize RabbitMQ exchange and queue, and bind them together."""
 channel.exchange_declare(exchange=EXCHANGE, exchange_type='direct')
 channel.queue_declare(queue=QUEUE_NAME)
 channel.queue_bind(exchange=EXCHANGE, queue=QUEUE_NAME, routing_key=KEY)

def send_frame(channel, frame):
 """Encode the video frame using FFmpeg and send it to RabbitMQ."""
 ffmpeg_path = 'ffmpeg/bin/ffmpeg.exe'
 cmd = [
 ffmpeg_path,
 '-f', 'rawvideo',
 '-pix_fmt', 'rgb24',
 '-s', '{}x{}'.format(frame.shape[1], frame.shape[0]),
 '-i', 'pipe:0',
 '-f', 'h264',
 '-vcodec', 'libx264',
 '-pix_fmt', 'yuv420p',
 '-preset', 'ultrafast',
 'pipe:1'
 ]
 
 start_time = time.time()
 process = subprocess.Popen(cmd, stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
 out, err = process.communicate(input=frame.tobytes())
 encoding_time = time.time() - start_time
 
 if process.returncode != 0:
 logging.error("ffmpeg error: %s", err.decode())
 raise RuntimeError("ffmpeg error")
 
 frame_size = len(out)
 logging.info("Sending frame with shape: %s, size: %d bytes", frame.shape, frame_size)
 timestamp = time.time()
 formatted_timestamp = datetime.fromtimestamp(timestamp).strftime('%H:%M:%S.%f')
 logging.info(f"Timestamp: {timestamp}") 
 logging.info(f"Formatted Timestamp: {formatted_timestamp[:-3]}")
 timestamp_bytes = struct.pack('d', timestamp)
 message_body = timestamp_bytes + out
 channel.basic_publish(exchange=EXCHANGE, routing_key=KEY, body=message_body)
 logging.info(f"Encoding time: {encoding_time:.4f} seconds")

def capture_video(channel):
 """Read video from the file, encode frames, and send them to RabbitMQ."""
 if not os.path.exists(VIDEO_FILE_PATH):
 logging.error("Error: Video file does not exist.")
 return
 cap = cv2.VideoCapture(VIDEO_FILE_PATH)
 if not cap.isOpened():
 logging.error("Error: Could not open video file.")
 return
 try:
 while True:
 start_time = time.time()
 ret, frame = cap.read()
 read_time = time.time() - start_time
 if not ret:
 break
 frame_rgb = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)
 frame_rgb = np.ascontiguousarray(frame_rgb) # Ensure the frame is contiguous
 send_frame(channel, frame_rgb)
 cv2.imshow('Video', frame)
 if cv2.waitKey(1) & 0xFF == ord('q'):
 break
 logging.info(f"Read time: {read_time:.4f} seconds")
 finally:
 cap.release()
 cv2.destroyAllWindows()



- 

- the backend (flask) :




app = Flask(__name__)
CORS(app)
socketio = SocketIO(app, cors_allowed_origins="*")

RABBITMQ_HOST = 'localhost'
EXCHANGE = 'DRONE'
CAM_LOCATION = 'Out_Front'
QUEUE_NAME = f'DRONE_{CAM_LOCATION}_video_queue'

def initialize_rabbitmq():
 connection = pika.BlockingConnection(pika.ConnectionParameters(RABBITMQ_HOST))
 channel = connection.channel()
 channel.exchange_declare(exchange=EXCHANGE, exchange_type='direct')
 channel.queue_declare(queue=QUEUE_NAME)
 channel.queue_bind(exchange=EXCHANGE, queue=QUEUE_NAME, routing_key=f'DRONE_{CAM_LOCATION}')
 return connection, channel

def decode_frame(frame_data):
 # FFmpeg command to decode H.264 frame data
 ffmpeg_path = 'ffmpeg/bin/ffmpeg.exe'
 cmd = [
 ffmpeg_path,
 '-f', 'h264',
 '-i', 'pipe:0',
 '-pix_fmt', 'bgr24',
 '-vcodec', 'rawvideo',
 '-an', '-sn',
 '-f', 'rawvideo',
 'pipe:1'
 ]
 process = subprocess.Popen(cmd, stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
 start_time = time.time() # Start timing the decoding process
 out, err = process.communicate(input=frame_data)
 decoding_time = time.time() - start_time # Calculate decoding time
 
 if process.returncode != 0:
 print("ffmpeg error: ", err.decode())
 return None
 frame_size = (960, 1280, 3) # frame dimensions expected by the frontend
 frame = np.frombuffer(out, np.uint8).reshape(frame_size)
 print(f"Decoding time: {decoding_time:.4f} seconds")
 return frame

def format_timestamp(ts):
 dt = datetime.fromtimestamp(ts)
 return dt.strftime('%H:%M:%S.%f')[:-3]

def rabbitmq_consumer():
 connection, channel = initialize_rabbitmq()
 for method_frame, properties, body in channel.consume(QUEUE_NAME):
 message_receive_time = time.time() # Time when the message is received

 # Extract the timestamp from the message body
 timestamp_bytes = body[:8]
 frame_data = body[8:]
 publish_timestamp = struct.unpack('d', timestamp_bytes)[0]

 print(f"Message Receive Time: {message_receive_time:.4f} ({format_timestamp(message_receive_time)})")
 print(f"Publish Time: {publish_timestamp:.4f} ({format_timestamp(publish_timestamp)})")

 frame = decode_frame(frame_data)
 decode_time = time.time() - message_receive_time # Calculate decode time

 if frame is not None:
 _, buffer = cv2.imencode('.jpg', frame)
 frame_data = buffer.tobytes()
 socketio.emit('video_frame', {'frame': frame_data, 'timestamp': publish_timestamp}, namespace='/')
 emit_time = time.time() # Time after emitting the frame

 # Log the time taken to emit the frame and its size
 rtt = emit_time - publish_timestamp # Calculate RTT from publish to emit
 print(f"Current Time: {emit_time:.4f} ({format_timestamp(emit_time)})")
 print(f"RTT: {rtt:.4f} seconds")
 print(f"Emit time: {emit_time - message_receive_time:.4f} seconds, Frame size: {len(frame_data)} bytes")
 channel.basic_ack(method_frame.delivery_tag)

@app.route('/')
def index():
 return render_template('index.html')

@socketio.on('connect')
def handle_connect():
 print('Client connected')

@socketio.on('disconnect')
def handle_disconnect():
 print('Client disconnected')

if __name__ == '__main__':
 consumer_thread = threading.Thread(target=rabbitmq_consumer)
 consumer_thread.daemon = True
 consumer_thread.start()
 socketio.run(app, host='0.0.0.0', port=5000)




How can I optimize the publishing and subscribing rates to handle a higher number of messages per second ?


Any help or suggestions would be greatly appreciated !


I attempted to use threading and multiprocessing to handle multiple frames concurrently and I tried to optimize the frame decoding function to make it faster but with no success.


-
what is the fastest ffmpeg video codec for decoding ?
11 décembre 2022, par MarkAI am using ffmpeg on Linux to transcode video files. The files are video from a race car camera. They have been downloaded from Youtube as "webm" format. I want to compare two of the videos, side-by-side, using GridPlayer, which uses vlc as its underlying video processor. GridPlayer has very nice, frame-by-frame controls, but, they are very slow. What video codec should I use to impose the least decoding overhead on vlc/GridPlayer for smoother playback ?


I have tried re-encoding as h264, 1920x1080, 30 fps, in mp4 container. I have since discovered a "-tune fastdecode" option that seems to be helpful, along with resizing to 854x480. Any other suggestions ?


-
Fatal error detected by JRE : libavcodec involved
16 novembre 2015, par hdfI’m experiencing this issue whenever I’m trying to run my java program. Could someone give me a clues on how to fix it ?
EDIT : the issue only occurs after a new environment has been set up.
#
# A fatal error has been detected by the Java Runtime Environment:
#
#
#
# JRE version: OpenJDK Runtime Environment (7.0_85-b01) (build 1.7.0_85-b01)
# Java VM: OpenJDK 64-Bit Server VM (24.85-b03 mixed mode linux-amd64 compressed oops)
# Derivative: IcedTea 2.6.1
# Distribution: Ubuntu 14.04 LTS, package 7u85-2.6.1-5ubuntu0.14.04.1
# Problematic frame:
# C [libavcodec.so.52.123.0+0x442060]
#
# Failed to write core dump. Core dumps have been disabled. To enable core dumping, try "ulimit -c unlimited" before starting Java again
#
# If you would like to submit a bug report, please include
# instructions on how to reproduce the bug and visit:
# http://icedtea.classpath.org/bugzilla
# The crash happened outside the Java Virtual Machine in native code.
# See problematic frame for where to report the bug.
#
--------------- T H R E A D ---------------
Current thread (0x00007f8f70141800): JavaThread "threadpool-14" daemon [_thread_in_native, id=14508, stack(0x00007f8fa73a0000,0x00007f8fa74a1000)]
siginfo:si_signo=SIGSEGV: si_errno=0, si_code=128 (), si_addr=0x0000000000000000
Registers:
RAX=0x00007f8f701d2110, RBX=0x00007f8fa65030d0, RCX=0x00007f8fa688efd0, RDX=0x00007f8fa68904b0
RSP=0x00007f8fa749f380, RBP=0x00007f8fa749f3d8, RSI=0x000000000000000c, RDI=0x00007f8f701d2110
R8 =0x0000000000000000, R9 =0x0000000000000000, R10=0x00007f8fc5eb56e0, R11=0x00007f8fc591af50
R12=0x0000000000000000, R13=0x00007f8f701419d8, R14=0x00007f8fa749f578, R15=0x00007f8f70141800
RIP=0x00007f8fa688f060, EFLAGS=0x0000000000010202, CSGSFS=0x0000000000000033, ERR=0x0000000000000000
TRAPNO=0x000000000000000d
Top of Stack: (sp=0x00007f8fa749f380)
0x00007f8fa749f380: 00007f8fa688f1a5 00007f8fa688ffe5
0x00007f8fa749f390: 00007f8fa6890025 00007f8fa6890065
0x00007f8fa749f3a0: 00007f8fa68900a5 00007f8fa68900e5
0x00007f8fa749f3b0: 00007f8fa6890125 00007f8fa68904b5
0x00007f8fa749f3c0: 00007f8fa6890678 00007f8fa7198f31
0x00007f8fa749f3d0: 0000000000000000 00007f8f70158540
0x00007f8fa749f3e0: 00007f8fa749f468 00007f8fa71966ae
0x00007f8fa749f3f0: 0000000776cfa560 0000000776cfa560...
Instructions: (pc=0x00007f8fa688f060)
0x00007f8fa688f040: c4 e3 5d 0c c0 55 c5 e4 5c c8 c5 e4 58 c0 c5 fc
0x00007f8fa688f050: 29 07 c5 fc 29 4f 20 c3 0f 1f 84 00 00 00 00 00
0x00007f8fa688f060: c5 fc 28 57 40 c5 fc 28 5f 60 c5 ec 5c fb c5 ec
0x00007f8fa688f070: 58 d3 c5 c4 57 3d 26 e0 13 00 c5 ec c6 df be c5
Register to memory mapping:
RAX=0x00007f8f701d2110 is an unknown value
RBX=0x00007f8fa65030d0: av_fft_calc+0 in /usr/local/lib/libavcodec.so.52.123.0 at 0x00007f8fa644d000
RCX=0x00007f8fa688efd0: <offset 0x441fd0="0x441fd0"> in /usr/local/lib/libavcodec.so.52.123.0 at 0x00007f8fa644d000
RDX=0x00007f8fa68904b0: <offset 0x4434b0="0x4434b0"> in /usr/local/lib/libavcodec.so.52.123.0 at 0x00007f8fa644d000
RSP=0x00007f8fa749f380 is pointing into the stack for thread: 0x00007f8f70141800
RBP=0x00007f8fa749f3d8 is pointing into the stack for thread: 0x00007f8f70141800
RSI=0x000000000000000c is an unknown value
RDI=0x00007f8f701d2110 is an unknown value
R8 =0x0000000000000000 is an unknown value
R9 =0x0000000000000000 is an unknown value
R10=0x00007f8fc5eb56e0: <offset 0xe006e0="0xe006e0"> in /usr/lib/jvm/java-7-openjdk-amd64/jre/lib/amd64/server/libjvm.so at 0x00007f8fc50b5000
R11=0x00007f8fc591af50: <offset 0x865f50="0x865f50"> in /usr/lib/jvm/java-7-openjdk-amd64/jre/lib/amd64/server/libjvm.so at 0x00007f8fc50b5000
R12=0x0000000000000000 is an unknown value
R13=0x00007f8f701419d8 is an unknown value
R14=0x00007f8fa749f578 is pointing into the stack for thread: 0x00007f8f70141800
R15=0x00007f8f70141800 is a thread
Stack: [0x00007f8fa73a0000,0x00007f8fa74a1000], sp=0x00007f8fa749f380, free space=1020k
Native frames: (J=compiled Java code, j=interpreted, Vv=VM code, C=native code)
C [libavcodec.so.52.123.0+0x442060]
</offset></offset></offset></offset>This occurs on different physical instances so chances are pretty low it’s hardware issue. Thanks !