
Recherche avancée
Autres articles (63)
-
Amélioration de la version de base
13 septembre 2013Jolie sélection multiple
Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...) -
Installation en mode ferme
4 février 2011, parLe mode ferme permet d’héberger plusieurs sites de type MediaSPIP en n’installant qu’une seule fois son noyau fonctionnel.
C’est la méthode que nous utilisons sur cette même plateforme.
L’utilisation en mode ferme nécessite de connaïtre un peu le mécanisme de SPIP contrairement à la version standalone qui ne nécessite pas réellement de connaissances spécifique puisque l’espace privé habituel de SPIP n’est plus utilisé.
Dans un premier temps, vous devez avoir installé les mêmes fichiers que l’installation (...) -
La sauvegarde automatique de canaux SPIP
1er avril 2010, parDans le cadre de la mise en place d’une plateforme ouverte, il est important pour les hébergeurs de pouvoir disposer de sauvegardes assez régulières pour parer à tout problème éventuel.
Pour réaliser cette tâche on se base sur deux plugins SPIP : Saveauto qui permet une sauvegarde régulière de la base de donnée sous la forme d’un dump mysql (utilisable dans phpmyadmin) mes_fichiers_2 qui permet de réaliser une archive au format zip des données importantes du site (les documents, les éléments (...)
Sur d’autres sites (12626)
-
Unable to retrieve video stream from RTSP URL inside Docker container
6 février, par birdalugurI have a FastAPI application running inside a Docker container that is trying to stream video from an RTSP camera URL using OpenCV. The setup works fine locally, but when running inside Docker, the
/video
endpoint does not return a stream and times out. Below are the details of the issue.

Docker Setup :


Dockerfile :


FROM python:3.10.12

RUN apt-get update && apt-get install -y \
 libgl1-mesa-glx \
 libglib2.0-0

WORKDIR /app

COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt

COPY . .

CMD ["python", "app.py"]




- 

- Docker Compose :




services:
 api:
 build: ./api
 ports:
 - "8000:8000"
 depends_on:
 - redis
 - mongo
 networks:
 - app_network
 volumes:
 - ./api:/app
 environment:
 - REDIS_HOST=redis
 - REDIS_PORT=6379
 - MONGO_URI=mongodb://mongo:27017/app_db

 frontend:
 build: ./frontend
 ports:
 - "3000:3000"
 depends_on:
 - api
 networks:
 - app_network
 volumes:
 - ./frontend:/app
 - /app/node_modules

redis:
 image: "redis:alpine"
 restart: always
 networks:
 - app_network
 volumes:
 - redis_data:/data

 mongo:
 image: "mongo:latest"
 restart: always
 networks:
 - app_network
 volumes:
 - mongo_data:/data/db

networks:
 app_network:
 driver: bridge

volumes:
 redis_data:
 mongo_data:




Issue :


When I try to access the
/video
endpoint, the following warnings appear :

[ WARN:0@46.518] global cap_ffmpeg_impl.hpp:453 _opencv_ffmpeg_interrupt_callback Stream timeout triggered after 30037.268665 ms



However, locally, the RTSP stream works fine using OpenCV with the same code.


Additional Information :


- 

- Network : The Docker container can successfully ping the camera IP (
10.100.10.94
). - Local Video : I can read frames from a local video file without issues.
- RTSP Stream : I am able to access the RTSP stream directly using OpenCV locally, but not inside the Docker container.








Code :


Here's the relevant part of the code in my
api/app.py
:

import cv2
from fastapi import FastAPI
from fastapi.responses import StreamingResponse

RTSP_URL = "rtsp://deneme:155115@10.100.10.94:554/axis-media/media.amp?adjustablelivestream=1&fps=10"

def generate_frames():
 cap = cv2.VideoCapture(RTSP_URL)
 if not cap.isOpened():
 print("Failed to connect to RTSP stream.")
 return

 while True:
 success, frame = cap.read()
 if not success:
 print("Failed to capture frame.")
 break

 _, buffer = cv2.imencode(".jpg", frame)
 frame_bytes = buffer.tobytes()

 yield (
 b"--frame\r\n" b"Content-Type: image/jpeg\r\n\r\n" + frame_bytes + b"\r\n"
 )

 cap.release()

@app.get("/video")
async def video_feed():
 """Return MJPEG stream to the browser."""
 return StreamingResponse(
 generate_frames(), media_type="multipart/x-mixed-replace; boundary=frame"
 )



Has anyone faced similar issues or have suggestions on how to resolve this ?



 -
Scalable Webinar Features Using Open-Source Tools ? [closed]
31 janvier, par Firas Ben saidI am searching for a scalable webinar solution that can handle 1000+ concurrent users. I have explored platforms like BigBlueButton but encountered limitations with scalability in real-world scenarios.


My requirements include :


- 

- Support for RTMP and HLS streaming.
- Chat and screen-sharing functionalities.
- Ability to integrate with custom APIs.








I’d like to know how to address these challenges using open-source tools. For instance :


- 

- What configurations are necessary to scale tools like BigBlueButton for large audiences ?
- Are there specific architectural patterns or server setups recommended for handling this user load ?






Any guidance or examples would be appreciated.


-
iOS Radio App : Need to extract and stream audio-only from HLS streams with video content
20 décembre 2024, par Bader AlghamdiI'm developing an iOS radio app that plays various HLS streams. The challenge is that some stations broadcast HLS streams containing both audio and video (example : https://svs.itworkscdn.net/smcwatarlive/smcwatar/chunks.m3u8), but I want to :


Extract and play only the audio track
Support AirPlay for audio-only streaming
Minimize data usage by not downloading video content
Technical Details :


iOS 17+
Swift 6
Using AVFoundation for playback
Current implementation uses AVPlayer with AVPlayerItem
Current Code Structure :


class StreamPlayer: ObservableObject { @Published var isPlaying = false private var player: AVPlayer? private var playerItem: AVPlayerItem?

func playStream(url: URL) {
 let asset = AVURLAsset(url: url)
 playerItem = AVPlayerItem(asset: asset)
 player = AVPlayer(playerItem: playerItem)
 player?.play()
}




Stream Analysis : When analyzing the video stream using FFmpeg :


CopyInput #0, hls, from 'https://svs.itworkscdn.net/smcwatarlive/smcwatar/chunks.m3u8':
 Stream #0:0: Video: h264, yuv420p(tv, bt709), 1920x1080 [SAR 1:1 DAR 16:9], 25 fps
 Stream #0:1: Audio: aac, 44100 Hz, stereo, fltp




Attempted Solutions :


Using MobileFFmpeg :


let command = [
 "-i", streamUrl,
 "-vn",
 "-acodec", "aac",
 "-ac", "2",
 "-ar", "44100",
 "-b:a", "128k",
 "-f", "mpegts",
 "udp://127.0.0.1:12345"
].joined(separator: " ")

ffmpegProcess = MobileFFmpeg.execute(command)

I



ssue : While FFmpeg successfully extracts audio, playback through AVPlayer doesn't work reliably.


Tried using HLS output :


let command = [
 "-i", streamUrl,
 "-vn",
 "-acodec", "aac",
 "-ac", "2",
 "-ar", "44100",
 "-b:a", "128k",
 "-f", "hls",
 "-hls_time", "2",
 "-hls_list_size", "3",
 outputUrl.path
]




Issue : Creates temporary files but faces synchronization issues with live streams.



Testing URLs :


Audio+Video : https://svs.itworkscdn.net/smcwatarlive/smcwatar/chunks.m3u8
Audio Only : https://mbcfm-radio.mbc.net/mbcfm-radio.m3u8



Requirements :


- 

- Real-time audio extraction from HLS stream
- Maintain live streaming capabilities
- Full AirPlay support
- Minimal data usage (avoid downloading video content)
- Handle network interruptions gracefully












Questions :


- 

- What's the most efficient way to extract only audio from an HLS stream in real-time ?
- Is there a way to tell AVPlayer to ignore video tracks completely ?
- Are there better alternatives to FFmpeg for this specific use case ?
- What's the recommended approach for handling AirPlay with modified streams ?










Any guidance or alternative approaches would be greatly appreciated. Thank you !


What I Tried :


- 

- Direct AVPlayer Implementation :




- 

- Used standard AVPlayer to play HLS stream
- Expected it to allow selecting audio-only tracks
- Result : Always downloads both video and audio, consuming unnecessary bandwidth









- 

- FFmpeg Audio Extraction :




let command = [
 "-i", "https://svs.itworkscdn.net/smcwatarlive/smcwatar/chunks.m3u8",
 "-vn", // Remove video
 "-acodec", "aac", // Audio codec
 "-ac", "2", // 2 channels
 "-ar", "44100", // Sample rate
 "-b:a", "128k", // Bitrate
 "-f", "mpegts", // Output format
 "udp://127.0.0.1:12345" // Local stream
]
ffmpegProcess = MobileFFmpeg.execute(command)




Expected : Clean audio stream that AVPlayer could play
Result : FFmpeg extracts audio but AVPlayer can't play the UDP stream



- 

- HLS Segmented Approach :




swiftCopylet command = [
 "-i", streamUrl,
 "-vn",
 "-acodec", "aac",
 "-f", "hls",
 "-hls_time", "2",
 "-hls_list_size", "3",
 outputUrl.path
]




Expected : Create local HLS playlist with audio-only segments
Result : Creates files but faces sync issues with live stream



Expected Behavior :


- 

- Stream plays audio only
- Minimal data usage (no video download)
- Working AirPlay support
- Real-time playback without delays










Actual Results :


- 

- Either downloads full video stream (wasteful)
- Or fails to play extracted audio
- AirPlay issues with modified streams
- Sync problems with live content