
Recherche avancée
Médias (2)
-
SPIP - plugins - embed code - Exemple
2 septembre 2013, par
Mis à jour : Septembre 2013
Langue : français
Type : Image
-
Publier une image simplement
13 avril 2011, par ,
Mis à jour : Février 2012
Langue : français
Type : Video
Autres articles (64)
-
Configurer la prise en compte des langues
15 novembre 2010, parAccéder à la configuration et ajouter des langues prises en compte
Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...) -
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.
Sur d’autres sites (9472)
-
Audio-Video Delay When Fetching From NGINX-RTMP live-Mode
17 novembre 2020, par SuuuehgiWe get an RTSP-stream and mix it together with line-in over pulseaudio.


This looks something like :


ffmpeg \
 -use_wallclock_as_timestamps 1 -fflags +genpts \
 -max_delay 2000000 -thread_queue_size 1024 \
 -i "rtsp://url" \
 -use_wallclock_as_timestamps 1 -fflags +genpts \
 -max_delay 2000000 -thread_queue_size 1024 \
 -itsoffset <offset> \
 -f pulse \
 [...]
</offset>


So far so good. This kind of works when fetching the rtsp stream directly.


As soon as we route the RTSP-stream through an NGINX-RTMP loopback
(live mode) beforehand,


ffmpeg -i rtsp://url -c copy -an -f flv rtmp://localhost/live

ffmpeg \
 -use_wallclock_as_timestamps 1 -fflags +genpts \
 -max_delay 2000000 -thread_queue_size 1024 \
 -i "rtmp://localhost/live" \
 -use_wallclock_as_timestamps 1 -fflags +genpts \
 -max_delay 2000000 -thread_queue_size 1024 \
 -itsoffset <offset> \
 -f pulse \
 [...]
</offset>


we get a delay of close to 5 s within the output (audio-video offset).


Whereat the configuration of rtmp ://localhost/live is :


application live {
 live on;
 sync 10ms;
 record off;
 allow publish 127.0.0.1;
 deny publish all;
 }



What causes the delay and how to get rid of it ?


The RTMP-server itself does not cause a noticeable delay, I hence
assume this to be a timestamp issue but my wisdom ends with the above
written options.


-
Streaming live video from ios [closed]
15 février 2018, par JohnI have a need to stream video from the iPhone/iPad camera to a server. It looks like this will need to be done with AVCaptureSession but I don’t know how to best architect this.
I found this post :
streaming video FROM an iPhone
But it doesn’t handle the "live" part, latency needs to be 2 or 3 seconds at most. Devices can be constrained to 4 or 4S capability if needed, and there is no requirement for HD, VGA is probably what we’ll end up with. I assume any solution would use ffmpeg, I haven’t found any more appropriate library.
How is this best accomplished ?
-
Stream to Facebook Live using OpenCV
23 mai 2022, par Lanzy ErinI am planning to stream a video file to Facebook Live but I want to programmatically edit its frames like adding texts depending. My problem is that I don't know how to properly send data to Facebook Live. I tried ffmpeg but it doesn't work.


Here is my code that I tried


import subprocess
import cv2

rtmp_url = "rtmps://live-api-s.facebook.com:443/rtmp/FB-1081417119476224-0-AbwwMK91tFTjFy2j"

path = "7.mp4"
cap = cv2.VideoCapture(path)

# gather video info to ffmpeg
fps = int(cap.get(cv2.CAP_PROP_FPS))
width = int(cap.get(cv2.CAP_PROP_FRAME_WIDTH))
height = int(cap.get(cv2.CAP_PROP_FRAME_HEIGHT))

# command and params for ffmpeg
command = ['ffmpeg',
 '-y',
 '-f', 'rawvideo',
 '-vcodec', 'rawvideo',
 '-pix_fmt', 'bgr24',
 '-s', f"{width}x{height}",
 '-r', str(fps),
 '-i', '-',
 '-c:v', 'libx264',
 '-pix_fmt', 'yuv420p',
 '-preset', 'ultrafast',
 '-f', 'flv',
 rtmp_url]

# using subprocess and pipe to fetch frame data
p = subprocess.Popen(command, stdin=subprocess.PIPE)

while cap.isOpened():
 ret, frame = cap.read()
 if not ret:
 print("frame read failed")
 break

 # YOUR CODE FOR PROCESSING FRAME HERE

 # write to pipe
 p.stdin.write(frame.tobytes())