
Recherche avancée
Médias (1)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (35)
-
MediaSPIP v0.2
21 juin 2013, parMediaSPIP 0.2 is the first MediaSPIP stable release.
Its official release date is June 21, 2013 and is announced here.
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
Librairies et binaires spécifiques au traitement vidéo et sonore
31 janvier 2010, parLes logiciels et librairies suivantes sont utilisées par SPIPmotion d’une manière ou d’une autre.
Binaires obligatoires FFMpeg : encodeur principal, permet de transcoder presque tous les types de fichiers vidéo et sonores dans les formats lisibles sur Internet. CF ce tutoriel pour son installation ; Oggz-tools : outils d’inspection de fichiers ogg ; Mediainfo : récupération d’informations depuis la plupart des formats vidéos et sonores ;
Binaires complémentaires et facultatifs flvtool2 : (...) -
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)
Sur d’autres sites (6027)
-
FFMPEG multi livestream - recorded stream send to different services like YT and Twitch at different time (on different button clicks )
4 octobre 2022, par GaneshTrying for the last 10 days and still no success, I am creating a python application that will accept the URL and visit that URL using chromium, capture that screen and send that real-time screen recording to different live stream acceptors as youtube live, twitch Twitter, Facebook live or some other sources and many of these could be multiple.


There are two challenges (both challenges depend on a user action like different button clicks) -


- 

- The time of starting the Livestream we know only one Livestream acceptor and other acceptors could be sent via another API at any time or may not be sent on the whole live stream.
- Any of the streams could be stopped at any moment including the first one which started the original live streaming service






To Solve these challenges I am trying the following process (i took mp4 as a source for simplifying)


- 

- create a stream and store it into PIPE.stdout




ffmpeg_Command_get_stream = 'ffmpeg -re -i test.mp4 -f flv pipe:1'
ffmpeg_Command_get_stream=ffmpeg_Command_get_stream.split()
pipe = sp.Popen(ffmpeg_Command_get_stream,
 stdout=sp.PIPE,
 stderr=sp.PIPE,
 bufsize=8000000,
 shell=True,
 universal_newlines=True
 )
out,err = pipe.communicate()



- 

-
and send that stream with the help of FFMPEG to the Livestream acceptor with the click of the youtube Livestream button


ffmpeg_Command_send_stream = ['ffmpeg','-i',pipe.stdout,'-f','flv',RTMPURL_YOUTUBE]






Update Trying to Explain it a little more :


step 1 - I need a real-time stream from the first command, so I used -re in FFMPEG


step 2 - Use above stream as an input for other command and send that as an output as a Livestream to youtube (or twitch/Facebook), But the second step would happen only when the user click on the button "YT LiveStream", Here the tricky thing is there are multiple buttons (YT LiveStream, Twitch LiveStream, Facebook LiveStream) and user can click any time on any of button, also can click on all button one by one.




sorry for bad explaination


what I am doing wrong ? , Is this Possible ? or need to go with another process,


any help would be greatly appreciated


-
Why doesn't the ffmpeg output display the stream in the browser ? [closed]
10 mai 2024, par TebyyWhy is it that when I create a livestream in Python using ffmpeg, and then I open the browser and visit the page, the page keeps loading continuously, and in PyCharm logs, I see binary data ? There are no errors displayed, and the code seems correct to me. I even tried saving to a file for testing purposes, and when I play the video, everything works fine. Does anyone know what might be wrong here ?


Code :


def generate_frames():
 cap = cv2.VideoCapture(os.path.normpath(app_root_dir().joinpath("data/temp", "video-979257305707693982.mp4")))
 while cap.isOpened():
 ret, frame = cap.read()
 if not ret:
 break

 yield frame


@app.route('/video_feed')
def video_feed():
 ffmpeg_command = [
 'ffmpeg', '-f', 'rawvideo', '-pix_fmt', 'bgr24',
 '-s:v', '1920x1080', '-r', '60',
 '-i', '-', '-vf', 'setpts=2.5*PTS', # Video Speed
 '-c:v', 'libvpx-vp9', '-g', '60', '-keyint_min', '60',
 '-b:v', '6M', '-minrate', '4M', '-maxrate', '12M', '-bufsize', '8M',
 '-crf', '0', '-deadline', 'realtime', '-tune', 'psnr', '-quality', 'good',
 '-tile-columns', '6', '-threads', '8', '-lag-in-frames', '16',
 '-f', 'webm', '-'
 ]
 ffmpeg_process = subprocess.Popen(ffmpeg_command, stdin=subprocess.PIPE, stderr=subprocess.PIPE, bufsize=-1)
 frames_generator = generate_frames()
 for frame in frames_generator:
 ffmpeg_process.stdin.write(frame)
 ffmpeg_process.stdin.flush()

 ffmpeg_process.stdin.close()
 ffmpeg_process.wait()

 def generate_video_stream(process):
 startTime = time.time()
 buffer = []
 sentBurst = False
 for chunk in iter(lambda: process.stderr.read(4096), b''):
 buffer.append(chunk)

 # Minimum buffer time, 3 seconds
 if sentBurst is False and time.time() > startTime + 3 and len(buffer) > 0:
 sentBurst = True
 for i in range(0, len(buffer) - 2):
 print("Send initial burst #", i)
 yield buffer.pop(0)

 elif time.time() > startTime + 3 and len(buffer) > 0:
 yield buffer.pop(0)

 process.poll()
 if isinstance(process.returncode, int):
 if process.returncode > 0:
 print('FFmpeg Error', process.returncode)

 break

 return Response(stream_with_context(generate_video_stream(ffmpeg_process)), mimetype='video/webm', content_type="video/webm; codecs=vp9", headers=Headers([("Connection", "close")]))




-
MPEG DASH : Playing video segments from .m4s files instead of byte ranges in the MPD File using Simple DASH Player
1er novembre 2018, par TrycoderI have followed the tutorial from the link below to create an MPEG DASH player using HTML5 and javascript.
Building a simple MPEG dash player using HTML5 and JS.In the given tutorial, it is possible to play the video segments using byte ranges. But for my application, I need the following features.
- The video player should read the .m4s segment files and play the video instead of byte ranges.
- The amount of data in the MediaSource buffer should be calculated periodically, also the amount of space left in the buffer.
Are these possible in the given player or Is there a better MPEG DASH Player with the above features ?
PS : The MPD File is created using FFmpeg.
ffmpeg -f avfoundation -video_size 1280x720 -framerate 30 -i 0 -vcodec libx264 -acodec aac -b:v 800k -f dash -use_template 0 -min_seg_duration 4000 -single_file 1 -start_at_zero -live 1 ffmpeg.mpd
Also, I tried using the DASH.js player but the documentation is very vast in it. Can we get the size of the source buffer in dash.js (The space available in the source buffer and also the amount of space filled) ? This is the main feature which is required for my project.
Edit :
Code which I tried