
Recherche avancée
Autres articles (111)
-
La file d’attente de SPIPmotion
28 novembre 2010, parUne file d’attente stockée dans la base de donnée
Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...) -
Les tâches Cron régulières de la ferme
1er décembre 2010, parLa gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
Le super Cron (gestion_mutu_super_cron)
Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...) -
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs
Sur d’autres sites (8971)
-
Using ffmpeg rtmp stream a static image and audio input [on hold]
30 octobre 2017, par ChadUsing a Raspberry Pi, stream audio in and use a static image as the video input thru ffmpeg over RTMP to a Cloud video provider (DaCast in this instance)
So far, I’ve gone through many blog posts, Stack Overflow questions, and package documentation. I’ve found that most of the posts are no longer valid with the newer versions of ffmpeg. Or don’t quite line up with what I am trying to achieve.
However, I have figured out the right settings to stream the Raspberry Pi Camera v2 with the audio in.
ffmpeg \ -f alsa -ac 1 -i plughw:1,0 \ -f v4l2 -s 1920x1080 -r 30 -input_format h264 -i /dev/video0 \ -vcodec copy -preset veryfast -r 15 -g 30 -b:v 64k -ar 44100 -threads 6 -b:a 96k -bufsize 3000k \ -f flv rtmp ://streaming_server_url
But can’t seem to get it right to replace the video input with a static image.
I have tried removing the 3rd line and adding
-loop 1 -i '/path/to/image.jpg'
The logs look like :
[alsa @ 0x55e4b980] Thread message queue blocking ; consider raising the thread_queue_size option (current value : 1024) [alsa @ 0x55e4b980] ALSA buffer xrun. 0kB time=00:00:00.00 bitrate=N/A speed= 0x [alsa @ 0x55e4b980] ALSA buffer xrun.130kB time=00:00:00.27 bitrate=3822.5kbits/s speed=0.0403x ...
I have also tried looping a 4 second video, with similar outcomes.
My Setup for context :
- Raspberry Pi 3 Model B
- USB Audio Device (Sabrent USB External Stereo Sound Adapter)
- Ubuntu MATE 16.04.2 (Xenial)
- ffmpeg version 3.2-2+rpi1 xenial1.7 (I can post what is configured with the build, if needed)
-
Using ffmpeg with Flash Media Server and HDS
20 avril 2012, par JonathanI want to use ffmpeg to encode and publish a live stream to Flash Media Server. In order to support iOS devices, I need to implement HTTP Live Streaming as well. The video needs to be in H.264 format and the audio should be AAC. I don't have much experience working with ffmpeg, and I'm having a hard time getting this to work. This is the command that I've tried (and some variations as well) :
ffmpeg.exe -threads 15 -f dshow -i video="USB2.0 UVC WebCam":audio="Microphone (Realtek High Defini" \
-map_channel 0.1.1 -r 24 -acodec libvo_aacenc -ar 22050 -ab 128k -vcodec libx264 \
-s vga -vb 100k -f flv "rtmp:///livepkgr/livestream1?adbe-live-event=liveevent" \
-r 24 -acodec libvo_aacenc -ar 22050 -ab 128k -vcodec libx264 -s qvga -vb 200k \
-f flv "rtmp:///livepkgr/livestream2?adbe-live-event=liveevent" \
-r 24 -acodec libvo_aacenc -ar 22050 -ab 128k -vcodec libx264 -s vga -vb 350k
-f flv "rtmp:///livepkgr/livestream3?adbe-live-event=liveevent"When I run this, it appears to connect to FMS, but then I get a lot of error messages about dropped frames - I'm not sure if ANY frames get encoded successfully. My CPU usage is very high as well. I get a 404 error from FMS when I enter the URL of the *.m3u8 file for one of the individual streams (the main livestream.m3u8 file is accessible though). I have also tried outputting to a file instead of FMS, with no success. All I get is some very garbled sound and no video.
Any suggestions for what options/commands I should use to get this working ? Is anyone using ffmpeg with FMS to do HTTP Dynamic Streaming / HLS with MP4 video ? I've been struggling to get HDS/HLS working for some time now, and any help would be much appreciated ! It shouldn't make a difference, but I'm using FMS on Amazon EC2 with their AMI image.
Thanks !
-
Using ffmpeg with Flash Media Server and HDS
20 avril 2012, par JonathanI want to use ffmpeg to encode and publish a live stream to Flash Media Server. In order to support iOS devices, I need to implement HTTP Live Streaming as well. The video needs to be in H.264 format and the audio should be AAC. I don't have much experience working with ffmpeg, and I'm having a hard time getting this to work. This is the command that I've tried (and some variations as well) :
ffmpeg.exe -threads 15 -f dshow -i video="USB2.0 UVC WebCam":audio="Microphone (Realtek High Defini" \
-map_channel 0.1.1 -r 24 -acodec libvo_aacenc -ar 22050 -ab 128k -vcodec libx264 \
-s vga -vb 100k -f flv "rtmp:///livepkgr/livestream1?adbe-live-event=liveevent" \
-r 24 -acodec libvo_aacenc -ar 22050 -ab 128k -vcodec libx264 -s qvga -vb 200k \
-f flv "rtmp:///livepkgr/livestream2?adbe-live-event=liveevent" \
-r 24 -acodec libvo_aacenc -ar 22050 -ab 128k -vcodec libx264 -s vga -vb 350k
-f flv "rtmp:///livepkgr/livestream3?adbe-live-event=liveevent"When I run this, it appears to connect to FMS, but then I get a lot of error messages about dropped frames - I'm not sure if ANY frames get encoded successfully. My CPU usage is very high as well. I get a 404 error from FMS when I enter the URL of the *.m3u8 file for one of the individual streams (the main livestream.m3u8 file is accessible though). I have also tried outputting to a file instead of FMS, with no success. All I get is some very garbled sound and no video.
Any suggestions for what options/commands I should use to get this working ? Is anyone using ffmpeg with FMS to do HTTP Dynamic Streaming / HLS with MP4 video ? I've been struggling to get HDS/HLS working for some time now, and any help would be much appreciated ! It shouldn't make a difference, but I'm using FMS on Amazon EC2 with their AMI image.
Thanks !