
Recherche avancée
Autres articles (88)
-
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Ecrire une actualité
21 juin 2013, parPrésentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
Vous pouvez personnaliser le formulaire de création d’une actualité.
Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir
Sur d’autres sites (15574)
-
Nginx rtmp module - on_publish fires multiple time instead of once
29 juillet 2017, par Stephen WrightThis is copy and pasted from the bug report I created on the rtmp-module by Arut, I am not completely sure if it is a bug or me not understanding how the module works, I have read the whole directives of module as from https://github.com/arut/nginx-rtmp-module/wiki/Directives
Proper explanation, if code is not displayed properly I will edit and fix
Hi, been using the module and finding it very very good !
Think I have found a issue though, although it may be me misunderstanding the directives.
Essentially I wish to fire a script (/usr/local/bin/make_thumbnail.sh) which creates a thumbnail automatically from a stream (using ffmpeg), the idea is to have this done for every stream as soon as it is published in order to create a function a bit like twitch tv where the streamer will not have to specify any thumbnail image, authenticated users simply start a stream (which will later be authenticated but is not yet) The script does also write data into the database however this stage works fine and I don’t believe the issue is related, if I comment out these lines then the thumbnail creation still works and my issue continues.
Initially this was done using the "exec" command as I believe I mis-read the documentation and I believe the exec command doesn’t work for my problem as ". When publishing stops the process is terminated." does this mean it will continually execute until stream stops ?
I have started using the exec_publish command to try and fix this issue however the same issue seems to occur. The entire script repeats approximately every 15-17 seconds, a new thumbnail is created and a new database entry is create with all the correct information.
Below is the nginx.conf line. Please ignore if indentation is incorrect couldn’t see a way to indent blocks of code and it’s late here, assume all code is indented correctly unless you believe that could be the issue in which case I will post it indented as early as I can.
application live {
allow play all;
live on;
record all;
record_path /var/stream/video_recordings/;
record_unique on;
hls on;
hls_nested on;
hls_path /var/stream/HLS/live;
hls_fragment 10s;
#on publish create thumbnail using first second of stream and save in
/var/stream/video_recordings/thumbnails
exec_publish usr/local/bin/make_thumbnail.sh $name;The rest can be pasted or attached if needed but is working nginx config for rtmp + website
The most simple version of the make_thumbnail..sh is pasted below, I have omitted the variables that I have used for database entryys obviously but as the script works without fail from terminal I believe this to be an nginx issue (if I run the command manually under the nginx user e.g. sudo -u nginx /usr/local/bin/make_thumbnail.sh with a name the same as any running stream, it works and only executes once as would expect, all permissions in script are ok and tested.
make_thumbnail.sh
#!/bin/bash
TIME=$(date +%s)
NAME=$1
echo "time: "
FILENAME=${TIME}_${NAME}
ffmpeg -i rtmp://192.168.0.98:1935/live/$1 -vframes 1 -s 150x150 -ss 10 -
strftime 1 /var/stream/video_recordings/thumbnails/"$FILENAME.jpg";
#Writes path to video into database
mysql --user=$DB_USER --password=$DB_PASSWD $DB_NAME << EOF
INSERT INTO $TABLE3 (thumbnailfile) VALUES ('$FILENAME');
set @last_id_in_thumbnails = LAST_INSERT_ID();
INSERT INTO $TABLE (created_at, updated_at, thumnailID) VALUES
(NOW(),NOW(),@last_id_in_thumbnails);
SET @last_id_in_livestreams = LAST_INSERT_ID();
INSERT INTO $TABLE2 (created_at, updated_at, filename,liveID) VALUES
(NOW(),NOW(),'$FILENAME',@last_id_in_livestreams);
EOFI have not got the nginx rtmp logs installed, I can obviously do this however some of the logs appear in the nginx error.log, strangely the latest stream I tried did not update in the access log, however I think this is because I did not attempt to connect to it via any method. I don’t fully understand the error.log, in my stupidity I decided to use nginx with which I am quite inexperienced and I am finding it very difficult to troubleshoot this issue, it appears to me that as part of the RTMP protocol or my streaming software (OBS) is either directly pinging the rtmp stream or is being pinged by the server to ensure the connection is still there. And this ping is
I have left a stream running from approx 4 minutes without interacting with the server, streaming software, computer running the stream, I have ensured the internet connection is constant as my first though was the connection dropped, however on inspecting the database the executing is done always after at least 11 seconds however usually this is 16, I can’t seem to figure out how to select the closest dates from the database however there has been at least a few 17 second differences (potentially when
I am unsure if this is an issue or if it is intended behavior but I do require this to finish a university degree, I’m not asking for answers but if it is a legitimate issue then I would be happy to spend as much time I can commit to it if some insight into what is causing it, or if there is a workaround I believe it should be documented somewhere, I have googled into making any exec commands run only once on publishI can’t seem to pinpoint where in the log the issue is happening however think it is something to do with the below exceprts I would attach the file but can’t seem to select all lines after the timestamp upon starting a stream
2017/07/26 18:17:35 [info] 1451#0: *2229 exec: starting managed child
'ffmpeg', client: 192.168.0.78, server: 0.0.0.0:1935
2017/07/26 18:17:35 [info] 1451#0: *2412 client connected '192.168.0.98'
2017/07/26 18:17:35 [info] 1451#0: *2412 connect: app='live' args=''
flashver='LNX 9,0,124,2' swf_url='' tc_url='rtmp://192.168.0.98:1935/live'
page_url='' acodecs=4071 vcodecs=252 object_encoding=0, client:
192.168.0.98, server: 0.0.0.0:1935
2017/07/26 18:17:35 [info] 1451#0: *2412 createStream, client: 192.168.0.98,
server: 0.0.0.0:1935
2017/07/26 18:17:35 [info] 1451#0: *2412 play: name='newname' args=''
start=-2000 duration=0 reset=0 silent=0, client: 192.168.0.98, server:
0.0.0.0:1935
2017/07/26 18:17:36 [info] 1451#0: *2410 recv() failed (104: Connection
reset by peer), client: 192.168.0.98, server: 0.0.0.0:1935
2017/07/26 18:17:36 [info] 1451#0: *2410 disconnect, client: 192.168.0.98,
server: 0.0.0.0:1935
2017/07/26 18:17:36 [info] 1451#0: *2410 deleteStream, client: 192.168.0.98,
server: 0.0.0.0:1935
2017/07/26 18:17:36 [notice] 1451#0: signal 17 (SIGCHLD) received
2017/07/26 18:17:36 [notice] 1451#0: unknown process 10487 exited with code
0
2017/07/26 18:17:36 [info] 1451#0: *2229 exec: child 10487 exited; ignoring,
client: 192.168.0.78, server: 0.0.0.0:1935
ver: 0.0.0.0:1935
2017/07/26 18:17:41 [info] 1451#0: *2229 exec: starting managed child
'usr/local/bin/make_thumbnail.sh', client: 192.168.0.78, server:
0.0.0.0:1935
2017/07/26 18:17:41 [info] 1451#0: *2413 client connected '192.168.0.98'
2017/07/26 18:17:41 [info] 1451#0: *2413 connect: app='live' args=''
flashver='LNX 9,0,124,2' swf_url='' tc_url='rtmp://192.168.0.98:1935/live'
page_url='' acodecs=4071 vcodecs=252 object_encoding=0, client:
192.168.0.98,
server: 0.0.0.0:1935
2017/07/26 18:17:41 [info] 1451#0: *2413 createStream, client: 192.168.0.98,
server: 0.0.0.0:1935
2017/07/26 18:17:41 [info] 1451#0: *2413 play: name='newname' args=''
start=-2000 duration=0 reset=0 silent=0, client: 192.168.0.98, server:
0.0.0.0:1935
2017/07/26 18:17:43 [info] 1451#0: *2229 exec: starting managed child
'ffmpeg',
client: 192.168.0.78, server: 0.0.0.0:1935
2017/07/26 18:17:43 [info] 1451#0: *2414 client connected '192.168.0.98'
2017/07/26 18:17:43 [info] 1451#0: *2414 connect: app='live' args=''
flashver='LNX 9,0,124,2' swf_url='' tc_url='rtmp://192.168.0.98:1935/live'
page_url='' acodecs=4071 vcodecs=252 object_encoding=0, client:
192.168.0.98,
server: 0.0.0.0:1935
@ -
Homepage Design : Best Practices & Examples
5 octobre 2022, par Erin -
ffmpeg configuration difficulty with filter_complex and hls
4 février 2020, par akc42I am trying to set up ffmpeg so that it will record from a microphone and encode the results at the same time into a .flac file for later syncing up with some video I will be making.
The microphone is plugged into a raspberry pi (4B) and I am currently trying it with a blue yeti mic, but I think I can do the same with a focusrite scarlett 2i2 plugged in instead. However I was puzzling about how to start the server recording and decided I could do it from a web browser if I made a simple nodejs server that spawned ffmpeg as a child process.
But then I was inspired by this sample ffmpeg command which displays (on my desktop with an graphical interface) a volume meter
ffmpeg -hide_banner -i 'http://distribution.bbb3d.renderfarming.net/video/mp4/bbb_sunflower_1080p_30fps_normal.mp4' -filter_complex "showvolume=rate=25:f=0.95:o=v:m=p:dm=3:h=80:w=480:ds=log:s=2" -c:v libx264 -c:a aac -f mpegts - | ffplay -window_title "Peak Volume" -i -
What if I could stream the video produced by the
showvolume
filter to the web browser that I am using to control the ffmpeg process (NOTE I don’t want to send the audio with this). So I tried to read up on hls (since the control device will be an ipad - in fact that is what I will record the video on), and came up with this commandffmpeg -hide_banner -f alsa -ac 2 -ar 48k -i hw:CARD=Microphone -filter_complex "asplit=2[main][vol],[vol]showvolume=rate=25:f=0.95:o=v:m=p:dm=3:h=80:w=480:ds=log:s=2[vid]" -map [main] -c:a:0 flac recordings/session_$(date +%a_%d_%b_%Y___%H_%M_%S).flac -map [vid] -preset veryfast -g 25 -an -sc_threshold 0 -c:v:1 libx264 -b:v:1 2000k -maxrate:v:1 2200k -bufsize:v:3000k -f hls -hls_time 4 -hls_flags independent_segments delete_segments -strftime 1 -hls_segment_filename recordings/volume-%Y%m%d-%s.ts recordings/volume.m3u8
The problem is I am finding the documentation a bit opaque as to what happens once I have generated two streams - the main audio and a video stream, and this command throws both a warning and an error :-
The warning is
Guessed Channel Layout for Input Stream #0.0 : stereo
and the error is
[NULL @ 0x1baa130] Unable to find a suitable output format for 'hls'
hls: Invalid argumentWhat I am trying to do is set up stream labels [main] and [vol] as I split the incoming audio into two parts, then I pass [vol] through the "showvolume" filter and end up with stream [vid].
I think I need to then use
-map
to specify encoding the [main] stream down to flac and writing it out to file (The file exists after I run the command although they have zero length), and use another -map to pass through to the-f hls
section. But I think I have something wrong at this stage.Can someone help me get this command right.