
Recherche avancée
Autres articles (45)
-
La sauvegarde automatique de canaux SPIP
1er avril 2010, parDans le cadre de la mise en place d’une plateforme ouverte, il est important pour les hébergeurs de pouvoir disposer de sauvegardes assez régulières pour parer à tout problème éventuel.
Pour réaliser cette tâche on se base sur deux plugins SPIP : Saveauto qui permet une sauvegarde régulière de la base de donnée sous la forme d’un dump mysql (utilisable dans phpmyadmin) mes_fichiers_2 qui permet de réaliser une archive au format zip des données importantes du site (les documents, les éléments (...) -
Script d’installation automatique de MediaSPIP
25 avril 2011, parAfin de palier aux difficultés d’installation dues principalement aux dépendances logicielles coté serveur, un script d’installation "tout en un" en bash a été créé afin de faciliter cette étape sur un serveur doté d’une distribution Linux compatible.
Vous devez bénéficier d’un accès SSH à votre serveur et d’un compte "root" afin de l’utiliser, ce qui permettra d’installer les dépendances. Contactez votre hébergeur si vous ne disposez pas de cela.
La documentation de l’utilisation du script d’installation (...) -
Automated installation script of MediaSPIP
25 avril 2011, parTo overcome the difficulties mainly due to the installation of server side software dependencies, an "all-in-one" installation script written in bash was created to facilitate this step on a server with a compatible Linux distribution.
You must have access to your server via SSH and a root account to use it, which will install the dependencies. Contact your provider if you do not have that.
The documentation of the use of this installation script is available here.
The code of this (...)
Sur d’autres sites (4835)
-
Use ffmpeg to stream live content to azure media services
19 mars 2018, par DadicoolI’ve been trying to stream content to azure media services using ffmpeg as it’s one of the options described here : http://azure.microsoft.com/blog/2014/09/18/azure-media-services-rtmp-support-and-live-encoders/
My command is :
ffmpeg -v verbose -i 300.mp4 -strict -2 -c:a aac -b:a 128k -ar 44100 -r 30 -g 60 -keyint_min 60 -b:v 400000 -c:v libx264 -preset medium -bufsize 400k -maxrate 400k -f flv rtmp://nessma-****.channel.mediaservices.windows.net:1935/live/584c99f5c47f424d9e83ac95364331e7
I have made sure that the streaming endpoint has one active streaming unit, I also made sure that the channel is actually Ready and I even get it to start streaming (which makes a PublishURL available).
When I execute the ffmpeg command to start streaming, I keep getting the following error :
ffmpeg version 2.5.2 Copyright (c) 2000-2014 the FFmpeg developers
built on Dec 30 2014 11:31:18 with llvm-gcc 4.2.1 (LLVM build 2336.11.00)
configuration: --prefix=/Volumes/Ramdisk/sw --enable-gpl --enable-pthreads --enable-version3 --enable-libspeex --enable-libvpx --disable-decoder=libvpx --enable-libmp3lame --enable-libtheora --enable-libvorbis --enable-libx264 --enable-avfilter --enable-libopencore_amrwb --enable-libopencore_amrnb --enable-filters --enable-libgsm --enable-libvidstab --enable-libx265 --arch=x86_64 --enable-runtime-cpudetect
libavutil 54. 15.100 / 54. 15.100
libavcodec 56. 13.100 / 56. 13.100
libavformat 56. 15.102 / 56. 15.102
libavdevice 56. 3.100 / 56. 3.100
libavfilter 5. 2.103 / 5. 2.103
libswscale 3. 1.101 / 3. 1.101
libswresample 1. 1.100 / 1. 1.100
libpostproc 53. 3.100 / 53. 3.100
Routing option strict to both codec and muxer layer
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x7f9a0a002c00] overread end of atom 'colr' by 1 bytes
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x7f9a0a002c00] stream 0, timescale not set
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x7f9a0a002c00] max_analyze_duration 5000000 reached at 5003637 microseconds
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '300.mp4':
Metadata:
major_brand : mp42
minor_version : 0
compatible_brands: mp42isomavc1
creation_time : 2014-01-11 05:39:32
genre : Trailer
artist : Warner Bros.
title : 300: Rise of an Empire - Trailer 2
encoder : HandBrake 0.9.9 2013051800
date : 2014
Duration: 00:02:33.24, start: 0.000000, bitrate: 7377 kb/s
Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, bt709), 1920x1080 (1920x1088), 7219 kb/s, 23.98 fps, 23.98 tbr, 90k tbn, 47.95 tbc (default)
Metadata:
creation_time : 2014-01-11 05:39:32
encoder : JVT/AVC Coding
Stream #0:1(eng): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 157 kb/s (default)
Metadata:
creation_time : 2014-01-11 05:39:32
Stream #0:2: Video: mjpeg, yuvj420p(pc, bt470bg/unknown/unknown), 101x150 [SAR 72:72 DAR 101:150], 90k tbr, 90k tbn, 90k tbc
rtmp://nessma-****.channel.mediaservices.windows.net:1935/live/584c99f5c47f424d9e83ac95364331e7: Input/output errorThe Azure blog post clearly states that this should be possible but I can’t find a working example anywhere.
Environment :
- MacOS Maverick
- FFMPEG installed from official build
- 300.mp4 : 1080p trailer of the latest 300 movie
-
Use ffmpeg to stream live content to azure media services
9 juin 2016, par DadicoolI’ve been trying to stream content to azure media services using ffmpeg as it’s one of the options described here : http://azure.microsoft.com/blog/2014/09/18/azure-media-services-rtmp-support-and-live-encoders/
My command is :
ffmpeg -v verbose -i 300.mp4 -strict -2 -c:a aac -b:a 128k -ar 44100 -r 30 -g 60 -keyint_min 60 -b:v 400000 -c:v libx264 -preset medium -bufsize 400k -maxrate 400k -f flv rtmp://nessma-****.channel.mediaservices.windows.net:1935/live/584c99f5c47f424d9e83ac95364331e7
I have made sure that the streaming endpoint has one active streaming unit, I also made sure that the channel is actually Ready and I even get it to start streaming (which makes a PublishURL available).
When I execute the ffmpeg command to start streaming, I keep getting the following error :
ffmpeg version 2.5.2 Copyright (c) 2000-2014 the FFmpeg developers
built on Dec 30 2014 11:31:18 with llvm-gcc 4.2.1 (LLVM build 2336.11.00)
configuration: --prefix=/Volumes/Ramdisk/sw --enable-gpl --enable-pthreads --enable-version3 --enable-libspeex --enable-libvpx --disable-decoder=libvpx --enable-libmp3lame --enable-libtheora --enable-libvorbis --enable-libx264 --enable-avfilter --enable-libopencore_amrwb --enable-libopencore_amrnb --enable-filters --enable-libgsm --enable-libvidstab --enable-libx265 --arch=x86_64 --enable-runtime-cpudetect
libavutil 54. 15.100 / 54. 15.100
libavcodec 56. 13.100 / 56. 13.100
libavformat 56. 15.102 / 56. 15.102
libavdevice 56. 3.100 / 56. 3.100
libavfilter 5. 2.103 / 5. 2.103
libswscale 3. 1.101 / 3. 1.101
libswresample 1. 1.100 / 1. 1.100
libpostproc 53. 3.100 / 53. 3.100
Routing option strict to both codec and muxer layer
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x7f9a0a002c00] overread end of atom 'colr' by 1 bytes
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x7f9a0a002c00] stream 0, timescale not set
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x7f9a0a002c00] max_analyze_duration 5000000 reached at 5003637 microseconds
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '300.mp4':
Metadata:
major_brand : mp42
minor_version : 0
compatible_brands: mp42isomavc1
creation_time : 2014-01-11 05:39:32
genre : Trailer
artist : Warner Bros.
title : 300: Rise of an Empire - Trailer 2
encoder : HandBrake 0.9.9 2013051800
date : 2014
Duration: 00:02:33.24, start: 0.000000, bitrate: 7377 kb/s
Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, bt709), 1920x1080 (1920x1088), 7219 kb/s, 23.98 fps, 23.98 tbr, 90k tbn, 47.95 tbc (default)
Metadata:
creation_time : 2014-01-11 05:39:32
encoder : JVT/AVC Coding
Stream #0:1(eng): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 157 kb/s (default)
Metadata:
creation_time : 2014-01-11 05:39:32
Stream #0:2: Video: mjpeg, yuvj420p(pc, bt470bg/unknown/unknown), 101x150 [SAR 72:72 DAR 101:150], 90k tbr, 90k tbn, 90k tbc
rtmp://nessma-****.channel.mediaservices.windows.net:1935/live/584c99f5c47f424d9e83ac95364331e7: Input/output errorThe Azure blog post clearly states that this should be possible but I can’t find a working example anywhere.
Environment :
- MacOS Maverick
- FFMPEG installed from official build
- 300.mp4 : 1080p trailer of the latest 300 movie
-
Nodejs - ffmpeg - unable to live convert webcam to hls file
26 mai 2020, par Jeff55Okay so long story short : I need to be able to accept a video stream from my usb port and offer this up to my frontend (React) using HLS/RTMP/WebRTC. I opted for HLS because it seems to have the least amount of dependencies and this may end up running on a Pi, so simplicity is good.



const { spawn } = require('child_process');
var args = [
 '-pix_fmt', 'uyvy422',
 '-f', 'avfoundation',
 '-framerate', '30',
 '-i', '0',
 '-c:v', 'libx264',
 '-crf', '21',
 '-preset', 'veryfast',
 '-f', 'hls',
 '-hls_time', '4',
 '-hls_playlist_type', 'event', 'stream.m3u8'
]
const ffmpeg = spawn('ffmpeg', args);

ffmpeg.stdout.on('data', (data) => {
 console.log(`stdout: ${data}`);
});

ffmpeg.stderr.on('data', (data) => {
 console.error(`stderr: ${data}`);
});

ffmpeg.on('close', (code, signal) => {
 if(code) console.log(`child process exited with code ${code}`);
 else if(signal) console.log(`child process exited with signal ${signal}`);
});




My expectation is that this should grab the 0th video input and encode to hls. Unfortunately, I get this as a result.



stderr: ffmpeg version 4.2.3
stderr: Copyright (c) 2000-2020 the FFmpeg developers
 built with Apple clang version 11.0.3 (clang-1103.0.32.59)
 configuration: --prefix=/usr/local/Cellar/ffmpeg/4.2.3 --enable-shared --enable-pthreads --enable-version3 --enable-avresample --cc=clang --host-cflags=-fno-stack-check --host-ldflags= --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libbluray --enable-libdav1d --enable-libmp3lame --enable-libopus --enable-librubberband --enable-libsnappy --enable-libsrt --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-librtmp --enable-libspeex --enable-libsoxr --enable-videotoolbox --disable-libjack --disable-indev=jack
 libavutil 56. 31.100 / 56. 31.100
 libavcodec 58. 54.100 / 58. 54.100
 libavformat 58. 29.100 / 58. 29.100
 libavdevice 58. 8.100 / 58. 8.100
 libavfilter 7. 57.100 / 7. 57.100
 libavresample 4. 0. 0 / 4. 0. 0
 libswscale 5. 5.100 / 5. 5.100
 libswresample 3. 5.100 / 3. 5.100
 libpostproc 55. 5.100 / 55. 5.100

stderr: [AVFoundation input device @ 0x7fcac2604600] Configuration of video device failed, falling back to default.

child process exited with signal SIGABRT




I have no clue why I'm getting a SIGABRT signal but when I manually type in the device (-i "FaceTime HD Camera"), I don't get the config error but the process exits without doing its usual
frame= 38 fps=0.0 q=27.0 size=N/A time=00:00:00.46 bitrate=N/A speed=0.869x
, it just exits with a SIGABRT.


If I type the exact command (
ffmpeg -f avfoundation -framerate 30 -i "0" -c:v libx264 -crf 21 -preset veryfast -f hls -hls_time 4 -hls_playlist_type event out/test.m3u8
), everything works as expected and I get an output.


I read here that it could be due to a static build but I'm not really sure why that would cause an issue. For reference, this is on MacOS and will most likely eventually move to Ubuntu.



And idea why this is happening ? Thanks.