
Recherche avancée
Médias (1)
-
La conservation du net art au musée. Les stratégies à l’œuvre
26 mai 2011
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (5)
-
Personnaliser les catégories
21 juin 2013, parFormulaire de création d’une catégorie
Pour ceux qui connaissent bien SPIP, une catégorie peut être assimilée à une rubrique.
Dans le cas d’un document de type catégorie, les champs proposés par défaut sont : Texte
On peut modifier ce formulaire dans la partie :
Administration > Configuration des masques de formulaire.
Dans le cas d’un document de type média, les champs non affichés par défaut sont : Descriptif rapide
Par ailleurs, c’est dans cette partie configuration qu’on peut indiquer le (...) -
Emballe médias : à quoi cela sert ?
4 février 2011, parCe plugin vise à gérer des sites de mise en ligne de documents de tous types.
Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ; -
Configuration spécifique pour PHP5
4 février 2011, parPHP5 est obligatoire, vous pouvez l’installer en suivant ce tutoriel spécifique.
Il est recommandé dans un premier temps de désactiver le safe_mode, cependant, s’il est correctement configuré et que les binaires nécessaires sont accessibles, MediaSPIP devrait fonctionner correctement avec le safe_mode activé.
Modules spécifiques
Il est nécessaire d’installer certains modules PHP spécifiques, via le gestionnaire de paquet de votre distribution ou manuellement : php5-mysql pour la connectivité avec la (...)
Sur d’autres sites (2131)
-
libav how to change stream codec
9 novembre 2018, par Rodolfo PicoretiI am trying to reproduce with libav the same thing that the following ffmpeg command does :
ffmpeg -f v4l2 -framerate 25 -video_size 640x480 -i /dev/video 0 -f
mpegts -codec:v mpeg1video -s 640x480 -b:v 1000k -bf 0 -muxdelay
0.001 http://localhost:8081/supersecretI manage to reproduce most part of it. The problem is that when allocating the "mpegts" stream (line 23) the codec that is selected is "mpeg2video", but I need it to be "mpeg1video". I tried forcing the codec_id variable to be "mpeg1video" (line 25) and it kinda worked, although I get a lot of artifacts in the image so I am guessing this is not how you do it. How can I properly change the codec in this case (that is, the "-codec:v mpeg1video" part of the ffmpeg command) ?
C++ code being used :
#include <exception>
#include <opencv2></opencv2>core.hpp>
#include <opencv2></opencv2>highgui.hpp>
#include <opencv2></opencv2>imgproc.hpp>
extern "C" {
#include <libavcodec></libavcodec>avcodec.h>
#include <libavformat></libavformat>avformat.h>
#include <libavutil></libavutil>error.h>
#include <libavutil></libavutil>frame.h>
#include <libavutil></libavutil>imgutils.h>
}
struct AVStreamer {
AVFormatContext* format_context;
AVStream* stream;
AVCodecContext* codec_context;
AVCodec* codec;
AVFrame* frame;
int64_t next_pts;
void init_format_context(const char* url) {
avformat_alloc_output_context2(&format_context, nullptr, "mpegts", url);
if (format_context == nullptr) throw std::runtime_error("Could not create output context");
format_context->oformat->video_codec = AV_CODEC_ID_MPEG1VIDEO;
}
void init_codec() {
auto codec_id = format_context->oformat->video_codec;
codec = avcodec_find_encoder(codec_id);
if (codec == nullptr) throw std::runtime_error("Could not find encoder");
}
void init_stream() {
stream = avformat_new_stream(format_context, nullptr);
if (stream == nullptr) throw std::runtime_error("Failed to alloc stream");
stream->id = format_context->nb_streams - 1;
}
void init_codec_context() {
codec_context = avcodec_alloc_context3(codec);
if (codec_context == nullptr) throw std::runtime_error("Failed to alloc encoding context");
auto codec_id = format_context->oformat->video_codec;
codec_context->codec_id = codec_id;
codec_context->bit_rate = 400000;
codec_context->width = 640;
codec_context->height = 480;
stream->time_base = AVRational{1, 30};
codec_context->time_base = stream->time_base;
codec_context->gop_size = 30; // one intra frame every gop_size
// codec_context->max_b_frames = 0; // output delayed by max_b_frames
codec_context->pix_fmt = AV_PIX_FMT_YUV420P;
if (codec_context->codec_id == AV_CODEC_ID_MPEG1VIDEO) { codec_context->mb_decision = 2; }
if (format_context->oformat->flags & AVFMT_GLOBALHEADER) {
codec_context->flags |= AV_CODEC_FLAG_GLOBAL_HEADER;
}
}
void init_frame() {
frame = av_frame_alloc();
if (frame == nullptr) throw std::runtime_error("Failed to alloc frame");
frame->format = codec_context->pix_fmt;
frame->width = codec_context->width;
frame->height = codec_context->height;
auto status = av_frame_get_buffer(frame, 32);
if (status < 0) { throw std::runtime_error("Could not allocate frame data.\n"); }
}
void open_stream(const char* url) {
int status = avcodec_open2(codec_context, codec, nullptr);
if (status != 0) throw std::runtime_error("Failed to open codec");
// copy the stream parameters to the muxer
status = avcodec_parameters_from_context(stream->codecpar, codec_context);
if (status < 0) throw std::runtime_error("Could not copy the stream parameters");
av_dump_format(format_context, 0, url, 1);
if (!(format_context->oformat->flags & AVFMT_NOFILE)) {
status = avio_open(&format_context->pb, url, AVIO_FLAG_WRITE);
if (status < 0) throw std::runtime_error("Could not open output file");
}
// Write the stream header, if any.
status = avformat_write_header(format_context, nullptr);
if (status < 0) throw std::runtime_error("Error occurred when opening output file");
}
AVStreamer(const char* url) : next_pts(0) {
init_format_context(url);
init_codec();
init_stream();
init_codec_context();
init_frame();
open_stream(url);
}
virtual ~AVStreamer() {
avformat_free_context(format_context);
avcodec_free_context(&codec_context);
av_frame_free(&frame);
}
void send(cv::Mat const& image) {
cv::cvtColor(image, image, CV_BGR2YUV);
cv::Mat planes[3];
cv::split(image, planes);
cv::pyrDown(planes[1], planes[1]);
cv::pyrDown(planes[2], planes[2]);
if (av_frame_make_writable(frame) < 0) {
throw std::runtime_error("Failed to make frame writable");
}
frame->data[0] = planes[0].data;
frame->linesize[0] = planes[0].step;
frame->data[1] = planes[1].data;
frame->linesize[1] = planes[1].step;
frame->data[2] = planes[2].data;
frame->linesize[2] = planes[2].step;
frame->pts = next_pts++;
AVPacket packet;
av_init_packet(&packet);
int status = avcodec_send_frame(codec_context, frame);
if (status < 0) throw std::runtime_error("Send frame failed");
status = avcodec_receive_packet(codec_context, &packet);
if (status == AVERROR(EAGAIN)) { return; }
if (status < 0) throw std::runtime_error("Receive packet failed");
av_packet_rescale_ts(&packet, codec_context->time_base, stream->time_base);
packet.stream_index = stream->index;
av_interleaved_write_frame(format_context, &packet);
}
};
int main(int argc, char** argv) {
av_register_all();
avformat_network_init();
auto url = argc == 2 ? argv[1] : "http://localhost:8081/supersecret";
AVStreamer streamer(url);
cv::VideoCapture video(0);
assert(video.isOpened() && "Failed to open video");
for (;;) {
cv::Mat image;
video >> image;
streamer.send(image);
}
}
</exception> -
nginx : [emerg] invalid port in url "http://192.168.0.100:80/live" in nginx.conf - Restreaming OBS to LAN
17 novembre 2018, par popek069I want to restream OBS to LAN. So I set up nginx server. The server receive stream from OBS using RTMP and restreams it to HTTP to view from another device.
Streaming from OBS works, but when I start nginx I get an errorPS C:\Users\popek\Downloads\nginx> .\nginx.exe -s reload
nginx: [emerg] invalid port in url "http://192.168.0.100:80/live" in C:\Users\popek\Downloads\nginx/conf/nginx.conf:187I’m new to nginx and I’m running Windows 10, nginx server and OBS are on the same pc with ip 192.168.0.100
I’d like to also reencode stream using ffmpeg if it’s possible. I know ffmpeg, I don’t know only how to set input and output.Config : (nginx.conf)
#user nobody;
# multiple workers works !
worker_processes 2;
#error_log logs/error.log;
#error_log logs/error.log notice;
#error_log logs/error.log info;
#pid logs/nginx.pid;
events {
worker_connections 8192;
# max value 32768, nginx recycling connections+registry optimization =
# this.value * 20 = max concurrent connections currently tested with one worker
# C1000K should be possible depending there is enough ram/cpu power
# multi_accept on;
}
http {
#include /nginx/conf/naxsi_core.rules;
include mime.types;
default_type application/octet-stream;
#log_format main '$remote_addr:$remote_port - $remote_user [$time_local] "$request" '
# '$status $body_bytes_sent "$http_referer" '
# '"$http_user_agent" "$http_x_forwarded_for"';
#access_log logs/access.log main;
# # loadbalancing PHP
# upstream myLoadBalancer {
# server 127.0.0.1:9001 weight=1 fail_timeout=5;
# server 127.0.0.1:9002 weight=1 fail_timeout=5;
# server 127.0.0.1:9003 weight=1 fail_timeout=5;
# server 127.0.0.1:9004 weight=1 fail_timeout=5;
# server 127.0.0.1:9005 weight=1 fail_timeout=5;
# server 127.0.0.1:9006 weight=1 fail_timeout=5;
# server 127.0.0.1:9007 weight=1 fail_timeout=5;
# server 127.0.0.1:9008 weight=1 fail_timeout=5;
# server 127.0.0.1:9009 weight=1 fail_timeout=5;
# server 127.0.0.1:9010 weight=1 fail_timeout=5;
# least_conn;
# }
sendfile off;
#tcp_nopush on;
server_names_hash_bucket_size 128;
## Start: Timeouts ##
client_body_timeout 10;
client_header_timeout 10;
keepalive_timeout 30;
send_timeout 10;
keepalive_requests 10;
## End: Timeouts ##
#gzip on;
server {
#listen 80;
server_name localhost;
#charset koi8-r;
#access_log logs/host.access.log main;
## Caching Static Files, put before first location
#location ~* \.(jpg|jpeg|png|gif|ico|css|js)$ {
# expires 14d;
# add_header Vary Accept-Encoding;
#}
# For Naxsi remove the single # line for learn mode, or the ## lines for full WAF mode
location / {
#include /nginx/conf/mysite.rules; # see also http block naxsi include line
##SecRulesEnabled;
##DeniedUrl "/RequestDenied";
##CheckRule "$SQL >= 8" BLOCK;
##CheckRule "$RFI >= 8" BLOCK;
##CheckRule "$TRAVERSAL >= 4" BLOCK;
##CheckRule "$XSS >= 8" BLOCK;
root html;
index index.html index.htm;
}
# For Naxsi remove the ## lines for full WAF mode, redirect location block used by naxsi
##location /RequestDenied {
## return 412;
##}
## Lua examples !
# location /robots.txt {
# rewrite_by_lua '
# if ngx.var.http_host ~= "localhost" then
# return ngx.exec("/robots_disallow.txt");
# end
# ';
# }
#error_page 404 /404.html;
# redirect server error pages to the static page /50x.html
#
error_page 500 502 503 504 /50x.html;
location = /50x.html {
root html;
}
# proxy the PHP scripts to Apache listening on 127.0.0.1:80
#
#location ~ \.php$ {
# proxy_pass http://127.0.0.1;
#}
# pass the PHP scripts to FastCGI server listening on 127.0.0.1:9000
#
#location ~ \.php$ {
# root html;
# fastcgi_pass 127.0.0.1:9000; # single backend process
# fastcgi_pass myLoadBalancer; # or multiple, see example above
# fastcgi_index index.php;
# fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
# include fastcgi_params;
#}
# deny access to .htaccess files, if Apache's document root
# concurs with nginx's one
#
#location ~ /\.ht {
# deny all;
#}
}
# another virtual host using mix of IP-, name-, and port-based configuration
#
#server {
# listen 8000;
# listen somename:8080;
# server_name somename alias another.alias;
# location / {
# root html;
# index index.html index.htm;
# }
#}
# HTTPS server
#
#server {
# listen 443 ssl spdy;
# server_name localhost;
# ssl on;
# ssl_certificate cert.pem;
# ssl_certificate_key cert.key;
# ssl_session_timeout 5m;
# ssl_prefer_server_ciphers On;
# ssl_protocols TLSv1 TLSv1.1 TLSv1.2;
# ssl_ciphers ECDH+AESGCM:ECDH+AES256:ECDH+AES128:ECDH+3DES:RSA+AESGCM:RSA+AES:RSA+3DES:!aNULL:!eNULL:!MD5:!DSS:!EXP:!ADH:!LOW:!MEDIUM;
# location / {
# root html;
# index index.html index.htm;
# }
#}
}
rtmp {
server {
listen 1935;
chunk_size 4096;
application live {
live on;
record off;
hls on;
push http://192.168.0.100:80/live ;
}
}
} -
Multiple format changes in FFmpeg for thermal camera
6 février 2023, par Greynol4I'm having trouble generating a command to process output from a uvc thermal camera's raw data so that it can be colorized and then output to a virtual device with the intention of streaming it over rtsp. This is on a raspberry pi 3B+ with 32bit bullseye.


The original code that works perfectly for previewing it is :


ffmpeg -input_format yuyv422 -video_size 256x384 -i /dev/video0 -vf 'crop=h=(ih/2):y=(ih/2)' -pix_fmt yuyv422 -f rawvideo - | ffplay -pixel_format gray16le -video_size 256x192 -f rawvideo -i - -vf 'normalize=smoothing=10, format=pix_fmts=rgb48, pseudocolor=p=inferno'



Essentially what this is doing is taking the raw data, cutting the useful portion out, then piping it to ffplay where it is seen as 16bit grayscale (in this case gray16le), then it is normalized, formatted to 48 bit rgb and then a pseudocolor filter is applied.


I haven't been able to get this to translate into ffmpeg-only because it throws codec errors or format errors or converts the 16bit to 10bit even though I need the 16bit. I have tried using v4l2loopback and two instances of ffmpeg in separate windows to see if I could figure out where the error was actually occuring but I suspect that is introducing more format issues that are distracting from the original problem. The closest I have been able to get is


ffmpeg -input_format yuyv422 -video_size 256x384 -i /dev/video0 -vf 'crop=h=(ih/2):y=(ih/2)' -pix_fmt yuyv422 -f rawvideo /dev/video3


Followed by


ffmpeg -video_size 256x192 -i /dev/video3 -f rawvideo -pix_fmt gray16le -vf 'normalize=smoothing=10,format=pix_fmts=rgb48, pseudocolor=p=inferno' -f rawvideo -f v4l2 /dev/video4


This results in a non colorized but somewhat useful image with certain temperatures showing as missing pixels as opposed to the command with ffplay where it shows a properly colorized stream without missing pixels.


I'll include my configuration and log from the preview command but the log doesn't show errors unless I try to modify parameters and presumably mess up the syntax.


ffmpeg -input_format yuyv422 -video_size 256x384 -i /dev/video0 -vf 'crop=h=(ih/2):y=(ih/2)' -pix_fmt yuyv422 -f rawvideo - | ffplay -pixel_format gray16le -video_size 256x192 -f rawvideo -i - -vf 'normalize=smoothing=10, format=pix_fmts=rgb48, pseudocolor=p=inferno'
ffplay version N-109758-gbdc76f467f Copyright (c) 2003-2023 the FFmpeg developers
 built with gcc 10 (Raspbian 10.2.1-6+rpi1)
 configuration: --prefix=/usr/local --enable-nonfree --enable-gpl --enable-hardcoded-tables --disable-ffprobe --disable-ffplay --enable-libx264 --enable-libx265 --enable-sdl --enable-sdl2 --enable-ffplay
 libavutil 57. 44.100 / 57. 44.100
 libavcodec 59. 63.100 / 59. 63.100
 libavformat 59. 38.100 / 59. 38.100
 libavdevice 59. 8.101 / 59. 8.101
 libavfilter 8. 56.100 / 8. 56.100
 libswscale 6. 8.112 / 6. 8.112
 libswresample 4. 9.100 / 4. 9.100
 libpostproc 56. 7.100 / 56. 7.100
ffmpeg version N-109758-gbdc76f467f Copyright (c) 2000-2023 the FFmpeg developers
 built with gcc 10 (Raspbian 10.2.1-6+rpi1)
 configuration: --prefix=/usr/local --enable-nonfree --enable-gpl --enable-hardcoded-tables --disable-ffprobe --disable-ffplay --enable-libx264 --enable-libx265 --enable-sdl --enable-sdl2 --enable-ffplay
 libavutil 57. 44.100 / 57. 44.100
 libavcodec 59. 63.100 / 59. 63.100
 libavformat 59. 38.100 / 59. 38.100
 libavdevice 59. 8.101 / 59. 8.101
 libavfilter 8. 56.100 / 8. 56.100
 libswscale 6. 8.112 / 6. 8.112
 libswresample 4. 9.100 / 4. 9.100
 libpostproc 56. 7.100 / 56. 7.100
Input #0, video4linux2,v4l2, from '/dev/video0':B sq= 0B f=0/0 
 Duration: N/A, start: 242.040935, bitrate: 39321 kb/s
 Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 256x384, 39321 kb/s, 25 fps, 25 tbr, 1000k tbn
Stream mapping:.000 fd= 0 aq= 0KB vq= 0KB sq= 0B f=0/0 
 Stream #0:0 -> #0:0 (rawvideo (native) -> rawvideo (native))
Press [q] to stop, [?] for help
Output #0, rawvideo, to 'pipe:': 0KB vq= 0KB sq= 0B f=0/0 
 Metadata:
 encoder : Lavf59.38.100
 Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422(tv, progressive), 256x192, q=2-31, 19660 kb/s, 25 fps, 25 tbn
 Metadata:
 encoder : Lavc59.63.100 rawvideo
frame= 0 fps=0.0 q=0.0 size= 0kB time=-577014:32:22.77 bitrate= -0.0kbInput #0, rawvideo, from 'fd:': 0KB vq= 0KB sq= 0B f=0/0 
 Duration: N/A, start: 0.000000, bitrate: 19660 kb/s
 Stream #0:0: Video: rawvideo (Y1[0][16] / 0x10003159), gray16le, 256x192, 19660 kb/s, 25 tbr, 25 tbn
frame= 13 fps=0.0 q=-0.0 size= 1152kB time=00:00:00.52 bitrate=18148.4kbitsframe= 25 fps= 24 q=-0.0 size= 2304kB time=00:00:01.00 bitrate=18874.4kbitsframe= 39 fps= 25 q=-0.0 size= 3648kB time=00:00:01.56 bitrate=19156.7kbitsframe= 51 fps= 24 q=-0.0 size= 4800kB time=00:00:02.04 bitrate=19275.3kbitsframe= 64 fps= 24 q=-0.0 size= 6048kB time=00:00:02.56 bitrate=19353.6kbitsframe= 78 fps= 25 q=-0.0 size= 7392kB time=00:00:03.12 bitrate=19408.7kbits





I'd also like to use the correct option so it isn't scrolling though every frame in the log as well as links to resources for adapting a command to a script for beginners even though that's outside the purview of this question so any direction on those would be much appreciated.