
Recherche avancée
Autres articles (67)
-
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page. -
Submit bugs and patches
13 avril 2011Unfortunately a software is never perfect.
If you think you have found a bug, report it using our ticket system. Please to help us to fix it by providing the following information : the browser you are using, including the exact version as precise an explanation as possible of the problem if possible, the steps taken resulting in the problem a link to the site / page in question
If you think you have solved the bug, fill in a ticket and attach to it a corrective patch.
You may also (...) -
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)
Sur d’autres sites (6506)
-
Using FFMPEG to make HLS clips from H264
21 novembre 2017, par Tyler BrooksI am using a Hi35xx camera processor from HiSilicon. It is an Arm9 with a video pipeline bolted on the side. At one end of the pipeline is the CMOS sensor. At the other end is a H264 encoder. When I turn on the pipeline, the encoder outputs H264 NAL packets like this :
frame0: <sps>,<pps>,<sei>,<key frame="frame">
frame1: <delta frame="frame">
frame2: <delta frame="frame">
...
frameN: <delta frame="frame">
frameN+1: <sps>,<pps>,<sei><key frame="frame">
frameN+2: <delta frame="frame">
frameN+3: <delta frame="frame">
...
etc.
</delta></delta></key></sei></pps></sps></delta></delta></delta></key></sei></pps></sps>I am turning that into HLS clips by doing the following (pseudo code for clarity) :
av_register_all();
avformat_network_init();
avformat_alloc_output_context2(&ctx_out, NULL, "hls", "./foo.m3u8");
strm_out = avformat_new_stream(ctx_out, NULL);
codec_out = strm_out->codecpar;
codec_out->codec_id = AV_CODEC_ID_H264;
codec_out->codec_type = AVMEDIA_TYPE_VIDEO;
codec_out->width = encoder_width;
codec_out->height = encoder_height;
codec_out->bit_rate = encoder_bitrate;
codec_out->codec_tag = 0;
avformat_write_header(ctx_out, NULL);
while(get_packet_from_pipeline_encoder(&encoder_packet)) {
AVPacket pkt;
av_init_packet(&pkt);
pkt.stream_index = 0;
pkt.dts = AV_NOPTS_VALUE;
pkt.pts = AV_NOPTS_VALUE;
pkt.duration = (1000000/FRAMERATE); // frame rate in microseconds
pkt.data = encoder_packet.data;
pkt.size = encoder_packet.size;
if (is_keyframe(&encoder_packet)) {
pkt.flags |= AV_PKT_FLAG_KEY;
}
av_write_frame(ctx_out, &pkt);
}
av_write_trailer(ctx_out);
avformat_free_context(ctx_out);This seems to work fine except that the resulting HLS frame rate is not right. Of course, this happens because I am not setting the pts/dts stuff correctly and ffmpeg lets me know that. So I have two quetions :
- Am I going about this right ?
- How can I set the pts/dts stuff correctly ?
The encoder is giving me packets and I am submitting them as frames. Those
<sps>, <pps> and <sei></sei></pps></sps>
packets are really out of band data and don’t really have a timestamp. How can I submit them correctly ? -
What is the proper syntax to use ffmpeg to stream H.264 using RTSP over an HTTP tunnel ?
1er décembre 2017, par NewtownGuyI’m trying to send an H.264 video stream at 10 fps over rtsp over an http tunnel so the video can be accessed remotely through a firewall and ideally, using only a single port for all communications. I can’t just use rtsp because it needs one open port on which the stream is requested, which is fine, but it opens two server ports that it chooses randomly for the video stream and they can’t be mapped through the router to the world — a common problem.
I tried VLC but it won’t let me control the server ports that it opens. ffmpeg seems to have more capability selecting ports, but I can’t get the syntax right. Here’s the command I’m using, where my H.264 stream at 10 fps comes from a pipe, /home/vout1, and I tried limiting the server ports in case it won’t let me just use one port for everything :
root@Z-1:~# ffmpeg -r 10 -i /home/vout1 -f rtsp -rtsp_transport http -min_port 25000 -max_port 25009 rtsp://localhost:8554
Here’s the result, where I’ve placed the errors messages in bold :
ffmpeg version 3.2.4-static http://johnvansickle.com/ffmpeg/ Copyright (c) 2000-2017 the FFmpeg developers built with gcc 5.4.1 (Debian 5.4.1-5) 20170205 configuration : —enable-gpl
— enable-version3 —enable-static —disable-debug —disable-ffplay —disable-indev=sndio —disable-outdev=sndio —cc=gcc-5 —enable-fontconfig —enable-frei0r —enable-gnutls —enable-gray —enable-libass —enable-libfreetype —enable-libfribidi —enable-libmp3lame —enable-libopencore-amrnb —enable-libopencore-amrwb —enable-libopus —enable-librtmp —enable-libsoxr —enable-libspeex —enable-libtheora —enable-libvidstab —enable-libvo-amrwbenc —enable-libvorbis —enable-libvpx —enable-libwebp —enable-libx264 —enable-libxvid
libavutil 55. 34.101 / 55. 34.101
libavcodec 57. 64.101 / 57. 64.101
libavformat 57. 56.101 / 57. 56.101
libavdevice 57. 1.100 / 57. 1.100
libavfilter 6. 65.100 / 6. 65.100
libswscale 4. 2.100 / 4. 2.100
libswresample 2. 3.100 / 2. 3.100
libpostproc 54. 1.100 / 54. 1.100
Input #0, h264, from ’/home/vout1’ :
Duration : N/A, bitrate : N/A
Stream #0:0 : Video : h264 (High), yuv420p(progressive), 960x540, 25 fps, 25 tbr, 1200k tbn, 50 tbc
[rtsp @ 0x3d68b30] Unsupported lower transport method, only UDP and TCP are supported for output.
Could not write header for output file #0 (incorrect codec parameters ?) : Invalid argumentStream mapping :
Stream #0:0 -> #0:0 (h264 (native) -> mpeg4 (native))
Last message repeated 1 timesffmpeg sees the stream because it got the resolution right. But it thinks my stream is the default 25 fps, but I specified -r 10 to say the frame rate is only 10 fps. Second, the stream is not being created.
What is the proper command line syntax and how can I make ffmpeg use one port for everything, even if I can only have one stream ?
Thank you in advance for your help.
-
Live video stream on server (PC) from images sent by robot through UDP
3 février 2018, par Richard KnopHmm. I found this which seems promising :
http://sourceforge.net/projects/mjpg-streamer/
Ok. I will try to explain what I am trying to do clearly and in much detail.
I have a small humanoid robot with camera and wifi stick (this is the robot). The robot’s wifi stick average wifi transfer rate is 1769KB/s. The robot has 500Mhz CPU and 256MB RAM so it is not enough for any serious computations (moreover there are already couple modules running on the robot for motion, vision, sonar, speech etc).
I have a PC from which I control the robot. I am trying to have the robot walk around the room and see a live stream video of what the robot sees in the PC.
What I already have working. The robot is walking as I want him to do and taking images with the camera. The images are being sent through UDP protocol to the PC where I am receiving them (I have verified this by saving the incoming images on the disk).
The camera returns images which are 640 x 480 px in YUV442 colorspace. I am sending the images with lossy compression (JPEG) because I am trying to get the best possible FPS on the PC. I am doing the compression to JPEG on the robot with PIL library.
My questions :
-
Could somebody please give me some ideas about how to convert the incoming JPEG images to a live video stream ? I understand that I will need some video encoder for that. Which video encoder do you recommend ? FFMPEG or something else ? I am very new to video streaming so I want to know what is best for this task. I’d prefer to use Python to write this so I would prefer some video encoder or library which has Python API. But I guess if the library has some good command line API it doesn’t have to be in Python.
-
What is the best FPS I could get out from this ? Given the 1769KB/s average wifi transfer rate and the dimensions of the images ? Should I use different compression than JPEG ?
-
I will be happy to see any code examples. Links to articles explaining how to do this would be fine, too.
Some code samples. Here is how I am sending JPEG images from robot to the PC (shortened simplified snippet). This runs on the robot :
# lots of code here
UDPSock = socket(AF_INET,SOCK_DGRAM)
while 1:
image = camProxy.getImageLocal(nameId)
size = (image[0], image[1])
data = image[6]
im = Image.fromstring("YCbCr", size, data)
s = StringIO.StringIO()
im.save(s, "JPEG")
UDPSock.sendto(s.getvalue(), addr)
camProxy.releaseImage(nameId)
UDPSock.close()
# lots of code hereHere is how I am receiving the images on the PC. This runs on the PC :
# lots of code here
UDPSock = socket(AF_INET,SOCK_DGRAM)
UDPSock.bind(addr)
while 1:
data, addr = UDPSock.recvfrom(buf)
# here I need to create a stream from the data
# which contains JPEG image
UDPSock.close()
# lots of code here -