
Recherche avancée
Autres articles (53)
-
Multilang : améliorer l’interface pour les blocs multilingues
18 février 2011, parMultilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela. -
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir
Sur d’autres sites (10217)
-
ffmpeg works but gstreamer does not work for rtsp camera
11 janvier 2021, par Jinmo ChongI'm using Nvidia Jetson Tx2 device.


With the following command, I can connect and capture an image with ffmpeg.


$/usr/bin/ffmpeg -y -frames 1 snapshot.png -rtsp_transport tcp -i rtsp://admin:admin@192.168.10.131/1/profile





ffmpeg version 3.4.8-0ubuntu0.2 Copyright (c) 2000-2020 the FFmpeg developers
 built with gcc 7 (Ubuntu/Linaro 7.5.0-3ubuntu1~18.04)
 configuration: --prefix=/usr --extra-version=0ubuntu0.2 --toolchain=hardened --libdir=/usr/lib/aarch64-linux-gnu --incdir=/usr/include/aarch64-linux-gnu --enable-gpl --disable-stripping --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librubberband --enable-librsvg --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-omx --enable-openal --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libopencv --enable-libx264 --enable-shared
 libavutil 55. 78.100 / 55. 78.100
 libavcodec 57.107.100 / 57.107.100
 libavformat 57. 83.100 / 57. 83.100
 libavdevice 57. 10.100 / 57. 10.100
 libavfilter 6.107.100 / 6.107.100
 libavresample 3. 7. 0 / 3. 7. 0
 libswscale 4. 8.100 / 4. 8.100
 libswresample 2. 9.100 / 2. 9.100
 libpostproc 54. 7.100 / 54. 7.100
Guessed Channel Layout for Input Stream #0.1 : mono
Input #0, rtsp, from 'rtsp://admin:admin@192.168.10.131/1/profile':
 Metadata:
 title : SDP Descrption
 comment : SDP Description
 Duration: N/A, start: 0.000000, bitrate: N/A
 Stream #0:0: Video: h264 (High), yuvj420p(pc, bt709, progressive), 1920x1080, 25 fps, 30 tbr, 90k tbn, 50 tbc
 Stream #0:1: Audio: pcm_alaw, 8000 Hz, mono, s16, 64 kb/s
 Stream #0:2: Data: none
Stream mapping:
 Stream #0:0 -> #0:0 (h264 (native) -> png (native))
Press [q] to stop, [?] for help
[swscaler @ 0x55a8f70c70] deprecated pixel format used, make sure you did set range correctly
[swscaler @ 0x55a8f70c70] No accelerated colorspace conversion found from yuv420p to rgb24.
Output #0, image2, to 'snapshot.png':
 Metadata:
 title : SDP Descrption
 comment : SDP Description
 encoder : Lavf57.83.100
 Stream #0:0: Video: png, rgb24, 1920x1080, q=2-31, 200 kb/s, 30 fps, 30 tbn, 30 tbc
 Metadata:
 encoder : Lavc57.107.100 png
frame= 1 fps=0.0 q=-0.0 Lsize=N/A time=00:00:00.03 bitrate=N/A dup=1 drop=1 speed=0.066x 
video:1982kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown







But with gstreamer, (I'm using version 1.14.5) I am not able to access the rtsp feed.


$gst-launch-1.0 uridecodebin uri=rtsp://admin:admin@192.168.10.131/1/profile ! fakesink





Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://admin:admin@192.168.10.131/1/profile
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (request) SETUP stream 1
Progress: (request) SETUP stream 2
ERROR: from element /GstPipeline:pipeline0/GstURIDecodeBin:uridecodebin0/GstRTSPSrc:source: Could not read from resource.
Additional debug info:
gstrtspsrc.c(5917): gst_rtsp_src_receive_response (): /GstPipeline:pipeline0/GstURIDecodeBin:uridecodebin0/GstRTSPSrc:source:
Could not receive message. (Timeout while waiting for server response)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...







I have also removed gstreamer-ugly pkg but it still does not work !


ref : https://forums.developer.nvidia.com/t/rtsp-gstreamer-simple-recieve-and-store-in-file/157535/12


Let me know if you can help me ! Thanks !


-
ffmpeg options for encoding a video mpeg2video but with .mov extension
20 septembre 2020, par SeleneI just finished a project, and need to produce an output in the specific format, should be exactly the same as the format of the video I received.
The best way for me to identify the source format was to use ffprobe. Here was the output of that :


Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'input.mov':
 Metadata:
 creation_time : 2020-02-27T04:15:23.000000Z
 Duration: 00:00:22.13, start: 0.000000, bitrate: 111320 kb/s
 Stream #0:0(eng): Video: mpeg2video (4:2:2) (xd5c / 0x63356478), yuv422p(tv, bt709, top coded first (swapped)), 1920x1080 [SAR 1:1 DAR 16:9], 109779 kb/s, 54.94 fps, 54.94 tbr, 5494 tbn, 50 tbc (default)
 Metadata:
 creation_time : 2020-02-27T04:15:23.000000Z
 handler_name : Gestor de contenido de v?deo Apple
 encoder : XDCAM HD422 1080i50 (50 Mb/s)
 Stream #0:1(eng): Audio: pcm_s16be (twos / 0x736F7774), 48000 Hz, 2 channels, s16, 1536 kb/s (default)
 Metadata:
 creation_time : 2020-02-27T04:15:23.000000Z
 handler_name : Gestor de contenido de sonido Apple



I did a lot of video work on the above file, and as part of my pipeline, I converted the file to ProRes4444. Now I need to get this video into the same format as above.


Couple of questions on the format, if I understand it correctly, mpeg2video is mpeg2, this would not normally appear as .mov file, but the source is as mov container. Why ?


Does the encoder from the input format matter ? specifically the XDCAM ?


Alternative to solving my problem would be to use media encoder, but even that application doesn't seem to give me options at mov + mpeg2, and if it is mov, it almost forces to use Apple ProRes to keep high resolution. Also, none of the options allow me to set fps at the source level, which is 54.94, and the closest option I have is 59.94.


Please help,


-
Multiple applications to access the same DeckLink device (openCV and rtmp pushing) ? What options do I have ?
7 septembre 2020, par Giorgi AptsiauriDescription :


I have a requirement that I must do two things to a single camera stream from a DeckLink device - OpenCV processing and RTMP live streaming. If it is relevant, the device is DeckLink 8K Pro.


The DeckLink device (one particular port) does not allow more than one application to access the video stream. My two applications are :


- 

- C++ OpenCV application which must run image processing algorithms on the video stream.
- Unreal Media Server which must RTMP push the stream to a remote endpoint, which should redistribute the stream to clients.






Both applications must be running on the same Windows workstation.


Problem :


If either application is running, it has exclusive access to the video stream and the other application cannot read the stream, and there seems to be no solution to this problem.


So, I am really looking for a smart solution some of you may have implemented before or have an idea which might work for me.


My solution so far :


In this solution, Unreal Media Server is replaced with FFMPEG.


Since OP's effort must be put into the solution before posting here, I have been browsing through possible solutions the past week. I came up with an FFMPEG-based solution which is the only application which reads the DeckLink video stream and does exactly two things in a single command - pushes video packets via RTMP and creates a local UDP stream. And, OpenCV's VideoCapture class is able to pick up UDP stream. For testing, I have used Twitch as RTMP server and VLC as UDP tester. It works and it seems like a good solution except for the delay introduced in UDP streaming which is about 0.4 seconds. Unfortunately, I am not able to evaluate RTMP delay because Twitch introduces its own >5s delay on its own. But that is not a problem for now.


Here's the FFMPEG solution (Note I am using computer webcam here but same goes for DeckLink).


Steram via :


ffmpeg -threads:v 2 -threads:a 8 -filter_threads 2 -thread_queue_size 512 -y -f dshow -video_size 640x480 -pixel_format yuyv422 -framerate 30 -rtbufsize 100M -i video="HD WebCam" -f dshow -rtbufsize 100M -i audio="Microphone Array (Realtek High Definition Audio(SST))" -preset ultrafast -vcodec libx264 -tune zerolatency -b 900k -map 0:v:0 -f mpegts udp://127.0.0.1:5555 -pix_fmt yuv420p -c:v libx264 -qp:v 19 -profile:v high -rc:v cbr_ld_hq -level:v 4.2 -r:v 60 -g:v 120 -bf:v 3 -refs:v 16 -f flv rtmp://live-fra05.twitch.tv/app/stream_key



Play via :


ffplay -probesize 32 -sync ext udp://127.0.0.1:5555



I want to hear how you would improve my own solution to better fit the problem (e.g. less latency) or if you have a better solution than mine.


Thanks in advance.