
Recherche avancée
Médias (1)
-
Collections - Formulaire de création rapide
19 février 2013, par
Mis à jour : Février 2013
Langue : français
Type : Image
Autres articles (111)
-
Problèmes fréquents
10 mars 2010, parPHP et safe_mode activé
Une des principales sources de problèmes relève de la configuration de PHP et notamment de l’activation du safe_mode
La solution consiterait à soit désactiver le safe_mode soit placer le script dans un répertoire accessible par apache pour le site -
Script d’installation automatique de MediaSPIP
25 avril 2011, parAfin de palier aux difficultés d’installation dues principalement aux dépendances logicielles coté serveur, un script d’installation "tout en un" en bash a été créé afin de faciliter cette étape sur un serveur doté d’une distribution Linux compatible.
Vous devez bénéficier d’un accès SSH à votre serveur et d’un compte "root" afin de l’utiliser, ce qui permettra d’installer les dépendances. Contactez votre hébergeur si vous ne disposez pas de cela.
La documentation de l’utilisation du script d’installation (...) -
List of compatible distributions
26 avril 2011, parThe table below is the list of Linux distributions compatible with the automated installation script of MediaSPIP. Distribution nameVersion nameVersion number Debian Squeeze 6.x.x Debian Weezy 7.x.x Debian Jessie 8.x.x Ubuntu The Precise Pangolin 12.04 LTS Ubuntu The Trusty Tahr 14.04
If you want to help us improve this list, you can provide us access to a machine whose distribution is not mentioned above or send the necessary fixes to add (...)
Sur d’autres sites (10965)
-
FFmpeg's common filter "v360" missing
26 septembre 2021, par Niklas E.I'm trying to convert a 180° fisheye video to a normal/regular video using the
v360
filter of FFmpeg.

This is the command I tried :

ffmpeg -i in.mp4 -vf "v360=input=fisheye:output=flat:iv_fov=180:v_fov=90" out.mp4


But the output says clearly
No such filter: 'v360'
, although v360 is a common filter listed in docs and other filters I used before worked just fine. I tried updating/reinstalling and looking for solutions, not fixing it.

Why is the filter missing ? How can I debug this ? Should I doe the task using another program entirely ?


Command output :


ffmpeg version 4.2.4-1ubuntu0.1 Copyright (c) 2000-2020 the FFmpeg developers
 built with gcc 9 (Ubuntu 9.3.0-10ubuntu2)
 configuration: --prefix=/usr --extra-version=1ubuntu0.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-avresample --disable-filter=resample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librsvg --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opencl --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-nvenc --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared
 libavutil 56. 31.100 / 56. 31.100
 libavcodec 58. 54.100 / 58. 54.100
 libavformat 58. 29.100 / 58. 29.100
 libavdevice 58. 8.100 / 58. 8.100
 libavfilter 7. 57.100 / 7. 57.100
 libavresample 4. 0. 0 / 4. 0. 0
 libswscale 5. 5.100 / 5. 5.100
 libswresample 3. 5.100 / 3. 5.100
 libpostproc 55. 5.100 / 55. 5.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'in.mp4':
 Metadata:
 major_brand : mp42
 minor_version : 0
 compatible_brands: mp42mp41
 creation_time : 2021-09-11T14:18:33.000000Z
 Duration: 00:02:48.02, start: 0.000000, bitrate: 26056 kb/s
 Stream #0:0(eng): Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, bt709), 2160x1080 [SAR 1:1 DAR 2:1], 25924 kb/s, 50 fps, 50 tbr, 50k tbn, 100 tbc (default)
 Metadata:
 creation_time : 2021-09-11T14:18:33.000000Z
 handler_name : Mainconcept MP4 Video Media Handler
 encoder : AVC Coding
 Stream #0:1(eng): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 125 kb/s (default)
 Metadata:
 creation_time : 2021-09-11T14:18:33.000000Z
 handler_name : Mainconcept MP4 Sound Media Handler
Stream mapping:
 Stream #0:0 -> #0:0 (h264 (native) -> h264 (libx264))
 Stream #0:1 -> #0:1 (aac (native) -> aac (native))
Press [q] to stop, [?] for help
[AVFilterGraph @ 0x55ee57567340] No such filter: 'v360'
Error reinitializing filters!
Failed to inject frame into filter network: Invalid argument
Error while processing the decoded data for stream #0:0
Conversion failed!



-
ffmpeg, /dev/video0, -f decklink
20 mars 2019, par Camille GoudeseuneI’m trying to capture video from a PCI card, the Blackmagic DeckLink Mini Recorder, via ffmpeg, on a headless host running Ubuntu 18.04.2 LTS, hopefully with a command like
ffmpeg -f decklink -i /dev/video0 ...
How can I make that work ? I have two obstacles.
No /dev/video0
ffmpeg -i /dev/video0 ...
fails :/dev/video0: No such device or address
.
v4l2-ctl --list-devices
fails with the same error message.I built /dev/video0, and it looks okay :
mknod /dev/video0 c 81 0
chown root.video /dev/video0
chmod g+rw /dev/video0To compare this file with a working one, I ran
strace cat /dev/video0
on this host, and on another host (Ubuntu 14) with a working /dev/video0. The outputs began to differ here (good, then bad) :fstat(1, {st_mode=S_IFREG|0644, st_size=0, ...}) = 0
open("/dev/video0", O_RDONLY) = 3
fstat(3, {st_mode=S_IFCHR|0660, st_rdev=makedev(81, 0), ...}) = 0
fadvise64(3, 0, 0, POSIX_FADV_SEQUENTIAL) = 0
----
fstat(1, {st_mode=S_IFCHR|0620, st_rdev=makedev(136, 0), ...}) = 0
openat(AT_FDCWD, "/dev/video0", O_RDONLY) = -1 ENXIO (No such device or address)So /dev/video0 is broken at a level lower than ffmpeg or v4l2 or even cat.
On Ubuntu 14,
man 8 MAKEDEV
suggests that the error message means that "the kernel does not have the driver configured or loaded."This Ubuntu 18 host lacks that manpage, but it does have a few
/snap/core/*/sbin/MAKEDEV
, all the same, so I tried/snap/core/6350/sbin/MAKEDEV -n -v video
It would have created over a hundred devices videoXX, radioXX, vtxXX, vbiXX. Those devices didn’t exist yet, so it seemed harmless to try it.
rm /dev/video0; /snap/core/6350/sbin/MAKEDEV video
That rebuilt /dev/video0, but "No such device" remains, from cat or ffmpeg.
No decklink
ffmpeg -f decklink ...
fails withUnknown input format: 'decklink'
.Neither black nor deck nor link is mentioned by
ffmpeg -devices
(fbdev, lavfi, oss, v4l2) andffmpeg -formats
(about 350), either for Ubuntu’s own version 3.4.4-0ubuntu0.18.04.1, or for version N-93330-g7ff89574c7 compiled from source on 2019 Mar 13 :git clone https://git.ffmpeg.org/ffmpeg.git ffmpeg
cd ffmpeg
./configure --enable-nonfree --disable-doc --disable-w32threads --enable-pthreads(Although
./configure --help
mentions--enable-decklink
, using that yielded "ERROR : DeckLinkAPI.h not found."updatedb && locate DeckLinkAPI.h
finds no file with that name, either.)The DeckLink PCI card is recognized by
hwinfo
andlspci
.lsmod
reports the loaded modulesblackmagic
andblackmagic_io
.Maybe the PCI card is installed ok, but ffmpeg just can’t reach it because I can’t configure it for that.
Edit : Rebooting didn’t fix anything.
-
Using FFMPEG for Bidirectional Voice Communication with Symmetric RTP
3 février 2024, par Batuhan ÖksüzI am trying to make a voice communication between two peers. The first peer is my local machine which is behind a symmetric NAT. The second peer is my server running on an AWS EC2 device which has a public IP address. I want to use FFMPEG for sending the audio stream through RTP while at the same time listening to a known port to receive the audio stream the remote peer sends to my device. In order to not deal with NAT traversal issues, I want to be able to use the same IP address and port number I use for sending on my local device for receiving. Is this plausible ? I'm starting to think that this is not possible and here is my rationale :


- 

- If my understanding is correct UDP doesn't allow full-duplex communication ; that is, one cannot use one IP:port pair for both sending and receiving data packets at the same time. Is that correct ?
- If one wants to bind a socket that is already been bound, the function throws a bind error saying "Address already in use.". I figured there is a UDP option to allow binding to the same port or address even if they are in use. These options are namely SO_REUSEADDR and SO_REUSEPORT. However, this page states that it is only possible if the port is in TIME_WAIT state. So this also supports my suspicion.






On the other hand, there is symmetric RTP that clearly states a device can receive RTP streams from an address and port that it simultaneously uses to transmit RTP streams from. With exact words of the RFC :




A device supports symmetric RTP if it selects, communicates, and uses
IP addresses and port numbers such that, when receiving a
bidirectional RTP media stream on UDP port "A" and IP address "a", it
also transmits RTP media for that stream from the same source UDP
port "A" and IP address "a". That is, it uses the same UDP port to
transmit and receive one RTP stream.






A device that doesn't support symmetric RTP would transmit RTP from a
different port, or from a different IP address, than the port and IP
address used to receive RTP for that bidirectional media steam.




So this is where I get confused. Is symmetric RTP somehow works around the limitations of UDP ? How am I getting this wrong ?


Now going back to FFMPEG and the use of symmetric RTP, I see there is an rtp option we can use to set it up, the so called localrtpport=n. I can find almost no explanation to what it does and how it's useful though. Can anyone help me with that ? As far as I can tell, this option tells FFMPEG to use port "n" as the outbound port when transmitting an RTP stream. So if the receiver transmitted its stream to this port then the problem of symmetric NAT requirement would be resolved. Or so I thought...


To draw you a better picture, here are my FFMPEG commands (I'm trying everything in my local host in these commands) :


# My Mac with en0 IP of 192.168.1.64
ffmpeg \
-hide_banner \
-re \
-fflags +genpts \
-f lavfi -i aevalsrc="sin(400*2*PI*t)" \
-ar 8000 \
-f mulaw \
-f rtp -reuse 1 "rtp://192.168.1.72:9193?reuse=1&localrtpport=16386&localrtcpport=16387" \
-protocol_whitelist udp,rtp \
-i rtp://192.168.1.64:16386 \
audio_signal_with_symmetric_rtp.mp3



Here I am simply generating a fixed sound signal and outputting it in mulaw format through rtp. I am using the 'localrtport' option to set my outbound port and I am expecting to receive the remote peer's stream on the same port. This command starts running and and waits for the incoming stream. As soon as I start transmitting the stream from my Raspberry Pi which is on the same wireless network, I get the dreaded "Address already in use." error and the process terminates. Here is the command I use on the Raspberry :


# My Raspberry Pi with wlan0 IP of 192.168.1.72
ffmpeg \
-hide_banner \
-re \
-f lavfi -i aevalsrc="sin(400*2*PI*t)" \
-ar 8000 \
-f mulaw \
-f rtp "rtp://192.168.1.64:16386?reuse=1&localrtpport=16386"



TLDR


The short form of the question comes down to this : How can I make use of symmetric RTP with FFMPEG, receive a stream from the same port I am actively transmitting another stream from ? Is what I'm asking impossible ? Should I go for an alternate route and try to set up a TURN server for my system ? Any help would be appreciated.