
Recherche avancée
Médias (1)
-
SPIP - plugins - embed code - Exemple
2 septembre 2013, par
Mis à jour : Septembre 2013
Langue : français
Type : Image
Autres articles (51)
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Pas question de marché, de cloud etc...
10 avril 2011Le vocabulaire utilisé sur ce site essaie d’éviter toute référence à la mode qui fleurit allègrement
sur le web 2.0 et dans les entreprises qui en vivent.
Vous êtes donc invité à bannir l’utilisation des termes "Brand", "Cloud", "Marché" etc...
Notre motivation est avant tout de créer un outil simple, accessible à pour tout le monde, favorisant
le partage de créations sur Internet et permettant aux auteurs de garder une autonomie optimale.
Aucun "contrat Gold ou Premium" n’est donc prévu, aucun (...) -
Ajouter des informations spécifiques aux utilisateurs et autres modifications de comportement liées aux auteurs
12 avril 2011, parLa manière la plus simple d’ajouter des informations aux auteurs est d’installer le plugin Inscription3. Il permet également de modifier certains comportements liés aux utilisateurs (référez-vous à sa documentation pour plus d’informations).
Il est également possible d’ajouter des champs aux auteurs en installant les plugins champs extras 2 et Interface pour champs extras.
Sur d’autres sites (9641)
-
FFMPEG Unable to find a suitable output format for 'pipe :' pipe: : Invalid argument
1er avril 2021, par ashiyaa nunhuckI want to stream video from my drone real time to a WEBUI using ffmpeg. But i get the follwong errors.
The code works but i don't think ishould have gotten this error.
Can somebody help with this issue ? This is my drone commands along with the streaming video codes. I have only added the code to get data.
Below is the error :
pipe error


My code is as follows :


logger = logging.getLogger(__name__)

DEFAULT_DISTANCE = 0.30
DEFAULT_SPEED = 10
DEFAULT_DEGREE = 10

FRAME_X = int(960/3)
FRAME_Y = int(720/3)
FRAME_AREA = FRAME_X * FRAME_Y

FRAME_SIZE = FRAME_AREA * 3
FRAME_CENTER_X = FRAME_X / 2
FRAME_CENTER_Y = FRAME_Y / 2

CMD_FFMPEG = (f'ffmpeg - hwaccel auto -hwaccel_device opencl -i pipe:0 '
 f'-pix_fmt bgr24 -s {FRAME_X}x{FRAME_Y} -f rawvideo pipe:1')


class DroneManager(metaclass=Singleton):
 def __init__(self, host_ip='192.168.10.2', host_port=8890,
 drone_ip='192.168.10.1', drone_port=8889,
 is_imperial=False, speed=DEFAULT_SPEED):
 self.host_ip = host_ip
 self.host_port = host_port
 self.drone_ip = drone_ip
 self.drone_port = drone_port
 self.drone_address = (drone_ip, drone_port)
 self.is_imperial = is_imperial
 self.speed = speed
 self.socket = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
 self.socket.bind((self.host_ip, self.host_port))

 self.response = None
 self.stop_event = threading.Event()
 self._response_thread = threading.Thread(target=self.receive_response,
 args=(self.stop_event,))
 self._response_thread.start()

 self.patrol_event = None
 self.is_patrol = False
 self._patrol_semaphore = threading.Semaphore(1)
 self._thread_patrol = None

 self.proc = subprocess.Popen(CMD_FFMPEG.split(' '),
 stdin=subprocess.PIPE,
 stdout=subprocess.PIPE)
 self.proc_stdin = self.proc.stdin
 self.proc_stdout = self.proc.stdout

 self.video_port = 11111

 self._receive_video_thread = threading.Thread(
 target=self.receive_video,
 args=(self.stop_event, self.proc_stdin,
 self.host_ip, self.video_port,))
 self._receive_video_thread.start()

 self.send_command('command')
 self.send_command('streamon')
 self.set_speed(self.speed)

 def receive_response(self, stop_event):
 while not stop_event.is_set():
 try:
 self.response, ip = self.socket.recvfrom(3000)
 logger.info({'action': 'receive_response',
 'response': self.response})
 except socket.error as ex:
 logger.error({'action': 'receive_response',
 'ex': ex})
 break

 def __dell__(self):
 self.stop()

 def stop(self):
 self.stop_event.set()
 retry = 0
 while self._response_thread.is_alive():
 time.sleep(0.3)
 if retry > 30:
 break
 retry += 1
 self.socket.close()
 os.kill(self.proc.pid, signal.CTRL_C_EVENT)

 def send_command(self, command):
 logger.info({'action': 'send_command', 'command': command})
 self.socket.sendto(command.encode('utf-8'), self.drone_address)

 retry = 0
 while self.response is None:
 time.sleep(0.3)
 if retry > 3:
 break
 retry += 1

 if self.response is None:
 response = None
 else:
 response = self.response.decode('utf-8')
 self.response = None
 return response

 def takeoff(self):
 return self.send_command('takeoff')

 def land(self):
 return self.send_command('land')

 def move(self, direction, distance):
 distance = float(distance)
 if self.is_imperial:
 distance = int(round(distance * 30.48))
 else:
 distance = int(round(distance * 100))
 return self.send_command(f'{direction} {distance}')

 def up(self, distance=DEFAULT_DISTANCE):
 return self.move('up', distance)

 def down(self, distance=DEFAULT_DISTANCE):
 return self.move('down', distance)

 def left(self, distance=DEFAULT_DISTANCE):
 return self.move('left', distance)

 def right(self, distance=DEFAULT_DISTANCE):
 return self.move('right', distance)

 def forward(self, distance=DEFAULT_DISTANCE):
 return self.move('forward', distance)

 def back(self, distance=DEFAULT_DISTANCE):
 return self.move('back', distance)

 def set_speed(self, speed):
 return self.send_command(f'speed {speed}')

 def clockwise(self, degree=DEFAULT_DEGREE):
 return self.send_command(f'cw {degree}')

 def counter_clockwise(self, degree=DEFAULT_DEGREE):
 return self.send_command(f'ccw {degree}')

 def flip_front(self):
 return self.send_command('flip f')

 def flip_back(self):
 return self.send_command('flip b')

 def flip_left(self):
 return self.send_command('flip l')

 def flip_right(self):
 return self.send_command('flip r')

 def patrol(self):
 if not self.is_patrol:
 self.patrol_event = threading.Event()
 self._thread_patrol = threading.Thread(
 target=self._patrol,
 args=(self._patrol_semaphore, self.patrol_event,))
 self._thread_patrol.start()
 self.is_patrol = True

 def stop_patrol(self):
 if self.is_patrol:
 self.patrol_event.set()
 retry = 0
 while self._thread_patrol.is_alive():
 time.sleep(0.3)
 if retry > 300:
 break
 retry += 1
 self.is_patrol = False

 def _patrol(self, semaphore, stop_event):
 is_acquire = semaphore.acquire(blocking=False)
 if is_acquire:
 logger.info({'action': '_patrol', 'status': 'acquire'})
 with contextlib.ExitStack() as stack:
 stack.callback(semaphore.release)
 status = 0
 while not stop_event.is_set():
 status += 1
 if status == 1:
 self.up()
 if status == 2:
 self.clockwise(180)
 if status == 3:
 self.down()
 if status == 4:
 status = 0
 time.sleep(5)
 else:
 logger.warning({'action': '_patrol', 'status': 'not_acquire'})

 def receive_video(self, stop_event, pipe_in, host_ip, video_port):
 with socket.socket(socket.AF_INET, socket.SOCK_DGRAM) as sock_video:
 sock_video.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
 sock_video.settimeout(5)
 sock_video.bind((host_ip, video_port))
 data = bytearray(2048)
 while not self.stop_event.is_set():
 try:
 size, addr = sock_video.recvfrom_into(data)
 logger.info({'action': 'receive_video', 'data': data})
 except socket.timeout as ex:
 logger.warning({'action': 'receive_video', 'ex': ex})
 time.sleep(0.5)
 continue
 except socket.error as ex:
 logger.error({'action': 'receive_video', 'ex': ex})
 break

 try:
 pipe_in.write(data[:size])
 pipe_in.flush()
 except Exception as ex:
 logger.info({'action': 'receive_video', 'ex': ex})
 break



-
Dash.js not playing mpd files made with ffmpeg
31 décembre 2022, par MacsterI'm using ffmpeg to create chunks and manifest of a webm file which I want to live stream with Dash.js. Unfortunately Dash.js won't play the mpd file, no matter which way I create the chunks and manifest. However, the sample mpd URL from Dash.js is working.


Commands


ffmpeg -re -r 25 -i Dash/strm.webm
-map 0:v:0
-pix_fmt yuv420p
-c:v libvpx
-s 640x480 -keyint_min 60 -g 60 -speed 6 -tile-columns 4 -frame-parallel 1 -threads 8 -static-thresh 0 -max-intra-rate 300 -deadline realtime -lag-in-frames 0 -error-resilient 1
-b:v 3000k
-f webm_chunk
-header "Dash/glass_360.hdr"
-chunk_start_index 1 Dash/glass_360_%d.chk
-map 0:a:0
-c:a libvorbis
-b:a 128k -ar 44100
-f webm_chunk
-audio_chunk_duration 2000
-header Dash/glass_171.hdr
-chunk_start_index 1 Dash/glass_171_%d.chk


//Manifest
ffmpeg
-f webm_dash_manifest -live 1
-i Dash/glass_360.hdr
-f webm_dash_manifest -live 1
-i Dash/glass_171.hdr
-c copy
-map 0 -map 1
-f webm_dash_manifest -live 1
-adaptation_sets "id=0,streams=0 id=1,streams=1"
-chunk_start_index 1
-chunk_duration_ms 2000
-time_shift_buffer_depth 7200
-minimum_update_period 7200 Dash/glass_video_manifest.mpd



Manifest output


ffmpeg version git-2020-05-27-8b5ffae Copyright (c) 2000-2020 the FFmpeg developers
 built with gcc 9.3.1 (GCC) 20200523
 configuration: --enable-gpl --enable-version3 --enable-sdl2 --enable-fontconfig --enable-gnutls --enable-iconv --enable-libass --enable-libdav1d --enable-libbluray --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libsrt --enable-libtheora --enable-libtwolame --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libzimg --enable-lzma --enable-zlib --enable-gmp --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvo-amrwbenc --enable-libmysofa --enable-libspeex --enable-libxvid --enable-libaom --disable-w32threads --enable-libmfx --enable-ffnvcodec --enable-cuda-llvm --enable-cuvid --enable-d3d11va --enable-nvenc --enable-nvdec --enable-dxva2 --enable-avisynth --enable-libopenmpt --enable-amf
 libavutil 56. 49.100 / 56. 49.100
 libavcodec 58. 87.101 / 58. 87.101
 libavformat 58. 43.100 / 58. 43.100
 libavdevice 58. 9.103 / 58. 9.103
 libavfilter 7. 83.100 / 7. 83.100
 libswscale 5. 6.101 / 5. 6.101
 libswresample 3. 6.100 / 3. 6.100
 libpostproc 55. 6.100 / 55. 6.100
Input #0, webm_dash_manifest, from 'Dash/glass_360.hdr':
 Metadata:
 ENCODER : Lavf58.43.100
 Duration: N/A, bitrate: N/A
 Stream #0:0(eng): Video: vp8, yuv420p, 640x480, SAR 1:1 DAR 4:3, 1k tbr, 1k tbn, 1k tbc (default)
 Metadata:
 ALPHA_MODE : 1
 ENCODER : Lavc58.87.101 libvpx
 webm_dash_manifest_file_name: glass_360.hdr
 webm_dash_manifest_track_number: 1
Input #1, webm_dash_manifest, from 'Dash/glass_171.hdr':
 Metadata:
 ENCODER : Lavf58.43.100
 Duration: N/A, bitrate: N/A
 Stream #1:0(eng): Audio: vorbis, 44100 Hz, mono, fltp (default)
 Metadata:
 ENCODER : Lavc58.87.101 libvorbis
 webm_dash_manifest_file_name: glass_171.hdr
 webm_dash_manifest_track_number: 1
Output #0, webm_dash_manifest, to 'Dash/glass_video_manifest.mpd':
 Metadata:
 encoder : Lavf58.43.100
 Stream #0:0(eng): Video: vp8, yuv420p, 640x480 [SAR 1:1 DAR 4:3], q=2-31, 1k tbr, 1k tbn, 1k tbc (default)
 Metadata:
 ALPHA_MODE : 1
 ENCODER : Lavc58.87.101 libvpx
 webm_dash_manifest_file_name: glass_360.hdr
 webm_dash_manifest_track_number: 1
 Stream #0:1(eng): Audio: vorbis, 44100 Hz, mono, fltp (default)
 Metadata:
 ENCODER : Lavc58.87.101 libvorbis
 webm_dash_manifest_file_name: glass_171.hdr
 webm_dash_manifest_track_number: 1
Stream mapping:
 Stream #0:0 -> #0:0 (copy)
 Stream #1:0 -> #0:1 (copy)
Press [q] to stop, [?] for help
frame= 0 fps=0.0 q=-1.0 Lsize= 1kB time=00:00:00.00 bitrate=N/A speed= 0x
video:0kB audio:0kB subtitle:0kB other streams:0kB global headers:4kB muxing overhead: unknown



Manifest file
(glass_video_manifest.mpd)

I tried to delete theContetntComponent
like suggested in other questions, but it didn't work.

<?xml version="1.0" encoding="UTF-8"?>

<period start="PT0S">
<adaptationset mimetype="video/webm" codecs="vp8" lang="eng" bitstreamswitching="true" subsegmentalignment="true" subsegmentstartswithsap="1">
<contentcomponent type="video"></contentcomponent>
<segmenttemplate timescale="1000" duration="2000" media="glass_$RepresentationID$_$Number$.chk" startnumber="1" initialization="glass_$RepresentationID$.hdr"></segmenttemplate>
<representation bandwidth="1000000" width="640" height="480" codecs="vp8" mimetype="video/webm" startswithsap="1"></representation>
</adaptationset>
<adaptationset mimetype="audio/webm" codecs="vorbis" lang="eng" bitstreamswitching="true" subsegmentalignment="true" subsegmentstartswithsap="1">
<contentcomponent type="audio"></contentcomponent>
<segmenttemplate timescale="1000" duration="2000" media="glass_$RepresentationID$_$Number$.chk" startnumber="1" initialization="glass_$RepresentationID$.hdr"></segmenttemplate>
<representation bandwidth="128000" audiosamplingrate="44100" codecs="vorbis" mimetype="audio/webm" startswithsap="1"></representation>
</adaptationset>
</period>




Dash.js-Player


<code class="echappe-js"><script>&#xA;&#xA;(function(){&#xA; // var url = "https://dash.akamaized.net/envivio/EnvivioDash3/manifest.mpd";&#xA; var url = "http://localhost:8081/videos/Dash/glass_live_manifest.mpd";&#xA; var player = dashjs.MediaPlayer().create();&#xA; &#xA; // config&#xA; targetLatency = 2.0; // Lowering this value will lower latency but may decrease the player&#x27;s ability to build a stable buffer.&#xA; minDrift = 0.05; // Minimum latency deviation allowed before activating catch-up mechanism.&#xA; catchupPlaybackRate = 0.5; // Maximum catch-up rate, as a percentage, for low latency live streams.&#xA; stableBuffer = 2; // The time that the internal buffer target will be set to post startup/seeks (NOT top quality).&#xA; bufferAtTopQuality = 2; // The time that the internal buffer target will be set to once playing the top quality.&#xA;&#xA; player.updateSettings({&#xA; &#x27;streaming&#x27;: {&#xA; &#x27;liveDelay&#x27;: 2,&#xA; &#x27;liveCatchUpMinDrift&#x27;: 0.05,&#xA; &#x27;liveCatchUpPlaybackRate&#x27;: 0.5,&#xA; &#x27;stableBufferTime&#x27;: 2,&#xA; &#x27;bufferTimeAtTopQuality&#x27;: 2,&#xA; &#x27;bufferTimeAtTopQualityLongForm&#x27;: 2,&#xA; &#x27;bufferToKeep&#x27;: 2,&#xA; &#x27;bufferAheadToKeep&#x27;: 2,&#xA; &#x27;lowLatencyEnabled&#x27;: true,&#xA; &#x27;fastSwitchEnabled&#x27;: true,&#xA; &#x27;abr&#x27;: {&#xA; &#x27;limitBitrateByPortal&#x27;: true&#xA; },&#xA; }&#xA; });&#xA;&#xA; console.log(player.getSettings());&#xA;&#xA; setInterval(() => {&#xA; console.log(&#x27;Live latency= &#x27;, player.getCurrentLiveLatency());&#xA; console.log(&#x27;Buffer length= &#x27;, player.getBufferLength(&#x27;video&#x27;));&#xA; }, 3000);&#xA;&#xA; player.initialize(document.querySelector("#videoPlayer"), url, true);&#xA;&#xA; })();&#xA;</script>



Chrome


{debug: {…}, streaming: {…}}
dash.all.min.js:2 XHR finished loading: GET "http://localhost:8081/videos/Dash/glass_live_manifest.mpd".
load @ dash.all.min.js:2
C @ dash.all.min.js:2
load @ dash.all.min.js:2
load @ dash.all.min.js:2
load @ dash.all.min.js:2
load @ dash.all.min.js:2
se @ dash.all.min.js:2
te @ dash.all.min.js:2
initialize @ dash.all.min.js:2
(anonymous) @ Dash:92
(anonymous) @ Dash:94
DevTools failed to load SourceMap: Could not parse content for http://localhost:8081/js/dash.all.min.js.map: Cannot read property 'length' of undefined
Dash:88 Live latency= NaN
Dash:89 Buffer length= NaN
Dash:88 Live latency= NaN
Dash:89 Buffer length= NaN
Dash:88 Live latency= NaN
Dash:89 Buffer length= NaN
Dash:88 Live latency= NaN
Dash:89 Buffer length= NaN
Dash:88 Live latency= NaN
Dash:89 Buffer length= NaN



UPDATE


Well, it seems like the problem in general was, that the mpd's wouldn't play from that /dash folder. So i took a look into the code and found a bad routing. Anyways, the mpd would't start with the given command I used, probably becasue it creates a
dynamic
manifest, like @Markus Schumann says. So I'm going with a new one which seems to be working for now, but not very well.

ffmpeg -y -re -i strm.webm 
-c:v libx264 -x264opts "keyint=24:min-keyint=24:no-scenecut" 
-r 24 -c:a aac -b:a 128k -bf 1 -b_strategy 0 -sc_threshold 0 -pix_fmt yuv420p 
-map 0:v:0 -map 0:a:0 -map 0:v:0 -map 0:a:0 -map 0:v:0 -map 0:a:0 -b:v:0 250k 
-filter:v:0 "scale=-2:240" -profile:v:0 baseline -b:v:1 750k 
-filter:v:1 "scale=-2:480" -profile:v:1 main -b:v:2 1500k 
-filter:v:2 "scale=-2:720" -profile:v:2 high 
-use_timeline 1 -use_template 1 -window_size 5 -adaptation_sets "id=0,streams=v id=1,streams=a" 
-f dash glass_video_manifest.mpd



-
Bash : displaying selected output, do not print unnecessary output
2 juillet 2014, par GuillaumeI don’t know if it’s possible :
I’m using ffmpeg and I would like to reduce the output of a command. I have this result :ffmpeg version 2.2.git Copyright (c) 2000-2014 the FFmpeg developers
built on Jun 17 2014 11:08:12 with gcc 4.8 (Ubuntu 4.8.2-19ubuntu1)
configuration: --prefix=/usr/local --enable-gpl --enable-nonfree --enable-libass --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libxvid --enable-libvidstab --enable-libx265
libavutil 52. 89.100 / 52. 89.100
libavcodec 55. 67.100 / 55. 67.100
libavformat 55. 43.100 / 55. 43.100
libavdevice 55. 13.101 / 55. 13.101
libavfilter 4. 8.100 / 4. 8.100
libswscale 2. 6.100 / 2. 6.100
libswresample 0. 19.100 / 0. 19.100
libpostproc 52. 3.100 / 52. 3.100
Input #0, hls,applehttp, from 'http://ftvodhdsecz-f.akamaihd.net/i/streaming-adaptatif_france-dom-tom/2014/S26/J7/104904507-20140629-,398,632,934,k.mp4.csmil/index_2_av.m3u8?null=':
Duration: 00:51:05.07, start: 0.100667, bitrate: 0 kb/s
Program 0
Metadata:
variant_bitrate : 0
Stream #0:0: Video: h264 (Main) ([27][0][0][0] / 0x001B), yuv420p, 704x396 [SAR 1:1 DAR 16:9], 25 fps, 25 tbr, 90k tbn, 50 tbc
Stream #0:1: Audio: aac ([15][0][0][0] / 0x000F), 48000 Hz, stereo, fltp, 67 kb/s
Stream #0:2: Data: timed_id3 (ID3 / 0x20334449)
File '/media/path/video.mkv' already exists. Overwrite ? [y/N] y
[matroska @ 0x2958840] Error parsing AAC extradata, unable to determine samplerate.
Output #0, matroska, to '/media/path/video.mkv':
Metadata:
encoder : Lavf55.43.100
Stream #0:0: Video: h264 (H264 / 0x34363248), yuv420p, 704x396 [SAR 1:1 DAR 16:9], q=2-31, 25 fps, 1k tbn, 90k tbc
Stream #0:1: Audio: aac ([255][0][0][0] / 0x00FF), 48000 Hz, stereo, 67 kb/s
Stream mapping:
Stream #0:0 -> #0:0 (copy)
Stream #0:1 -> #0:1 (copy)
Press [q] to stop, [?] for help
[hls,applehttp @ 0x2864c20] Failed to open segment of playlist 0ate= 844.6kbits/s
frame= 3000 fps=174 q=-1.0 Lsize= 12325kB time=00:02:00.00 bitrate= 841.4kbits/sI just would like to have this 4 informations :
1) Duration: 00:51:05.07, start: 0.100667, bitrate: 0 kb/s
2) File '/media/path/video.mkv' already exists. Overwrite ? [y/N] y
3) Output #0, matroska, to '/media/path/video.mkv':
4) frame= 3000 fps=174 q=-1.0 Lsize= 12325kB time=00:02:00.00 bitrate= 841.4kbits/sI’ve tried the
-v option
, but the output is either-v info
(this long output), or-v warning
, or-v error
. There’s not what I would like to have.
I’ve seen this question but the output is totally clear. Can I put an exception for specially string ?
Thanks allEdit : my line in my script is like that :
ffmpeg -i "${M3U2}" -vcodec copy -acodec copy "${Directory}/${PROG}_${ID}.mkv"