Recherche avancée

Médias (1)

Mot : - Tags -/stallman

Autres articles (93)

  • Mediabox : ouvrir les images dans l’espace maximal pour l’utilisateur

    8 février 2011, par

    La visualisation des images est restreinte par la largeur accordée par le design du site (dépendant du thème utilisé). Elles sont donc visibles sous un format réduit. Afin de profiter de l’ensemble de la place disponible sur l’écran de l’utilisateur, il est possible d’ajouter une fonctionnalité d’affichage de l’image dans une boite multimedia apparaissant au dessus du reste du contenu.
    Pour ce faire il est nécessaire d’installer le plugin "Mediabox".
    Configuration de la boite multimédia
    Dès (...)

  • Keeping control of your media in your hands

    13 avril 2011, par

    The vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
    While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
    MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
    MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

Sur d’autres sites (12024)

  • Bash : displaying selected output, do not print unnecessary output

    2 juillet 2014, par Guillaume

    I don’t know if it’s possible :
    I’m using ffmpeg and I would like to reduce the output of a command. I have this result :

    ffmpeg version 2.2.git Copyright (c) 2000-2014 the FFmpeg developers
     built on Jun 17 2014 11:08:12 with gcc 4.8 (Ubuntu 4.8.2-19ubuntu1)
     configuration: --prefix=/usr/local --enable-gpl --enable-nonfree --enable-libass --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libxvid --enable-libvidstab --enable-libx265
     libavutil      52. 89.100 / 52. 89.100
     libavcodec     55. 67.100 / 55. 67.100
     libavformat    55. 43.100 / 55. 43.100
     libavdevice    55. 13.101 / 55. 13.101
     libavfilter     4.  8.100 /  4.  8.100
     libswscale      2.  6.100 /  2.  6.100
     libswresample   0. 19.100 /  0. 19.100
     libpostproc    52.  3.100 / 52.  3.100
    Input #0, hls,applehttp, from 'http://ftvodhdsecz-f.akamaihd.net/i/streaming-adaptatif_france-dom-tom/2014/S26/J7/104904507-20140629-,398,632,934,k.mp4.csmil/index_2_av.m3u8?null=':
     Duration: 00:51:05.07, start: 0.100667, bitrate: 0 kb/s
     Program 0
       Metadata:
         variant_bitrate : 0
       Stream #0:0: Video: h264 (Main) ([27][0][0][0] / 0x001B), yuv420p, 704x396 [SAR 1:1 DAR 16:9], 25 fps, 25 tbr, 90k tbn, 50 tbc
       Stream #0:1: Audio: aac ([15][0][0][0] / 0x000F), 48000 Hz, stereo, fltp, 67 kb/s
       Stream #0:2: Data: timed_id3 (ID3  / 0x20334449)
    File '/media/path/video.mkv' already exists. Overwrite ? [y/N] y
    [matroska @ 0x2958840] Error parsing AAC extradata, unable to determine samplerate.
    Output #0, matroska, to '/media/path/video.mkv':
     Metadata:
       encoder         : Lavf55.43.100
       Stream #0:0: Video: h264 (H264 / 0x34363248), yuv420p, 704x396 [SAR 1:1 DAR 16:9], q=2-31, 25 fps, 1k tbn, 90k tbc
       Stream #0:1: Audio: aac ([255][0][0][0] / 0x00FF), 48000 Hz, stereo, 67 kb/s
    Stream mapping:
     Stream #0:0 -> #0:0 (copy)
     Stream #0:1 -> #0:1 (copy)
    Press [q] to stop, [?] for help
    [hls,applehttp @ 0x2864c20] Failed to open segment of playlist 0ate= 844.6kbits/s    
    frame= 3000 fps=174 q=-1.0 Lsize=   12325kB time=00:02:00.00 bitrate= 841.4kbits/s

    I just would like to have this 4 informations :

    1) Duration: 00:51:05.07, start: 0.100667, bitrate: 0 kb/s
    2)  File '/media/path/video.mkv' already exists. Overwrite ? [y/N] y
    3)  Output #0, matroska, to '/media/path/video.mkv':
    4)  frame= 3000 fps=174 q=-1.0 Lsize=   12325kB time=00:02:00.00 bitrate= 841.4kbits/s

    I’ve tried the -v option, but the output is either -v info (this long output), or -v warning, or -v error. There’s not what I would like to have.
    I’ve seen this question but the output is totally clear. Can I put an exception for specially string ?
    Thanks all

    Edit : my line in my script is like that :

    ffmpeg -i "${M3U2}" -vcodec copy -acodec copy "${Directory}/${PROG}_${ID}.mkv"
  • FFMPEG Unable to find a suitable output format for 'pipe :' pipe: : Invalid argument

    1er avril 2021, par ashiyaa nunhuck

    I want to stream video from my drone real time to a WEBUI using ffmpeg. But i get the follwong errors.
The code works but i don't think ishould have gotten this error.
Can somebody help with this issue ? This is my drone commands along with the streaming video codes. I have only added the code to get data.
Below is the error :
pipe error

    


    My code is as follows :

    


    logger = logging.getLogger(__name__)

DEFAULT_DISTANCE = 0.30
DEFAULT_SPEED = 10
DEFAULT_DEGREE = 10

FRAME_X = int(960/3)
FRAME_Y = int(720/3)
FRAME_AREA = FRAME_X * FRAME_Y

FRAME_SIZE = FRAME_AREA * 3
FRAME_CENTER_X = FRAME_X / 2
FRAME_CENTER_Y = FRAME_Y / 2

CMD_FFMPEG = (f'ffmpeg - hwaccel auto -hwaccel_device opencl -i pipe:0 '
              f'-pix_fmt bgr24 -s {FRAME_X}x{FRAME_Y}  -f rawvideo pipe:1')


class DroneManager(metaclass=Singleton):
    def __init__(self, host_ip='192.168.10.2', host_port=8890,
                 drone_ip='192.168.10.1', drone_port=8889,
                 is_imperial=False, speed=DEFAULT_SPEED):
        self.host_ip = host_ip
        self.host_port = host_port
        self.drone_ip = drone_ip
        self.drone_port = drone_port
        self.drone_address = (drone_ip, drone_port)
        self.is_imperial = is_imperial
        self.speed = speed
        self.socket = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
        self.socket.bind((self.host_ip, self.host_port))

        self.response = None
        self.stop_event = threading.Event()
        self._response_thread = threading.Thread(target=self.receive_response,
                                                 args=(self.stop_event,))
        self._response_thread.start()

        self.patrol_event = None
        self.is_patrol = False
        self._patrol_semaphore = threading.Semaphore(1)
        self._thread_patrol = None

        self.proc = subprocess.Popen(CMD_FFMPEG.split(' '),
                                     stdin=subprocess.PIPE,
                                     stdout=subprocess.PIPE)
        self.proc_stdin = self.proc.stdin
        self.proc_stdout = self.proc.stdout

        self.video_port = 11111

        self._receive_video_thread = threading.Thread(
            target=self.receive_video,
            args=(self.stop_event, self.proc_stdin,
                  self.host_ip, self.video_port,))
        self._receive_video_thread.start()

        self.send_command('command')
        self.send_command('streamon')
        self.set_speed(self.speed)

    def receive_response(self, stop_event):
        while not stop_event.is_set():
            try:
                self.response, ip = self.socket.recvfrom(3000)
                logger.info({'action': 'receive_response',
                             'response': self.response})
            except socket.error as ex:
                logger.error({'action': 'receive_response',
                              'ex': ex})
                break

    def __dell__(self):
        self.stop()

    def stop(self):
        self.stop_event.set()
        retry = 0
        while self._response_thread.is_alive():
            time.sleep(0.3)
            if retry > 30:
                break
            retry += 1
        self.socket.close()
        os.kill(self.proc.pid, signal.CTRL_C_EVENT)

    def send_command(self, command):
        logger.info({'action': 'send_command', 'command': command})
        self.socket.sendto(command.encode('utf-8'), self.drone_address)

        retry = 0
        while self.response is None:
            time.sleep(0.3)
            if retry > 3:
                break
            retry += 1

        if self.response is None:
            response = None
        else:
            response = self.response.decode('utf-8')
        self.response = None
        return response

    def takeoff(self):
        return self.send_command('takeoff')

    def land(self):
        return self.send_command('land')

    def move(self, direction, distance):
        distance = float(distance)
        if self.is_imperial:
            distance = int(round(distance * 30.48))
        else:
            distance = int(round(distance * 100))
        return self.send_command(f'{direction} {distance}')

    def up(self, distance=DEFAULT_DISTANCE):
        return self.move('up', distance)

    def down(self, distance=DEFAULT_DISTANCE):
        return self.move('down', distance)

    def left(self, distance=DEFAULT_DISTANCE):
        return self.move('left', distance)

    def right(self, distance=DEFAULT_DISTANCE):
        return self.move('right', distance)

    def forward(self, distance=DEFAULT_DISTANCE):
        return self.move('forward', distance)

    def back(self, distance=DEFAULT_DISTANCE):
        return self.move('back', distance)

    def set_speed(self, speed):
        return self.send_command(f'speed {speed}')

    def clockwise(self, degree=DEFAULT_DEGREE):
        return self.send_command(f'cw {degree}')

    def counter_clockwise(self, degree=DEFAULT_DEGREE):
        return self.send_command(f'ccw {degree}')

    def flip_front(self):
        return self.send_command('flip f')

    def flip_back(self):
        return self.send_command('flip b')

    def flip_left(self):
        return self.send_command('flip l')

    def flip_right(self):
        return self.send_command('flip r')

    def patrol(self):
        if not self.is_patrol:
            self.patrol_event = threading.Event()
            self._thread_patrol = threading.Thread(
                target=self._patrol,
                args=(self._patrol_semaphore, self.patrol_event,))
            self._thread_patrol.start()
            self.is_patrol = True

    def stop_patrol(self):
        if self.is_patrol:
            self.patrol_event.set()
            retry = 0
            while self._thread_patrol.is_alive():
                time.sleep(0.3)
                if retry > 300:
                    break
                retry += 1
            self.is_patrol = False

    def _patrol(self, semaphore, stop_event):
        is_acquire = semaphore.acquire(blocking=False)
        if is_acquire:
            logger.info({'action': '_patrol', 'status': 'acquire'})
            with contextlib.ExitStack() as stack:
                stack.callback(semaphore.release)
                status = 0
                while not stop_event.is_set():
                    status += 1
                    if status == 1:
                        self.up()
                    if status == 2:
                        self.clockwise(180)
                    if status == 3:
                        self.down()
                    if status == 4:
                        status = 0
                    time.sleep(5)
        else:
            logger.warning({'action': '_patrol', 'status': 'not_acquire'})

    def receive_video(self, stop_event, pipe_in, host_ip, video_port):
        with socket.socket(socket.AF_INET, socket.SOCK_DGRAM) as sock_video:
            sock_video.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
            sock_video.settimeout(5)
            sock_video.bind((host_ip, video_port))
            data = bytearray(2048)
            while not self.stop_event.is_set():
                try:
                    size, addr = sock_video.recvfrom_into(data)
                    logger.info({'action': 'receive_video', 'data': data})
                except socket.timeout as ex:
                    logger.warning({'action': 'receive_video', 'ex': ex})
                    time.sleep(0.5)
                    continue
                except socket.error as ex:
                    logger.error({'action': 'receive_video', 'ex': ex})
                    break

                try:
                    pipe_in.write(data[:size])
                    pipe_in.flush()
                except Exception as ex:
                    logger.info({'action': 'receive_video', 'ex': ex})
                    break


    


  • MPEG-DASH create initialization segment

    3 novembre 2014, par static

    I’m segmenting the video capture of the desktop using ffmpeg -segment and sending them over network in order to be served to clients and to be played using dash.js. The problem is that the player is searching for the initialization segment and i don’t seem to be able to figure out how to create it.
    I create the segments using this ffmpeg command :

    ffmpeg -rtbufsize 1500M -f dshow -r 15 -i video="UScreenCapture"
    -flags +global_header -vcodec libvpx -crf 10 -quality good -keyint_min 15 -g 15
    -cpu-used 3 -b:v 1000k -qmin 10 -qmax 42  -threads 2 -vf scale=-1:480 -bufsize 1
    500 -map 0 -f stream_segment -segment_time 2 -segment_format webm http://localho
    st:3000/stream/22/%03d

    The manifest that i create for the stream looks something like this :

    <mpd xmlns="urn:mpeg:dash:schema:mpd:2011" type="dynamic" availabilitystarttime="2014-06-19T07:47:40.079Z" minbuffertime="PT0S" profiles="urn:mpeg:dash:profile:isoff-live:2011" suggestedpresentationdelay="PT40S" maxsegmentduration="PT2.000S" minimumupdateperiod="PT1000M">
    <period bitstreamswitching="true" start="PT0S">
    <adaptationset mimetype="video/webm" segmentalignment="true" startwithsap="1" maxwidth="1280" maxheight="720" maxframerate="15">
    <contentcomponent contenttype="video"></contentcomponent>
    <segmenttemplate presentationtimeoffset="0" timescale="90000" media="$Number$/" duration="180000" startnumber="0"></segmenttemplate>
    <representation width="853" height="480" framerate="15" bandwidth="1000000" codecs="vp8"></representation>
    </adaptationset>
    </period>
    </mpd>

    The player debugging mode prints the following things :

    Getting the request for time: 0 dash.all.js:2073
    Index for time 0 is 0 dash.all.js:2073
    Waiting for more video buffer before starting playback. dash.all.js:2073
    BufferController video seek: 0 dash.all.js:2073
    Marking a special seek for initial video playback. dash.all.js:2073
    Start searching for initialization. dash.all.js:2073
    Perform init search: stream/22/ dash.all.js:2073
    Getting the request for time: 0 dash.all.js:2073
    Index for time 0 is 0 dash.all.js:2073
    Data changed - loading the video fragment for time: 0 dash.all.js:2073
    Getting the request for time: 0

    How can i create the initialization segment for the generated segments ? I can’t seem to be able to get it to work.