Recherche avancée

Médias (1)

Mot : - Tags -/géodiversité

Autres articles (34)

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • De l’upload à la vidéo finale [version standalone]

    31 janvier 2010, par

    Le chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
    Upload et récupération d’informations de la vidéo source
    Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
    Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)

Sur d’autres sites (7178)

  • How to create a named pipe in Windows ?

    10 février 2023, par Che

    Is it possible without much effort (e.g. in command line or powershell), to create a named pipe ?

    


    My goal is to write continously to that pipe from a ffmpeg-process.

    


    Without opening a Pipe at first the following command

    


    ffmpeg -i "path\to\my\File\name of my File" -f webm \\.\pipe\from_ffmpeg


    


    fails to

    


    


    "\.\pipe\from_ffmpeg : No such file or directory"

    


    


    In the big picture, I want to read a Live-Web-Video-Stream to analyze it and take live-actions based on that.
I am working with OpenCV in Java on a Windows machine. At the moment I have different ffmpeg-processes, which record different sectors (i.e. pixels (45, 45, 100, 100) and (200, 200, 100, 100) (x, y, height, width). The results are saved as jpg files in the filesystem and are then opened in a Java Process. This works, but I think I would significantly gain performance by not taking the long way over the files but to directly pipe my input into the Java-Process.

    


    I know there's an option to live-capture videostreams via open-CV but the framework does not support as many formats as ffmpeg does.

    


  • How can I pipe JPEG files into FFMPEG and create an RTSP, H.264 stream ?

    25 janvier 2023, par Michael Schmidt

    I have an input RTSP stream that I would like to manipulate on a frame-by-frame basis using openCV. After these changes are applied, I'd like to create a separate RTSP stream from those frames. I'm piping the resulting JPEG images to FFMPEG via STDIN. I need the intermediate frame to be a JPEG.

    


    In other words, I must conform to this pattern :
RTSP IN -> Create JPEG as input -> manipulation, JPEG out -> RTSP

    


    The PROBLEM I'm trying to solve deals with a codec at this point. See the last few lines of FFMPEG's output error message.

    


    Here is what I have :

    


    def open_ffmpeg_stream_process():
    args = (
        "ffmpeg -re -stream_loop -1 "
        "-f jpeg_pipe "
        "-s 512x288 "
        "-i pipe:0 "
        "-c:v h264 "
        "-f rtsp "
        "rtsp://localhost:8100/out0"
    ).split()
    return subprocess.Popen(args, stdin=subprocess.PIPE)

video_source = cv2.VideoCapture('rtsp://localhost:9100/in0')

frame_cnt = 0
FRAME_SKIP = 30

ffmpeg_process = open_ffmpeg_stream_process()
while video_source.isOpened():

    frame_cnt += 1
    if frame_cnt % FRAME_SKIP:
      continue
    else:
      frame_cnt = 0

    _, frame = video_source.read()
    _, jpg = cv2.imencode('.jpg', frame)

    # Work on the JPEG occurs here, and the output will be a JPEG

    ffmpeg_process.stdin.write(jpg.astype(np.uint8).tobytes())

    if cv2.waitKey(1) & 0xFF == ord('q'):
        break

video_source.release()


    


    Here is FFMPEG's output :

    


    ffmpeg version 4.4.2-0ubuntu0.22.04.1 Copyright (c) 2000-2021 the FFmpeg developers
  built with gcc 11 (Ubuntu 11.2.0-19ubuntu1)
  configuration: ---- noisy configuration stuff ----
  libavutil      56. 70.100 / 56. 70.100
  libavcodec     58.134.100 / 58.134.100
  libavformat    58. 76.100 / 58. 76.100
  libavdevice    58. 13.100 / 58. 13.100
  libavfilter     7.110.100 /  7.110.100
  libswscale      5.  9.100 /  5.  9.100
  libswresample   3.  9.100 /  3.  9.100
  libpostproc    55.  9.100 / 55.  9.100
Input #0, jpeg_pipe, from 'pipe:0':
  Duration: N/A, bitrate: N/A
  Stream #0:0: Video: mjpeg, rgb24(bt470bg/unknown/unknown), 512x288, 25 fps, 25 tbr, 25 tbn, 25 tbc
Stream mapping:
  Stream #0:0 -> #0:0 (mjpeg (native) -> h264 (libx264))
[libx264 @ 0x562a1ffc5840] using SAR=1/1
[libx264 @ 0x562a1ffc5840] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 BMI2 AVX2
[libx264 @ 0x562a1ffc5840] profile High, level 3.1, 4:2:0, 8-bit
[libx264 @ 0x562a1ffc5840] 264 - core 163 r3060 5db6aa6 - H.264/MPEG-4 AVC codec - Copyleft 2003-2021 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=18 lookahead_threads=3 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
[tcp @ 0x562a20826c00] Connection to tcp://localhost:8100?timeout=0 failed: Connection refused
Could not write header for output file #0 (incorrect codec parameters ?): Connection refused
Error initializing output stream 0:0 -- 
Conversion failed!


    


    How does one create an H.264, RTSP stream from a series of JPEG frames using FFMPEG.

    


    Notes : The FFMPEG command/subproccess might need unrelated improvements, feel free to comment on my crappy code.

    


    Edit : Oof, I just found out the FFMPEG command doesn't even work stand alone.

    


  • avformat/file : add fd option for pipe

    14 décembre 2022, par Zhao Zhili
    avformat/file : add fd option for pipe
    

    Signed-off-by : Zhao Zhili <zhilizhao@tencent.com>

    • [DH] doc/protocols.texi
    • [DH] libavformat/file.c
    • [DH] libavformat/version.h