Recherche avancée

Médias (1)

Mot : - Tags -/école

Autres articles (13)

  • Ajouter notes et légendes aux images

    7 février 2011, par

    Pour pouvoir ajouter notes et légendes aux images, la première étape est d’installer le plugin "Légendes".
    Une fois le plugin activé, vous pouvez le configurer dans l’espace de configuration afin de modifier les droits de création / modification et de suppression des notes. Par défaut seuls les administrateurs du site peuvent ajouter des notes aux images.
    Modification lors de l’ajout d’un média
    Lors de l’ajout d’un média de type "image" un nouveau bouton apparait au dessus de la prévisualisation (...)

  • Les formats acceptés

    28 janvier 2010, par

    Les commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
    ffmpeg -codecs ffmpeg -formats
    Les format videos acceptés en entrée
    Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
    Les formats vidéos de sortie possibles
    Dans un premier temps on (...)

  • Les vidéos

    21 avril 2011, par

    Comme les documents de type "audio", Mediaspip affiche dans la mesure du possible les vidéos grâce à la balise html5 .
    Un des inconvénients de cette balise est qu’elle n’est pas reconnue correctement par certains navigateurs (Internet Explorer pour ne pas le nommer) et que chaque navigateur ne gère en natif que certains formats de vidéos.
    Son avantage principal quant à lui est de bénéficier de la prise en charge native de vidéos dans les navigateur et donc de se passer de l’utilisation de Flash et (...)

Sur d’autres sites (4049)

  • FFMPEG filtergraph 'warning, too many B-frames in a row' [on hold]

    26 juin 2017, par Leif Andersen

    I am trying to append two videos together with FFmpeg’s filtergraph. One video is sized 1920x1080 at 30fps, and another 1280x720 at 25fps. Both use yuv420p, and have the same pixel densities. I am currently ignoring the audio tracks. The following is my filtergraph :

    [video2]fifo[video3];
    [video3]pad=width=1920:height=1080[video9];
    [video9]fps=fps=25[video11];
    [video11]setpts=expr=PTS-STARTPTS[video17];

    [video6]fifo[video7];
    [video7]pad=width=1920:height=1080[video13];
    [video13]fps=fps=25[video15];
    [video15]setpts=expr=PTS-STARTPTS[video19];

    [video17][video19]concat=v=1:a=0:n=2[video21];
    [video21]pad=width=1920:height=1080[video23];
    [video23]fps=fps=25[video25];
    [video25]format=pix_fmts=yuv420p[video27]

    The first chain tries to convert the first video into a common format, that starts at 0 for the concat filter. The second chain does the same as the first. Finally, the third chain concatenates them videos together, and sets some properties for the resulting playlist.

    Unfortunately, when I run this error, ffmpeg repeatedly outputs :

    [mpeg4 @ 0x7fc16a810600] warning, too many B-frames in a row

    When finished, I see the first video, padded to the correct resolution and frame rate, but instead of the second Video I see black. Additionally, the entire video is several days in length, starting with the first Video and ending in several days of just black.

    I cannot figure out why I am getting this error, as it seems like I am setting the videos to have identical properties. What am I missing ?

    Also, for what its worth, I am using FFmpeg’s C API rather than the command line tool. I am using libavformat/libavcodec/libavutil to do the encoding/decoding and libavfilter for the filtergraph.

  • Webcam streaming from Mac using FFmpeg

    13 décembre 2019, par Galaxy

    I want to stream my webcam from Mac using FFmpeg.

    First I checked the supported devices using ffmpeg -f avfoundation -list_devices true -i ""

    Output :

    [AVFoundation input device @ 0x7fdf1bd03000] AVFoundation video devices:
    [AVFoundation input device @ 0x7fdf1bd03000] [0] USB 2.0 Camera #2
    [AVFoundation input device @ 0x7fdf1bd03000] [1] FaceTime HD Camera
    [AVFoundation input device @ 0x7fdf1bd03000] [2] Capture screen 0
    [AVFoundation input device @ 0x7fdf1bd03000] [3] Capture screen 1
    [AVFoundation input device @ 0x7fdf1bd03000] AVFoundation audio devices:
    [AVFoundation input device @ 0x7fdf1bd03000] [0] Built-in Microphone

    The device[0] is the webcam I want to use.


    Then I tried to capture the webcam using ffmpeg -f avfoundation -i "0" out.mpg

    Output :

    [avfoundation @ 0x7fe7f3810600] Selected framerate (29.970030) is not supported by the device
    [avfoundation @ 0x7fe7f3810600] Supported modes:
    [avfoundation @ 0x7fe7f3810600]   320x240@[120.101366 120.101366]fps
    [avfoundation @ 0x7fe7f3810600]   640x480@[120.101366 120.101366]fps
    [avfoundation @ 0x7fe7f3810600]   800x600@[60.000240 60.000240]fps
    [avfoundation @ 0x7fe7f3810600]   1024x768@[30.000030 30.000030]fps
    [avfoundation @ 0x7fe7f3810600]   1280x720@[60.000240 60.000240]fps
    [avfoundation @ 0x7fe7f3810600]   1280x1024@[30.000030 30.000030]fps
    [avfoundation @ 0x7fe7f3810600]   1920x1080@[30.000030 30.000030]fps
    [avfoundation @ 0x7fe7f3810600]   320x240@[30.000030 30.000030]fps
    [avfoundation @ 0x7fe7f3810600]   640x480@[30.000030 30.000030]fps
    [avfoundation @ 0x7fe7f3810600]   800x600@[20.000000 20.000000]fps
    [avfoundation @ 0x7fe7f3810600]   1024x768@[6.000002 6.000002]fps
    0: Input/output error

    After that, I tried stream this webcam from my Mac using ffmpeg -f avfoundation -framerate 30 -i "0" -f mpeg1video -b 200k -r 30 -vf scale=1920:1080 http://127.0.0.1:8082/

    Output :

    [avfoundation @ 0x7f8515012800] An error occurred: The activeVideoMinFrameDuration passed is not supported by the device.  Use -activeFormat.videoSupportedFrameRateRanges to discover valid ranges.0: Input/output error

    I cannot capture or stream this webcam. However when I used the Facetime camera instead of this webcam, everything was OK. I’ve been searching for this problem for a few days, but still cannot fix it. Does anyone have experience with webcam and FFmpeg on Mac ?

  • Android OpenCV - How can i read video files with using jni ?

    20 juin 2017, par Tiga

    I am developing an application using Android Opencv.

    This app, which I am developing, offers two operations.

    • The frame read from the camera is passed to Jni using native function
      Mat.getNativeObjAddr (), and the new image is returned through
      javaCameraView’s onCameraFrame() function
    • It reads a video clip inside Storage, processes each frame the same
      as # 1, and returns the resulting image via the onCameraFrame()
      function.

    First,function is implemented as simple as the following and works normally :

      @Override
       public Mat onCameraFrame(CameraBridgeViewBase.CvCameraViewFrame inputFrame)
       {
           if(inputFrame!=null){
                   Detect(inputFrame.rgba().getNativeObjAddr(), boardImage.getNativeObjAddr());
               }
               return boardImage;
           }
       }

    However, the problem occurred in the second operation.
    As far as I know, the files inside Java Storage are not readable by jni.

    I already tried FFmpegMediaPlayer or MediaMetadataRetriever through Google search. However, the getFrameAtTime () function provided by this MetadataRetriever took an average of 170ms when grabbing a bitmap to a specific frame of 1920 * 1080 image. What I have to develop is to show the video results in real time at 30 fps. In # 1, the native function Detect () takes about 2ms to process one frame.

    For these reasons, I want to do this.

    java sends a video’s path (eg : /storage/emulated/0/download/video.mp4) to jni, and native functions process the video one frame at a time, and display the result image on ’onCameraFrame’.

    Is there a proper way ? I look forward to your reply. Thank you !