Recherche avancée

Médias (0)

Mot : - Tags -/content

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (90)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Mediabox : ouvrir les images dans l’espace maximal pour l’utilisateur

    8 février 2011, par

    La visualisation des images est restreinte par la largeur accordée par le design du site (dépendant du thème utilisé). Elles sont donc visibles sous un format réduit. Afin de profiter de l’ensemble de la place disponible sur l’écran de l’utilisateur, il est possible d’ajouter une fonctionnalité d’affichage de l’image dans une boite multimedia apparaissant au dessus du reste du contenu.
    Pour ce faire il est nécessaire d’installer le plugin "Mediabox".
    Configuration de la boite multimédia
    Dès (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

Sur d’autres sites (15462)

  • flush encoded packets to disk when muxing audio and video

    23 juin 2016, par chandu

    I am using muxing.c example (without any modifications) provided with ffmpeg 3.0 version to create MP4 file (H.264 & AAC) with VS 2013. The sample is working fine with default width & height for video. But when changed the width to 1920 and height to 1080, sample is taking nearly 400MB (using task manager & in Release mode) through out the program and also never writing the encoded packets to the out file. It is writing to the out file (out.mp4) only when calling avcodec_close() at the end.

    I have tried to

    1. free the encoded packet after calling write_frame().
    2. used avio_flush()
    3. used avcodec_flush_buffers()

    but no success.

    Could anybody please tell me, how can I save the every encoded packet to disk by not keeping in RAM, so the memory usage would be low ?

    Note : There is no issue with flushing the buffers after recording is over, I am doing this by calling av_interleaved_write_frame() with AVPacket NULL.

  • Webcam streaming from Mac using FFmpeg

    22 juillet 2016, par Galaxy

    I want to stream my webcam from Mac using FFmpeg.

    First I checked the supported devices using ffmpeg -f avfoundation -list_devices true -i ""

    Output :

    [AVFoundation input device @ 0x7fdf1bd03000] AVFoundation video devices:
    [AVFoundation input device @ 0x7fdf1bd03000] [0] USB 2.0 Camera #2
    [AVFoundation input device @ 0x7fdf1bd03000] [1] FaceTime HD Camera
    [AVFoundation input device @ 0x7fdf1bd03000] [2] Capture screen 0
    [AVFoundation input device @ 0x7fdf1bd03000] [3] Capture screen 1
    [AVFoundation input device @ 0x7fdf1bd03000] AVFoundation audio devices:
    [AVFoundation input device @ 0x7fdf1bd03000] [0] Built-in Microphone

    The device[0] is the webcam I want to use.


    Then I tried to capture the webcam using ffmpeg -f avfoundation -i "0" out.mpg

    Output :

    [avfoundation @ 0x7fe7f3810600] Selected framerate (29.970030) is not supported by the device
    [avfoundation @ 0x7fe7f3810600] Supported modes:
    [avfoundation @ 0x7fe7f3810600]   320x240@[120.101366 120.101366]fps
    [avfoundation @ 0x7fe7f3810600]   640x480@[120.101366 120.101366]fps
    [avfoundation @ 0x7fe7f3810600]   800x600@[60.000240 60.000240]fps
    [avfoundation @ 0x7fe7f3810600]   1024x768@[30.000030 30.000030]fps
    [avfoundation @ 0x7fe7f3810600]   1280x720@[60.000240 60.000240]fps
    [avfoundation @ 0x7fe7f3810600]   1280x1024@[30.000030 30.000030]fps
    [avfoundation @ 0x7fe7f3810600]   1920x1080@[30.000030 30.000030]fps
    [avfoundation @ 0x7fe7f3810600]   320x240@[30.000030 30.000030]fps
    [avfoundation @ 0x7fe7f3810600]   640x480@[30.000030 30.000030]fps
    [avfoundation @ 0x7fe7f3810600]   800x600@[20.000000 20.000000]fps
    [avfoundation @ 0x7fe7f3810600]   1024x768@[6.000002 6.000002]fps
    0: Input/output error

    After that, I tried stream this webcam from my Mac using ffmpeg -f avfoundation -framerate 30 -i "0" -f mpeg1video -b 200k -r 30 -vf scale=1920:1080 http://127.0.0.1:8082/

    Output :

    [avfoundation @ 0x7f8515012800] An error occurred: The activeVideoMinFrameDuration passed is not supported by the device.  Use -activeFormat.videoSupportedFrameRateRanges to discover valid ranges.0: Input/output error

    I cannot capture or stream this webcam. However when I used the Facetime camera instead of this webcam, everything was OK. I’ve been searching for this problem for a few days, but still cannot fix it. Does anyone have experience with webcam and FFmpeg on Mac ?

  • JavaCV FFmpegFrameRecorder Video output reddish color

    14 juin 2016, par Diego Perozo

    I am trying to make a video .mp4 file out of a group of images using FFmpegFrameRecorder as a part of a bigger program, so I set up a test project in which I try to make a video out of 100 instances of the same frame at 25fps. The program seems to work. However, every time I run it the image seems to be reddish. As if a red filter had been applied to it.

    Here’s the code snippet :

    public static void main(String[] args) {
       File file = new File("C:/Users/Diego/Desktop/tc-images/image0.jpg");
       BufferedImage img = null;
       try {
           img = ImageIO.read(file);
       } catch (IOException e1) {
           e1.printStackTrace();
       }
       IplImage image = IplImage.createFrom(img);
       FFmpegFrameRecorder recorder = new FFmpegFrameRecorder("C:/Users/Diego/Desktop/tc-images/test.mp4",1920,1080);
       try {
       recorder.setVideoCodec(13);
       recorder.setFormat("mp4");
       recorder.setPixelFormat(0);
       recorder.setFrameRate(25);
       recorder.start();
       for (int i=0;i<100;i++){
       recorder.record(image);
       }
       recorder.stop();
       }
       catch (Exception e){
       e.printStackTrace();
       }
    }

    I’d appreciate it if anybody told me what’s wrong. Thanks in advance for any help.