Recherche avancée

Médias (1)

Mot : - Tags -/stallman

Autres articles (64)

  • Keeping control of your media in your hands

    13 avril 2011, par

    The vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
    While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
    MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
    MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...)

  • Participer à sa traduction

    10 avril 2011

    Vous pouvez nous aider à améliorer les locutions utilisées dans le logiciel ou à traduire celui-ci dans n’importe qu’elle nouvelle langue permettant sa diffusion à de nouvelles communautés linguistiques.
    Pour ce faire, on utilise l’interface de traduction de SPIP où l’ensemble des modules de langue de MediaSPIP sont à disposition. ll vous suffit de vous inscrire sur la liste de discussion des traducteurs pour demander plus d’informations.
    Actuellement MediaSPIP n’est disponible qu’en français et (...)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

Sur d’autres sites (11541)

  • flush encoded packets to disk when muxing audio and video

    23 juin 2016, par chandu

    I am using muxing.c example (without any modifications) provided with ffmpeg 3.0 version to create MP4 file (H.264 & AAC) with VS 2013. The sample is working fine with default width & height for video. But when changed the width to 1920 and height to 1080, sample is taking nearly 400MB (using task manager & in Release mode) through out the program and also never writing the encoded packets to the out file. It is writing to the out file (out.mp4) only when calling avcodec_close() at the end.

    I have tried to

    1. free the encoded packet after calling write_frame().
    2. used avio_flush()
    3. used avcodec_flush_buffers()

    but no success.

    Could anybody please tell me, how can I save the every encoded packet to disk by not keeping in RAM, so the memory usage would be low ?

    Note : There is no issue with flushing the buffers after recording is over, I am doing this by calling av_interleaved_write_frame() with AVPacket NULL.

  • Webcam streaming from Mac using FFmpeg

    22 juillet 2016, par Galaxy

    I want to stream my webcam from Mac using FFmpeg.

    First I checked the supported devices using ffmpeg -f avfoundation -list_devices true -i ""

    Output :

    [AVFoundation input device @ 0x7fdf1bd03000] AVFoundation video devices:
    [AVFoundation input device @ 0x7fdf1bd03000] [0] USB 2.0 Camera #2
    [AVFoundation input device @ 0x7fdf1bd03000] [1] FaceTime HD Camera
    [AVFoundation input device @ 0x7fdf1bd03000] [2] Capture screen 0
    [AVFoundation input device @ 0x7fdf1bd03000] [3] Capture screen 1
    [AVFoundation input device @ 0x7fdf1bd03000] AVFoundation audio devices:
    [AVFoundation input device @ 0x7fdf1bd03000] [0] Built-in Microphone

    The device[0] is the webcam I want to use.


    Then I tried to capture the webcam using ffmpeg -f avfoundation -i "0" out.mpg

    Output :

    [avfoundation @ 0x7fe7f3810600] Selected framerate (29.970030) is not supported by the device
    [avfoundation @ 0x7fe7f3810600] Supported modes:
    [avfoundation @ 0x7fe7f3810600]   320x240@[120.101366 120.101366]fps
    [avfoundation @ 0x7fe7f3810600]   640x480@[120.101366 120.101366]fps
    [avfoundation @ 0x7fe7f3810600]   800x600@[60.000240 60.000240]fps
    [avfoundation @ 0x7fe7f3810600]   1024x768@[30.000030 30.000030]fps
    [avfoundation @ 0x7fe7f3810600]   1280x720@[60.000240 60.000240]fps
    [avfoundation @ 0x7fe7f3810600]   1280x1024@[30.000030 30.000030]fps
    [avfoundation @ 0x7fe7f3810600]   1920x1080@[30.000030 30.000030]fps
    [avfoundation @ 0x7fe7f3810600]   320x240@[30.000030 30.000030]fps
    [avfoundation @ 0x7fe7f3810600]   640x480@[30.000030 30.000030]fps
    [avfoundation @ 0x7fe7f3810600]   800x600@[20.000000 20.000000]fps
    [avfoundation @ 0x7fe7f3810600]   1024x768@[6.000002 6.000002]fps
    0: Input/output error

    After that, I tried stream this webcam from my Mac using ffmpeg -f avfoundation -framerate 30 -i "0" -f mpeg1video -b 200k -r 30 -vf scale=1920:1080 http://127.0.0.1:8082/

    Output :

    [avfoundation @ 0x7f8515012800] An error occurred: The activeVideoMinFrameDuration passed is not supported by the device.  Use -activeFormat.videoSupportedFrameRateRanges to discover valid ranges.0: Input/output error

    I cannot capture or stream this webcam. However when I used the Facetime camera instead of this webcam, everything was OK. I’ve been searching for this problem for a few days, but still cannot fix it. Does anyone have experience with webcam and FFmpeg on Mac ?

  • JavaCV FFmpegFrameRecorder Video output reddish color

    14 juin 2016, par Diego Perozo

    I am trying to make a video .mp4 file out of a group of images using FFmpegFrameRecorder as a part of a bigger program, so I set up a test project in which I try to make a video out of 100 instances of the same frame at 25fps. The program seems to work. However, every time I run it the image seems to be reddish. As if a red filter had been applied to it.

    Here’s the code snippet :

    public static void main(String[] args) {
       File file = new File("C:/Users/Diego/Desktop/tc-images/image0.jpg");
       BufferedImage img = null;
       try {
           img = ImageIO.read(file);
       } catch (IOException e1) {
           e1.printStackTrace();
       }
       IplImage image = IplImage.createFrom(img);
       FFmpegFrameRecorder recorder = new FFmpegFrameRecorder("C:/Users/Diego/Desktop/tc-images/test.mp4",1920,1080);
       try {
       recorder.setVideoCodec(13);
       recorder.setFormat("mp4");
       recorder.setPixelFormat(0);
       recorder.setFrameRate(25);
       recorder.start();
       for (int i=0;i<100;i++){
       recorder.record(image);
       }
       recorder.stop();
       }
       catch (Exception e){
       e.printStackTrace();
       }
    }

    I’d appreciate it if anybody told me what’s wrong. Thanks in advance for any help.