Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • How to Install FFmpeg on Databricks [closed]

    20 mai, par Riley Cohen

    I’m posting this to save others from the headache I went through getting ffmpeg running on Databricks. The process isn’t super intuitive, especially with Unity Catalog and volume-based restrictions, so here’s a clear step-by-step guide.

    Step #1: In Notebook Add init script for ffmpeg

    dbutils.fs.put(
      "dbfs:/Volumes/xxxxxxx/default/init/install_ffmpeg.sh",
      """#!/bin/bash
    apt-get update -y
    apt-get install -y ffmpeg
    """,
      overwrite=True
    )
    

    Step #2 Add init script to allowed list

    Follow this article: https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/manage-privileges/privileges#manage-allowlist

    enter image description here

    Step #3 Add the init script in the cluster advanced setting

    After creating this script, go to your cluster settings in Databricks UI (Clusters > Edit > Advanced Options > Init Scripts) and add the script path (dbfs:/Volumes/xxxxxxx/default/init/install_ffmpeg.sh). Restart the cluster to apply it. Once the cluster starts with this init script, FFmpeg will be installed and available on each node

    Step 4: Start/Restart the cluster

  • Encoding/Decoding H264 using libav in C++ [closed]

    20 mai, par gbock93

    I want to build an application to

    • capture frames in YUYV 4:2:2 format
    • encode them to H264
    • send over network
    • decode the received data
    • display the video stream

    To do so I wrote 2 classes, H264Encoder and H264Decoder.

    I post only the .cpp contents, the .h are trivial:

    H264Encoder.cpp

    #include 
    
    #include 
    #include 
    
    H264Encoder::H264Encoder(unsigned int width_, unsigned int height_, unsigned int fps_):
        m_width(width_),
        m_height(height_),
        m_fps(fps_),
        m_frame_index(0),
        m_context(nullptr),
        m_frame(nullptr),
        m_packet(nullptr),
        m_sws_ctx(nullptr)
    {
        // Find the video codec
        AVCodec* codec;
        codec = avcodec_find_encoder(AV_CODEC_ID_H264);
        if (!codec)
            throw std::runtime_error("[Encoder]: Error: Codec not found");
    
        // Allocate codec
        m_context = avcodec_alloc_context3(codec);
        if (!m_context)
            throw std::runtime_error("[Encoder]: Error: Could not allocate codec context");
    
        // Configure codec
        av_opt_set(m_context->priv_data, "preset", "ultrafast", 0);
        av_opt_set(m_context->priv_data, "tune", "zerolatency", 0);
        av_opt_set(m_context->priv_data, "crf",           "35", 0); // Range: [0; 51], sane range: [18; 26], lower -> higher compression
    
        m_context->width        = (int)width_;
        m_context->height       = (int)height_;
        m_context->time_base    = {1, (int)fps_};
        m_context->framerate    = {(int)fps_, 1};
        m_context->codec_id     = AV_CODEC_ID_H264;
        m_context->pix_fmt      = AV_PIX_FMT_YUV420P; // H265|4 codec take as input only AV_PIX_FMT_YUV420P
        m_context->bit_rate     = 400000;
        m_context->gop_size     = 10;
        m_context->max_b_frames = 1;
    
        // Open codec
        if (avcodec_open2(m_context, codec, nullptr) < 0)
            throw std::runtime_error("[Encoder]: Error: Could not open codec");
    
        // Allocate frame and its buffer
        m_frame = av_frame_alloc();
        if (!m_frame) 
            throw std::runtime_error("[Encoder]: Error: Could not allocate frame");
    
        m_frame->format = m_context->pix_fmt;
        m_frame->width  = m_context->width;
        m_frame->height = m_context->height;
    
        if (av_frame_get_buffer(m_frame, 0) < 0)
            throw std::runtime_error("[Encoder]: Error: Cannot allocate frame buffer");
        
        // Allocate packet
        m_packet = av_packet_alloc();
        if (!m_packet) 
            throw std::runtime_error("[Encoder]: Error: Could not allocate packet");
    
        // Convert from YUYV422 to YUV420P
        m_sws_ctx = sws_getContext(
            width_, height_, AV_PIX_FMT_YUYV422,
            width_, height_, AV_PIX_FMT_YUV420P,
            SWS_BILINEAR, nullptr, nullptr, nullptr
        );
        if (!m_sws_ctx) 
            throw std::runtime_error("[Encoder]: Error: Could not allocate sws context");
    
        //
        printf("[Encoder]: H264Encoder ready.\n");
    }
    
    H264Encoder::~H264Encoder()
    {
        sws_freeContext(m_sws_ctx);
        av_packet_free(&m_packet);
        av_frame_free(&m_frame);
        avcodec_free_context(&m_context);
    
        printf("[Encoder]: H264Encoder destroyed.\n");
    }
    
    std::vector H264Encoder::encode(const cv::Mat& img_)
    {
        /*
        - YUYV422 is a packed format. It has 3 components (av_pix_fmt_desc_get((AVPixelFormat)AV_PIX_FMT_YUYV422)->nb_components == 3) but
            data is stored in a single plane (av_pix_fmt_count_planes((AVPixelFormat)AV_PIX_FMT_YUYV422) == 1).
        - YUV420P is a planar format. It has 3 components (av_pix_fmt_desc_get((AVPixelFormat)AV_PIX_FMT_YUV420P)->nb_components == 3) and
            each component is stored in a separate plane (av_pix_fmt_count_planes((AVPixelFormat)AV_PIX_FMT_YUV420P) == 3) with its
            own stride.
        */
        std::cout << "[Encoder]" << std::endl;
        std::cout << "[Encoder]: Encoding img " << img_.cols << "x" << img_.rows << " | element size " << img_.elemSize() << std::endl;
        assert(img_.elemSize() == 2);
    
        uint8_t* input_data[1] = {(uint8_t*)img_.data};
        int input_linesize[1] = {2 * (int)m_width};
        
        if (av_frame_make_writable(m_frame) < 0)
            throw std::runtime_error("[Encoder]: Error: Cannot make frame data writable");
    
        // Convert from YUV422 image to YUV420 frame. Apply scaling if necessary
        sws_scale(
            m_sws_ctx,
            input_data, input_linesize, 0, m_height,
            m_frame->data, m_frame->linesize
        );
        m_frame->pts = m_frame_index;
    
        int n_planes = av_pix_fmt_count_planes((AVPixelFormat)m_frame->format);
        std::cout << "[Encoder]: Sending Frame " << m_frame_index << " with dimensions " << m_frame->width << "x" << m_frame->height << "x" << n_planes << std::endl;
        for (int i=0; iframerate.num) + 1;
                break;
            case AVERROR(EAGAIN):
                throw std::runtime_error("[Encoder]: avcodec_send_frame: EAGAIN");
            case AVERROR_EOF:
                throw std::runtime_error("[Encoder]: avcodec_send_frame: EOF");
            case AVERROR(EINVAL):
                throw std::runtime_error("[Encoder]: avcodec_send_frame: EINVAL");
            case AVERROR(ENOMEM):
                throw std::runtime_error("[Encoder]: avcodec_send_frame: ENOMEM");
            default:
                throw std::runtime_error("[Encoder]: avcodec_send_frame: UNKNOWN");
        }
    
        // Receive packet from codec
        std::vector result;
        while(ret >= 0)
        {
            ret = avcodec_receive_packet(m_context, m_packet);
    
            switch (ret)
            {
            case 0:
                std::cout << "[Encoder]: Received packet from codec of size " << m_packet->size << " bytes " << std::endl;
                result.insert(result.end(), m_packet->data, m_packet->data + m_packet->size);
                av_packet_unref(m_packet);
                break;
    
            case AVERROR(EAGAIN):
                std::cout << "[Encoder]: avcodec_receive_packet: EAGAIN" << std::endl;
                break;
            case AVERROR_EOF:
                std::cout << "[Encoder]: avcodec_receive_packet: EOF" << std::endl;
                break;
            case AVERROR(EINVAL):
                throw std::runtime_error("[Encoder]: avcodec_receive_packet: EINVAL");
            default:
                throw std::runtime_error("[Encoder]: avcodec_receive_packet: UNKNOWN");
            }
        }
    
        std::cout << "[Encoder]: Encoding complete" << std::endl;
        return result;
    }
    

    H264Decoder.cpp

    #include 
    
    #include 
    #include 
    
    H264Decoder::H264Decoder():
        m_context(nullptr),
        m_frame(nullptr),
        m_packet(nullptr)
    {
        // Find the video codec
        AVCodec* codec;
        codec = avcodec_find_decoder(AV_CODEC_ID_H264);
        if (!codec)
            throw std::runtime_error("[Decoder]: Error: Codec not found");
    
        // Allocate codec
        m_context = avcodec_alloc_context3(codec);
        if (!m_context)
            throw std::runtime_error("[Decoder]: Error: Could not allocate codec context");
    
        // Open codec
        if (avcodec_open2(m_context, codec, nullptr) < 0)
            throw std::runtime_error("[Decoder]: Error: Could not open codec");
    
        // Allocate frame
        m_frame = av_frame_alloc();
        if (!m_frame)
            throw std::runtime_error("[Decoder]: Error: Could not allocate frame");
    
        // Allocate packet
        m_packet = av_packet_alloc();
        if (!m_packet) 
            throw std::runtime_error("[Decoder]: Error: Could not allocate packet");
    
        //
        printf("[Decoder]: H264Decoder ready.\n");
    }
    
    H264Decoder::~H264Decoder()
    {
        av_packet_free(&m_packet);
        av_frame_free(&m_frame);
        avcodec_free_context(&m_context);
    
        printf("[Decoder]: H264Decoder destroyed.\n");
    }
    
    bool H264Decoder::decode(uint8_t* data_, size_t size_, cv::Mat& img_)
    {
        std::cout << "[Decoder]" << std::endl;
        std::cout << "[Decoder]: decoding " << size_ << " bytes of data" << std::endl;
    
        // Fill packet
        m_packet->data = data_;
        m_packet->size = size_;
    
        if (size_ == 0)
            return false;
    
        // Send packet to codec
        int send_result = avcodec_send_packet(m_context, m_packet);
    
        switch (send_result)
        {
            case 0:
                std::cout << "[Decoder]: Sent packet to codec" << std::endl;
                break;
            case AVERROR(EAGAIN):
                throw std::runtime_error("[Decoder]: avcodec_send_packet: EAGAIN");
            case AVERROR_EOF:
                throw std::runtime_error("[Decoder]: avcodec_send_packet: EOF");
            case AVERROR(EINVAL):
                throw std::runtime_error("[Decoder]: avcodec_send_packet: EINVAL");
            case AVERROR(ENOMEM):
                throw std::runtime_error("[Decoder]: avcodec_send_packet: ENOMEM");
            default:
                throw std::runtime_error("[Decoder]: avcodec_send_packet: UNKNOWN");
        }
    
        // Receive frame from codec
        int n_planes;
        uint8_t* output_data[1];
        int output_line_size[1];
    
        int receive_result = avcodec_receive_frame(m_context, m_frame);
    
        switch (receive_result)
        {
            case 0:
                n_planes = av_pix_fmt_count_planes((AVPixelFormat)m_frame->format);
                std::cout << "[Decoder]: Received Frame with dimensions " << m_frame->width << "x" << m_frame->height << "x" << n_planes << std::endl;
                for (int i=0; i/
        std::cout << "[Decoder]: Decoding complete" << std::endl;
        return true;
    }
    

    To test the two classes I put together a main.cpp to grab a frame, encode/decode and display the decoded frame (no network transmission in place):

    main.cpp

    while(...)
    {
        // get frame from custom camera class. Format is YUYV 4:2:2
        camera.getFrame(camera_frame);
        // Construct a cv::Mat to represent the grabbed frame
        cv::Mat camera_frame_yuyv = cv::Mat(camera_frame.height, camera_frame.width, CV_8UC2, camera_frame.data.data());
        // Encode image
        std::vector encoded_data = encoder.encode(camera_frame_yuyv);
        if (!encoded_data.empty())
        {
            // Decode image
            cv::Mat decoded_frame;
            if (decoder.decode(encoded_data.data(), encoded_data.size(), decoded_frame))
            {
                // Display image
                cv::imshow("Camera", decoded_frame);
                cv::waitKey(1);
            }
        }
    }
    

    Compiling and executing the code I get random results between subsequent executions:

    • Sometimes the whole loop runs without problems and I see the decoded image.
    • Sometimes the program crashes at the sws_scale(...) call in the decoder with "Assertion desc failed at src/libswscale/swscale_internal.h:757".
    • Sometimes the loop runs but I see a black image and the message Slice parameters 0, 720 are invalid is displayed when executing the sws_scale(...) call in the decoder.

    Why is the behaviour so random? What am I doing wrong with the libav API?

    Some resources I found useful:

  • ffmpeg conversion from h264 for webserver

    20 mai, par Xav

    I'm working on a system to record videos through RPi (picamera, h264 format), export (rsync) 10s segments of videos to a distant webserver, and trying to concatenate those segment into a single video file to read on the webserver.

    Simplified code to record on RPi:

    import os
    import datetime as dt
    import picamera
    
    camera = picamera.PiCamera()
    camera.resolution = (1920, 1080)
    camera.framerate = 25
    duration = 10 #sec
    
    [...]
    
    camera.start_recording(filename, format='h264')
    while True:
        filename = get_timestamped_filename()
        camera.split_recording(filename)
        camera.wait_recording(duration)
    
    camera.stop_recording()
    

    For the conversion, I tried many options/searches without success. I can get the video (.ts, .mp4, .ogv, .webm) working on majority of web browser but never on ios/android (chrome/safari). As example of test & fail with ffmpeg: https://gist.github.com/jaydenseric/220c785d6289bcfd7366

    When I don't try to concatenate but only convert those video:

    ffmpeg -i input.h264 -profile:v baseline -level 3.0 output.mp4
    

    I have the following output:

    ffmpeg version 4.1.6-1~deb10u1 Copyright (c) 2000-2020 the FFmpeg developers
      built with gcc 8 (Debian 8.3.0-6)
      configuration: --prefix=/usr --extra-version='1~deb10u1' --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-avresample --disable-filter=resample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librsvg --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared
      libavutil      56. 22.100 / 56. 22.100
      libavcodec     58. 35.100 / 58. 35.100
      libavformat    58. 20.100 / 58. 20.100
      libavdevice    58.  5.100 / 58.  5.100
      libavfilter     7. 40.101 /  7. 40.101
      libavresample   4.  0.  0 /  4.  0.  0
      libswscale      5.  3.100 /  5.  3.100
      libswresample   3.  3.100 /  3.  3.100
      libpostproc    55.  3.100 / 55.  3.100
    Input #0, h264, from 'input.h264':
      Duration: N/A, bitrate: N/A
        Stream #0:0: Video: h264 (High), yuv420p(progressive), 720x480, 25 fps, 25 tbr, 1200k tbn, 50 tbc
    File 'output.mp4' already exists. Overwrite ? [y/N] y
    Stream mapping:
      Stream #0:0 -> #0:0 (h264 (native) -> h264 (libx264))
    Press [q] to stop, [?] for help
    [libx264 @ 0x556885757980] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 BMI2 AVX2
    [libx264 @ 0x556885757980] profile Constrained Baseline, level 3.0
    [libx264 @ 0x556885757980] 264 - core 155 r2917 0a84d98 - H.264/MPEG-4 AVC codec - Copyleft 2003-2018 - http://www.videolan.org/x264.html - options: cabac=0 ref=3 deblock=1:0:0 analyse=0x1:0x111 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=0 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=9 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=0 weightp=0 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
    Output #0, mp4, to 'output.mp4':
      Metadata:
        encoder         : Lavf58.20.100
        Stream #0:0: Video: h264 (libx264) (avc1 / 0x31637661), yuv420p, 720x480, q=-1--1, 25 fps, 12800 tbn, 25 tbc
        Metadata:
          encoder         : Lavc58.35.100 libx264
        Side data:
          cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: -1
    frame=  195 fps=0.0 q=-1.0 Lsize=     229kB time=00:00:07.76 bitrate= 241.3kbits/s speed=20.4x    
    video:227kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.687068%
    [libx264 @ 0x556885757980] frame I:1     Avg QP:23.86  size: 24802
    [libx264 @ 0x556885757980] frame P:194   Avg QP:22.46  size:  1067
    [libx264 @ 0x556885757980] mb I  I16..4: 26.4%  0.0% 73.6%
    [libx264 @ 0x556885757980] mb P  I16..4:  1.1%  0.0%  0.4%  P16..4: 20.3%  1.5%  0.6%  0.0%  0.0%    skip:76.1%
    [libx264 @ 0x556885757980] coded y,uvDC,uvAC intra: 34.9% 54.4% 12.3% inter: 1.8% 12.4% 0.0%
    [libx264 @ 0x556885757980] i16 v,h,dc,p: 45% 14% 17% 23%
    [libx264 @ 0x556885757980] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 35% 31% 11%  4%  4%  4%  4%  4%  4%
    [libx264 @ 0x556885757980] i8c dc,h,v,p: 51% 17% 28%  4%
    [libx264 @ 0x556885757980] ref P L0: 80.0% 10.6%  9.4%
    [libx264 @ 0x556885757980] kb/s:237.75
    

    But still, impossible to play on mobile web (iOS, Android)

    Is there anything I'm doing wrong, either in the capturing process or conversion? Do you have any recommendations?

  • Generate video from images with ken burns effect using ffmpeg

    20 mai, par user1615796

    I am trying to generate video from images.

    The following code generates the video, 1 frames per 3 seconds.

         exec("ffmpeg -f image2 -r 1/3 -loop_input -shortest -y  -i tmp/image%3d.jpg -i tmp/ag1.mp3 -target ntsc-dvd -s 1280x720 tmp/video.mwv");
    

    I want to apply ken burns effect to it, I have searched a lot but couldn't find any solution.

    Can anybody help me out ? If can provide any code snippet, will be really appreciated.

  • How to record video and audio from webcam using ffmpeg on Windows ?

    20 mai, par Babu

    I want to record video as well as audio from webcam using ffmpeg,

    I have used the following codes to know what devices are available:

    ffmpeg -list_devices true -f dshow -i dummy
    

    And got the result:

    ffmpeg version N-54082-g96b33dd Copyright (c) 2000-2013 the FFmpeg developers
    built on Jun 17 2013 02:05:16 with gcc 4.7.3 (GCC)
    configuration: --enable-gpl --enable-version3 --disable-w32threads --enable-av
    isynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enab
    le-iconv --enable-libass --enable-libbluray --enable-libcaca --enable-libfreetyp
    e --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmp3lame --ena
    ble-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-l
    ibopus --enable-librtmp --enable-libschroedinger --enable-libsoxr --enable-libsp
    eex --enable-libtheora --enable-libtwolame --enable-libvo-aacenc --enable-libvo-
    amrwbenc --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libxavs --
    enable-libxvid --enable-zlib
    libavutil      52. 37.101 / 52. 37.101
    libavcodec     55. 16.100 / 55. 16.100
    libavformat    55.  8.103 / 55.  8.103
    libavdevice    55.  2.100 / 55.  2.100
    libavfilter     3. 77.101 /  3. 77.101
    libswscale      2.  3.100 /  2.  3.100
    libswresample   0. 17.102 /  0. 17.102
    libpostproc    52.  3.100 / 52.  3.100
    [dshow @ 024eb460] DirectShow video devices
    [dshow @ 024eb460]  "Sirius USB2.0 Camera"
    [dshow @ 024eb460]  "Vimicro USB Camera (Altair)"
    [dshow @ 024eb460] DirectShow audio devices
    [dshow @ 024eb460]  "Microphone (Realtek High Defini"
    dummy: Immediate exit requested
    

    I am using the following codes to get more details about the device:

    ffmpeg -f dshow -list_options true -i video="Vimicro USB Camera (Altair)"
    

    And i am getting the following results:

    ffmpeg version N-54082-g96b33dd Copyright (c) 2000-2013 the FFmpeg developers
    built on Jun 17 2013 02:05:16 with gcc 4.7.3 (GCC)
    configuration: --enable-gpl --enable-version3 --disable-w32threads --enable-av
    isynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enab
    le-iconv --enable-libass --enable-libbluray --enable-libcaca --enable-libfreetyp
    e --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmp3lame --ena
    ble-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-l
    ibopus --enable-librtmp --enable-libschroedinger --enable-libsoxr --enable-libsp
    eex --enable-libtheora --enable-libtwolame --enable-libvo-aacenc --enable-libvo-
    amrwbenc --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libxavs --
    enable-libxvid --enable-zlib
    libavutil      52. 37.101 / 52. 37.101
    libavcodec     55. 16.100 / 55. 16.100
    libavformat    55.  8.103 / 55.  8.103
    libavdevice    55.  2.100 / 55.  2.100
    libavfilter     3. 77.101 /  3. 77.101
    libswscale      2.  3.100 /  2.  3.100
    libswresample   0. 17.102 /  0. 17.102
    libpostproc    52.  3.100 / 52.  3.100
    [dshow @ 0249b3e0] DirectShow video device options
    [dshow @ 0249b3e0]  Pin "Capture"
    [dshow @ 0249b3e0]   pixel_format=yuyv422  min s=640x480 fps=30 max   s=640x480 fp
    s=30
    [dshow @ 0249b3e0]   pixel_format=yuyv422  min s=640x480 fps=30 max s=640x480 fp
    s=30
    [dshow @ 0249b3e0]   pixel_format=yuyv422  min s=352x288 fps=30 max s=352x288 fp
    s=30
    [dshow @ 0249b3e0]   pixel_format=yuyv422  min s=352x288 fps=30 max s=352x288 fp
    s=30
    [dshow @ 0249b3e0]   pixel_format=yuyv422  min s=320x240 fps=30 max s=320x240 fp
    s=30
    [dshow @ 0249b3e0]   pixel_format=yuyv422  min s=320x240 fps=30 max s=320x240 fp
    s=30
    [dshow @ 0249b3e0]   pixel_format=yuyv422  min s=176x144 fps=30 max s=176x144 fp
    s=30
    [dshow @ 0249b3e0]   pixel_format=yuyv422  min s=176x144 fps=30 max s=176x144 fp
    s=30
    [dshow @ 0249b3e0]   pixel_format=yuyv422  min s=160x120 fps=30 max s=160x120 fp
    s=30
    [dshow @ 0249b3e0]   pixel_format=yuyv422  min s=160x120 fps=30 max s=160x120 fp
    s=30
    video=Vimicro USB Camera (Altair): Immediate exit requested
    

    When i am trying following codes to get video and audio stream:

    ffmpeg -f dshow -s 320x240 -r 30 -vcodec mjpeg -i video="Vimicro USB Camera (Altair)" e:\output.mp4
    

    I am getting an error message as "Could not set video options video=Vimicro USB Camera (Altair): Input/output error"

    Following are the error details:

    ffmpeg version N-54082-g96b33dd Copyright (c) 2000-2013 the FFmpeg developers
    built on Jun 17 2013 02:05:16 with gcc 4.7.3 (GCC)
    configuration: --enable-gpl --enable-version3 --disable-w32threads --enable-av
    isynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enab
    le-iconv --enable-libass --enable-libbluray --enable-libcaca --enable-libfreetyp
    e --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmp3lame --ena
    ble-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-l
    ibopus --enable-librtmp --enable-libschroedinger --enable-libsoxr --enable-libsp
    eex --enable-libtheora --enable-libtwolame --enable-libvo-aacenc --enable-libvo-
    amrwbenc --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libxavs --
    enable-libxvid --enable-zlib
    libavutil      52. 37.101 / 52. 37.101
    libavcodec     55. 16.100 / 55. 16.100
    libavformat    55.  8.103 / 55.  8.103
    libavdevice    55.  2.100 / 55.  2.100
    libavfilter     3. 77.101 /  3. 77.101
    libswscale      2.  3.100 /  2.  3.100
    libswresample   0. 17.102 /  0. 17.102
    libpostproc    52.  3.100 / 52.  3.100
    [dshow @ 002f78e0] Could not set video options
    video=Vimicro USB Camera (Altair): Input/output error
    

    And i am unable to find where i am doing wrong.

    So if anybody can find where i am doing wrong or how to record video as well as audio from webcam using ffmpeg please help me to solve this problem.