Recherche avancée

Médias (3)

Mot : - Tags -/image

Autres articles (98)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • ANNEXE : Les plugins utilisés spécifiquement pour la ferme

    5 mars 2010, par

    Le site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)

Sur d’autres sites (10091)

  • Unable to stream video file from MediaMTX media server to browser via WebRTC

    8 juin 2024, par thegreatjedi

    I took over a repository at work. It's a working demo comprising a web server which receives video and camera feeds from a media server (built from the rtsp-simple-server Docker image) via a RTSP relay server and streams the feeds to the client, all deployed via Docker Compose.

    


    I'm trying to switch over to use WebRTC instead. rtsp-simple-server has upgraded into MediaMTX since the time the demo was created 2 years ago. This is the relevant section of the updated Docker Compose configuration :

    


      media-server:
    image: bluenviron/mediamtx:latest-ffmpeg
    expose:
      - 8889
    init: true
    ports:
      - 8889:8889
    restart: unless-stopped
    volumes:
      - type: bind
        source: ./demo/vids
        target: /vids
      - type: bind
        source: ./demo/mediamtx.yml
        target: /mediamtx.yml


    


    Relevant part of the MediaMTX custom configuration in mediamtx.yml :

    


    ###############################################
# Path settings

# Settings in "paths" are applied to specific paths, and the map key
# is the name of the path.
# Any setting in "pathDefaults" can be overridden here.
# It's possible to use regular expressions by using a tilde as prefix,
# for example "~^(test1|test2)$" will match both "test1" and "test2",
# for example "~^prefix" will match all paths that start with "prefix".
paths:
  # example:
  # my_camera:
  #   source: rtsp://my_camera
  ~^demo\d+$:
    runOnDemand: ffmpeg -re -stream_loop -1 -i /vids/$MTX_PATH.mp4 -c:v libvpx -b:v 0 -crf 18 -qmin 18 -qmax 18 -f webm http://localhost:8889/$MTX_PATH/whip

  # Settings under path "all_others" are applied to all paths that
  # do not match another entry.
  all_others:


    


    I've absolutely no experience with WebRTC. This is my first time hearing of this protocol, let alone working with it. From what I understand, I need to convert my demo mp4 videos (which were successfully streaming via RTSP in the previous implementation) to a compatible video codec, so I've opted for VP8.

    


    Before trying to stream the videos into my web server, I tested the stream directly in the browser (tried with both the latest versions of Chrome and Edge). I went to http://localhost:8889/demo0 (which should convert demo0.mp4 to VP8 and then stream it over WebRTC). The video player loaded in the browser but no video data was received and nothing played. After several seconds, the screen displayed "Error : bad status code 400, retrying in some seconds". In the browser console, it showed :

    


    Failed to load resource : the server responded with a status of 400 (Bad Request)

    


    Inside the MediaMTX container's runtime logs, this is what's displayed :

    


    2024-04-02 14:53:08 ffmpeg version 6.1.1 Copyright (c) 2000-2023 the FFmpeg developers
2024-04-02 14:53:08   built with gcc 13.2.1 (Alpine 13.2.1_git20231014) 20231014
2024-04-02 14:53:08   configuration: --prefix=/usr --disable-librtmp --disable-lzma --disable-static --disable-stripping --enable-avfilter --enable-gpl --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libdav1d --enable-libdrm --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libharfbuzz --enable-libmp3lame --enable-libopenmpt --enable-libopus --enable-libplacebo --enable-libpulse --enable-librav1e --enable-librist --enable-libsoxr --enable-libsrt --enable-libssh --enable-libtheora --enable-libv4l2 --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxcb --enable-libxml2 --enable-libxvid --enable-libzimg --enable-libzmq --enable-lto=auto --enable-lv2 --enable-openssl --enable-pic --enable-postproc --enable-pthreads --enable-shared --enable-vaapi --enable-vdpau --enable-version3 --enable-vulkan --optflags=-O3 --enable-libjxl --enable-libsvtav1 --enable-libvpl
2024-04-02 14:53:08   libavutil      58. 29.100 / 58. 29.100
2024-04-02 14:53:08   libavcodec     60. 31.102 / 60. 31.102
2024-04-02 14:53:08   libavformat    60. 16.100 / 60. 16.100
2024-04-02 14:53:08   libavdevice    60.  3.100 / 60.  3.100
2024-04-02 14:53:08   libavfilter     9. 12.100 /  9. 12.100
2024-04-02 14:53:08   libswscale      7.  5.100 /  7.  5.100
2024-04-02 14:53:08   libswresample   5.  0.100 /  5.  0.100
2024-04-02 14:53:08   libpostproc    57.  3.100 / 57.  3.100
2024-04-02 14:53:08 Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/vids/demo0.mp4':
2024-04-02 14:53:08   Metadata:
2024-04-02 14:53:08     major_brand     : isom
2024-04-02 14:53:08     minor_version   : 512
2024-04-02 14:53:08     compatible_brands: isomiso2mp41
2024-04-02 14:53:08     encoder         : Lavf58.76.100
2024-04-02 14:53:08   Duration: 00:00:03.47, start: 0.000000, bitrate: 1675 kb/s
2024-04-02 14:53:08   Stream #0:0[0x1](und): Video: mpeg1video (mp4v / 0x7634706D), yuv420p(tv, progressive), 640x360 [SAR 1:1 DAR 16:9], 104857 kb/s, 30 fps, 30 tbr, 90k tbn (default)
2024-04-02 14:53:08     Metadata:
2024-04-02 14:53:08       handler_name    : VideoHandler
2024-04-02 14:53:08       vendor_id       : [0][0][0][0]
2024-04-02 14:53:08     Side data:
2024-04-02 14:53:08       cpb: bitrate max/min/avg: 0/0/0 buffer size: 49152 vbv_delay: N/A
2024-04-02 14:53:08 Stream mapping:
2024-04-02 14:53:08   Stream #0:0 -> #0:0 (mpeg1video (native) -> vp8 (libvpx))
2024-04-02 14:53:08 Press [q] to stop, [?] for help
2024-04-02 14:53:08 [libvpx @ 0x7faa8591b8c0] v1.13.1
2024-04-02 14:53:08 [libvpx @ 0x7faa8591b8c0] Bitrate not specified for constrained quality mode, using default of 256kbit/sec
2024-04-02 14:53:08 Output #0, webm, to 'http://localhost:8889/demo0/whip':
2024-04-02 14:53:08   Metadata:
2024-04-02 14:53:08     major_brand     : isom
2024-04-02 14:53:08     minor_version   : 512
2024-04-02 14:53:08     compatible_brands: isomiso2mp41
2024-04-02 14:53:08     encoder         : Lavf60.16.100
2024-04-02 14:53:08   Stream #0:0(und): Video: vp8, yuv420p(tv, progressive), 640x360 [SAR 1:1 DAR 16:9], q=2-31, 256 kb/s, 30 fps, 1k tbn (default)
2024-04-02 14:53:08     Metadata:
2024-04-02 14:53:08       handler_name    : VideoHandler
2024-04-02 14:53:08       vendor_id       : [0][0][0][0]
2024-04-02 14:53:08       encoder         : Lavc60.31.102 libvpx
2024-04-02 14:53:08     Side data:
2024-04-02 14:53:08       cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: N/A
2024-04-02 14:53:18 2024/04/02 06:53:18 INF [path demo0] runOnDemand command stopped: timed out
2024-04-02 14:53:18 2024/04/02 06:53:18 INF [WebRTC] [session 0f460c76] closed: source of path 'demo0' has timed out
[out#0/webm @ 0x7faa859487c0] video:272kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 1.042856%
2024-04-02 14:53:18 frame=  315 fps= 32 q=18.0 Lsize=     275kB time=00:00:10.46 bitrate= 215.1kbits/s speed=1.05x    
2024-04-02 14:53:18 Exiting normally, received signal 2.


    


    I'm not sure what this is supposed to mean ? Why isn't the server able to stream this 3-second, 709kb video even once ? The browser connected to the server and the URL successfully, but no data was being transferred.

    


    Just in case, I decided to manually convert all of my mp4 files to webm using ffmpeg, and verified with Window's media player that the webm videos work. Then, I modified MediaMTX's configuration to stream the webm videos directly :

    


    paths:
  # example:
  # my_camera:
  #   source: rtsp://my_camera
  ~^demo\d+$:
    runOnDemand: ffmpeg -re -stream_loop -1 -i /vids/$MTX_PATH.webm -c copy -f webm http://localhost:8889/$MTX_PATH/whip


    


    However, the error persists :

    


    2024-04-02 15:03:58 ffmpeg version 6.1.1 Copyright (c) 2000-2023 the FFmpeg developers
2024-04-02 15:03:58   built with gcc 13.2.1 (Alpine 13.2.1_git20231014) 20231014
2024-04-02 15:03:58   configuration: --prefix=/usr --disable-librtmp --disable-lzma --disable-static --disable-stripping --enable-avfilter --enable-gpl --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libdav1d --enable-libdrm --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libharfbuzz --enable-libmp3lame --enable-libopenmpt --enable-libopus --enable-libplacebo --enable-libpulse --enable-librav1e --enable-librist --enable-libsoxr --enable-libsrt --enable-libssh --enable-libtheora --enable-libv4l2 --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxcb --enable-libxml2 --enable-libxvid --enable-libzimg --enable-libzmq --enable-lto=auto --enable-lv2 --enable-openssl --enable-pic --enable-postproc --enable-pthreads --enable-shared --enable-vaapi --enable-vdpau --enable-version3 --enable-vulkan --optflags=-O3 --enable-libjxl --enable-libsvtav1 --enable-libvpl
2024-04-02 15:03:58   libavutil      58. 29.100 / 58. 29.100
2024-04-02 15:03:58   libavcodec     60. 31.102 / 60. 31.102
2024-04-02 15:03:58   libavformat    60. 16.100 / 60. 16.100
2024-04-02 15:03:58   libavdevice    60.  3.100 / 60.  3.100
2024-04-02 15:03:58   libavfilter     9. 12.100 /  9. 12.100
2024-04-02 15:03:58   libswscale      7.  5.100 /  7.  5.100
2024-04-02 15:03:58   libswresample   5.  0.100 /  5.  0.100
2024-04-02 15:03:58   libpostproc    57.  3.100 / 57.  3.100
2024-04-02 15:03:58 Input #0, matroska,webm, from '/vids/demo0.webm':
2024-04-02 15:03:58   Metadata:
2024-04-02 15:03:58     COMPATIBLE_BRANDS: isomiso2mp41
2024-04-02 15:03:58     MAJOR_BRAND     : isom
2024-04-02 15:03:58     MINOR_VERSION   : 512
2024-04-02 15:03:58     ENCODER         : Lavf60.16.100
2024-04-02 15:03:58   Duration: 00:00:03.47, start: 0.000000, bitrate: 217 kb/s
2024-04-02 15:03:58   Stream #0:0: Video: vp8, yuv420p(tv, progressive), 640x360, SAR 1:1 DAR 16:9, 30 fps, 30 tbr, 1k tbn (default)
2024-04-02 15:03:58     Metadata:
2024-04-02 15:03:58       HANDLER_NAME    : VideoHandler
2024-04-02 15:03:58       VENDOR_ID       : [0][0][0][0]
2024-04-02 15:03:58       ENCODER         : Lavc60.31.102 libvpx
2024-04-02 15:03:58       DURATION        : 00:00:03.466000000
2024-04-02 15:03:58 Output #0, webm, to 'http://localhost:8889/demo0/whip':
2024-04-02 15:03:58   Metadata:
2024-04-02 15:03:58     COMPATIBLE_BRANDS: isomiso2mp41
2024-04-02 15:03:58     MAJOR_BRAND     : isom
2024-04-02 15:03:58     MINOR_VERSION   : 512
2024-04-02 15:03:58     encoder         : Lavf60.16.100
2024-04-02 15:03:58   Stream #0:0: Video: vp8, yuv420p(tv, progressive), 640x360 [SAR 1:1 DAR 16:9], q=2-31, 30 fps, 30 tbr, 1k tbn (default)
2024-04-02 15:03:58     Metadata:
2024-04-02 15:03:58       HANDLER_NAME    : VideoHandler
2024-04-02 15:03:58       VENDOR_ID       : [0][0][0][0]
2024-04-02 15:03:58       ENCODER         : Lavc60.31.102 libvpx
2024-04-02 15:03:58       DURATION        : 00:00:03.466000000
2024-04-02 15:03:58 Stream mapping:
2024-04-02 15:03:58   Stream #0:0 -> #0:0 (copy)
2024-04-02 15:03:58 Press [q] to stop, [?] for help
2024-04-02 15:04:08 2024/04/02 07:04:08 INF [path demo0] runOnDemand command stopped: timed out
2024-04-02 15:04:08 2024/04/02 07:04:08 INF [WebRTC] [session 829664cb] closed: source of path 'demo0' has timed out
[out#0/webm @ 0x7f04b00515c0] video:281kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 1.023511%
2024-04-02 15:04:08 size=     284kB time=00:00:10.49 bitrate= 221.3kbits/s speed=1.05x    
2024-04-02 15:04:08 Exiting normally, received signal 2.


    


    This is the same when I try to stream my other videos (demo1.mp4, demo2.mp4 etc.). What am I doing wrong ?

    


  • C# FFMPEG : Code bugs out and stops producing media files

    24 mai 2020, par Hamez

    I've made a joke program in C# that uses ffmpeg to edit videos with different effects such as stuttering. I've finished 3 effects so far and each of them work on their own but as soon as I put one after another e.g.  fx.CrashStutter(0, 2); fx.CrashBeep(2, 2); fx.Wow(4, 2);
The code breaks and no longer produces photo/video files but once I stop debugging the file it was supposed to be processing appears. I've used a system where it loops over trying to execute a command to create a text file as a marker for when ffmpeg is done processing a file. The debug console also repeatedly says "The process tried to write to a nonexistent pipe."

    



    Here's the code for all 3 effects :

    



     public void Wow(double start, double duration)
        {
            if (fxstart == true)
            {
                //MessageBox.Show("WowFX Duration" + duration);
                string folderName = ("W_s" + start);
                System.Threading.Thread.Sleep(100);
                FXcmd.StandardInput.WriteLine("mkdir " + folderName);
                System.Threading.Thread.Sleep(100);
                FXcmd.StandardInput.WriteLine("cd " + folderName);
                System.Threading.Thread.Sleep(100);
                FXcmd.StandardInput.WriteLine("ffmpeg -ss " + start + " -t " + (duration / 6) + " -i " + source + " a.mp4");
                //wait until a.mp4 appears
                while (File.Exists("FxSource(Temporary)\\" + folderName + "\\a.txt") == false)
                {
                    /*aha got a live one!*/FXcmd.StandardInput.WriteLine(" echo a > a.txt");
                    System.Threading.Thread.Sleep(1500);
                }
                FXcmd.StandardInput.WriteLine("ffmpeg -i a.mp4 -vf reverse -af areverse b.mp4");
                //wait until b.mp4 appears
                while (File.Exists("FxSource(Temporary)\\" + folderName + "\\b.txt") == false)
                {
                    FXcmd.StandardInput.WriteLine(" echo b > b.txt");
                    System.Threading.Thread.Sleep(1500);
                }
                FXcmd.StandardInput.WriteLine("ffmpeg -ss " + start + " -t " + (duration / 3) + " -i " + source + " c.mp4");
                //wait until c.mp4 appears
                while (File.Exists("FxSource(Temporary)\\" + folderName + "\\c.txt") == false)
                {
                    FXcmd.StandardInput.WriteLine(" echo c > c.txt");
                    System.Threading.Thread.Sleep(1500);
                }
                FXcmd.StandardInput.WriteLine("ffmpeg -i c.mp4 -vf reverse -af areverse d.mp4");
                //wait until d.mp4 appears
                while (File.Exists("FxSource(Temporary)\\" + folderName + "\\d.txt") == false)
                {
                    FXcmd.StandardInput.WriteLine(" echo d > d.txt");
                    System.Threading.Thread.Sleep(1500);
                }
                string[] concatList = { "file 'a.mp4'", "file 'b.mp4'", "file 'c.mp4'", "file 'd.mp4'" };
                //FXcmd.StandardInput.Write("del a.txt, b.txt, c.txt, d.txt");
                //System.Threading.Thread.Sleep(1000);
                System.IO.File.WriteAllLines(("FxSource(Temporary)\\" + folderName + "\\concatList.txt"), concatList);
                System.Threading.Thread.Sleep(1500);
                FXcmd.StandardInput.WriteLine("ffmpeg -f concat -i concatList.txt -c copy " + folderName + ".mp4");
                while (File.Exists("FxSource(Temporary)\\" + folderName + "\\" + "1.txt") == false)
                {
                    FXcmd.StandardInput.WriteLine(" echo 1 > 1.txt");
                    System.Threading.Thread.Sleep(1500);
                }
                FXcmd.StandardInput.WriteLine("copy " + folderName + ".mp4 ..");
                while (File.Exists("FxSource(Temporary)\\" + folderName + ".mp4") == false)
                {
                    System.Threading.Thread.Sleep(1500);
                }
                FXcmd.StandardInput.WriteLine("cd ..");
                System.Threading.Thread.Sleep(100);
                FXcmd.StandardInput.WriteLine("cls");
                FXcmd.StandardInput.Flush();
            }
        }
        public void CrashStutter(int start, int duration)
        {
            if (fxstart == true)
            {
                string folderName = ("Cs_s" + start);
                System.Threading.Thread.Sleep(100);
                FXcmd.StandardInput.WriteLine("mkdir " + folderName);
                System.Threading.Thread.Sleep(100);
                FXcmd.StandardInput.WriteLine("cd " + folderName);
                System.Threading.Thread.Sleep(100);
                FXcmd.StandardInput.WriteLine("ffmpeg -ss " + start + " -t 0.1" + " -i " + source + " a.mp4");
                System.Threading.Thread.Sleep(100);
                while (File.Exists("FxSource(Temporary)\\" + folderName + "\\a.txt") == false)
                {
                    FXcmd.StandardInput.WriteLine(" echo a > a.txt");
                    System.Threading.Thread.Sleep(1500);
                }
                FXcmd.StandardInput.WriteLine("ffmpeg -stream_loop "+10*duration+" -i a.mp4 "+folderName+".mp4");
                while (File.Exists("FxSource(Temporary)\\" + folderName + "\\" + "1.txt") == false)
                {
                    FXcmd.StandardInput.WriteLine(" echo 1 > 1.txt");
                    System.Threading.Thread.Sleep(1500);
                }
                FXcmd.StandardInput.WriteLine("copy " + folderName + ".mp4 ..");
                while (File.Exists("FxSource(Temporary)\\" + folderName + ".mp4") == false)
                {
                    System.Threading.Thread.Sleep(1500);
                }
                FXcmd.StandardInput.WriteLine("cd ..");
                System.Threading.Thread.Sleep(100);
                FXcmd.StandardInput.WriteLine("cls");
                FXcmd.StandardInput.Flush();
            }
        }
        public void CrashBeep(int start, int duration)
        {
            //this effect cannot last longer than 7 seconds
            double contrast = 25;
            double red = 0.75;
            if (fxstart == true)
            {
                string folderName = ("Cb_s" + start);
                FXcmd.StandardInput.WriteLine("mkdir " + folderName);
                System.Threading.Thread.Sleep(100);
                FXcmd.StandardInput.WriteLine("cd " + folderName);
                System.Threading.Thread.Sleep(100);
                /*gets stuck*/FXcmd.StandardInput.WriteLine("ffmpeg -i "+source+ " -vf fps=1 a.jpg");
                System.Threading.Thread.Sleep(100);
                while (File.Exists("FxSource(Temporary)\\" + folderName + "\\a.txt") == false)
                {
                    FXcmd.StandardInput.WriteLine(" echo a > a.txt");
                    System.Threading.Thread.Sleep(1500);
                }
                FXcmd.StandardInput.WriteLine("ffmpeg -i a.jpg -vf eq=contrast="+contrast+" b.jpg");
                while (File.Exists("FxSource(Temporary)\\" + folderName + "\\b.txt") == false)
                {
                    FXcmd.StandardInput.WriteLine(" echo b > b.txt");
                    System.Threading.Thread.Sleep(1500);
                }
                FXcmd.StandardInput.WriteLine("ffmpeg -i b.jpg -vf colorbalance=rm=" + red + " c.jpg");
                while (File.Exists("FxSource(Temporary)\\" + folderName + "\\c.txt") == false)
                {
                    FXcmd.StandardInput.WriteLine(" echo > c.txt");
                    System.Threading.Thread.Sleep(1500);
                }
                FXcmd.StandardInput.WriteLine("ffmpeg -loop 1 -i c.jpg -c:v libx264 -t "+ duration +" -pix_fmt yuv420p -vf scale=1920:1080 d.mp4");
                while (File.Exists("FxSource(Temporary)\\" + folderName + "\\d.txt") == false)
                {
                    FXcmd.StandardInput.WriteLine(" echo d > d.txt");
                    System.Threading.Thread.Sleep(1500);
                }
                FXcmd.StandardInput.WriteLine("cd ..");
                System.Threading.Thread.Sleep(100);
                FXcmd.StandardInput.WriteLine("copy beep.mp3 "+folderName+"/beep.mp3");
                System.Threading.Thread.Sleep(100);
                FXcmd.StandardInput.WriteLine("cd "+folderName);
                System.Threading.Thread.Sleep(100);
                FXcmd.StandardInput.WriteLine("ffmpeg -i beep.mp3 -ss 0 -t " + duration + " e.mp3");
                while (File.Exists("FxSource(Temporary)\\" + folderName + "\\e.txt") == false)
                {
                    FXcmd.StandardInput.WriteLine(" echo e > e.txt");
                    System.Threading.Thread.Sleep(1500);
                }
                FXcmd.StandardInput.WriteLine("ffmpeg -i d.mp4 -i e.mp3 -c copy -map 0:v:0 -map 1:a:0 " + folderName + ".mp4");
                while (File.Exists("FxSource(Temporary)\\" + folderName + "\\" + "1.txt") == false)
                {
                    FXcmd.StandardInput.WriteLine(" echo 1 > 1.txt");
                    System.Threading.Thread.Sleep(1500);
                }
                FXcmd.StandardInput.WriteLine("copy " + folderName + ".mp4 ..");
                while (File.Exists("FxSource(Temporary)\\" + folderName + ".mp4") == false)
                {
                    System.Threading.Thread.Sleep(1500);
                }
                FXcmd.StandardInput.WriteLine("cd ..");
                System.Threading.Thread.Sleep(100);
                FXcmd.StandardInput.WriteLine("cls");
                FXcmd.StandardInput.Flush();
            }
        }


    



    Any suggestions ? Thanks !

    


  • Demuxing a video media file with FFMPEG

    23 février 2016, par MOHW

    After starting this question Extracting the h264 part of a video file (demuxing) I was actually able to figure out that,

    1. When I reverted to an older version of FFMPEG (avcodec-55.dll) as against the avcodec-57.dll I was using earlier, the code worked perfectly without any error and the resultant h264 file played with ffplay.
    2. When I tracked my output when using the avcodec-57.dll version of FFMPEG (most recent version), there was actually an error "Using AVStream.codec.time_base as a timebase hint to the muxer is deprecated. Set AVStream.time_base instead" after the call avformat_write_header(ofmt_ctx_v, NULL) and another one after the call to avformat_write_header(ofmt_ctx_a, NULL). The program continued executing, the audio was fine but the .h264 file wasn’t.

    The output

    Press any key to continue . . .
    Press any key to continue . . .
    Press any key to continue . . .
    Press any key to continue . . .

    ==============Input Video=============
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'sample.mp4':
     Metadata:
       major_brand     : isom
       minor_version   : 512
       compatible_brands: isomiso2avc1mp41
       title           : 10153755968775490
       encoder         : Lavf57.19.100
     Duration: 00:01:07.27, start: 0.020021, bitrate: 1058 kb/s
       Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p, 400x224 [
    SAR 199:200 DAR 199:112], 927 kb/s, 15 fps, 15 tbr, 15360 tbn, 30 tbc (default)
       Metadata:
         handler_name    : VideoHandler
       Stream #0:1(eng): Audio: mp3 (mp4a / 0x6134706D), 48000 Hz, stereo, s16p, 12
    8 kb/s (default)
       Metadata:
         handler_name    : SoundHandler

    ==============Output Video============
    Output #0, h264, to 'sample.h264':
       Stream #0:0: Video: h264 (High), yuv420p, 400x224 [SAR 199:200 DAR 199:112],
    q=2-31, 927 kb/s, 30 tbc

    ==============Output Audio============
    Output #0, mp3, to 'sample.mp3':
       Stream #0:0: Audio: mp3, 48000 Hz, stereo, s16p, 128 kb/s

    ======================================
    [h264 @ 00a3ee20] Using AVStream.codec.time_base as a timebase hint to the muxer
    is deprecated. Set AVStream.time_base instead.
    [mp3 @ 00a4fec0] Using AVStream.codec.time_base as a timebase hint to the muxer
    is deprecated. Set AVStream.time_base instead.
    Press any key to continue . . .

    The code

    #include

    #define __STDC_CONSTANT_MACROS

    extern "C"
    {
       #include "libavformat/avformat.h"
    }


    #define USE_H264BSF 1

    int main()
    {
       AVOutputFormat *ofmt_a = NULL,*ofmt_v = NULL;
       AVFormatContext *ifmt_ctx = NULL, *ofmt_ctx_a = NULL, *ofmt_ctx_v = NULL;
       AVPacket pkt;
       int ret, i;
       int videoindex=-1,audioindex=-1;
       int frame_index=0;

       const char *in_filename  = "sample.mp4";
       const char *out_filename_v = "sample.h264";
       const char *out_filename_a = "sample.mp3";

       av_register_all();
       //Input
       if ((ret = avformat_open_input(&ifmt_ctx, in_filename, 0, 0)) < 0) {
           printf( "Could not open input file.");
           goto end;
       }
       if ((ret = avformat_find_stream_info(ifmt_ctx, 0)) < 0) {
           printf( "Failed to retrieve input stream information");
           goto end;
       }
    system("pause");

       //Output
       avformat_alloc_output_context2(&ofmt_ctx_v, NULL, NULL, out_filename_v);
       if (!ofmt_ctx_v) {
           printf( "Could not create output context\n");
           ret = AVERROR_UNKNOWN;
           goto end;
       }
       ofmt_v = ofmt_ctx_v->oformat;

    system("pause");

       avformat_alloc_output_context2(&ofmt_ctx_a, NULL, NULL, out_filename_a);
       if (!ofmt_ctx_a) {
           printf( "Could not create output context\n");
           ret = AVERROR_UNKNOWN;
           goto end;
       }
       ofmt_a = ofmt_ctx_a->oformat;
    system("pause");
       for (i = 0; i < ifmt_ctx->nb_streams; i++) {
               //Create output AVStream according to input AVStream
               AVFormatContext *ofmt_ctx;
               AVStream *in_stream = ifmt_ctx->streams[i];
               AVStream *out_stream = NULL;

               if(ifmt_ctx->streams[i]->codec->codec_type==AVMEDIA_TYPE_VIDEO){
                   videoindex=i;
                   out_stream=avformat_new_stream(ofmt_ctx_v, in_stream->codec->codec);
                   ofmt_ctx=ofmt_ctx_v;
               }else if(ifmt_ctx->streams[i]->codec->codec_type==AVMEDIA_TYPE_AUDIO){
                   audioindex=i;
                   out_stream=avformat_new_stream(ofmt_ctx_a, in_stream->codec->codec);
                   ofmt_ctx=ofmt_ctx_a;
               }else{
                   break;
               }

               if (!out_stream) {
                   printf( "Failed allocating output stream\n");
                   ret = AVERROR_UNKNOWN;
                   goto end;
               }
               //Copy the settings of AVCodecContext
               if (avcodec_copy_context(out_stream->codec, in_stream->codec) < 0) {
                   printf( "Failed to copy context from input to output stream codec context\n");
                   goto end;
               }
               out_stream->codec->codec_tag = 0;

               if (ofmt_ctx->oformat->flags & AVFMT_GLOBALHEADER)
                   out_stream->codec->flags |= CODEC_FLAG_GLOBAL_HEADER;
       }
    system("pause");
       //Dump Format------------------
       printf("\n==============Input Video=============\n");
       av_dump_format(ifmt_ctx, 0, in_filename, 0);
       printf("\n==============Output Video============\n");
       av_dump_format(ofmt_ctx_v, 0, out_filename_v, 1);
       printf("\n==============Output Audio============\n");
       av_dump_format(ofmt_ctx_a, 0, out_filename_a, 1);
       printf("\n======================================\n");
       //Open output file
       if (!(ofmt_v->flags & AVFMT_NOFILE)) {
           if (avio_open(&ofmt_ctx_v->pb, out_filename_v, AVIO_FLAG_WRITE) < 0) {
               printf( "Could not open output file '%s'", out_filename_v);
               goto end;
           }
       }

       if (!(ofmt_a->flags & AVFMT_NOFILE)) {
           if (avio_open(&ofmt_ctx_a->pb, out_filename_a, AVIO_FLAG_WRITE) < 0) {
               printf( "Could not open output file '%s'", out_filename_a);
               goto end;
           }
       }

       //Write file header
       if (avformat_write_header(ofmt_ctx_v, NULL) < 0) {
           printf( "Error occurred when opening video output file\n");
           goto end;
       }

       if (avformat_write_header(ofmt_ctx_a, NULL) < 0) {
           printf( "Error occurred when opening audio output file\n");
           goto end;
       }
       system("pause");
    #if USE_H264BSF
       AVBitStreamFilterContext* h264bsfc =  av_bitstream_filter_init("h264_mp4toannexb");
    #endif

       while (1) {
           AVFormatContext *ofmt_ctx;
           AVStream *in_stream, *out_stream;
           //Get an AVPacket
           if (av_read_frame(ifmt_ctx, &pkt) < 0)
               break;
           in_stream  = ifmt_ctx->streams[pkt.stream_index];


           if(pkt.stream_index==videoindex){
               out_stream = ofmt_ctx_v->streams[0];
               ofmt_ctx=ofmt_ctx_v;
               #if USE_H264BSF
                   av_bitstream_filter_filter(h264bsfc, in_stream->codec, NULL, &pkt.data, &pkt.size, pkt.data, pkt.size, 0);
               #endif
               printf("Write Video Packet. size:%d\tpts:%lld\n",pkt.size,pkt.pts);
           }else if(pkt.stream_index==audioindex){
               out_stream = ofmt_ctx_a->streams[0];
               ofmt_ctx=ofmt_ctx_a;
               printf("Write Audio Packet. size:%d\tpts:%lld\n",pkt.size,pkt.pts);
           }else{
               continue;
           }


           //Convert PTS/DTS
           pkt.pts = av_rescale_q_rnd(pkt.pts, in_stream->time_base, out_stream->time_base, (AVRounding)(AV_ROUND_NEAR_INF|AV_ROUND_PASS_MINMAX));
           pkt.dts = av_rescale_q_rnd(pkt.dts, in_stream->time_base, out_stream->time_base, (AVRounding)(AV_ROUND_NEAR_INF|AV_ROUND_PASS_MINMAX));
           pkt.duration = av_rescale_q(pkt.duration, in_stream->time_base, out_stream->time_base);
           pkt.pos = -1;
           pkt.stream_index=0;
           //Write
           if (av_interleaved_write_frame(ofmt_ctx, &pkt) < 0) {
               printf( "Error muxing packet\n");
               break;
           }
           //printf("Write %8d frames to output file\n", frame_index);
           av_free_packet(&pkt);
           frame_index++;
       }
    system("pause");
    #if USE_H264BSF
       av_bitstream_filter_close(h264bsfc);  
    #endif

       //Write file trailer
       av_write_trailer(ofmt_ctx_a);
       av_write_trailer(ofmt_ctx_v);
    end:
       avformat_close_input(&ifmt_ctx);
       /* close output */
       if (ofmt_ctx_a && !(ofmt_a->flags & AVFMT_NOFILE))
           avio_close(ofmt_ctx_a->pb);

       if (ofmt_ctx_v && !(ofmt_v->flags & AVFMT_NOFILE))
           avio_close(ofmt_ctx_v->pb);

       avformat_free_context(ofmt_ctx_a);
       avformat_free_context(ofmt_ctx_v);

       system("pause");
       if (ret < 0 && ret != AVERROR_EOF) {
           printf( "Error occurred.\n");
           return -1;
       }

       return 0;
    }

    EDIT 1
    After reading through http://lists.libav.org/pipermail/libav-devel/2014-June/060048.html, I figured what was causing the "Using AVStream.codec.time_base as a timebase hint to the muxer is deprecated. Set AVStream.time_base instead". I fixed it by adding out_stream->time_base = in_stream->time_base; before the call to avformat_write_header. The code now runs without any error ! The h264 file created with the old FFMPEG (avcodec-55.dll) works fine while that created by the recent avcodec-57.dll is still invalid.