Recherche avancée

Médias (1)

Mot : - Tags -/publicité

Autres articles (64)

  • MediaSPIP version 0.1 Beta

    16 avril 2011, par

    MediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Amélioration de la version de base

    13 septembre 2013

    Jolie sélection multiple
    Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
    Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)

Sur d’autres sites (10146)

  • ffmpeg Transcoding Stops After Few Seconds [migrated]

    15 avril 2018, par Salem F

    I’m trying to do this over week now with no success,
    What’s I’m trying to do is transcoding video from live streaming source and downscale it with FFmpeg, but every time I start transcoding it broadcasting for 11 sec and stop.

    The last command I tried :

    ffmpeg  -re  -i 'http://source.com/1034.ts' -preset ultrafast http://localhost:2052/feed1.ffm

    I tried to download the .ts file with IDM and it finish downloading the file on the exact 12 Sec that FFmpeg stop trans coding on it.

    Does that means that FFmpeg download that file as one segment and not continued reading the source video As what video players does usually. By the way, I tried with source with VLC player and it didn’t stop playing the the same source video.

    I decided to pass FFmpeg command via FFserver config file ffserver.conf

    Launch ffmpeg -i 'http://source.com/1.ts' -copyinkf -codec copy  

    The stream works fine for a while but after testing couple sources I notice it’s struggle to trans-coding HD videos.

    I guess the issue with my VPS KVM server being very limited CPU and RAM ( 128MB only) ! Since I tried using ultrafast preset but din’t solve the issue, another thing, I notice when I enable AVOptionVideo crf setting on ffserver.conf trans-coding runs bit smoothly without frame-rate dropping.
    Las my server uses Xeon L5520 CPU which is outdated CPU specially I gout 1/4 power of V single core (if they count HT it will be 1/8 of the real core) : (

    # vlc -I dummy 'https://source.com/1034.ts' --sout '#standard{access=http,mux=flv,dst=localhost:2052}'
    VLC media player 2.2.8 Weatherwax (revision 2.2.7-14-g3cc1d8cba9)
    [09d3fdf0] pulse audio output error: PulseAudio server connection failure: Connection refused
    [09d279c0] core interface error: no suitable interface module
    [09c9b8f8] core libvlc error: interface "globalhotkeys,none" initialization failed
    [09d279c0] dbus interface error: Failed to connect to the D-Bus session daemon: Unable to autolaunch a dbus-daemon without a $DISPLAY for X11
    [09d279c0] core interface error: no suitable interface module
    [09c9b8f8] core libvlc error: interface "dbus,none" initialization failed
    [09d279c0] dummy interface: using the dummy interface module...
    [b5e04ae0] access_output_http access out: Consider passing --http-host=IP on the command line instead.
    [b5e38ab8] ts demux: MPEG-4 descriptor not found for pid 0x101 type 0xf
    [b5e90ae0] packetizer_mpeg4audio decoder: AAC channels: 2 samplerate: 48000
    [flv @ 0xb5e33b40] dimensions not set
    [b5e06360] avformat mux error: could not write header: Invalid argument
    [b5e88ef0] core decoder error: cannot continue streaming due to errors
    [b5e90ae0] core decoder error: cannot continue streaming due to errors

    Here output with -loglevel verbose

    :~# ffmpeg -i http://source.com/1.ts -copyinkf -codec copy  -loglevel verbose  http://127.0.0.1:8090/feed1.ffm
    ffmpeg version 2.6.9 Copyright (c) 2000-2016 the FFmpeg developers
     built with gcc 4.9.2 (Debian 4.9.2-10)
     configuration: --prefix=/usr --extra-cflags='-g -O2 -fstack-protector-strong -Wformat -Werror=format-security ' --extra-ldflags='-Wl,-z,relro' --cc='ccache cc' --enable-shared --enable-libmp3lame --enable-gpl --enable-nonfree --enable-libvorbis --enable-pthreads --enable-libfaac --enable-libxvid --enable-postproc --enable-x11grab --enable-libgsm --enable-libtheora --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libx264 --enable-libspeex --enable-nonfree --disable-stripping --enable-libvpx --enable-libschroedinger --disable-encoder=libschroedinger --enable-version3 --enable-libopenjpeg --enable-librtmp --enable-avfilter --enable-libfreetype --enable-libvo-aacenc --disable-decoder=amrnb --enable-libvo-amrwbenc --enable-libaacplus --libdir=/usr/lib/i386-linux-gnu --disable-vda --enable-libbluray --enable-libcdio --enable-gnutls --enable-frei0r --enable-openssl --enable-libass --enable-libopus --enable-fontconfig --enable-libpulse --disable-mips32r2 --disable-mipsdspr1 --disable-mipsdspr2 --enable-libvidstab --enable-libzvbi --enable-avresample --disable-htmlpages --disable-podpages --enable-libutvideo --enable-libfdk-aac --enable-libx265 --enable-libiec61883 --enable-vaapi --enable-libdc1394 --disable-altivec --shlibdir=/usr/lib/i386-linux-gnu
     libavutil      54. 20.100 / 54. 20.100
     libavcodec     56. 26.100 / 56. 26.100
     libavformat    56. 25.101 / 56. 25.101
     libavdevice    56.  4.100 / 56.  4.100
     libavfilter     5. 11.102 /  5. 11.102
     libavresample   2.  1.  0 /  2.  1.  0
     libswscale      3.  1.101 /  3.  1.101
     libswresample   1.  1.100 /  1.  1.100
     libpostproc    53.  3.100 / 53.  3.100
    Invalid UE golomb code
       Last message repeated 2 times
    Input #0, mpegts, from 'http://source.com/1.ts':
     Duration: N/A, start: 30472.768167, bitrate: N/A
     Program 1
       Metadata:
         service_name    : Service01
         service_provider: FFmpeg
       Stream #0:0[0x100]: Video: h264 (High) ([27][0][0][0] / 0x001B), yuv420p, 960x540 (960x544) [SAR 1:1 DAR 16:9], 50 fps, 50 tbr, 90k tbn, 100 tbc
       Stream #0:1[0x101]: Audio: aac (LC) ([15][0][0][0] / 0x000F), 48000 Hz, stereo, fltp, 105 kb/s
    [graph 0 input from stream 0:1 @ 0x971f2c0] tb:1/48000 samplefmt:fltp samplerate:48000 chlayout:0x3
    [audio format for output stream 0:0 @ 0x9844de0] auto-inserting filter 'auto-inserted resampler 0' between the filter 'Parsed_anull_0' and the filter 'audio format for output stream 0:0'
    [auto-inserted resampler 0 @ 0x97115e0] ch:2 chl:stereo fmt:fltp r:48000Hz -> ch:1 chl:mono fmt:fltp r:22050Hz
    [graph 1 input from stream 0:0 @ 0x96f5d00] w:960 h:540 pixfmt:yuv420p tb:1/90000 fr:50/1 sar:1/1 sws_param:flags=2
    [scaler for output stream 0:1 @ 0x96f5e80] w:352 h:240 flags:'0x4' interl:0
    [scaler for output stream 0:1 @ 0x96f5e80] w:960 h:540 fmt:yuv420p sar:1/1 -> w:352 h:240 fmt:yuv420p sar:40/33 flags:0x4
    Output #0, ffm, to 'http://127.0.0.1:8090/feed1.ffm':
     Metadata:
       creation_time   : now
       encoder         : Lavf56.25.101
       Stream #0:0: Audio: wmav2, 22050 Hz, mono, fltp, 64 kb/s
       Metadata:
         encoder         : Lavc56.26.100 wmav2
       Stream #0:1: Video: msmpeg4v3 (msmpeg4), yuv420p, 352x240 [SAR 40:33 DAR 16:9], q=2-31, 256 kb/s, 50 fps, 1000k tbn, 15 tbc
       Metadata:
         encoder         : Lavc56.26.100 msmpeg4
    Stream mapping:
     Stream #0:1 -> #0:0 (aac (native) -> wmav2 (native))
     Stream #0:0 -> #0:1 (h264 (native) -> msmpeg4v3 (msmpeg4))
    Press [q] to stop, [?] for help
    Invalid UE golomb code
    *** dropping frame 3 from stream 1 at ts 1
       Last message repeated 1 times
    [msmpeg4 @ 0x970f060] warning, clipping 1 dct coefficients to -127..127
    *** dropping frame 4 from stream 1 at ts 2
       Last message repeated 1 times
    *** dropping frame 5 from stream 1 at ts 3
       Last message repeated 1 times
    *** dropping frame 5 from stream 1 at ts 4
    *** dropping frame 6 from stream 1 at ts 4
       Last message repeated 1 times
    *** dropping frame 7 from stream 1 at ts 5
       Last message repeated 1 times
    [msmpeg4 @ 0x970f060] warning, clipping 1 dct coefficients to -127..127
    *** dropping frame 8 from stream 1 at ts 6
       Last message repeated 1 times
    *** dropping frame 8 from stream 1 at ts 7
    *** dropping frame 9 from stream 1 at ts 7
       Last message repeated 1 times
    *** dropping frame 10 from stream 1 at ts 8
       Last message repeated 1 times
    *** dropping frame 11 from stream 1 at ts 9
       Last message repeated 1 times
    *** dropping frame 11 from stream 1 at ts 10
    *** dropping frame 12 from stream 1 at ts 10
       Last message repeated 1 times
    *** dropping frame 13 from stream 1 at ts 11
       Last message repeated 1 times
    *** dropping frame 14 from stream 1 at ts 12
       Last message repeated 1 times
    *** dropping frame 14 from stream 1 at ts 13
    *** dropping frame 15 from stream 1 at ts 13
       Last message repeated 1 times
    *** dropping frame 16 from stream 1 at ts 14
       Last message repeated 1 times
    *** dropping frame 17 from stream 1 at ts 15
       Last message repeated 1 times
    *** dropping frame 17 from stream 1 at ts 16
    *** dropping frame 18 from stream 1 at ts 16
       Last message repeated 1 times
    *** dropping frame 19 from stream 1 at ts 17
       Last message repeated 1 times
    *** dropping frame 20 from stream 1 at ts 18me=00:00:01.33 bitrate= 270.3kbits/s dup=0 drop=39
       Last message repeated 1 times
    *** dropping frame 20 from stream 1 at ts 19
    *** dropping frame 21 from stream 1 at ts 19
       Last message repeated 1 times
    *** dropping frame 22 from stream 1 at ts 20
       Last message repeated 1 times
    *** dropping frame 23 from stream 1 at ts 21
       Last message repeated 1 times
    *** dropping frame 23 from stream 1 at ts 22
    *** dropping frame 24 from stream 1 at ts 22
       Last message repeated 1 times
    *** dropping frame 25 from stream 1 at ts 23
       Last message repeated 1 times
    *** dropping frame 26 from stream 1 at ts 24
       Last message repeated 1 times
    *** dropping frame 26 from stream 1 at ts 25
    *** dropping frame 27 from stream 1 at ts 25
       Last message repeated 1 times
    *** dropping frame 28 from stream 1 at ts 26
       Last message repeated 1 times
    *** dropping frame 29 from stream 1 at ts 27
       Last message repeated 1 times
    *** dropping frame 29 from stream 1 at ts 28
    *** dropping frame 30 from stream 1 at ts 28
       Last message repeated 1 times
    *** dropping frame 31 from stream 1 at ts 29
       Last message repeated 1 times
    *** dropping frame 32 from stream 1 at ts 30
       Last message repeated 1 times
    *** dropping frame 32 from stream 1 at ts 31
    *** dropping frame 33 from stream 1 at ts 31
       Last message repeated 1 times
    *** dropping frame 34 from stream 1 at ts 32
       Last message repeated 1 times
    *** dropping frame 34 from stream 1 at ts 33
    *** dropping frame 35 from stream 1 at ts 33
    *** dropping frame 35 from stream 1 at ts 34
    *** dropping frame 36 from stream 1 at ts 34
       Last message repeated 1 times
    *** dropping frame 37 from stream 1 at ts 35
       Last message repeated 1 times
    Invalid UE golomb code
    *** dropping frame 38 from stream 1 at ts 36
       Last message repeated 1 times
    *** dropping frame 38 from stream 1 at ts 37
    *** dropping frame 39 from stream 1 at ts 37
       Last message repeated 1 times
    *** dropping frame 40 from stream 1 at ts 38
       Last message repeated 1 times
    *** dropping frame 41 from stream 1 at ts 39me=00:00:02.73 bitrate= 311.7kbits/s dup=0 drop=88
       Last message repeated 1 times
    *** dropping frame 41 from stream 1 at ts 40
    *** dropping frame 42 from stream 1 at ts 40
       Last message repeated 1 times
    *** dropping frame 43 from stream 1 at ts 41
       Last message repeated 1 times
    *** dropping frame 44 from stream 1 at ts 42
       Last message repeated 1 times
    *** dropping frame 44 from stream 1 at ts 43
    *** dropping frame 45 from stream 1 at ts 43
       Last message repeated 1 times
    *** dropping frame 46 from stream 1 at ts 44
       Last message repeated 1 times
    *** dropping frame 47 from stream 1 at ts 45
       Last message repeated 1 times
    *** dropping frame 47 from stream 1 at ts 46
    *** dropping frame 48 from stream 1 at ts 46
       Last message repeated 1 times
    *** dropping frame 49 from stream 1 at ts 47
       Last message repeated 1 times
    *** dropping frame 50 from stream 1 at ts 48
       Last message repeated 1 times
    *** dropping frame 50 from stream 1 at ts 49
    *** dropping frame 51 from stream 1 at ts 49
       Last message repeated 1 times
    *** dropping frame 52 from stream 1 at ts 50
       Last message repeated 1 times
    *** dropping frame 53 from stream 1 at ts 51
       Last message repeated 1 times
    [h264 @ 0x9844a00] error while decoding MB 58 12, bytestream -5
    [h264 @ 0x9844a00] concealing 1311 DC, 1311 AC, 1311 MV errors in B frame
    *** dropping frame 53 from stream 1 at ts 52
    No more output streams to write to, finishing.
    frame=   55 fps= 42 q=4.3 Lsize=     152kB time=00:00:03.66 bitrate= 339.6kbits/s dup=0 drop=119
    video:116kB audio:26kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 6.760316%
    Input file #0 (http://source.com/1.ts):
     Input stream #0:0 (video): 174 packets read (220322 bytes); 174 frames decoded;
     Input stream #0:1 (audio): 156 packets read (36657 bytes); 156 frames decoded (159744 samples);
     Total: 330 packets (256979 bytes) demuxed
    Output file #0 (http://127.0.0.1:8090/feed1.ffm):
     Output stream #0:0 (audio): 72 frames encoded (73383 samples); 72 packets muxed (26712 bytes);
     Output stream #0:1 (video): 55 frames encoded; 55 packets muxed (119080 bytes);
     Total: 127 packets (145792 bytes) muxed

    Here input URL file info After I download it to my PC with IDM

    General
    ID                             : 1 (0x1)
    Complete name                  : D:\1.ts
    Format                         : MPEG-TS
    File size                      : 256 KiB
    Duration                       : 2 s 520 ms
    Overall bit rate mode          : Variable
    Overall bit rate               : 788 kb/s

    Video
    ID                             : 256 (0x100)
    Menu ID                        : 1 (0x1)
    Format                         : AVC
    Format/Info                    : Advanced Video Codec
    Format profile                 : High@L3.1
    Format settings, CABAC         : Yes
    Format settings, RefFrames     : 2 frames
    Codec ID                       : 27
    Duration                       : 2 s 680 ms
    Width                          : 960 pixels
    Height                         : 540 pixels
    Display aspect ratio           : 16:9
    Frame rate                     : 50.000 FPS
    Color space                    : YUV
    Chroma subsampling             : 4:2:0
    Bit depth                      : 8 bits
    Scan type                      : Progressive

    Audio
    ID                             : 257 (0x101)
    Menu ID                        : 1 (0x1)
    Format                         : AAC
    Format/Info                    : Advanced Audio Codec
    Format version                 : Version 4
    Format profile                 : LC
    Muxing mode                    : ADTS
    Codec ID                       : 15
    Duration                       : 2 s 69 ms
    Bit rate mode                  : Variable
    Channel(s)                     : 2 channels
    Channel positions              : Front: L R
    Sampling rate                  : 48.0 kHz
    Frame rate                     : 46.875 FPS (1024 SPF)
    Compression mode               : Lossy
    Delay relative to video        : -12 ms

    Menu
    ID                             : 4096 (0x1000)
    Menu ID                        : 1 (0x1)
    Duration                       : 2 s 520 ms
    List                           : 256 (0x100) (AVC) / 257 (0x101) (AAC)
    Service name                   : Service01
    Service provider               : FFmpeg
    Service type                   : digital television
  • Revision 35042 : Un truc qui trainait

    9 février 2010, par kent1@… — Log

    Un truc qui trainait

  • Tap to record like in vine using javacv

    8 décembre 2015, par human123

    I am trying to implement a tap to record feature like in vine. A sample for handling recording (not touch to record) provided in javacv is https://github.com/bytedeco/javacv/blob/master/samples/RecordActivity.java. I am trying to modify it so that in onPreviewFrame method frames are added to buffer only when user has his finger placed on screen. These frames are then tried to be combined into final video in stopRecording method.

    The issue is that if I set the timestamp as given in below code snippet (in stopRecording method)

    if (t > recorder.getTimestamp())
    {
       recorder.setTimestamp(t);
    }

    the behavior is as below

    Case 1

    If I tap on screen to record for 2 seconds and take the finger away from screen for 3 seconds and then again place finger back on screen to record for another 4 seconds the resulting video is like,

    For 1st 2 seconds video has recorded content. For next 3 seconds (time when finger is put away from screen). video just shows the last frame recorded when finger was placed on screen last. Then the video has recorded video content for next 4 seconds. So there seems to be an issue in handling video recording when finger is removed from screen.

    Case 2

    Next I removed the code setting time stamp to recorder(the code snippet given above) in stopRecording method.

    Now the resulting video (for the same steps tried in case 1) does not contain the middle 3 seconds(which is what is required) when finger was taken away from screen. But video is playing at a faster rate. So it seems that we need to set time stamp so that video plays at normal rate.

    Full code of my activity is given below. (Please note that video recording is mainly handled from onPreviewFrame and stopRecording methods)

    public class TouchToRecordActivity extends Activity implements OnClickListener, View.OnTouchListener {

    private final static String CLASS_LABEL = "TouchToRecordActivity";
    private final static String LOG_TAG = CLASS_LABEL;

    private String ffmpeg_link = "/mnt/sdcard/stream.mp4";

    long startTime = 0;
    boolean recording = false;
    boolean rec = false;

    private FFmpegFrameRecorder recorder;

    private boolean isPreviewOn = false;

    private int sampleAudioRateInHz = 44100;
    private int imageWidth = 640;
    private int imageHeight = 480;
    private int destWidth = 480;
    private int frameRate = 30;

    /* audio data getting thread */
    private AudioRecord audioRecord;
    private AudioRecordRunnable audioRecordRunnable;
    private Thread audioThread;
    volatile boolean runAudioThread = true;

    /* video data getting thread */
    private Camera cameraDevice;
    private CameraView cameraView;

    private Frame yuvImage = null;

    /* layout setting */
    private final int bg_screen_bx = 232;
    private final int bg_screen_by = 128;
    private final int bg_screen_width = 700;
    private final int bg_screen_height = 500;
    private final int bg_width = 1123;
    private final int bg_height = 715;
    private final int live_width = 640;
    private final int live_height = 480;
    private int screenWidth, screenHeight;
    private Button btnRecorderControl;

    /* The number of seconds in the continuous record loop (or 0 to disable loop). */
    final int RECORD_LENGTH = 20;
    Frame[] images;
    long[] timestamps;
    ShortBuffer[] samples;
    int imagesIndex, samplesIndex;

    long firstTime = 0;
    long startPauseTime = 0;
    long totalPauseTime = 0;
    long pausedTime = 0;
    long stopPauseTime = 0;
    long totalTime = 0;

    long totalRecordedTS = 0;

    private TextView txtTimer;
    private Handler mHandler = new Handler();

    @Override
    public void onCreate(Bundle savedInstanceState) {
       super.onCreate(savedInstanceState);
       setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);

       setContentView(R.layout.touch_main);

       initLayout();
    }

    @Override
    protected void onDestroy() {
       super.onDestroy();

       recording = false;

       if (cameraView != null) {
           cameraView.stopPreview();
       }

       if (cameraDevice != null) {
           cameraDevice.stopPreview();
           cameraDevice.release();
           cameraDevice = null;
       }
    }


    private void initLayout() {

       /* get size of screen */
       Display display = ((WindowManager) getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay();
       screenWidth = display.getWidth();
       screenHeight = display.getHeight();
       RelativeLayout.LayoutParams layoutParam = null;
       LayoutInflater myInflate = null;
       myInflate = (LayoutInflater) getSystemService(Context.LAYOUT_INFLATER_SERVICE);
       RelativeLayout topLayout = new RelativeLayout(this);
       setContentView(topLayout);
       LinearLayout preViewLayout = (LinearLayout) myInflate.inflate(R.layout.touch_main, null);
       layoutParam = new RelativeLayout.LayoutParams(screenWidth, screenHeight);
       topLayout.addView(preViewLayout, layoutParam);

       txtTimer = (TextView) preViewLayout.findViewById(R.id.txtTimer);

       /* add control button: start and stop */
       btnRecorderControl = (Button) findViewById(R.id.recorder_control);
       btnRecorderControl.setText("Start");
       btnRecorderControl.setOnClickListener(this);

       /* add camera view */
       int display_width_d = (int) (1.0 * bg_screen_width * screenWidth / bg_width);
       int display_height_d = (int) (1.0 * bg_screen_height * screenHeight / bg_height);
       int prev_rw, prev_rh;
       if (1.0 * display_width_d / display_height_d > 1.0 * live_width / live_height) {
           prev_rh = display_height_d;
           prev_rw = (int) (1.0 * display_height_d * live_width / live_height);
       } else {
           prev_rw = display_width_d;
           prev_rh = (int) (1.0 * display_width_d * live_height / live_width);
       }
       layoutParam = new RelativeLayout.LayoutParams(prev_rw, prev_rh);
       layoutParam.topMargin = (int) (1.0 * bg_screen_by * screenHeight / bg_height);
       layoutParam.leftMargin = (int) (1.0 * bg_screen_bx * screenWidth / bg_width);

       cameraDevice = Camera.open();
       Log.i(LOG_TAG, "cameara open");
       cameraView = new CameraView(this, cameraDevice);
       topLayout.addView(cameraView, layoutParam);
       topLayout.setOnTouchListener(this);
       Log.i(LOG_TAG, "cameara preview start: OK");
    }

    //---------------------------------------
    // initialize ffmpeg_recorder
    //---------------------------------------
    private void initRecorder() {

       Log.w(LOG_TAG, "init recorder");

       if (RECORD_LENGTH > 0) {
           imagesIndex = 0;
           images = new Frame[RECORD_LENGTH * frameRate];
           timestamps = new long[images.length];
           for (int i = 0; i < images.length; i++) {
               images[i] = new Frame(destWidth, imageHeight, Frame.DEPTH_UBYTE, 2);
               timestamps[i] = -1;
           }
       } else if (yuvImage == null) {
           yuvImage = new Frame(destWidth, imageHeight, Frame.DEPTH_UBYTE, 2);
           Log.i(LOG_TAG, "create yuvImage");
       }
       Log.i(LOG_TAG, "ffmpeg_url: " + ffmpeg_link);
       recorder = new FFmpegFrameRecorder(ffmpeg_link, destWidth, imageHeight, 1);
       recorder.setFormat("mp4");
       recorder.setVideoCodecName("libx264");
       recorder.setSampleRate(sampleAudioRateInHz);
       // Set in the surface changed method
       recorder.setFrameRate(frameRate);

       Log.i(LOG_TAG, "recorder initialize success");

       audioRecordRunnable = new AudioRecordRunnable();
       audioThread = new Thread(audioRecordRunnable);
       runAudioThread = true;
    }

    public void startRecording() {

       initRecorder();

       mHandler.removeCallbacks(mUpdateTimeTask);
       mHandler.postDelayed(mUpdateTimeTask, 100);

       try {
           recorder.start();
           startTime = System.currentTimeMillis();
           recording = true;
           audioThread.start();

       } catch (FFmpegFrameRecorder.Exception e) {
           e.printStackTrace();
       }
    }

    public void stopRecording() {

       runAudioThread = false;
       try {
           audioThread.join();
       } catch (InterruptedException e) {
           e.printStackTrace();
       }
       audioRecordRunnable = null;
       audioThread = null;

       if (recorder != null && recording) {
           if (RECORD_LENGTH > 0) {
               Log.v(LOG_TAG, "Writing frames");
               try {
                   int firstIndex = imagesIndex % samples.length;
                   int lastIndex = (imagesIndex - 1) % images.length;
                   if (imagesIndex <= images.length) {
                       firstIndex = 0;
                       lastIndex = imagesIndex - 1;
                   }
                   if ((startTime = timestamps[lastIndex] - RECORD_LENGTH * 1000000L) < 0) {
                       startTime = 0;
                   }
                   if (lastIndex < firstIndex) {
                       lastIndex += images.length;
                   }
                   int videoCounter = 0;
                   for (int i = firstIndex; i <= lastIndex; i++) {
                       if (timestamps[i] == -1) {
                           Log.v(LOG_TAG, "frame not recorded");
                       }
                       if (timestamps[i] != -1) {
                           long t = timestamps[i % timestamps.length] - startTime;
                           if (t >= 0) {

                               videoCounter++;

                               /*if (((i % images.length) != 0) && images[i % images.length] != images[(i % images.length) - 1]) {
                                   if (t > recorder.getTimestamp()) {
                                       recorder.setTimestamp(t);
                                   }*/
                                   Log.v(LOG_TAG, "imageIndex=" + (i % images.length));
                                   recorder.record(images[i % images.length]);
                           /*    }*/
                               Log.v(LOG_TAG, "videoCounter=" + videoCounter);
                           }
                       }
                   }

                   firstIndex = samplesIndex % samples.length;
                   lastIndex = (samplesIndex - 1) % samples.length;
                   if (samplesIndex <= samples.length) {
                       firstIndex = 0;
                       lastIndex = samplesIndex - 1;
                   }
                   if (lastIndex < firstIndex) {
                       lastIndex += samples.length;
                   }
                   for (int i = firstIndex; i <= lastIndex; i++) {
                       if (timestamps[i] != -1) {
                           recorder.recordSamples(samples[i % samples.length]);
                       }
                   }
               } catch (FFmpegFrameRecorder.Exception e) {
                   Log.v(LOG_TAG, e.getMessage());
                   e.printStackTrace();
               }
           }

           recording = false;
           Log.v(LOG_TAG, "Finishing recording, calling stop and release on recorder");
           try {
               recorder.stop();
               recorder.release();
           } catch (FFmpegFrameRecorder.Exception e) {
               e.printStackTrace();
           }
           recorder = null;

       }
    }

    @Override
    public boolean onKeyDown(int keyCode, KeyEvent event) {

       if (keyCode == KeyEvent.KEYCODE_BACK) {
           if (recording) {
               stopRecording();
           }

           finish();

           return true;
       }

       return super.onKeyDown(keyCode, event);
    }

    @Override
    public boolean onTouch(View view, MotionEvent motionEvent) {
       switch (motionEvent.getAction()) {
           case MotionEvent.ACTION_DOWN:
               Log.v(LOG_TAG, "ACTION_DOWN" + recording);

               if (!recording) {
                   startRecording();
               } else {
                   stopPauseTime = System.currentTimeMillis();
                   totalPauseTime = stopPauseTime - startPauseTime - ((long) (1.0 / (double) frameRate) * 1000);
                   pausedTime += totalPauseTime;
               }
               rec = true;
               setTotalVideoTime();
               btnRecorderControl.setText(getResources().getString(R.string.stop));
               break;
           case MotionEvent.ACTION_MOVE:
               rec = true;
               setTotalVideoTime();
               break;
           case MotionEvent.ACTION_UP:
               Log.v(LOG_TAG, "ACTION_UP");
               rec = false;
               startPauseTime = System.currentTimeMillis();
               break;
       }
       return true;
    }

    private Runnable mUpdateTimeTask = new Runnable() {
       public void run() {
           if (recording) {
               setTotalVideoTime();
           }
           mHandler.postDelayed(this, 500);
       }
    };

    private synchronized void setTotalVideoTime() {
       totalTime = System.currentTimeMillis() - firstTime - pausedTime - ((long) (1.0 / (double) frameRate) * 1000);
       if (totalTime > 0)
           txtTimer.setText(getRecordingTimeFromMillis(totalTime));
    }

    private String getRecordingTimeFromMillis(long millis) {
       String strRecordingTime = null;
       int seconds = (int) (millis / 1000);
       int minutes = seconds / 60;
       int hours = minutes / 60;

       if (hours >= 0 && hours < 10)
           strRecordingTime = "0" + hours + ":";
       else
           strRecordingTime = hours + ":";

       if (hours > 0)
           minutes = minutes % 60;

       if (minutes >= 0 && minutes < 10)
           strRecordingTime += "0" + minutes + ":";
       else
           strRecordingTime += minutes + ":";

       seconds = seconds % 60;

       if (seconds >= 0 && seconds < 10)
           strRecordingTime += "0" + seconds;
       else
           strRecordingTime += seconds;

       return strRecordingTime;

    }


    //---------------------------------------------
    // audio thread, gets and encodes audio data
    //---------------------------------------------
    class AudioRecordRunnable implements Runnable {

       @Override
       public void run() {
           android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);

           // Audio
           int bufferSize;
           ShortBuffer audioData;
           int bufferReadResult;

           bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz,
                   AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
           audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz,
                   AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);

           if (RECORD_LENGTH > 0) {
               samplesIndex = 0;
               samples = new ShortBuffer[RECORD_LENGTH * sampleAudioRateInHz * 2 / bufferSize + 1];
               for (int i = 0; i < samples.length; i++) {
                   samples[i] = ShortBuffer.allocate(bufferSize);
               }
           } else {
               audioData = ShortBuffer.allocate(bufferSize);
           }

           Log.d(LOG_TAG, "audioRecord.startRecording()");
           audioRecord.startRecording();

           /* ffmpeg_audio encoding loop */
           while (runAudioThread) {
               if (RECORD_LENGTH > 0) {
                   audioData = samples[samplesIndex++ % samples.length];
                   audioData.position(0).limit(0);
               }
               //Log.v(LOG_TAG,"recording? " + recording);
               bufferReadResult = audioRecord.read(audioData.array(), 0, audioData.capacity());
               audioData.limit(bufferReadResult);
               if (bufferReadResult > 0) {
                   Log.v(LOG_TAG, "bufferReadResult: " + bufferReadResult);
                   // If "recording" isn't true when start this thread, it never get's set according to this if statement...!!!
                   // Why?  Good question...
                   if (recording && rec) {
                       Log.v(LOG_TAG, "Recording audio");
                       if (RECORD_LENGTH <= 0) try {
                           recorder.recordSamples(audioData);
                           //Log.v(LOG_TAG,"recording " + 1024*i + " to " + 1024*i+1024);
                       } catch (FFmpegFrameRecorder.Exception e) {
                           Log.v(LOG_TAG, e.getMessage());
                           e.printStackTrace();
                       }
                   }
               }
           }
           Log.v(LOG_TAG, "AudioThread Finished, release audioRecord");

           /* encoding finish, release recorder */
           if (audioRecord != null) {
               audioRecord.stop();
               audioRecord.release();
               audioRecord = null;
               Log.v(LOG_TAG, "audioRecord released");
           }
       }
    }

    //---------------------------------------------
    // camera thread, gets and encodes video data
    //---------------------------------------------
    class CameraView extends SurfaceView implements SurfaceHolder.Callback, PreviewCallback {

       private SurfaceHolder mHolder;
       private Camera mCamera;

       public CameraView(Context context, Camera camera) {
           super(context);
           Log.w("camera", "camera view");
           mCamera = camera;
           mHolder = getHolder();
           mHolder.addCallback(CameraView.this);
           mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
           mCamera.setPreviewCallback(CameraView.this);
       }

       @Override
       public void surfaceCreated(SurfaceHolder holder) {
           try {
               stopPreview();
               mCamera.setPreviewDisplay(holder);
           } catch (IOException exception) {
               mCamera.release();
               mCamera = null;
           }
       }

       public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
           stopPreview();

           Camera.Parameters camParams = mCamera.getParameters();
           List sizes = camParams.getSupportedPreviewSizes();
           // Sort the list in ascending order
           Collections.sort(sizes, new Comparator() {

               public int compare(final Camera.Size a, final Camera.Size b) {
                   return a.width * a.height - b.width * b.height;
               }
           });

           camParams.setPreviewSize(imageWidth, imageHeight);

           Log.v(LOG_TAG, "Setting imageWidth: " + imageWidth + " imageHeight: " + imageHeight + " frameRate: " + frameRate);

           camParams.setPreviewFrameRate(frameRate);
           Log.v(LOG_TAG, "Preview Framerate: " + camParams.getPreviewFrameRate());

           mCamera.setParameters(camParams);

           List videoSizes = mCamera.getParameters().getSupportedVideoSizes();

           // Set the holder (which might have changed) again
           try {
               mCamera.setPreviewDisplay(holder);
               mCamera.setPreviewCallback(CameraView.this);
               startPreview();
           } catch (Exception e) {
               Log.e(LOG_TAG, "Could not set preview display in surfaceChanged");
           }
       }

       @Override
       public void surfaceDestroyed(SurfaceHolder holder) {
           try {
               mHolder.addCallback(null);
               mCamera.setPreviewCallback(null);
           } catch (RuntimeException e) {
               // The camera has probably just been released, ignore.
           }
       }

       public void startPreview() {
           if (!isPreviewOn && mCamera != null) {
               isPreviewOn = true;
               mCamera.startPreview();
           }
       }

       public void stopPreview() {
           if (isPreviewOn && mCamera != null) {
               isPreviewOn = false;
               mCamera.stopPreview();
           }
       }

       @Override
       public void onPreviewFrame(byte[] data, Camera camera) {
           if (audioRecord == null || audioRecord.getRecordingState() != AudioRecord.RECORDSTATE_RECORDING) {
               startTime = System.currentTimeMillis();
               return;
           }
           if (RECORD_LENGTH > 0) {
               int i = imagesIndex++ % images.length;
               Log.v(LOG_TAG, "recording:" + recording + "rec:" + rec);
               if (recording && rec) {
                   yuvImage = images[i];
                   timestamps[i] = 1000 * (System.currentTimeMillis() - startTime);
                   totalRecordedTS++;
               } else {
                   Log.v(LOG_TAG, "recording is paused");
                   yuvImage = null;
                   timestamps[i] = -1;
               }
           }

           /* get video data */
           if (yuvImage != null && recording && rec) {
               if (data.length != imageWidth * imageHeight) {
                   Camera.Size sz = camera.getParameters().getPreviewSize();
                   imageWidth = sz.width;
                   imageHeight = sz.height;
                   destWidth = imageHeight;
                   Log.v(LOG_TAG, "data length:" + data.length);
               }

               ByteBuffer bb = (ByteBuffer) yuvImage.image[0].position(0); // resets the buffer
               int start = 2 * ((imageWidth - destWidth) / 4); // this must be even
               for (int row = 0; row < imageHeight * 3 / 2; row++) {
                   bb.put(data, start, destWidth);
                   start += imageWidth;
               }

           }
       }
    }

    @Override
    public void onClick(View v) {
       if (!recording) {
           startRecording();
           Log.w(LOG_TAG, "Start Button Pushed");
           btnRecorderControl.setText("Stop");
       } else {
           // This will trigger the audio recording loop to stop and then set isRecorderStart = false;
           stopRecording();
           Log.w(LOG_TAG, "Stop Button Pushed");
           btnRecorderControl.setText("Start");
       }
    }}

    Changes made as per Alex Cohn’s suggestions

    Suggestion 1 - Estimate average frame rate

       public void stopRecording() {

      ..............................

                               if (((i % images.length) != 0) && images[i % images.length] != images[(i % images.length) - 1]) {
                                   if (t > recorder.getTimestamp()) {
                                       t += 1000000 / frameRate;
                                       recorder.setTimestamp(t);
                                   }

                                   recorder.record(images[i % images.length]);
                               }
                ..........................................


    }

    Change made was adding t += 1000000 / frameRate ; But this caused the video to freeze (as in case 1 described above) in portions when finger was placed away from screen.

    Suggestion 2 - Modification in onPreviewFrame()

    long[] timestampsForRecorder;
    private void initRecorder() {

       Log.w(LOG_TAG, "init recorder");

       if (RECORD_LENGTH > 0) {
          .......................................................
           timestampsForRecorder = new long[images.length];
           for (int i = 0; i < images.length; i++) {
               images[i] = new Frame(destWidth, imageHeight, Frame.DEPTH_UBYTE, 2);
               timestamps[i] = -1;
               timestampsForRecorder[i] = -1;
           }
       } else if (yuvImage == null) {
           yuvImage = new Frame(destWidth, imageHeight, Frame.DEPTH_UBYTE, 2);
           Log.i(LOG_TAG, "create yuvImage");
       }
       ...................................................
    }

    @Override
    public void onPreviewFrame(byte[] data, Camera camera) {
           if (audioRecord == null || audioRecord.getRecordingState() != AudioRecord.RECORDSTATE_RECORDING) {
               startTime = SystemClock.elapsedRealtime();
               return;
           }
           if (RECORD_LENGTH > 0) {
               int i = imagesIndex++ % images.length;
               Log.v(LOG_TAG, "recording:" + recording + "rec:" + rec);
               if (recording && rec) {
                   yuvImage = images[i];
                   long thisFrameTime = SystemClock.elapsedRealtime();
                   timestamps[i] = thisFrameTime;
                   long lastFrameTime = timestamps[(int) (imagesIndex == 0 ? startTime : ((imagesIndex-1) % images.length))];
                   Log.v(LOG_TAG, "lastFrameTime:" + lastFrameTime+",stopPauseTime:" + stopPauseTime);
                   if (lastFrameTime > stopPauseTime) {
                       timestampsForRecorder[i] = 1000 * (thisFrameTime - Math.max(stopPauseTime, lastFrameTime));
                   }
               }
           }

          .....................................................
       }

    public void stopRecording() {

       .......................................................

       if (recorder != null && recording) {
           if (RECORD_LENGTH > 0) {
               Log.v(LOG_TAG, "Writing frames");
               try {
                   int firstIndex = imagesIndex % samples.length;
                   int lastIndex = (imagesIndex - 1) % images.length;
                   if (imagesIndex <= images.length) {
                       firstIndex = 0;
                       lastIndex = imagesIndex - 1;
                   }
                   if ((startTime = timestampsForRecorder[lastIndex] - RECORD_LENGTH * 1000000L) < 0) {
                       startTime = 0;
                   }
                   if (lastIndex < firstIndex) {
                       lastIndex += images.length;
                   }
                   for (int i = firstIndex; i <= lastIndex; i++) {

                       if (timestampsForRecorder[i] != -1) {
                           long t = timestampsForRecorder[i % timestampsForRecorder.length] - startTime;
                           if (t >= 0) {

                               if (((i % images.length) != 0) && images[i % images.length] != images[(i % images.length) - 1]) {
                                   if (t > recorder.getTimestamp()) {
                                       recorder.setTimestamp(t);
                                   }
                                   Log.v(LOG_TAG, "imageIndex=" + (i % images.length));
                                   recorder.record(images[i % images.length]);
                               }
                           }
                       }
                   }
                   .............................................
               } catch (FFmpegFrameRecorder.Exception e) {
                  .................................
               }
           }

           ...........................................

       }
    }

    The video recorded using this was having the issue in case 2 mentioned above. ie,It was playing at a faster rate