Recherche avancée

Médias (0)

Mot : - Tags -/organisation

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (63)

  • Configurer la prise en compte des langues

    15 novembre 2010, par

    Accéder à la configuration et ajouter des langues prises en compte
    Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
    De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
    Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...)

  • L’utiliser, en parler, le critiquer

    10 avril 2011

    La première attitude à adopter est d’en parler, soit directement avec les personnes impliquées dans son développement, soit autour de vous pour convaincre de nouvelles personnes à l’utiliser.
    Plus la communauté sera nombreuse et plus les évolutions seront rapides ...
    Une liste de discussion est disponible pour tout échange entre utilisateurs.

  • Des sites réalisés avec MediaSPIP

    2 mai 2011, par

    Cette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
    Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.

Sur d’autres sites (5460)

  • Real time playback of two blended videos with alpha channel and synched audio in pygame ?

    22 décembre 2024, par Francesco Calderone

    I need to play two videos with synched sound in real-time with Pygame.
Pygame does not currently support video streams, so I am using a ffmpeg subprocess.
The first video is a prores422_hq. This is a background video with no alpha channel.
The second video is a prores4444 overlay video with an alpha channel, and it needs to be played in real-tim on top of the first video (with transparency).
All of this needs synched sound from the first base video only.

    


    I have tried many libraries, including pymovie pyav and opencv. The best result so far is to use a subprocess with ffmpeg.

    


    ffmpeg -i testing/stefano_prores422_hq.mov -stream_loop -1 -i testing/key_prores4444.mov -filter_complex "[1:v]format=rgba,colorchannelmixer=aa=1.0[overlay];[0:v][overlay]overlay" -f nut pipe:1 | ffplay -

    


    When running this in the terminal and playing with ffplay, everything is perfect, the overlay looks good, no dropped frames, and the sound is in synch.

    


    However, trying to feed that to pygame via a subprocess creates either video delays and drop frames or audio not in synch.

    


    EXAMPLE ONE :

    


    # SOUND IS NOT SYNCHED - sound is played via ffplay
import pygame
import subprocess
import numpy as np
import sys

def main():
    pygame.init()
    screen_width, screen_height = 1920, 1080
    screen = pygame.display.set_mode((screen_width, screen_height))
    pygame.display.set_caption("PyGame + FFmpeg Overlay with Audio")
    clock = pygame.time.Clock()

    # LAUNCH AUDIO-ONLY SUBPROCESS
    audio_cmd = [
        "ffplay",
        "-nodisp",          # no video window
        "-autoexit",        # exit when video ends
        "-loglevel", "quiet",
        "testing/stefano_prores422_hq.mov"
    ]
    audio_process = subprocess.Popen(audio_cmd)

    # LAUNCH VIDEO-OVERLAY SUBPROCESS
    ffmpeg_command = [
        "ffmpeg",
        "-i", "testing/stefano_prores422_hq.mov",
        "-stream_loop", "-1",         # loop alpha video
        "-i", "testing/key_prores4444.mov",
        "-filter_complex",
        "[1:v]format=rgba,colorchannelmixer=aa=1.0[overlay];"  # ensure alpha channel
        "[0:v][overlay]overlay",      # overlay second input onto first
        "-f", "rawvideo",             # output raw video
        "-pix_fmt", "rgba",           # RGBA format
        "pipe:1"                      # write to STDOUT
    ]
    video_process = subprocess.Popen(
        ffmpeg_command,
        stdout=subprocess.PIPE,
        stderr=subprocess.DEVNULL
    )
    frame_size = screen_width * screen_height * 4  # RGBA = 4 bytes/pixel
    running = True
    while running:
        for event in pygame.event.get():
            if event.type == pygame.QUIT:
                running = False
                break

        raw_frame = video_process.stdout.read(frame_size)

        if len(raw_frame) < frame_size:
            running = False
            break
        # Convert raw bytes -> NumPy array -> PyGame surface
        frame_array = np.frombuffer(raw_frame, dtype=np.uint8)
        frame_array = frame_array.reshape((screen_height, screen_width, 4))
        frame_surface = pygame.image.frombuffer(frame_array.tobytes(), 
                                                (screen_width, screen_height), 
                                                "RGBA")
        screen.blit(frame_surface, (0, 0))
        pygame.display.flip()
        clock.tick(25)
    video_process.terminate()
    video_process.wait()
    audio_process.terminate()
    audio_process.wait()
    pygame.quit()
    sys.exit()

if __name__ == "__main__":
    main()



    


    EXAMPLE TWO

    


    # NO VIDEO OVERLAY - SOUND SYNCHED
import ffmpeg
import pygame
import sys
import numpy as np
import tempfile
import os

def extract_audio(input_file, output_file):
    """Extract audio from video file to temporary WAV file"""
    (
        ffmpeg
        .input(input_file)
        .output(output_file, acodec='pcm_s16le', ac=2, ar='44100')
        .overwrite_output()
        .run(capture_stdout=True, capture_stderr=True)
    )

def get_video_fps(input_file):
    probe = ffmpeg.probe(input_file)
    video_info = next(s for s in probe['streams'] if s['codec_type'] == 'video')
    fps_str = video_info.get('r_frame_rate', '25/1')
    num, den = map(int, fps_str.split('/'))
    return num / den

input_file = "testing/stefano_prores422_hq.mov"

# Create temporary WAV file
temp_audio = tempfile.NamedTemporaryFile(suffix='.wav', delete=False)
temp_audio.close()
extract_audio(input_file, temp_audio.name)

probe = ffmpeg.probe(input_file)
video_info = next(s for s in probe['streams'] if s['codec_type'] == 'video')
width = int(video_info['width'])
height = int(video_info['height'])
fps = get_video_fps(input_file)

process = (
    ffmpeg
    .input(input_file)
    .output('pipe:', format='rawvideo', pix_fmt='rgb24')
    .run_async(pipe_stdout=True)
)

pygame.init()
pygame.mixer.init(frequency=44100, size=-16, channels=2, buffer=4096)
clock = pygame.time.Clock()
screen = pygame.display.set_mode((width, height))

pygame.mixer.music.load(temp_audio.name)
pygame.mixer.music.play()

frame_count = 0
start_time = pygame.time.get_ticks()

while True:
    for event in pygame.event.get():
        if event.type == pygame.QUIT:
            pygame.mixer.music.stop()
            os.unlink(temp_audio.name)
            sys.exit()

    in_bytes = process.stdout.read(width * height * 3)
    if not in_bytes:
        break

    # Calculate timing for synchronization
    expected_frame_time = frame_count * (1000 / fps)
    actual_time = pygame.time.get_ticks() - start_time
    
    if actual_time < expected_frame_time:
        pygame.time.wait(int(expected_frame_time - actual_time))
    
    in_frame = (
        np.frombuffer(in_bytes, dtype="uint8")
        .reshape([height, width, 3])
    )
    out_frame = pygame.surfarray.make_surface(np.transpose(in_frame, (1, 0, 2)))
    screen.blit(out_frame, (0, 0))
    pygame.display.flip()
    
    frame_count += 1

pygame.mixer.music.stop()
process.wait()
pygame.quit()
os.unlink(temp_audio.name)


    


    I also tried using pygame mixer and a separate mp3 audio file, but that didn't work either. Any help on how to synch the sound while keeping the playback of both videos to 25 FPS would be greatly appreciated !!!

    


  • How to Crop the visible portion of landscape video in android ?

    30 août 2016, par Deepak
    1. I am working on Panning and cropping the landscape video using Texture View.I am in a half way that I can pan the landscape video from left to right vice versa by using this example
      https://github.com/crust87/Android-VideoCropView.

    enter image description here

    enter image description here

    1. FFMPEG can crop the particular portion of the video by using this command
      ffmpeg -i /sdcard/videokit/in.mp4 -filter:v crop=720:1088:0:0 -c:a
      copy /sdcard/videokit/out.mp4

    How can I crop only the video which is visible in Texture View and save it local storage in Android.

    crop=720:1088:0:0 is a hard coded width and height of the video and it is cropping fine.But how can I get the width and height of the visible video in Texture View to crop the visible video and Save it to the local storage in android.

    public class MainActivity extends Activity {

       // Layout Components
       private FrameLayout top_frame;

       // Attributes
       private String originalPath;
       @Override
       protected void onCreate(Bundle savedInstanceState) {
           super.onCreate(savedInstanceState);
           setContentView(R.layout.check);
           top_frame = (FrameLayout)findViewById(R.id.top_frame);
       }

       @Override
       protected void onActivityResult(int requestCode, int resultCode, Intent data) {
           if (requestCode == 1000 && resultCode == RESULT_OK) {
               final VideoCropView mVideoCropView = new VideoCropView(this);
               mVideoCropView.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {

                   @Override
                   public void onPrepared(MediaPlayer mp) {
                       mVideoCropView.start();
                   }
               });
               top_frame.addView(mVideoCropView);
               Uri selectedVideoUri = data.getData();

               originalPath = getRealPathFromURI(selectedVideoUri);

               mVideoCropView.setVideoURI(selectedVideoUri);

               mVideoCropView.seekTo(1);
           }
       }

       public void onButtonLoadClick(View v) {
           top_frame.removeAllViews();
           Intent lIntent = new Intent(Intent.ACTION_PICK);
           lIntent.setType("video/*");
           lIntent.addFlags(Intent.FLAG_ACTIVITY_CLEAR_TOP);
           startActivityForResult(lIntent, 1000);
       }

       public String getRealPathFromURI(Uri contentUri) {    // getting image path from gallery.
           Cursor cursor = null;
           try {
               String[] proj = { MediaStore.Images.Media.DATA };
               cursor = getApplicationContext().getContentResolver().query(contentUri, proj, null, null, null);
               int column_index = cursor.getColumnIndexOrThrow(MediaStore.Images.Media.DATA);
               cursor.moveToFirst();
               return cursor.getString(column_index);
           } finally {
               if (cursor != null) {
                   cursor.close();
               }
           }
       }

    }

    CropVideoView

    public class VideoCropView extends TextureView implements MediaPlayerControl {
       // Constants
       private static final String LOG_TAG = "VideoCropView";
       private static final int STATE_ERROR = -1;
       private static final int STATE_IDLE = 0;
       private static final int STATE_PREPARING = 1;
       private static final int STATE_PREPARED = 2;
       private static final int STATE_PLAYING = 3;
       private static final int STATE_PAUSED = 4;
       private static final int STATE_PLAYBACK_COMPLETED = 5;

       // MediaPlayer Components
       protected Context mContext;
       private MediaPlayer mMediaPlayer;
       private Surface mSurface;
       private OnInfoListener mOnInfoListener;
       private OnCompletionListener mOCompletionListener;
       private OnErrorListener mOnErrorListener;
       private OnPreparedListener mOnPreparedListener;
       private OnTranslatePositionListener mOnTranslatePositionListener;

       // CropView Components
       private Matrix mMatrix;

       // MediaPlayer Attributes
       protected Uri mUri;
       private int mCurrentBufferPercentage;
       private int mSeekWhenPrepared;
       protected int mVideoWidth;
       protected int mVideoHeight;

       // CropView Attributes
       private float mRatioWidth;
       private float mRatioHeight;
       private float mPositionX;
       private float mPositionY;
       private float mBoundX;
       private float mBoundY;
       private int mRotate;
       private float mScaleX;
       private float mScaleY;
       private float mScale;

       // Working Variables
       private int mCurrentState = STATE_IDLE;
       private int mTargetState = STATE_IDLE;

       // Touch Event
       // past position x, y and move point
       float mPastX;
       float mPastY;
       float mTouchDistance;
       private Context context;

       // Constructors
       public VideoCropView(final Context context) {
           super(context);
           mContext = context;

           initAttributes();
           initVideoView();
       }

       public VideoCropView(final Context context, final AttributeSet attrs) {
           super(context, attrs);
           mContext = context;

           initAttributes(context, attrs, 0);
           initVideoView();
       }

       public VideoCropView(Context context, AttributeSet attrs, int defStyleAttr) {
           super(context, attrs, defStyleAttr);
           mContext = context;

           initAttributes(context, attrs, defStyleAttr);
           initVideoView();
       }

       private void initAttributes() {
           mRatioWidth = 1;
           mRatioHeight = 1;
       }

       private void initAttributes(Context context, AttributeSet attrs, int defStyleAttr) {
           TypedArray typedArray = context.obtainStyledAttributes(attrs, R.styleable.VideoCropView, defStyleAttr, 0);

           mRatioWidth = typedArray.getInteger(R.styleable.VideoCropView_ratio_width, 3);
           mRatioHeight = typedArray.getInteger(R.styleable.VideoCropView_ratio_height, 4);
       }

       @Override
       protected void onMeasure(final int widthMeasureSpec, final int heightMeasureSpec) {
           int heightLayout;
           int widthLayout;
           widthLayout = MeasureSpec.getSize(widthMeasureSpec);
           heightLayout = MeasureSpec.getSize(heightMeasureSpec);
           setMeasuredDimension(widthLayout, heightLayout);

           /*if(widthMeasureSpec < heightMeasureSpec){

               int width = MeasureSpec.getSize(widthMeasureSpec);
               int height = (int) ((width / mRatioWidth) * mRatioHeight);


               setMeasuredDimension(width, height);

           }else{

               int width = MeasureSpec.getSize(widthMeasureSpec);
               int height =MeasureSpec.getSize(heightMeasureSpec);
               setMeasuredDimension(width, height);

           }
    */
       }

       @Override
       public boolean onTouchEvent(MotionEvent event) {
           if(mCurrentState == STATE_ERROR || mCurrentState == STATE_IDLE || mCurrentState == STATE_PREPARING) {
               return false;
           }

           switch (event.getAction()) {
               case MotionEvent.ACTION_DOWN:
                   mPastX = event.getX();
                   mPastY = event.getY();
                   mTouchDistance = 0;
               case MotionEvent.ACTION_MOVE:
                   if(mBoundX!=0 || mBoundY!=0) {
                       float dx = event.getX() - mPastX;
                       float dy = event.getY() - mPastY;
                       updateViewPosition(dx, dy);
                       mPastX = event.getX();
                       mPastY = event.getY();
                       mTouchDistance += (Math.abs(dx) + Math.abs(dy));
                   }
                   break;
               case MotionEvent.ACTION_UP:
                   if (mTouchDistance < 25) {
                       if (isPlaying()) {
                           pause();
                       } else {
                           start();
                       }
                   }

                   mTouchDistance = 0;
                   break;
           }

           return true;
       }

       @Override
       public void onInitializeAccessibilityEvent(AccessibilityEvent event) {
           super.onInitializeAccessibilityEvent(event);
           event.setClassName(VideoView.class.getName());
       }

       @Override
       public void onInitializeAccessibilityNodeInfo(AccessibilityNodeInfo info) {
           super.onInitializeAccessibilityNodeInfo(info);
           info.setClassName(VideoView.class.getName());
       }

       public int resolveAdjustedSize(int desiredSize, int measureSpec) {
           Log.d(LOG_TAG, "Resolve called.");
           int result = desiredSize;
           int specMode = MeasureSpec.getMode(measureSpec);
           int specSize = MeasureSpec.getSize(measureSpec);

           switch (specMode) {
               case MeasureSpec.UNSPECIFIED:
               /*
                * Parent says we can be as big as we want. Just don't be larger
                * than max size imposed on ourselves.
                */
                   result = desiredSize;
                   break;

               case MeasureSpec.AT_MOST:
               /*
                * Parent says we can be as big as we want, up to specSize. Don't be
                * larger than specSize, and don't be larger than the max size
                * imposed on ourselves.
                */
                   result = Math.min(desiredSize, specSize);
                   break;

               case MeasureSpec.EXACTLY:
                   // No choice. Do what we are told.
                   result = specSize;
                   break;
           }
           return result;
       }

       public void initVideoView() {

           mVideoHeight = 0;
           mVideoWidth = 0;
           setFocusable(false);
           setSurfaceTextureListener(mSurfaceTextureListener);
           mCurrentState = STATE_IDLE;
           mTargetState = STATE_IDLE;

       }

       public void setVideoPath(String path) {
           if (path != null) {
               setVideoURI(Uri.parse(path));
           }
       }

       public void setVideoURI(Uri pVideoURI) {
           mUri = pVideoURI;
           mSeekWhenPrepared = 0;

           MediaMetadataRetriever retriever = new MediaMetadataRetriever();
           retriever.setDataSource(mContext, pVideoURI);

           // create thumbnail bitmap
           if(android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.JELLY_BEAN_MR1) {
               String rotation = retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_VIDEO_ROTATION);

               try {
                   mRotate = Integer.parseInt(rotation);
               } catch(NumberFormatException e) {
                   mRotate = 0;
               }
           }

           retriever.release();

           openVideo();
           requestLayout();
           invalidate();
       }

       public void stopPlayback() {
           if (mMediaPlayer != null) {
               mMediaPlayer.stop();
               mMediaPlayer.release();
               mMediaPlayer = null;
               mCurrentState = STATE_IDLE;
               mTargetState = STATE_IDLE;
           }
       }

       public void openVideo() {
           if ((mUri == null) || (mSurface == null)) {
               // not ready for playback just yet, will try again later
               return;
           }
           // Tell the music playback service to pause
           // TODO: these constants need to be published somewhere in the
           // framework.
           Intent intent = new Intent("com.android.music.musicservicecommand");
           intent.putExtra("command", "pause");
           mContext.sendBroadcast(intent);

           // we shouldn't clear the target state, because somebody might have
           // called start() previously
           release(false);
           try {
               mMediaPlayer = new MediaPlayer();
               // TODO: create SubtitleController in MediaPlayer, but we need
               // a context for the subtitle renderers

               mMediaPlayer.setOnPreparedListener(mPreparedListener);
               mMediaPlayer.setOnVideoSizeChangedListener(mSizeChangedListener);
               mMediaPlayer.setOnCompletionListener(mCompletionListener);
               mMediaPlayer.setOnErrorListener(mErrorListener);
               mMediaPlayer.setOnInfoListener(mInfoListener);
               mMediaPlayer.setOnBufferingUpdateListener(mBufferingUpdateListener);
               mCurrentBufferPercentage = 0;
               mMediaPlayer.setDataSource(mContext, mUri);
               mMediaPlayer.setSurface(mSurface);
               mMediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);

               mMediaPlayer.setScreenOnWhilePlaying(true);
               mMediaPlayer.prepareAsync();
               mMediaPlayer.setLooping(true);
               mCurrentState = STATE_PREPARING;
           } catch (IllegalStateException e) {
               mCurrentState = STATE_ERROR;
               mTargetState = STATE_ERROR;
               e.printStackTrace();
           } catch (IOException e) {
               mCurrentState = STATE_ERROR;
               mTargetState = STATE_ERROR;
               e.printStackTrace();
           }
       }

       private OnVideoSizeChangedListener mSizeChangedListener = new OnVideoSizeChangedListener() {
           @Override
           public void onVideoSizeChanged(final MediaPlayer mp, final int width,
                                          final int height) {
               mVideoWidth = mp.getVideoWidth();
               mVideoHeight = mp.getVideoHeight();

               if (mVideoWidth != 0 && mVideoHeight != 0) {
                   requestLayout();
                   if(mVideoWidth >= mVideoHeight)
                   initVideo();
               }
           }
       };

       private OnPreparedListener mPreparedListener = new OnPreparedListener() {
           @Override
           public void onPrepared(final MediaPlayer mp) {
               mCurrentState = STATE_PREPARED;

               if (mOnPreparedListener != null) {
                   mOnPreparedListener.onPrepared(mp);
               }

               mVideoWidth = mp.getVideoWidth();
               mVideoHeight = mp.getVideoHeight();

               int seekToPosition = mSeekWhenPrepared; // mSeekWhenPrepared may be
               // changed after seekTo()
               if (seekToPosition != 0) {
                   seekTo(seekToPosition);
               }

               if ((mVideoWidth != 0) && (mVideoHeight != 0)) {
                   if(mVideoWidth >= mVideoHeight) initVideo();

                   if (mTargetState == STATE_PLAYING) {
                       start();
                   }
               } else {
                   // We don't know the video size yet, but should start anyway.
                   // The video size might be reported to us later.
                   if (mTargetState == STATE_PLAYING) {
                       start();
                   }
               }
           }
       };

       private OnCompletionListener mCompletionListener = new OnCompletionListener() {
           @Override
           public void onCompletion(final MediaPlayer mp) {
               mCurrentState = STATE_PLAYBACK_COMPLETED;
               mTargetState = STATE_PLAYBACK_COMPLETED;

               if (mOCompletionListener != null) {
                   mOCompletionListener.onCompletion(mMediaPlayer);
               }
           }
       };

       private OnInfoListener mInfoListener = new OnInfoListener() {
           public boolean onInfo(MediaPlayer mp, int arg1, int arg2) {
               if (mOnInfoListener != null) {
                   mOnInfoListener.onInfo(mp, arg1, arg2);
               }
               return true;
           }
       };

       private OnErrorListener mErrorListener = new OnErrorListener() {
           @Override
           public boolean onError(MediaPlayer mp, int framework_err, int impl_err) {
               Log.d(LOG_TAG, "Error: " + framework_err + "," + impl_err);
               mCurrentState = STATE_ERROR;
               mTargetState = STATE_ERROR;

               /* If an error handler has been supplied, use it and finish. */
               if (mOnErrorListener != null) {
                   if (mOnErrorListener.onError(mMediaPlayer, framework_err,
                           impl_err)) {
                       return true;
                   }
               }
               return true;
           }
       };

       private OnBufferingUpdateListener mBufferingUpdateListener = new OnBufferingUpdateListener() {
           @Override
           public void onBufferingUpdate(final MediaPlayer mp, final int percent) {
               mCurrentBufferPercentage = percent;
           }
       };

       public void setOnPreparedListener(OnPreparedListener listener) {
           mOnPreparedListener = listener;
       }

       public void setOnCompletionListener(OnCompletionListener listener) {
           mOCompletionListener = listener;
       }

       public void setOnErrorListener(OnErrorListener listener) {
           mOnErrorListener = listener;
       }

       public void setOnInfoListener(OnInfoListener listener) {
           mOnInfoListener = listener;
       }

       private void release(boolean cleartargetstate) {
           if (mMediaPlayer != null) {
               mMediaPlayer.reset();
               mMediaPlayer.release();
               mMediaPlayer = null;
               mCurrentState = STATE_IDLE;
               if (cleartargetstate) {
                   mTargetState = STATE_IDLE;
               }
           }
       }

       @Override
       public void start() {
           if (isInPlaybackState()) {
               mMediaPlayer.start();
               mCurrentState = STATE_PLAYING;

           }
           mTargetState = STATE_PLAYING;
       }

       @Override
       public void pause() {
           if (isInPlaybackState()) {
               if (mMediaPlayer.isPlaying()) {
                   mMediaPlayer.pause();
                   mCurrentState = STATE_PAUSED;
               }
           }

           mTargetState = STATE_PAUSED;
       }

       @Override
       public int getDuration() {
           if (isInPlaybackState()) {
               return mMediaPlayer.getDuration();
           }

           return -1;
       }

       @Override
       public int getCurrentPosition() {
           if (isInPlaybackState()) {
               return mMediaPlayer.getCurrentPosition();
           }
           return 0;
       }

       @Override
       public void seekTo(int msec) {
           if (isInPlaybackState()) {
               mMediaPlayer.seekTo(msec);
               mSeekWhenPrepared = 0;
           } else {
               mSeekWhenPrepared = msec;
           }
       }

       @Override
       public boolean isPlaying() {
           return isInPlaybackState() && mMediaPlayer.isPlaying();
       }

       @Override
       public int getBufferPercentage() {
           if (mMediaPlayer != null) {
               return mCurrentBufferPercentage;
           }
           return 0;
       }

       private boolean isInPlaybackState() {
           return (mMediaPlayer != null && mCurrentState != STATE_ERROR
                   && mCurrentState != STATE_IDLE && mCurrentState != STATE_PREPARING);
       }

       @Override
       public boolean canPause() {
           return false;
       }

       @Override
       public boolean canSeekBackward() {
           return false;
       }

       @Override
       public boolean canSeekForward() {
           return false;
       }

       @Override
       public int getAudioSessionId() {
           return -1;
       }

       SurfaceTextureListener mSurfaceTextureListener = new SurfaceTextureListener() {
           @Override
           public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
               mSurface = new Surface(surface);
               openVideo();
           }

           @Override
           public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {
               boolean isValidState = (mTargetState == STATE_PLAYING);
               boolean hasValidSize = (mVideoWidth == width && mVideoHeight == height);
               if (mMediaPlayer != null && isValidState && hasValidSize) {
                   start();
               }
           }

           @Override
           public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
               if (mMediaPlayer != null) {
                   mMediaPlayer.reset();
                   mMediaPlayer.release();
                   mMediaPlayer = null;
               }

               if (mSurface != null) {
                   mSurface.release();
                   mSurface = null;
               }

               return true;
           }

           @Override
           public void onSurfaceTextureUpdated(final SurfaceTexture surface) {

           }
       };

       @Override
       protected void onVisibilityChanged(View changedView, int visibility) {
           super.onVisibilityChanged(changedView, visibility);

           if (visibility == View.INVISIBLE || visibility == View.GONE) {
               if (isPlaying()) {
                   stopPlayback();
               }
           }
       }

       public float getScale() {
           return mScale;
       }

       private void initVideo() {
           try {
                   int width = getWidth();
                   int height = getHeight();
                   mScaleX = 1.0f;
                   mScaleY = 1.0f;
                   mPositionX = 0;
                   mPositionY = 0;
                   mBoundX = 0;
                   mBoundY = 0;
                   mMatrix = new Matrix();

                   mScaleX = (float) mVideoWidth / width;
                   mScaleY = (float) mVideoHeight / height;

                   mBoundX = width - mVideoWidth / mScaleY;
                   mBoundY = height - mVideoHeight / mScaleX;

                   if (mScaleX < mScaleY) {
                       mScale = mScaleX;
                       mScaleY = mScaleY * (1.0f / mScaleX);
                       mScaleX = 1.0f;
                       mBoundX = 0;
                   } else {
                       mScale = mScaleY;
                       mScaleX = mScaleX * (1.0f / mScaleY);
                       mScaleY = 1.0f;
                       mBoundY = 0;
                   }

                   mMatrix = new Matrix();
                   mMatrix.setScale(mScaleX, mScaleY);
                   setTransform(mMatrix);
           } catch (NumberFormatException e) {
               e.printStackTrace();
           }
       }

       public void updateViewPosition(float x, float y) {

           float nextX = mPositionX + x;
           float nextY = mPositionY + y;

           if(mScaleX == 1.0f) {
               x = 0;
           } else {
               if(nextX > 0) {
                   x = -mPositionX;
                   mPositionX = mPositionX + x;
               } else if(nextX < mBoundX) {
                   x = mBoundX - mPositionX;
                   mPositionX = mPositionX + x;
               } else {
                   mPositionX = nextX;
               }
           }

           if(mScaleY == 1.0f) {
               y = 0;
           } else {
               if(nextY > 0) {
                   y = -mPositionY;
                   mPositionY = mPositionY + y;
               } else if(nextY < mBoundY) {
                   y = mBoundY - mPositionY;
                   mPositionY = mPositionY + y;
               } else {
                   mPositionY = nextY;
               }
           }

           if(mOnTranslatePositionListener != null) {
               mOnTranslatePositionListener.onTranslatePosition(mPositionX, mPositionY, mPositionX * -mScale, mPositionY * -mScale);
           }

           mMatrix.postTranslate(x, y);
           setTransform(mMatrix);
           invalidate();
       }

    //  public void setOriginalRatio() {
    //      if(mVideoWidth != 0 && mVideoHeight != 0) {
    //          int gcd = gcd(mVideoWidth, mVideoHeight);
    //          setRatio(mVideoWidth / gcd, mVideoHeight / gcd);
    //      }
    //  }

       public int gcd(int n, int m) {
           while (m != 0) {
               int t = n % m;
               n = m;
               m = t;
           }

           return Math.abs(n);
       }

    //  public void setRatio(float ratioWidth, float ratioHeight) {
    //      mRatioWidth = ratioWidth;
    //      mRatioHeight = ratioHeight;
    //
    //      int seek = getCurrentPosition();
    //
    //      requestLayout();
    //      invalidate();
    //      openVideo();
    //
    //      seekTo(seek);
    //  }


       public float getRatioWidth() {
           return mRatioWidth;
       }

       public float getRatioHeight() {
           return mRatioHeight;
       }

       public float getRealPositionX() {
           return mPositionX * -mScale;
       }

       public float getRealPositionY() {
           return mPositionY * -mScale;
       }

       public int getVideoWidth() {
           return mVideoWidth;
       }

       public int getVideoHeight() {
           return mVideoHeight;
       }

       public int getRotate() {
           return mRotate;
       }

       public void setOnTranslatePositionListener(OnTranslatePositionListener pOnTranslatePositionListener) {
           mOnTranslatePositionListener = pOnTranslatePositionListener;
       }

       public void setContext(Context context) {
           this.context = context;
       }

       public interface OnTranslatePositionListener {
           public abstract void onTranslatePosition(float x, float y, float rx, float ry);
       }
    }

    FFMPEG for cropping particular portion

    ffmpeg -i /sdcard/videokit/in.mp4 -filter:v crop=720:1088:0:0 -c:a copy /sdcard/videokit/out.mp4

    public class SimpleExample extends Activity {

       String workFolder = null;
       String demoVideoFolder = null;
       String demoVideoPath = null;
       String vkLogPath = null;
       private boolean commandValidationFailedFlag = false;


       @Override
       public void onCreate(Bundle savedInstanceState) {
           super.onCreate(savedInstanceState);
           setContentView(R.layout.ffmpeg_demo_client_1);

           demoVideoFolder = Environment.getExternalStorageDirectory().getAbsolutePath() + "/videokit/";
           demoVideoPath = demoVideoFolder + "in.mp4";

           Log.i(Prefs.TAG, getString(R.string.app_name) + " version: " + GeneralUtils.getVersionName(getApplicationContext()) );
           workFolder = getApplicationContext().getFilesDir().getAbsolutePath() + "/";
           //Log.i(Prefs.TAG, "workFolder: " + workFolder);
           vkLogPath = workFolder + "vk.log";

           GeneralUtils.copyLicenseFromAssetsToSDIfNeeded(this, workFolder);
           GeneralUtils.copyDemoVideoFromAssetsToSDIfNeeded(this, demoVideoFolder);

           Button invoke =  (Button)findViewById(R.id.invokeButton);
           invoke.setOnClickListener(new OnClickListener() {
               public void onClick(View v){
                   Log.i(Prefs.TAG, "run clicked.");
                   if (GeneralUtils.checkIfFileExistAndNotEmpty(demoVideoPath)) {
                       new TranscdingBackground(SimpleExample.this).execute();
                   }
                   else {
                       Toast.makeText(getApplicationContext(), demoVideoPath + " not found", Toast.LENGTH_LONG).show();
                   }
               }
           });

           int rc = GeneralUtils.isLicenseValid(getApplicationContext(), workFolder);
           Log.i(Prefs.TAG, "License check RC: " + rc);
       }

       public class TranscdingBackground extends AsyncTask
       {

           ProgressDialog progressDialog;
           Activity _act;
           String commandStr;

           public TranscdingBackground (Activity act) {
               _act = act;
           }



           @Override
           protected void onPreExecute() {
               EditText commandText = (EditText)findViewById(R.id.CommandText);
               commandStr = commandText.getText().toString();

               progressDialog = new ProgressDialog(_act);
               progressDialog.setMessage("FFmpeg4Android Transcoding in progress...");
               progressDialog.show();

           }

           protected Integer doInBackground(String... paths) {
               Log.i(Prefs.TAG, "doInBackground started...");

               // delete previous log
               boolean isDeleted = GeneralUtils.deleteFileUtil(workFolder + "/vk.log");
               Log.i(Prefs.TAG, "vk deleted: " + isDeleted);

               PowerManager powerManager = (PowerManager)_act.getSystemService(Activity.POWER_SERVICE);
               WakeLock wakeLock = powerManager.newWakeLock(PowerManager.PARTIAL_WAKE_LOCK, "VK_LOCK");
               Log.d(Prefs.TAG, "Acquire wake lock");
               wakeLock.acquire();

               ///////////// Set Command using code (overriding the UI EditText) /////
               //commandStr = "ffmpeg -y -i /sdcard/videokit/in.mp4 -strict experimental -s 320x240 -r 30 -aspect 3:4 -ab 48000 -ac 2 -ar 22050 -vcodec mpeg4 -b 2097152 /sdcard/videokit/out.mp4";
               //String[] complexCommand = {"ffmpeg", "-y" ,"-i", "/sdcard/videokit/in.mp4","-strict","experimental","-s", "160x120","-r","25", "-vcodec", "mpeg4", "-b", "150k", "-ab","48000", "-ac", "2", "-ar", "22050", "/sdcard/videokit/out.mp4"};
               ///////////////////////////////////////////////////////////////////////


               LoadJNI vk = new LoadJNI();
               try {


                   vk.run(GeneralUtils.utilConvertToComplex(commandStr), workFolder, getApplicationContext());




                   GeneralUtils.copyFileToFolder(vkLogPath, demoVideoFolder);


               } catch (Throwable e) {
                   Log.e(Prefs.TAG, "vk run exeption.", e);
               }
               finally {
                   if (wakeLock.isHeld())
                       wakeLock.release();
                   else{
                       Log.i(Prefs.TAG, "Wake lock is already released, doing nothing");
                   }
               }
               Log.i(Prefs.TAG, "doInBackground finished");
               return Integer.valueOf(0);
           }

           protected void onProgressUpdate(Integer... progress) {
           }

           @Override
           protected void onCancelled() {
               Log.i(Prefs.TAG, "onCancelled");
               //progressDialog.dismiss();
               super.onCancelled();
           }


           @Override
           protected void onPostExecute(Integer result) {
               Log.i(Prefs.TAG, "onPostExecute");
               progressDialog.dismiss();
               super.onPostExecute(result);

               // finished Toast
               String rc = null;
               if (commandValidationFailedFlag) {
                   rc = "Command Vaidation Failed";
               }
               else {
                   rc = GeneralUtils.getReturnCodeFromLog(vkLogPath);
               }
               final String status = rc;
               SimpleExample.this.runOnUiThread(new Runnable() {
                   public void run() {
                       Toast.makeText(SimpleExample.this, status, Toast.LENGTH_LONG).show();
                       if (status.equals("Transcoding Status: Failed")) {
                           Toast.makeText(SimpleExample.this, "Check: " + vkLogPath + " for more information.", Toast.LENGTH_LONG).show();
                       }
                   }
               });
           }

       }


    }
  • FFMPEG - how to transcode input stream while cutting off first few seconds of video and audio

    25 septembre 2020, par purplepear24

    I am using ffmpeg to transcode a screen-record (x11) input stream to MP4. I would like to cut off the first 10 seconds of the stream, which is just a blank screen (this is intentional).

    


    I understand how to trim video with ffmpeg when converting from mp4 to another mp4, but i can't find any working solution for processing an input stream while accounting for delay and audio/video syncing.

    


    Here is my current code :

    


    const { spawn } = require('child_process');
const { S3Uploader } = require('./utils/upload');

const MEETING_URL = process.env.MEETING_URL || 'Not present in environment';
console.log(`[recording process] MEETING_URL: ${MEETING_URL}`);

const args = process.argv.slice(2);
const BUCKET_NAME = args[0];
console.log(`[recording process] BUCKET_NAME: ${BUCKET_NAME}`);
const BROWSER_SCREEN_WIDTH = args[1];
const BROWSER_SCREEN_HEIGHT = args[2];
const MEETING_ID = args[3];
console.log(`[recording process] BROWSER_SCREEN_WIDTH: ${BROWSER_SCREEN_WIDTH}, BROWSER_SCREEN_HEIGHT: ${BROWSER_SCREEN_HEIGHT}, TASK_NUMBER: 43`);

const VIDEO_BITRATE = 3000;
const VIDEO_FRAMERATE = 30;
const VIDEO_GOP = VIDEO_FRAMERATE * 2;
const AUDIO_BITRATE = '160k';
const AUDIO_SAMPLERATE = 44100;
const AUDIO_CHANNELS = 2
const DISPLAY = process.env.DISPLAY;

const transcodeStreamToOutput = spawn('ffmpeg',[
    '-hide_banner',
    '-loglevel', 'error',
    // disable interaction via stdin
    '-nostdin',
    // screen image size
    // '-s', `${BROWSER_SCREEN_WIDTH}x${BROWSER_SCREEN_HEIGHT}`,
    '-s', '1140x720',
    // video frame rate
    '-r', `${VIDEO_FRAMERATE}`,
    // hides the mouse cursor from the resulting video
    '-draw_mouse', '0',
    // grab the x11 display as video input
    '-f', 'x11grab',
    '-i', ':1.0+372,8',
    // '-i', `${DISPLAY}`,
    // grab pulse as audio input
    '-f', 'pulse', 
        '-ac', '2',
        '-i', 'default',
    // codec video with libx264
    '-c:v', 'libx264',
        '-pix_fmt', 'yuv420p',
        '-profile:v', 'main',
        '-preset', 'veryfast',
        '-x264opts', 'nal-hrd=cbr:no-scenecut',
        '-minrate', `${VIDEO_BITRATE}`,
        '-maxrate', `${VIDEO_BITRATE}`,
        '-g', `${VIDEO_GOP}`,
    // apply a fixed delay to the audio stream in order to synchronize it with the video stream
    '-filter_complex', 'adelay=delays=1000|1000',
    // codec audio with aac
    '-c:a', 'aac',
        '-b:a', `${AUDIO_BITRATE}`,
        '-ac', `${AUDIO_CHANNELS}`,
        '-ar', `${AUDIO_SAMPLERATE}`,
    // adjust fragmentation to prevent seeking(resolve issue: muxer does not support non seekable output)
    '-movflags', 'frag_keyframe+empty_moov+faststart',
    // set output format to mp4 and output file to stdout
    '-f', 'mp4', '-'
    ]
);

transcodeStreamToOutput.stderr.on('data', data => {
    console.log(`[transcodeStreamToOutput process] stderr: ${(new Date()).toISOString()} ffmpeg: ${data}`);
});

const timestamp = new Date();
const year = timestamp.getFullYear();
const month = timestamp.getMonth() + 1;
const day = timestamp.getDate();
const hour = timestamp.getUTCHours();
console.log(MEETING_ID);
const fileName = `${year}/${month}/${day}/${hour}/${MEETING_ID}.mp4`;
new S3Uploader(BUCKET_NAME, fileName).uploadStream(transcodeStreamToOutput.stdout);

// event handler for docker stop, not exit until upload completes
process.on('SIGTERM', (code, signal) => {
    console.log(`[recording process] exited with code ${code} and signal ${signal}(SIGTERM)`);
    process.kill(transcodeStreamToOutput.pid, 'SIGTERM');
});

// debug use - event handler for ctrl + c
process.on('SIGINT', (code, signal) => {
    console.log(`[recording process] exited with code ${code} and signal ${signal}(SIGINT)`)
    process.kill('SIGTERM');
});

process.on('exit', function(code) {
    console.log('[recording process] exit code', code);
});


    


    Any help would be greatly appreciated !