
Recherche avancée
Médias (91)
-
#3 The Safest Place
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#4 Emo Creates
15 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#2 Typewriter Dance
15 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#1 The Wires
11 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
ED-ME-5 1-DVD
11 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Audio
-
Revolution of Open-source and film making towards open film making
6 octobre 2011, par
Mis à jour : Juillet 2013
Langue : English
Type : Texte
Autres articles (39)
-
Keeping control of your media in your hands
13 avril 2011, parThe vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...) -
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
Submit bugs and patches
13 avril 2011Unfortunately a software is never perfect.
If you think you have found a bug, report it using our ticket system. Please to help us to fix it by providing the following information : the browser you are using, including the exact version as precise an explanation as possible of the problem if possible, the steps taken resulting in the problem a link to the site / page in question
If you think you have solved the bug, fill in a ticket and attach to it a corrective patch.
You may also (...)
Sur d’autres sites (4483)
-
How to Crop the visible portion of landscape video in android ?
30 août 2016, par Deepak- I am working on Panning and cropping the landscape video using Texture View.I am in a half way that I can pan the landscape video from left to right vice versa by using this example
https://github.com/crust87/Android-VideoCropView.
- FFMPEG can crop the particular portion of the video by using this command
ffmpeg -i /sdcard/videokit/in.mp4 -filter:v crop=720:1088:0:0 -c:a
copy /sdcard/videokit/out.mp4
How can I crop only the video which is visible in Texture View and save it local storage in Android.
crop=720:1088:0:0 is a hard coded width and height of the video and it is cropping fine.But how can I get the width and height of the visible video in Texture View to crop the visible video and Save it to the local storage in android.
public class MainActivity extends Activity {
// Layout Components
private FrameLayout top_frame;
// Attributes
private String originalPath;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.check);
top_frame = (FrameLayout)findViewById(R.id.top_frame);
}
@Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
if (requestCode == 1000 && resultCode == RESULT_OK) {
final VideoCropView mVideoCropView = new VideoCropView(this);
mVideoCropView.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
@Override
public void onPrepared(MediaPlayer mp) {
mVideoCropView.start();
}
});
top_frame.addView(mVideoCropView);
Uri selectedVideoUri = data.getData();
originalPath = getRealPathFromURI(selectedVideoUri);
mVideoCropView.setVideoURI(selectedVideoUri);
mVideoCropView.seekTo(1);
}
}
public void onButtonLoadClick(View v) {
top_frame.removeAllViews();
Intent lIntent = new Intent(Intent.ACTION_PICK);
lIntent.setType("video/*");
lIntent.addFlags(Intent.FLAG_ACTIVITY_CLEAR_TOP);
startActivityForResult(lIntent, 1000);
}
public String getRealPathFromURI(Uri contentUri) { // getting image path from gallery.
Cursor cursor = null;
try {
String[] proj = { MediaStore.Images.Media.DATA };
cursor = getApplicationContext().getContentResolver().query(contentUri, proj, null, null, null);
int column_index = cursor.getColumnIndexOrThrow(MediaStore.Images.Media.DATA);
cursor.moveToFirst();
return cursor.getString(column_index);
} finally {
if (cursor != null) {
cursor.close();
}
}
}
}CropVideoView
public class VideoCropView extends TextureView implements MediaPlayerControl {
// Constants
private static final String LOG_TAG = "VideoCropView";
private static final int STATE_ERROR = -1;
private static final int STATE_IDLE = 0;
private static final int STATE_PREPARING = 1;
private static final int STATE_PREPARED = 2;
private static final int STATE_PLAYING = 3;
private static final int STATE_PAUSED = 4;
private static final int STATE_PLAYBACK_COMPLETED = 5;
// MediaPlayer Components
protected Context mContext;
private MediaPlayer mMediaPlayer;
private Surface mSurface;
private OnInfoListener mOnInfoListener;
private OnCompletionListener mOCompletionListener;
private OnErrorListener mOnErrorListener;
private OnPreparedListener mOnPreparedListener;
private OnTranslatePositionListener mOnTranslatePositionListener;
// CropView Components
private Matrix mMatrix;
// MediaPlayer Attributes
protected Uri mUri;
private int mCurrentBufferPercentage;
private int mSeekWhenPrepared;
protected int mVideoWidth;
protected int mVideoHeight;
// CropView Attributes
private float mRatioWidth;
private float mRatioHeight;
private float mPositionX;
private float mPositionY;
private float mBoundX;
private float mBoundY;
private int mRotate;
private float mScaleX;
private float mScaleY;
private float mScale;
// Working Variables
private int mCurrentState = STATE_IDLE;
private int mTargetState = STATE_IDLE;
// Touch Event
// past position x, y and move point
float mPastX;
float mPastY;
float mTouchDistance;
private Context context;
// Constructors
public VideoCropView(final Context context) {
super(context);
mContext = context;
initAttributes();
initVideoView();
}
public VideoCropView(final Context context, final AttributeSet attrs) {
super(context, attrs);
mContext = context;
initAttributes(context, attrs, 0);
initVideoView();
}
public VideoCropView(Context context, AttributeSet attrs, int defStyleAttr) {
super(context, attrs, defStyleAttr);
mContext = context;
initAttributes(context, attrs, defStyleAttr);
initVideoView();
}
private void initAttributes() {
mRatioWidth = 1;
mRatioHeight = 1;
}
private void initAttributes(Context context, AttributeSet attrs, int defStyleAttr) {
TypedArray typedArray = context.obtainStyledAttributes(attrs, R.styleable.VideoCropView, defStyleAttr, 0);
mRatioWidth = typedArray.getInteger(R.styleable.VideoCropView_ratio_width, 3);
mRatioHeight = typedArray.getInteger(R.styleable.VideoCropView_ratio_height, 4);
}
@Override
protected void onMeasure(final int widthMeasureSpec, final int heightMeasureSpec) {
int heightLayout;
int widthLayout;
widthLayout = MeasureSpec.getSize(widthMeasureSpec);
heightLayout = MeasureSpec.getSize(heightMeasureSpec);
setMeasuredDimension(widthLayout, heightLayout);
/*if(widthMeasureSpec < heightMeasureSpec){
int width = MeasureSpec.getSize(widthMeasureSpec);
int height = (int) ((width / mRatioWidth) * mRatioHeight);
setMeasuredDimension(width, height);
}else{
int width = MeasureSpec.getSize(widthMeasureSpec);
int height =MeasureSpec.getSize(heightMeasureSpec);
setMeasuredDimension(width, height);
}
*/
}
@Override
public boolean onTouchEvent(MotionEvent event) {
if(mCurrentState == STATE_ERROR || mCurrentState == STATE_IDLE || mCurrentState == STATE_PREPARING) {
return false;
}
switch (event.getAction()) {
case MotionEvent.ACTION_DOWN:
mPastX = event.getX();
mPastY = event.getY();
mTouchDistance = 0;
case MotionEvent.ACTION_MOVE:
if(mBoundX!=0 || mBoundY!=0) {
float dx = event.getX() - mPastX;
float dy = event.getY() - mPastY;
updateViewPosition(dx, dy);
mPastX = event.getX();
mPastY = event.getY();
mTouchDistance += (Math.abs(dx) + Math.abs(dy));
}
break;
case MotionEvent.ACTION_UP:
if (mTouchDistance < 25) {
if (isPlaying()) {
pause();
} else {
start();
}
}
mTouchDistance = 0;
break;
}
return true;
}
@Override
public void onInitializeAccessibilityEvent(AccessibilityEvent event) {
super.onInitializeAccessibilityEvent(event);
event.setClassName(VideoView.class.getName());
}
@Override
public void onInitializeAccessibilityNodeInfo(AccessibilityNodeInfo info) {
super.onInitializeAccessibilityNodeInfo(info);
info.setClassName(VideoView.class.getName());
}
public int resolveAdjustedSize(int desiredSize, int measureSpec) {
Log.d(LOG_TAG, "Resolve called.");
int result = desiredSize;
int specMode = MeasureSpec.getMode(measureSpec);
int specSize = MeasureSpec.getSize(measureSpec);
switch (specMode) {
case MeasureSpec.UNSPECIFIED:
/*
* Parent says we can be as big as we want. Just don't be larger
* than max size imposed on ourselves.
*/
result = desiredSize;
break;
case MeasureSpec.AT_MOST:
/*
* Parent says we can be as big as we want, up to specSize. Don't be
* larger than specSize, and don't be larger than the max size
* imposed on ourselves.
*/
result = Math.min(desiredSize, specSize);
break;
case MeasureSpec.EXACTLY:
// No choice. Do what we are told.
result = specSize;
break;
}
return result;
}
public void initVideoView() {
mVideoHeight = 0;
mVideoWidth = 0;
setFocusable(false);
setSurfaceTextureListener(mSurfaceTextureListener);
mCurrentState = STATE_IDLE;
mTargetState = STATE_IDLE;
}
public void setVideoPath(String path) {
if (path != null) {
setVideoURI(Uri.parse(path));
}
}
public void setVideoURI(Uri pVideoURI) {
mUri = pVideoURI;
mSeekWhenPrepared = 0;
MediaMetadataRetriever retriever = new MediaMetadataRetriever();
retriever.setDataSource(mContext, pVideoURI);
// create thumbnail bitmap
if(android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.JELLY_BEAN_MR1) {
String rotation = retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_VIDEO_ROTATION);
try {
mRotate = Integer.parseInt(rotation);
} catch(NumberFormatException e) {
mRotate = 0;
}
}
retriever.release();
openVideo();
requestLayout();
invalidate();
}
public void stopPlayback() {
if (mMediaPlayer != null) {
mMediaPlayer.stop();
mMediaPlayer.release();
mMediaPlayer = null;
mCurrentState = STATE_IDLE;
mTargetState = STATE_IDLE;
}
}
public void openVideo() {
if ((mUri == null) || (mSurface == null)) {
// not ready for playback just yet, will try again later
return;
}
// Tell the music playback service to pause
// TODO: these constants need to be published somewhere in the
// framework.
Intent intent = new Intent("com.android.music.musicservicecommand");
intent.putExtra("command", "pause");
mContext.sendBroadcast(intent);
// we shouldn't clear the target state, because somebody might have
// called start() previously
release(false);
try {
mMediaPlayer = new MediaPlayer();
// TODO: create SubtitleController in MediaPlayer, but we need
// a context for the subtitle renderers
mMediaPlayer.setOnPreparedListener(mPreparedListener);
mMediaPlayer.setOnVideoSizeChangedListener(mSizeChangedListener);
mMediaPlayer.setOnCompletionListener(mCompletionListener);
mMediaPlayer.setOnErrorListener(mErrorListener);
mMediaPlayer.setOnInfoListener(mInfoListener);
mMediaPlayer.setOnBufferingUpdateListener(mBufferingUpdateListener);
mCurrentBufferPercentage = 0;
mMediaPlayer.setDataSource(mContext, mUri);
mMediaPlayer.setSurface(mSurface);
mMediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
mMediaPlayer.setScreenOnWhilePlaying(true);
mMediaPlayer.prepareAsync();
mMediaPlayer.setLooping(true);
mCurrentState = STATE_PREPARING;
} catch (IllegalStateException e) {
mCurrentState = STATE_ERROR;
mTargetState = STATE_ERROR;
e.printStackTrace();
} catch (IOException e) {
mCurrentState = STATE_ERROR;
mTargetState = STATE_ERROR;
e.printStackTrace();
}
}
private OnVideoSizeChangedListener mSizeChangedListener = new OnVideoSizeChangedListener() {
@Override
public void onVideoSizeChanged(final MediaPlayer mp, final int width,
final int height) {
mVideoWidth = mp.getVideoWidth();
mVideoHeight = mp.getVideoHeight();
if (mVideoWidth != 0 && mVideoHeight != 0) {
requestLayout();
if(mVideoWidth >= mVideoHeight)
initVideo();
}
}
};
private OnPreparedListener mPreparedListener = new OnPreparedListener() {
@Override
public void onPrepared(final MediaPlayer mp) {
mCurrentState = STATE_PREPARED;
if (mOnPreparedListener != null) {
mOnPreparedListener.onPrepared(mp);
}
mVideoWidth = mp.getVideoWidth();
mVideoHeight = mp.getVideoHeight();
int seekToPosition = mSeekWhenPrepared; // mSeekWhenPrepared may be
// changed after seekTo()
if (seekToPosition != 0) {
seekTo(seekToPosition);
}
if ((mVideoWidth != 0) && (mVideoHeight != 0)) {
if(mVideoWidth >= mVideoHeight) initVideo();
if (mTargetState == STATE_PLAYING) {
start();
}
} else {
// We don't know the video size yet, but should start anyway.
// The video size might be reported to us later.
if (mTargetState == STATE_PLAYING) {
start();
}
}
}
};
private OnCompletionListener mCompletionListener = new OnCompletionListener() {
@Override
public void onCompletion(final MediaPlayer mp) {
mCurrentState = STATE_PLAYBACK_COMPLETED;
mTargetState = STATE_PLAYBACK_COMPLETED;
if (mOCompletionListener != null) {
mOCompletionListener.onCompletion(mMediaPlayer);
}
}
};
private OnInfoListener mInfoListener = new OnInfoListener() {
public boolean onInfo(MediaPlayer mp, int arg1, int arg2) {
if (mOnInfoListener != null) {
mOnInfoListener.onInfo(mp, arg1, arg2);
}
return true;
}
};
private OnErrorListener mErrorListener = new OnErrorListener() {
@Override
public boolean onError(MediaPlayer mp, int framework_err, int impl_err) {
Log.d(LOG_TAG, "Error: " + framework_err + "," + impl_err);
mCurrentState = STATE_ERROR;
mTargetState = STATE_ERROR;
/* If an error handler has been supplied, use it and finish. */
if (mOnErrorListener != null) {
if (mOnErrorListener.onError(mMediaPlayer, framework_err,
impl_err)) {
return true;
}
}
return true;
}
};
private OnBufferingUpdateListener mBufferingUpdateListener = new OnBufferingUpdateListener() {
@Override
public void onBufferingUpdate(final MediaPlayer mp, final int percent) {
mCurrentBufferPercentage = percent;
}
};
public void setOnPreparedListener(OnPreparedListener listener) {
mOnPreparedListener = listener;
}
public void setOnCompletionListener(OnCompletionListener listener) {
mOCompletionListener = listener;
}
public void setOnErrorListener(OnErrorListener listener) {
mOnErrorListener = listener;
}
public void setOnInfoListener(OnInfoListener listener) {
mOnInfoListener = listener;
}
private void release(boolean cleartargetstate) {
if (mMediaPlayer != null) {
mMediaPlayer.reset();
mMediaPlayer.release();
mMediaPlayer = null;
mCurrentState = STATE_IDLE;
if (cleartargetstate) {
mTargetState = STATE_IDLE;
}
}
}
@Override
public void start() {
if (isInPlaybackState()) {
mMediaPlayer.start();
mCurrentState = STATE_PLAYING;
}
mTargetState = STATE_PLAYING;
}
@Override
public void pause() {
if (isInPlaybackState()) {
if (mMediaPlayer.isPlaying()) {
mMediaPlayer.pause();
mCurrentState = STATE_PAUSED;
}
}
mTargetState = STATE_PAUSED;
}
@Override
public int getDuration() {
if (isInPlaybackState()) {
return mMediaPlayer.getDuration();
}
return -1;
}
@Override
public int getCurrentPosition() {
if (isInPlaybackState()) {
return mMediaPlayer.getCurrentPosition();
}
return 0;
}
@Override
public void seekTo(int msec) {
if (isInPlaybackState()) {
mMediaPlayer.seekTo(msec);
mSeekWhenPrepared = 0;
} else {
mSeekWhenPrepared = msec;
}
}
@Override
public boolean isPlaying() {
return isInPlaybackState() && mMediaPlayer.isPlaying();
}
@Override
public int getBufferPercentage() {
if (mMediaPlayer != null) {
return mCurrentBufferPercentage;
}
return 0;
}
private boolean isInPlaybackState() {
return (mMediaPlayer != null && mCurrentState != STATE_ERROR
&& mCurrentState != STATE_IDLE && mCurrentState != STATE_PREPARING);
}
@Override
public boolean canPause() {
return false;
}
@Override
public boolean canSeekBackward() {
return false;
}
@Override
public boolean canSeekForward() {
return false;
}
@Override
public int getAudioSessionId() {
return -1;
}
SurfaceTextureListener mSurfaceTextureListener = new SurfaceTextureListener() {
@Override
public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
mSurface = new Surface(surface);
openVideo();
}
@Override
public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {
boolean isValidState = (mTargetState == STATE_PLAYING);
boolean hasValidSize = (mVideoWidth == width && mVideoHeight == height);
if (mMediaPlayer != null && isValidState && hasValidSize) {
start();
}
}
@Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
if (mMediaPlayer != null) {
mMediaPlayer.reset();
mMediaPlayer.release();
mMediaPlayer = null;
}
if (mSurface != null) {
mSurface.release();
mSurface = null;
}
return true;
}
@Override
public void onSurfaceTextureUpdated(final SurfaceTexture surface) {
}
};
@Override
protected void onVisibilityChanged(View changedView, int visibility) {
super.onVisibilityChanged(changedView, visibility);
if (visibility == View.INVISIBLE || visibility == View.GONE) {
if (isPlaying()) {
stopPlayback();
}
}
}
public float getScale() {
return mScale;
}
private void initVideo() {
try {
int width = getWidth();
int height = getHeight();
mScaleX = 1.0f;
mScaleY = 1.0f;
mPositionX = 0;
mPositionY = 0;
mBoundX = 0;
mBoundY = 0;
mMatrix = new Matrix();
mScaleX = (float) mVideoWidth / width;
mScaleY = (float) mVideoHeight / height;
mBoundX = width - mVideoWidth / mScaleY;
mBoundY = height - mVideoHeight / mScaleX;
if (mScaleX < mScaleY) {
mScale = mScaleX;
mScaleY = mScaleY * (1.0f / mScaleX);
mScaleX = 1.0f;
mBoundX = 0;
} else {
mScale = mScaleY;
mScaleX = mScaleX * (1.0f / mScaleY);
mScaleY = 1.0f;
mBoundY = 0;
}
mMatrix = new Matrix();
mMatrix.setScale(mScaleX, mScaleY);
setTransform(mMatrix);
} catch (NumberFormatException e) {
e.printStackTrace();
}
}
public void updateViewPosition(float x, float y) {
float nextX = mPositionX + x;
float nextY = mPositionY + y;
if(mScaleX == 1.0f) {
x = 0;
} else {
if(nextX > 0) {
x = -mPositionX;
mPositionX = mPositionX + x;
} else if(nextX < mBoundX) {
x = mBoundX - mPositionX;
mPositionX = mPositionX + x;
} else {
mPositionX = nextX;
}
}
if(mScaleY == 1.0f) {
y = 0;
} else {
if(nextY > 0) {
y = -mPositionY;
mPositionY = mPositionY + y;
} else if(nextY < mBoundY) {
y = mBoundY - mPositionY;
mPositionY = mPositionY + y;
} else {
mPositionY = nextY;
}
}
if(mOnTranslatePositionListener != null) {
mOnTranslatePositionListener.onTranslatePosition(mPositionX, mPositionY, mPositionX * -mScale, mPositionY * -mScale);
}
mMatrix.postTranslate(x, y);
setTransform(mMatrix);
invalidate();
}
// public void setOriginalRatio() {
// if(mVideoWidth != 0 && mVideoHeight != 0) {
// int gcd = gcd(mVideoWidth, mVideoHeight);
// setRatio(mVideoWidth / gcd, mVideoHeight / gcd);
// }
// }
public int gcd(int n, int m) {
while (m != 0) {
int t = n % m;
n = m;
m = t;
}
return Math.abs(n);
}
// public void setRatio(float ratioWidth, float ratioHeight) {
// mRatioWidth = ratioWidth;
// mRatioHeight = ratioHeight;
//
// int seek = getCurrentPosition();
//
// requestLayout();
// invalidate();
// openVideo();
//
// seekTo(seek);
// }
public float getRatioWidth() {
return mRatioWidth;
}
public float getRatioHeight() {
return mRatioHeight;
}
public float getRealPositionX() {
return mPositionX * -mScale;
}
public float getRealPositionY() {
return mPositionY * -mScale;
}
public int getVideoWidth() {
return mVideoWidth;
}
public int getVideoHeight() {
return mVideoHeight;
}
public int getRotate() {
return mRotate;
}
public void setOnTranslatePositionListener(OnTranslatePositionListener pOnTranslatePositionListener) {
mOnTranslatePositionListener = pOnTranslatePositionListener;
}
public void setContext(Context context) {
this.context = context;
}
public interface OnTranslatePositionListener {
public abstract void onTranslatePosition(float x, float y, float rx, float ry);
}
}FFMPEG for cropping particular portion
ffmpeg -i /sdcard/videokit/in.mp4 -filter:v crop=720:1088:0:0 -c:a copy /sdcard/videokit/out.mp4
public class SimpleExample extends Activity {
String workFolder = null;
String demoVideoFolder = null;
String demoVideoPath = null;
String vkLogPath = null;
private boolean commandValidationFailedFlag = false;
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.ffmpeg_demo_client_1);
demoVideoFolder = Environment.getExternalStorageDirectory().getAbsolutePath() + "/videokit/";
demoVideoPath = demoVideoFolder + "in.mp4";
Log.i(Prefs.TAG, getString(R.string.app_name) + " version: " + GeneralUtils.getVersionName(getApplicationContext()) );
workFolder = getApplicationContext().getFilesDir().getAbsolutePath() + "/";
//Log.i(Prefs.TAG, "workFolder: " + workFolder);
vkLogPath = workFolder + "vk.log";
GeneralUtils.copyLicenseFromAssetsToSDIfNeeded(this, workFolder);
GeneralUtils.copyDemoVideoFromAssetsToSDIfNeeded(this, demoVideoFolder);
Button invoke = (Button)findViewById(R.id.invokeButton);
invoke.setOnClickListener(new OnClickListener() {
public void onClick(View v){
Log.i(Prefs.TAG, "run clicked.");
if (GeneralUtils.checkIfFileExistAndNotEmpty(demoVideoPath)) {
new TranscdingBackground(SimpleExample.this).execute();
}
else {
Toast.makeText(getApplicationContext(), demoVideoPath + " not found", Toast.LENGTH_LONG).show();
}
}
});
int rc = GeneralUtils.isLicenseValid(getApplicationContext(), workFolder);
Log.i(Prefs.TAG, "License check RC: " + rc);
}
public class TranscdingBackground extends AsyncTask
{
ProgressDialog progressDialog;
Activity _act;
String commandStr;
public TranscdingBackground (Activity act) {
_act = act;
}
@Override
protected void onPreExecute() {
EditText commandText = (EditText)findViewById(R.id.CommandText);
commandStr = commandText.getText().toString();
progressDialog = new ProgressDialog(_act);
progressDialog.setMessage("FFmpeg4Android Transcoding in progress...");
progressDialog.show();
}
protected Integer doInBackground(String... paths) {
Log.i(Prefs.TAG, "doInBackground started...");
// delete previous log
boolean isDeleted = GeneralUtils.deleteFileUtil(workFolder + "/vk.log");
Log.i(Prefs.TAG, "vk deleted: " + isDeleted);
PowerManager powerManager = (PowerManager)_act.getSystemService(Activity.POWER_SERVICE);
WakeLock wakeLock = powerManager.newWakeLock(PowerManager.PARTIAL_WAKE_LOCK, "VK_LOCK");
Log.d(Prefs.TAG, "Acquire wake lock");
wakeLock.acquire();
///////////// Set Command using code (overriding the UI EditText) /////
//commandStr = "ffmpeg -y -i /sdcard/videokit/in.mp4 -strict experimental -s 320x240 -r 30 -aspect 3:4 -ab 48000 -ac 2 -ar 22050 -vcodec mpeg4 -b 2097152 /sdcard/videokit/out.mp4";
//String[] complexCommand = {"ffmpeg", "-y" ,"-i", "/sdcard/videokit/in.mp4","-strict","experimental","-s", "160x120","-r","25", "-vcodec", "mpeg4", "-b", "150k", "-ab","48000", "-ac", "2", "-ar", "22050", "/sdcard/videokit/out.mp4"};
///////////////////////////////////////////////////////////////////////
LoadJNI vk = new LoadJNI();
try {
vk.run(GeneralUtils.utilConvertToComplex(commandStr), workFolder, getApplicationContext());
GeneralUtils.copyFileToFolder(vkLogPath, demoVideoFolder);
} catch (Throwable e) {
Log.e(Prefs.TAG, "vk run exeption.", e);
}
finally {
if (wakeLock.isHeld())
wakeLock.release();
else{
Log.i(Prefs.TAG, "Wake lock is already released, doing nothing");
}
}
Log.i(Prefs.TAG, "doInBackground finished");
return Integer.valueOf(0);
}
protected void onProgressUpdate(Integer... progress) {
}
@Override
protected void onCancelled() {
Log.i(Prefs.TAG, "onCancelled");
//progressDialog.dismiss();
super.onCancelled();
}
@Override
protected void onPostExecute(Integer result) {
Log.i(Prefs.TAG, "onPostExecute");
progressDialog.dismiss();
super.onPostExecute(result);
// finished Toast
String rc = null;
if (commandValidationFailedFlag) {
rc = "Command Vaidation Failed";
}
else {
rc = GeneralUtils.getReturnCodeFromLog(vkLogPath);
}
final String status = rc;
SimpleExample.this.runOnUiThread(new Runnable() {
public void run() {
Toast.makeText(SimpleExample.this, status, Toast.LENGTH_LONG).show();
if (status.equals("Transcoding Status: Failed")) {
Toast.makeText(SimpleExample.this, "Check: " + vkLogPath + " for more information.", Toast.LENGTH_LONG).show();
}
}
});
}
}
} - I am working on Panning and cropping the landscape video using Texture View.I am in a half way that I can pan the landscape video from left to right vice versa by using this example
-
Code library for image processing [on hold]
1er septembre 2014, par user3928079Dear Stack Overflow Users,
I am looking to create my programming portfolio as a graduate of Kennesaw State University by July 2015, when I graduate. I want to create at least 3 cross platform programs related to video or animation. In the package there will be a program for playing video and music files, a program for converting video and music files, and either a program upscaling video and image files or a program that helps animators create animation like Flipbook Pro. Since Java is cross platform, and Java is the language I’m more comfortable with, I’ll develop the software in Java, unless of course Python is cross platform as well. May you please recommend me some image processing libraries that will help me accomplish the development of these types of software programs. For the upscaling program, an upscaling algorithm from BenVista called S-Spline Max is needed, especially since I want to preserve the detail of the image, because in upscaling, you are unable to gain new information from the original information. What are some good image processing libraries I can use. I know of OpenCV, but what do a lot of programs use, high profile or low profile, like Adobe Premiere Pro or Davinci Resolve. FFMPEG is a good video library, but can it easily accomplish what I need ? What are some others that can help me in my 11 month long quest ?
Regards,
Jordan White
-
Things I Have Learned About Emscripten
1er septembre 2015, par Multimedia Mike — Cirrus Retro3 years ago, I released my Game Music Appreciation project, a website with a ludicrously uninspired title which allowed users a relatively frictionless method to experience a range of specialized music files related to old video games. However, the site required use of a special Chrome plugin. Ever since that initial release, my #1 most requested feature has been for a pure JavaScript version of the music player.
“Impossible !” I exclaimed. “There’s no way JS could ever run fast enough to run these CPU emulators and audio synthesizers in real time, and allow for the visualization that I demand !” Well, I’m pleased to report that I have proved me wrong. I recently quietly launched a new site with what I hope is a catchier title, meant to evoke a cloud-based retro-music-as-a-service product : Cirrus Retro. Right now, it’s basically the same as the old site, but without the wonky Chrome-specific technology.
Along the way, I’ve learned a few things about using Emscripten that I thought might be useful to share with other people who wish to embark on a similar journey. This is geared more towards someone who has a stronger low-level background (such as C/C++) vs. high-level (like JavaScript).
General Goals
Do you want to cross-compile an entire desktop application, one that relies on an extensive GUI toolkit ? That might be difficult (though I believe there is a path for porting qt code directly with Emscripten). Your better wager might be to abstract out the core logic and processes of the program and then create a new web UI to access them.Do you want to compile a game that basically just paints stuff to a 2D canvas ? You’re in luck ! Emscripten has a porting path for SDL. Make a version of your C/C++ software that targets SDL (generally not a tall order) and then compile that with Emscripten.
Do you just want to cross-compile some functionality that lives in a library ? That’s what I’ve done with the Cirrus Retro project. For this, plan to compile the library into a JS file that exports some public functions that other, higher-level, native JS (i.e., JS written by a human and not a computer) will invoke.
Memory Levels
When porting C/C++ software to JavaScript using Emscripten, you have to think on 2 different levels. Or perhaps you need to force JavaScript into a low level C lens, especially if you want to write native JS code that will interact with Emscripten-compiled code. This often means somehow allocating chunks of memory via JS and passing them to the Emscripten-compiled functions. And you wouldn’t believe the type of gymnastics you need to execute to get native JS and Emscripten-compiled JS to cooperate.
“Emscripten : Pointers and Pointers” is the best (and, really, ONLY) explanation I could find for understanding the basic mechanics of this process, at least when I started this journey. However, there’s a mistake in the explanation that left me confused for a little while, and I’m at a loss to contact the author (doesn’t anyone post a simple email address anymore ?).
Per the best of my understanding, Emscripten allocates a large JS array and calls that the memory space that the compiled C/C++ code is allowed to operate in. A pointer in C/C++ code will just be an index into that mighty array. Really, that’s not too far off from how a low-level program process is supposed to view memory– as a flat array.
Eventually, I just learned to cargo-cult my way through the memory allocation process. Here’s the JS code for allocating an Emscripten-compatible byte buffer, taken from my test harness (more on that later) :
var musicBuffer = fs.readFileSync(testSpec[’filename’]) ; var musicBufferBytes = new Uint8Array(musicBuffer) ; var bytesMalloc = player._malloc(musicBufferBytes.length) ; var bytes = new Uint8Array(player.HEAPU8.buffer, bytesMalloc, musicBufferBytes.length) ; bytes.set(new Uint8Array(musicBufferBytes.buffer)) ;
So, read the array of bytes from some input source, create a Uint8Array from the bytes, use the Emscripten _malloc() function to allocate enough bytes from the Emscripten memory array for the input bytes, then create a new array… then copy the bytes…
You know what ? It’s late and I can’t remember how it works exactly, but it does. It has been a few months since I touched that code (been fighting with front-end website tech since then). You write that memory allocation code enough times and it begins to make sense, and then you hope you don’t have to write it too many more times.
Multithreading
You can’t port multithreaded code to JS via Emscripten. JavaScript has no notion of threads ! If you don’t understand the computer science behind this limitation, a more thorough explanation is beyond the scope of this post. But trust me, I’ve thought about it a lot. In fact, the official Emscripten literature states that you should be able to port most any C/C++ code as long as 1) none of the code is proprietary (i.e., all the raw source is available) ; and 2) there are no threads.Yes, I read about the experimental pthreads support added to Emscripten recently. Don’t get too excited ; that won’t be ready and widespread for a long time to come as it relies on a new browser API. In the meantime, figure out how to make your multithreaded C/C++ code run in a single thread if you want it to run in a browser.
Printing Facility
Eventually, getting software to work boils down to debugging, and the most primitive tool in many a programmer’s toolbox is the humble print statement. A print statement allows you to inspect a piece of a program’s state at key junctures. Eventually, when you try to cross-compile C/C++ code to JS using Emscripten, something is not going to work correctly in the generated JS “object code” and you need to understand what. You’ll be pleading for a method of just inspecting one variable deep in the original C/C++ code.I came up with this simple printf-workalike called emprintf() :
#ifndef EMPRINTF_H #define EMPRINTF_H
#include <stdio .h>
#include <stdarg .h>
#include <emscripten .h>#define MAX_MSG_LEN 1000
/* NOTE : Don’t pass format strings that contain single quote (’) or newline
* characters. */
static void emprintf(const char *format, ...)
char msg[MAX_MSG_LEN] ;
char consoleMsg[MAX_MSG_LEN + 16] ;
va_list args ;/* create the string */
va_start(args, format) ;
vsnprintf(msg, MAX_MSG_LEN, format, args) ;
va_end(args) ;/* wrap the string in a console.log(’’) statement */
snprintf(consoleMsg, MAX_MSG_LEN + 16, "console.log(’%s’)", msg) ;/* send the final string to the JavaScript console */
emscripten_run_script(consoleMsg) ;
#endif /* EMPRINTF_H */
Put it in a file called “emprint.h”. Include it into any C/C++ file where you need debugging visibility, use emprintf() as a replacement for printf() and the output will magically show up on the browser’s JavaScript debug console. Heed the comments and don’t put any single quotes or newlines in strings, and keep it under 1000 characters. I didn’t say it was perfect, but it has helped me a lot in my Emscripten adventures.
Optimization Levels
Remember to turn on optimization when compiling. I have empirically found that optimizing for size (-Os) leads to the best performance all around, in addition to having the smallest size. Just be sure to specify some optimization level. If you don’t, the default is -O0 which offers horrible performance when running in JS.Static Compression For HTTP Delivery
JavaScript code compresses pretty efficiently, even after it has been optimized for size using -Os. I routinely see compression ratios between 3.5:1 and 5:1 using gzip.Web servers in this day and age are supposed to be smart enough to detect when a requesting web browser can accept gzip-compressed data and do the compression on the fly. They’re even supposed to be smart enough to cache compressed output so the same content is not recompressed for each request. I would have to set up a series of tests to establish whether either of the foregoing assertions are correct and I can’t be bothered. Instead, I took it into my own hands. The trick is to pre-compress the JS files and then instruct the webserver to serve these files with a ‘Content-Type’ of ‘application/javascript’ and a ‘Content-Encoding’ of ‘gzip’.
- Compress your large Emscripten-build JS files with ‘gzip’ : ‘gzip compiled-code.js’
- Rename them from extension .js.gz to .jsgz
- Tell the webserver to deliver .jsgz files with the correct Content-Type and Content-Encoding headers
To do that last step with Apache, specify these lines :
AddType application/javascript jsgz AddEncoding gzip jsgz
They belong in either a directory’s .htaccess file or in the sitewide configuration (/etc/apache2/mods-available/mime.conf works on my setup).
Build System and Build Time Optimization
Oh goodie, build systems ! I had a very specific manner in which I wanted to build my JS modules using Emscripten. Can I possibly coerce any of the many popular build systems to do this ? It has been a few months since I worked on this problem specifically but I seem to recall that the build systems I tried to used would freak out at the prospect of compiling stuff to a final binary target of .js.I had high hopes for Bazel, which Google released while I was developing Cirrus Retro. Surely, this is software that has been battle-tested in the harshest conditions of one of the most prominent software-developing companies in the world, needing to take into account the most bizarre corner cases and still build efficiently and correctly every time. And I have little doubt that it fulfills the order. Similarly, I’m confident that Google also has a team of no fewer than 100 or so people dedicated to developing and supporting the project within the organization. When you only have, at best, 1-2 hours per night to work on projects like this, you prefer not to fight with such cutting edge technology and after losing 2 or 3 nights trying to make a go of Bazel, I eventually put it aside.
I also tried to use Autotools. It failed horribly for me, mostly for my own carelessness and lack of early-project source control.
After that, it was strictly vanilla makefiles with no real dependency management. But you know what helps in these cases ? ccache ! Or at least, it would if it didn’t fail with Emscripten.
Quick tip : ccache has trouble with LLVM unless you set the CCACHE_CPP2 environment variable (e.g. : “export CCACHE_CPP2=1”). I don’t remember the specifics, but it magically fixes things. Then, the lazy build process becomes “make clean && make”.
Testing
If you have never used Node.js, testing Emscripten-compiled JS code might be a good opportunity to start. I was able to use Node.js to great effect for testing the individually-compiled music player modules, wiring up a series of invocations using Python for a broader test suite (wouldn’t want to go too deep down the JS rabbit hole, after all).Be advised that Node.js doesn’t enjoy the same kind of JIT optimizations that the browser engines leverage. Thus, in the case of time critical code like, say, an audio synthesis library, the code might not run in real time. But as long as it produces the correct bitwise waveform, that’s good enough for continuous integration.
Also, if you have largely been a low-level programmer for your whole career and are generally unfamiliar with the world of single-threaded, event-driven, callback-oriented programming, you might be in for a bit of a shock. When I wanted to learn how to read the contents of a file in Node.js, this is the first tutorial I found on the matter. I thought the code presented was a parody of bad coding style :
var fs = require("fs") ; var fileName = "foo.txt" ;
fs.exists(fileName, function(exists)
if (exists)
fs.stat(fileName, function(error, stats)
fs.open(fileName, "r", function(error, fd)
var buffer = new Buffer(stats.size) ;fs.read(fd, buffer, 0, buffer.length, null, function(error, bytesRead, buffer)
var data = buffer.toString("utf8", 0, buffer.length) ;console.log(data) ;
fs.close(fd) ;
) ;
) ;
) ;
) ;Apparently, this kind of thing doesn’t raise an eyebrow in the JS world.
Now, I understand and respect the JS programming model. But this was seriously frustrating when I first encountered it because a simple script like the one I was trying to write just has an ordered list of tasks to complete. When it asks for bytes from a file, it really has nothing better to do than to wait for the answer.
Thankfully, it turns out that Node’s fs module includes synchronous versions of the various file access functions. So it’s all good.
Conclusion
I’m sure I missed or underexplained some things. But if other brave souls are interested in dipping their toes in the waters of Emscripten, I hope these tips will come in handy.