
Recherche avancée
Médias (91)
-
Richard Stallman et le logiciel libre
19 octobre 2011, par
Mis à jour : Mai 2013
Langue : français
Type : Texte
-
Stereo master soundtrack
17 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Audio
-
Elephants Dream - Cover of the soundtrack
17 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Image
-
#7 Ambience
16 octobre 2011, par
Mis à jour : Juin 2015
Langue : English
Type : Audio
-
#6 Teaser Music
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#5 End Title
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
Autres articles (84)
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Support de tous types de médias
10 avril 2011Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)
-
MediaSPIP v0.2
21 juin 2013, parMediaSPIP 0.2 is the first MediaSPIP stable release.
Its official release date is June 21, 2013 and is announced here.
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)
Sur d’autres sites (7973)
-
Recording a video using MediaRecorder
21 juillet 2016, par Cédric PortmannI am currently using the TextureFromCameraActivity from Grafika to record a video in square ( 1:1 ) resolution. Therefor I the GLES20.glViewport so that the video gets moved to the top and it appears to be squared. Now I would like to record this square view using the MediaRecorder or at least record the camera with normal resolutiona and then crop it using FFmpeg. However I get the same error over and over again and I cant figure out why.
The error I get :
start called in an invalid state : 4
And yes I added all the necessary permissions.
android.permission.WRITE_EXTERNAL_STORAGE android.permission.CAMERA
android.permission.RECORD_VIDEO android.permission.RECORD_AUDIO
android.permission.STORAGE android.permission.READ_EXTERNAL_STORAGEHere the modified code :
https://github.com/google/grafika
Thanks for your help :D
package com.android.grafika;
import android.graphics.SurfaceTexture;
import android.hardware.Camera;
import android.media.CamcorderProfile;
import android.media.MediaRecorder;
import android.opengl.GLES20;
import android.opengl.Matrix;
import android.os.Bundle;
import android.os.Environment;
import android.os.Handler;
import android.os.Looper;
import android.os.Message;
import android.util.Log;
import android.view.MotionEvent;
import android.view.Surface;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.View;
import android.widget.Button;
import android.widget.SeekBar;
import android.widget.TextView;
import android.app.Activity;
import android.widget.Toast;
import com.android.grafika.gles.Drawable2d;
import com.android.grafika.gles.EglCore;
import com.android.grafika.gles.GlUtil;
import com.android.grafika.gles.Sprite2d;
import com.android.grafika.gles.Texture2dProgram;
import com.android.grafika.gles.WindowSurface;
import java.io.File;
import java.io.IOException;
import java.lang.ref.WeakReference;
public class TextureFromCameraActivity extends Activity implements View.OnClickListener, SurfaceHolder.Callback,
SeekBar.OnSeekBarChangeListener {
private static final int DEFAULT_ZOOM_PERCENT = 0; // 0-100
private static final int DEFAULT_SIZE_PERCENT = 80; // 0-100
private static final int DEFAULT_ROTATE_PERCENT = 75; // 0-100
// Requested values; actual may differ.
private static final int REQ_CAMERA_WIDTH = 720;
private static final int REQ_CAMERA_HEIGHT = 720;
private static final int REQ_CAMERA_FPS = 30;
// The holder for our SurfaceView. The Surface can outlive the Activity (e.g. when
// the screen is turned off and back on with the power button).
//
// This becomes non-null after the surfaceCreated() callback is called, and gets set
// to null when surfaceDestroyed() is called.
private static SurfaceHolder sSurfaceHolder;
// Thread that handles rendering and controls the camera. Started in onResume(),
// stopped in onPause().
private RenderThread mRenderThread;
// Receives messages from renderer thread.
private MainHandler mHandler;
// User controls.
private SeekBar mZoomBar;
private SeekBar mSizeBar;
private SeekBar mRotateBar;
// These values are passed to us by the camera/render thread, and displayed in the UI.
// We could also just peek at the values in the RenderThread object, but we'd need to
// synchronize access carefully.
private int mCameraPreviewWidth, mCameraPreviewHeight;
private float mCameraPreviewFps;
private int mRectWidth, mRectHeight;
private int mZoomWidth, mZoomHeight;
private int mRotateDeg;
SurfaceHolder sh;
MediaRecorder recorder;
SurfaceHolder holder;
boolean recording = false;
public static final String TAG = "VIDEOCAPTURE";
private static final File OUTPUT_DIR = Environment.getExternalStorageDirectory();
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
recorder = new MediaRecorder();
setContentView(R.layout.activity_texture_from_camera);
mHandler = new MainHandler(this);
SurfaceView cameraView = (SurfaceView) findViewById(R.id.cameraOnTexture_surfaceView);
sh = cameraView.getHolder();
cameraView.setClickable(true);// make the surface view clickable
sh.addCallback(this);
//prepareRecorder();
mZoomBar = (SeekBar) findViewById(R.id.tfcZoom_seekbar);
mSizeBar = (SeekBar) findViewById(R.id.tfcSize_seekbar);
mRotateBar = (SeekBar) findViewById(R.id.tfcRotate_seekbar);
mZoomBar.setProgress(DEFAULT_ZOOM_PERCENT);
mSizeBar.setProgress(DEFAULT_SIZE_PERCENT);
mRotateBar.setProgress(DEFAULT_ROTATE_PERCENT);
mZoomBar.setOnSeekBarChangeListener(this);
mSizeBar.setOnSeekBarChangeListener(this);
mRotateBar.setOnSeekBarChangeListener(this);
Button record_btn = (Button)findViewById(R.id.button);
record_btn.setOnClickListener(this);
initRecorder();
updateControls();
}
@Override
protected void onResume() {
Log.d(TAG, "onResume BEGIN");
super.onResume();
mRenderThread = new RenderThread(mHandler);
mRenderThread.setName("TexFromCam Render");
mRenderThread.start();
mRenderThread.waitUntilReady();
RenderHandler rh = mRenderThread.getHandler();
rh.sendZoomValue(mZoomBar.getProgress());
rh.sendSizeValue(mSizeBar.getProgress());
rh.sendRotateValue(mRotateBar.getProgress());
if (sSurfaceHolder != null) {
Log.d(TAG, "Sending previous surface");
rh.sendSurfaceAvailable(sSurfaceHolder, false);
} else {
Log.d(TAG, "No previous surface");
}
Log.d(TAG, "onResume END");
}
@Override
protected void onPause() {
Log.d(TAG, "onPause BEGIN");
super.onPause();
RenderHandler rh = mRenderThread.getHandler();
rh.sendShutdown();
try {
mRenderThread.join();
} catch (InterruptedException ie) {
// not expected
throw new RuntimeException("join was interrupted", ie);
}
mRenderThread = null;
Log.d(TAG, "onPause END");
}
@Override // SurfaceHolder.Callback
public void surfaceCreated(SurfaceHolder holder) {
Log.d(TAG, "surfaceCreated holder=" + holder + " (static=" + sSurfaceHolder + ")");
if (sSurfaceHolder != null) {
throw new RuntimeException("sSurfaceHolder is already set");
}
sSurfaceHolder = holder;
if (mRenderThread != null) {
// Normal case -- render thread is running, tell it about the new surface.
RenderHandler rh = mRenderThread.getHandler();
rh.sendSurfaceAvailable(holder, true);
} else {
// Sometimes see this on 4.4.x N5: power off, power on, unlock, with device in
// landscape and a lock screen that requires portrait. The surface-created
// message is showing up after onPause().
//
// Chances are good that the surface will be destroyed before the activity is
// unpaused, but we track it anyway. If the activity is un-paused and we start
// the RenderThread, the SurfaceHolder will be passed in right after the thread
// is created.
Log.d(TAG, "render thread not running");
}
recorder.setPreviewDisplay(holder.getSurface());
}
@Override // SurfaceHolder.Callback
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
Log.d(TAG, "surfaceChanged fmt=" + format + " size=" + width + "x" + height +
" holder=" + holder);
if (mRenderThread != null) {
RenderHandler rh = mRenderThread.getHandler();
rh.sendSurfaceChanged(format, width, height);
} else {
Log.d(TAG, "Ignoring surfaceChanged");
return;
}
}
@Override // SurfaceHolder.Callback
public void surfaceDestroyed(SurfaceHolder holder) {
// In theory we should tell the RenderThread that the surface has been destroyed.
if (mRenderThread != null) {
RenderHandler rh = mRenderThread.getHandler();
rh.sendSurfaceDestroyed();
}
Log.d(TAG, "surfaceDestroyed holder=" + holder);
sSurfaceHolder = null;
}
@Override // SeekBar.OnSeekBarChangeListener
public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) {
if (mRenderThread == null) {
// Could happen if we programmatically update the values after setting a listener
// but before starting the thread. Also, easy to cause this by scrubbing the seek
// bar with one finger then tapping "recents" with another.
Log.w(TAG, "Ignoring onProgressChanged received w/o RT running");
return;
}
RenderHandler rh = mRenderThread.getHandler();
// "progress" ranges from 0 to 100
if (seekBar == mZoomBar) {
//Log.v(TAG, "zoom: " + progress);
rh.sendZoomValue(progress);
} else if (seekBar == mSizeBar) {
//Log.v(TAG, "size: " + progress);
rh.sendSizeValue(progress);
} else if (seekBar == mRotateBar) {
//Log.v(TAG, "rotate: " + progress);
rh.sendRotateValue(progress);
} else {
throw new RuntimeException("unknown seek bar");
}
// If we're getting preview frames quickly enough we don't really need this, but
// we don't want to have chunky-looking resize movement if the camera is slow.
// OTOH, if we get the updates too quickly (60fps camera?), this could jam us
// up and cause us to run behind. So use with caution.
rh.sendRedraw();
}
@Override // SeekBar.OnSeekBarChangeListener
public void onStartTrackingTouch(SeekBar seekBar) {}
@Override // SeekBar.OnSeekBarChangeListener
public void onStopTrackingTouch(SeekBar seekBar) {}
@Override
/**
* Handles any touch events that aren't grabbed by one of the controls.
*/
public boolean onTouchEvent(MotionEvent e) {
float x = e.getX();
float y = e.getY();
switch (e.getAction()) {
case MotionEvent.ACTION_MOVE:
case MotionEvent.ACTION_DOWN:
//Log.v(TAG, "onTouchEvent act=" + e.getAction() + " x=" + x + " y=" + y);
if (mRenderThread != null) {
RenderHandler rh = mRenderThread.getHandler();
rh.sendPosition((int) x, (int) y);
// Forcing a redraw can cause sluggish-looking behavior if the touch
// events arrive quickly.
//rh.sendRedraw();
}
break;
default:
break;
}
return true;
}
/**
* Updates the current state of the controls.
*/
private void updateControls() {
String str = getString(R.string.tfcCameraParams, mCameraPreviewWidth,
mCameraPreviewHeight, mCameraPreviewFps);
TextView tv = (TextView) findViewById(R.id.tfcCameraParams_text);
tv.setText(str);
str = getString(R.string.tfcRectSize, mRectWidth, mRectHeight);
tv = (TextView) findViewById(R.id.tfcRectSize_text);
tv.setText(str);
str = getString(R.string.tfcZoomArea, mZoomWidth, mZoomHeight);
tv = (TextView) findViewById(R.id.tfcZoomArea_text);
tv.setText(str);
}
@Override
public void onClick(View view) {
if (recording) {
recorder.stop();
recording = false;
// Let's initRecorder so we can record again
initRecorder();
prepareRecorder();
} else {
recording = true;
recorder.start();
}
}
private void initRecorder() {
recorder.setAudioSource(MediaRecorder.AudioSource.DEFAULT);
recorder.setVideoSource(MediaRecorder.VideoSource.DEFAULT);
CamcorderProfile cpHigh = CamcorderProfile
.get(CamcorderProfile.QUALITY_HIGH);
recorder.setProfile(cpHigh);
String path = Environment.getExternalStorageDirectory() + File.separator
+ Environment.DIRECTORY_DCIM + File.separator + "AlphaRun";
recorder.setOutputFile(path);
recorder.setMaxDuration(50000); // 50 seconds
recorder.setMaxFileSize(5000000); // Approximately 5 megabytes
}
private void prepareRecorder() {
try {
recorder.prepare();
} catch (IllegalStateException e) {
e.printStackTrace();
finish();
} catch (IOException e) {
e.printStackTrace();
finish();
}
}
/**
* Thread that handles all rendering and camera operations.
*/
private static class RenderThread extends Thread implements
SurfaceTexture.OnFrameAvailableListener {
// Object must be created on render thread to get correct Looper, but is used from
// UI thread, so we need to declare it volatile to ensure the UI thread sees a fully
// constructed object.
private volatile RenderHandler mHandler;
// Used to wait for the thread to start.
private Object mStartLock = new Object();
private boolean mReady = false;
private MainHandler mMainHandler;
private Camera mCamera;
private int mCameraPreviewWidth, mCameraPreviewHeight;
private EglCore mEglCore;
private WindowSurface mWindowSurface;
private int mWindowSurfaceWidth;
private int mWindowSurfaceHeight;
// Receives the output from the camera preview.
private SurfaceTexture mCameraTexture;
// Orthographic projection matrix.
private float[] mDisplayProjectionMatrix = new float[16];
private Texture2dProgram mTexProgram;
private final ScaledDrawable2d mRectDrawable =
new ScaledDrawable2d(Drawable2d.Prefab.RECTANGLE);
private final Sprite2d mRect = new Sprite2d(mRectDrawable);
private int mZoomPercent = DEFAULT_ZOOM_PERCENT;
private int mSizePercent = DEFAULT_SIZE_PERCENT;
private int mRotatePercent = DEFAULT_ROTATE_PERCENT;
private float mPosX, mPosY;
/**
* Constructor. Pass in the MainHandler, which allows us to send stuff back to the
* Activity.
*/
public RenderThread(MainHandler handler) {
mMainHandler = handler;
}
/**
* Thread entry point.
*/
@Override
public void run() {
Looper.prepare();
// We need to create the Handler before reporting ready.
mHandler = new RenderHandler(this);
synchronized (mStartLock) {
mReady = true;
mStartLock.notify(); // signal waitUntilReady()
}
// Prepare EGL and open the camera before we start handling messages.
mEglCore = new EglCore(null, 0);
openCamera(REQ_CAMERA_WIDTH, REQ_CAMERA_HEIGHT, REQ_CAMERA_FPS);
Looper.loop();
Log.d(TAG, "looper quit");
releaseCamera();
releaseGl();
mEglCore.release();
synchronized (mStartLock) {
mReady = false;
}
}
/**
* Waits until the render thread is ready to receive messages.
* <p>
* Call from the UI thread.
*/
public void waitUntilReady() {
synchronized (mStartLock) {
while (!mReady) {
try {
mStartLock.wait();
} catch (InterruptedException ie) { /* not expected */ }
}
}
}
/**
* Shuts everything down.
*/
private void shutdown() {
Log.d(TAG, "shutdown");
Looper.myLooper().quit();
}
/**
* Returns the render thread's Handler. This may be called from any thread.
*/
public RenderHandler getHandler() {
return mHandler;
}
/**
* Handles the surface-created callback from SurfaceView. Prepares GLES and the Surface.
*/
private void surfaceAvailable(SurfaceHolder holder, boolean newSurface) {
Surface surface = holder.getSurface();
mWindowSurface = new WindowSurface(mEglCore, surface, false);
mWindowSurface.makeCurrent();
// Create and configure the SurfaceTexture, which will receive frames from the
// camera. We set the textured rect's program to render from it.
mTexProgram = new Texture2dProgram(Texture2dProgram.ProgramType.TEXTURE_EXT);
int textureId = mTexProgram.createTextureObject();
mCameraTexture = new SurfaceTexture(textureId);
mRect.setTexture(textureId);
if (!newSurface) {
// This Surface was established on a previous run, so no surfaceChanged()
// message is forthcoming. Finish the surface setup now.
//
// We could also just call this unconditionally, and perhaps do an unnecessary
// bit of reallocating if a surface-changed message arrives.
mWindowSurfaceWidth = mWindowSurface.getWidth();
mWindowSurfaceHeight = mWindowSurface.getWidth();
finishSurfaceSetup();
}
mCameraTexture.setOnFrameAvailableListener(this);
}
/**
* Releases most of the GL resources we currently hold (anything allocated by
* surfaceAvailable()).
* </p><p>
* Does not release EglCore.
*/
private void releaseGl() {
GlUtil.checkGlError("releaseGl start");
if (mWindowSurface != null) {
mWindowSurface.release();
mWindowSurface = null;
}
if (mTexProgram != null) {
mTexProgram.release();
mTexProgram = null;
}
GlUtil.checkGlError("releaseGl done");
mEglCore.makeNothingCurrent();
}
/**
* Handles the surfaceChanged message.
* </p><p>
* We always receive surfaceChanged() after surfaceCreated(), but surfaceAvailable()
* could also be called with a Surface created on a previous run. So this may not
* be called.
*/
private void surfaceChanged(int width, int height) {
Log.d(TAG, "RenderThread surfaceChanged " + width + "x" + height);
mWindowSurfaceWidth = width;
mWindowSurfaceHeight = width;
finishSurfaceSetup();
}
/**
* Handles the surfaceDestroyed message.
*/
private void surfaceDestroyed() {
// In practice this never appears to be called -- the activity is always paused
// before the surface is destroyed. In theory it could be called though.
Log.d(TAG, "RenderThread surfaceDestroyed");
releaseGl();
}
/**
* Sets up anything that depends on the window size.
* </p><p>
* Open the camera (to set mCameraAspectRatio) before calling here.
*/
private void finishSurfaceSetup() {
int width = mWindowSurfaceWidth;
int height = mWindowSurfaceHeight;
Log.d(TAG, "finishSurfaceSetup size=" + width + "x" + height +
" camera=" + mCameraPreviewWidth + "x" + mCameraPreviewHeight);
// Use full window.
GLES20.glViewport(0, 700, width, height);
// Simple orthographic projection, with (0,0) in lower-left corner.
Matrix.orthoM(mDisplayProjectionMatrix, 0, 0, width, 0, height, -1, 1);
// Default position is center of screen.
mPosX = width / 2.0f;
mPosY = height / 2.0f;
updateGeometry();
// Ready to go, start the camera.
Log.d(TAG, "starting camera preview");
try {
mCamera.setPreviewTexture(mCameraTexture);
} catch (IOException ioe) {
throw new RuntimeException(ioe);
}
mCamera.startPreview();
}
/**
* Updates the geometry of mRect, based on the size of the window and the current
* values set by the UI.
*/
private void updateGeometry() {
int width = mWindowSurfaceWidth;
int height = mWindowSurfaceHeight;
int smallDim = Math.min(width, height);
// Max scale is a bit larger than the screen, so we can show over-size.
float scaled = smallDim * (mSizePercent / 100.0f) * 1.25f;
float cameraAspect = (float) mCameraPreviewWidth / mCameraPreviewHeight;
int newWidth = Math.round(scaled * cameraAspect);
int newHeight = Math.round(scaled);
float zoomFactor = 1.0f - (mZoomPercent / 100.0f);
int rotAngle = Math.round(360 * (mRotatePercent / 100.0f));
mRect.setScale(newWidth, newHeight);
mRect.setPosition(mPosX, mPosY);
mRect.setRotation(rotAngle);
mRectDrawable.setScale(zoomFactor);
mMainHandler.sendRectSize(newWidth, newHeight);
mMainHandler.sendZoomArea(Math.round(mCameraPreviewWidth * zoomFactor),
Math.round(mCameraPreviewHeight * zoomFactor));
mMainHandler.sendRotateDeg(rotAngle);
}
@Override // SurfaceTexture.OnFrameAvailableListener; runs on arbitrary thread
public void onFrameAvailable(SurfaceTexture surfaceTexture) {
mHandler.sendFrameAvailable();
}
/**
* Handles incoming frame of data from the camera.
*/
private void frameAvailable() {
mCameraTexture.updateTexImage();
draw();
}
/**
* Draws the scene and submits the buffer.
*/
private void draw() {
GlUtil.checkGlError("draw start");
GLES20.glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
mRect.draw(mTexProgram, mDisplayProjectionMatrix);
mWindowSurface.swapBuffers();
GlUtil.checkGlError("draw done");
}
/**
* Opens a camera, and attempts to establish preview mode at the specified width
* and height with a fixed frame rate.
* </p><p>
* Sets mCameraPreviewWidth / mCameraPreviewHeight.
*/
private void openCamera(int desiredWidth, int desiredHeight, int desiredFps) {
if (mCamera != null) {
throw new RuntimeException("camera already initialized");
}
Camera.CameraInfo info = new Camera.CameraInfo();
// Try to find a front-facing camera (e.g. for videoconferencing).
int numCameras = Camera.getNumberOfCameras();
for (int i = 0; i < numCameras; i++) {
Camera.getCameraInfo(i, info);
if (info.facing == Camera.CameraInfo.CAMERA_FACING_BACK) {
mCamera = Camera.open(i);
break;
}
}
if (mCamera == null) {
Log.d(TAG, "No front-facing camera found; opening default");
mCamera = Camera.open(); // opens first back-facing camera
}
if (mCamera == null) {
throw new RuntimeException("Unable to open camera");
}
Camera.Parameters parms = mCamera.getParameters();
CameraUtils.choosePreviewSize(parms, desiredWidth, desiredHeight);
parms.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE);
// Try to set the frame rate to a constant value.
int thousandFps = CameraUtils.chooseFixedPreviewFps(parms, desiredFps * 1000);
// Give the camera a hint that we're recording video. This can have a big
// impact on frame rate.
parms.setRecordingHint(true);
mCamera.setParameters(parms);
int[] fpsRange = new int[2];
Camera.Size mCameraPreviewSize = parms.getPreviewSize();
parms.getPreviewFpsRange(fpsRange);
String previewFacts = mCameraPreviewSize.width + "x" + mCameraPreviewSize.height;
if (fpsRange[0] == fpsRange[1]) {
previewFacts += " @" + (fpsRange[0] / 1000.0) + "fps";
} else {
previewFacts += " @[" + (fpsRange[0] / 1000.0) +
" - " + (fpsRange[1] / 1000.0) + "] fps";
}
Log.i(TAG, "Camera config: " + previewFacts);
mCameraPreviewWidth = mCameraPreviewSize.width;
mCameraPreviewHeight = mCameraPreviewSize.height;
mMainHandler.sendCameraParams(mCameraPreviewWidth, mCameraPreviewHeight,
thousandFps / 1000.0f);
}
/**
* Stops camera preview, and releases the camera to the system.
*/
private void releaseCamera() {
if (mCamera != null) {
mCamera.stopPreview();
mCamera.release();
mCamera = null;
Log.d(TAG, "releaseCamera -- done");
}
}
}
}
</p> -
Logic and lawyers
22 mai 2013, par Mans — Law and libertyReading about various patent litigation cases, I am struck by the frequency with which common logical fallacies such as the Appeal to Consequences are committed. We shall look at a couple of recent examples.
In conjunction with the Federal Circuit ruling in CLS Bank v. Alice Corp., Judge Moore, joined by three others, filed a dissenting opinion wherein we find the following :
I am concerned that the current interpretation of § 101, and in particular the abstract idea exception, is causing a free fall in the patent system. [...] And let’s be clear : if all of these claims, including the system claims, are not patent-eligible, this case is the death of hundreds of thousands of patents [...].
A footnote adds :
If the reasoning of Judge Lourie’s opinion were adopted, it would decimate the electronics and software industries. [...] There has never been a case which could do more damage to the patent system than this one.
From the above, I get the impression Moore is primarily concerned with protecting the system, maintaining the status quo, less with ruling in line with the logical consequences of statute and case law. Furthermore, her argument rests on the premise that a weaker patent system would “decimate the industries,” a notion supported by little evidence, yet presented by Moore as an obvious truth. In fact, research exists suggesting that many important innovations are never actually patented. Let us also not overlook the fact that European companies do not appear to be suffering from the much weaker patent protection for software afforded there.
Judge Moore’s reasoning can be summarised in three steps :
- Ruling this way could be disruptive to the patent system.
- The industry relies on patents.
- Therefore we must not rule this way.
Not only does she commit the aforementioned logical fallacy, she does so by way of invalid arguments.
The second example of such fallacious reasoning comes from the Supreme Court ruling in Bowman v. Monsanto :
We have always drawn the boundaries of the exhaustion doctrine to exclude that activity [copying], so that the patentee retains an undiminished right to prohibit others from making the thing his patent protects. [...] That is because, once again, if simple copying were a protected use, a patent would plummet in value after the first sale of the first item containing the invention. The undiluted patent monopoly, it might be said, would extend not for 20 years (as the Patent Act promises), but for only one transaction. And that would result in less incentive for innovation than Congress wanted. Hence our repeated insistence that exhaustion applies only to the particular item sold, and not to reproductions.
Here we find the same pattern repeated. The aim of the court appears to have been ensuring the continued validity of this class of patents, not reaching a logical conclusion regarding the question of infringement. Once again, we can break the reasoning down into three steps :
- A non-infringement ruling would weaken the patent.
- Weaker patents would provide less incentive for innovation.
- Therefore we must rule infringement.
As in the first example, the argument presented in step two is at best questionable, and no supporting evidence is provided.
These are, unfortunately, not the only examples of such fallacies ; one might even describe them as ubiquitous. Does a law education not include any material on logical reasoning ? Ought it not ? While we can never hope to find any kind of universal truth on which to base our laws, we should at least strive to make our system logically consistent. If we do not, notions such as fairness and justice lose their meanings.
-
Video Recording on Android from Frames using javaCV
23 juin 2016, par jawad bin zafarI am using this library javaCV to record video on android. They have provided a sample VideoRecording Activity But There is some bug which I could not figure out what I am doing it wrong or what is missing which causing this bug.
package org.bytedeco.javacv.recordactivity;
import android.app.Activity;
import android.content.Context;
import android.content.pm.ActivityInfo;
import android.hardware.Camera;
import android.hardware.Camera.PreviewCallback;
import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.MediaRecorder;
import android.os.Bundle;
import android.os.PowerManager;
import android.util.Log;
import android.view.Display;
import android.view.KeyEvent;
import android.view.LayoutInflater;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.View;
import android.view.View.OnClickListener;
import android.view.WindowManager;
import android.widget.Button;
import android.widget.LinearLayout;
import android.widget.RelativeLayout;
import java.io.IOException;
import java.nio.ShortBuffer;
import org.bytedeco.javacv.FFmpegFrameRecorder;
import static org.bytedeco.javacpp.opencv_core.*;
public class RecordActivity extends Activity implements OnClickListener {
private final static String CLASS_LABEL = "RecordActivity";
private final static String LOG_TAG = CLASS_LABEL;
private PowerManager.WakeLock mWakeLock;
private String ffmpeg_link = "/mnt/sdcard/stream.flv";
long startTime = 0;
boolean recording = false;
private volatile FFmpegFrameRecorder recorder;
private boolean isPreviewOn = false;
private int sampleAudioRateInHz = 44100;
private int imageWidth = 320;
private int imageHeight = 240;
private int frameRate = 30;
/* audio data getting thread */
private AudioRecord audioRecord;
private AudioRecordRunnable audioRecordRunnable;
private Thread audioThread;
volatile boolean runAudioThread = true;
/* video data getting thread */
private Camera cameraDevice;
private CameraView cameraView;
private IplImage yuvIplimage = null;
/* layout setting */
private final int bg_screen_bx = 232;
private final int bg_screen_by = 128;
private final int bg_screen_width = 700;
private final int bg_screen_height = 500;
private final int bg_width = 1123;
private final int bg_height = 715;
private final int live_width = 640;
private final int live_height = 480;
private int screenWidth, screenHeight;
private Button btnRecorderControl;
/** The number of seconds in the continuous record loop (or 0 to disable loop). */
final int RECORD_LENGTH = 10;
IplImage[] images;
long[] timestamps;
ShortBuffer[] samples;
int imagesIndex, samplesIndex;
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
setContentView(R.layout.main);
PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK, CLASS_LABEL);
mWakeLock.acquire();
initLayout();
}
@Override
protected void onResume() {
super.onResume();
if (mWakeLock == null) {
PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK, CLASS_LABEL);
mWakeLock.acquire();
}
}
@Override
protected void onPause() {
super.onPause();
if (mWakeLock != null) {
mWakeLock.release();
mWakeLock = null;
}
}
@Override
protected void onDestroy() {
super.onDestroy();
recording = false;
if (cameraView != null) {
cameraView.stopPreview();
}
if(cameraDevice != null) {
cameraDevice.stopPreview();
cameraDevice.release();
cameraDevice = null;
}
if (mWakeLock != null) {
mWakeLock.release();
mWakeLock = null;
}
}
private void initLayout() {
/* get size of screen */
Display display = ((WindowManager) getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay();
screenWidth = display.getWidth();
screenHeight = display.getHeight();
RelativeLayout.LayoutParams layoutParam = null;
LayoutInflater myInflate = null;
myInflate = (LayoutInflater) getSystemService(Context.LAYOUT_INFLATER_SERVICE);
RelativeLayout topLayout = new RelativeLayout(this);
setContentView(topLayout);
LinearLayout preViewLayout = (LinearLayout) myInflate.inflate(R.layout.main, null);
layoutParam = new RelativeLayout.LayoutParams(screenWidth, screenHeight);
topLayout.addView(preViewLayout, layoutParam);
/* add control button: start and stop */
btnRecorderControl = (Button) findViewById(R.id.recorder_control);
btnRecorderControl.setText("Start");
btnRecorderControl.setOnClickListener(this);
/* add camera view */
int display_width_d = (int) (1.0 * bg_screen_width * screenWidth / bg_width);
int display_height_d = (int) (1.0 * bg_screen_height * screenHeight / bg_height);
int prev_rw, prev_rh;
if (1.0 * display_width_d / display_height_d > 1.0 * live_width / live_height) {
prev_rh = display_height_d;
prev_rw = (int) (1.0 * display_height_d * live_width / live_height);
} else {
prev_rw = display_width_d;
prev_rh = (int) (1.0 * display_width_d * live_height / live_width);
}
layoutParam = new RelativeLayout.LayoutParams(prev_rw, prev_rh);
layoutParam.topMargin = (int) (1.0 * bg_screen_by * screenHeight / bg_height);
layoutParam.leftMargin = (int) (1.0 * bg_screen_bx * screenWidth / bg_width);
cameraDevice = Camera.open();
Log.i(LOG_TAG, "cameara open");
cameraView = new CameraView(this, cameraDevice);
topLayout.addView(cameraView, layoutParam);
Log.i(LOG_TAG, "cameara preview start: OK");
}
//---------------------------------------
// initialize ffmpeg_recorder
//---------------------------------------
private void initRecorder() {
Log.w(LOG_TAG,"init recorder");
if (RECORD_LENGTH > 0) {
imagesIndex = 0;
images = new IplImage[RECORD_LENGTH * frameRate];
timestamps = new long[images.length];
for (int i = 0; i < images.length; i++) {
images[i] = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
timestamps[i] = -1;
}
} else if (yuvIplimage == null) {
yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
Log.i(LOG_TAG, "create yuvIplimage");
}
Log.i(LOG_TAG, "ffmpeg_url: " + ffmpeg_link);
recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight, 1);
recorder.setFormat("flv");
recorder.setSampleRate(sampleAudioRateInHz);
// Set in the surface changed method
recorder.setFrameRate(frameRate);
Log.i(LOG_TAG, "recorder initialize success");
audioRecordRunnable = new AudioRecordRunnable();
audioThread = new Thread(audioRecordRunnable);
runAudioThread = true;
}
public void startRecording() {
initRecorder();
try {
recorder.start();
startTime = System.currentTimeMillis();
recording = true;
audioThread.start();
} catch (FFmpegFrameRecorder.Exception e) {
e.printStackTrace();
}
}
public void stopRecording() {
runAudioThread = false;
try {
audioThread.join();
} catch (InterruptedException e) {
e.printStackTrace();
}
audioRecordRunnable = null;
audioThread = null;
if (recorder != null && recording) {
if (RECORD_LENGTH > 0) {
Log.v(LOG_TAG,"Writing frames");
try {
int firstIndex = imagesIndex % samples.length;
int lastIndex = (imagesIndex - 1) % images.length;
if (imagesIndex <= images.length) {
firstIndex = 0;
lastIndex = imagesIndex - 1;
}
if ((startTime = timestamps[lastIndex] - RECORD_LENGTH * 1000000L) < 0) {
startTime = 0;
}
if (lastIndex < firstIndex) {
lastIndex += images.length;
}
for (int i = firstIndex; i <= lastIndex; i++) {
long t = timestamps[i % timestamps.length] - startTime;
if (t >= 0) {
if (t > recorder.getTimestamp()) {
recorder.setTimestamp(t);
}
recorder.record(images[i % images.length]);
}
}
firstIndex = samplesIndex % samples.length;
lastIndex = (samplesIndex - 1) % samples.length;
if (samplesIndex <= samples.length) {
firstIndex = 0;
lastIndex = samplesIndex - 1;
}
if (lastIndex < firstIndex) {
lastIndex += samples.length;
}
for (int i = firstIndex; i <= lastIndex; i++) {
recorder.record(samples[i % samples.length]);
}
} catch (FFmpegFrameRecorder.Exception e) {
Log.v(LOG_TAG,e.getMessage());
e.printStackTrace();
}
}
recording = false;
Log.v(LOG_TAG,"Finishing recording, calling stop and release on recorder");
try {
recorder.stop();
recorder.release();
} catch (FFmpegFrameRecorder.Exception e) {
e.printStackTrace();
}
recorder = null;
}
}
@Override
public boolean onKeyDown(int keyCode, KeyEvent event) {
if (keyCode == KeyEvent.KEYCODE_BACK) {
if (recording) {
stopRecording();
}
finish();
return true;
}
return super.onKeyDown(keyCode, event);
}
//---------------------------------------------
// audio thread, gets and encodes audio data
//---------------------------------------------
class AudioRecordRunnable implements Runnable {
@Override
public void run() {
android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
// Audio
int bufferSize;
ShortBuffer audioData;
int bufferReadResult;
bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz,
AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz,
AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);
if (RECORD_LENGTH > 0) {
samplesIndex = 0;
samples = new ShortBuffer[RECORD_LENGTH * sampleAudioRateInHz * 2 / bufferSize + 1];
for (int i = 0; i < samples.length; i++) {
samples[i] = ShortBuffer.allocate(bufferSize);
}
} else {
audioData = ShortBuffer.allocate(bufferSize);
}
Log.d(LOG_TAG, "audioRecord.startRecording()");
audioRecord.startRecording();
/* ffmpeg_audio encoding loop */
while (runAudioThread) {
if (RECORD_LENGTH > 0) {
audioData = samples[samplesIndex++ % samples.length];
audioData.position(0).limit(0);
}
//Log.v(LOG_TAG,"recording? " + recording);
bufferReadResult = audioRecord.read(audioData.array(), 0, audioData.capacity());
audioData.limit(bufferReadResult);
if (bufferReadResult > 0) {
Log.v(LOG_TAG,"bufferReadResult: " + bufferReadResult);
// If "recording" isn't true when start this thread, it never get's set according to this if statement...!!!
// Why? Good question...
if (recording) {
if (RECORD_LENGTH <= 0) try {
recorder.record(audioData);
//Log.v(LOG_TAG,"recording " + 1024*i + " to " + 1024*i+1024);
} catch (FFmpegFrameRecorder.Exception e) {
Log.v(LOG_TAG,e.getMessage());
e.printStackTrace();
}
}
}
}
Log.v(LOG_TAG,"AudioThread Finished, release audioRecord");
/* encoding finish, release recorder */
if (audioRecord != null) {
audioRecord.stop();
audioRecord.release();
audioRecord = null;
Log.v(LOG_TAG,"audioRecord released");
}
}
}
//---------------------------------------------
// camera thread, gets and encodes video data
//---------------------------------------------
class CameraView extends SurfaceView implements SurfaceHolder.Callback, PreviewCallback {
private SurfaceHolder mHolder;
private Camera mCamera;
public CameraView(Context context, Camera camera) {
super(context);
Log.w("camera","camera view");
mCamera = camera;
mHolder = getHolder();
mHolder.addCallback(CameraView.this);
mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
mCamera.setPreviewCallback(CameraView.this);
}
@Override
public void surfaceCreated(SurfaceHolder holder) {
try {
stopPreview();
mCamera.setPreviewDisplay(holder);
} catch (IOException exception) {
mCamera.release();
mCamera = null;
}
}
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
Log.v(LOG_TAG,"Setting imageWidth: " + imageWidth + " imageHeight: " + imageHeight + " frameRate: " + frameRate);
Camera.Parameters camParams = mCamera.getParameters();
camParams.setPreviewSize(imageWidth, imageHeight);
Log.v(LOG_TAG,"Preview Framerate: " + camParams.getPreviewFrameRate());
camParams.setPreviewFrameRate(frameRate);
mCamera.setParameters(camParams);
startPreview();
}
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
try {
mHolder.addCallback(null);
mCamera.setPreviewCallback(null);
} catch (RuntimeException e) {
// The camera has probably just been released, ignore.
}
}
public void startPreview() {
if (!isPreviewOn && mCamera != null) {
isPreviewOn = true;
mCamera.startPreview();
}
}
public void stopPreview() {
if (isPreviewOn && mCamera != null) {
isPreviewOn = false;
mCamera.stopPreview();
}
}
@Override
public void onPreviewFrame(byte[] data, Camera camera) {
if (audioRecord == null || audioRecord.getRecordingState() != AudioRecord.RECORDSTATE_RECORDING) {
startTime = System.currentTimeMillis();
return;
}
if (RECORD_LENGTH > 0) {
int i = imagesIndex++ % images.length;
yuvIplimage = images[i];
timestamps[i] = 1000 * (System.currentTimeMillis() - startTime);
}
/* get video data */
if (yuvIplimage != null && recording) {
yuvIplimage.getByteBuffer().put(data);
if (RECORD_LENGTH <= 0) try {
Log.v(LOG_TAG,"Writing Frame");
long t = 1000 * (System.currentTimeMillis() - startTime);
if (t > recorder.getTimestamp()) {
recorder.setTimestamp(t);
}
recorder.record(yuvIplimage);
} catch (FFmpegFrameRecorder.Exception e) {
Log.v(LOG_TAG,e.getMessage());
e.printStackTrace();
}
}
}
}
@Override
public void onClick(View v) {
if (!recording) {
startRecording();
Log.w(LOG_TAG, "Start Button Pushed");
btnRecorderControl.setText("Stop");
} else {
// This will trigger the audio recording loop to stop and then set isRecorderStart = false;
stopRecording();
Log.w(LOG_TAG, "Stop Button Pushed");
btnRecorderControl.setText("Start");
}
}
}I am getting this error
The type org.bytedeco.javacpp.avutil$AVFrame cannot be resolved. It is indirectly referenced from required .class files
at following line in the above code
if (RECORD_LENGTH <= 0) try {
recorder.record(audioData);
//Log.v(LOG_TAG,"recording " + 1024*i + " to " + 1024*i+1024);
}at recorder.record(audioData). I don’t know what I am doing wrong here. New to JavaCV. Any help will be appreciated.