
Recherche avancée
Médias (91)
-
Richard Stallman et le logiciel libre
19 octobre 2011, par
Mis à jour : Mai 2013
Langue : français
Type : Texte
-
Stereo master soundtrack
17 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Audio
-
Elephants Dream - Cover of the soundtrack
17 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Image
-
#7 Ambience
16 octobre 2011, par
Mis à jour : Juin 2015
Langue : English
Type : Audio
-
#6 Teaser Music
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#5 End Title
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
Autres articles (79)
-
Amélioration de la version de base
13 septembre 2013Jolie sélection multiple
Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...) -
Menus personnalisés
14 novembre 2010, parMediaSPIP utilise le plugin Menus pour gérer plusieurs menus configurables pour la navigation.
Cela permet de laisser aux administrateurs de canaux la possibilité de configurer finement ces menus.
Menus créés à l’initialisation du site
Par défaut trois menus sont créés automatiquement à l’initialisation du site : Le menu principal ; Identifiant : barrenav ; Ce menu s’insère en général en haut de la page après le bloc d’entête, son identifiant le rend compatible avec les squelettes basés sur Zpip ; (...) -
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs
Sur d’autres sites (7707)
-
Android studio + OpenCV + FFmpeg
11 juillet 2018, par t0mI have problem with code, which is functional only for Genymotion device (Android 4.1.1), but for Genymotion device 5.0.1 and real device Huawei honor 4c Android 4.4.2 not.
I have imported OpenCV 3.1 to Android studio by : https://stackoverflow.com/a/27421494/4244605
I added JavaCV with FFmpeg by : https://github.com/bytedeco/javacvAndroid studio 1.5.1
minSdkVersion 15
compileSdkVersion 23Code is only for test.
OpenCVCameraActivity.java :import android.app.Activity;
import android.hardware.Camera;
import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.MediaRecorder;
import android.os.Bundle;
import android.os.Environment;
import android.util.Log;
import android.view.Menu;
import android.view.MenuItem;
import android.view.MotionEvent;
import android.view.SubMenu;
import android.view.SurfaceView;
import android.view.View;
import android.view.WindowManager;
import android.widget.Toast;
import org.bytedeco.javacv.FFmpegFrameRecorder;
import org.bytedeco.javacv.Frame;
import org.opencv.android.BaseLoaderCallback;
import org.opencv.android.CameraBridgeViewBase;
import org.opencv.android.LoaderCallbackInterface;
import org.opencv.android.OpenCVLoader;
import org.opencv.core.Mat;
import java.io.File;
import java.nio.ShortBuffer;
import java.text.SimpleDateFormat;
import java.util.Date;
import java.util.List;
import java.util.ListIterator;
@SuppressWarnings("ALL")
public class OpenCVCameraActivity extends Activity implements
CameraBridgeViewBase.CvCameraViewListener2,
View.OnTouchListener {
//name of activity, for DEBUGGING
private static final String TAG = OpenCVCameraActivity.class.getSimpleName();
private OpenCVCameraPreview mOpenCvCameraView;
private List mResolutionList;
private MenuItem[] mEffectMenuItems;
private SubMenu mColorEffectsMenu;
private MenuItem[] mResolutionMenuItems;
private SubMenu mResolutionMenu;
private static long frameCounter = 0;
long startTime = 0;
private Mat edgesMat;
boolean recording = false;
private int sampleAudioRateInHz = 44100;
private int imageWidth = 1920;
private int imageHeight = 1080;
private int frameRate = 30;
private Frame yuvImage = null;
private File ffmpeg_link;
private FFmpegFrameRecorder recorder;
/* audio data getting thread */
private AudioRecord audioRecord;
private AudioRecordRunnable audioRecordRunnable;
private Thread audioThread;
volatile boolean runAudioThread = true;
ShortBuffer[] samples;
private BaseLoaderCallback mLoaderCallback = new BaseLoaderCallback(this) {
@Override
public void onManagerConnected(int status) {
switch (status) {
case LoaderCallbackInterface.SUCCESS:
Log.i(TAG, "OpenCV loaded successfully");
mOpenCvCameraView.enableView();
mOpenCvCameraView.setOnTouchListener(OpenCVCameraActivity.this);
break;
default:
super.onManagerConnected(status);
break;
}
}
};
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
if(Static.DEBUG) Log.i(TAG, "onCreate()");
getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
try {
setContentView(R.layout.activity_opencv);
mOpenCvCameraView = (OpenCVCameraPreview) findViewById(R.id.openCVCameraPreview);
mOpenCvCameraView.setVisibility(SurfaceView.VISIBLE);
mOpenCvCameraView.setCvCameraViewListener(this);
//mOpenCvCameraView.enableFpsMeter();
ffmpeg_link = new File(Environment.getExternalStorageDirectory(), "stream.mp4");
} catch (Exception e){
e.printStackTrace();
}
}
@Override
protected void onRestart() {
if (Static.DEBUG) Log.i(TAG, "onRestart()");
super.onRestart();
}
@Override
protected void onStart() {
if (Static.DEBUG) Log.i(TAG, "onStart()");
super.onStart();
}
@Override
protected void onResume() {
if (Static.DEBUG) Log.i(TAG, "onResume()");
super.onResume();
if (!OpenCVLoader.initDebug()) {
Log.i(TAG, "Internal OpenCV library not found. Using OpenCV Manager for initialization");
OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_2_4_11, this, mLoaderCallback);
} else {
Log.i(TAG, "OpenCV library found inside package. Using it!");
mLoaderCallback.onManagerConnected(LoaderCallbackInterface.SUCCESS);
}
}
@Override
public boolean onCreateOptionsMenu(Menu menu) {
if (Static.DEBUG) Log.i(TAG, "onCreateOptionsMenu()");
super.onCreateOptionsMenu(menu);
List<string> effects = mOpenCvCameraView.getEffectList();
if (effects == null) {
Log.e(TAG, "Color effects are not supported by device!");
return true;
}
mColorEffectsMenu = menu.addSubMenu("Color Effect");
mEffectMenuItems = new MenuItem[effects.size()];
int idx = 0;
ListIterator<string> effectItr = effects.listIterator();
while(effectItr.hasNext()) {
String element = effectItr.next();
mEffectMenuItems[idx] = mColorEffectsMenu.add(1, idx, Menu.NONE, element);
idx++;
}
mResolutionMenu = menu.addSubMenu("Resolution");
mResolutionList = mOpenCvCameraView.getResolutionList();
mResolutionMenuItems = new MenuItem[mResolutionList.size()];
ListIterator resolutionItr = mResolutionList.listIterator();
idx = 0;
while(resolutionItr.hasNext()) {
Camera.Size element = resolutionItr.next();
mResolutionMenuItems[idx] = mResolutionMenu.add(2, idx, Menu.NONE,
Integer.valueOf(element.width).toString() + "x" + Integer.valueOf(element.height).toString());
idx++;
}
return true;
}
@Override
protected void onPause() {
if (Static.DEBUG) Log.i(TAG, "onPause()");
super.onPause();
if (mOpenCvCameraView != null)
mOpenCvCameraView.disableView();
}
@Override
protected void onStop() {
if (Static.DEBUG) Log.i(TAG, "onStop()");
super.onStop();
}
@Override
protected void onDestroy() {
if (Static.DEBUG) Log.i(TAG, "onDestroy()");
super.onDestroy();
if (mOpenCvCameraView != null)
mOpenCvCameraView.disableView();
}
public Mat onCameraFrame(CameraBridgeViewBase.CvCameraViewFrame inputFrame) {
++frameCounter;
//Log.i(TAG, "Frame number: "+frameCounter);
return inputFrame.rgba();
}
@Override
public void onCameraViewStarted(int width, int height) {
edgesMat = new Mat();
}
@Override
public void onCameraViewStopped() {
if (edgesMat != null)
edgesMat.release();
edgesMat = null;
}
public boolean onOptionsItemSelected(MenuItem item) {
Log.i(TAG, "called onOptionsItemSelected; selected item: " + item);
if (item.getGroupId() == 1)
{
mOpenCvCameraView.setEffect((String) item.getTitle());
Toast.makeText(this, mOpenCvCameraView.getEffect(), Toast.LENGTH_SHORT).show();
} else if (item.getGroupId() == 2) {
int id = item.getItemId();
Camera.Size resolution = mResolutionList.get(id);
mOpenCvCameraView.setResolution(resolution);
resolution = mOpenCvCameraView.getResolution();
String caption = Integer.valueOf(resolution.width).toString() + "x" + Integer.valueOf(resolution.height).toString();
Toast.makeText(this, caption, Toast.LENGTH_SHORT).show();
}
return true;
}
@Override
public boolean onTouch(View v, MotionEvent event) {
Log.i(TAG,"onTouch event");
SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd_HH-mm-ss");
String currentDateandTime = sdf.format(new Date());
String fileName = Environment.getExternalStorageDirectory().getPath() +
"/sample_picture_" + currentDateandTime + ".jpg";
mOpenCvCameraView.takePicture(fileName);
Toast.makeText(this, fileName + " saved", Toast.LENGTH_SHORT).show();
return false;
}
/**
* Click to ImageButton to start recording.
*/
public void onClickBtnStartRecord2(View v) {
if (Static.DEBUG) Log.i(TAG, "onClickBtnStartRecord()");
if(!recording)
startRecording();
else
stopRecording();
}
private void startRecording() {
if (Static.DEBUG) Log.i(TAG, "startRecording()");
initRecorder();
try {
recorder.start();
startTime = System.currentTimeMillis();
recording = true;
audioThread.start();
} catch(FFmpegFrameRecorder.Exception e) {
e.printStackTrace();
}
}
private void stopRecording() {
if (Static.DEBUG) Log.i(TAG, "stopRecording()");
runAudioThread = false;
try {
audioThread.join();
} catch(InterruptedException e) {
e.printStackTrace();
}
audioRecordRunnable = null;
audioThread = null;
if(recorder != null && recording) {
recording = false;
Log.v(TAG, "Finishing recording, calling stop and release on recorder");
try {
recorder.stop();
recorder.release();
} catch(FFmpegFrameRecorder.Exception e) {
e.printStackTrace();
}
recorder = null;
}
}
//---------------------------------------
// initialize ffmpeg_recorder
//---------------------------------------
private void initRecorder() {
Log.w(TAG, "init recorder");
try {
if (yuvImage == null) {
yuvImage = new Frame(imageWidth, imageHeight, Frame.DEPTH_UBYTE, 2);
Log.i(TAG, "create yuvImage");
}
Log.i(TAG, "ffmpeg_url: " + ffmpeg_link.getAbsolutePath());
Log.i(TAG, "ffmpeg_url: " + ffmpeg_link.exists());
recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight, 1);
recorder.setFormat("mp4");
recorder.setSampleRate(sampleAudioRateInHz);
// Set in the surface changed method
recorder.setFrameRate(frameRate);
Log.i(TAG, "recorder initialize success");
audioRecordRunnable = new AudioRecordRunnable();
audioThread = new Thread(audioRecordRunnable);
runAudioThread = true;
} catch (Exception e){
e.printStackTrace();
}
}
//---------------------------------------------
// audio thread, gets and encodes audio data
//---------------------------------------------
class AudioRecordRunnable implements Runnable {
@Override
public void run() {
android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
// Audio
int bufferSize;
ShortBuffer audioData;
int bufferReadResult;
bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz,
AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz,
AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);
audioData = ShortBuffer.allocate(bufferSize);
Log.d(TAG, "audioRecord.startRecording()");
audioRecord.startRecording();
/* ffmpeg_audio encoding loop */
while(runAudioThread) {
//Log.v(TAG,"recording? " + recording);
bufferReadResult = audioRecord.read(audioData.array(), 0, audioData.capacity());
audioData.limit(bufferReadResult);
if(bufferReadResult > 0) {
Log.v(TAG, "bufferReadResult: " + bufferReadResult);
// If "recording" isn't true when start this thread, it never get's set according to this if statement...!!!
// Why? Good question...
if(recording) {
try {
recorder.recordSamples(audioData);
//Log.v(TAG,"recording " + 1024*i + " to " + 1024*i+1024);
} catch(FFmpegFrameRecorder.Exception e) {
Log.v(TAG, e.getMessage());
e.printStackTrace();
}
}
}
}
Log.v(TAG, "AudioThread Finished, release audioRecord");
/* encoding finish, release recorder */
if(audioRecord != null) {
audioRecord.stop();
audioRecord.release();
audioRecord = null;
Log.v(TAG, "audioRecord released");
}
}
}
}
</string></string>OpenCVCameraPreview.java :
import android.content.Context;
import android.hardware.Camera;
import android.util.AttributeSet;
import android.util.Log;
import org.opencv.android.JavaCameraView;
import java.io.FileOutputStream;
import java.util.List;
public class OpenCVCameraPreview extends JavaCameraView implements Camera.PictureCallback {
private static final String TAG = OpenCVCameraPreview.class.getSimpleName();
private String mPictureFileName;
public OpenCVCameraPreview(Context context, AttributeSet attrs) {
super(context, attrs);
}
public List<string> getEffectList() {
return mCamera.getParameters().getSupportedColorEffects();
}
public boolean isEffectSupported() {
return (mCamera.getParameters().getColorEffect() != null);
}
public String getEffect() {
return mCamera.getParameters().getColorEffect();
}
public void setEffect(String effect) {
Camera.Parameters params = mCamera.getParameters();
params.setColorEffect(effect);
mCamera.setParameters(params);
}
public List getResolutionList() {
return mCamera.getParameters().getSupportedPreviewSizes();
}
public void setResolution(Camera.Size resolution) {
disconnectCamera();
mMaxHeight = resolution.height;
mMaxWidth = resolution.width;
connectCamera(getWidth(), getHeight());
}
public Camera.Size getResolution() {
return mCamera.getParameters().getPreviewSize();
}
public void takePicture(final String fileName) {
Log.i(TAG, "Taking picture");
this.mPictureFileName = fileName;
// Postview and jpeg are sent in the same buffers if the queue is not empty when performing a capture.
// Clear up buffers to avoid mCamera.takePicture to be stuck because of a memory issue
mCamera.setPreviewCallback(null);
// PictureCallback is implemented by the current class
mCamera.takePicture(null, null, this);
}
@Override
public void onPictureTaken(byte[] data, Camera camera) {
Log.i(TAG, "Saving a bitmap to file");
// The camera preview was automatically stopped. Start it again.
mCamera.startPreview();
mCamera.setPreviewCallback(this);
// Write the image in a file (in jpeg format)
try {
FileOutputStream fos = new FileOutputStream(mPictureFileName);
fos.write(data);
fos.close();
} catch (java.io.IOException e) {
Log.e("PictureDemo", "Exception in photoCallback", e);
}
}
}
</string>Gradle :
apply plugin: 'com.android.application'
android {
compileSdkVersion 23
buildToolsVersion "23.0.2"
defaultConfig {
applicationId "co.example.example"
minSdkVersion 15
targetSdkVersion 23
versionCode 1
versionName "1.0"
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro'
}
}
packagingOptions {
exclude 'META-INF/maven/org.bytedeco.javacpp-presets/opencv/pom.properties'
exclude 'META-INF/maven/org.bytedeco.javacpp-presets/opencv/pom.xml'
exclude 'META-INF/maven/org.bytedeco.javacpp-presets/ffmpeg/pom.properties'
exclude 'META-INF/maven/org.bytedeco.javacpp-presets/ffmpeg/pom.xml'
}
}
repositories {
mavenCentral()
}
dependencies {
compile fileTree(include: ['*.jar'], dir: 'libs')
testCompile 'junit:junit:4.12'
compile 'com.android.support:appcompat-v7:23.1.1'
compile 'com.google.android.gms:play-services-appindexing:8.1.0'
compile group: 'org.bytedeco', name: 'javacv', version: '1.1'
compile group: 'org.bytedeco.javacpp-presets', name: 'opencv', version: '3.0.0-1.1', classifier: 'android-arm'
compile group: 'org.bytedeco.javacpp-presets', name: 'opencv', version: '3.0.0-1.1', classifier: 'android-x86'
compile group: 'org.bytedeco.javacpp-presets', name: 'ffmpeg', version: '2.8.1-1.1', classifier: 'android-arm'
compile group: 'org.bytedeco.javacpp-presets', name: 'ffmpeg', version: '2.8.1-1.1', classifier: 'android-x86'
compile project(':openCVLibrary310')
}proguard-rules.pro
Edited by : linkjniLibs :
app/src/main/jniLibs :armeabi armeabi-v7a arm64-v8a mips mips64 x86 x86_64
Problem
02-19 11:57:37.684 1759-1759/ I/OpenCVCameraActivity: onClickBtnStartRecord()
02-19 11:57:37.684 1759-1759/ I/OpenCVCameraActivity: startRecording()
02-19 11:57:37.684 1759-1759/ W/OpenCVCameraActivity: init recorder
02-19 11:57:37.691 1759-1759/ I/OpenCVCameraActivity: create yuvImage
02-19 11:57:37.691 1759-1759/ I/OpenCVCameraActivity: ffmpeg_url: /storage/emulated/0/stream.mp4
02-19 11:57:37.696 1759-1759/ I/OpenCVCameraActivity: ffmpeg_url: false
02-19 11:57:37.837 1759-1759/ W/linker: libjniavutil.so: unused DT entry: type 0x1d arg 0x18cc3
02-19 11:57:37.837 1759-1759/ W/linker: libjniavutil.so: unused DT entry: type 0x6ffffffe arg 0x21c30
02-19 11:57:37.837 1759-1759/ W/linker: libjniavutil.so: unused DT entry: type 0x6fffffff arg 0x1
02-19 11:57:37.838 1759-1759/co.example.example E/art: dlopen("/data/app/co.example.example-2/lib/x86/libjniavutil.so", RTLD_LAZY) failed: dlopen failed: cannot locate symbol "av_version_info" referenced by "libjniavutil.so"...
02-19 11:57:37.843 1759-1759/co.example.example I/art: Rejecting re-init on previously-failed class java.lang.Class
02-19 11:57:37.844 1759-1759/co.example.example E/AndroidRuntime: FATAL EXCEPTION: main
Process: co.example.example, PID: 1759
java.lang.IllegalStateException: Could not execute method of the activity
at android.view.View$1.onClick(View.java:4020)
at android.view.View.performClick(View.java:4780)
at android.view.View$PerformClick.run(View.java:19866)
at android.os.Handler.handleCallback(Handler.java:739)
at android.os.Handler.dispatchMessage(Handler.java:95)
at android.os.Looper.loop(Looper.java:135)
at android.app.ActivityThread.main(ActivityThread.java:5254)
at java.lang.reflect.Method.invoke(Native Method)
at java.lang.reflect.Method.invoke(Method.java:372)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:903)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:698)
Caused by: java.lang.reflect.InvocationTargetException
at java.lang.reflect.Method.invoke(Native Method)
at java.lang.reflect.Method.invoke(Method.java:372)
at android.view.View$1.onClick(View.java:4015)
at android.view.View.performClick(View.java:4780)
at android.view.View$PerformClick.run(View.java:19866)
at android.os.Handler.handleCallback(Handler.java:739)
at android.os.Handler.dispatchMessage(Handler.java:95)
at android.os.Looper.loop(Looper.java:135)
at android.app.ActivityThread.main(ActivityThread.java:5254)
at java.lang.reflect.Method.invoke(Native Method)
at java.lang.reflect.Method.invoke(Method.java:372)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:903)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:698)
Caused by: java.lang.UnsatisfiedLinkError: org.bytedeco.javacpp.avutil
at java.lang.Class.classForName(Native Method)
at java.lang.Class.forName(Class.java:309)
at org.bytedeco.javacpp.Loader.load(Loader.java:413)
at org.bytedeco.javacpp.Loader.load(Loader.java:381)
at org.bytedeco.javacpp.avcodec$AVPacket.<clinit>(avcodec.java:1650)
at org.bytedeco.javacv.FFmpegFrameRecorder.<init>(FFmpegFrameRecorder.java:149)
at org.bytedeco.javacv.FFmpegFrameRecorder.<init>(FFmpegFrameRecorder.java:129)
at co.example.example.OpenCVCameraActivity.initRecorder(OpenCVCameraActivity.java:320)
at co.example.example.OpenCVCameraActivity.startRecording(OpenCVCameraActivity.java:266)
at co.example.example.OpenCVCameraActivity.onClickBtnStartRecord2(OpenCVCameraActivity.java:259)
at java.lang.reflect.Method.invoke(Native Method)
at java.lang.reflect.Method.invoke(Method.java:372)
at android.view.View$1.onClick(View.java:4015)
at android.view.View.performClick(View.java:4780)
at android.view.View$PerformClick.run(View.java:19866)
at android.os.Handler.handleCallback(Handler.java:739)
at android.os.Handler.dispatchMessage(Handler.java:95)
at android.os.Looper.loop(Looper.java:135)
at android.app.ActivityThread.main(ActivityThread.java:5254)
at java.lang.reflect.Method.invoke(Native Method)
at java.lang.reflect.Method.invoke(Method.java:372)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:903)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:698)
</init></init></clinit> -
Is there a case where I can not create a file if I run a lot of threads ? ( feat. Ffmpeg, thread)
26 juin 2018, par JunburgI am now creating an app that will duet with the singer. We use a thread (mAudioPlayer) that outputs background music, a vAudioPlayer that prints the voice of the singer, a mRecordThread that records the voice of the user, and a thread that weaves and attaches each file (mp3ConcatThread).
It works by stopping the singer’s voice while recording background music and recording by the user. Of course, when the user does not record, the singer’s voice is output. In this way, each section must be made into an mp3 file and merged into a single file. However, it often happens that files that do not record and merge properly are created.
Audio processing is done using Ffmpeg. I guess the following error might be the reason.
06-26 21:37:11.084 13017-13017/com.softcode.kihnoplay I/Choreographer: Skipped 72 frames! The application may be doing too much work on its main thread.
Could not generate a file because of this kind of error ?
If you know the answer to this question, please answer. Thank you.
Below are related codes. For more information, please leave a comment.
Because the code is too long, I’ve included only the code that looks like it’s needed.Record Thread.class
public class Record_Thread {
private static final String LOG_TAG = Record_Thread.class.getSimpleName();
private static final int SAMPLE_RATE = 44100;
private int bufferSize = 0;
private String currentOutFile = null;
private Context context;
byte RECORDER_BPP = 16;
public Record_Thread(Record_interface listener) {
mListener = listener;
Player.currentCreateFileName = SmcInfo.APPDIRPATH + "/ucc/" + Player.getCurrentTime(false);
currentOutFile = Player.currentCreateFileName + ".pcm";
}
public Record_Thread(Record_interface listener, Context context) {
mListener = listener;
RecordActivity.currentCreateFileName = SmcInfo.APPDIRPATH + "/ucc/" + RecordActivity.getCurrentTime(false);
currentOutFile = RecordActivity.currentCreateFileName + ".pcm";
this.context = context;
}
private boolean isSampleTranspo;
private boolean isRecording;
public boolean isSharding = false;
private Record_interface mListener;
private Thread mThread;
public boolean recording() {
return mThread != null;
}
public void setSampleTranspo(boolean booleanValue) {
this.isSampleTranspo = booleanValue;
}
public boolean getSampleTranspo() {
return this.isSampleTranspo;
}
long startpoint = 0;
boolean posWrite = false;
public void startRecording() {
if (mThread != null)
return;
isRecording = true;
mThread = new Thread(new Runnable() {
@Override
public void run() {
record();
}
});
mThread.start();
}
public void stopRecording() {
if (mThread == null)
return;
isRecording = false;
mThread = null;
posWrite = false;
startpoint = 0;
}
public void startFileWrite(long startpoint) {
this.startpoint = startpoint;
this.posWrite = true;
}
public void stopFileWrite() {
this.posWrite = false;
}
private void record() {
try {
Log.v(LOG_TAG, "Start");
android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_AUDIO);
bufferSize = AudioRecord.getMinBufferSize(SAMPLE_RATE, AudioFormat.CHANNEL_IN_STEREO, AudioFormat.ENCODING_PCM_16BIT);
if (bufferSize == AudioRecord.ERROR || bufferSize == AudioRecord.ERROR_BAD_VALUE) {
bufferSize = SAMPLE_RATE * 2;
}
short[] audioBuffer = new short[bufferSize];
short[] audioZero = new short[bufferSize];
AudioRecord record = new AudioRecord(MediaRecorder.AudioSource.MIC, SAMPLE_RATE, AudioFormat.CHANNEL_IN_STEREO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);
if (record.getState() != AudioRecord.STATE_INITIALIZED) {
Log.e(LOG_TAG, "Audio Record can't initialize!");
return;
}
record.startRecording();
Log.v(LOG_TAG, "Start recording");
long shortsRead = 0;
int readsize = 0;
File tempFile = new File(currentOutFile);
if (tempFile.exists())
tempFile.delete();
FileOutputStream fos = new FileOutputStream(currentOutFile);
byte[] audiodata = new byte[bufferSize];
while (isRecording && record != null) {
readsize = record.read(audiodata, 0, audiodata.length);
if (AudioRecord.ERROR_INVALID_OPERATION != readsize && fos != null) {
try {
if (readsize > 0 && readsize <= audiodata.length) {
fos.write(audiodata, 0, readsize);//TypeCast.shortToByte(audioBuffer)
fos.flush();
}
} catch (Exception ex) {
Log.e("AudioRecorder", ex.getMessage());
}
}
ShortBuffer sb = ByteBuffer.wrap(audiodata).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer();
short[] samples = new short[sb.limit()];
sb.get(samples);
if (isSampleTranspo) {
mListener.onAudioDataReceived(samples);
} else {
mListener.onAudioDataReceived(audioZero);
}
if (posWrite) {
FileOutputStream pos = null;
try {
if (startpoint > 0) {
if (context instanceof RecordActivity) {
pos = new FileOutputStream(RecordActivity.currentCreateFileName.replaceAll("/ucc/", "/tmp/") + "_" + caltime(String.valueOf((int) (startpoint / 1000)), false) + "_uv.pcm", true);/////파일에 이어서 쓰기
Log.d(TAG, "record: " + pos.toString());
} else {
pos = new FileOutputStream(Player.currentCreateFileName.replaceAll("/ucc/", "/tmp/") + "_" + caltime(String.valueOf((int) (startpoint / 1000)), false) + "_uv.pcm", true);/////파일에 이어서 쓰기
}
}
pos.write(audiodata);
pos.flush();
} catch (Exception e) {
e.printStackTrace();
} finally {
pos.close();
pos = null;
}
}
}
if (fos != null)
fos.close();
mListener.onRecordEnd();
record.stop();
record.release();
} catch (IOException e) {
Log.e("AudioRecorder", e.getMessage());
}
}
private String caltime(String sMillis, boolean timeFormat) {
double dMillis = 0;
int minutes = 0;
int seconds = 0;
int millis = 0;
String sTime;
try {
dMillis = Double.parseDouble(sMillis);
} catch (Exception e) {
System.out.println(e.getMessage());
}
seconds = (int) (dMillis / 1000) % 60;
millis = (int) (dMillis % 1000);
if (seconds > 0) {
minutes = (int) (dMillis / 1000 / 60) % 60;
if (minutes > 0) {
if (timeFormat)
sTime = String.format("%02d:%02d.%d", minutes, seconds, millis);
else
sTime = String.format("%02d%02d%d", minutes, seconds, millis);
} else {
if (timeFormat)
sTime = String.format("%02d:%02d.%d", 0, seconds, millis);
else
sTime = String.format("%02d%02d%d", 0, seconds, millis);
}
} else {
if (timeFormat)
sTime = String.format("%02d:%02d.%d", 0, 0, millis);
else
sTime = String.format("%02d%02d%d", 0, 0, millis);
}
return sTime;
}
}RecordActivity.class
public class RecordActivity extends AppCompatActivity implements Player_interface, SeekBar.OnSeekBarChangeListener {
private static final String TAG = "RecordActivity";
public Context context = this;
private LinearLayout recordLayout;
private RelativeLayout recordBtn, saveBtn;
private CircleImageView userImg, artistImg;
private TextView songTitleTxt, playTimeTxt, progressTimeTxt;
private BlurBitmap blurBitmap;
private SeekBar seekBar;
private ImageView micBg1, micBg2;
private String assPath;
private String ampPath;
private int deviceWidth, deviceHeight;
public static AssRenderView assView;
public static LinearLayout lyricsLayout;
public static int lyricsWidth, lyricsHeight, layoutWidth;
public static LinearLayout.LayoutParams assViewParams;
public static String currentCreateFileName = null;
public static String mrPath;
public static String voicePath;
private String recMusicPath;
Player_Thread mAudioPlayer = null, vAudioPlayer = null, testPlayer = null;
private Record_Thread mRecordThread;
public static Mp3Concat_Thread mMp3ConcatThread;
long lastDuration = 0L;
private boolean isSeekbarTouch = false;
private ArrayList<long> combineList;
CNetProgressdialog createMp3Dialog;
int bufferSize = 7104;
int SAMPLE_RATE = 44100;
int RECORDER_SAMPLERATE = 44100;
byte RECORDER_BPP = 16;
@Override
protected void onCreate(@Nullable Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
removeDir(SmcInfo.APPDIRPATH + "/tmp");
setContentView(R.layout.activity_record_phone);
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.KITKAT) {
Window window = getWindow();
window.addFlags(WindowManager.LayoutParams.FLAG_LAYOUT_NO_LIMITS);
window.addFlags(WindowManager.LayoutParams.FLAG_TRANSLUCENT_NAVIGATION);
}
recordLayout = (LinearLayout) findViewById(R.id.record_layout);
userImg = (CircleImageView) findViewById(R.id.user_img);
artistImg = (CircleImageView) findViewById(R.id.artist_img);
songTitleTxt = (TextView) findViewById(R.id.song_title_txt);
progressTimeTxt = (TextView) findViewById(R.id.progress_time_txt);
playTimeTxt = (TextView) findViewById(R.id.play_time_txt);
recordBtn = (RelativeLayout) findViewById(R.id.record_btn);
saveBtn = (RelativeLayout) findViewById(R.id.save_btn);
seekBar = (SeekBar) findViewById(R.id.seek_bar);
micBg1 = (ImageView) findViewById(R.id.mic_bg_small);
micBg2 = (ImageView) findViewById(R.id.mic_bg_big);
createMp3Dialog = new CNetProgressdialog(this);
GradientDrawable drawable = new GradientDrawable();
drawable.setColors(new int[]{
Color.parseColor("#32c49b"),
Color.parseColor("#19b2c3")
});
Intent intent = getIntent();
final String artistImgPath = intent.getStringExtra("artistImgPath");
final String songTitle = intent.getStringExtra("songTitle");
assPath = intent.getStringExtra("assPath");
ampPath = intent.getStringExtra("ampPath");
String playTime = intent.getStringExtra("playTime");
blurBitmap = new BlurBitmap();
songTitleTxt.setText(songTitle);
playTimeTxt.setText(playTime);
final Bitmap artistImgBitmap = blurBitmap.toBitmap(artistImgPath);
final Bitmap userImgBitmap = BitmapFactory.decodeResource(context.getResources(), R.drawable.dummy_artist_2);
final Bitmap userBlurImg = blurBitmap.blurRenderScript(this, userImgBitmap, 25);
final Bitmap artistBlurImg = blurBitmap.blurRenderScript(this, artistImgBitmap, 25);
artistImg.setImageBitmap(artistImgBitmap);
userImg.setImageBitmap(userBlurImg);
drawable.setGradientType(GradientDrawable.LINEAR_GRADIENT);
drawable.setOrientation(GradientDrawable.Orientation.TOP_BOTTOM);
recordLayout.setBackground(drawable);
play(ampToMp3(ampPath));
mRecordThread = new Record_Thread(new Record_interface() {
@Override
public void onAudioDataReceived(short[] data) {
}
@Override
public void onRecordEnd() {
}
}, context);
mMp3ConcatThread = new Mp3Concat_Thread(new Mp3Concat_interface() {
@Override
public void onAudioDataReceived(short[] data) {
}
@Override
public void onRecordEnd() {
createMp3Dialog.dismiss();
startPrelisteningActivity(recMusicPath, songTitle);
}
}, this);
if (!mRecordThread.recording()) {
mRecordThread.startRecording();
}
final Animation animZoomIn = AnimationUtils.loadAnimation(this, R.anim.zoom_in);
final Animation animZoomOut = AnimationUtils.loadAnimation(this, R.anim.zoom_out);
final Animation animMic1 = AnimationUtils.loadAnimation(this, R.anim.bg_mic_anim_1_phone);
final Animation animMic2 = AnimationUtils.loadAnimation(this, R.anim.bg_mic_anim_2_phone);
artistImg.startAnimation(animZoomIn);
combineList = new ArrayList<long>();
recordBtn.setOnTouchListener(new View.OnTouchListener() {
@Override
public boolean onTouch(View view, MotionEvent motionEvent) {
switch (motionEvent.getAction()) {
case MotionEvent.ACTION_DOWN: {
long currentDuration = vAudioPlayer.getCurrentDuration();
// 녹음 시작 ( combineList 사이즈가 짝수일 때 )
if (mRecordThread != null) {
if (combineList.size() % 2 == 0) {
mRecordThread.startFileWrite(currentDuration);
combineList.add(currentDuration);
}
vAudioPlayer.setSampleTranspo(true);
mRecordThread.setSampleTranspo(true);
}
}
micBg1.setVisibility(View.VISIBLE);
micBg2.setVisibility(View.VISIBLE);
micBg1.startAnimation(animMic1);
micBg2.startAnimation(animMic2);
userImg.setImageBitmap(userImgBitmap);
userImg.startAnimation(animZoomIn);
artistImg.setImageBitmap(artistBlurImg);
artistImg.startAnimation(animZoomOut);
break;
case MotionEvent.ACTION_UP: {
long currentDuration = vAudioPlayer.getCurrentDuration();
if (mRecordThread != null) {
if (combineList.size() % 2 == 1) {
mRecordThread.startRecording();
mRecordThread.stopFileWrite();
File waveFile = new File(RecordActivity.currentCreateFileName.replaceAll("/ucc/", "/tmp/")
+ "_" + caltime(combineList.get(combineList.size() - 1) / 1000, false) + "_uv.pcm");
if (waveFile.exists()) {
copyWaveFile(RecordActivity.currentCreateFileName.replaceAll("/ucc/", "/tmp/") + "_" + caltime(combineList.get(combineList.size() - 1) / 1000, false) + "_uv.pcm",
RecordActivity.currentCreateFileName.replaceAll("/ucc/", "/tmp/") + "_" + caltime(combineList.get(combineList.size() - 1) / 1000, false) + "_u0.wav");
Log.d(TAG, "onTouch: " + currentCreateFileName);
if (mMp3ConcatThread != null) {
mMp3ConcatThread.startCombine(null, 3333333333333333333L, combineList.get(combineList.size() - 1), currentDuration);
}
}
combineList.add(currentDuration);
Log.d(TAG, "onTouch: " + combineList.size());
if (combineList.size() == 2) {
mMp3ConcatThread.startCombine(null, 0, combineList.get(combineList.size() - 2), currentDuration);
} else {
mMp3ConcatThread.startCombine(null, combineList.get(combineList.size() - 3), combineList.get(combineList.size() - 2), currentDuration);
}
}
vAudioPlayer.setSampleTranspo(false);
mRecordThread.setSampleTranspo(false);
}
}
micBg1.setVisibility(View.GONE);
micBg2.setVisibility(View.GONE);
micBg1.clearAnimation();
micBg2.clearAnimation();
userImg.setImageBitmap(userBlurImg);
userImg.startAnimation(animZoomOut);
artistImg.setImageBitmap(artistImgBitmap);
artistImg.startAnimation(animZoomIn);
break;
}
return false;
}
});
saveBtn.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
createMp3Dialog.show();
vAudioPlayer.setSampleTranspo(true);
vAudioPlayer.setlistenerStop(true);
if (assView != null)
assView.Destroy();
if (lyricsLayout != null) {
lyricsLayout.removeAllViews();
}
seekBar.setProgress(0);
seekBar.setMax(100);
Log.d(TAG, "donep3: " + "done");
if (mMp3ConcatThread != null) {
try {
mMp3ConcatThread.startCombine(combineList, 7777777777777777777L, combineList.get(combineList.size() - 1), lastDuration);
} catch (ArrayIndexOutOfBoundsException e) {
e.getMessage();
finish();
}
}
releaseAudioPlayer();
recMusicPath = SmcInfo.APPDIRPATH + "/ucc/" + currentCreateFileName.substring(currentCreateFileName.lastIndexOf('/') + 1, currentCreateFileName.length()) + ".mp3";
}
});
DisplayMetrics displayMetrics = new DisplayMetrics();
getWindowManager().getDefaultDisplay().getMetrics(displayMetrics);
deviceWidth = displayMetrics.widthPixels;
deviceHeight = displayMetrics.heightPixels;
lyricsWidth = deviceWidth;
lyricsHeight = deviceHeight;
Log.d(TAG, "onCreate: " + lyricsWidth + "/" + lyricsHeight);
layoutWidth = lyricsWidth * 2 / 3;
int parentAssViewHeight = ((lyricsHeight * 50) / 91) - 2;
if (layoutWidth > parentAssViewHeight)
layoutWidth = (parentAssViewHeight * 8) / 10;
assViewParams = new LinearLayout.LayoutParams(new ViewGroup.LayoutParams(layoutWidth * 2, layoutWidth));
assViewParams.gravity = Gravity.CENTER;
lyricsLayout = (LinearLayout)
findViewById(R.id.lyrics_layout);
if (assView != null) {
assView.Destroy();
}
if (lyricsLayout != null) {
lyricsLayout.removeAllViews();
}
assView = new AssRenderView(getApplicationContext(), layoutWidth * 13 / 10, layoutWidth);
File assFile = new File(assPath);
if (assFile.exists()) {
assView.ReadASSFile(assFile.toString(), true, layoutWidth * 2, layoutWidth * 5 / 7);
}
lyricsLayout.addView(assView, assViewParams);
lyricsLayout.setGravity(Gravity.CENTER);
assView.ShowASS(true);
seekBar.setOnSeekBarChangeListener(this);
seekBar.setProgress(0);
seekBar.setMax(100);
}
private void startPrelisteningActivity(String recMusicPath, String songTitle) {
Intent intent = new Intent(RecordActivity.this, PrelisteningActivity.class);
intent.putExtra("recMusicPath", recMusicPath);
intent.putExtra("songTitle", songTitle);
startActivityForResult(intent, 1);
}
private String[] ampToMp3(String ampPath) {
String[] pathArray = new String[2];
try {
File ampFile = new File(ampPath);
String ampName = ampFile.getName();
int size;
BufferedInputStream buf = null;
FileInputStream fis = null;
size = (int) ampFile.length();
byte[] bytes = new byte[size];
fis = new FileInputStream(ampFile);
buf = new BufferedInputStream(fis, 8 * 1024);
buf.read(bytes, 0, bytes.length);
byte[] vocalbytes = AMPFileUtility.getByteData(bytes, "voice");
byte[] mrbytes = AMPFileUtility.getByteData(bytes, "mr");
voicePath = SmcInfo.APPDIRPATH + "/audio/" + ampName.replaceAll(".amp", "") + "_voice.mp3";
mrPath = SmcInfo.APPDIRPATH + "/audio/" + ampName.replaceAll(".amp", "") + "_mr.mp3";
BufferedOutputStream bosVocal = new BufferedOutputStream(new FileOutputStream(voicePath));
bosVocal.write(vocalbytes);
bosVocal.flush();
bosVocal.close();
BufferedOutputStream bosMr = new BufferedOutputStream(new FileOutputStream(mrPath));
bosMr.write(mrbytes);
bosMr.flush();
bosMr.close();
} catch (Exception e) {
e.getMessage();
}
pathArray[0] = voicePath;
pathArray[1] = mrPath;
return pathArray;
}
private void play(String[] pathArray) {
releaseAudioPlayer();
String voicePath = pathArray[0];
String mrPath = pathArray[1];
mAudioPlayer = new Player_Thread();
mAudioPlayer.setOnAudioStreamInterface(this);
mAudioPlayer.setUrlString(mrPath);
mAudioPlayer.setlistenerStop(true);
vAudioPlayer = new Player_Thread();
vAudioPlayer.setOnAudioStreamInterface(this);
vAudioPlayer.setUrlString(voicePath);
vAudioPlayer.setlistenerStop(false);
try {
mAudioPlayer.play();
vAudioPlayer.play();
} catch (IOException e) {
e.printStackTrace();
}
}
private void releaseAudioPlayer() {
if (mAudioPlayer != null) {
mAudioPlayer.stop();
mAudioPlayer.release();
mAudioPlayer = null;
}
if (vAudioPlayer != null) {
vAudioPlayer.stop();
vAudioPlayer.release();
vAudioPlayer = null;
}
if (mRecordThread != null) {
mRecordThread.stopRecording();
}
}
public static String getCurrentTime(boolean dateForm) {
SimpleDateFormat dateFormat;
if (dateForm)
dateFormat = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss.SSS"); //SSS가 밀리세컨드 표시
else
dateFormat = new SimpleDateFormat("yyyyMMdd_HHmmssSSS");
Calendar calendar = Calendar.getInstance();
return dateFormat.format(calendar.getTime());
}
private String caltime(long sMillis, boolean timeFormat) {
double dMillis = 0;
int minutes = 0;
int seconds = 0;
int millis = 0;
String sTime;
try {
dMillis = Double.parseDouble(String.valueOf(sMillis));
} catch (Exception e) {
System.out.println(e.getMessage());
}
seconds = (int) (dMillis / 1000) % 60;
millis = (int) (dMillis % 1000);
if (seconds > 0) {
minutes = (int) (dMillis / 1000 / 60) % 60;
if (minutes > 0) {
if (timeFormat)
sTime = String.format("%02d:%02d.%d", minutes, seconds, millis);
else
sTime = String.format("%02d%02d%d", minutes, seconds, millis);
} else {
if (timeFormat)
sTime = String.format("%02d:%02d.%d", 0, seconds, millis);
else
sTime = String.format("%02d%02d%d", 0, seconds, millis);
}
} else {
if (timeFormat)
sTime = String.format("%02d:%02d.%d", 0, 0, millis);
else
sTime = String.format("%02d%02d%d", 0, 0, millis);
}
Log.d(TAG, "caltime: " + sTime);
return sTime;
}
public void copyWaveFile(String inFilename, String outFilename) {
FileInputStream in = null;
FileOutputStream out = null;
long totalAudioLen = 0;
long totalDataLen = totalAudioLen + 36;
long longSampleRate = SAMPLE_RATE;
int channels = 2;/////////////////byte 저장은 1에서 완벽함 AudioFormat.CHANNEL_IN_MONO: channels = 1;AudioFormat.CHANNEL_IN_STEREO: channels = 2;
long byteRate = RECORDER_BPP * SAMPLE_RATE * channels / 8;
try {
in = new FileInputStream(inFilename);
out = new FileOutputStream(outFilename);
byte[] data = new byte[bufferSize];
totalAudioLen = in.getChannel().size();
totalDataLen = totalAudioLen + 36;
AppLog.logString("File size: " + totalDataLen);
WriteWaveFileHeader(out, totalAudioLen, totalDataLen, longSampleRate, channels, byteRate);
while (in.read(data) != -1) {
out.write(data);
}
in.close();
out.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
public void WriteWaveFileHeader(FileOutputStream out, long totalAudioLen, long totalDataLen, long longSampleRate, int channels, long byteRate) throws IOException {
byte[] header = new byte[44];
header[0] = 'R';
header[1] = 'I';
header[2] = 'F';
header[3] = 'F';
header[4] = (byte) (totalDataLen & 0xff);
header[5] = (byte) ((totalDataLen >> 8) & 0xff);
header[6] = (byte) ((totalDataLen >> 16) & 0xff);
header[7] = (byte) ((totalDataLen >> 24) & 0xff);
header[8] = 'W';
header[9] = 'A';
header[10] = 'V';
header[11] = 'E';
header[12] = 'f';
header[13] = 'm';
header[14] = 't';
header[15] = ' ';
header[16] = 16;
header[17] = 0;
header[18] = 0;
header[19] = 0;
header[20] = 1;
header[21] = 0;
header[22] = (byte) channels;
header[23] = 0;
header[24] = (byte) (longSampleRate & 0xff);
header[25] = (byte) ((longSampleRate >> 8) & 0xff);
header[26] = (byte) ((longSampleRate >> 16) & 0xff);
header[27] = (byte) ((longSampleRate >> 24) & 0xff);
header[28] = (byte) (byteRate & 0xff);
header[29] = (byte) ((byteRate >> 8) & 0xff);
header[30] = (byte) ((byteRate >> 16) & 0xff);
header[31] = (byte) ((byteRate >> 24) & 0xff);
header[32] = (byte) (2 * 16 / 8);
header[33] = 0;
header[34] = RECORDER_BPP;
header[35] = 0;
header[36] = 'd';
header[37] = 'a';
header[38] = 't';
header[39] = 'a';
header[40] = (byte) (totalAudioLen & 0xff);
header[41] = (byte) ((totalAudioLen >> 8) & 0xff);
header[42] = (byte) ((totalAudioLen >> 16) & 0xff);
header[43] = (byte) ((totalAudioLen >> 24) & 0xff);
out.write(header, 0, 44);
}
</long></long> -
Evolution #4148 : Augmenter la largeur de l’espace privé
21 juin 2018, par b bSuper, merci pour les modifs :)