
Recherche avancée
Médias (1)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (54)
-
Participer à sa traduction
10 avril 2011Vous pouvez nous aider à améliorer les locutions utilisées dans le logiciel ou à traduire celui-ci dans n’importe qu’elle nouvelle langue permettant sa diffusion à de nouvelles communautés linguistiques.
Pour ce faire, on utilise l’interface de traduction de SPIP où l’ensemble des modules de langue de MediaSPIP sont à disposition. ll vous suffit de vous inscrire sur la liste de discussion des traducteurs pour demander plus d’informations.
Actuellement MediaSPIP n’est disponible qu’en français et (...) -
Les statuts des instances de mutualisation
13 mars 2010, parPour des raisons de compatibilité générale du plugin de gestion de mutualisations avec les fonctions originales de SPIP, les statuts des instances sont les mêmes que pour tout autre objets (articles...), seuls leurs noms dans l’interface change quelque peu.
Les différents statuts possibles sont : prepa (demandé) qui correspond à une instance demandée par un utilisateur. Si le site a déjà été créé par le passé, il est passé en mode désactivé. publie (validé) qui correspond à une instance validée par un (...) -
L’agrémenter visuellement
10 avril 2011MediaSPIP est basé sur un système de thèmes et de squelettes. Les squelettes définissent le placement des informations dans la page, définissant un usage spécifique de la plateforme, et les thèmes l’habillage graphique général.
Chacun peut proposer un nouveau thème graphique ou un squelette et le mettre à disposition de la communauté.
Sur d’autres sites (9747)
-
Compress video command for FFMPEG in short time duration in android [on hold]
13 mars 2018, par Vijendra patidarI have used many ffmpeg command but not work proper for compress video this work fine for trim video.
For trim video command
String[] complexCommand = {"-i", file.getPath(), "-ss", "" + startMs / 1000, "-to", "" + endMs / 1000, "-c", "copy", file1.getPath()};
-
Convert Audio Files using FFMPEG in android
25 avril 2018, par Venkat MaddyI developing android application, Where I’ve asked to record the calls and convert them to WAVE(.wav) format and send to server.
I am able to successfully record and save the audio in my mobile. The recorded file mime type is 3gpp, now I want to convert this 3gpp file to wav. I tried using FFMPEG but unable to convert it. I am posting my code here, Any help would be grateful.
While Converting the recorded file from aac to wav, I’m getting below error
Exception while trying to run: [Ljava.lang.String;@64075a0
java.io.IOException: Cannot run program "/data/user/0/com.sms.example.example/files/ffmpeg": error=2, No such file or directory
at java.lang.ProcessBuilder.start(ProcessBuilder.java:983)
at java.lang.Runtime.exec(Runtime.java:691)
at java.lang.Runtime.exec(Runtime.java:559)
at com.github.hiteshsondhi88.libffmpeg.ShellCommand.run(ShellCommand.java:10)
at com.github.hiteshsondhi88.libffmpeg.FFmpegExecuteAsyncTask.doInBackground(FFmpegExecuteAsyncTask.java:38)
at com.github.hiteshsondhi88.libffmpeg.FFmpegExecuteAsyncTask.doInBackground(FFmpegExecuteAsyncTask.java:10)
at android.os.AsyncTask$2.call(AsyncTask.java:305)
at java.util.concurrent.FutureTask.run(FutureTask.java:237)
at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:243)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1133)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:607)
at java.lang.Thread.run(Thread.java:761)
Caused by: java.io.IOException: error=2, No such file or directory
at java.lang.UNIXProcess.forkAndExec(Native Method)
at java.lang.UNIXProcess.<init>(UNIXProcess.java:133)
at java.lang.ProcessImpl.start(ProcessImpl.java:128)
at java.lang.ProcessBuilder.start(ProcessBuilder.java:964)
</init>Recorder Service
public class CallService extends Service {
//util object creation
NetworkUtils networkUtils;
SharedPref sharedPref;
MediaRecorder recorder;
File audiofile;
String name, phonenumber;
String audio_format;
public String Audio_Type;
int audioSource;
Context context;
private Handler handler;
Timer timer;
Boolean offHook = false, ringing = false;
Toast toast;
Boolean isOffHook = false;
private boolean recordstarted = false;
private static final String ACTION_IN = "android.intent.action.PHONE_STATE";
private static final String ACTION_OUT = "android.intent.action.NEW_OUTGOING_CALL";
private CallBr br_call;
public String file_name="", recordedFileName="",uploadedFileName="", base64_file="", statusMessage="";
public int statusCode, file_size=0, delay = 1000,period = 5000;
public FFmpeg fFmpeg;
@Override
public IBinder onBind(Intent arg0) {
// TODO Auto-generated method stub
return null;
}
@Override
public void onDestroy() {
Log.d("service", "destroy");
super.onDestroy();
}
@Override
public int onStartCommand(Intent intent, int flags, int startId) {
sharedPref = new SharedPref(this);
networkUtils = new NetworkUtils(this);
final IntentFilter filter = new IntentFilter();
filter.addAction(ACTION_OUT);
filter.addAction(ACTION_IN);
this.br_call = new CallBr();
this.registerReceiver(this.br_call, filter);
fFmpeg = FFmpeg.getInstance(CallService.this);
Timer task = new Timer();
task.scheduleAtFixedRate(new TimerTask() {
@Override
public void run() {
if (networkUtils.checkConnection()){
}else{
// Toast.makeText(CallService.this, getResources().getString(R.string.no_connection), Toast.LENGTH_SHORT).show();
}
}
}, delay, period);
// if(terminate != null) {
// stopSelf();
// }
return START_STICKY;
}
public class CallBr extends BroadcastReceiver {
Bundle bundle;
String state;
String inCall, outCall;
public boolean wasRinging = false;
public File sampleDir;
@Override
public void onReceive(Context context, Intent intent) {
if (intent.getAction().equals(ACTION_IN)) {
if ((bundle = intent.getExtras()) != null) {
state = bundle.getString(TelephonyManager.EXTRA_STATE);
if (state.equals(TelephonyManager.EXTRA_STATE_RINGING)) {
inCall = bundle.getString(TelephonyManager.EXTRA_INCOMING_NUMBER);
wasRinging = true;
Toast.makeText(context, "IN : " + inCall, Toast.LENGTH_LONG).show();
} else if (state.equals(TelephonyManager.EXTRA_STATE_OFFHOOK)) {
if (wasRinging == true) {
Toast.makeText(context, "ANSWERED", Toast.LENGTH_LONG).show();
String out = new SimpleDateFormat("dd-MM-yyyy hh-mm-ss").format(new Date());
sampleDir = new File(Environment.getExternalStorageDirectory().getAbsolutePath(), "/OfficeRecordings/");
if (!sampleDir.exists()) {
sampleDir.mkdirs();
}
Log.d("TService", "onReceive: "+sampleDir);
file_name = "Incoming";
String path = Environment.getExternalStorageDirectory().getAbsolutePath();
// try {
// audiofile = File.createTempFile(file_name, ".wav", sampleDir);
// recordedFileName = sampleDir + String.valueOf(System.currentTimeMillis() + ".wav");
//
// } catch (IOException e) {
// e.printStackTrace();
// }
// recordedFileName = sampleDir+"_"+ file_name +String.valueOf(System.currentTimeMillis() + ".wav");
uploadedFileName = sharedPref.getStringValue("userId")+"_"+sharedPref.getStringValue("mobile_number_prefs")+"_" +file_name +String.valueOf(System.currentTimeMillis()+ ".aac");
recordedFileName = sampleDir+"/"+uploadedFileName;
recorder = new MediaRecorder();
// recorder.setAudioSource(MediaRecorder.AudioSource.VOICE_CALL);
// recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setOutputFormat(MediaRecorder.OutputFormat.AAC_ADTS);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
// recorder.setOutputFile(audiofile.getAbsolutePath());
recorder.setOutputFile(recordedFileName);
try {
recorder.prepare();
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
recorder.start();
recordstarted = true;
}
} else if (state.equals(TelephonyManager.EXTRA_STATE_IDLE)) {
wasRinging = false;
Toast.makeText(context, "REJECT || DISCO", Toast.LENGTH_LONG).show();
if (recordstarted) {
recorder.stop();
recordstarted = false;
if (recordedFileName.isEmpty()){
}else{
convertFileToBase64(context);
}
}
}
}
} else if (intent.getAction().equals(ACTION_OUT)) {
if ((bundle = intent.getExtras()) != null) {
outCall = intent.getStringExtra(Intent.EXTRA_PHONE_NUMBER);
Toast.makeText(context, "OUT : " + outCall, Toast.LENGTH_LONG).show();
sampleDir = new File(Environment.getExternalStorageDirectory().getAbsolutePath(), "/OfficeRecordings/");
if (!sampleDir.exists()) {
sampleDir.mkdirs();
}
Log.d("TService", "onReceive: "+sampleDir);
file_name = "Outgoing";
String path = Environment.getExternalStorageDirectory().getAbsolutePath();
uploadedFileName = sharedPref.getStringValue("userId")+"_"+sharedPref.getStringValue("mobile_number_prefs")+"_" +file_name +String.valueOf(System.currentTimeMillis() + ".amr");
recordedFileName = sampleDir+"/"+uploadedFileName;
// try {
// audiofile = File.createTempFile(file_name, ".wav", sampleDir);
// recordedFileName = audiofile.getName();
//
// } catch (IOException e) {
// e.printStackTrace();
// }
recorder = new MediaRecorder();
// recorder.setAudioSource(MediaRecorder.AudioSource.VOICE_CALL);
// recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setOutputFormat(MediaRecorder.OutputFormat.AMR_NB);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.DEFAULT);
recorder.setOutputFile(audiofile.getAbsolutePath());
try {
recorder.prepare();
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
recorder.start();
recordstarted = true;
if (state.equals(TelephonyManager.EXTRA_STATE_IDLE)) {
wasRinging = false;
Toast.makeText(context, "REJECT || DISCO", Toast.LENGTH_LONG).show();
if (recordstarted) {
recorder.stop();
recordstarted = false;
if (recordedFileName.isEmpty()){
}else{
convertFileToBase64(context);
}
}
}
}
}
// LocalBroadcastManager.getInstance(context).sendBroadcast(new Intent("callRecord"));
}
public void convertFileToBase64(Context context){
File uploadedFile = new File(recordedFileName);
Log.d("CallService", "convertFileToBase64: "+uploadedFile);
String[] cmd = new String[4];
cmd[0] = "ffmpeg ";
cmd[1] = "-i ";
cmd[2] = recordedFileName+" ";
cmd[3] = " "+sampleDir+"/"+"sampleAudio.wav";
execFFmpegBinary(cmd);
file_size = Integer.parseInt(String.valueOf(uploadedFile.length()/1024));
Log.d("CallService", "onReceive: "+file_size);
byte[] byteArray = new byte[1024*11];
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
// File uploadedFile = new File(sampleDir+"/"+recordedFileName);
Uri path = Uri.fromFile(uploadedFile);
// Uri path = Uri.fromFile(uploadedFile);
Log.d("CallService", "convertFileToBase64: "+path);
try {
InputStream inputStream = context.getContentResolver().openInputStream(path);
byte[] b = new byte[1024 * 11];
int bytesRead = 0;
while ((bytesRead = inputStream.read(b)) != -1) {
byteArrayOutputStream.write(b, 0, bytesRead);
}
byteArray = byteArrayOutputStream.toByteArray();
Log.e("Byte array", ">" + byteArray);
} catch (IOException e) {
e.printStackTrace();
}
base64_file = Base64.encodeToString(byteArray, Base64.DEFAULT);
Log.d("CallRecorder", "base64File: "+base64_file);
if (!(base64_file.isEmpty())){
Log.d("CallService", "convertFileToBase64: base64 not empty");
if (networkUtils.checkConnection()){
uploadRecordedFileToServer();
}else{
saveFileToLocalDB();
Toast.makeText(context, getString(R.string.no_connection), Toast.LENGTH_SHORT).show();
}
}
}
private void execFFmpegBinary(final String[] command) {
try {
fFmpeg.execute(command, new ExecuteBinaryResponseHandler() {
@Override
public void onFailure(String s) {
Log.d("CallService", "onFailure: "+s);
Toast.makeText(CallService.this, "Failed to convert", Toast.LENGTH_SHORT).show();
}
@Override
public void onSuccess(String s) {
Log.d("CallService", "onSuccess: "+s);
}
@Override
public void onProgress(String s) {
Log.d("CallService", "Started command : ffmpeg "+command);
Log.d("CallService", "progress : "+s);
// progressDialog.setMessage("Processing\n"+s);
}
@Override
public void onStart() {
Log.d("CallService", "Started command : ffmpeg " + command);
// progressDialog.setMessage("Processing...");
// progressDialog.show();
}
@Override
public void onFinish() {
Log.d("CallService", "Finished command : ffmpeg "+command);
// progressDialog.dismiss();
}
});
} catch (FFmpegCommandAlreadyRunningException e) {
// do nothing for now
}
}Amy help would be really very grateful... Thanks in advance !!!
-
Ffplay : Change filter options while playing audio
11 février 2018, par arthaI am trying to apply equalizer filters to ffplay while it is playing an audio. Is it possible ?
Like start playing the audio with the command :
ffplay -i foo.wav
Then while foo.wav is playing change it to
ffplay -af "equalizer=f=1000:width_type=h:width=200:g=-10" -i foo.wav
without stopping the audio.