
Recherche avancée
Autres articles (28)
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Ajouter notes et légendes aux images
7 février 2011, parPour pouvoir ajouter notes et légendes aux images, la première étape est d’installer le plugin "Légendes".
Une fois le plugin activé, vous pouvez le configurer dans l’espace de configuration afin de modifier les droits de création / modification et de suppression des notes. Par défaut seuls les administrateurs du site peuvent ajouter des notes aux images.
Modification lors de l’ajout d’un média
Lors de l’ajout d’un média de type "image" un nouveau bouton apparait au dessus de la prévisualisation (...) -
ANNEXE : Les plugins utilisés spécifiquement pour la ferme
5 mars 2010, parLe site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)
Sur d’autres sites (7257)
-
Converting mp4 to webm , ogg formats
24 août 2016, par user2943893Currently i need to convert mp4 video to webm and ogg . To convert mp4 to webm i have used "ffmpeg.exe". I am running following code to convert video mp4 to webm.
[DllImport("User32.dll")]
public static extern bool SetForegroundWindow(IntPtr hWnd);
public void mciConvertWavMP3(string fileName, bool waitFlag)
{
string savepath = Server.MapPath(fileName);
string destpath = Server.MapPath(fileName);
string pworkingDir = Server.MapPath("~/ffmpeg/");
// string outfile = "-b:a 16 --resample 24 -m j " + savepath + " " + savepath.Replace(".wav", ".mp3") + ""; //--- lame code
// string outfile = "-b 192k -i " + savepath + " " + destpath.Replace(".mp4", ".webm");
// string outfile = "ffmpeg -i " + savepath + " -acodec libvorbis -ac 2 -ab 96k -ar 44100 -b 345k -s 640x360 " + Server.MapPath("output-file.webm");
string outfile = "ffmpeg -i \"test7.mp4\" -c:v libvpx -crf 10 -b:v 1M -c:a libvorbis \"" + Server.MapPath("output-file.webm") + "\"";
// string outfile = "ffmpeg -i \""+fileName+"\" -codec:v libvpx -quality good -cpu-used 0 -b:v 600k -qmin 10 -qmax 42 -maxrate 500k -bufsize 1000k -threads 2 -vf scale=-1:480 -an -pass 1 -f webm /dev/null";
System.Diagnostics.ProcessStartInfo psi = new System.Diagnostics.ProcessStartInfo();
psi.FileName = pworkingDir+"ffmpeg.exe";
psi.Arguments = outfile;
psi.UseShellExecute = true;
psi.CreateNoWindow = false;
System.Diagnostics.Process p = System.Diagnostics.Process.Start(psi);
Thread.Sleep(1000);// utput.webm
if (waitFlag)
{
p.WaitForExit();
// wait for exit of called application
}
}I kept my project folder in D :/ drive
When i am running from Command promt its working fine. But when i am running this code its not working fine.
Formal errors which i am getting are :
" unable to find a suitable output format for ’ffmpeg’ " kind of errors. SO please can any one help to solve this issue.
Thank & Regards
-
How to keep transparency when scale webm file with ffmpeg
5 octobre 2022, par Sonia KidmanI'm using ffmpeg to scale my WEBM file, by using below command : 
ffmpeg -i in.webm -c:v libvpx -vf scale=100:100 out.webm
The output has correct resolution as I expected but the problem is transparency become black background.



Could someone give me a solution for this.



Thank you so much.



Below is the log of the operation :



ffmpeg version 3.4 Copyright (c) 2000-2017 the FFmpeg developers
 built with gcc 7.2.0 (GCC)
 configuration: --enable-gpl --enable-version3 --enable-sdl2 --enable-bzlib --enable-fontconfig --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libfreetype --enable-libmp3lame --enable-libopenjpeg --enable-libopus --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libtheora --enable-libtwolame --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libzimg --enable-lzma --enable-zlib --enable-gmp --enable-libvidstab --enable-libvorbis --enable-cuda --enable-cuvid --enable-d3d11va --enable-nvenc --enable-dxva2 --enable-avisynth --enable-libmfx
 libavutil 55. 78.100 / 55. 78.100
 libavcodec 57.107.100 / 57.107.100
 libavformat 57. 83.100 / 57. 83.100
 libavdevice 57. 10.100 / 57. 10.100
 libavfilter 6.107.100 / 6.107.100
 libswscale 4. 8.100 / 4. 8.100
 libswresample 2. 9.100 / 2. 9.100
 libpostproc 54. 7.100 / 54. 7.100
Splitting the commandline.
Reading option '-v' ... matched as option 'v' (set logging level) with argument '56'.
Reading option '-i' ... matched as input url with argument 'in.webm'.
Reading option '-c:v' ... matched as option 'c' (codec name) with argument 'libvpx'.
Reading option '-vf' ... matched as option 'vf' (set video filters) with argument 'scale=320:240'.
Reading option 'out.webm' ... matched as output url.
Finished splitting the commandline.
Parsing a group of options: global .
Applying option v (set logging level) with argument 56.
Successfully parsed a group of options.
Parsing a group of options: input url in.webm.
Successfully parsed a group of options.
Opening an input file: in.webm.
[NULL @ 000002387e6322a0] Opening 'in.webm' for reading
[file @ 000002387e632ea0] Setting default whitelist 'file,crypto'
Probing matroska,webm score:100 size:2048
Probing mp3 score:1 size:2048
[matroska,webm @ 000002387e6322a0] Format matroska,webm probed with size=2048 and score=100
st:0 removing common factor 1000000 from timebase
[matroska,webm @ 000002387e6322a0] Before avformat_find_stream_info() pos: 634 bytes read:32768 seeks:0 nb_streams:1
[matroska,webm @ 000002387e6322a0] All info found
[matroska,webm @ 000002387e6322a0] stream 0: start_time: 0.000 duration: -9223372036854776.000
[matroska,webm @ 000002387e6322a0] format: start_time: 0.000 duration: 0.400 bitrate=1432 kb/s
[matroska,webm @ 000002387e6322a0] After avformat_find_stream_info() pos: 34843 bytes read:65536 seeks:0 frames:1
Input #0, matroska,webm, from 'in.webm':
 Metadata:
 ENCODER : Lavf57.83.100
 Duration: 00:00:00.40, start: 0.000000, bitrate: 1432 kb/s
 Stream #0:0, 1, 1/1000: Video: vp8, 1 reference frame, yuv420p(progressive), 640x480, 0/1, SAR 1:1 DAR 4:3, 10 fps, 10 tbr, 1k tbn, 1k tbc (default)
 Metadata:
 alpha_mode : 1
 ENCODER : Lavc57.107.100 libvpx
 DURATION : 00:00:00.400000000
Successfully opened the file.
Parsing a group of options: output url out.webm.
Applying option c:v (codec name) with argument libvpx.
Applying option vf (set video filters) with argument scale=320:240.
Successfully parsed a group of options.
Opening an output file: out.webm.
[file @ 000002387e658b40] Setting default whitelist 'file,crypto'
Successfully opened the file.
detected 4 logical cores
Stream mapping:
 Stream #0:0 -> #0:0 (vp8 (native) -> vp8 (libvpx))
Press [q] to stop, [?] for help
cur_dts is invalid (this is harmless if it occurs once at the start per stream)
 Last message repeated 4 times
[Parsed_scale_0 @ 000002387e718a60] Setting 'w' to value '320'
[Parsed_scale_0 @ 000002387e718a60] Setting 'h' to value '240'
[Parsed_scale_0 @ 000002387e718a60] Setting 'flags' to value 'bicubic'
[Parsed_scale_0 @ 000002387e718a60] w:320 h:240 flags:'bicubic' interl:0
[graph 0 input from stream 0:0 @ 000002387e743b00] Setting 'video_size' to value '640x480'
[graph 0 input from stream 0:0 @ 000002387e743b00] Setting 'pix_fmt' to value '0'
[graph 0 input from stream 0:0 @ 000002387e743b00] Setting 'time_base' to value '1/1000'
[graph 0 input from stream 0:0 @ 000002387e743b00] Setting 'pixel_aspect' to value '1/1'
[graph 0 input from stream 0:0 @ 000002387e743b00] Setting 'sws_param' to value 'flags=2'
[graph 0 input from stream 0:0 @ 000002387e743b00] Setting 'frame_rate' to value '10/1'
[graph 0 input from stream 0:0 @ 000002387e743b00] w:640 h:480 pixfmt:yuv420p tb:1/1000 fr:10/1 sar:1/1 sws_param:flags=2
[format @ 000002387e7fe1e0] compat: called with args=[yuv420p|yuva420p]
[format @ 000002387e7fe1e0] Setting 'pix_fmts' to value 'yuv420p|yuva420p'
[AVFilterGraph @ 000002387e634e60] query_formats: 4 queried, 3 merged, 0 already done, 0 delayed
[Parsed_scale_0 @ 000002387e718a60] w:640 h:480 fmt:yuv420p sar:1/1 -> w:320 h:240 fmt:yuv420p sar:1/1 flags:0x4
[libvpx @ 000002387e657fe0] v1.6.1
[libvpx @ 000002387e657fe0] --prefix=/Users/kyle/software/libvpx/win64/libvpx-1.6.1-win64 --target=x86_64-win64-gcc
[libvpx @ 000002387e657fe0] vpx_codec_enc_cfg
[libvpx @ 000002387e657fe0] generic settings
 g_usage: 0
 g_threads: 0
 g_profile: 0
 g_w: 320
 g_h: 240
 g_bit_depth: 8
 g_input_bit_depth: 8
 g_timebase: {1/30}
 g_error_resilient: 0
 g_pass: 0
 g_lag_in_frames: 0
[libvpx @ 000002387e657fe0] rate control settings
 rc_dropframe_thresh: 0
 rc_resize_allowed: 0
 rc_resize_up_thresh: 60
 rc_resize_down_thresh: 30
 rc_end_usage: 0
 rc_twopass_stats_in: 0000000000000000(0)
 rc_target_bitrate: 256
[libvpx @ 000002387e657fe0] quantizer settings
 rc_min_quantizer: 4
 rc_max_quantizer: 63
[libvpx @ 000002387e657fe0] bitrate tolerance
 rc_undershoot_pct: 100
 rc_overshoot_pct: 100
[libvpx @ 000002387e657fe0] decoder buffer model
 rc_buf_sz: 6000
 rc_buf_initial_sz: 4000
 rc_buf_optimal_sz: 5000
[libvpx @ 000002387e657fe0] 2 pass rate control settings
 rc_2pass_vbr_bias_pct: 50
 rc_2pass_vbr_minsection_pct: 0
 rc_2pass_vbr_maxsection_pct: 400
[libvpx @ 000002387e657fe0] keyframing settings
 kf_mode: 1
 kf_min_dist: 0
 kf_max_dist: 128
[libvpx @ 000002387e657fe0] 
[libvpx @ 000002387e657fe0] vpx_codec_enc_cfg
[libvpx @ 000002387e657fe0] generic settings
 g_usage: 0
 g_threads: 0
 g_profile: 0
 g_w: 320
 g_h: 240
 g_bit_depth: 8
 g_input_bit_depth: 8
 g_timebase: {1/10}
 g_error_resilient: 0
 g_pass: 0
 g_lag_in_frames: 25
[libvpx @ 000002387e657fe0] rate control settings
 rc_dropframe_thresh: 0
 rc_resize_allowed: 0
 rc_resize_up_thresh: 60
 rc_resize_down_thresh: 30
 rc_end_usage: 0
 rc_twopass_stats_in: 0000000000000000(0)
 rc_target_bitrate: 200
[libvpx @ 000002387e657fe0] quantizer settings
 rc_min_quantizer: 4
 rc_max_quantizer: 63
[libvpx @ 000002387e657fe0] bitrate tolerance
 rc_undershoot_pct: 100
 rc_overshoot_pct: 100
[libvpx @ 000002387e657fe0] decoder buffer model
 rc_buf_sz: 6000
 rc_buf_initial_sz: 4000
 rc_buf_optimal_sz: 5000
[libvpx @ 000002387e657fe0] 2 pass rate control settings
 rc_2pass_vbr_bias_pct: 50
 rc_2pass_vbr_minsection_pct: 0
 rc_2pass_vbr_maxsection_pct: 400
[libvpx @ 000002387e657fe0] keyframing settings
 kf_mode: 1
 kf_min_dist: 0
 kf_max_dist: 128
[libvpx @ 000002387e657fe0] 
[libvpx @ 000002387e657fe0] vpx_codec_control
[libvpx @ 000002387e657fe0] VP8E_SET_CPUUSED: 1
[libvpx @ 000002387e657fe0] VP8E_SET_ARNR_MAXFRAMES: 0
[libvpx @ 000002387e657fe0] VP8E_SET_ARNR_STRENGTH: 3
[libvpx @ 000002387e657fe0] VP8E_SET_ARNR_TYPE: 3
[libvpx @ 000002387e657fe0] VP8E_SET_NOISE_SENSITIVITY: 0
[libvpx @ 000002387e657fe0] VP8E_SET_TOKEN_PARTITIONS: 0
[libvpx @ 000002387e657fe0] VP8E_SET_STATIC_THRESHOLD: 0
[libvpx @ 000002387e657fe0] Using deadline: 1000000
Output #0, webm, to 'out.webm':
 Metadata:
 encoder : Lavf57.83.100
 Stream #0:0, 0, 1/1000: Video: vp8 (libvpx), 1 reference frame, yuv420p, 320x240 [SAR 1:1 DAR 4:3], 0/1, q=-1--1, 200 kb/s, 10 fps, 1k tbn, 10 tbc (default)
 Metadata:
 alpha_mode : 1
 DURATION : 00:00:00.400000000
 encoder : Lavc57.107.100 libvpx
 Side data:
 cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: -1
Clipping frame in rate conversion by 0.000008
[webm @ 000002387e656880] get_metadata_duration returned: 400000
[webm @ 000002387e656880] Write early duration from metadata = 400
[webm @ 000002387e656880] Writing block at offset 3, size 11223, pts 0, dts 0, duration 100, keyframe 1
[webm @ 000002387e656880] Writing block at offset 11233, size 1288, pts 100, dts 100, duration 100, keyframe 0
[webm @ 000002387e656880] Writing block at offset 12528, size 1504, pts 200, dts 200, duration 100, keyframe 0
[webm @ 000002387e656880] Writing block at offset 14039, size 2481, pts 300, dts 300, duration 100, keyframe 0
[out_0_0 @ 000002387e743d60] EOF on sink link out_0_0:default.
No more output streams to write to, finishing.
[webm @ 000002387e656880] end duration = 400
[webm @ 000002387e656880] stream 0 end duration = 400
frame= 4 fps=0.0 q=0.0 Lsize= 17kB time=00:00:00.30 bitrate= 457.8kbits/s speed=4.45x 
video:16kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 4.413191%
Input file #0 (in.webm):
 Input stream #0:0 (video): 4 packets read (34992 bytes); 4 frames decoded; 
 Total: 4 packets (34992 bytes) demuxed
Output file #0 (out.webm):
 Output stream #0:0 (video): 4 frames encoded; 4 packets muxed (16496 bytes); 
 Total: 4 packets (16496 bytes) muxed
4 frames successfully decoded, 0 decoding errors
[AVIOContext @ 000002387e698c20] Statistics: 14 seeks, 10 writeouts
[AVIOContext @ 000002387cc773e0] Statistics: 71649 bytes read, 0 seeks



-
Convert Audio Files using FFMPEG in android
25 avril 2018, par Venkat MaddyI developing android application, Where I’ve asked to record the calls and convert them to WAVE(.wav) format and send to server.
I am able to successfully record and save the audio in my mobile. The recorded file mime type is 3gpp, now I want to convert this 3gpp file to wav. I tried using FFMPEG but unable to convert it. I am posting my code here, Any help would be grateful.
While Converting the recorded file from aac to wav, I’m getting below error
Exception while trying to run: [Ljava.lang.String;@64075a0
java.io.IOException: Cannot run program "/data/user/0/com.sms.example.example/files/ffmpeg": error=2, No such file or directory
at java.lang.ProcessBuilder.start(ProcessBuilder.java:983)
at java.lang.Runtime.exec(Runtime.java:691)
at java.lang.Runtime.exec(Runtime.java:559)
at com.github.hiteshsondhi88.libffmpeg.ShellCommand.run(ShellCommand.java:10)
at com.github.hiteshsondhi88.libffmpeg.FFmpegExecuteAsyncTask.doInBackground(FFmpegExecuteAsyncTask.java:38)
at com.github.hiteshsondhi88.libffmpeg.FFmpegExecuteAsyncTask.doInBackground(FFmpegExecuteAsyncTask.java:10)
at android.os.AsyncTask$2.call(AsyncTask.java:305)
at java.util.concurrent.FutureTask.run(FutureTask.java:237)
at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:243)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1133)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:607)
at java.lang.Thread.run(Thread.java:761)
Caused by: java.io.IOException: error=2, No such file or directory
at java.lang.UNIXProcess.forkAndExec(Native Method)
at java.lang.UNIXProcess.<init>(UNIXProcess.java:133)
at java.lang.ProcessImpl.start(ProcessImpl.java:128)
at java.lang.ProcessBuilder.start(ProcessBuilder.java:964)
</init>Recorder Service
public class CallService extends Service {
//util object creation
NetworkUtils networkUtils;
SharedPref sharedPref;
MediaRecorder recorder;
File audiofile;
String name, phonenumber;
String audio_format;
public String Audio_Type;
int audioSource;
Context context;
private Handler handler;
Timer timer;
Boolean offHook = false, ringing = false;
Toast toast;
Boolean isOffHook = false;
private boolean recordstarted = false;
private static final String ACTION_IN = "android.intent.action.PHONE_STATE";
private static final String ACTION_OUT = "android.intent.action.NEW_OUTGOING_CALL";
private CallBr br_call;
public String file_name="", recordedFileName="",uploadedFileName="", base64_file="", statusMessage="";
public int statusCode, file_size=0, delay = 1000,period = 5000;
public FFmpeg fFmpeg;
@Override
public IBinder onBind(Intent arg0) {
// TODO Auto-generated method stub
return null;
}
@Override
public void onDestroy() {
Log.d("service", "destroy");
super.onDestroy();
}
@Override
public int onStartCommand(Intent intent, int flags, int startId) {
sharedPref = new SharedPref(this);
networkUtils = new NetworkUtils(this);
final IntentFilter filter = new IntentFilter();
filter.addAction(ACTION_OUT);
filter.addAction(ACTION_IN);
this.br_call = new CallBr();
this.registerReceiver(this.br_call, filter);
fFmpeg = FFmpeg.getInstance(CallService.this);
Timer task = new Timer();
task.scheduleAtFixedRate(new TimerTask() {
@Override
public void run() {
if (networkUtils.checkConnection()){
}else{
// Toast.makeText(CallService.this, getResources().getString(R.string.no_connection), Toast.LENGTH_SHORT).show();
}
}
}, delay, period);
// if(terminate != null) {
// stopSelf();
// }
return START_STICKY;
}
public class CallBr extends BroadcastReceiver {
Bundle bundle;
String state;
String inCall, outCall;
public boolean wasRinging = false;
public File sampleDir;
@Override
public void onReceive(Context context, Intent intent) {
if (intent.getAction().equals(ACTION_IN)) {
if ((bundle = intent.getExtras()) != null) {
state = bundle.getString(TelephonyManager.EXTRA_STATE);
if (state.equals(TelephonyManager.EXTRA_STATE_RINGING)) {
inCall = bundle.getString(TelephonyManager.EXTRA_INCOMING_NUMBER);
wasRinging = true;
Toast.makeText(context, "IN : " + inCall, Toast.LENGTH_LONG).show();
} else if (state.equals(TelephonyManager.EXTRA_STATE_OFFHOOK)) {
if (wasRinging == true) {
Toast.makeText(context, "ANSWERED", Toast.LENGTH_LONG).show();
String out = new SimpleDateFormat("dd-MM-yyyy hh-mm-ss").format(new Date());
sampleDir = new File(Environment.getExternalStorageDirectory().getAbsolutePath(), "/OfficeRecordings/");
if (!sampleDir.exists()) {
sampleDir.mkdirs();
}
Log.d("TService", "onReceive: "+sampleDir);
file_name = "Incoming";
String path = Environment.getExternalStorageDirectory().getAbsolutePath();
// try {
// audiofile = File.createTempFile(file_name, ".wav", sampleDir);
// recordedFileName = sampleDir + String.valueOf(System.currentTimeMillis() + ".wav");
//
// } catch (IOException e) {
// e.printStackTrace();
// }
// recordedFileName = sampleDir+"_"+ file_name +String.valueOf(System.currentTimeMillis() + ".wav");
uploadedFileName = sharedPref.getStringValue("userId")+"_"+sharedPref.getStringValue("mobile_number_prefs")+"_" +file_name +String.valueOf(System.currentTimeMillis()+ ".aac");
recordedFileName = sampleDir+"/"+uploadedFileName;
recorder = new MediaRecorder();
// recorder.setAudioSource(MediaRecorder.AudioSource.VOICE_CALL);
// recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setOutputFormat(MediaRecorder.OutputFormat.AAC_ADTS);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
// recorder.setOutputFile(audiofile.getAbsolutePath());
recorder.setOutputFile(recordedFileName);
try {
recorder.prepare();
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
recorder.start();
recordstarted = true;
}
} else if (state.equals(TelephonyManager.EXTRA_STATE_IDLE)) {
wasRinging = false;
Toast.makeText(context, "REJECT || DISCO", Toast.LENGTH_LONG).show();
if (recordstarted) {
recorder.stop();
recordstarted = false;
if (recordedFileName.isEmpty()){
}else{
convertFileToBase64(context);
}
}
}
}
} else if (intent.getAction().equals(ACTION_OUT)) {
if ((bundle = intent.getExtras()) != null) {
outCall = intent.getStringExtra(Intent.EXTRA_PHONE_NUMBER);
Toast.makeText(context, "OUT : " + outCall, Toast.LENGTH_LONG).show();
sampleDir = new File(Environment.getExternalStorageDirectory().getAbsolutePath(), "/OfficeRecordings/");
if (!sampleDir.exists()) {
sampleDir.mkdirs();
}
Log.d("TService", "onReceive: "+sampleDir);
file_name = "Outgoing";
String path = Environment.getExternalStorageDirectory().getAbsolutePath();
uploadedFileName = sharedPref.getStringValue("userId")+"_"+sharedPref.getStringValue("mobile_number_prefs")+"_" +file_name +String.valueOf(System.currentTimeMillis() + ".amr");
recordedFileName = sampleDir+"/"+uploadedFileName;
// try {
// audiofile = File.createTempFile(file_name, ".wav", sampleDir);
// recordedFileName = audiofile.getName();
//
// } catch (IOException e) {
// e.printStackTrace();
// }
recorder = new MediaRecorder();
// recorder.setAudioSource(MediaRecorder.AudioSource.VOICE_CALL);
// recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setOutputFormat(MediaRecorder.OutputFormat.AMR_NB);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.DEFAULT);
recorder.setOutputFile(audiofile.getAbsolutePath());
try {
recorder.prepare();
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
recorder.start();
recordstarted = true;
if (state.equals(TelephonyManager.EXTRA_STATE_IDLE)) {
wasRinging = false;
Toast.makeText(context, "REJECT || DISCO", Toast.LENGTH_LONG).show();
if (recordstarted) {
recorder.stop();
recordstarted = false;
if (recordedFileName.isEmpty()){
}else{
convertFileToBase64(context);
}
}
}
}
}
// LocalBroadcastManager.getInstance(context).sendBroadcast(new Intent("callRecord"));
}
public void convertFileToBase64(Context context){
File uploadedFile = new File(recordedFileName);
Log.d("CallService", "convertFileToBase64: "+uploadedFile);
String[] cmd = new String[4];
cmd[0] = "ffmpeg ";
cmd[1] = "-i ";
cmd[2] = recordedFileName+" ";
cmd[3] = " "+sampleDir+"/"+"sampleAudio.wav";
execFFmpegBinary(cmd);
file_size = Integer.parseInt(String.valueOf(uploadedFile.length()/1024));
Log.d("CallService", "onReceive: "+file_size);
byte[] byteArray = new byte[1024*11];
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
// File uploadedFile = new File(sampleDir+"/"+recordedFileName);
Uri path = Uri.fromFile(uploadedFile);
// Uri path = Uri.fromFile(uploadedFile);
Log.d("CallService", "convertFileToBase64: "+path);
try {
InputStream inputStream = context.getContentResolver().openInputStream(path);
byte[] b = new byte[1024 * 11];
int bytesRead = 0;
while ((bytesRead = inputStream.read(b)) != -1) {
byteArrayOutputStream.write(b, 0, bytesRead);
}
byteArray = byteArrayOutputStream.toByteArray();
Log.e("Byte array", ">" + byteArray);
} catch (IOException e) {
e.printStackTrace();
}
base64_file = Base64.encodeToString(byteArray, Base64.DEFAULT);
Log.d("CallRecorder", "base64File: "+base64_file);
if (!(base64_file.isEmpty())){
Log.d("CallService", "convertFileToBase64: base64 not empty");
if (networkUtils.checkConnection()){
uploadRecordedFileToServer();
}else{
saveFileToLocalDB();
Toast.makeText(context, getString(R.string.no_connection), Toast.LENGTH_SHORT).show();
}
}
}
private void execFFmpegBinary(final String[] command) {
try {
fFmpeg.execute(command, new ExecuteBinaryResponseHandler() {
@Override
public void onFailure(String s) {
Log.d("CallService", "onFailure: "+s);
Toast.makeText(CallService.this, "Failed to convert", Toast.LENGTH_SHORT).show();
}
@Override
public void onSuccess(String s) {
Log.d("CallService", "onSuccess: "+s);
}
@Override
public void onProgress(String s) {
Log.d("CallService", "Started command : ffmpeg "+command);
Log.d("CallService", "progress : "+s);
// progressDialog.setMessage("Processing\n"+s);
}
@Override
public void onStart() {
Log.d("CallService", "Started command : ffmpeg " + command);
// progressDialog.setMessage("Processing...");
// progressDialog.show();
}
@Override
public void onFinish() {
Log.d("CallService", "Finished command : ffmpeg "+command);
// progressDialog.dismiss();
}
});
} catch (FFmpegCommandAlreadyRunningException e) {
// do nothing for now
}
}Amy help would be really very grateful... Thanks in advance !!!