
Recherche avancée
Autres articles (53)
-
Multilang : améliorer l’interface pour les blocs multilingues
18 février 2011, parMultilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela. -
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir
Sur d’autres sites (10063)
-
AsyncTask publishProgress() does not update progress ffmpeg android
7 février 2014, par jayI am using ffmpeg commands for processing media files.In doInBackground() method i have started the process and every time i get the duration , time values and grabbing progress using time and duration and send progress to publishProgress(progress).When i tested on google nexus(android 4.4 kitkat) it is updating progress dialog correctly but this won't happen in below android 4.4 devices.It is updating with an eye blink of time after completion of the process.
Here is my code :protected String doInBackground(String... params) {
// TODO Auto-generated method stub
try {
proc = mProcess.start();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
processDuration(proc.getErrorStream());
// Wait for process to exit
int exitCode = 1; // Assume error
try {
exitCode = proc.waitFor();
} catch (InterruptedException e) {
Log.e(TAG, "Process interrupted!", e);
}
onExit(exitCode);
return null;
}
private void onExit(int exitCode) {
// TODO Auto-generated method stub
Log.i("exit code >>>>>>>>..", ""+exitCode);
}
private void processDuration(InputStream errorStream) {
// TODO Auto-generated method stub
Scanner sc = new Scanner(errorStream);
// Find duration
Pattern durPattern = Pattern.compile("(?<=Duration: )[^,]*");
String dur = sc.findWithinHorizon(durPattern, 0);
if (dur==null) throw new RuntimeException("Could not parse duration.");
String[] hms = dur.split(":");
try{
totalSecs= Integer.parseInt(hms[0]) * 3600 + Integer.parseInt(hms[1]) * 60 + Double.parseDouble(hms[2]);
Log.i(" progress>>>>>>>>>>>>>",""+totalSecs);
}catch(NumberFormatException e){
}
Pattern timePattern = Pattern.compile("(?<=time=)[\\d:.]*");
String match= sc.findWithinHorizon(timePattern, 0);
while (null != (match = sc.findWithinHorizon(timePattern, 0))) {
hms = match.split(":");
try{
processedSecs= Integer.parseInt(hms[0]) * 3600 + Integer.parseInt(hms[1]) * 60 + Double.parseDouble(hms[2]);
}catch(NumberFormatException e){
}
progress = processedSecs / totalSecs;
final int finalProgress=(int)(progress*100);
try {
publishProgress(""+finalProgress);
Thread.sleep(1000);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
publishProgress(""+100);
}
protected void onPostExecute(String result) {
super.onPostExecute(result);
mProgressDialog.dismiss();
}
protected void onPreExecute() {
super.onPreExecute();
showDialog(DIALOG_DOWNLOAD_PROGRESS);
}
protected void onProgressUpdate(String... progress) {
mProgressDialog.setProgress(Integer.parseInt(progress[0]));
super.onProgressUpdate(progress);
}
public Dialog showDialog(int id) {
// TODO Auto-generated method stub
switch (id) {
case DIALOG_DOWNLOAD_PROGRESS:
mProgressDialog = new ProgressDialog(context);
mProgressDialog.setMessage(loading process..");
mProgressDialog.setProgressStyle(ProgressDialog.STYLE_HORIZONTAL);
mProgressDialog.setCancelable(false);
mProgressDialog.setMax(100);
mProgressDialog.setCanceledOnTouchOutside(false);
mProgressDialog.show();
return mProgressDialog;
default:
return null;
}
}
}Thanks for Your Help..
Please help me out this problem.......... -
Android. Problems with AudioTrack class. Sound sometimes lost
29 janvier 2014, par bukka.whI have found open source video player for Android, which uses ffmpeg to decode video.
I have some problems with audio, that sometimes plays with jerks, but video picture is shown well. The basic idea of player is that audio and video are decoded in two different streams, and then in the third stream the are passed back, video picture is shown on SurfaceView and video sound is passed in byte array to AudioTrack and then plays. But sometimes sound is lost or playing with jerks. Can anyone give me start point for what to do (some basic concepts). May be I should change buffer size for AudioTrack or add some flags to it. Here is a piece of code, where AudioTrack class is created.private AudioTrack prepareAudioTrack(int sampleRateInHz,
int numberOfChannels) {
for (;;) {
int channelConfig;
if (numberOfChannels == 1) {
channelConfig = AudioFormat.CHANNEL_OUT_MONO;
} else if (numberOfChannels == 2) {
channelConfig = AudioFormat.CHANNEL_OUT_STEREO;
} else if (numberOfChannels == 3) {
channelConfig = AudioFormat.CHANNEL_OUT_FRONT_CENTER
| AudioFormat.CHANNEL_OUT_FRONT_RIGHT
| AudioFormat.CHANNEL_OUT_FRONT_LEFT;
} else if (numberOfChannels == 4) {
channelConfig = AudioFormat.CHANNEL_OUT_QUAD;
} else if (numberOfChannels == 5) {
channelConfig = AudioFormat.CHANNEL_OUT_QUAD
| AudioFormat.CHANNEL_OUT_LOW_FREQUENCY;
} else if (numberOfChannels == 6) {
channelConfig = AudioFormat.CHANNEL_OUT_5POINT1;
} else if (numberOfChannels == 8) {
channelConfig = AudioFormat.CHANNEL_OUT_7POINT1;
} else {
channelConfig = AudioFormat.CHANNEL_OUT_STEREO;
}
try {
Log.d("MyLog","Creating Audio player");
int minBufferSize = AudioTrack.getMinBufferSize(sampleRateInHz,
channelConfig, AudioFormat.ENCODING_PCM_16BIT);
AudioTrack audioTrack = new AudioTrack(
AudioManager.STREAM_MUSIC, sampleRateInHz,
channelConfig, AudioFormat.ENCODING_PCM_16BIT,
minBufferSize, AudioTrack.MODE_STREAM);
return audioTrack;
} catch (IllegalArgumentException e) {
if (numberOfChannels > 2) {
numberOfChannels = 2;
} else if (numberOfChannels > 1) {
numberOfChannels = 1;
} else {
throw e;
}
}
}
}And this is a piece of native code where sound bytes are written to AudioTrack
int player_write_audio(struct DecoderData *decoder_data, JNIEnv *env,
int64_t pts, uint8_t *data, int data_size, int original_data_size) {
struct Player *player = decoder_data->player;
int stream_no = decoder_data->stream_no;
int err = ERROR_NO_ERROR;
int ret;
AVCodecContext * c = player->input_codec_ctxs[stream_no];
AVStream *stream = player->input_streams[stream_no];
LOGI(10, "player_write_audio Writing audio frame")
jbyteArray samples_byte_array = (*env)->NewByteArray(env, data_size);
if (samples_byte_array == NULL) {
err = -ERROR_NOT_CREATED_AUDIO_SAMPLE_BYTE_ARRAY;
goto end;
}
if (pts != AV_NOPTS_VALUE) {
player->audio_clock = av_rescale_q(pts, stream->time_base, AV_TIME_BASE_Q);
LOGI(9, "player_write_audio - read from pts")
} else {
int64_t sample_time = original_data_size;
sample_time *= 1000000ll;
sample_time /= c->channels;
sample_time /= c->sample_rate;
sample_time /= av_get_bytes_per_sample(c->sample_fmt);
player->audio_clock += sample_time;
LOGI(9, "player_write_audio - added")
}
enum WaitFuncRet wait_ret = player_wait_for_frame(player,
player->audio_clock + AUDIO_TIME_ADJUST_US, stream_no);
if (wait_ret == WAIT_FUNC_RET_SKIP) {
goto end;
}
LOGI(10, "player_write_audio Writing sample data")
jbyte *jni_samples = (*env)->GetByteArrayElements(env, samples_byte_array,
NULL);
memcpy(jni_samples, data, data_size);
(*env)->ReleaseByteArrayElements(env, samples_byte_array, jni_samples, 0);
LOGI(10, "player_write_audio playing audio track");
ret = (*env)->CallIntMethod(env, player->audio_track,
player->audio_track_write_method, samples_byte_array, 0, data_size);
jthrowable exc = (*env)->ExceptionOccurred(env);
if (exc) {
err = -ERROR_PLAYING_AUDIO;
LOGE(3, "Could not write audio track: reason in exception");
// TODO maybe release exc
goto free_local_ref;
}
if (ret < 0) {
err = -ERROR_PLAYING_AUDIO;
LOGE(3,
"Could not write audio track: reason: %d look in AudioTrack.write()", ret);
goto free_local_ref;
}
free_local_ref:
LOGI(10, "player_write_audio releasing local ref");
(*env)->DeleteLocalRef(env, samples_byte_array);
end: return err;}
I will be pleased for any help !!!! Thank you very much !!!!
-
use AForge.Video.FFMPEG.dll to grab video thumbnail
5 avril 2015, par SeanI am using AForge.Video.FFMPEG.dll to grab video thumbnail on website, it will be shown on webpage or other clients. I downloaded latest version of the dll and related native library into /bin folder and copied to system32 folder as well.
but when I compiled the project, it will told me :
"
Could not load file or assembly ’AForge.Video.FFMPEG.DLL’ or one of its dependencies. The specified module could not be found.
"-
I copied all DLLs
-
All DLLs are 32bit, not x64 bit.
-
I saw another article to introduce how to load native c or managed c++ library in asp.net, but doesn’t work for me.
http://blogs.msdn.com/b/jorman/archive/2007/08/31/loading-c-assemblies-in-asp-net.aspx -
My developing environment : Windows2012R2 VS2010
- When I was compiling the project, compiler will throw above exception.
Thanks,
Sean
-