Recherche avancée

Médias (0)

Mot : - Tags -/masques

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (100)

  • ANNEXE : Les plugins utilisés spécifiquement pour la ferme

    5 mars 2010, par

    Le site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)

  • Problèmes fréquents

    10 mars 2010, par

    PHP et safe_mode activé
    Une des principales sources de problèmes relève de la configuration de PHP et notamment de l’activation du safe_mode
    La solution consiterait à soit désactiver le safe_mode soit placer le script dans un répertoire accessible par apache pour le site

  • Support de tous types de médias

    10 avril 2011

    Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)

Sur d’autres sites (11057)

  • Error when use command of ffmpeg on android

    19 février 2014, par user2830969

    I get source code from https://github.com/JayH5/android-ffmpeg-cmdline and then I edit code to be simple than.

    public ProcessRunnable create() {
           if (inputPath == null || outputPath == null) {
               throw new IllegalStateException("Need an input and output filepath!");
           }  

           final List<string> cmd = new LinkedList<string>();

           cmd.add(mFfmpegPath);
           cmd.add("-i");
           cmd.add(inputPath);
           cmd.add("-s 240x160");

           cmd.add(outputPath);
           Log.w("Command", cmd.toString());
           final ProcessBuilder pb = new ProcessBuilder(cmd);
           return new ProcessRunnable(pb);
       }
    </string></string>

    When I run code, the result is *.mp file 0.00 kb.
    Please help me fix about this.
    thank so much

  • AsyncTask publishProgress() does not update progress ffmpeg android

    7 février 2014, par jay

    I am using ffmpeg commands for processing media files.In doInBackground() method i have started the process and every time i get the duration , time values and grabbing progress using time and duration and send progress to publishProgress(progress).When i tested on google nexus(android 4.4 kitkat) it is updating progress dialog correctly but this won't happen in below android 4.4 devices.It is updating with an eye blink of time after completion of the process.
    Here is my code :

    protected String doInBackground(String... params)  {
               // TODO Auto-generated method stub  
               try {
                   proc = mProcess.start();
               } catch (IOException e) {
                   // TODO Auto-generated catch block
                   e.printStackTrace();
               }
               processDuration(proc.getErrorStream());

               // Wait for process to exit
               int exitCode = 1; // Assume error
               try {
                   exitCode = proc.waitFor();
               } catch (InterruptedException e) {
                   Log.e(TAG, "Process interrupted!", e);
               }
               onExit(exitCode);          
               return null;            
           }

           private void onExit(int exitCode) {
               // TODO Auto-generated method stub
               Log.i("exit code >>>>>>>>..", ""+exitCode);
           }

           private void processDuration(InputStream errorStream) {
               // TODO Auto-generated method stub
               Scanner sc = new Scanner(errorStream);          
               // Find duration
               Pattern durPattern = Pattern.compile("(?&lt;=Duration: )[^,]*");
               String dur = sc.findWithinHorizon(durPattern, 0);
               if (dur==null) throw new RuntimeException("Could not parse    duration.");          
               String[] hms = dur.split(":");
               try{
                   totalSecs= Integer.parseInt(hms[0]) * 3600 + Integer.parseInt(hms[1]) *  60 + Double.parseDouble(hms[2]);
                   Log.i(" progress>>>>>>>>>>>>>",""+totalSecs);
               }catch(NumberFormatException e){

               }
               Pattern timePattern = Pattern.compile("(?&lt;=time=)[\\d:.]*");
               String match= sc.findWithinHorizon(timePattern, 0);            
               while (null != (match = sc.findWithinHorizon(timePattern, 0))) {
                   hms = match.split(":");
                   try{
                       processedSecs= Integer.parseInt(hms[0]) * 3600 + Integer.parseInt(hms[1]) *  60 + Double.parseDouble(hms[2]);
                   }catch(NumberFormatException e){

                   }
                   progress = processedSecs / totalSecs;  
                   final int finalProgress=(int)(progress*100);
                   try {

                       publishProgress(""+finalProgress);

                       Thread.sleep(1000);
                   } catch (InterruptedException e) {
                       // TODO Auto-generated catch block
                       e.printStackTrace();
                   }

               }
               publishProgress(""+100);
           }

           protected void onPostExecute(String  result) {
               super.onPostExecute(result);                
               mProgressDialog.dismiss();

           }

           protected void onPreExecute() {
               super.onPreExecute();
               showDialog(DIALOG_DOWNLOAD_PROGRESS);              
           }

           protected void onProgressUpdate(String... progress) {
               mProgressDialog.setProgress(Integer.parseInt(progress[0]));
               super.onProgressUpdate(progress);
           }

           public Dialog showDialog(int id) {
           // TODO Auto-generated method stub
           switch (id) {
           case DIALOG_DOWNLOAD_PROGRESS:
               mProgressDialog = new ProgressDialog(context);
               mProgressDialog.setMessage(loading process..");
               mProgressDialog.setProgressStyle(ProgressDialog.STYLE_HORIZONTAL);
               mProgressDialog.setCancelable(false);
               mProgressDialog.setMax(100);
               mProgressDialog.setCanceledOnTouchOutside(false);
               mProgressDialog.show();
               return mProgressDialog;
           default:
               return null;
           }
       }
       }

    Thanks for Your Help..
    Please help me out this problem..........

  • Android. Problems with AudioTrack class. Sound sometimes lost

    29 janvier 2014, par bukka.wh

    I have found open source video player for Android, which uses ffmpeg to decode video.
    I have some problems with audio, that sometimes plays with jerks, but video picture is shown well. The basic idea of player is that audio and video are decoded in two different streams, and then in the third stream the are passed back, video picture is shown on SurfaceView and video sound is passed in byte array to AudioTrack and then plays. But sometimes sound is lost or playing with jerks. Can anyone give me start point for what to do (some basic concepts). May be I should change buffer size for AudioTrack or add some flags to it. Here is a piece of code, where AudioTrack class is created.

    private AudioTrack prepareAudioTrack(int sampleRateInHz,
           int numberOfChannels) {

       for (;;) {
           int channelConfig;
           if (numberOfChannels == 1) {
               channelConfig = AudioFormat.CHANNEL_OUT_MONO;
           } else if (numberOfChannels == 2) {
               channelConfig = AudioFormat.CHANNEL_OUT_STEREO;
           } else if (numberOfChannels == 3) {
               channelConfig = AudioFormat.CHANNEL_OUT_FRONT_CENTER
                       | AudioFormat.CHANNEL_OUT_FRONT_RIGHT
                       | AudioFormat.CHANNEL_OUT_FRONT_LEFT;
           } else if (numberOfChannels == 4) {
               channelConfig = AudioFormat.CHANNEL_OUT_QUAD;
           } else if (numberOfChannels == 5) {
               channelConfig = AudioFormat.CHANNEL_OUT_QUAD
                       | AudioFormat.CHANNEL_OUT_LOW_FREQUENCY;
           } else if (numberOfChannels == 6) {
               channelConfig = AudioFormat.CHANNEL_OUT_5POINT1;
           } else if (numberOfChannels == 8) {
               channelConfig = AudioFormat.CHANNEL_OUT_7POINT1;
           } else {
               channelConfig = AudioFormat.CHANNEL_OUT_STEREO;
           }
           try {
               Log.d("MyLog","Creating Audio player");
               int minBufferSize = AudioTrack.getMinBufferSize(sampleRateInHz,
                       channelConfig, AudioFormat.ENCODING_PCM_16BIT);
               AudioTrack audioTrack = new AudioTrack(
                       AudioManager.STREAM_MUSIC, sampleRateInHz,
                       channelConfig, AudioFormat.ENCODING_PCM_16BIT,
                       minBufferSize, AudioTrack.MODE_STREAM);
               return audioTrack;
           } catch (IllegalArgumentException e) {
               if (numberOfChannels > 2) {
                   numberOfChannels = 2;
               } else if (numberOfChannels > 1) {
                   numberOfChannels = 1;
               } else {
                   throw e;
               }
           }
       }
    }

    And this is a piece of native code where sound bytes are written to AudioTrack

    int player_write_audio(struct DecoderData *decoder_data, JNIEnv *env,
       int64_t pts, uint8_t *data, int data_size, int original_data_size) {
    struct Player *player = decoder_data->player;
    int stream_no = decoder_data->stream_no;
    int err = ERROR_NO_ERROR;
    int ret;
    AVCodecContext * c = player->input_codec_ctxs[stream_no];
    AVStream *stream = player->input_streams[stream_no];
    LOGI(10, "player_write_audio Writing audio frame")

    jbyteArray samples_byte_array = (*env)->NewByteArray(env, data_size);
    if (samples_byte_array == NULL) {
       err = -ERROR_NOT_CREATED_AUDIO_SAMPLE_BYTE_ARRAY;
       goto end;
    }

    if (pts != AV_NOPTS_VALUE) {
       player->audio_clock = av_rescale_q(pts, stream->time_base, AV_TIME_BASE_Q);
       LOGI(9, "player_write_audio - read from pts")
    } else {
       int64_t sample_time = original_data_size;
       sample_time *= 1000000ll;
       sample_time /= c->channels;
       sample_time /= c->sample_rate;
       sample_time /= av_get_bytes_per_sample(c->sample_fmt);
       player->audio_clock += sample_time;
       LOGI(9, "player_write_audio - added")
    }
    enum WaitFuncRet wait_ret = player_wait_for_frame(player,
           player->audio_clock + AUDIO_TIME_ADJUST_US, stream_no);
    if (wait_ret == WAIT_FUNC_RET_SKIP) {
       goto end;
    }

    LOGI(10, "player_write_audio Writing sample data")

    jbyte *jni_samples = (*env)->GetByteArrayElements(env, samples_byte_array,
           NULL);
    memcpy(jni_samples, data, data_size);
    (*env)->ReleaseByteArrayElements(env, samples_byte_array, jni_samples, 0);

    LOGI(10, "player_write_audio playing audio track");
    ret = (*env)->CallIntMethod(env, player->audio_track,
           player->audio_track_write_method, samples_byte_array, 0, data_size);
    jthrowable exc = (*env)->ExceptionOccurred(env);
    if (exc) {
       err = -ERROR_PLAYING_AUDIO;
       LOGE(3, "Could not write audio track: reason in exception");
       // TODO maybe release exc
       goto free_local_ref;
    }
    if (ret &lt; 0) {
       err = -ERROR_PLAYING_AUDIO;
       LOGE(3,
               "Could not write audio track: reason: %d look in AudioTrack.write()", ret);
       goto free_local_ref;
    }

    free_local_ref:
    LOGI(10, "player_write_audio releasing local ref");
    (*env)->DeleteLocalRef(env, samples_byte_array);

    end: return err;

    }

    I will be pleased for any help !!!! Thank you very much !!!!