Recherche avancée

Médias (0)

Mot : - Tags -/organisation

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (54)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

  • Contribute to a better visual interface

    13 avril 2011

    MediaSPIP is based on a system of themes and templates. Templates define the placement of information on the page, and can be adapted to a wide range of uses. Themes define the overall graphic appearance of the site.
    Anyone can submit a new graphic theme or template and make it available to the MediaSPIP community.

Sur d’autres sites (9098)

  • AsyncTask publishProgress() does not update progress ffmpeg android

    7 février 2014, par jay

    I am using ffmpeg commands for processing media files.In doInBackground() method i have started the process and every time i get the duration , time values and grabbing progress using time and duration and send progress to publishProgress(progress).When i tested on google nexus(android 4.4 kitkat) it is updating progress dialog correctly but this won't happen in below android 4.4 devices.It is updating with an eye blink of time after completion of the process.
    Here is my code :

    protected String doInBackground(String... params)  {
               // TODO Auto-generated method stub  
               try {
                   proc = mProcess.start();
               } catch (IOException e) {
                   // TODO Auto-generated catch block
                   e.printStackTrace();
               }
               processDuration(proc.getErrorStream());

               // Wait for process to exit
               int exitCode = 1; // Assume error
               try {
                   exitCode = proc.waitFor();
               } catch (InterruptedException e) {
                   Log.e(TAG, "Process interrupted!", e);
               }
               onExit(exitCode);          
               return null;            
           }

           private void onExit(int exitCode) {
               // TODO Auto-generated method stub
               Log.i("exit code >>>>>>>>..", ""+exitCode);
           }

           private void processDuration(InputStream errorStream) {
               // TODO Auto-generated method stub
               Scanner sc = new Scanner(errorStream);          
               // Find duration
               Pattern durPattern = Pattern.compile("(?<=Duration: )[^,]*");
               String dur = sc.findWithinHorizon(durPattern, 0);
               if (dur==null) throw new RuntimeException("Could not parse    duration.");          
               String[] hms = dur.split(":");
               try{
                   totalSecs= Integer.parseInt(hms[0]) * 3600 + Integer.parseInt(hms[1]) *  60 + Double.parseDouble(hms[2]);
                   Log.i(" progress>>>>>>>>>>>>>",""+totalSecs);
               }catch(NumberFormatException e){

               }
               Pattern timePattern = Pattern.compile("(?<=time=)[\\d:.]*");
               String match= sc.findWithinHorizon(timePattern, 0);            
               while (null != (match = sc.findWithinHorizon(timePattern, 0))) {
                   hms = match.split(":");
                   try{
                       processedSecs= Integer.parseInt(hms[0]) * 3600 + Integer.parseInt(hms[1]) *  60 + Double.parseDouble(hms[2]);
                   }catch(NumberFormatException e){

                   }
                   progress = processedSecs / totalSecs;  
                   final int finalProgress=(int)(progress*100);
                   try {

                       publishProgress(""+finalProgress);

                       Thread.sleep(1000);
                   } catch (InterruptedException e) {
                       // TODO Auto-generated catch block
                       e.printStackTrace();
                   }

               }
               publishProgress(""+100);
           }

           protected void onPostExecute(String  result) {
               super.onPostExecute(result);                
               mProgressDialog.dismiss();

           }

           protected void onPreExecute() {
               super.onPreExecute();
               showDialog(DIALOG_DOWNLOAD_PROGRESS);              
           }

           protected void onProgressUpdate(String... progress) {
               mProgressDialog.setProgress(Integer.parseInt(progress[0]));
               super.onProgressUpdate(progress);
           }

           public Dialog showDialog(int id) {
           // TODO Auto-generated method stub
           switch (id) {
           case DIALOG_DOWNLOAD_PROGRESS:
               mProgressDialog = new ProgressDialog(context);
               mProgressDialog.setMessage(loading process..");
               mProgressDialog.setProgressStyle(ProgressDialog.STYLE_HORIZONTAL);
               mProgressDialog.setCancelable(false);
               mProgressDialog.setMax(100);
               mProgressDialog.setCanceledOnTouchOutside(false);
               mProgressDialog.show();
               return mProgressDialog;
           default:
               return null;
           }
       }
       }

    Thanks for Your Help..
    Please help me out this problem..........

  • H264 HW accelerated decoding in Android using stagefright library

    21 février 2014, par user3215358

    I`m trying to decode h264 video using HW with Stagefright library.

    i have used an example in here. Im getting decoded data in MedaBuffer. For rendering MediaBuffer->data() i tried AwesomeLocalRenderer in AwesomePlayer.cpp.

    but picture in screen are distorted

    Here is The Link of original and crashed picture.

    And also tried this in example`

    sp<metadata> metaData = mVideoBuffer->meta_data();
    int64_t timeUs = 0;
    metaData->findInt64(kKeyTime, &amp;timeUs);
    native_window_set_buffers_timestamp(mNativeWindow.get(), timeUs * 1000);
    err = mNativeWindow->queueBuffer(mNativeWindow.get(),
    mVideoBuffer->graphicBuffer().get(), -1);`
    </metadata>

    But my native code crashes. I can`t get real picture its or corrupted or it black screen.

    Please, I need Your help, What i'm doing wrong ?

    Thanks in Advance.

  • How can fix CalledProcessError : in ffmpeg

    7 décembre 2020, par Mario

    I hope someone can help to troubleshoot this problem. I'm trying to save the loss plots out of Keras in the form of the following animation.

    &#xA;&#xA;

    img

    &#xA;&#xA;

    but I have been facing the following error, and ultimately I can't save the animation :

    &#xA;&#xA;

    MovieWriter stderr:&#xA;[h264_v4l2m2m @ 0x55a67176f430] Could not find a valid device&#xA;[h264_v4l2m2m @ 0x55a67176f430] can&#x27;t configure encoder&#xA;Error initializing output stream 0:0 -- Error while opening encoder for output stream #0:0 - maybe incorrect parameters such as bit_rate, rate, width or height&#xA;&#xA;---------------------------------------------------------------------------&#xA;BrokenPipeError                           Traceback (most recent call last)&#xA;~/anaconda3/envs/CR7/lib/python3.6/site-packages/matplotlib/animation.py in saving(self, fig, outfile, dpi, *args, **kwargs)&#xA;    229         try:&#xA;--> 230             yield self&#xA;    231         finally:&#xA;&#xA;~/anaconda3/envs/CR7/lib/python3.6/site-packages/matplotlib/animation.py in save(self, filename, writer, fps, dpi, codec, bitrate, extra_args, metadata, extra_anim, savefig_kwargs, progress_callback)&#xA;   1155                             frame_number &#x2B;= 1&#xA;-> 1156                     writer.grab_frame(**savefig_kwargs)&#xA;   1157 &#xA;&#xA;~/anaconda3/envs/CR7/lib/python3.6/site-packages/matplotlib/animation.py in grab_frame(self, **savefig_kwargs)&#xA;    383         self.fig.savefig(self._frame_sink(), format=self.frame_format,&#xA;--> 384                          dpi=self.dpi, **savefig_kwargs)&#xA;    385 &#xA;&#xA;~/anaconda3/envs/CR7/lib/python3.6/site-packages/matplotlib/figure.py in savefig(self, fname, transparent, **kwargs)&#xA;   2179 &#xA;-> 2180         self.canvas.print_figure(fname, **kwargs)&#xA;   2181 &#xA;&#xA;~/anaconda3/envs/CR7/lib/python3.6/site-packages/matplotlib/backend_bases.py in print_figure(self, filename, dpi, facecolor, edgecolor, orientation, format, bbox_inches, **kwargs)&#xA;   2081                     bbox_inches_restore=_bbox_inches_restore,&#xA;-> 2082                     **kwargs)&#xA;   2083             finally:&#xA;&#xA;~/anaconda3/envs/CR7/lib/python3.6/site-packages/matplotlib/backends/backend_agg.py in print_raw(self, filename_or_obj, *args, **kwargs)&#xA;    445                 cbook.open_file_cm(filename_or_obj, "wb") as fh:&#xA;--> 446             fh.write(renderer._renderer.buffer_rgba())&#xA;    447 &#xA;&#xA;BrokenPipeError: [Errno 32] Broken pipe&#xA;&#xA;During handling of the above exception, another exception occurred:&#xA;&#xA;CalledProcessError                        Traceback (most recent call last)&#xA; in <module>&#xA;     17 print(f&#x27;{model_type.upper()} Train Time: {Timer} sec&#x27;)&#xA;     18 &#xA;---> 19 create_loss_animation(model_type, hist.history[&#x27;loss&#x27;], hist.history[&#x27;val_loss&#x27;], epoch)&#xA;     20 &#xA;     21 evaluate(model, trainX, trainY, testX, testY, scores_train, scores_test)&#xA;&#xA; in create_loss_animation(model_type, loss_hist, val_loss_hist, epoch)&#xA;     34 &#xA;     35     ani = matplotlib.animation.FuncAnimation(fig, animate, fargs=(l1, l2, loss, val_loss, title), repeat=True, interval=1000, repeat_delay=1000)&#xA;---> 36     ani.save(f&#x27;loss_animation_{model_type}_oneDataset.mp4&#x27;, writer=writer)&#xA;&#xA;~/anaconda3/envs/CR7/lib/python3.6/site-packages/matplotlib/animation.py in save(self, filename, writer, fps, dpi, codec, bitrate, extra_args, metadata, extra_anim, savefig_kwargs, progress_callback)&#xA;   1154                             progress_callback(frame_number, total_frames)&#xA;   1155                             frame_number &#x2B;= 1&#xA;-> 1156                     writer.grab_frame(**savefig_kwargs)&#xA;   1157 &#xA;   1158         # Reconnect signal for first draw if necessary&#xA;&#xA;~/anaconda3/envs/CR7/lib/python3.6/contextlib.py in __exit__(self, type, value, traceback)&#xA;     97                 value = type()&#xA;     98             try:&#xA;---> 99                 self.gen.throw(type, value, traceback)&#xA;    100             except StopIteration as exc:&#xA;    101                 # Suppress StopIteration *unless* it&#x27;s the same exception that&#xA;&#xA;~/anaconda3/envs/CR7/lib/python3.6/site-packages/matplotlib/animation.py in saving(self, fig, outfile, dpi, *args, **kwargs)&#xA;    230             yield self&#xA;    231         finally:&#xA;--> 232             self.finish()&#xA;    233 &#xA;    234 &#xA;&#xA;~/anaconda3/envs/CR7/lib/python3.6/site-packages/matplotlib/animation.py in finish(self)&#xA;    365     def finish(self):&#xA;    366         &#x27;&#x27;&#x27;Finish any processing for writing the movie.&#x27;&#x27;&#x27;&#xA;--> 367         self.cleanup()&#xA;    368 &#xA;    369     def grab_frame(self, **savefig_kwargs):&#xA;&#xA;~/anaconda3/envs/CR7/lib/python3.6/site-packages/matplotlib/animation.py in cleanup(self)&#xA;    409         if self._proc.returncode:&#xA;    410             raise subprocess.CalledProcessError(&#xA;--> 411                 self._proc.returncode, self._proc.args, out, err)&#xA;    412 &#xA;    413     @classmethod&#xA;&#xA;CalledProcessError: Command &#x27;[&#x27;/usr/bin/ffmpeg&#x27;, &#x27;-f&#x27;, &#x27;rawvideo&#x27;, &#x27;-vcodec&#x27;, &#x27;rawvideo&#x27;, &#x27;-s&#x27;, &#x27;720x720&#x27;, &#x27;-pix_fmt&#x27;, &#xA;&#x27;rgba&#x27;, &#x27;-r&#x27;, &#x27;5&#x27;, &#x27;-loglevel&#x27;, &#x27;error&#x27;, &#x27;-i&#x27;, &#x27;pipe:&#x27;, &#x27;-vcodec&#x27;, &#x27;h264&#x27;, &#x27;-pix_fmt&#x27;, &#x27;yuv420p&#x27;, &#x27;-b&#x27;, &#x27;800k&#x27;, &#x27;-y&#x27;, &#xA;&#x27;loss_animation_CNN_oneDataset.mp4&#x27;]&#x27; returned non-zero exit status 1.&#xA;</module>

    &#xA;&#xA;

    I tried to ignore the error by this answer but it seems it's not the case. I also checked similar case but its answer for getting a static git binary is not my cas as well since not especial converting PNG to MP4 !

    &#xA;&#xA;

    My code is as follows :

    &#xA;&#xA;

    plt.rcParams[&#x27;animation.ffmpeg_path&#x27;] = &#x27;/usr/bin/ffmpeg&#x27;&#xA;&#xA;def animate(i, data1, data2, line1, line2):&#xA;    temp1 = data1.iloc[:int(i&#x2B;1)]&#xA;    temp2 = data2.iloc[:int(i&#x2B;1)]&#xA;&#xA;    line1.set_data(temp1.index, temp1.value)&#xA;    line2.set_data(temp2.index, temp2.value)&#xA;&#xA;    return (line1, line2)&#xA;&#xA;&#xA;def create_loss_animation(model_type, data1, data2):&#xA;    fig = plt.figure()&#xA;    plt.title(f&#x27;Loss on Train &amp; Test&#x27;, fontsize=25)&#xA;    plt.xlabel(&#x27;Epoch&#x27;, fontsize=20)&#xA;    plt.ylabel(&#x27;Loss MSE for Sx-Sy &amp; Sxy&#x27;, fontsize=20)&#xA;    plt.xlim(min(data1.index.min(), data2.index.min()), max(data1.index.max(), data2.index.max()))&#xA;    plt.ylim(min(data1.value.min(), data2.value.min()), max(data1.value.max(), data2.value.max()))&#xA;&#xA;    l1, = plt.plot([], [], &#x27;o-&#x27;, label=&#x27;Train Loss&#x27;, color=&#x27;b&#x27;, markevery=[-1])&#xA;    l2, = plt.plot([], [], &#x27;o-&#x27;, label=&#x27;Test Loss&#x27;, color=&#x27;r&#x27;, markevery=[-1])&#xA;    plt.legend(loc=&#x27;center right&#x27;, fontsize=&#x27;xx-large&#x27;)&#xA;&#xA;    Writer = animation.writers[&#x27;ffmpeg&#x27;]&#xA;    writer = Writer(fps=5, bitrate=1800)&#xA;&#xA;    ani = matplotlib.animation.FuncAnimation(fig, animate, fargs=(data1, data2, l1, l2), repeat=True, interval=1000, repeat_delay=1000)&#xA;    ani.save(f&#x27;{model_type}.mp4&#x27;, writer=writer)&#xA;&#xA;# create datasets&#xA;x = np.linspace(0,150,50)&#xA;y1 = 41*np.exp(-x/20)&#xA;y2 = 35*np.exp(-x/50)&#xA;&#xA;my_data_number_1 = pd.DataFrame({&#x27;x&#x27;:x, &#x27;value&#x27;:y1}).set_index(&#x27;x&#x27;)&#xA;my_data_number_2 = pd.DataFrame({&#x27;x&#x27;:x, &#x27;value&#x27;:y2}).set_index(&#x27;x&#x27;)&#xA;&#xA;create_loss_animation(&#x27;test&#x27;, my_data_number_1, my_data_number_2)&#xA;

    &#xA;