Recherche avancée

Médias (0)

Mot : - Tags -/clipboard

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (111)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • MediaSPIP version 0.1 Beta

    16 avril 2011, par

    MediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • Amélioration de la version de base

    13 septembre 2013

    Jolie sélection multiple
    Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
    Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)

Sur d’autres sites (9846)

  • why is ffmpeg so fast

    8 juillet 2017, par jx.xu

    I have written a ffmpeg-based C++ program about converting yuv to rgb using libswscale, similar to the official example.
    Just simple copy and modify the official example before build in my visual studio 2017 on windows 10. However, the time performance is much slower than the ffmpeg.exe executable file, as 36 seconds vs 12 seconds.
    I already know that ffmpeg uses some optimization techniques like SIMD instructions. While in my performance profiling, the bottleneck is disk I/O writing which takes at least 2/3 time.
    Then I develop a concurrent version where a dedicated thread will handle all I/O task while the situation doesn’t seem to improve. It’s worth noting that I use Boost C++ library to utilize multi-thread and asynchronous events.

    So, I just wanna know how can I modify program using the libraries of ffmpeg or the time performance gap towards ffmpeg.exe just can’ be catched up.

    As requested by friendly answers, I post my codes. Compiler is msvc in vs2017 and I turn on the full optimization /Ox .

    Supplement my main question, I make another plain disk I/O test merely copying the file of the same size. It’s surprising to find that plain sequential disk I/O costs 28 seconds while the front codes cost 36 seconds in total... Any one knows how can ffmpeg finishes the same job in only 12 seconds ? That must use some optimization techniques, like random disk I/O or memory buffer reusing ?

    #include "stdafx.h"
    #define __STDC_CONSTANT_MACROS
    extern "C" {
    #include <libavutil></libavutil>imgutils.h>
    #include <libavutil></libavutil>parseutils.h>
    #include <libswscale></libswscale>swscale.h>
    }

    #ifdef _WIN64
    #pragma comment(lib, "avformat.lib")
    #pragma comment(lib, "avcodec.lib")
    #pragma comment(lib, "avutil.lib")
    #pragma comment(lib, "swscale.lib")
    #endif

    #include <common></common>cite.hpp>   // just include headers of c++ STL/Boost

    int main(int argc, char **argv)
    {
       chrono::duration<double> period;
       auto pIn = fopen("G:/Panorama/UHD/originalVideos/DrivingInCountry_3840x1920_30fps_8bit_420_erp.yuv", "rb");
       auto time_mark = chrono::steady_clock::now();

       int src_w = 3840, src_h = 1920, dst_w, dst_h;
       enum AVPixelFormat src_pix_fmt = AV_PIX_FMT_YUV420P, dst_pix_fmt = AV_PIX_FMT_RGB24;
       const char *dst_filename = "G:/Panorama/UHD/originalVideos/out.rgb";
       const char *dst_size = "3840x1920";
       FILE *dst_file;
       int dst_bufsize;
       struct SwsContext *sws_ctx;
       int i, ret;
       if (av_parse_video_size(&amp;dst_w, &amp;dst_h, dst_size) &lt; 0) {
           fprintf(stderr,
               "Invalid size '%s', must be in the form WxH or a valid size abbreviation\n",
               dst_size);
           exit(1);
       }
       dst_file = fopen(dst_filename, "wb");
       if (!dst_file) {
           fprintf(stderr, "Could not open destination file %s\n", dst_filename);
           exit(1);
       }
       /* create scaling context */
       sws_ctx = sws_getContext(src_w, src_h, src_pix_fmt,
           dst_w, dst_h, dst_pix_fmt,
           SWS_BILINEAR, NULL, NULL, NULL);
       if (!sws_ctx) {
           fprintf(stderr,
               "Impossible to create scale context for the conversion "
               "fmt:%s s:%dx%d -> fmt:%s s:%dx%d\n",
               av_get_pix_fmt_name(src_pix_fmt), src_w, src_h,
               av_get_pix_fmt_name(dst_pix_fmt), dst_w, dst_h);
           ret = AVERROR(EINVAL);
           exit(1);
       }
       io_service srv;   // Boost.Asio class
       auto work = make_shared(srv);
       thread t{ bind(&amp;io_service::run,&amp;srv) }; // I/O worker thread
       vector> result;
       /* utilize function class so that lambda can capture itself */
       function)> recursion;  
       recursion = [&amp;](int left, unique_future<bool> writable)  
       {
           if (left &lt;= 0)
           {
               writable.wait();
               return;
           }
           uint8_t *src_data[4], *dst_data[4];
           int src_linesize[4], dst_linesize[4];
           /* promise-future pair used for thread synchronizing so that the file part is written in the correct sequence  */
           promise<bool> sync;
           /* allocate source and destination image buffers */
           if ((ret = av_image_alloc(src_data, src_linesize,
               src_w, src_h, src_pix_fmt, 16)) &lt; 0) {
               fprintf(stderr, "Could not allocate source image\n");
           }
           /* buffer is going to be written to rawvideo file, no alignment */
           if ((ret = av_image_alloc(dst_data, dst_linesize,
               dst_w, dst_h, dst_pix_fmt, 1)) &lt; 0) {
               fprintf(stderr, "Could not allocate destination image\n");
           }
           dst_bufsize = ret;
           fread(src_data[0], src_h*src_w, 1, pIn);
           fread(src_data[1], src_h*src_w / 4, 1, pIn);
           fread(src_data[2], src_h*src_w / 4, 1, pIn);
           result.push_back(async([&amp;] {
               /* convert to destination format */
               sws_scale(sws_ctx, (const uint8_t * const*)src_data,
                   src_linesize, 0, src_h, dst_data, dst_linesize);
               if (left>0)
               {
                   assert(writable.get() == true);
                   srv.post([=]
                   {
                       /* write scaled image to file */
                       fwrite(dst_data[0], 1, dst_bufsize, dst_file);
                       av_freep((void*)&amp;dst_data[0]);
                   });
               }
               sync.set_value(true);
               av_freep(&amp;src_data[0]);
           }));
           recursion(left - 1, sync.get_future());
       };

       promise<bool> root;
       root.set_value(true);
       recursion(300, root.get_future());  // .yuv file only has 300 frames
       wait_for_all(result.begin(), result.end()); // wait for all unique_future to callback
       work.reset();    // io_service::work releses
       srv.stop();      // io_service stops
       t.join();        // I/O thread joins
       period = steady_clock::now() - time_mark;  // calculate valid time
       fprintf(stderr, "Scaling succeeded. Play the output file with the command:\n"
           "ffplay -f rawvideo -pix_fmt %s -video_size %dx%d %s\n",
           av_get_pix_fmt_name(dst_pix_fmt), dst_w, dst_h, dst_filename);
       cout &lt;&lt; "period" &lt;&lt; et &lt;&lt; period &lt;&lt; en;
    end:
       fclose(dst_file);
       //  av_freep(&amp;src_data[0]);
       //  av_freep(&amp;dst_data[0]);
       sws_freeContext(sws_ctx);
       return ret &lt; 0;
    }
    </bool></bool></bool></double>
  • Downloading a video from youtube and convert to MP4

    24 mars 2019, par Ilyes GHOMRANI

    I created a script to download a youtube video and extract an images from it for each period

    screenshotvideo

    def screenshotvideo(url, interval, id, fullduration, title, quality):
       interval = int(interval)
       parsed_t = isodate.parse_duration(fullduration)
       durationseconds=parsed_t.total_seconds()
       iterat=int(durationseconds/int(interval))
       for i in range(0, iterat):
           print(str(id))
           print(str(i))
           print(str(i*interval))
           part(url, time.strftime('%H:%M:%S', time.gmtime(int(i*interval))), "00:00:01", title+"-"+str(id), quality)

    part

    def part(url, starttime, duration, name, quality):
       f = os.popen("ffmpeg $(youtube-dl -f "+quality+" -g '"+url+"' | sed 's/.*/-ss "+starttime+" -i &amp;/') -t "+duration+" -c copy "+name+".mp4")
       now = f.read()
       print(now)
       f = os.popen("ffmpeg -i "+name+".mp4 -ss 00:00:00 -vframes 1 "+name+".jpg")
       now = f.read()
       print(now)
       f = os.popen("rm -rf "+name+".mp4")
       now = f.read()
       print(now)

    so i got the folowing error in the first ffmpeg command

    [mp4 @ 0x55d537f64240] Could not find tag for codec vp8 in stream #0, codec not currently supported in container.

    Could not write header for output file #0 (incorrect codec parameters ?) : Invalid argument

  • Downloading a video from youtube and convert to MP4

    26 mars 2019, par Ghomrani Ilyes

    I created a script to download a youtube video and extract an images from it for each period

    screenshotvideo

    def screenshotvideo(url, interval, id, fullduration, title, quality):
       interval = int(interval)
       parsed_t = isodate.parse_duration(fullduration)
       durationseconds=parsed_t.total_seconds()
       iterat=int(durationseconds/int(interval))
       for i in range(0, iterat):
           print(str(id))
           print(str(i))
           print(str(i*interval))
           part(url, time.strftime('%H:%M:%S', time.gmtime(int(i*interval))), "00:00:01", title+"-"+str(id), quality)

    part

    def part(url, starttime, duration, name, quality):
       f = os.popen("ffmpeg $(youtube-dl -f "+quality+" -g '"+url+"' | sed 's/.*/-ss "+starttime+" -i &amp;/') -t "+duration+" -c copy "+name+".mp4")
       now = f.read()
       print(now)
       f = os.popen("ffmpeg -i "+name+".mp4 -ss 00:00:00 -vframes 1 "+name+".jpg")
       now = f.read()
       print(now)
       f = os.popen("rm -rf "+name+".mp4")
       now = f.read()
       print(now)

    so i got the folowing error in the first ffmpeg command

    [mp4 @ 0x55d537f64240] Could not find tag for codec vp8 in stream #0, codec not currently supported in container.

    Could not write header for output file #0 (incorrect codec parameters ?) : Invalid argument