Recherche avancée

Médias (0)

Mot : - Tags -/metadatas

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (14)

  • Taille des images et des logos définissables

    9 février 2011, par

    Dans beaucoup d’endroits du site, logos et images sont redimensionnées pour correspondre aux emplacements définis par les thèmes. L’ensemble des ces tailles pouvant changer d’un thème à un autre peuvent être définies directement dans le thème et éviter ainsi à l’utilisateur de devoir les configurer manuellement après avoir changé l’apparence de son site.
    Ces tailles d’images sont également disponibles dans la configuration spécifique de MediaSPIP Core. La taille maximale du logo du site en pixels, on permet (...)

  • Configuration spécifique d’Apache

    4 février 2011, par

    Modules spécifiques
    Pour la configuration d’Apache, il est conseillé d’activer certains modules non spécifiques à MediaSPIP, mais permettant d’améliorer les performances : mod_deflate et mod_headers pour compresser automatiquement via Apache les pages. Cf ce tutoriel ; mode_expires pour gérer correctement l’expiration des hits. Cf ce tutoriel ;
    Il est également conseillé d’ajouter la prise en charge par apache du mime-type pour les fichiers WebM comme indiqué dans ce tutoriel.
    Création d’un (...)

  • Supporting all media types

    13 avril 2011, par

    Unlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)

Sur d’autres sites (5039)

  • How to successfully use ffmpeg to convert images into videos

    6 février 2014, par user3276588

    Currently I'm trying to make 5 second videos for each image and then combining them using concat, but I'm having a lot of issues doing this.

    Here is my code :

    ffmpeg -loop 1 -i 000.jpg -t 5 000.mp4
    ffmpeg -loop 1 -i 001.jpg -t 5 001.mp4
    ffmpeg -f concat -i inputs.txt -codec copy output.mp4

    inputs.txt lists the following
    file '000.mp4'
    file '001.mp4'

    Now when I try all of this, I run the output mp4 file and the first part of it works fine, but then once it starts displaying the 001 part of the video, it switches to a gray screen. It doesn't even do that in the stand-alone file 001.mp4.

    Any suggestions on what's going on or what I could do fix this ?

    I've tried so many other things too. Like switching all the files over to png instead, which gave the same issue. I also tried using different output file types like wmv, avi, etc. I'm still very new though so I don't know what else to try.

    This is what shows up in my command prompt after running the command.

    C:\xampp\htdocs\images>ffmpeg -f concat -i inputs.txt -codec copy output.mp4
    ffmpeg version N-50911-g9efcfbe Copyright (c) 2000-2013 the FFmpeg developers
     built on Mar 13 2013 21:26:48 with gcc 4.7.2 (GCC)
     configuration: --enable-gpl --enable-version3 --disable-w32threads --enable-av
    isynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enab
    le-libass --enable-libbluray --enable-libcaca --enable-libfreetype --enable-libg
    sm --enable-libilbc --enable-libmp3lame --enable-libopencore-amrnb --enable-libo
    pencore-amrwb --enable-libopenjpeg --enable-libopus --enable-librtmp --enable-li
    bschroedinger --enable-libsoxr --enable-libspeex --enable-libtheora --enable-lib
    twolame --enable-libvo-aacenc --enable-libvo-amrwbenc --enable-libvorbis --enabl
    e-libvpx --enable-libx264 --enable-libxavs --enable-libxvid --enable-zlib
     libavutil      52. 19.100 / 52. 19.100
     libavcodec     55.  0.100 / 55.  0.100
     libavformat    55.  0.100 / 55.  0.100
     libavdevice    54.  4.100 / 54.  4.100
     libavfilter     3. 45.103 /  3. 45.103
     libswscale      2.  2.100 /  2.  2.100
     libswresample   0. 17.102 /  0. 17.102
     libpostproc    52.  2.100 / 52.  2.100
    [concat @ 003b9660] Estimating duration from bitrate, this may be inaccurate
    Input #0, concat, from 'inputs.txt':
     Duration: 00:00:00.01, start: 0.000000, bitrate: 28 kb/s
       Stream #0:0: Video: h264 (High 4:4:4 Predictive) (avc1 / 0x31637661), yuv444
    p, 800x600 [SAR 1:1 DAR 4:3], 28 kb/s, 25 fps, 25 tbr, 12800 tbn, 50 tbc
    Output #0, mp4, to 'output.mp4':
     Metadata:
       encoder         : Lavf55.0.100
       Stream #0:0: Video: h264 ([33][0][0][0] / 0x0021), yuv444p, 800x600 [SAR 1:1
    DAR 4:3], q=2-31, 28 kb/s, 25 fps, 12800 tbn, 12800 tbc
    Stream mapping:
     Stream #0:0 -> #0:0 (copy)
    Press [q] to stop, [?] for help
    frame=  250 fps=0.0 q=-1.0 Lsize=      31kB time=00:00:09.88 bitrate=  26.1kbits
    /s
    video:28kB audio:0kB subtitle:0 global headers:0kB muxing overhead 13.534948%

    C:\xampp\htdocs\images>
  • Studying A Game Wave Disc

    23 novembre 2010, par Multimedia Mike — Game Hacking

    I picked up a used copy of game called Gemz — a rather flagrant Bejeweled clone — for a game console called Game Wave Family Entertainment System. Heard of it ? Neither had I. But the game media is optical, so I had to get it and study it.



    When mounted in Linux (as UDF), the disc is reported to contain 2.8 GB of data, so it has to be a DVD. 810 MB of that is dedicated to the movies/ directory. Multimedia format ? Just plain, boring MPEG files (very YouTube-friendly— here’s the opening animation). Deeper digging reveals some more subdirectories called movies/ that, combined, occupy the lion’s share of the disc space. Additionally, there are several single-frame .m2v files in a directory called iframes/ which are used to encode things like load screens.



    There are more interesting data files including .zbm files for images and fonts, and .zwf files for audio. I suspect that these stand for zipped bitmap and zipped wave file, respectively. They can’t be directly unzipped with ’gunzip’. Some of the numbers at the start of some files lead me to believe they can be easily decompressed with standard zlib facilities.

    Based on the binary files on the Gemz disc, I couldn’t find any data on what CPU this system might use. A little Googling led me to this page at the Video Game Console Library which pegs the brain as a Mediamatics 6811. Some searching for that leads me to a long-discontinued line of hardware from National Semiconductor.

    The Console Library page also mentions that the games were developed using the Lua programming language. Indeed, there are many Lua-related strings in the game’s binaries (’zlib’ also makes an appearance).

  • FFMPEG Audio decode and draw waveform

    7 avril 2016, par Haris

    I am trying to decode the audio and draw the waveform using ffmpeg, and the input audio data is AV_SAMPLE_FMT_S16P, basically I am following the tutorial here, and the audio is playing fine with libao. Now I need to plot the waveform using decoded data, currently I am writing left and right channel to separate csv file and plotting on excel. But the waveform is something different from the waveform shown in Audacity using the same audio clip. When I analyzed the value written on csv most of the values are close to maximum of uint16_t(65535), but there are some other lower values, but majority is high peak.

    Here is the source code,

       const char* input_filename="/home/user/Music/Clip.mp3";
       av_register_all();
       AVFormatContext* container=avformat_alloc_context();
       if(avformat_open_input(&container,input_filename,NULL,NULL)<0){
           endApp("Could not open file");
       }

       if(avformat_find_stream_info(container, NULL)<0){
           endApp("Could not find file info");
       }

       av_dump_format(container,0,input_filename,false);

       int stream_id=-1;
       int i;
       for(i=0;inb_streams;i++){
           if(container->streams[i]->codec->codec_type==AVMEDIA_TYPE_AUDIO){
               stream_id=i;
               break;
           }
       }
       if(stream_id==-1){
           endApp("Could not find Audio Stream");
       }

       AVDictionary *metadata=container->metadata;

       AVCodecContext *ctx=container->streams[stream_id]->codec;
       AVCodec *codec=avcodec_find_decoder(ctx->codec_id);

       if(codec==NULL){
           endApp("cannot find codec!");
       }

       if(avcodec_open2(ctx,codec,NULL)<0){
           endApp("Codec cannot be found");
       }



       AVPacket packet;
       av_init_packet(&packet);

       //AVFrame *frame=avcodec_alloc_frame();
       AVFrame *frame=av_frame_alloc();

       int buffer_size=AVCODEC_MAX_AUDIO_FRAME_SIZE+ FF_INPUT_BUFFER_PADDING_SIZE;

       // MSVC can't do variable size allocations on stack, ohgodwhy
       uint8_t *buffer = new uint8_t[buffer_size];
       packet.data=buffer;
       packet.size =buffer_size;

       int frameFinished=0;

       int plane_size;

       ofstream fileCh1,fileCh2;
       fileCh1.open ("ch1.csv");
       fileCh2.open ("ch2.csv");

       AVSampleFormat sfmt=ctx->sample_fmt;

       while(av_read_frame(container,&packet)>=0)
       {

           if(packet.stream_index==stream_id){
               int len=avcodec_decode_audio4(ctx,frame,&frameFinished,&packet);
               int data_size = av_samples_get_buffer_size(&plane_size, ctx->channels,
                                                   frame->nb_samples,
                                                   ctx->sample_fmt, 1);


               if(frameFinished){
                   int write_p=0;
                   // QTime t;
                   switch (sfmt){

                       case AV_SAMPLE_FMT_S16P:

                           for (int nb=0;nbsizeof(uint16_t);nb++){
                               for (int ch = 0; ch < ctx->channels; ch++) {
                                   if(ch==0)
                                       fileCh1 <<((uint16_t *) frame->extended_data[ch])[nb]<<"\n";
                                   else if(ch==1)
                                       fileCh2 <<((uint16_t *) frame->extended_data[ch])[nb]<<"\n";
                               }
                           }

                           break;

                   }
               } else {
                   DBG("frame failed");
               }
           }


           av_free_packet(&packet);
       }
       fileCh1.close();
       fileCh2.close();
       avcodec_close(ctx);
       avformat_close_input(&container);
       delete buffer;
       return 0;

    Edit :

    I have attached the waveform image draw using opencv, here I scaled the sample value to 0-255 range, and took value 127 as 0(Y-axis). Now for each sample draw line from (x,127) to (x,sample value) where x=1,2,3,...

    enter image description here