Recherche avancée

Médias (1)

Mot : - Tags -/publier

Autres articles (101)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Multilang : améliorer l’interface pour les blocs multilingues

    18 février 2011, par

    Multilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
    Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela.

  • Amélioration de la version de base

    13 septembre 2013

    Jolie sélection multiple
    Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
    Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)

Sur d’autres sites (12019)

  • Started Programming Young

    6 septembre 2011, par Multimedia Mike — Programming

    I have some of the strangest memories of my struggles to jump into computer programming.

    Back To BASIC
    I remember doing some Logo programming on Apple II computers at school in 5th grade (1987 timeframe). But that was mostly driving turtle graphics. Then I remember doing some TRS-80 BASIC in 7th grade, circa 1989. Emboldened by what very little I had learned in perhaps the week or 2 we took in a science class to do this, I tried a little GW-BASIC on my family’s “IBM-PC compatible” computer (they were still called that back then). I still remember what my first program consisted of. Even back then I was interested in manipulating graphics and color on a computer screen. Thus :

    10 color 1
    20 print "This is color 1"
    30 color 2
    40 print "This is color 2"
    ...
    

    And so on through 15 colors. Hey, it did the job– it demonstrated the 15 different colors you could set in text mode.

    What’s FOR For ?
    That 7th grade computer unit in science class wasn’t very thick on computer science details. I recall working with a lab partner to transcribe code listings into a computer (and also saving my work to a storage cassette). We also developed form processing programs that would print instructions to input text followed by an “INPUT I$” statement to obtain the user’s output.

    I remember there was some situation where we needed a brief delay between input and printing. The teacher told us to use a construct of the form :

    10 FOR I = 1 TO 20000
    20 NEXT I
    

    We had to calibrate the number based on our empirical assessment of how long it lasted but I recall that the number couldn’t be much higher than about 32000, for reasons that would become clearer much later.

    Imagine my confusion when I would read and try to comprehend BASIC program code I would find in magazines. I would of course see that FOR..NEXT construct all over the place but obviously not in the context of introducing deliberate execution delays. Indeed, my understanding of one of the fundamental building blocks of computer programming — iteration — was completely skewed because of this early lesson.

    Refactoring
    Somewhere along the line, I figured out that the FOR..NEXT could be used to do the same thing a bunch of times, possibly with different values. A few years after I had written that color program, I found it again and realized that I could write it as :

    10 for I = 1 to 15
    20 color I
    30 print I
    40 next I
    

    It still took me a few more years to sort out the meaning of WHILE..WEND, though.

  • How to get picture buffer data in ffmpeg ?

    5 juin 2014, par 4ntoine

    I’m trying to pass bitmap from ffmpeg to android.
    It already works but it’s displaying picture right on surface passed from java to native code.
    How can i get frame buffer bitmap data to pass it to java ?

    I’ve tried to save out_frame buffer data :

    unsigned char bmpFileHeader[14] = {'B', 'M', 0,0,0,0, 0,0, 0,0, 54, 0,0,0};
    unsigned char bmpInfoHeader[40] = {40,0,0,0, 0,0,0,0, 0,0,0,0, 1,0, 24,0};
    unsigned char bmpPad[3] = {0, 0, 0};

    void saveBuffer(int fileIndex, int width, int height, unsigned char *buffer, int buffer_size) {

       unsigned char filename[1024];
       sprintf(filename, "/storage/sdcard0/3d_player_%d.bmp", fileIndex);

       LOGI(10, "saving ffmpeg bitmap file: %d to %s", fileIndex, filename);

       FILE *bitmapFile = fopen(filename, "wb");
       if (!bitmapFile) {
           LOGE(10, "failed to create ffmpeg bitmap file");
           return;
       }

       unsigned char filesize = 54 + 3 * width * height; // 3 = (r,g,b)

       bmpFileHeader[2] = (unsigned char)(filesize);
       bmpFileHeader[3] = (unsigned char)(filesize >> 8);
       bmpFileHeader[4] = (unsigned char)(filesize >> 16);
       bmpFileHeader[5] = (unsigned char)(filesize >> 24);

       bmpInfoHeader[4] = (unsigned char)(width);
       bmpInfoHeader[5] = (unsigned char)(width >> 8);
       bmpInfoHeader[6] = (unsigned char)(width >> 16);
       bmpInfoHeader[7] = (unsigned char)(width >> 24);
       bmpInfoHeader[8] = (unsigned char)(height);
       bmpInfoHeader[9] = (unsigned char)(height >> 8);
       bmpInfoHeader[10] = (unsigned char)(height >> 16);
       bmpInfoHeader[11] = (unsigned char)(height >> 24);

       fwrite(bmpFileHeader, 1, 14, bitmapFile);
       fwrite(bmpInfoHeader, 1, 40, bitmapFile);
       int i;
       for (i=0; iplayer;
       int stream_no = decoder_data->stream_no;
       AVCodecContext * ctx = player->input_codec_ctxs[stream_no];
       AVFrame * frame = player->input_frames[stream_no];
       AVStream * stream = player->input_streams[stream_no];
       int interrupt_ret;
       int to_write;
       int err = 0;
       AVFrame *rgb_frame = player->rgb_frame;
       ANativeWindow_Buffer buffer;
       ANativeWindow * window;

    #ifdef MEASURE_TIME
       struct timespec timespec1, timespec2, diff;
    #endif // MEASURE_TIME
       LOGI(10, "player_decode_video decoding");
       int frameFinished;

    #ifdef MEASURE_TIME
       clock_gettime(CLOCK_PROCESS_CPUTIME_ID, &timespec1);
    #endif // MEASURE_TIME
       int ret = avcodec_decode_video2(ctx, frame, &frameFinished,
               packet_data->packet);

    #ifdef MEASURE_TIME
       clock_gettime(CLOCK_PROCESS_CPUTIME_ID, &timespec2);
       diff = timespec_diff(timespec1, timespec2);
       LOGI(3, "decode_video timediff: %d.%9ld", diff.tv_sec, diff.tv_nsec);
    #endif // MEASURE_TIME

       if (ret < 0) {
           LOGE(1, "player_decode_video Fail decoding video %d\n", ret);
           return -ERROR_WHILE_DECODING_VIDEO;
       }
       if (!frameFinished) {
           LOGI(10, "player_decode_video Video frame not finished\n");
           return 0;
       }

       // saving in buffer converted video frame
       LOGI(7, "player_decode_video copy wait");

    #ifdef MEASURE_TIME
       clock_gettime(CLOCK_PROCESS_CPUTIME_ID, &timespec1);
    #endif // MEASURE_TIME

       pthread_mutex_lock(&player->mutex_queue);
       window = player->window;
       if (window == NULL) {
           pthread_mutex_unlock(&player->mutex_queue);
           goto skip_frame;
       }
       ANativeWindow_setBuffersGeometry(window, ctx->width, ctx->height,
               WINDOW_FORMAT_RGBA_8888);
       if (ANativeWindow_lock(window, &buffer, NULL) != 0) {
           pthread_mutex_unlock(&player->mutex_queue);
           goto skip_frame;
       }
       pthread_mutex_unlock(&player->mutex_queue);

       int format = buffer.format;
       if (format < 0) {
           LOGE(1, "Could not get window format")
       }
       enum PixelFormat out_format;
       if (format == WINDOW_FORMAT_RGBA_8888) {
           out_format = PIX_FMT_RGBA;
           LOGI(6, "Format: WINDOW_FORMAT_RGBA_8888");
       } else if (format == WINDOW_FORMAT_RGBX_8888) {
           out_format = PIX_FMT_RGB0;
           LOGE(1, "Format: WINDOW_FORMAT_RGBX_8888 (not supported)");
       } else if (format == WINDOW_FORMAT_RGB_565) {
           out_format = PIX_FMT_RGB565;
           LOGE(1, "Format: WINDOW_FORMAT_RGB_565 (not supported)");
       } else {
           LOGE(1, "Unknown window format");
       }

       avpicture_fill((AVPicture *) rgb_frame, buffer.bits, out_format,
               buffer.width, buffer.height);
       rgb_frame->data[0] = buffer.bits;
       if (format == WINDOW_FORMAT_RGBA_8888) {
           rgb_frame->linesize[0] = buffer.stride * 4;
       } else {
           LOGE(1, "Unknown window format");
       }
       LOGI(6,
               "Buffer: width: %d, height: %d, stride: %d",
               buffer.width, buffer.height, buffer.stride);
       int i = 0;

    #ifdef MEASURE_TIME
       clock_gettime(CLOCK_PROCESS_CPUTIME_ID, &timespec2);
       diff = timespec_diff(timespec1, timespec2);
       LOGI(1,
               "lockPixels and fillimage timediff: %d.%9ld", diff.tv_sec, diff.tv_nsec);
    #endif // MEASURE_TIME
    #ifdef MEASURE_TIME
       clock_gettime(CLOCK_PROCESS_CPUTIME_ID, &timespec1);
    #endif // MEASURE_TIME
       LOGI(7, "player_decode_video copying...");
       AVFrame * out_frame;
       int rescale;
       if (ctx->width == buffer.width && ctx->height == buffer.height) {
           // This always should be true
           out_frame = rgb_frame;
           rescale = FALSE;
       } else {
           out_frame = player->tmp_frame2;
           rescale = TRUE;
       }

       if (ctx->pix_fmt == PIX_FMT_YUV420P) {
           __I420ToARGB(frame->data[0], frame->linesize[0], frame->data[2],
                   frame->linesize[2], frame->data[1], frame->linesize[1],
                   out_frame->data[0], out_frame->linesize[0], ctx->width,
                   ctx->height);
       } else if (ctx->pix_fmt == PIX_FMT_NV12) {
           __NV21ToARGB(frame->data[0], frame->linesize[0], frame->data[1],
                   frame->linesize[1], out_frame->data[0], out_frame->linesize[0],
                   ctx->width, ctx->height);
       } else {
           LOGI(3, "Using slow conversion: %d ", ctx->pix_fmt);
           struct SwsContext *sws_context = player->sws_context;
           sws_context = sws_getCachedContext(sws_context, ctx->width, ctx->height,
                   ctx->pix_fmt, ctx->width, ctx->height, out_format,
                   SWS_FAST_BILINEAR, NULL, NULL, NULL);
           player->sws_context = sws_context;
           if (sws_context == NULL) {
               LOGE(1, "could not initialize conversion context from: %d"
               ", to :%d\n", ctx->pix_fmt, out_format);
               // TODO some error
           }
           sws_scale(sws_context, (const uint8_t * const *) frame->data,
                   frame->linesize, 0, ctx->height, out_frame->data,
                   out_frame->linesize);
       }

       if (rescale) {
           // Never occurs
           __ARGBScale(out_frame->data[0], out_frame->linesize[0], ctx->width,
                   ctx->height, rgb_frame->data[0], rgb_frame->linesize[0],
                   buffer.width, buffer.height, __kFilterNone);
           out_frame = rgb_frame;
       }

       // TODO: (4ntoine) frame decoded and rescaled, ready to call callback with frame picture from buffer
       int bufferSize = buffer.width * buffer.height * 3; // 3 = (r,g,b);  

       static int bitmapCounter = 0;
       if (bitmapCounter < 10) {
           saveBuffer(bitmapCounter++, buffer.width, buffer.height, (unsigned char *)out_frame->data, bufferSize);
       }

    but out_frame is empty and file has header and 0x00 bytes body.

    How to get picture buffer data in ffmpeg ?

  • Anomalie #2555 : Défaut graphique du survol sur barre typo

    5 mars 2012, par tetue -

    Le « séparateur » est matérialisé par un filet vertical de 1 pixel. S’il fait plus, pourquoi n’est-il pas matérialisé par une bordure aussi épaisse que l’espace qu’il occupe réellement ?