Recherche avancée

Médias (91)

Autres articles (79)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Personnaliser les catégories

    21 juin 2013, par

    Formulaire de création d’une catégorie
    Pour ceux qui connaissent bien SPIP, une catégorie peut être assimilée à une rubrique.
    Dans le cas d’un document de type catégorie, les champs proposés par défaut sont : Texte
    On peut modifier ce formulaire dans la partie :
    Administration > Configuration des masques de formulaire.
    Dans le cas d’un document de type média, les champs non affichés par défaut sont : Descriptif rapide
    Par ailleurs, c’est dans cette partie configuration qu’on peut indiquer le (...)

Sur d’autres sites (11940)

  • avformat/rtpdec_jpeg : fix low contrast image on low quality setting

    24 mars 2016, par Ico Doornekamp
    avformat/rtpdec_jpeg : fix low contrast image on low quality setting
    

    Original mail and my own followup on ffmpeg-user earlier today :

    I have a device sending out a MJPEG/RTP stream on a low quality setting.
    Decoding and displaying the video with libavformat results in a washed
    out, low contrast, greyish image. Playing the same stream with VLC results
    in proper color representation.

    Screenshots for comparison :

    http://zevv.nl/div/libav/shot-ffplay.jpg
    http://zevv.nl/div/libav/shot-vlc.jpg

    A pcap capture of a few seconds of video and SDP file for playing the
    stream are available at

    http://zevv.nl/div/libav/mjpeg.pcap
    http://zevv.nl/div/libav/mjpeg.sdp

    I believe the problem might be in the calculation of the quantization
    tables in the function create_default_qtables(), the attached patch
    solves the issue for me.

    The problem is that the argument ’q’ is of the type uint8_t. According to the
    JPEG standard, if 1 <= q <= 50, the scale factor ’S’ should be 5000 / Q.
    Because the create_default_qtables() reuses the variable ’q’ to store the
    result of this calculation, for small values of q < 19, q wil subsequently
    overflow and give wrong results in the calculated quantization tables. The
    patch below uses a new variable ’S’ (same name as in RFC2435) with the proper
    range to store the result of the division.

    Signed-off-by : Michael Niedermayer <michael@niedermayer.cc>

    • [DH] libavformat/rtpdec_jpeg.c
  • Live TV Stream to Android

    8 mai 2016, par Hd Dream

    I am new user on stackoverflow.
    I wrote you today to get answer and solution to my problem.
    I have input stream come from vlc. I need that all android connected to my network can watch stream in live in the same time like an online channel tv.
    I saw for this channel

    Code : Select all
    www.canal2international.net/live.php

    there is a same ts file is generated for each 45 second.

    I need to do the same but with a mp4 file that I insered on html5 video tag.
    for that I tried to convert my input stream with this cmd on ffmpeg

    Code : Select all
    ffmpeg -i 192.168.1.10:8620 -c:v libx264 -profile:v baseline -c:a aac -ar 44100 -ac 2 -b:a 128k -f segment -segment_time 45 -segment_format_options movflags=+faststart -segment_list D :\live\play.m3u8 -segment_list_size 2 D :\live\watch%03d.mp4

    I’m not satisfied. I can’t insert m3u8 file in html5 video tag because android don’t read it and his content.

    I tried another code that I can say little similar that the result of this page

    Code : Select all
    www.canal2international.net/live.php

    Code : Select all
    ffmpeg -i 192.168.1.10:8620 -hls_flags single_file D :\live\play.m3u8

    It generated two files, a ts and a m3u8.
    I know that html5 video tag can read ts file. Maybe there is a solution somewhere.

    The second way I don’t like in this code is ts file don’t read in real time. When I read file it begin from the first encoding.

    All android connected to my network must watch in real time !

    I don’t know how to solve my problem that’s why I write in thie forum.

    Thank you in advance for all your help !

  • FFMPEG API and cropping

    24 juin 2016, par Jguillot

    I learned how to use FFMPEG API with dranger’s tutorial, and I implemented a video reader using the library SDL to display the video.

    I have a HD video 1280*720 (i only worked with mp4) and i want to select a VGA-screen anywhere in the HD video (I mean cropping a VGA screen in a HD video), recuperate the data and display it on screen.

    In the FFMPEG API, we can use the function av_picture_crop (here). I get a "yellow" overlay on the cropped video and my application crashes after fews seconds. Before posting here, i read here that the function wasn’t finish yet. But when i reading the code, i don’t find a way to finish it.
    This is a part of my code :

    AVFrame *pFrame = NULL;
    AVFrame *pFrameCropped = NULL;
    bmp = SDL_CreateYUVOverlay(CODEC_WIDTH, // width
                             CODEC_HEIGHT, // height
                             SDL_YV12_OVERLAY, // format
                             screen); // SDL_Surface to display


    sws_ctx = sws_getContext(CODEC_WIDTH, // src width
                           CODEC_HEIGHT, // src height
                           pCodecCtx->pix_fmt, // src img format
                           STREAM_WIDTH, // dest width,
                           STREAM_HEIGHT, // dest height
                           AV_PIX_FMT_YUV420P, // dest img format
                           SWS_BILINEAR, // option to rescalling
                           NULL, //
                           NULL, //
                           NULL //
                           );

    while(
    av_read_frame(pFormatCtx,
                   &amp;packet)>=0)
    {
    if(packet.stream_index==videoStream)
    {
     avcodec_decode_video2(pCodecCtx,
                            pFrame,
                            &amp;frameFinished,
                            &amp;packet);

     if(frameFinished)
     {
       SDL_LockYUVOverlay(bmp);

       av_picture_crop((AVPicture*)pFrameCropped,
                        (AVPicture*)pFrame,
                        (AVPixelFormat)pFrame->format,
                        150,
                        300);
       pict.data[0] = pFrameCropped->data[0];// "X"
       pict.data[1] = pFrameCropped->data[1];
       pict.data[2] = pFrameCropped->data[2];

       // pict.linesize == number of bytes per line
       pict.linesize[0] = pFrameCropped->linesize[0];
       pict.linesize[1] = pFrameCropped->linesize[2];
       pict.linesize[2] = pFrameCropped->linesize[1];

       sws_scale(sws_ctx, // the scaling context previously created with sws_getContext()
                   (uint8_t const * const *)pFrameCropped->data, // Pointers to the planes of the source slice
                   pFrame->linesize, // the array containing the strides for each plane of the source image
                   0, // position in src img processed slice.  
                      // It's number (counted starting from zero)
                      // in the image of the first row of the slice  
                   CODEC_HEIGHT, // source slice height. Number of rows in the slice
                   pict.data, // pointers to the planes of the destination image
                   pict.linesize); // strides for each plane of the destination image

       // Unlock SDL_Overlay
       SDL_UnlockYUVOverlay(bmp);
    }

    Get the compilation error :

    *** glibc detected *** ./HDtoVGA: corrupted double-linked list: 0x08d74e30 ***

    In the FFMPEG command line tool, we can crop a video, using vf_crop (here) but I don’t find how to implement the same function in my code.

    Do you have any hint to help me ?