Recherche avancée

Médias (0)

Mot : - Tags -/signalement

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (54)

  • La file d’attente de SPIPmotion

    28 novembre 2010, par

    Une file d’attente stockée dans la base de donnée
    Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
    Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...)

  • Les vidéos

    21 avril 2011, par

    Comme les documents de type "audio", Mediaspip affiche dans la mesure du possible les vidéos grâce à la balise html5 .
    Un des inconvénients de cette balise est qu’elle n’est pas reconnue correctement par certains navigateurs (Internet Explorer pour ne pas le nommer) et que chaque navigateur ne gère en natif que certains formats de vidéos.
    Son avantage principal quant à lui est de bénéficier de la prise en charge native de vidéos dans les navigateur et donc de se passer de l’utilisation de Flash et (...)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

Sur d’autres sites (7054)

  • Converting RGB frames to YUV420P using FFmpeg/C++ [duplicate]

    20 mars 2014, par learner

    This question already has an answer here :

    I tried a code based on this tutorial. What this does is it writes first 5 frames of a video into memory in RGB format. I need to convert these RGB frames to YUV420P format so that I can pass them to YUV420P encoder. I tried using the sws_scale but then the resulting frames lose color information and also suffer from scaling issues. Also, looked at various questions but none gives a working solution. I am completely new to FFmpeg library and working on various tutorials to get a grip on it. I hope somebody can suggest a nice solution. Many thanks for your inputs !

    Following is the code :

    #include <ffmpeg></ffmpeg>avcodec.h>
    #include <ffmpeg></ffmpeg>avformat.h>

    #include

    void SaveFrame(AVFrame *pFrame, int width, int height, int iFrame) {
     FILE *pFile;
     char szFilename[32];
     int  y;

     // Open file
     sprintf(szFilename, "frame%d.ppm", iFrame);
     pFile=fopen(szFilename, "wb");
     if(pFile==NULL)
       return;

     // Write header
     fprintf(pFile, "P6\n%d %d\n255\n", width, height);

     // Write pixel data
     for(y=0; ydata[0]+y*pFrame->linesize[0], 1, width*3, pFile);

     // Close file
     fclose(pFile);
    }

    int main(int argc, char *argv[]) {
     AVFormatContext *pFormatCtx;
     int             i, videoStream;
     AVCodecContext  *pCodecCtx;
     AVCodec         *pCodec;
     AVFrame         *pFrame;
     AVFrame         *pFrameRGB;
     AVPacket        packet;
     int             frameFinished;
     int             numBytes;
     uint8_t         *buffer;

     if(argc &lt; 2) {
       printf("Please provide a movie file\n");
       return -1;
     }
     // Register all formats and codecs
     av_register_all();

     // Open video file
     if(av_open_input_file(&amp;pFormatCtx, argv[1], NULL, 0, NULL)!=0)
       return -1; // Couldn&#39;t open file

     // Retrieve stream information
     if(av_find_stream_info(pFormatCtx)&lt;0)
       return -1; // Couldn&#39;t find stream information

     // Dump information about file onto standard error
     dump_format(pFormatCtx, 0, argv[1], 0);

     // Find the first video stream
     videoStream=-1;
     for(i=0; inb_streams; i++)
       if(pFormatCtx->streams[i]->codec->codec_type==CODEC_TYPE_VIDEO) {
         videoStream=i;
         break;
       }
     if(videoStream==-1)
       return -1; // Didn&#39;t find a video stream

     // Get a pointer to the codec context for the video stream
     pCodecCtx=pFormatCtx->streams[videoStream]->codec;

     // Find the decoder for the video stream
     pCodec=avcodec_find_decoder(pCodecCtx->codec_id);
     if(pCodec==NULL) {
       fprintf(stderr, "Unsupported codec!\n");
       return -1; // Codec not found
     }
     // Open codec
     if(avcodec_open(pCodecCtx, pCodec)&lt;0)
       return -1; // Could not open codec

     // Allocate video frame
     pFrame=avcodec_alloc_frame();

     // Allocate an AVFrame structure
     pFrameRGB=avcodec_alloc_frame();
     if(pFrameRGB==NULL)
       return -1;

     // Determine required buffer size and allocate buffer
     numBytes=avpicture_get_size(PIX_FMT_RGB24, pCodecCtx->width,
                     pCodecCtx->height);
     buffer=(uint8_t *)av_malloc(numBytes*sizeof(uint8_t));

     // Assign appropriate parts of buffer to image planes in pFrameRGB
     // Note that pFrameRGB is an AVFrame, but AVFrame is a superset
     // of AVPicture
     avpicture_fill((AVPicture *)pFrameRGB, buffer, PIX_FMT_RGB24,
            pCodecCtx->width, pCodecCtx->height);

     // Read frames and save first five frames to disk
     i=0;
     while(av_read_frame(pFormatCtx, &amp;packet)>=0) {
       // Is this a packet from the video stream?
       if(packet.stream_index==videoStream) {
         // Decode video frame
         avcodec_decode_video(pCodecCtx, pFrame, &amp;frameFinished,
                  packet.data, packet.size);

         // Did we get a video frame?
         if(frameFinished) {
       // Convert the image from its native format to RGB
       img_convert((AVPicture *)pFrameRGB, PIX_FMT_RGB24,
                       (AVPicture*)pFrame, pCodecCtx->pix_fmt, pCodecCtx->width,
                       pCodecCtx->height);

       // Save the frame to disk
       if(++i&lt;=5)
         SaveFrame(pFrameRGB, pCodecCtx->width, pCodecCtx->height,
               i);
         }
       }

       // Free the packet that was allocated by av_read_frame
       av_free_packet(&amp;packet);
     }

     // Free the RGB image
     av_free(buffer);
     av_free(pFrameRGB);

     // Free the YUV frame
     av_free(pFrame);

     // Close the codec
     avcodec_close(pCodecCtx);

     // Close the video file
     av_close_input_file(pFormatCtx);

     return 0;
    }
  • When I tried to install react-native-ffmpeg, Build getting failed throwing error

    1er juillet 2020, par srinivas
    Version Using&#xA;compileSdkVersion 29&#xA;buildToolsVersion "26.0.0"&#xA;

    &#xA;

    What went wrong :&#xA;Could not determine the dependencies of task ':react-native-ffmpeg:compileDebugAidl'.&#xA; width='300' height='172' /> Could not resolve all task dependencies for configuration ':react-native-ffmpeg:debugCompileClasspath'.

    &#xA;

    &#xA;

    Could not find com.arthenica:mobile-ffmpeg-com.febbler:4.3.1.&#xA;Required by :&#xA;project :react-native-ffmpeg

    &#xA;

    &#xA;

  • Translating ffmpeg command line to C++ codec settings

    6 mars 2014, par pacificator

    I've been doing some work in ffmpeg for a while in C++.
    Most of the help regarding encoder settings is explained as command line options.
    For example (taken from the ffmpeg site) :

    -mbd rd -trellis 2 -cmp 2 -subcmp 2 -g 100 -pass 1/2’

    but beware the ’-g 100’ might cause problems with some decoders. Things to try :

    ’-bf 2’, ’-flags qprd’, ’-flags mv0’, ’- flags skiprd.

    This is not really usefull when you want to set these options in C.
    For example I managed to find int trellis ; in the AVCodecContext struct so that is one solved, but what about the others ?

    Is there a way to determine what command line parameters correspond to what AVCodecContext members ?
    I tried setting them like this :

    AVCodecContext* c;
    av_opt_set_int(c->priv_data, "cmp", 2, 0);

    But this returns an error code that the option does not exist.
    I've also tried :

     av_opt_set(c->priv_data, "cmp", "2", 0);

    I still get the error that the option does not exist.

    So, is there a way to determine what AVCodecContext members I should set that are equivalent to the ffmpeg command line parameters above ?