Recherche avancée

Médias (0)

Mot : - Tags -/alertes

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (97)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Support de tous types de médias

    10 avril 2011

    Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)

Sur d’autres sites (5422)

  • Unable to mux two live streams from webcam using ffmpeg in linux

    24 octobre 2013, par user2902884

    We are trying to multiplex two live streams from webcams into an output file using ffmpeg in linux as follows,

    fmpeg -i "http://10.41.2.57:8090/webcam.asf" -i "http://10.41.2.49:8090/webcam.asf" -map 0:0 -map 1:0 -t 60 "/home/../MuxLiveStream1.flv"

    ffmpeg stops responding at after a while and the console shows the following output,

    FFmpeg version SVN-r0.5.9-4:0.5.9-0ubuntu0.10.04.3, Copyright (c) 2000-2009 Fabrice Bellard, et al.
     configuration: --extra-version=4:0.5.9-0ubuntu0.10.04.3 --prefix=/usr --enable-avfilter --enable-avfilter-lavf --enable-vdpau --enable-bzlib --enable-libgsm --enable-libschroedinger --enable-libspeex --enable-libtheora --enable-libvorbis --enable-pthreads --enable-zlib --disable-stripping --disable-vhook --enable-runtime-cpudetect --enable-gpl --enable-postproc --enable-swscale --enable-x11grab --enable-libdc1394 --enable-shared --disable-static
     libavutil     49.15. 0 / 49.15. 0
     libavcodec    52.20. 1 / 52.20. 1
     libavformat   52.31. 0 / 52.31. 0
     libavdevice   52. 1. 0 / 52. 1. 0
     libavfilter    0. 4. 0 /  0. 4. 0
     libswscale     0. 7. 1 /  0. 7. 1
     libpostproc   51. 2. 0 / 51. 2. 0
     built on Jan 24 2013 19:42:59, gcc: 4.4.3

    Seems stream 0 codec frame rate differs from container frame rate: 1000.00 (1000/1) -> 2.00 (2/1)
    Input #0, flv, from 'http:10.41.2.154:8090/webcam.flv':
     Duration: 00:00:00.00, start: 1587016.579000, bitrate: 200 kb/s
       Stream #0.0: Video: flv, yuv420p, 320x240, 200 kb/s, 2 tbr, 1k tbn, 1k tbc

    Seems stream 0 codec frame rate differs from container frame rate: 1000.00 (1000/1) -> 2.00 (2/1)
    Input #1, flv, from 'http:10.41.2.57:8090/webcam.flv':
     Duration: 00:00:00.00, start: 1587230.806000, bitrate: 200 kb/s
       Stream #1.0: Video: flv, yuv420p, 320x240, 200 kb/s, 2 tbr, 1k tbn, 1k tbc
    Number of stream maps must match number of output streams

    Is there a mistake in the command or is there anything else that needs to be done ?

  • PHP and FFmpeg : works on terminal but ffmpeg error when in shell_exec

    25 janvier 2014, par user2128983

    I'm in front of a problem I can't solve !

    I try to convert videos using the ffmpeg tool. When I paste the command on the server terminal (ssh root), it works fine. But, when I try in my PHP script using shell_exec, it returns the error "Error while opening encoder ...".

    Here is the command I use :

    ffmpeg -y -i VIDEOIN.mov -preset medium -strict -2 -c:v libx264 -b:v 1000k -pass 1 -f mp4 VIDEOOUT.mp4 null 1>logfile.txt 2>&1 &

    And my PHP script :

    $dir = 'videos/'.$uuid;
    $out = $dir.'video.mp4';
    $log = $dir.'transcoding_log.txt';
    $preset = 'medium';
    $bitrate = 1000;

    $cmd = "ffmpeg -y -i ".$in." -preset ".$preset." -strict -2 -c:v libx264 -b:v ".$bitrate."k -pass 1 -f mp4 ".$out." null 1>".$log." 2>&1 &";
    @file_put_contents($dir.'cmd.log', $cmd); // to debug
    shell_exec($cmd);

    Here is the log file created in the PHP script :

    ffmpeg version 1.0.8 Copyright (c) 2000-2013 the FFmpeg developers
    built on Sep 12 2013 11:57:09 with gcc 4.7 (Debian 4.7.2-5)
    configuration: --prefix=/usr --extra-cflags='-g -O2 -fstack-protector --param=ssp-buffer-size=4 -Wformat -Werror=format-security ' --extra-ldflags='-Wl,-z,relro' --cc='ccache cc' --enable-shared --enable-libmp3lame --enable-gpl --enable-nonfree --enable-libvorbis --enable-pthreads --enable-libfaac --enable-libxvid --enable-postproc --enable-x11grab --enable-libgsm --enable-libtheora --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libx264 --enable-libspeex --enable-nonfree --disable-stripping --enable-libvpx --enable-libschroedinger --disable-encoder=libschroedinger --enable-version3 --enable-libopenjpeg --enable-librtmp --enable-avfilter --enable-libfreetype --enable-libvo-aacenc --disable-decoder=amrnb --enable-libvo-amrwbenc --enable-libaacplus --libdir=/usr/lib/x86_64-linux-gnu --disable-vda --enable-libbluray --enable-libcdio --enable-gnutls --enable-frei0r --enable-openssl --enable-libass --enable-libopus --enable-fontconfig --enable-libfdk-aac --enable-libdc1394 --disable-altivec --dis  libavutil      51. 73.101 / 51. 73.101
     libavcodec     54. 59.100 / 54. 59.100
     libavformat    54. 29.104 / 54. 29.104
     libavdevice    54.  2.101 / 54.  2.101
     libavfilter     3. 17.100 /  3. 17.100
     libswscale      2.  1.101 /  2.  1.101
     libswresample   0. 15.100 /  0. 15.100
     libpostproc    52.  0.100 / 52.  0.100
    Input #0, matroska,webm, from 'videos/18352678af96ecb5/origin.webm':
     Metadata:
       creation_time   : 2010-05-20 08:21:12
     Duration: 00:00:32.48, start: 0.000000, bitrate: 533 kb/s
       Stream #0:0(eng): Video: vp8, yuv420p, 640x360, SAR 1:1 DAR 16:9, 25 fps, 25 tbr, 1k tbn, 1k tbc (default)
       Stream #0:1(eng): Audio: vorbis, 44100 Hz, mono, s16 (default)
    [libx264 @ 0x19c05a0] using SAR=1/1
    [libx264 @ 0x19c05a0] using cpu capabilities: MMX2 SSE2Fast SSEMisalign LZCNT
    [libx264 @ 0x19c05a0] ratecontrol_init: can't open stats file
    Output #0, mp4, to 'videos/18352678af96ecb5/video.mp4':
       Stream #0:0(eng): Video: h264, yuv420p, 640x360 [SAR 1:1 DAR 16:9], q=-1--1, pass 1,     90k tbn, 25 tbc (default)
       Stream #0:1(eng): Audio: none, 44100 Hz, mono, s16, 128 kb/s (default)
    Stream mapping:
     Stream #0:0 -> #0:0 (vp8 -> libx264)
     Stream #0:1 -> #0:1 (vorbis -> libfaac)
    Error while opening encoder for output stream #0:0 - maybe incorrect parameters such as     bit_rate, rate, width or height

    EDIT 1 :

    I compiled ffmpeg with the latest revision but the problem still occurs.

  • The C++11 Thread Timer is not working

    26 août 2017, par Gathros

    I’m trying to make a video player using SDL2 and FFmpeg API. The video is being decoded and I can display an image on screen. I can also play audio but I not doing that (I know it works, I’ve tried it).

    My problem is I can’t update the image when it should be. I’m able to get the timestamps and work out the delay then send it to a thread, where it should call an window update when the time has elapsed. But all that happens is the images flash on the screen with no delay. I have even set the delay to 1 second and the images still flash, after there being 1 second of a blank window.

    Here is my code :

    extern "C"{
       //FFmpeg libraries
       #include <libavcodec></libavcodec>avcodec.h>
       #include <libavformat></libavformat>avformat.h>
       #include <libswscale></libswscale>swscale.h>

       //SDL2 libraries
       #include <sdl2></sdl2>SDL.h>
    }
    // compatibility with newer API
    #if LIBAVCODEC_VERSION_INT &lt; AV_VERSION_INT(55,28,1)
    #define av_frame_alloc avcodec_alloc_frame
    #define av_frame_free avcodec_free_frame
    #endif

    //C++ libraries
    #include <cstdio>
    #include <chrono>
    #include <thread>
    #include <atomic>
    #include <mutex>
    #include

    typedef struct PacketQueue {
       AVPacketList                *first_pkt, *last_pkt;
       std::mutex                  mutex;
       std::condition_variable     convar;
    } PacketQueue;

    std::atomic<bool>           quitting, decoded;
    std::atomic        delay;
    Uint32                      Update_Window;

    int packet_queue_put(PacketQueue *q, AVPacket *pkt){
       AVPacketList *pkt1;
       if(av_dup_packet(pkt) &lt; 0){
           return -1;
       }
       pkt1 = (AVPacketList*) av_malloc(sizeof(AVPacketList));
       if(!pkt1){
           return -1;
       }
       pkt1->pkt = *pkt;
       pkt1->next = NULL;

       std::lock_guard lock(q->mutex);

       if (!q->last_pkt){
           q->first_pkt = pkt1;
       }else{
           q->last_pkt->next = pkt1;
       }
       q->last_pkt = pkt1;
       q->convar.notify_all();
       return 0;
    }

    static int packet_queue_get(PacketQueue *q, AVPacket *pkt, int block){
       AVPacketList *pkt1;
       int ret;

       std::unique_lock lk(q->mutex);
       while(1){
           if(quitting){
               ret = -1;
               break;
           }

           pkt1 = q->first_pkt;
           if(pkt1){
               q->first_pkt = pkt1->next;
               if(!q->first_pkt){
                   q->last_pkt = NULL;
               }
               *pkt = pkt1->pkt;
               av_free(pkt1);
               ret = 1;
               break;
           }else if(decoded){
               ret = 0;
               quitting = true;
               break;
           }else if(block){
               q->convar.wait_for(lk, std::chrono::microseconds(50));
           }else {
               ret = 0;
               break;
           }
       }
       return ret;
    }

    void UpdateEventQueue(){
       SDL_Event event;
       SDL_zero(event);
       event.type = Update_Window;
       SDL_PushEvent(&amp;event);
    }

    void VideoTimerThreadFunc(){
       UpdateEventQueue();

       while(!quitting){
           if(delay == 0){
               std::this_thread::sleep_for(std::chrono::milliseconds(1));
           }else {
               std::this_thread::sleep_for(std::chrono::microseconds(delay));
               UpdateEventQueue();
           }
       }
    }

    int main(int argc, char *argv[]){
       AVFormatContext*                FormatCtx = nullptr;
       AVCodecContext*                 CodecCtxOrig = nullptr;
       AVCodecContext*                 CodecCtx = nullptr;
       AVCodec*                        Codec = nullptr;
       int                             videoStream;
       AVFrame*                        Frame = nullptr;
       AVPacket                        packet;
       struct SwsContext*              SwsCtx = nullptr;

       PacketQueue                     videoq;
       int                             frameFinished;
       int64_t                         last_pts = 0;
       const AVRational                ms = {1, 1000};

       SDL_Event                       event;
       SDL_Window*                     screen;
       SDL_Renderer*                   renderer;
       SDL_Texture*                    texture;
       std::shared_ptr<uint8>          yPlane, uPlane, vPlane;
       int                             uvPitch;

       if (argc != 2) {
           fprintf(stderr, "Usage: %s <file>\n", argv[0]);
           return -1;
       }

       // Register all formats and codecs
       av_register_all();

       // Initialise SDL2
       if (SDL_Init(SDL_INIT_VIDEO | SDL_INIT_AUDIO | SDL_INIT_TIMER)) {
           fprintf(stderr, "Couldn't initialise SDL - %s\n", SDL_GetError());
           return -1;
       }

       // Setting things up
       quitting = false;
       decoded = false;
       delay = 0;
       Update_Window = SDL_RegisterEvents(1);
       memset(&amp;videoq, 0, sizeof(PacketQueue));

       // Open video file
       if(avformat_open_input(&amp;FormatCtx, argv[1], NULL, NULL) != 0){
           fprintf(stderr, "Couldn't open file\n");        
           return -1; // Couldn't open file
       }

       // Retrieve stream information
       if(avformat_find_stream_info(FormatCtx, NULL) &lt; 0){
           fprintf(stderr, "Couldn't find stream information\n");

           // Close the video file
           avformat_close_input(&amp;FormatCtx);

           return -1; // Couldn't find stream information
       }

       // Find the video stream
       videoStream = av_find_best_stream(FormatCtx, AVMEDIA_TYPE_VIDEO, -1, -1, NULL, 0);
       if(videoStream &lt; 0){
           fprintf(stderr, "Couldn't find video stream\n");

           // Close the video file
           avformat_close_input(&amp;FormatCtx);

           return -1; // Didn't find a video stream
       }

       // Get a pointer to the codec context for the video stream
       CodecCtxOrig = FormatCtx->streams[videoStream]->codec;

       // Find the decoder for the video stream
       Codec = avcodec_find_decoder(CodecCtxOrig->codec_id);
       if(Codec == NULL){
           fprintf(stderr, "Unsupported codec\n");

           // Close the codec
           avcodec_close(CodecCtxOrig);

           // Close the video file
           avformat_close_input(&amp;FormatCtx);

           return -1; // Codec not found
       }

       // Copy context
       CodecCtx = avcodec_alloc_context3(Codec);
       if(avcodec_copy_context(CodecCtx, CodecCtxOrig) != 0){
           fprintf(stderr, "Couldn't copy codec context");

           // Close the codec
           avcodec_close(CodecCtxOrig);

           // Close the video file
           avformat_close_input(&amp;FormatCtx);

           return -1; // Error copying codec context
       }

       // Open codec
       if(avcodec_open2(CodecCtx, Codec, NULL) &lt; 0){
           fprintf(stderr, "Couldn't open codec\n");

           // Close the codec
           avcodec_close(CodecCtx);
           avcodec_close(CodecCtxOrig);

           // Close the video file
           avformat_close_input(&amp;FormatCtx);
           return -1; // Could not open codec
       }

       // Allocate video frame
       Frame = av_frame_alloc();

       // Make a screen to put our video
       screen = SDL_CreateWindow("Video Player", SDL_WINDOWPOS_UNDEFINED, SDL_WINDOWPOS_UNDEFINED, CodecCtx->width, CodecCtx->height, 0);
       if(!screen){
           fprintf(stderr, "SDL: could not create window - exiting\n");
           quitting = true;

           // Clean up SDL2
           SDL_Quit();

           // Free the YUV frame
           av_frame_free(&amp;Frame);

           // Close the codec
           avcodec_close(CodecCtx);
           avcodec_close(CodecCtxOrig);

           // Close the video file
           avformat_close_input(&amp;FormatCtx);

           return -1;
       }

       renderer = SDL_CreateRenderer(screen, -1, 0);
       if(!renderer){
           fprintf(stderr, "SDL: could not create renderer - exiting\n");
           quitting = true;

           // Clean up SDL2
           SDL_DestroyWindow(screen);
           SDL_Quit();

           // Free the YUV frame
           av_frame_free(&amp;Frame);

           // Close the codec
           avcodec_close(CodecCtx);
           avcodec_close(CodecCtxOrig);

           // Close the video file
           avformat_close_input(&amp;FormatCtx);
           return -1;
       }

       // Allocate a place to put our YUV image on that screen
       texture = SDL_CreateTexture(renderer, SDL_PIXELFORMAT_YV12, SDL_TEXTUREACCESS_STREAMING, CodecCtx->width, CodecCtx->height);
       if(!texture){
           fprintf(stderr, "SDL: could not create texture - exiting\n");
           quitting = true;

           // Clean up SDL2
           SDL_DestroyRenderer(renderer);
           SDL_DestroyWindow(screen);
           SDL_Quit();

           // Free the YUV frame
           av_frame_free(&amp;Frame);

           // Close the codec
           avcodec_close(CodecCtx);
           avcodec_close(CodecCtxOrig);

           // Close the video file
           avformat_close_input(&amp;FormatCtx);
           return -1;
       }

       // Initialise SWS context for software scaling
       SwsCtx = sws_getContext(CodecCtx->width, CodecCtx->height, CodecCtx->pix_fmt,
                   CodecCtx->width, CodecCtx->height, PIX_FMT_YUV420P, SWS_BILINEAR, NULL, NULL, NULL);
       if(!SwsCtx){
           fprintf(stderr, "Couldn't create sws context\n");
           quitting = true;

           // Clean up SDL2
           SDL_DestroyTexture(texture);
           SDL_DestroyRenderer(renderer);
           SDL_DestroyWindow(screen);
           SDL_Quit();

           // Free the YUV frame
           av_frame_free(&amp;Frame);

           // Close the codec
           avcodec_close(CodecCtx);
           avcodec_close(CodecCtxOrig);

           // Close the video file
           avformat_close_input(&amp;FormatCtx);
           return -1;
       }

       // set up YV12 pixel array (12 bits per pixel)
       yPlane = std::shared_ptr<uint8>((Uint8 *)::operator new (CodecCtx->width * CodecCtx->height, std::nothrow));
       uPlane = std::shared_ptr<uint8>((Uint8 *)::operator new (CodecCtx->width * CodecCtx->height / 4, std::nothrow));
       vPlane = std::shared_ptr<uint8>((Uint8 *)::operator new (CodecCtx->width * CodecCtx->height / 4, std::nothrow));
       uvPitch = CodecCtx->width / 2;

       if (!yPlane || !uPlane || !vPlane) {
           fprintf(stderr, "Could not allocate pixel buffers - exiting\n");
           quitting = true;

           // Clean up SDL2
           SDL_DestroyTexture(texture);
           SDL_DestroyRenderer(renderer);
           SDL_DestroyWindow(screen);
           SDL_Quit();

           // Free the YUV frame
           av_frame_free(&amp;Frame);

           // Close the codec
           avcodec_close(CodecCtx);
           avcodec_close(CodecCtxOrig);

           // Close the video file
           avformat_close_input(&amp;FormatCtx);
           return -1;
       }

       std::thread VideoTimer (VideoTimerThreadFunc);

       while (!quitting) {
           // Check for more packets
           if(av_read_frame(FormatCtx, &amp;packet) >= 0){
               // Check what stream it belongs to
               if (packet.stream_index == videoStream) {
                   packet_queue_put(&amp;videoq, &amp;packet);
               }else{
                   // Free the packet that was allocated by av_read_frame
                   av_free_packet(&amp;packet);
               }
           }else {
               decoded = true;
           }

           SDL_PollEvent(&amp;event);

           if(event.type == Update_Window){
               // Getting packet
               if(packet_queue_get(&amp;videoq, &amp;packet, 0)){
                   // Decode video frame
                   avcodec_decode_video2(CodecCtx, Frame, &amp;frameFinished, &amp;packet);

                   // Did we get a video frame?
                   if (frameFinished) {
                       AVPicture pict;
                       pict.data[0] = yPlane.get();
                       pict.data[1] = uPlane.get();
                       pict.data[2] = vPlane.get();
                       pict.linesize[0] = CodecCtx->width;
                       pict.linesize[1] = uvPitch;
                       pict.linesize[2] = uvPitch;

                       // Convert the image into YUV format that SDL uses
                       sws_scale(SwsCtx, (uint8_t const * const *) Frame->data, Frame->linesize, 0, CodecCtx->height, pict.data, pict.linesize);

                       SDL_UpdateYUVTexture(texture, NULL, yPlane.get(), CodecCtx->width, uPlane.get(), uvPitch, vPlane.get(), uvPitch);

                       SDL_RenderClear(renderer);
                       SDL_RenderCopy(renderer, texture, NULL, NULL);
                       SDL_RenderPresent(renderer);

                       // Calculating delay
                       delay = av_rescale_q(packet.dts, CodecCtx->time_base, ms) - last_pts;
                       last_pts = av_rescale_q(packet.dts, CodecCtx->time_base, ms);
                   }else{
                       //UpdateEventQueue();
                       delay = 1;
                   }

                   // Free the packet that was allocated by av_read_frame
                   av_free_packet(&amp;packet);

               }else{
                   //UpdateEventQueue();
               }
           }

           switch (event.type) {
               case SDL_QUIT:
                   quitting = true;
                   break;

               default:
                   break;
           }
       }

       VideoTimer.join();

       //SDL2 clean up
       SDL_DestroyTexture(texture);
       SDL_DestroyRenderer(renderer);
       SDL_DestroyWindow(screen);
       SDL_Quit();

       // Free the YUV frame
       av_frame_free(&amp;Frame);

       // Free Sws
       sws_freeContext(SwsCtx);

       // Close the codec
       avcodec_close(CodecCtx);
       avcodec_close(CodecCtxOrig);

       // Close the video file
       avformat_close_input(&amp;FormatCtx);

       return 0;
    }
    </uint8></uint8></uint8></file></uint8></bool></mutex></atomic></thread></chrono></cstdio>