Recherche avancée

Médias (1)

Mot : - Tags -/musée

Autres articles (70)

  • Amélioration de la version de base

    13 septembre 2013

    Jolie sélection multiple
    Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
    Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)

  • Emballe médias : à quoi cela sert ?

    4 février 2011, par

    Ce plugin vise à gérer des sites de mise en ligne de documents de tous types.
    Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ;

  • Gestion de la ferme

    2 mars 2010, par

    La ferme est gérée dans son ensemble par des "super admins".
    Certains réglages peuvent être fais afin de réguler les besoins des différents canaux.
    Dans un premier temps il utilise le plugin "Gestion de mutualisation"

Sur d’autres sites (8595)

  • C++ RTSP Stream recording timebase wrong ffmpeg

    29 juillet 2021, par Tom Baires

    We fetch a camera stream over RTSP (Axis Q1755 H264). And after a random time we a start recording the stream to a file. I check the video file with ffprobe and realise the start time is not the start time of the recording it is the start time of fetch the stream. My Video has a length of 5 seconds and the start time of the video should be 0. But the real starttime of the Video is at second 20. This leads to a an incorrect timebase. (ffmpeg Version 3.3.3)

    


    I try to repair it, but now i have some time the error timebase 1/180000... i marked my changes with the comment (timebase repair logic)

    


    timebase video

    


    ffprobe Output (without timebase repair logic)
The start is higher as the duration, correctly the start is 0 and not 20.080000

    


    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from
'C:\testvideo.mp4':  
 Metadata:
     major_brand     : isom
     minor_version   : 512
     compatible_brands: isomiso2avc1mp41
     encoder         : Lavf57.71.100   Duration: 00:00:05.04, start: 20.080000, bitrate: 2675 kb/s
     Stream #0:0(und): Video: h264 (Baseline) (avc1 / 0x31637661), yuvj420p(pc, bt709), 800x450 [SAR 1:1 DAR 16:9], 2670 kb/s, 50.20 fps,
 50 tbr, 180k tbn, 360k tbc (default)
     Metadata:
       handler_name    : VideoHandler


    


    Error after the add of timebase repair logic

    


    


    [mpeg4 @ 00000178fcb12b40] timebase 1/180000 not supported by MPEG 4
standard, the maximum admitted value for the timebase denominator is
65535 Could not open codec 'mpeg4' : Unspecified error Cannot start
recording

    


    


    Here some parts of my code

    


    Start Logic

    


    bool RecordingStreamGrabber::start()
{
    CORE_LOG_INFO(m_logger, "Started RecordingStreamGrabber");
    if (m_thread == NULL)
    {
        if (this->prepareInputStream())
        {
            m_run = true;
            m_thread = new std::thread(RecordingStreamGrabber::run, this);
            return true;
        }
        CORE_LOG_ERROR(m_logger, "Error starting RecrodingStreamGrabber");
        return false;
    }
    return false;
}


    


    Prepare Input/Output Stream

    


    bool RecordingStreamGrabber::prepareInputStream()
{
    CORE_LOG_INFO(m_logger, "Preparing Inputstream for recording: " << m_url);

    m_ifmtctx = avformat_alloc_context();
    interrupt_recording_nostop = false;
    interrupt_recording_timeout = m_timeout; // Timeout in milisekunde
    interrupt_recording_starttime = GetTickCount();
    m_ifmtctx->interrupt_callback = interrupt_timeout_cb;

    if (avformat_open_input(&m_ifmtctx, m_url.c_str(), NULL, NULL) != 0)
    {
        m_ifmtctx = NULL;
        CORE_LOG_ERROR(m_logger, "Error opening recording URL: " << m_url);
        return false;
    }

    interrupt_recording_nostop = true;

    if (avformat_find_stream_info(m_ifmtctx, NULL) < 0)
    {
        CORE_LOG_ERROR(m_logger, "Error finding stream in URL: " << m_url);
        avformat_close_input(&m_ifmtctx);
        m_ifmtctx = NULL;
        return false;
    }

    //search for the first video stream
    m_stream_index = -1;
    for (unsigned int i = 0; i < m_ifmtctx->nb_streams && m_stream_index == -1; i++)
    {
        m_iccx = m_ifmtctx->streams[i]->codec;
        if (m_iccx->codec_type == AVMEDIA_TYPE_VIDEO)
        {
            m_istream = m_ifmtctx->streams[i];
            m_stream_index = i;
        }
    }

    if (m_stream_index == -1)
    {
        CORE_LOG_ERROR(m_logger, "Could not find video stream in URL: " << m_url);
        avformat_close_input(&m_ifmtctx);
        m_ifmtctx = NULL;
        return false;
    }

    return true;
}

bool RecordingStreamGrabber::prepareOutputStream()
{
    if (m_ofmtctx)
    {
        CORE_LOG_DEBUG(m_logger, "Close outputfile: " << m_targetfile);
        avformat_free_context(m_ofmtctx);
        m_ofmtctx = NULL;
    }

    m_ofmt = av_guess_format(NULL, m_targetfile.c_str(), NULL);

    m_ofmtctx = avformat_alloc_context();

    m_ofmtctx->oformat = m_ofmt;

    if (avio_open2(&m_ofmtctx->pb, m_targetfile.c_str(), AVIO_FLAG_WRITE, NULL, NULL) != 0)
    {
        avformat_free_context(m_ofmtctx);
        m_ofmtctx = NULL;
        CORE_LOG_ERROR(m_logger, "Error opening outputfile: " << m_targetfile);
        return false;
    }

    m_ostream = avformat_new_stream(m_ofmtctx, NULL);

    avcodec_copy_context(m_ostream->codec, m_iccx);

    m_ostream->sample_aspect_ratio.num = m_iccx->sample_aspect_ratio.num;
    m_ostream->sample_aspect_ratio.den = m_iccx->sample_aspect_ratio.den;

    /* time base: this is the fundamental unit of time (in seconds) in terms
    of which frame timestamps are represented. for fixed-fps content,
    timebase should be 1/framerate and timestamp increments should be
    identically 1. */
    m_ostream->time_base.num = m_iccx->time_base.num;
    m_ostream->time_base.den = m_iccx->time_base.den;

    avformat_write_header(m_ofmtctx, NULL);

#ifdef WIN32
    sprintf_s(m_ofmtctx->filename, sizeof(m_ofmtctx->filename), "%s", m_targetfile.c_str());
#else
    snprintf(m_ofmtctx->filename, sizeof(m_ofmtctx->filename), "%s", m_targetfile.c_str());
#endif
    return true;
}


    


    Recording Logic

    


    void RecordingStreamGrabber::run(RecordingStreamGrabber *_this)
{
    AVPacket packet;
    av_init_packet(&packet);
    int i = 0;
    bool startFrame = true;
    bool keyFrame = false;
    int64_t pts, dts;
    _this->m_tailWritten = true;
    while (_this->m_run)
    {
        if (av_read_frame(_this->m_ifmtctx, &packet) >= 0)
        {
            if ((packet.flags & AV_PKT_FLAG_KEY) == AV_PKT_FLAG_KEY)
            {
                keyFrame = true;
                CORE_LOG_DEBUG(_this->m_logger, "Detected key frame: " << i << "");
            }

            if (_this->m_record)
            {
                if (packet.stream_index == _this->m_stream_index)
                {
                    packet.stream_index = _this->m_ostream->id;

                    if (_this->m_tailWritten == false || keyFrame == true)
                    {
                        //#####################################
                        //timebase repair logic (my changes)
                        //#####################################
                        AVStream *in_stream;
                        AVStream *out_stream;

                        in_stream = _this->m_ifmtctx->streams[packet.stream_index];
                        out_stream = _this->m_ofmtctx->streams[packet.stream_index];

                        if (startFrame)
                        {
                            pts = packet.pts;
                            dts = packet.dts;
                            startFrame = false;
                        }
                        packet.pts -= pts;
                        packet.dts -= dts;

                        packet.pts = av_rescale_q_rnd(packet.pts, in_stream->time_base, out_stream->time_base, (AVRounding)((int)AV_ROUND_NEAR_INF | (int)AV_ROUND_PASS_MINMAX));
                        packet.dts = av_rescale_q_rnd(packet.dts, in_stream->time_base, out_stream->time_base, (AVRounding)((int)AV_ROUND_NEAR_INF | (int)AV_ROUND_PASS_MINMAX));

                        packet.duration = av_rescale_q(packet.duration, in_stream->time_base, out_stream->time_base);
                        //end of my changes
                        
                        av_interleaved_write_frame(_this->m_ofmtctx, &packet);
                        _this->m_tailWritten = false;
                    }
                }
            }
            else if (_this->m_ofmtctx)
            {
                if (_this->m_tailWritten == false)
                {
                    av_write_trailer(_this->m_ofmtctx);
                    avio_close(_this->m_ofmtctx->pb);
                }
                avformat_free_context(_this->m_ofmtctx);
                _this->m_tailWritten = true;
                _this->m_ofmtctx = NULL;
                startFrame = true;
            }
            i++;
            keyFrame = false;
        }
        av_free_packet(&packet);
        av_init_packet(&packet);
    }
    if (_this->m_record)
    {
        av_write_trailer(_this->m_ofmtctx);
        avio_close(_this->m_ofmtctx->pb);
    }
    if (_this->m_ofmtctx)
    {
        avformat_free_context(_this->m_ofmtctx);
        _this->m_ofmtctx = NULL;
    }
    _this->m_record = false;

    avformat_close_input(&_this->m_ifmtctx);
    _this->m_ifmtctx = NULL;
}


    


    Other question
How can I check whether my project uses the deprecated api (answer of @berak) ?

    


  • Use FFMPEG to Save Live CCTV Video Streams that Has Wrong FPS Encoded, Published by Video Clips instead of Frames, and With Nonnegligible Frame Loss

    6 mars 2023, par Crear

    I want to use FFMPEG command line to archive live CCTV video stream (no audio) from Newark Citizen Virtual Patrol (https://cvp.newarkpublicsafety.org) for traffic analysis, previously I was using (I'm just a noob in these commands)
os.system('ffmpeg -t 24:00:00 -i '+address+' -hide_banner -c:v copy -s 640x360 -segment_time 00:15:00 -f segment -strftime 1 -reset_timestamps 1 "'+OutPath+camera_location+'_%Y-%m-%d-%H-%M-%S.mp4"') to archive the videos everyday and segment them into 15-min-long videos.

    


    However, there are several issues.

    


      

    1. The FPS read from the video stream is actually slower than it really is. For example, it's actually 12, but the decoded result says 8, so every time it generates a 15-min-long video, it only pasts 10 11 mins in the real world.
    2. 


    3. Due to unstable frame loss, the FPS is not a stable value either. Therefore, when I manually set the FPS, it usually make the video has wrong length, and sometimes when the stream froze, it keeps waiting because it hasn't finished 15-min-long video. Something I noticed is that it may generate a 15-min-long video, which contains both night and day, started from perhaps 2AM but ended at 8AM.
    4. 


    5. The live CCTV video stream is not frame by frame, but video clip by video clip. Therefore, when I set the -use_wallclock_as_timestamps to 1, the video will be ultra-fast playing the short video clip, then frozen for the rest of time until receiving next video clip.
    6. 


    


    The only thing I can think of is to re-distribute the frames evenly between the timestamp of receiving the current video clip and the timestamp of receiving the prior video clip. What parameters can help FFMPEG to fix the FPS and archive correctly ? I am using FFMPEG to save the video instead of using OpenCV to decode the frame and then encode a video because we have huge amounts of cameras and our legacy Xeon processor had trouble encoding so many frames simultaneously.

    


    Any suggestion is appreciated !

    


  • Use FFmpeg to rotate the video based on its metadata ? Why does Android put the wrong metadata ?

    22 juillet 2015, par Mauro Valvano

    I have a website where the user uploads a video.
    For my website i have also an Android application.
    The website creates a thumbnail of each uploaded video (from browser or Android).

    The problem is that for normal videos it’s all OK, but for android my videos are rotated by 90°.

    I think that Android has a bug, because when I see with FFmpeg the video’s metadata, for a normal recorded video I got a rotate=90 value, and for a 90° rotated video, I don’t have anything in the metadata (and the thumbnail is correct).

    Why ?

    This is an image of a normal recorded video (with the phone in portrait mode).

    FFmpeg showing the rotate metadata in a normal recorded video

    Anyway, I have the metadata in the video, can I create its thumbnail based on its metadata’s rotate value without extracting the metadata and then use ffmpeg to rotate the video on this value ?

    Thank you in advance.