Recherche avancée

Médias (91)

Autres articles (74)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • MediaSPIP version 0.1 Beta

    16 avril 2011, par

    MediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

Sur d’autres sites (12865)

  • Converting DAV to MP4 and OGG

    27 mars 2015, par mackowiakp

    I want to prepare WEB page containing films from security camera recorders. Each recorder transmit video files in DAV format so each film is converted to MP4 format by script, using such syntax :

    ffmpeg -y -i movie.dav -vcodec libx264 -crf 24 movie.mp4

    So I included in HTMLv5 code such entry :

    <video width="320" height="240">
     <source src="movie.mp4" type="video/mp4">
    </source></video>

    It works correctly with Chrome but not with Firefox. For proper work in FF it is necessary add link to OGG file. So correct HTMLv5 syntax for both browsers should look like this :

    <video width="320" height="240">
     <source src="movie.mp4" type="video/mp4">
     <source src="movie.ogg" type="video/ogg">
    </source></source></video>

    Can anybody help me to pass correct ffmpeg syntax to create OGG file ?

    Output from console after using -movflags +faststart options

    [maciek@piotr MMM]$ ../ffmpeg-2.4.2-64bit-static/ffmpeg -movflags +faststart -y -i   04.24.23-04.24.38\[M\]\[@0\]\[0\].dav -vcodec libx264 -crf 24 10.mp4
    ffmpeg version 2.4.2-   http://johnvansickle.com/ffmpeg/    Copyright (c) 2000-2014 the FFmpeg developers
     built on Oct  9 2014 07:24:56 with gcc 4.8 (Debian 4.8.3-11)
     configuration: --enable-gpl --enable-version3 --disable-shared --disable-debug --enable-runtime-cpudetect --enable-libmp3lame --enable-libx264 --enable-libx265 --enable- libwebp --enable-libspeex --enable-libvorbis --enable-libvpx --enable-libfreetype --enable-fontconfig --enable-libxvid --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libtheora --enable-libvo-aacenc --enable-libvo-amrwbenc --enable-gray --enable-libopenjpeg --enable-libopus --disable-ffserver --enable-libass --enable-gnutls --cc=gcc-4.8
     libavutil      54.  7.100 / 54.  7.100
     libavcodec     56.  1.100 / 56.  1.100
     libavformat    56.  4.101 / 56.  4.101
     libavdevice    56.  0.100 / 56.  0.100
     libavfilter     5.  1.100 /  5.  1.100
     libswscale      3.  0.100 /  3.  0.100
     libswresample   1.  1.100 /  1.  1.100
     libpostproc    53.  0.100 / 53.  0.100
    Option movflags not found.
  • Cannot retrieve duration of webm file using ffmpeg

    10 juin 2016, par maximus 69

    I am trying to retrieve the duration of a .webm video file that I recorded using the the video.js plugin (https://github.com/collab-project/videojs-record).

    I am using ffmpeg to get the duration but it always return N/A :

    ffprobe version 3.0.2-static http://johnvansickle.com/ffmpeg/  Copyright (c) 2007-2016 the FFmpeg developers
     built with gcc 5.3.1 (Debian 5.3.1-16) 20160424
     configuration: --enable-gpl --enable-version3 --enable-static --disable-debug --enable-libmp3lame --enable-libx264 --enable-libx265 --enable-libwebp --enable-libspeex --enable-libvorbis --enable-libvpx --enable-libfreetype --enable-fontconfig --enable-libxvid --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libtheora --enable-libvo-amrwbenc --enable-gray --enable-libopenjpeg --enable-libopus --enable-libass --enable-gnutls --enable-libvidstab --enable-libsoxr --enable-frei0r --enable-libfribidi --disable-indev=sndio --disable-outdev=sndio --enable-librtmp --enable-libmfx --enable-libzimg --cc=gcc
     libavutil      55. 17.103 / 55. 17.103
     libavcodec     57. 24.102 / 57. 24.102
     libavformat    57. 25.100 / 57. 25.100
     libavdevice    57.  0.101 / 57.  0.101
     libavfilter     6. 31.100 /  6. 31.100
     libswscale      4.  0.100 /  4.  0.100
     libswresample   2.  0.101 /  2.  0.101
     libpostproc    54.  0.100 / 54.  0.100
    Input #0, matroska,webm, from '/home/demoss/public_html/source/bALRt9I697PVQh4zglou.webm':
     Metadata:
       encoder         : Chrome
     Duration: N/A, start: 0.000000, bitrate: N/A
       Stream #0:0(eng): Video: vp8, yuv420p, 640x480, SAR 1:1 DAR 4:3, 30 fps, 30 tbr, 1k tbn, 1k tbc (default)
       Stream #0:1(eng): Audio: opus, 48000 Hz, mono, fltp (default)
    duration=N/A

    I don’t think the video is corrupted as it played in my VideoJS player without issues.

    Any pointers as to why its not returning the duration would be greatly appreciated. I wouldn’t mind using an alternative library as well

    Thanks

  • ffmpeg H264 Encode Frame at a time for network streaming

    4 février 2017, par Richard Harrow

    I’m working on a remote desktop application, I would like to send an encoded H264 packet over TCP by using ffmpeg for the encoding. However I couldn’t find useful info for the particular case of encoding just one frame (already on YUV444) and get the packet.

    I have several issues, the first was that :

    avcodec_encode_video2

    Was not blocking, I found that most of the time you get the "delayed" frames at the end, however, since this is a real time streaming the solution was :

    av_opt_set(mCodecContext->priv_data, "tune", "zerolatency", 0);

    Now I got the frame, but several issues, it takes a while and even worse I got a gray with trash pixels video as result. My configuration for the Codec Context :

    m_pCodecCtx->bit_rate=8000000;
    m_pCodecCtx->codec_id=AV_CODEC_ID_H264;
    m_pCodecCtx->codec_type = AVMEDIA_TYPE_VIDEO;
    m_pCodecCtx->width=1920;
    m_pCodecCtx->height=1080;
    m_pCodecCtx->pix_fmt=AV_PIX_FMT_YUV444P;
    m_pCodecCtx->time_base.num = 1;
    m_pCodecCtx->time_base.den = 25;
    m_pCodecCtx->gop_size = 1;
    m_pCodecCtx->keyint_min = 1;
    m_pCodecCtx->i_quant_factor = float(0.71);
    m_pCodecCtx->b_frame_strategy = 20;
    m_pCodecCtx->qcompress = (float)0.6;
    m_pCodecCtx->qmax = 51;
    m_pCodecCtx->qmin = 20;
    m_pCodecCtx->max_qdiff = 4;
    m_pCodecCtx->refs = 4;
    m_pCodecCtx->max_b_frames = 1;
    m_pCodecCtx->thread_count = 1;

    I would like to know how this could be done, how do I set the "I Frames" ? and, that would be the optimal for a "one at a time" encoding ? Also I’m not concerned right now with the quality, just need to be fast enough (under 16 ms).

    For the encoding part :

    nres = avcodec_encode_video2(m_pCodecCtx,&amp;packet,m_pFrame,&amp;framefinished);

    if(nres&lt;0){
       qDebug() &lt;&lt; "error encoding: " &lt;&lt; nres &lt;&lt; endl;
    }

    if(framefinished){
       m_pFrame->pts++;
        ofstream vidout("video.h264",ios::app);
        if(vidout.good()){
            vidout.write((const char*)&amp;packet.data[0],packet.size);
        }
        vidout.close();

        av_packet_unref(&amp;packet);

    }

    I’m not using a container, just a raw file, ffplay reproduce raw files if the packets are right, and that’s my principal issue. I’m planning to send the packet over tcp and decode on the client. Any help would be greatly appreciated.