Recherche avancée

Médias (0)

Mot : - Tags -/tags

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (99)

  • Amélioration de la version de base

    13 septembre 2013

    Jolie sélection multiple
    Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
    Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)

  • Le plugin : Gestion de la mutualisation

    2 mars 2010, par

    Le plugin de Gestion de mutualisation permet de gérer les différents canaux de mediaspip depuis un site maître. Il a pour but de fournir une solution pure SPIP afin de remplacer cette ancienne solution.
    Installation basique
    On installe les fichiers de SPIP sur le serveur.
    On ajoute ensuite le plugin "mutualisation" à la racine du site comme décrit ici.
    On customise le fichier mes_options.php central comme on le souhaite. Voilà pour l’exemple celui de la plateforme mediaspip.net :
    < ?php (...)

  • Keeping control of your media in your hands

    13 avril 2011, par

    The vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
    While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
    MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
    MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...)

Sur d’autres sites (11854)

  • FFMPEG MKV Causing Errors in DASH JS

    24 juin 2018, par Mike

    I’m getting the following browser errors (on all browsers) using Dash JS when transcoding and MKV file :

    ERROR DOMException: Failed to read the 'buffered' property from 'SourceBuffer': This SourceBuffer has been removed from the parent media source.

    and...

    dash.all.min.js:26 Uncaught (in promise) DOMException: Failed to load because no supported source was found.

    What’s weird is I have no issues when I transcode a MP4 file. I’m using FFMPEG in conjunction with Bento4 to build MPEG DASH and HLS files for my video player.

    What I did to single out FFMPEG was to transcode a video that gave me errors on my test server on my local machine (which works) and start the Bento4 process on that file. Doing that, I had no issues and everything played just fine.

    I have removed FFMPEG and reinstalled it multiple times and I always get the same result. I’m sure I screwed something up on my server, but for the life of me I can’t seem to figure out where to start with fixing the issue.

    FFMPEG Version

    ffmpeg version N-91321-ge85c608 Copyright (c) 2000-2018 the FFmpeg developers

    built with gcc 4.8.5 (GCC) 20150623 (Red Hat 4.8.5-28)

    configuration:
    --prefix=/root/ffmpeg_build
    --pkg-config-flags=--static
    --extra-cflags=-I/root/ffmpeg_build/include
    --extra-ldflags=-L/root/ffmpeg_build/lib
    --extra-libs=-lpthread
    --extra-libs=-lm
    --bindir=/root/bin
    --enable-gpl
    --enable-libfdk_aac
    --enable-libfreetype
    --enable-libmp3lame
    --enable-libopus
    --enable-libvorbis
    --enable-libtheora
    --enable-libx264
    --enable-nonfree
    libavutil      56. 18.102 / 56. 18.102
    libavcodec     58. 20.102 / 58. 20.102
    libavformat    58. 17.100 / 58. 17.100
    libavdevice    58.  4.101 / 58.  4.101
    libavfilter     7. 25.100 /  7. 25.100
    libswscale      5.  2.100 /  5.  2.100
    libswresample   3.  2.100 /  3.  2.100
    libpostproc    55.  2.100 / 55.  2.100

    FFMPEG Command

    ffmpeg
    -i ${DIRECTORY}/${INPUT_FILE}
    -progress ${DIRECTORY}/transcode.log
    -s 1920x1080
    -c:v libx264
    -b:v 3000k
    -c:a aac
    -b:a 32k
    -minrate 3000k
    -maxrate 3000k
    -bufsize 6000k
    -g 96
    -keyint_min 96
    -sc_threshold 0
    -profile:v high
    -flags +cgop
    -movflags faststart
    -preset ultrafast
    -pix_fmt yuv420p
    ${DIRECTORY}/ffmpeg_1920_1080_3000.mp4 &amp;> ${DIRECTORY}/ffmpeg.log

    Also, I get no errors and and if I access the output files directly, they play just fine.

    I’m sure I’m not including all the information needed to troubleshoot this, so let me know if there is better information I can provide.

    What would cause FFMPEG to transcode MP4 and not MKV ?

    EDIT
    One last thing, I converted the MKV to an MP4 then used the above command and it worked. It’s like MP4 to MP4 is fine, but MKV to MP4 is broke.

  • FFmpeg sws_scale crash at certain resolution

    23 mai 2016, par Tamás Szobonya

    I’m having a weird issue with sws_scale. The problem is, that at certain resolutions i got an Access violation reading location exception. Resolutions like 1920x1080, 1600x900 works, but 1280x720 doesn’t ? This happens in a c++ cli code which is called from c#. Every project is x64 build (no Any CPU) on a Win7 x64.

    c++ cli code :

    void FFmpegWrapper::Codec::E(int width, int height, IntPtr dataIn, [Out] IntPtr %dataOut)
    {
       int ret;
       AVFrame *f, *fIn, *fOut;
       f = av_frame_alloc();
       fIn = av_frame_alloc();
       fOut = av_frame_alloc();

       fIn->format = AV_PIX_FMT_RGB24;
       fIn->width = width;
       fIn->height = height;
       ret = av_image_alloc(fIn->data, fIn->linesize, width, height, AV_PIX_FMT_RGB24, 32);

       f->format = AV_PIX_FMT_YUV420P;
       f->width = width;
       f->height = height;
       ret = av_image_alloc(f->data, f->linesize, width, height, AV_PIX_FMT_YUV420P, 32);

       fOut->format = AV_PIX_FMT_RGB24;
       fOut->width = width;
       fOut->height = height;
       ret = av_image_alloc(fOut->data, fOut->linesize, width, height, AV_PIX_FMT_RGB24, 32);


       uint8_t *data = (uint8_t*)dataIn.ToPointer();
       fIn->data[0] = data;

       //with or without struct no difference
       /*struct */SwsContext *convertCtx = sws_getContext(width, height, AV_PIX_FMT_RGB24, width, height, AV_PIX_FMT_YUV420P, 0, NULL, NULL, NULL);

       // CRASH here
       sws_scale(convertCtx, fIn->data, fIn->linesize, 0, height, f->data, f->linesize);

       convertCtx = sws_getContext(width, height, AV_PIX_FMT_YUV420P, width, height, AV_PIX_FMT_RGB24, 0, NULL, NULL, NULL);

       sws_scale(convertCtx, f->data, f->linesize, 0, height, fOut->data, fOut->linesize);

       dataOut = (IntPtr)fIn->data[0];

    }

    And its called from c# like this :

    FFmpegWrapper.Codec test = new FFmpegWrapper.Codec();

    Bitmap image = new Bitmap(w, h, PixelFormat.Format24bppRgb);

    // Get a screenshot from the desktop
    Screen.Capture(w, h, image, PixelFormat.Format24bppRgb);

    Rectangle rec = new Rectangle(0, 0, image.Width, image.Height);
    BitmapData bitmapData = image.LockBits(rec, ImageLockMode.ReadWrite, image.PixelFormat);

    IntPtr ptr = bitmapData.Scan0;

    IntPtr testptr1;

    test.E(w, h, ptr, out testptr1);

    // We never reach this with 1280x720 resolution
    Bitmap bmp = new Bitmap(w, h, w * 3, PixelFormat.Format24bppRgb, testptr1);

    bmp.Save(@"H:\sajt1.bmp", ImageFormat.Bmp);

    What i don’t understand is, how can it work with certain resolutions and crash with others ?
    Using 20160512-git-cd244fa-win64 version of ffmpeg.

    Edit :
    It seems, that changing AV_PIX_FMT_RGB24 to AV_PIX_FMT_BGR24 fixes it, but I’m not sure why. I know that .Net stores the pixels in bgr, but why does wrong format crashes it ? And only at some resolutions ?

  • ffmpeg live stream transcoding. A/V sync issues on fast camera movement

    21 août 2020, par Kelsnare
      &#xA;
    1. I create a webrtc peer connection with my server(only stun)
    2. &#xA;

    3. Using pion webrtc for the server
    4. &#xA;

    5. I write the received RTP packets as VP8 and opus streams, as described here, to two pipes (the writers ; created with os.Pipe() in golang)
    6. &#xA;

    7. The read ends of these two pipes are received by ffmpeg as inputs (via exec.Command.ExtraFiles) for transcoding using libx264 and aac into a single stream. The command :
    8. &#xA;

    &#xA;

    ffmpeg -re -i pipe:3 -re -r pipe:4 -c:a aac -af aresample=48000 -c:v libx264 -x264-params keyint=48:min-keyint=24 -profile:v main -preset ultrafast -tune zerolatency -crf 20 -fflags genpts -avoid_negative_ts make_zero -vsync vfr -map 0:0,0:0 -map 1:0,0:0 -f matroska -strict -2 pipe:5&#xA;

    &#xA;

      &#xA;
    1. The above command outputs to a pipe(:5) the read end of which is being taken as input by the following :
    2. &#xA;

    &#xA;

    ffmpeg -hide_banner -y -re -i pipe:3 -sn -vf scale=-1:&#x27;min(ih,360)&#x27; -c:v libx264 -pix_fmt yuv420p -ca aac -b:a 128k -b:v 1400k -maxrate 1498k -bufsize 2100k -hls_time 1 -hls_playlist_type event -hls_base_url /workdir/streamID/360p -hls_segment_filename /workdir/streamID/360p/360_%%03d.ts -f hls /workdir/streamID/360p.m3u8&#xA;

    &#xA;

      &#xA;
    1. This works fine as long as there are no movements of my webcam. The moment that happens the video speed suddenly increases for a split second and audio delay gets introduced. This delay keeps increasing each time I shake my webcam.
    2. &#xA;

    &#xA;

    The first command in point 4 above - if written to a file separately will be absolutely fine, in terms of a/v sync, even with vigorous camera shaking. The weird audio delay is only when transcoding for hls output irrespective of whether I'm actually viewing it live or playing it back later.

    &#xA;

    This is my first time working with ffmpeg/hls/webrtc - would be really helpful if I could be pointed in the correct direction at least to be able to debug this or even know why this happens. Any and all help is greatly appreciated

    &#xA;