Recherche avancée

Médias (1)

Mot : - Tags -/ogv

Autres articles (70)

  • MediaSPIP version 0.1 Beta

    16 avril 2011, par

    MediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Amélioration de la version de base

    13 septembre 2013

    Jolie sélection multiple
    Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
    Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)

Sur d’autres sites (11102)

  • arm : Update the var2 functions to the new signature

    29 mai 2017, par Martin Storsjö
    arm : Update the var2 functions to the new signature
    

    The existing functions could easily be used by just calling them
    twice - this would give the following cycle numbers from checkasm :

    Cortex A7 A8 A9 A53
    var2_8x8_c : 7302 5342 5050 4400
    var2_8x8_neon : 2645 1612 1932 1715
    var2_8x16_c : 14300 10528 10020 8637
    var2_8x16_neon : 5127 2695 3217 2651

    However, by merging both passes into the same function, we get the
    following speedup :
    var2_8x8_neon : 2312 1190 1389 1300
    var2_8x16_neon : 4862 2130 2293 2422

    • [DH] common/arm/pixel-a.S
    • [DH] common/arm/pixel.h
    • [DH] common/pixel.c
  • What is substitute for ffmpeg video filter "between" for audio filter

    18 janvier 2018, par Hekimen

    i want make a short video preview from long video with audio, but i have problem to select audio stream segments on specific timestamps.
    I am using this option for segmenting video

    -filter_complex "[0:v]select='between(t,216,220.5)+between(t,432,436.5)+between(t,648,652.5)+between(t,864,868.5)+between(t,1080,1084.5)+between(t,1296,1300.5)+between(t,1512,1516.5)+between(t,1728,1732.5)+between(t,1944,1948.5)+between(t,2160,2164.5)'[outv]"

    and my question is how this option rewrite for audio stream, because when i use just -filter without selecting any stream then i get error : Cannot connect video filter to audio input

    I am using this ffmpeg version :

    ffmpeg version 3.3.3-static http://johnvansickle.com/ffmpeg/  Copyright (c) 2000-2017 the FFmpeg developers
    built with gcc 6.4.0 (Debian 6.4.0-2) 20170724
    configuration : —enable-gpl —enable-version3 —enable-static —disable-debug —disable-ffplay —disable-indev=sndio —disable-outdev=sndio —cc=gcc-6 —enable-fontconfig —enable-frei0r —e
    nable-gnutls —enable-gray —enable-libass —enable-libfreetype —enable-libfribidi —enable-libmp3lame —enable-libopencore-amrnb —enable-libopencore-amrwb —enable-libopus —enable-librtm
    p —enable-libsoxr —enable-libspeex —enable-libtheora —enable-libvidstab —enable-libvo-amrwbenc —enable-libvorbis —enable-libvpx —enable-libwebp —enable-libx264 —enable-libxvid
    libavutil      55. 58.100 / 55. 58.100
    libavcodec     57. 89.100 / 57. 89.100
    libavformat    57. 71.100 / 57. 71.100
    libavdevice    57.  6.100 / 57.  6.100
    libavfilter     6. 82.100 /  6. 82.100
    libswscale      4.  6.100 /  4.  6.100
    libswresample   2.  7.100 /  2.  7.100
    libpostproc    54.  5.100 / 54.  5.100
    
  • Unity : Converting Texture2D to YUV420P and sending with UDP using FFmpeg

    22 juin 2018, par potu1304

    In my Unity game each frame is rendered into a texture and then put together into a video using FFmpeg. Now my questions is if I am doing this right because avcodec_send_frame throws every time an exception.
    I am pretty sure that I am doing something wrong or in the wrong order or simply missing something.

    Here is the code for capturing the texture :

    void Update() {
           //StartCoroutine(CaptureFrame());

           if (rt == null)
           {
               rect = new Rect(0, 0, captureWidth, captureHeight);
               rt = new RenderTexture(captureWidth, captureHeight, 24);
               frame = new Texture2D(captureWidth, captureHeight, TextureFormat.RGB24, false);
           }

           Camera camera = this.GetComponent<camera>(); // NOTE: added because there was no reference to camera in original script; must add this script to Camera
           camera.targetTexture = rt;
           camera.Render();

           RenderTexture.active = rt;
           frame.ReadPixels(rect, 0, 0);
           frame.Apply();

           camera.targetTexture = null;
           RenderTexture.active = null;

           byte[] fileData = null;
           fileData = frame.GetRawTextureData();
           encoding(fileData, fileData.Length);

       }
    </camera>

    And here is the code for encoding and sending the byte data :

    private unsafe void encoding(byte[] bytes, int size)
       {
           Debug.Log("Encoding...");
           AVCodec* codec;
           codec = ffmpeg.avcodec_find_encoder(AVCodecID.AV_CODEC_ID_H264);
           int ret, got_output = 0;

           AVCodecContext* codecContext = null;
           codecContext = ffmpeg.avcodec_alloc_context3(codec);
           codecContext->bit_rate = 400000;
           codecContext->width = captureWidth;
           codecContext->height = captureHeight;
           //codecContext->time_base.den = 25;
           //codecContext->time_base.num = 1;

           AVRational timeBase = new AVRational();
           timeBase.num = 1;
           timeBase.den = 25;
           codecContext->time_base = timeBase;
           //AVStream* videoAVStream = null;
           //videoAVStream->time_base = timeBase;



           AVRational frameRate = new AVRational();
           frameRate.num = 25;
           frameRate.den = 1;
           codecContext->framerate = frameRate;

           codecContext->gop_size = 10;
           codecContext->max_b_frames = 1;
           codecContext->pix_fmt = AVPixelFormat.AV_PIX_FMT_YUV420P;

           AVFrame* inputFrame;
           inputFrame = ffmpeg.av_frame_alloc();
           inputFrame->format = (int)codecContext->pix_fmt;
           inputFrame->width = captureWidth;
           inputFrame->height = captureHeight;
           inputFrame->linesize[0] = inputFrame->width;

           AVPixelFormat dst_pix_fmt = AVPixelFormat.AV_PIX_FMT_YUV420P, src_pix_fmt = AVPixelFormat.AV_PIX_FMT_RGBA;
           int src_w = 1920, src_h = 1080, dst_w = 1920, dst_h = 1080;
           SwsContext* sws_ctx;

           GCHandle pinned = GCHandle.Alloc(bytes, GCHandleType.Pinned);
           IntPtr address = pinned.AddrOfPinnedObject();

           sbyte** inputData = (sbyte**)address;
           sws_ctx = ffmpeg.sws_getContext(src_w, src_h, src_pix_fmt,
                                dst_w, dst_h, dst_pix_fmt,
                                0, null, null, null);

           fixed (int* lineSize = new int[1])
           {
               lineSize[0] = 4 * captureHeight;
               // Convert RGBA to YUV420P
               ffmpeg.sws_scale(sws_ctx, inputData, lineSize, 0, codecContext->width, inputFrame->extended_data, inputFrame->linesize);
           }

           inputFrame->pts = counter++;

           if (ffmpeg.avcodec_send_frame(codecContext, inputFrame) &lt; 0)
               throw new ApplicationException("Error sending a frame for encoding!");

           AVPacket pkt;
           pkt = new AVPacket();
           //pkt.data = inData;
           AVPacket* packet = &amp;pkt;
           ffmpeg.av_init_packet(packet);

           Debug.Log("pkt.size " + pkt.size);
           pinned.Free();
           AVDictionary* options = null;
           ffmpeg.av_dict_set(&amp;options, "pkt_size", "1300", 0);
           ffmpeg.av_dict_set(&amp;options, "buffer_size", "65535", 0);
           AVIOContext* server = null;
           ffmpeg.avio_open2(&amp;server, "udp://192.168.0.1:1111", ffmpeg.AVIO_FLAG_WRITE, null, &amp;options);
           Debug.Log("encoded");
           ret = ffmpeg.avcodec_encode_video2(codecContext, &amp;pkt, inputFrame, &amp;got_output);
           ffmpeg.avio_write(server, pkt.data, pkt.size);
           ffmpeg.av_free_packet(&amp;pkt);
           pkt.data = null;
           pkt.size = 0;
       }

    And every time I start the game

     if (ffmpeg.avcodec_send_frame(codecContext, inputFrame) &lt; 0)
               throw new ApplicationException("Error sending a frame for encoding!");

    throws the exception.
    Any help in fixing the issue would be greatly appreciated :)