Recherche avancée

Médias (0)

Mot : - Tags -/formulaire

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (20)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

  • Other interesting software

    13 avril 2011, par

    We don’t claim to be the only ones doing what we do ... and especially not to assert claims to be the best either ... What we do, we just try to do it well and getting better ...
    The following list represents softwares that tend to be more or less as MediaSPIP or that MediaSPIP tries more or less to do the same, whatever ...
    We don’t know them, we didn’t try them, but you can take a peek.
    Videopress
    Website : http://videopress.com/
    License : GNU/GPL v2
    Source code : (...)

Sur d’autres sites (6959)

  • How can I get decoded frames, which I sent into GPU, can i get them from GPU in rgb

    21 mai 2024, par Владислав Сапожник

    I use ffmpeg.av_hwframe_transfer_data to sent decoded frames into GPU, but i can not get them again in another good format. I try to change my shaders, use av_hwframe_transfer_get_formats but it is not working !dfghkdsfiuhsgiherghoeirughoighweroigjoiwejgoiwrjgjeoijgoiewgoheroighoieqfgoihqeoigheiogieiqrhgihergh2eouirghou2rerhg
My code :

    


    {&#xA;    private static bool _readingComplete = false;&#xA;    private static bool _decodingComplete = false;&#xA;    private static readonly object _lock = new object();&#xA;    private static Queue<avpacket> packets = new Queue<avpacket>();&#xA;    private static readonly object _fileLock = new object();&#xA;    public static MyWindow myWindow;&#xA;    public static SKBitmap myBitmap;&#xA;&#xA;    public static async unsafe Task Main(string[] args)&#xA;    {&#xA;        FFmpegBinariesHelper.RegisterFFmpegBinaries();&#xA;        DynamicallyLoadedBindings.Initialize();&#xA;        Console.WriteLine($"FFmpeg version info: {ffmpeg.av_version_info()}");&#xA;&#xA;        Directory.Delete("frames", true);&#xA;        Directory.CreateDirectory("frames");&#xA;&#xA;        var url = "rtsp://admin:123456@192.168.1.12:554/stream0?username=admin&amp;password=E10ADC3949BA59ABBE56E057F20";&#xA;&#xA;        AVDictionary* opts = null;&#xA;        ffmpeg.av_dict_set(&amp;opts, "-rtsp_transport", "tcp", 0);&#xA;&#xA;        var vsr = new VideoStreamReader(url, opts);&#xA;        var vsd = new VideoStreamDecoder(*vsr.GetCodecParameters(), AVHWDeviceType.AV_HWDEVICE_TYPE_D3D11VA);&#xA;&#xA;        Task readerTask = Task.Factory.StartNew(() => ReadPackets(vsr), TaskCreationOptions.LongRunning);&#xA;        Task decoderTask = Task.Factory.StartNew(() => DecodeFrames(vsd), TaskCreationOptions.LongRunning);&#xA;&#xA;        var nativeWindowSettings = new NativeWindowSettings()&#xA;        {&#xA;            ClientSize = new Vector2i(800, 600),&#xA;            Title = "My first OpenTK program!"&#xA;        };&#xA;&#xA;        using (var myWindow = new MyWindow(GameWindowSettings.Default, nativeWindowSettings))&#xA;        {&#xA;            myWindow.Run();&#xA;        }&#xA;    }&#xA;&#xA;    private static unsafe void ReadPackets(VideoStreamReader vsr)&#xA;    {&#xA;        while (!_readingComplete)&#xA;        {&#xA;            vsr.TryReadNextPacket(out var packet);&#xA;            lock (_lock)&#xA;            {&#xA;                packets.Enqueue(packet);&#xA;            }&#xA;        }&#xA;&#xA;        _readingComplete = true;&#xA;    }&#xA;&#xA;    private static unsafe void DecodeFrames(VideoStreamDecoder vsd)&#xA;    {&#xA;&#xA;        Console.WriteLine($"codec name: {vsd.CodecName}");&#xA;&#xA;        //var sourceSize = vsd.FrameSize;&#xA;        //var sourcePixelFormat = vsd.PixelFormat;&#xA;        //var destinationSize = sourceSize;&#xA;        //var destinationPixelFormat = AVPixelFormat.AV_PIX_FMT_RGBA;&#xA;        //using var vfc = new VideoFrameConverter(sourceSize, sourcePixelFormat, destinationSize, destinationPixelFormat);&#xA;&#xA;        var frameNumber = 0;&#xA;&#xA;        while (true)&#xA;        {&#xA;            AVPacket packet;&#xA;            lock (_lock)&#xA;            {&#xA;                if (packets.Count == 0)&#xA;                {&#xA;                    if (_readingComplete)&#xA;                    {&#xA;                        break;&#xA;                    }&#xA;                    else&#xA;                    {&#xA;                        continue;&#xA;                    }&#xA;                }&#xA;                packet = packets.Dequeue();&#xA;            }&#xA;&#xA;            vsd.TryDecodeNextFrame(out var frame, packet);&#xA;            //var convertedFrame = vfc.Convert(frame);&#xA;&#xA;            //var bitmap = new SKBitmap(convertedFrame.width, convertedFrame.height, SKColorType.Bgra8888, SKAlphaType.Opaque);&#xA;            //bitmap.InstallPixels(new SKImageInfo(convertedFrame.width, convertedFrame.height, SKColorType.Bgra8888, SKAlphaType.Opaque), (IntPtr)convertedFrame.data[0]);&#xA;            //myBitmap = bitmap;&#xA;            var bitmap = new SKBitmap(frame.width, frame.height, SKColorType.Bgra8888, SKAlphaType.Opaque);&#xA;            bitmap.InstallPixels(new SKImageInfo(frame.width, frame.height, SKColorType.Bgra8888, SKAlphaType.Opaque), (IntPtr)frame.data[0]);&#xA;            myBitmap = bitmap;&#xA;            &#xA;            Console.WriteLine($"frame: {frameNumber}");&#xA;            frameNumber&#x2B;&#x2B;;&#xA;        }&#xA;&#xA;        _decodingComplete = true;&#xA;    }&#xA;&#xA;    //private static unsafe void WriteFrame(AVFrame convertedFrame, int frameNumber)&#xA;    //{&#xA;    //    var imageInfo = new SKImageInfo(convertedFrame.width, convertedFrame.height, SKColorType.Bgra8888, SKAlphaType.Opaque);&#xA;    //    using var bitmap = new SKBitmap();&#xA;    //    bitmap.InstallPixels(imageInfo, (IntPtr)convertedFrame.data[0]);&#xA;&#xA;    //    string filePath;&#xA;    //    lock (_fileLock)&#xA;    //    {&#xA;    //        filePath = $"frames/frame.{frameNumber:D8}.jpg";&#xA;    //    }&#xA;&#xA;    //    using var stream = File.Create(filePath);&#xA;    //    bitmap.Encode(stream, SKEncodedImageFormat.Jpeg, 90);&#xA;    //}&#xA;}&#xA;using OpenTK.Graphics.OpenGL4;&#xA;using OpenTK.Mathematics;&#xA;using OpenTK.Windowing.Common;&#xA;using OpenTK.Windowing.Desktop;&#xA;using OpenTK.Windowing.GraphicsLibraryFramework;&#xA;using SkiaSharp;&#xA;&#xA;namespace OpenTKTask;&#xA;&#xA;public class MyWindow : GameWindow&#xA;{&#xA;    private Shader shader;&#xA;    private int vertexBufferHandle;&#xA;    private int elementBufferHandle;&#xA;    private int vertexArrayHandle;&#xA;    private int texture;&#xA;&#xA;    //float[] vertices =&#xA;    //{&#xA;    //    1.0f, 1.0f, 0.0f, 1.0f, 0.0f,&#xA;    //    1.0f, -1.0f, 0.0f, 1.0f, 1.0f,&#xA;    //    -1.0f, -1.0f, 0.0f, 0.0f, 0.0f,&#xA;    //    -1.0f, 1.0f, 0.0f, 0.0f, 1.0f&#xA;    //};&#xA;&#xA;    float[] vertices =&#xA;{&#xA;    //Position         | Texture coordinates&#xA;     1.0f,  1.0f, 0.0f, 1.0f, 0.0f, // top right&#xA;     1.0f, -1.0f, 0.0f, 1.0f, 1.0f, // bottom right&#xA;    -1.0f, -1.0f, 0.0f, 0.0f, 1.0f, // bottom left&#xA;    -1.0f,  1.0f, 0.0f, 0.0f, 0.0f  // top left&#xA;};&#xA;&#xA;    uint[] indices =&#xA;{&#xA;        0, 1, 3,&#xA;        1, 2, 3&#xA;    };&#xA;&#xA;    float[] texCoords =&#xA;    {&#xA;        0.0f, 0.0f,&#xA;        1.0f, 0.0f,&#xA;        0.5f, 1.0f,&#xA;    };&#xA;&#xA;    public MyWindow(GameWindowSettings gameWindowSettings, NativeWindowSettings nativeWindowSettings) : base(gameWindowSettings, nativeWindowSettings)&#xA;    {&#xA;        this.CenterWindow(new Vector2i(1280, 760));&#xA;    }&#xA;&#xA;    protected override void OnResize(ResizeEventArgs e)&#xA;    {&#xA;        GL.Viewport(0, 0, e.Width, e.Height);&#xA;        base.OnResize(e);&#xA;    }&#xA;&#xA;    protected override void OnLoad()&#xA;    {&#xA;        base.OnLoad();&#xA;&#xA;        shader = new Shader("C:\\Users\\1\\Desktop\\7h3_C0d3r\\OpenTKTask\\vertexShader.vert", "C:\\Users\\1\\Desktop\\7h3_C0d3r\\OpenTKTask\\fragShader.frag");&#xA;        shader.Use();&#xA;&#xA;        vertexArrayHandle = GL.GenVertexArray();&#xA;        GL.BindVertexArray(vertexArrayHandle);&#xA;&#xA;        vertexBufferHandle = GL.GenBuffer();&#xA;        GL.BindBuffer(BufferTarget.ArrayBuffer, vertexBufferHandle);&#xA;        GL.BufferData(BufferTarget.ArrayBuffer, vertices.Length * sizeof(float), vertices, BufferUsageHint.StaticDraw);&#xA;&#xA;        elementBufferHandle = GL.GenBuffer();&#xA;        GL.BindBuffer(BufferTarget.ElementArrayBuffer, elementBufferHandle);&#xA;        GL.BufferData(BufferTarget.ElementArrayBuffer, indices.Length * sizeof(uint), indices, BufferUsageHint.StaticDraw);&#xA;&#xA;        var positionLocation = shader.GetAttribLocation("aPosition");&#xA;        GL.VertexAttribPointer(positionLocation, 3, VertexAttribPointerType.Float, false, 5 * sizeof(float), 0);&#xA;        GL.EnableVertexAttribArray(positionLocation);&#xA;&#xA;        var texCoordLocation = shader.GetAttribLocation("aTexCoord");&#xA;        GL.VertexAttribPointer(texCoordLocation, 2, VertexAttribPointerType.Float, false, 5 * sizeof(float), 3 * sizeof(float));&#xA;        GL.EnableVertexAttribArray(texCoordLocation);&#xA;&#xA;        texture = GL.GenTexture();&#xA;        GL.BindTexture(TextureTarget.Texture2D, texture);&#xA;&#xA;        GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureWrapS, (int)TextureWrapMode.Repeat);&#xA;        GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureWrapT, (int)TextureWrapMode.Repeat);&#xA;        GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureMinFilter, (int)TextureMinFilter.Linear);&#xA;        GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureMagFilter, (int)TextureMagFilter.Linear);&#xA;&#xA;        GL.TexImage2D(TextureTarget.Texture2D, 0, PixelInternalFormat.Rgba, 800, 600, 0, PixelFormat.Bgra, PixelType.UnsignedByte, IntPtr.Zero);&#xA;        GL.BindTexture(TextureTarget.Texture2D, 0);&#xA;    }&#xA;&#xA;    protected override void OnUnload()&#xA;    {&#xA;        base.OnUnload();&#xA;        GL.DeleteBuffer(vertexBufferHandle);&#xA;        GL.DeleteVertexArray(vertexArrayHandle);&#xA;        GL.DeleteProgram(shader.Handle);&#xA;        GL.DeleteTexture(texture);&#xA;    }&#xA;&#xA;    protected override void OnUpdateFrame(FrameEventArgs args)&#xA;    {&#xA;        base.OnUpdateFrame(args);&#xA;    }&#xA;&#xA;    protected override void OnRenderFrame(FrameEventArgs args)&#xA;    {&#xA;        base.OnRenderFrame(args);&#xA;&#xA;        UpdateTexture(Program.myBitmap);&#xA;&#xA;        GL.Clear(ClearBufferMask.ColorBufferBit);&#xA;&#xA;        GL.BindTexture(TextureTarget.Texture2D, texture);&#xA;&#xA;        shader.Use();&#xA;&#xA;        GL.BindVertexArray(vertexArrayHandle);&#xA;&#xA;        GL.DrawElements(PrimitiveType.Triangles, indices.Length, DrawElementsType.UnsignedInt, 0);&#xA;&#xA;        SwapBuffers();&#xA;    }&#xA;&#xA;    public void UpdateTexture(SKBitmap bitmap)&#xA;    {&#xA;        GL.BindTexture(TextureTarget.Texture2D, texture);&#xA;&#xA;        if (bitmap != null)&#xA;        {&#xA;            //byte[] pixels = bitmap.Bytes;&#xA;            GL.TexImage2D(TextureTarget.Texture2D, 0, PixelInternalFormat.Rgba, bitmap.Width, bitmap.Height, 0, PixelFormat.Bgra, PixelType.UnsignedByte, 0);&#xA;        }&#xA;    }&#xA;&#xA;    public SKBitmap LoadBitmap(string path)&#xA;    {&#xA;        using (var stream = File.OpenRead(path))&#xA;        {&#xA;            return SKBitmap.Decode(stream);&#xA;        }&#xA;    }&#xA;}&#xA;using FFmpeg.AutoGen;&#xA;using System.Drawing;&#xA;&#xA;namespace OpenTKTask;&#xA;&#xA;public sealed unsafe class VideoStreamDecoder : IDisposable&#xA;{&#xA;    private readonly AVCodecContext* _pCodecContext;&#xA;    private readonly AVPacket* _pPacket;&#xA;    private readonly AVFrame* _pFrame;&#xA;    private readonly AVFrame* _receivedFrame;&#xA;    private AVFrame* _pSwFrame;&#xA;    private AVBufferRef* _pHWDeviceCtx;&#xA;&#xA;    public VideoStreamDecoder(AVCodecParameters parameter, AVHWDeviceType HWDeviceType = AVHWDeviceType.AV_HWDEVICE_TYPE_D3D12VA)&#xA;    {&#xA;        _receivedFrame = ffmpeg.av_frame_alloc();&#xA;&#xA;        AVCodec* codec = ffmpeg.avcodec_find_decoder(parameter.codec_id);&#xA;        if (codec == null)&#xA;            throw new InvalidOperationException("Codec not found.");&#xA;        _pCodecContext = ffmpeg.avcodec_alloc_context3(codec);&#xA;&#xA;        ffmpeg.av_hwdevice_ctx_create(&amp;_pCodecContext->hw_device_ctx, HWDeviceType, null, null, 0).ThrowExceptionIfError();&#xA;&#xA;        ffmpeg.avcodec_parameters_to_context(_pCodecContext, &amp;parameter)&#xA;            .ThrowExceptionIfError();&#xA;        ffmpeg.avcodec_open2(_pCodecContext, codec, null).ThrowExceptionIfError();&#xA;&#xA;        CodecName = ffmpeg.avcodec_get_name(codec->id);&#xA;        FrameSize = new Size(_pCodecContext->width, _pCodecContext->height);&#xA;        PixelFormat = _pCodecContext->pix_fmt;&#xA;&#xA;        _pFrame = ffmpeg.av_frame_alloc();&#xA;        _pPacket = ffmpeg.av_packet_alloc();&#xA;    }&#xA;&#xA;    public string CodecName { get; }&#xA;    public Size FrameSize { get; }&#xA;    public AVPixelFormat PixelFormat { get; }&#xA;&#xA;    public bool TryDecodeNextFrame(out AVFrame frame, AVPacket packet)&#xA;    {&#xA;        ffmpeg.av_frame_unref(_pFrame);&#xA;        ffmpeg.av_frame_unref(_receivedFrame);&#xA;        int error;&#xA;&#xA;        do&#xA;        {&#xA;            ffmpeg.avcodec_send_packet(_pCodecContext, &amp;packet).ThrowExceptionIfError();&#xA;            error = ffmpeg.avcodec_receive_frame(_pCodecContext, _pFrame);&#xA;        } while (error == ffmpeg.AVERROR(ffmpeg.EAGAIN));&#xA;&#xA;        error.ThrowExceptionIfError();&#xA;&#xA;        if (_pCodecContext->hw_device_ctx != null)&#xA;        {&#xA;            ffmpeg.av_hwframe_transfer_data(_receivedFrame, _pFrame, 0).ThrowExceptionIfError();&#xA;            Console.WriteLine((AVPixelFormat)171);&#xA;            frame = *_receivedFrame; // AV_PIX_FMT_NV11&#xA;            //Console.WriteLine((AVPixelFormat)frame.format);&#xA;        }&#xA;        else&#xA;            frame = *_pFrame; // AV_PIX_FMT_NV11&#xA;        //Console.WriteLine((AVPixelFormat)frame.format);&#xA;        return true;&#xA;    }&#xA;&#xA;    public void Dispose()&#xA;    {&#xA;        var pFrame = _pFrame;&#xA;        ffmpeg.av_frame_free(&amp;pFrame);&#xA;&#xA;        var pCodecContext = _pCodecContext;&#xA;        ffmpeg.avcodec_free_context(&amp;pCodecContext);&#xA;&#xA;        if (_pHWDeviceCtx != null)&#xA;        {&#xA;            var pHWDeviceCtx = _pHWDeviceCtx;&#xA;            ffmpeg.av_buffer_unref(&amp;pHWDeviceCtx);&#xA;        }&#xA;&#xA;        if (_pSwFrame != null)&#xA;        {&#xA;            var pSwFrame = _pSwFrame;&#xA;            ffmpeg.av_frame_free(&amp;pSwFrame);&#xA;        }&#xA;    }&#xA;}````&#xA;</avpacket></avpacket>

    &#xA;

  • Render YUV frame using OpenTK [closed]

    20 mai 2024, par dima2012 terminator

    my window&#xA;Im trying to render YUV AVFrame, that i get from camera using OpenTK, im creating a rectangle and trying to apply a texture to it, but it doesnt work.

    &#xA;

    Here is my window class

    &#xA;

    using OpenTK.Graphics.Egl;&#xA;using OpenTK.Graphics.OpenGL4;&#xA;using OpenTK.Windowing.Common;&#xA;using OpenTK.Windowing.Desktop;&#xA;using OpenTK.Windowing.GraphicsLibraryFramework;&#xA;using System;&#xA;using System.Collections.Generic;&#xA;using System.Diagnostics;&#xA;using System.Linq;&#xA;using System.Text;&#xA;using System.Threading;&#xA;using System.Threading.Tasks;&#xA;&#xA;namespace myFFmpeg&#xA;{&#xA;    public class CameraWindow : GameWindow&#xA;    {&#xA;        private int vertexBufferHandle;&#xA;        private int elementBufferHandle;&#xA;        private int vertexArrayHandle;&#xA;        private int frameNumber = 0;&#xA;        private int yTex, uTex, vTex;&#xA;&#xA;        Shader shader;&#xA;        Texture texture;&#xA;&#xA;        float[] vertices =&#xA;        {&#xA;            //Position         | Texture coordinates&#xA;             0.5f,  0.5f, 0.0f, 1.0f, 0.0f, // top right&#xA;             0.5f, -0.5f, 0.0f, 1.0f, 1.0f, // bottom right&#xA;            -0.5f, -0.5f, 0.0f, 0.0f, 1.0f, // bottom left&#xA;            -0.5f,  0.5f, 0.0f, 0.0f, 0.0f  // top left&#xA;        };&#xA;&#xA;&#xA;        private uint[] indices = &#xA;        {&#xA;            0, 1, 3,   // first triangle&#xA;            1, 2, 3    // second triangle&#xA;        };&#xA;&#xA;        public CameraWindow(string title) : base(GameWindowSettings.Default, new NativeWindowSettings() { ClientSize = (1280, 720), Title = title }) { UpdateFrequency = 25; }&#xA;&#xA;        protected override void OnUpdateFrame(FrameEventArgs e)&#xA;        {&#xA;            base.OnUpdateFrame(e);&#xA;        }&#xA;&#xA;        protected override void OnLoad()&#xA;        {&#xA;            GL.ClearColor(0.5f, 0.3f, 0.3f, 1.0f);&#xA;&#xA;            shader = new Shader(@"..\..\shader.vert", @"..\..\shader.frag");&#xA;            texture = new Texture();&#xA;&#xA;            elementBufferHandle = GL.GenBuffer();&#xA;            GL.BindBuffer(BufferTarget.ElementArrayBuffer, elementBufferHandle);&#xA;            GL.BufferData(BufferTarget.ElementArrayBuffer, indices.Length * sizeof(uint), indices, BufferUsageHint.StaticDraw);&#xA;&#xA;            vertexBufferHandle = GL.GenBuffer();&#xA;            GL.BindBuffer(BufferTarget.ArrayBuffer, vertexBufferHandle);&#xA;            GL.BufferData(BufferTarget.ArrayBuffer, vertices.Length * sizeof(float), vertices, BufferUsageHint.StaticDraw);&#xA;&#xA;            GL.BindBuffer(BufferTarget.ArrayBuffer, 0);&#xA;&#xA;            vertexArrayHandle = GL.GenVertexArray();&#xA;            GL.BindVertexArray(vertexArrayHandle);&#xA;&#xA;            GL.BindBuffer(BufferTarget.ArrayBuffer, vertexBufferHandle);&#xA;            GL.VertexAttribPointer(0, 3, VertexAttribPointerType.Float, false, 5 * sizeof(float), 0);&#xA;            GL.EnableVertexAttribArray(0);&#xA;&#xA;            int vertexShader = GL.CreateShader(ShaderType.VertexShader);&#xA;            GL.ShaderSource(vertexShader, @"..\..\shader.vert");&#xA;            GL.CompileShader(vertexShader);&#xA;&#xA;            int fragmentShader = GL.CreateShader(ShaderType.FragmentShader);&#xA;            GL.ShaderSource(fragmentShader, @"..\..\shader.frag");&#xA;            GL.CompileShader(fragmentShader);&#xA;&#xA;            int shaderProgram = GL.CreateProgram();&#xA;            GL.AttachShader(shaderProgram, vertexShader);&#xA;            GL.AttachShader(shaderProgram, fragmentShader);&#xA;            GL.LinkProgram(shaderProgram);&#xA;&#xA;&#xA;            int vertexPosLocation = GL.GetAttribLocation(shaderProgram, "vertexPos");&#xA;            GL.EnableVertexAttribArray(vertexPosLocation);&#xA;            GL.VertexAttribPointer(vertexPosLocation, 2, VertexAttribPointerType.Float, false, 4 * sizeof(float), 0);&#xA;&#xA;            int texCoordLocation = GL.GetAttribLocation(shaderProgram, "texCoord");&#xA;            GL.EnableVertexAttribArray(texCoordLocation);&#xA;            GL.VertexAttribPointer(texCoordLocation, 2, VertexAttribPointerType.Float, false, 4 * sizeof(float), 2 * sizeof(float));&#xA;&#xA;            GL.UseProgram(shaderProgram);&#xA;&#xA;            GL.ActiveTexture(TextureUnit.Texture0);&#xA;            GL.BindTexture(TextureTarget.Texture2D, yTex);&#xA;            GL.Uniform1(GL.GetUniformLocation(shaderProgram, "yTex"), 0);&#xA;&#xA;            GL.ActiveTexture(TextureUnit.Texture1);&#xA;            GL.BindTexture(TextureTarget.Texture2D, uTex);&#xA;            GL.Uniform1(GL.GetUniformLocation(shaderProgram, "uTex"), 1);&#xA;&#xA;            GL.ActiveTexture(TextureUnit.Texture2);&#xA;            GL.BindTexture(TextureTarget.Texture2D, vTex);&#xA;            GL.Uniform1(GL.GetUniformLocation(shaderProgram, "vTex"), 2);&#xA;&#xA;            GL.BindVertexArray(0);&#xA;            //code&#xA;&#xA;            base.OnLoad();&#xA;        }&#xA;&#xA;        protected override void OnUnload()&#xA;        {&#xA;            GL.BindBuffer(BufferTarget.ArrayBuffer, 0);&#xA;            GL.DeleteBuffer(vertexBufferHandle);&#xA;            GL.UseProgram(0);&#xA;            shader.Dispose();&#xA;&#xA;            //code&#xA;&#xA;            base.OnUnload();&#xA;        }&#xA;&#xA;        protected override void OnRenderFrame(FrameEventArgs e)&#xA;        {&#xA;&#xA;            GL.Clear(ClearBufferMask.ColorBufferBit);&#xA;&#xA;            shader.Use();&#xA;            texture.Use(frameNumber&#x2B;&#x2B;);&#xA;&#xA;            GL.BindVertexArray(vertexArrayHandle);&#xA;&#xA;            GL.DrawElements(PrimitiveType.Triangles, indices.Length, DrawElementsType.UnsignedInt, indices);&#xA;&#xA;            Context.SwapBuffers();&#xA;&#xA;            base.OnRenderFrame(e);&#xA;        }&#xA;&#xA;        protected override void OnFramebufferResize(FramebufferResizeEventArgs e)&#xA;        {&#xA;            base.OnFramebufferResize(e);&#xA;&#xA;            GL.Viewport(0, 0, e.Width, e.Height);&#xA;        }&#xA;    }&#xA;}&#xA;

    &#xA;

    And my texture class :

    &#xA;

    using System;&#xA;using OpenTK;&#xA;using OpenTK.Graphics.OpenGL4;&#xA;using SkiaSharp;&#xA;using FFmpeg;&#xA;using SkiaSharp.Internals;&#xA;using StbImageSharp;&#xA;using FFmpeg.AutoGen;&#xA;using System.Threading;&#xA;&#xA;namespace myFFmpeg&#xA;{&#xA;    public class Texture&#xA;    {&#xA;        int Handle, yTex, uTex, vTex;&#xA;&#xA;        Program program = new Program();&#xA;&#xA;        public Texture()&#xA;        {&#xA;            Handle = GL.GenTexture();&#xA;        }&#xA;&#xA;&#xA;        public unsafe void Use(int frameNumber)&#xA;        {&#xA;            GL.BindTexture(TextureTarget.Texture2D, Handle);&#xA;&#xA;            // Generate textures only once (outside the loop)&#xA;            if (yTex == 0)&#xA;            {&#xA;                GL.GenTextures(1, out yTex);&#xA;            }&#xA;            if (uTex == 0)&#xA;            {&#xA;                GL.GenTextures(1, out uTex);&#xA;            }&#xA;            if (vTex == 0)&#xA;            {&#xA;                GL.GenTextures(1, out vTex);&#xA;            }&#xA;&#xA;            // Bind textures to specific units before rendering each frame&#xA;            GL.ActiveTexture(TextureUnit.Texture0);&#xA;            GL.BindTexture(TextureTarget.Texture2D, yTex);&#xA;            GL.ActiveTexture(TextureUnit.Texture1);&#xA;            GL.BindTexture(TextureTarget.Texture2D, uTex);&#xA;            GL.ActiveTexture(TextureUnit.Texture2);&#xA;&#xA;            // Update textures with new frame data from FFmpeg&#xA;            AVFrame frame = program.getFrame();&#xA;            int width = frame.width;&#xA;            int height = frame.height;&#xA;&#xA;            Console.BackgroundColor = ConsoleColor.White;&#xA;            Console.ForegroundColor = ConsoleColor.Black;&#xA;            Console.WriteLine((AVPixelFormat)frame.format);&#xA;            Console.BackgroundColor = ConsoleColor.Black;&#xA;&#xA;&#xA;            // Assuming YUV data is stored in separate planes (Y, U, V)&#xA;            GL.BindTexture(TextureTarget.Texture2D, yTex);&#xA;            GL.TexImage2D(TextureTarget.Texture2D, 0, PixelInternalFormat.Luminance, width, height, 0, PixelFormat.Luminance, PixelType.UnsignedByte, (IntPtr)frame.data[0]);&#xA;            GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureMinFilter, (int)TextureMinFilter.Linear);&#xA;            GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureMagFilter, (int)TextureMagFilter.Linear);&#xA;&#xA;            GL.BindTexture(TextureTarget.Texture2D, uTex);&#xA;            GL.TexImage2D(TextureTarget.Texture2D, 0, PixelInternalFormat.Luminance, width / 2, height / 2, 0, PixelFormat.Luminance, PixelType.UnsignedByte, (IntPtr)frame.data[1]);&#xA;            GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureMinFilter, (int)TextureMinFilter.Linear);&#xA;            GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureMagFilter, (int)TextureMagFilter.Linear);&#xA;&#xA;            GL.BindTexture(TextureTarget.Texture2D, vTex);&#xA;            GL.TexImage2D(TextureTarget.Texture2D, 0, PixelInternalFormat.Luminance, width / 2, height / 2, 0, PixelFormat.Luminance, PixelType.UnsignedByte, (IntPtr)frame.data[2]);&#xA;            GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureMinFilter, (int)TextureMinFilter.Linear);&#xA;            GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureMinFilter, (int)TextureMinFilter.Linear);&#xA;&#xA;        }&#xA;    }&#xA;}&#xA;&#xA;

    &#xA;

    And my shader class :

    &#xA;

    using OpenTK.Graphics.OpenGL4;&#xA;using System;&#xA;using System.IO;&#xA;&#xA;namespace myFFmpeg&#xA;{&#xA;    public class Shader : IDisposable&#xA;    {&#xA;        public int Handle { get; private set; }&#xA;&#xA;        public Shader(string vertexPath, string fragmentPath)&#xA;        {&#xA;            string vertexShaderSource = File.ReadAllText(vertexPath);&#xA;            string fragmentShaderSource = File.ReadAllText(fragmentPath);&#xA;&#xA;            int vertexShader = GL.CreateShader(ShaderType.VertexShader);&#xA;            GL.ShaderSource(vertexShader, vertexShaderSource);&#xA;            GL.CompileShader(vertexShader);&#xA;            CheckShaderCompilation(vertexShader);&#xA;&#xA;            int fragmentShader = GL.CreateShader(ShaderType.FragmentShader);&#xA;            GL.ShaderSource(fragmentShader, fragmentShaderSource);&#xA;            GL.CompileShader(fragmentShader);&#xA;            CheckShaderCompilation(fragmentShader);&#xA;&#xA;            Handle = GL.CreateProgram();&#xA;            GL.AttachShader(Handle, vertexShader);&#xA;            GL.AttachShader(Handle, fragmentShader);&#xA;            GL.LinkProgram(Handle);&#xA;            CheckProgramLinking(Handle);&#xA;&#xA;            GL.DetachShader(Handle, vertexShader);&#xA;            GL.DetachShader(Handle, fragmentShader);&#xA;            GL.DeleteShader(vertexShader);&#xA;            GL.DeleteShader(fragmentShader);&#xA;        }&#xA;&#xA;        public void Use()&#xA;        {&#xA;            GL.UseProgram(Handle);&#xA;        }&#xA;&#xA;        public int GetAttribLocation(string attribName)&#xA;        {&#xA;            return GL.GetAttribLocation(Handle, attribName);&#xA;        }&#xA;&#xA;        public int GetUniformLocation(string uniformName)&#xA;        {&#xA;            return GL.GetUniformLocation(Handle, uniformName);&#xA;        }&#xA;&#xA;        private void CheckShaderCompilation(int shader)&#xA;        {&#xA;            GL.GetShader(shader, ShaderParameter.CompileStatus, out int success);&#xA;            if (success == 0)&#xA;            {&#xA;                string infoLog = GL.GetShaderInfoLog(shader);&#xA;                throw new InvalidOperationException($"Shader compilation failed: {infoLog}");&#xA;            }&#xA;        }&#xA;&#xA;        private void CheckProgramLinking(int program)&#xA;        {&#xA;            GL.GetProgram(program, GetProgramParameterName.LinkStatus, out int success);&#xA;            if (success == 0)&#xA;            {&#xA;                string infoLog = GL.GetProgramInfoLog(program);&#xA;                throw new InvalidOperationException($"Program linking failed: {infoLog}");&#xA;            }&#xA;        }&#xA;&#xA;        public void Dispose()&#xA;        {&#xA;            GL.DeleteProgram(Handle);&#xA;        }&#xA;    }&#xA;}&#xA;

    &#xA;

    Vert shader

    &#xA;

    #version 330 core&#xA;layout(location = 0) in vec3 vertexPos;&#xA;layout(location = 1) in vec2 texCoord;&#xA;&#xA;out vec2 TexCoord; &#xA;&#xA;void main()&#xA;{&#xA;    gl_Position = vec4(vertexPos,1.0);&#xA;    TexCoord = texCoord;&#xA;}&#xA;

    &#xA;

    Frag shader

    &#xA;

    #version 330 core&#xA;in vec2 TexCoord;&#xA;out vec4 color;&#xA;&#xA;uniform sampler2D yTex;&#xA;uniform sampler2D uTex;&#xA;uniform sampler2D vTex;&#xA;&#xA;void main()&#xA;{&#xA;  float y = texture(yTex, TexCoord).r;&#xA;  float u = texture(uTex, TexCoord).r - 0.5;&#xA;  float v = texture(vTex, TexCoord).r - 0.5;&#xA;&#xA;  // YUV to RGB conversion (BT.709)&#xA;  float r = y &#x2B; 1.5714 * v;&#xA;  float g = y - 0.6486 * u - 0.3918 * v;&#xA;  float b = y &#x2B; 1.8556 * u;&#xA;&#xA;  color = vec4(r, g, b, 1.0);&#xA;}&#xA;

    &#xA;

    I can provide more code, if needed..

    &#xA;

    I tried changing shaders, changing textures, getting frame using ffmpeg.av_hwframe_transfer_data(_receivedFrame, _pFrame, 0);

    &#xA;

  • Error : Output format mp4 is not available

    12 avril 2024, par alpaca pwaa

    I'm using fluent-ffmpeg in my nextjs application, I'm trying to process the video and specified a format to stream on my s3 bucket but it keeps on failing. My terminal keeps on throwing "Error : Error : Output format mp4 is not available". I already verify my ffmpeg format "ffmpeg -format" and confirm that it supports encoding and decoding mp4 files. I've already tried solutions from other forums but it's not working for me.

    &#xA;

    createVideo: privateProcedure&#xA;    .input(&#xA;      z.object({&#xA;        fileId: z.string(),&#xA;      })&#xA;    )&#xA;    .mutation(async ({ ctx, input }) => {&#xA;      const { getUser } = getKindeServerSession();&#xA;      const user = await getUser();&#xA;&#xA;      if (!user || !user.id || !user.email) {&#xA;        throw new TRPCError({ code: "UNAUTHORIZED" });&#xA;      }&#xA;&#xA;      const dbUser = await db.user.findFirst({&#xA;        where: {&#xA;          id: user.id,&#xA;        },&#xA;      });&#xA;&#xA;      if (!dbUser) {&#xA;        throw new TRPCError({&#xA;          code: "UNAUTHORIZED",&#xA;          message: "User not found in the database.",&#xA;        });&#xA;      }&#xA;&#xA;      const putObjectCommand = new PutObjectCommand({&#xA;        Bucket: process.env.AWS_BUCKET_NAME!,&#xA;        Key: generateFileName(),&#xA;      });&#xA;&#xA;      const s3 = new S3Client({&#xA;        region: process.env.AWS_BUCKET_REGION!,&#xA;        credentials: {&#xA;          accessKeyId: process.env.AWS_ACCESS_KEY!,&#xA;          secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,&#xA;        },&#xA;      });&#xA;&#xA;      const singedURL = await getSignedUrl(s3, putObjectCommand, {&#xA;        expiresIn: 60,&#xA;      });&#xA;&#xA;      const ffmpeg = require("fluent-ffmpeg");&#xA;      const passthroughStream = new PassThrough();&#xA;&#xA;      ffmpeg({ source: "./template1.mp4" })&#xA;        .on("end", async () => {&#xA;          console.log("Job done");&#xA;          await uploadToS3(passthroughStream);&#xA;        })&#xA;        .on("error", (error: string) => {&#xA;          console.error("Error:", error);&#xA;          throw new Error("Error processing video");&#xA;        })&#xA;        .videoFilter({&#xA;          filter: "drawtext",&#xA;          options: {&#xA;            text: "hi",&#xA;            fontsize: 24,&#xA;            fontcolor: "white",&#xA;            x: "(w-text_w)/2",&#xA;            y: "(h-text_h)/2",&#xA;            box: 1,&#xA;            boxcolor: "black@0.5",&#xA;            boxborderw: 5,&#xA;            fontfile: "/Windows/fonts/calibri.ttf",&#xA;          },&#xA;        })&#xA;        .videoCodec("libx264")&#xA;        .outputFormat("mp4")&#xA;        .outputOptions(["-movflags frag_keyframe&#x2B;empty_moov"])&#xA;        .pipe(passthroughStream, { end: true });&#xA;&#xA;      const uploadToS3 = async (stream: PassThrough) => {&#xA;        const upload = new Upload({&#xA;          client: s3,&#xA;          params: {&#xA;            Bucket: process.env.AWS_BUCKET_NAME!,&#xA;            Key: generateFileName(),&#xA;            Body: stream,&#xA;          },&#xA;        });&#xA;        await upload.done();&#xA;      };&#xA;&#xA;      await new Promise((resolve, reject) => {&#xA;        passthroughStream.on("end", resolve);&#xA;        passthroughStream.on("error", reject);&#xA;      });&#xA;&#xA;      const createdVideo = await db.video.create({&#xA;        data: {&#xA;          name: "Test Name",&#xA;          url: singedURL.split("?")[0],&#xA;          key: singedURL,&#xA;          fileId: input.fileId,&#xA;        },&#xA;      });&#xA;&#xA;      return createdVideo;&#xA;    }),&#xA;

    &#xA;

    Here's the ffmpeg log.

    &#xA;

    ffmpeg started on 2024-04-11 at 20:58:56&#xA;Report written to "ffmpeg-20240411-205856.log"&#xA;Log level: 48&#xA;Command line:&#xA;"C:\\ProgramData\\chocolatey\\lib\\ffmpeg-full\\tools\\ffmpeg\\bin\\ffmpeg.exe" -i ./template1.mp4 -filter:v "drawtext=text=hi:fontsize=24:fontcolor=white:x=(w-text_w)/2:y=(h-text_h)/2:box=1:boxcolor=black@0.5:boxborderw=5:fontfile=/Windows/fonts/calibri.ttf" -report pipe:1&#xA;ffmpeg version 7.0-full_build-www.gyan.dev Copyright (c) 2000-2024 the FFmpeg developers&#xA;  built with gcc 13.2.0 (Rev5, Built by MSYS2 project)&#xA;  configuration: --enable-gpl --enable-version3 --enable-static --disable-w32threads --disable-autodetect --enable-fontconfig --enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp --enable-bzlib --enable-lzma --enable-libsnappy --enable-zlib --enable-librist --enable-libsrt --enable-libssh --enable-libzmq --enable-avisynth --enable-libbluray --enable-libcaca --enable-sdl2 --enable-libaribb24 --enable-libaribcaption --enable-libdav1d --enable-libdavs2 --enable-libuavs3d --enable-libxevd --enable-libzvbi --enable-librav1e --enable-libsvtav1 --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs2 --enable-libxeve --enable-libxvid --enable-libaom --enable-libjxl --enable-libopenjpeg --enable-libvpx --enable-mediafoundation --enable-libass --enable-frei0r --enable-libfreetype --enable-libfribidi --enable-libharfbuzz --enable-liblensfun --enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf --enable-cuda-llvm --enable-cuvid --enable-dxva2 --enable-d3d11va --enable-d3d12va --enable-f  libavutil      59.  8.100 / 59.  8.100&#xA;  libavcodec     61.  3.100 / 61.  3.100&#xA;  libavformat    61.  1.100 / 61.  1.100&#xA;  libavdevice    61.  1.100 / 61.  1.100&#xA;  libavfilter    10.  1.100 / 10.  1.100&#xA;  libswscale      8.  1.100 /  8.  1.100&#xA;  libswresample   5.  1.100 /  5.  1.100&#xA;  libpostproc    58.  1.100 / 58.  1.100&#xA;Splitting the commandline.&#xA;Reading option &#x27;-i&#x27; ... matched as input url with argument &#x27;./template1.mp4&#x27;.&#xA;Reading option &#x27;-filter:v&#x27; ... matched as option &#x27;filter&#x27; (apply specified filters to audio/video) with argument &#x27;drawtext=text=hi:fontsize=24:fontcolor=white:x=(w-text_w)/2:y=(h-text_h)/2:box=1:boxcolor=black@0.5:boxborderw=5:fontfile=/Windows/fonts/calibri.ttf&#x27;.&#xA;Reading option &#x27;-report&#x27; ... matched as option &#x27;report&#x27; (generate a report) with argument &#x27;1&#x27;.&#xA;Reading option &#x27;pipe:1&#x27; ... matched as output url.&#xA;Finished splitting the commandline.&#xA;Parsing a group of options: global .&#xA;Applying option report (generate a report) with argument 1.&#xA;Successfully parsed a group of options.&#xA;Parsing a group of options: input url ./template1.mp4.&#xA;Successfully parsed a group of options.&#xA;Opening an input file: ./template1.mp4.&#xA;[AVFormatContext @ 00000262cd0888c0] Opening &#x27;./template1.mp4&#x27; for reading&#xA;[file @ 00000262cd0a94c0] Setting default whitelist &#x27;file,crypto,data&#x27;&#xA;[mov,mp4,m4a,3gp,3g2,mj2 @ 00000262cd0888c0] Format mov,mp4,m4a,3gp,3g2,mj2 probed with size=2048 and score=100&#xA;[mov,mp4,m4a,3gp,3g2,mj2 @ 00000262cd0888c0] ISO: File Type Major Brand: isom&#xA;[mov,mp4,m4a,3gp,3g2,mj2 @ 00000262cd0888c0] Unknown dref type 0x206c7275 size 12&#xA;[mov,mp4,m4a,3gp,3g2,mj2 @ 00000262cd0888c0] Processing st: 0, edit list 0 - media time: 1024, duration: 126981&#xA;[mov,mp4,m4a,3gp,3g2,mj2 @ 00000262cd0888c0] Offset DTS by 1024 to make first pts zero.&#xA;[mov,mp4,m4a,3gp,3g2,mj2 @ 00000262cd0888c0] Setting codecpar->delay to 2 for stream st: 0&#xA;[mov,mp4,m4a,3gp,3g2,mj2 @ 00000262cd0888c0] Before avformat_find_stream_info() pos: 6965 bytes read:32768 seeks:0 nb_streams:1&#xA;[h264 @ 00000262cd0bb140] nal_unit_type: 7(SPS), nal_ref_idc: 3&#xA;[h264 @ 00000262cd0bb140] Decoding VUI&#xA;[h264 @ 00000262cd0bb140] nal_unit_type: 8(PPS), nal_ref_idc: 3&#xA;[h264 @ 00000262cd0bb140] nal_unit_type: 7(SPS), nal_ref_idc: 3&#xA;[h264 @ 00000262cd0bb140] Decoding VUI&#xA;[h264 @ 00000262cd0bb140] nal_unit_type: 8(PPS), nal_ref_idc: 3&#xA;[h264 @ 00000262cd0bb140] Decoding VUI&#xA;[h264 @ 00000262cd0bb140] nal_unit_type: 6(SEI), nal_ref_idc: 0&#xA;[h264 @ 00000262cd0bb140] nal_unit_type: 7(SPS), nal_ref_idc: 3&#xA;[h264 @ 00000262cd0bb140] nal_unit_type: 8(PPS), nal_ref_idc: 3&#xA;[h264 @ 00000262cd0bb140] nal_unit_type: 5(IDR), nal_ref_idc: 3&#xA;[h264 @ 00000262cd0bb140] Decoding VUI&#xA;[h264 @ 00000262cd0bb140] Format yuv420p chosen by get_format().&#xA;[h264 @ 00000262cd0bb140] Reinit context to 1088x1920, pix_fmt: yuv420p&#xA;[h264 @ 00000262cd0bb140] no picture &#xA;[mov,mp4,m4a,3gp,3g2,mj2 @ 00000262cd0888c0] All info found&#xA;[mov,mp4,m4a,3gp,3g2,mj2 @ 00000262cd0888c0] After avformat_find_stream_info() pos: 82242 bytes read:82242 seeks:0 frames:1&#xA;Input #0, mov,mp4,m4a,3gp,3g2,mj2, from &#x27;./template1.mp4&#x27;:&#xA;  Metadata:&#xA;    major_brand     : isom&#xA;    minor_version   : 512&#xA;    compatible_brands: isomiso2avc1mp41&#xA;    encoder         : Lavf58.76.100&#xA;  Duration: 00:00:08.27, start: 0.000000, bitrate: 3720 kb/s&#xA;  Stream #0:0[0x1](und), 1, 1/15360: Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, bt709, progressive), 1080x1920, 3714 kb/s, 30 fps, 30 tbr, 15360 tbn (default)&#xA;      Metadata:&#xA;        handler_name    : VideoHandler&#xA;        vendor_id       : [0][0][0][0]&#xA;Successfully opened the file.&#xA;Parsing a group of options: output url pipe:1.&#xA;Applying option filter:v (apply specified filters to audio/video) with argument drawtext=text=hi:fontsize=24:fontcolor=white:x=(w-text_w)/2:y=(h-text_h)/2:box=1:boxcolor=black@0.5:boxborderw=5:fontfile=/Windows/fonts/calibri.ttf.&#xA;Successfully parsed a group of options.&#xA;Opening an output file: pipe:1.&#xA;[AVFormatContext @ 00000262cd0b2240] Unable to choose an output format for &#x27;pipe:1&#x27;; use a standard extension for the filename or specify the format manually.&#xA;[out#0 @ 00000262cd0bb300] Error initializing the muxer for pipe:1: Invalid argument&#xA;Error opening output file pipe:1.&#xA;Error opening output files: Invalid argument&#xA;[AVIOContext @ 00000262cd0a9580] Statistics: 82242 bytes read, 0 seeks&#xA;

    &#xA;

    I should be able to stream the processed video to my s3, but it keeps on throwing "Error : Error : Output format mp4 is not available"

    &#xA;