Advanced search

Medias (1)

Tag: - Tags -/intégration

Other articles (94)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 September 2013, by

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo; l’ajout d’une bannière l’ajout d’une image de fond;

  • Soumettre améliorations et plugins supplémentaires

    10 April 2011

    Si vous avez développé une nouvelle extension permettant d’ajouter une ou plusieurs fonctionnalités utiles à MediaSPIP, faites le nous savoir et son intégration dans la distribution officielle sera envisagée.
    Vous pouvez utiliser la liste de discussion de développement afin de le faire savoir ou demander de l’aide quant à la réalisation de ce plugin. MediaSPIP étant basé sur SPIP, il est également possible d’utiliser le liste de discussion SPIP-zone de SPIP pour (...)

  • Emballe médias : à quoi cela sert?

    4 February 2011, by

    Ce plugin vise à gérer des sites de mise en ligne de documents de tous types.
    Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel; un seul document ne peut être lié à un article dit "média";

On other websites (11446)

  • offloading to ffmpeg via named pipes in c#/dotnet core

    1 April 2022, by bep

    I tried to break this down to the base elements so I hope this is clear. I want to take in a network stream, it may be a 1 way, it may be a protocol that requires 2 way communication, such as RTMP during handshake.

    


    I want to pass that stream straight through to a spawned FFMPEG process. I then want to capture the output of FFMPEG, in this example I just want to pipe it out to a file. The file is not my end goal, but for simplicity if I can get that far I think I'll be ok.

    


    enter image description here

    


    I want the code to be as plain as possible and offload the core processing to FFMPEG. If I ask FFMPEG to output webrtc stream, a file, whatever, I just want to capture that. FFMPEG shouldn't be used directly, just indirectly via IncomingConnectionHandler.

    


    Only other component is OBS, which I am using to create the RTMP stream coming in.

    


    As things stand now, running this results in the following error, which I'm a little unclear on. I don't feel like I'm causing concurrent reads at any point.

    


    System.InvalidOperationException: Concurrent reads are not allowed
         at Medallion.Shell.Throw`1.If(Boolean condition, String message)
         at Medallion.Shell.Streams.Pipe.ReadAsync(Byte[] buffer, Int32 offset, Int32 count, TimeSpan timeout, CancellationToken cancellationToken)
         at Medallion.Shell.Streams.Pipe.PipeOutputStream.ReadAsync(Byte[] buffer, Int32 offset, Int32 count, CancellationToken cancellationToken)
         at System.IO.Stream.ReadAsync(Memory`1 buffer, CancellationToken cancellationToken)
         at System.IO.StreamReader.ReadBufferAsync(CancellationToken cancellationToken)
         at System.IO.StreamReader.ReadLineAsyncInternal()
         at Medallion.Shell.Streams.MergedLinesEnumerable.GetEnumeratorInternal()+MoveNext()
         at System.String.Join(String separator, IEnumerable`1 values)
         at VideoIngest.IncomingRtmpConnectionHandler.OnConnectedAsync(ConnectionContext connection) in Program.cs:line 55
         at Microsoft.AspNetCore.Server.Kestrel.Core.Internal.Infrastructure.KestrelConnection`1.ExecuteAsync()


    


    Code:

    


    namespace VideoIngest&#xA;{&#xA;    public class IncomingRtmpConnectionHandler : ConnectionHandler&#xA;    {&#xA;        private readonly ILogger<incomingrtmpconnectionhandler> logger;&#xA;&#xA;        public IncomingRtmpConnectionHandler(ILogger<incomingrtmpconnectionhandler> logger)&#xA;        {&#xA;            this.logger = logger;&#xA;        }&#xA;&#xA;        public override async Task OnConnectedAsync(ConnectionContext connection)&#xA;        {&#xA;            logger?.LogInformation("connection started");&#xA;&#xA;            var outputFileName = @"C:\Temp\bunny.mp4";&#xA;&#xA;            var rtmpPassthroughPipeName = Guid.NewGuid().ToString();&#xA;            var cmdPath = @"C:\Opt\ffmpeg\bin\ffmpeg.exe";&#xA;            var cmdArgs = $"-i pipe:{rtmpPassthroughPipeName} -preset slow -c copy -f mp4 -y pipe:1";&#xA;&#xA;            var cancellationToken = connection.ConnectionClosed;&#xA;            var rtmpStream = connection.Transport;&#xA;&#xA;            using (var outputStream = new FileStream(outputFileName, FileMode.Create))&#xA;            using (var cmd = Command.Run(cmdPath, options: o => { o.StartInfo(i => i.Arguments = cmdArgs); o.CancellationToken(cancellationToken); }))&#xA;            {&#xA;                // create a pipe to pass the RTMP data straight to FFMPEG. This code should be dumb to proto etc being used&#xA;                var ffmpegPassthroughStream = new NamedPipeServerStream(rtmpPassthroughPipeName, PipeDirection.InOut, 10, PipeTransmissionMode.Byte, System.IO.Pipes.PipeOptions.Asynchronous);&#xA;&#xA;                // take the network stream and pass data to/from ffmpeg process&#xA;                var fromFfmpegTask = ffmpegPassthroughStream.CopyToAsync(rtmpStream.Output.AsStream(), cancellationToken);&#xA;                var toFfmpegTask = rtmpStream.Input.AsStream().CopyToAsync(ffmpegPassthroughStream, cancellationToken);&#xA;&#xA;                // take the ffmpeg process output (not stdout) into target file&#xA;                var outputTask = cmd.StandardOutput.PipeToAsync(outputStream);&#xA;&#xA;                while (!outputTask.IsCompleted &amp;&amp; !outputTask.IsCanceled)&#xA;                {&#xA;                    var errs = cmd.GetOutputAndErrorLines();&#xA;                    logger.LogInformation(string.Join(Environment.NewLine, errs));&#xA;&#xA;                    await Task.Delay(1000);&#xA;                }&#xA;&#xA;                CommandResult result = result = cmd.Result;&#xA;&#xA;                if (result != null &amp;&amp; result.Success)&#xA;                {&#xA;                    logger.LogInformation("Created file");&#xA;                }&#xA;                else&#xA;                {&#xA;                    logger.LogError(result.StandardError);&#xA;                }&#xA;            }&#xA;&#xA;            logger?.LogInformation("connection closed");&#xA;        }&#xA;    }&#xA;&#xA;    public class Startup&#xA;    {&#xA;        public void ConfigureServices(IServiceCollection services) { }&#xA;&#xA;        public void Configure(IApplicationBuilder app, IWebHostEnvironment env)&#xA;        {&#xA;            app.Run(async (context) =>&#xA;            {&#xA;                var log = context.RequestServices.GetRequiredService>();&#xA;                await context.Response.WriteAsync("Hello World!");&#xA;            });&#xA;        }&#xA;    }&#xA;&#xA;    public class Program&#xA;    {&#xA;        public static void Main(string[] args)&#xA;        {&#xA;            CreateHostBuilder(args).Build().Run();&#xA;        }&#xA;&#xA;        public static IWebHostBuilder CreateHostBuilder(string[] args) =>&#xA;            WebHost&#xA;                .CreateDefaultBuilder(args)&#xA;                .ConfigureServices(services =>&#xA;                {&#xA;                    services.AddLogging(options =>&#xA;                    {&#xA;                        options.AddDebug().AddConsole().SetMinimumLevel(LogLevel.Information);&#xA;                    });&#xA;                })&#xA;                .UseKestrel(options =>&#xA;                {&#xA;                    options.ListenAnyIP(15666, builder =>&#xA;                    {&#xA;                        builder.UseConnectionHandler<incomingrtmpconnectionhandler>();&#xA;                    });&#xA;&#xA;                    options.ListenLocalhost(5000);&#xA;&#xA;                    // HTTPS 5001&#xA;                    options.ListenLocalhost(5001, builder =>&#xA;                    {&#xA;                        builder.UseHttps();&#xA;                    });&#xA;                })&#xA;                .UseStartup<startup>();&#xA;    }&#xA;    &#xA;&#xA;}&#xA;</startup></incomingrtmpconnectionhandler></incomingrtmpconnectionhandler></incomingrtmpconnectionhandler>

    &#xA;

    Questions:

    &#xA;

      &#xA;
    1. Is this a valid approach, do you see any fundamental issues?
    2. &#xA;

    3. Is the pipe naming correct, is the convention just pipe:someName?
    4. &#xA;

    5. Any ideas on what specifically may be causing the Concurrent reads are not allowed?
    6. &#xA;

    7. If #3 is solved, does the rest of this seem valid?
    8. &#xA;

    &#xA;

  • bad audio mic recording quality with ffmpeg compared to sox

    1 July 2021, by user2355330

    I am contacting you as after 3 days of searching I am stuck on a really simple point.

    &#xA;

    I want to record the sound of my mic on MacOS using ffmpeg.

    &#xA;

    I managed to do it using the following command:

    &#xA;

    ffmpeg -f avfoundation -audio_device_index 2 -i "none:-" -c:a pcm_s32l alexspeaking.wav -y -loglevel debug&#xA;

    &#xA;

    The issue is that each time I am speaking, there are cracks and pop in the sound...

    &#xA;

    I tried to use sox and it gave me a perfect and crystal clear sound and I have no idea why... Below is the output of the sox command :

    &#xA;

    sox -t coreaudio "G935 Gaming Headset" toto.wav -V6&#xA;sox:      SoX v&#xA;time:     Nov 15 2020 01:06:02&#xA;uname:    Darwin MacBook-Pro.local 20.5.0 Darwin Kernel Version 20.5.0: Sat May  8 05:10:33 PDT 2021; root:xnu-7195.121.3~9/RELEASE_X86_64 x86_64&#xA;compiler: gcc Apple LLVM 12.0.0 (clang-1200.0.32.27)&#xA;arch:     1288 48 88 L&#xA;sox INFO coreaudio: Found Audio Device "DELL U2721DE"&#xA;sox INFO coreaudio: Found Audio Device "G935 Gaming "&#xA;sox DBUG coreaudio: audio device did not accept 2 channels. Use 1 channels instead.&#xA;sox DBUG coreaudio: audio device did not accept 44100 sample rate. Use 48000 instead.&#xA;Input File     : &#x27;G935 Gaming Headset&#x27; (coreaudio)&#xA;Channels       : 1&#xA;Sample Rate    : 48000&#xA;Precision      : 32-bit&#xA;Sample Encoding: 32-bit Signed Integer PCM&#xA;Endian Type    : little&#xA;Reverse Nibbles: no&#xA;Reverse Bits   : no&#xA;sox INFO sox: Overwriting `toto.wav&#x27;&#xA;sox DBUG wav: Writing Wave file: Microsoft PCM format, 1 channel, 48000 samp/sec&#xA;sox DBUG wav:         192000 byte/sec, 4 block align, 32 bits/samp&#xA;Output File    : &#x27;toto.wav&#x27;&#xA;Channels       : 1&#xA;Sample Rate    : 48000&#xA;Precision      : 32-bit&#xA;Sample Encoding: 32-bit Signed Integer PCM&#xA;Endian Type    : little&#xA;Reverse Nibbles: no&#xA;Reverse Bits   : no&#xA;Comment        : &#x27;Processed by SoX&#x27;&#xA;sox DBUG effects: sox_add_effect: extending effects table, new size = 8&#xA;sox INFO sox: effects chain: input        48000Hz  1 channels (multi) 32 bits unknown length&#xA;sox INFO sox: effects chain: output       48000Hz  1 channels (multi) 32 bits unknown length&#xA;sox DBUG sox: start-up time = 0.051332&#xA;In:0.00% 00:00:07.13 [00:00:00.00] Out:340k  [      |      ]        Clip:0    ^C&#xA;sox DBUG input: output buffer still held 2048 samples; dropped.&#xA;Aborted.&#xA;sox DBUG wav: Finished writing Wave file, 1359872 data bytes 339968 samples&#xA;

    &#xA;

    I am pretty sure the issue is linked to the way the encoding is done and the params I used with ffmpeg but I don't seem to be able to grasp which one I must use.

    &#xA;

    Any ideas if there are ffmpeg experts here ?

    &#xA;

  • Mixed Reality WebRTC without Signalling Server

    25 May 2021, by SilverLife

    I am trying to find a way, which allows me to use Mixed Reality WebRTC (link to git-repo) without a signalling server.&#xA;In detail, I want to create a sdp-file from my ffmpeg Video sender and use this sdp-description in my unity-Project to bypass the signaling process and receive the ffmpeg video stream.&#xA;Is there a way of doing so with Mixed Reality WebRTC? I was already searching for the line of code, where the sdp-file is created within MR WebRTC but I didn´t find it.

    &#xA;

    I am relatively new to this topic and I am not sure if this works at all but since ffmpeg is not directly compatible with webrtc I was thinking that this might be the most promising approach.

    &#xA;