Recherche avancée

Médias (1)

Mot : - Tags -/iphone

Autres articles (111)

  • Script d’installation automatique de MediaSPIP

    25 avril 2011, par

    Afin de palier aux difficultés d’installation dues principalement aux dépendances logicielles coté serveur, un script d’installation "tout en un" en bash a été créé afin de faciliter cette étape sur un serveur doté d’une distribution Linux compatible.
    Vous devez bénéficier d’un accès SSH à votre serveur et d’un compte "root" afin de l’utiliser, ce qui permettra d’installer les dépendances. Contactez votre hébergeur si vous ne disposez pas de cela.
    La documentation de l’utilisation du script d’installation (...)

  • Ajouter des informations spécifiques aux utilisateurs et autres modifications de comportement liées aux auteurs

    12 avril 2011, par

    La manière la plus simple d’ajouter des informations aux auteurs est d’installer le plugin Inscription3. Il permet également de modifier certains comportements liés aux utilisateurs (référez-vous à sa documentation pour plus d’informations).
    Il est également possible d’ajouter des champs aux auteurs en installant les plugins champs extras 2 et Interface pour champs extras.

  • Que fait exactement ce script ?

    18 janvier 2011, par

    Ce script est écrit en bash. Il est donc facilement utilisable sur n’importe quel serveur.
    Il n’est compatible qu’avec une liste de distributions précises (voir Liste des distributions compatibles).
    Installation de dépendances de MediaSPIP
    Son rôle principal est d’installer l’ensemble des dépendances logicielles nécessaires coté serveur à savoir :
    Les outils de base pour pouvoir installer le reste des dépendances Les outils de développements : build-essential (via APT depuis les dépôts officiels) ; (...)

Sur d’autres sites (8992)

  • Python subprocess. Linux vs Windows

    23 mai 2021, par Chris

    I wonder if someone can help explain what is happening ?

    


    I run 2 subprocesses, 1 for ffprobe and 1 for ffmpeg.

    


    popen = subprocess.Popen(ffprobecmd, stderr=subprocess.PIPE, shell=True)


    


    And

    


    popen = subprocess.Popen(ffmpegcmd, shell=True, stdout=subprocess.PIPE)


    


    On both Windows and Linux the ffprobe command fires, finishes and gets removed from taskmanager/htop. But only on Windows does the same happen to ffmpeg. On Linux the command remains in htop...

    


    enter image description here

    


    Can anyone explain what is going on, if it matters and how I can stop it from happening please ?

    


    EDIT : Here are the commands...

    


    ffprobecmd = 'ffprobe' + \
' -user_agent "' + request.headers['User-Agent'] + '"' + \
' -headers "Referer: ' + request.headers['Referer'] + '"' + \
' -timeout "5000000"' + \
' -v error -select_streams v -show_entries stream=height -of default=nw=1:nk=1' + \
' -i "' + request.url + '"'


    


    and

    


    ffmpegcmd = 'ffmpeg' + \
' -re' + \
' -user_agent "' + r.headers['User-Agent'] + '"' + \
' -headers "Referer: ' + r.headers['Referer'] + '"' + \
' -timeout "10"' + \
' -i "' + r.url + '"' + \
' -c copy' + \
' -f mpegts' + \
' pipe:'


    


    EDIT : Here is a example that behaves as described...

    


    import flask
from flask import Response
import subprocess

app = flask.Flask(__name__)

@app.route('/', methods=['GET'])
def go():
    def stream(ffmpegcmd):
        popen = subprocess.Popen(ffmpegcmd, stdout=subprocess.PIPE, shell=True)
        try:
            for stdout_line in iter(popen.stdout.readline, ""):
                yield stdout_line
        except GeneratorExit:
            raise

    url = "https://bitdash-a.akamaihd.net/content/MI201109210084_1/m3u8s/f08e80da-bf1d-4e3d-8899-f0f6155f6efa.m3u8"

    ffmpegcmd = 'ffmpeg' + \
                ' -re' + \
                ' -timeout "10"' + \
                ' -i "' + url + '"' + \
                ' -c copy' + \
                ' -f mpegts' + \
                ' pipe:'
    return Response(stream(ffmpegcmd))

if __name__ == '__main__':
    app.run(host= '0.0.0.0', port=5000)


    


  • Low latency video streaming on android

    17 mai 2021, par Louis Blenner

    I'd like to be able to stream the video from my webcam to an Android app with a latency below 500ms, on my local network.

    


    To capture and send the video over the network, I use ffmpeg.

    


    ffmpeg -f v4l2 -i /dev/video0 -preset ultrafast -tune zerolatency -vcodec libx264 -an -vf format=yuv420p -f mpegts  udp://192.168.1.155:5000


    


    This command takes the webcam as an input, convert it and send it to a device using the mpegts protocol.

    


    I am able to read the video on another PC with a latency below 500 ms, using commands like

    


    gst-launch-1.0 -v udpsrc port=5000 ! video/mpegts ! tsdemux ! h264parse ! avdec_h264 ! fpsdisplaysink sync=false


    


    or

    


    mpv udp://0.0.0.0:5000 --no-cache --untimed --no-demuxer-thread --video-sync=audio --vd-lavc-threads=1 


    


    So it is possible to have this range of latency.
    
I'd like to have the same thing on Android.

    


    Here are my tries to do that.

    


    Exoplayer

    


    After looking at the different players available on Android studio, it seems like Exoplayer is the go-to choice.
    
I tried different options indicated in the live-streaming documentation, but I always end up with a stream taking seconds to start and with a latency of seconds.
    
I tried to add a Button to seek to the default position of the windows, but it results in a loading of several seconds.

    


    DefaultExtractorsFactory extractorsFactory =
                new DefaultExtractorsFactory()
                        .setTsExtractorFlags(DefaultTsPayloadReaderFactory.FLAG_IGNORE_AAC_STREAM);

        player = new SimpleExoPlayer.Builder(this)
                .setMediaSourceFactory(
                        new DefaultMediaSourceFactory(this, extractorsFactory))
                .setLoadControl(new DefaultLoadControl.Builder()
                        .setBufferDurationsMs(DefaultLoadControl.DEFAULT_MIN_BUFFER_MS, DefaultLoadControl.DEFAULT_MAX_BUFFER_MS, 200, 200)
                        .build())
                .build();
        MyPlayerView playerView = findViewById(R.id.player_view);
        // Bind the player to the view.
        playerView.setPlayer(player);
        // Build the media item.
        MediaItem mediaItem = new MediaItem.Builder()
                .setUri(Uri.parse("udp://0.0.0.0:5000"))
                .setLiveMaxOffsetMs(500)
                .setLiveTargetOffsetMs(0)
                .setLiveMinOffsetMs(0)
                .build();
        // Set the media item to be played.
        player.setMediaItem(mediaItem);
        // Prepare the player.
        player.setPlayWhenReady(true);
        player.prepare();
        //player.seekToDefaultPosition();


    


    This issue is about the same issue and the conclusion was that Exoplayer was not fit for this use case.

    


    


    I'll be honest, ultra low-latency like this isn't ExoPlayer's main use-case

    


    


    Vlc

    


    Another try was to use the Vlc library.
    
But I was unable to have the same low latency stream as with the two previous example with Vlc.
    
I tried changing the preferences of Vlc to stream as fast as possible.

    


    Input/Codecs -> x264 preset: ultrafast - zerolatency
Input/Codecs -> Access Module: UDP input
Input/Codecs -> Clock Jitter: 500
Audio: disable audio


    


    I also tried reducing the different buffers.
    
However, I still have a latency of more than 1 seconds with that.

    


    Gstreamer

    


    Another try was to create a react-native project to use the different players available here.
    
One player that seemed promising was react-native-gstreamer because it uses gstreamer which is able to stream with low latency (gst-launch command).
    
But the library is now outdated.

    


    Question

    


    There were other tries, but none were successful.
    
Is there a problem with one of my approaches ?
    
And if not, Is there a player on Android (that I missed) which is able to achieve low latency stream like gstream or mpv on linux ?

    


  • offloading to ffmpeg via named pipes in c#/dotnet core

    1er avril 2022, par bep

    I tried to break this down to the base elements so I hope this is clear. I want to take in a network stream, it may be a 1 way, it may be a protocol that requires 2 way communication, such as RTMP during handshake.

    


    I want to pass that stream straight through to a spawned FFMPEG process. I then want to capture the output of FFMPEG, in this example I just want to pipe it out to a file. The file is not my end goal, but for simplicity if I can get that far I think I'll be ok.

    


    enter image description here

    


    I want the code to be as plain as possible and offload the core processing to FFMPEG. If I ask FFMPEG to output webrtc stream, a file, whatever, I just want to capture that. FFMPEG shouldn't be used directly, just indirectly via IncomingConnectionHandler.

    


    Only other component is OBS, which I am using to create the RTMP stream coming in.

    


    As things stand now, running this results in the following error, which I'm a little unclear on. I don't feel like I'm causing concurrent reads at any point.

    


    System.InvalidOperationException: Concurrent reads are not allowed
         at Medallion.Shell.Throw`1.If(Boolean condition, String message)
         at Medallion.Shell.Streams.Pipe.ReadAsync(Byte[] buffer, Int32 offset, Int32 count, TimeSpan timeout, CancellationToken cancellationToken)
         at Medallion.Shell.Streams.Pipe.PipeOutputStream.ReadAsync(Byte[] buffer, Int32 offset, Int32 count, CancellationToken cancellationToken)
         at System.IO.Stream.ReadAsync(Memory`1 buffer, CancellationToken cancellationToken)
         at System.IO.StreamReader.ReadBufferAsync(CancellationToken cancellationToken)
         at System.IO.StreamReader.ReadLineAsyncInternal()
         at Medallion.Shell.Streams.MergedLinesEnumerable.GetEnumeratorInternal()+MoveNext()
         at System.String.Join(String separator, IEnumerable`1 values)
         at VideoIngest.IncomingRtmpConnectionHandler.OnConnectedAsync(ConnectionContext connection) in Program.cs:line 55
         at Microsoft.AspNetCore.Server.Kestrel.Core.Internal.Infrastructure.KestrelConnection`1.ExecuteAsync()


    


    Code :

    


    namespace VideoIngest&#xA;{&#xA;    public class IncomingRtmpConnectionHandler : ConnectionHandler&#xA;    {&#xA;        private readonly ILogger<incomingrtmpconnectionhandler> logger;&#xA;&#xA;        public IncomingRtmpConnectionHandler(ILogger<incomingrtmpconnectionhandler> logger)&#xA;        {&#xA;            this.logger = logger;&#xA;        }&#xA;&#xA;        public override async Task OnConnectedAsync(ConnectionContext connection)&#xA;        {&#xA;            logger?.LogInformation("connection started");&#xA;&#xA;            var outputFileName = @"C:\Temp\bunny.mp4";&#xA;&#xA;            var rtmpPassthroughPipeName = Guid.NewGuid().ToString();&#xA;            var cmdPath = @"C:\Opt\ffmpeg\bin\ffmpeg.exe";&#xA;            var cmdArgs = $"-i pipe:{rtmpPassthroughPipeName} -preset slow -c copy -f mp4 -y pipe:1";&#xA;&#xA;            var cancellationToken = connection.ConnectionClosed;&#xA;            var rtmpStream = connection.Transport;&#xA;&#xA;            using (var outputStream = new FileStream(outputFileName, FileMode.Create))&#xA;            using (var cmd = Command.Run(cmdPath, options: o => { o.StartInfo(i => i.Arguments = cmdArgs); o.CancellationToken(cancellationToken); }))&#xA;            {&#xA;                // create a pipe to pass the RTMP data straight to FFMPEG. This code should be dumb to proto etc being used&#xA;                var ffmpegPassthroughStream = new NamedPipeServerStream(rtmpPassthroughPipeName, PipeDirection.InOut, 10, PipeTransmissionMode.Byte, System.IO.Pipes.PipeOptions.Asynchronous);&#xA;&#xA;                // take the network stream and pass data to/from ffmpeg process&#xA;                var fromFfmpegTask = ffmpegPassthroughStream.CopyToAsync(rtmpStream.Output.AsStream(), cancellationToken);&#xA;                var toFfmpegTask = rtmpStream.Input.AsStream().CopyToAsync(ffmpegPassthroughStream, cancellationToken);&#xA;&#xA;                // take the ffmpeg process output (not stdout) into target file&#xA;                var outputTask = cmd.StandardOutput.PipeToAsync(outputStream);&#xA;&#xA;                while (!outputTask.IsCompleted &amp;&amp; !outputTask.IsCanceled)&#xA;                {&#xA;                    var errs = cmd.GetOutputAndErrorLines();&#xA;                    logger.LogInformation(string.Join(Environment.NewLine, errs));&#xA;&#xA;                    await Task.Delay(1000);&#xA;                }&#xA;&#xA;                CommandResult result = result = cmd.Result;&#xA;&#xA;                if (result != null &amp;&amp; result.Success)&#xA;                {&#xA;                    logger.LogInformation("Created file");&#xA;                }&#xA;                else&#xA;                {&#xA;                    logger.LogError(result.StandardError);&#xA;                }&#xA;            }&#xA;&#xA;            logger?.LogInformation("connection closed");&#xA;        }&#xA;    }&#xA;&#xA;    public class Startup&#xA;    {&#xA;        public void ConfigureServices(IServiceCollection services) { }&#xA;&#xA;        public void Configure(IApplicationBuilder app, IWebHostEnvironment env)&#xA;        {&#xA;            app.Run(async (context) =>&#xA;            {&#xA;                var log = context.RequestServices.GetRequiredService>();&#xA;                await context.Response.WriteAsync("Hello World!");&#xA;            });&#xA;        }&#xA;    }&#xA;&#xA;    public class Program&#xA;    {&#xA;        public static void Main(string[] args)&#xA;        {&#xA;            CreateHostBuilder(args).Build().Run();&#xA;        }&#xA;&#xA;        public static IWebHostBuilder CreateHostBuilder(string[] args) =>&#xA;            WebHost&#xA;                .CreateDefaultBuilder(args)&#xA;                .ConfigureServices(services =>&#xA;                {&#xA;                    services.AddLogging(options =>&#xA;                    {&#xA;                        options.AddDebug().AddConsole().SetMinimumLevel(LogLevel.Information);&#xA;                    });&#xA;                })&#xA;                .UseKestrel(options =>&#xA;                {&#xA;                    options.ListenAnyIP(15666, builder =>&#xA;                    {&#xA;                        builder.UseConnectionHandler<incomingrtmpconnectionhandler>();&#xA;                    });&#xA;&#xA;                    options.ListenLocalhost(5000);&#xA;&#xA;                    // HTTPS 5001&#xA;                    options.ListenLocalhost(5001, builder =>&#xA;                    {&#xA;                        builder.UseHttps();&#xA;                    });&#xA;                })&#xA;                .UseStartup<startup>();&#xA;    }&#xA;    &#xA;&#xA;}&#xA;</startup></incomingrtmpconnectionhandler></incomingrtmpconnectionhandler></incomingrtmpconnectionhandler>

    &#xA;

    Questions :

    &#xA;

      &#xA;
    1. Is this a valid approach, do you see any fundamental issues ?
    2. &#xA;

    3. Is the pipe naming correct, is the convention just pipe:someName ?
    4. &#xA;

    5. Any ideas on what specifically may be causing the Concurrent reads are not allowed ?
    6. &#xA;

    7. If #3 is solved, does the rest of this seem valid ?
    8. &#xA;

    &#xA;