Recherche avancée

Médias (0)

Mot : - Tags -/albums

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (42)

  • Keeping control of your media in your hands

    13 avril 2011, par

    The vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
    While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
    MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
    MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...)

  • Les images

    15 mai 2013
  • Mediabox : ouvrir les images dans l’espace maximal pour l’utilisateur

    8 février 2011, par

    La visualisation des images est restreinte par la largeur accordée par le design du site (dépendant du thème utilisé). Elles sont donc visibles sous un format réduit. Afin de profiter de l’ensemble de la place disponible sur l’écran de l’utilisateur, il est possible d’ajouter une fonctionnalité d’affichage de l’image dans une boite multimedia apparaissant au dessus du reste du contenu.
    Pour ce faire il est nécessaire d’installer le plugin "Mediabox".
    Configuration de la boite multimédia
    Dès (...)

Sur d’autres sites (7243)

  • avformat/matroskaenc : Reset cur_master_element when discarding master

    16 juin 2022, par Andreas Rheinhardt
    avformat/matroskaenc : Reset cur_master_element when discarding master
    

    Before this patch the muxer writes an invalid file
    (namely one in which the Projection master is a child of
    the Colour element) if the following conditions are met :
    a) The stream contains AVMasteringDisplayMetadata without primaries
    and luminance (i.e. useless AVMasteringDisplayMetadata).
    b) The stream contains AV_PKT_DATA_SPHERICAL side data.
    c) All the colour elements of the stream are equal to default
    (i.e. unknown).
    Fortunately these conditions are very unlikely to be met.

    Signed-off-by : Andreas Rheinhardt <andreas.rheinhardt@outlook.com>

    • [DH] libavformat/matroskaenc.c
  • Flutter chewie video player shows incorrect duration for HLS stream

    29 mai 2022, par magackame

    Im using SRS and ffmpeg to create a HLS video stream
    &#xA;I run SRS using docker command

    &#xA;

    docker run --rm -it -p 1935:1935 -p 1985:1985 -p 8080:8080 ossrs/srs:4 ./objs/srs -c conf/docker.conf&#xA;

    &#xA;

    And then post a stream using ffmpeg command

    &#xA;

    ffmpeg -re -i video.mp4 -c copy -f flv rtmp://localhost/live/livestream&#xA;

    &#xA;

    To playback a video in flutter I use Chewie video player (chewie: ^1.3.2 in pubspec.yaml) and this widget code :

    &#xA;

    import &#x27;package:video_player/video_player.dart&#x27;;&#xA;import &#x27;package:chewie/chewie.dart&#x27;;&#xA;import &#x27;package:flutter/material.dart&#x27;;&#xA;&#xA;class RoomView extends StatefulWidget {&#xA;  const RoomView({Key? key}) : super(key: key);&#xA;&#xA;  @override&#xA;  _RoomViewState createState() => _RoomViewState();&#xA;}&#xA;&#xA;class _RoomViewState extends State<roomview> {&#xA;  late VideoPlayerController _videoController;&#xA;  late ChewieController _controller;&#xA;&#xA;  @override&#xA;  void dispose() {&#xA;    _controller.dispose();&#xA;    _videoController.dispose();&#xA;&#xA;    super.dispose();&#xA;  }&#xA;&#xA;  @override&#xA;  void initState() {&#xA;    super.initState();&#xA;&#xA;    _videoController = VideoPlayerController.network(&#x27;http://localhost/live/livestream.m3u8&#x27;)&#xA;      ..initialize().then((_) {&#xA;        setState(() {});&#xA;      });&#xA;&#xA;    _controller = ChewieController(videoPlayerController: _videoController);&#xA;  }&#xA;&#xA;  @override&#xA;  Widget build(BuildContext context) {&#xA;    return AspectRatio(&#xA;        aspectRatio:&#xA;        _controller.aspectRatio == null ? 16 / 9 : _controller.aspectRatio!,&#xA;        child: Chewie(controller: _controller));&#xA;  }&#xA;}&#xA;</roomview>

    &#xA;

    The video plays fine and seeking using the playback bar also works, but the video duration is incorrect. I tried streaming videos with different duration(the two minute and twenty minute ones), tried using mp4 and mkv formats as source and/or streaming using mpegts as output container(instead of flv) but all of them had a duration of either one minute or sometimes 10 seconds and the playbar will overflow when reaching the limit(showing something like 2:11/1:05).
    &#xA;When playing some public HLS streams(https://multiplatform-f.akamaihd.net/i/multi/will/bunny/big_buck_bunny_,640x360_400,640x360_700,640x360_1000,950x540_1500,.f4v.csmil/master.m3u8) the video duration is shown correctly so I guess the problem is either the SRS or the ffmpeg.
    &#xA;The question is what am I doing wrong, what parameters should I use for ffmpeg or SRS, and what are other options that I can use to provide a working HLS stream for the player

    &#xA;

  • Decode audio using ffmpeg (packet-by-packet) in Java

    27 mai 2022, par quad478

    In my application, I receive an audio stream from an IP-camera via RTP using NETTY.&#xA;The stream from IP-camera comes in the "G.711 mulaw" format, I would like to recode it to the "AAC" format.&#xA;I can't use files in this task as it's a live stream, and each packet needs to be decoded and delivered to the client (browser) immediately.&#xA;For this task, I wanted to use the ffmpeg child process :&#xA;when connecting to the camera, create a ffmpeg process and send to stdin (ffmpeg) each received packet, then I would like to receive the decoded packet from stdout.&#xA;Here is the command I run ffmeg with :

    &#xA;

    "ffmpeg.exe -f mulaw -re -i - -f adts -"&#xA;

    &#xA;

    I'm not sure if "-re" should be used here, but without this option, ffmpeg outputs the decode result only after stdin is closed and the process exits.&#xA;The problem is that I don't get anything on stdout after sending the packet to ffmpeg.

    &#xA;

    Decoder code :

    &#xA;

    package ru.ngslab.insentry.web.video.protocols.rtp;&#xA;&#xA;import io.netty.buffer.ByteBuf;&#xA;import io.netty.channel.ChannelHandlerContext;&#xA;import io.netty.handler.codec.MessageToMessageDecoder;&#xA;&#xA;import java.io.Closeable;&#xA;import java.io.IOException;&#xA;import java.io.InputStream;&#xA;import java.io.OutputStream;&#xA;import java.util.List;&#xA;import java.util.concurrent.ExecutorService;&#xA;import java.util.concurrent.Executors;&#xA;import java.util.concurrent.Future;&#xA;&#xA;public class RtpFfmpegDecoder extends MessageToMessageDecoder<rtppacket> implements Closeable {&#xA;&#xA;    private final Process ffmegProcess;&#xA;    private final OutputStream ffmpegOutput;&#xA;    private final InputStream ffmegInput;&#xA;    private final ExecutorService ffmpegInputReaderService = Executors.newSingleThreadExecutor();&#xA;&#xA;    public RtpFfmpegDecoder() {&#xA;&#xA;        //Start Ffmpeg process&#xA;        ProcessBuilder ffmpegBuilder = new ProcessBuilder("ffmpeg.exe", "-f", "mulaw",&#xA;                "-re", "-i", "-", "-f", "adts", "-").redirectError(ProcessBuilder.Redirect.INHERIT);&#xA;        try {&#xA;            ffmegProcess = ffmpegBuilder.start();&#xA;            ffmpegOutput = ffmegProcess.getOutputStream();&#xA;            ffmegInput = ffmegProcess.getInputStream();&#xA;        } catch (IOException e) {&#xA;            throw new IllegalStateException(e);&#xA;        }&#xA;    }&#xA;&#xA;    @Override&#xA;    protected void decode(ChannelHandlerContext channelHandlerContext, RtpPacket rtpPacket, List list) throws Exception {&#xA;&#xA;        //start read ffmpeg output in another thread&#xA;        Future future = ffmpegInputReaderService.submit(this::startReadFFmpegOutput);&#xA;&#xA;        //write rtp- packet bytes to ffmpeg-input&#xA;        ByteBuf data = rtpPacket.getData();&#xA;        byte[] rtpData = new byte[data.readableBytes()];&#xA;        data.getBytes(data.readerIndex(), rtpData);&#xA;        ffmpegOutput.write(rtpData);&#xA;        ffmpegOutput.flush();&#xA;&#xA;        //waiting here for the decoding result from ffmpeg&#xA;        //blocks here&#xA;        byte[] result = future.get();&#xA;        //then process result...&#xA;    }&#xA;&#xA;    private byte[] startReadFFmpegOutput() {&#xA;        try {&#xA;            //I don&#x27;t know how many bytes to expect here, for test purposes I use 1024&#xA;            var bytes = new byte[1024];&#xA;            ffmegInput.read(bytes);&#xA;            return bytes;&#xA;        } catch (IOException e) {&#xA;            throw new IllegalStateException(e);&#xA;        }&#xA;    }&#xA;&#xA;    @Override&#xA;    public void close() throws IOException {&#xA;        //Close streams code...&#xA;    }&#xA;}&#xA;</rtppacket>

    &#xA;

    This doesn't work because ffmpeg doesn't send anything after receiving the packet.&#xA;No errors in log, no output data.&#xA;Just wait for result here :

    &#xA;

    byte[] result = future.get();&#xA;

    &#xA;

    Normally, ffmpeg only outputs after stdin is closed and the process stops.&#xA;It may be necessary to run ffmpeg with some special&#xA;parameters so that it outputs each received packet at once ?

    &#xA;

    I would be very grateful for any help

    &#xA;