
Recherche avancée
Autres articles (42)
-
Keeping control of your media in your hands
13 avril 2011, parThe vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...) -
Les images
15 mai 2013 -
Mediabox : ouvrir les images dans l’espace maximal pour l’utilisateur
8 février 2011, parLa visualisation des images est restreinte par la largeur accordée par le design du site (dépendant du thème utilisé). Elles sont donc visibles sous un format réduit. Afin de profiter de l’ensemble de la place disponible sur l’écran de l’utilisateur, il est possible d’ajouter une fonctionnalité d’affichage de l’image dans une boite multimedia apparaissant au dessus du reste du contenu.
Pour ce faire il est nécessaire d’installer le plugin "Mediabox".
Configuration de la boite multimédia
Dès (...)
Sur d’autres sites (7243)
-
avformat/matroskaenc : Reset cur_master_element when discarding master
16 juin 2022, par Andreas Rheinhardtavformat/matroskaenc : Reset cur_master_element when discarding master
Before this patch the muxer writes an invalid file
(namely one in which the Projection master is a child of
the Colour element) if the following conditions are met :
a) The stream contains AVMasteringDisplayMetadata without primaries
and luminance (i.e. useless AVMasteringDisplayMetadata).
b) The stream contains AV_PKT_DATA_SPHERICAL side data.
c) All the colour elements of the stream are equal to default
(i.e. unknown).
Fortunately these conditions are very unlikely to be met.Signed-off-by : Andreas Rheinhardt <andreas.rheinhardt@outlook.com>
-
Flutter chewie video player shows incorrect duration for HLS stream
29 mai 2022, par magackameIm using SRS and ffmpeg to create a HLS video stream

I run SRS using docker command

docker run --rm -it -p 1935:1935 -p 1985:1985 -p 8080:8080 ossrs/srs:4 ./objs/srs -c conf/docker.conf



And then post a stream using ffmpeg command


ffmpeg -re -i video.mp4 -c copy -f flv rtmp://localhost/live/livestream



To playback a video in flutter I use Chewie video player (
chewie: ^1.3.2
in pubspec.yaml) and this widget code :

import 'package:video_player/video_player.dart';
import 'package:chewie/chewie.dart';
import 'package:flutter/material.dart';

class RoomView extends StatefulWidget {
 const RoomView({Key? key}) : super(key: key);

 @override
 _RoomViewState createState() => _RoomViewState();
}

class _RoomViewState extends State<roomview> {
 late VideoPlayerController _videoController;
 late ChewieController _controller;

 @override
 void dispose() {
 _controller.dispose();
 _videoController.dispose();

 super.dispose();
 }

 @override
 void initState() {
 super.initState();

 _videoController = VideoPlayerController.network('http://localhost/live/livestream.m3u8')
 ..initialize().then((_) {
 setState(() {});
 });

 _controller = ChewieController(videoPlayerController: _videoController);
 }

 @override
 Widget build(BuildContext context) {
 return AspectRatio(
 aspectRatio:
 _controller.aspectRatio == null ? 16 / 9 : _controller.aspectRatio!,
 child: Chewie(controller: _controller));
 }
}
</roomview>


The video plays fine and seeking using the playback bar also works, but the video duration is incorrect. I tried streaming videos with different duration(the two minute and twenty minute ones), tried using mp4 and mkv formats as source and/or streaming using mpegts as output container(instead of flv) but all of them had a duration of either one minute or sometimes 10 seconds and the playbar will overflow when reaching the limit(showing something like 2:11/1:05).

When playing some public HLS streams(https://multiplatform-f.akamaihd.net/i/multi/will/bunny/big_buck_bunny_,640x360_400,640x360_700,640x360_1000,950x540_1500,.f4v.csmil/master.m3u8) the video duration is shown correctly so I guess the problem is either the SRS or the ffmpeg.

The question is what am I doing wrong, what parameters should I use for ffmpeg or SRS, and what are other options that I can use to provide a working HLS stream for the player

-
Decode audio using ffmpeg (packet-by-packet) in Java
27 mai 2022, par quad478In my application, I receive an audio stream from an IP-camera via RTP using NETTY.
The stream from IP-camera comes in the "G.711 mulaw" format, I would like to recode it to the "AAC" format.
I can't use files in this task as it's a live stream, and each packet needs to be decoded and delivered to the client (browser) immediately.
For this task, I wanted to use the ffmpeg child process :
when connecting to the camera, create a ffmpeg process and send to stdin (ffmpeg) each received packet, then I would like to receive the decoded packet from stdout.
Here is the command I run ffmeg with :


"ffmpeg.exe -f mulaw -re -i - -f adts -"



I'm not sure if "-re" should be used here, but without this option, ffmpeg outputs the decode result only after stdin is closed and the process exits.
The problem is that I don't get anything on stdout after sending the packet to ffmpeg.


Decoder code :


package ru.ngslab.insentry.web.video.protocols.rtp;

import io.netty.buffer.ByteBuf;
import io.netty.channel.ChannelHandlerContext;
import io.netty.handler.codec.MessageToMessageDecoder;

import java.io.Closeable;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.util.List;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.Future;

public class RtpFfmpegDecoder extends MessageToMessageDecoder<rtppacket> implements Closeable {

 private final Process ffmegProcess;
 private final OutputStream ffmpegOutput;
 private final InputStream ffmegInput;
 private final ExecutorService ffmpegInputReaderService = Executors.newSingleThreadExecutor();

 public RtpFfmpegDecoder() {

 //Start Ffmpeg process
 ProcessBuilder ffmpegBuilder = new ProcessBuilder("ffmpeg.exe", "-f", "mulaw",
 "-re", "-i", "-", "-f", "adts", "-").redirectError(ProcessBuilder.Redirect.INHERIT);
 try {
 ffmegProcess = ffmpegBuilder.start();
 ffmpegOutput = ffmegProcess.getOutputStream();
 ffmegInput = ffmegProcess.getInputStream();
 } catch (IOException e) {
 throw new IllegalStateException(e);
 }
 }

 @Override
 protected void decode(ChannelHandlerContext channelHandlerContext, RtpPacket rtpPacket, List list) throws Exception {

 //start read ffmpeg output in another thread
 Future future = ffmpegInputReaderService.submit(this::startReadFFmpegOutput);

 //write rtp- packet bytes to ffmpeg-input
 ByteBuf data = rtpPacket.getData();
 byte[] rtpData = new byte[data.readableBytes()];
 data.getBytes(data.readerIndex(), rtpData);
 ffmpegOutput.write(rtpData);
 ffmpegOutput.flush();

 //waiting here for the decoding result from ffmpeg
 //blocks here
 byte[] result = future.get();
 //then process result...
 }

 private byte[] startReadFFmpegOutput() {
 try {
 //I don't know how many bytes to expect here, for test purposes I use 1024
 var bytes = new byte[1024];
 ffmegInput.read(bytes);
 return bytes;
 } catch (IOException e) {
 throw new IllegalStateException(e);
 }
 }

 @Override
 public void close() throws IOException {
 //Close streams code...
 }
}
</rtppacket>


This doesn't work because ffmpeg doesn't send anything after receiving the packet.
No errors in log, no output data.
Just wait for result here :


byte[] result = future.get();



Normally, ffmpeg only outputs after stdin is closed and the process stops.
It may be necessary to run ffmpeg with some special
parameters so that it outputs each received packet at once ?


I would be very grateful for any help