
Recherche avancée
Médias (1)
-
The pirate bay depuis la Belgique
1er avril 2013, par
Mis à jour : Avril 2013
Langue : français
Type : Image
Autres articles (28)
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Selection of projects using MediaSPIP
2 mai 2011, parThe examples below are representative elements of MediaSPIP specific uses for specific projects.
MediaSPIP farm @ Infini
The non profit organizationInfini develops hospitality activities, internet access point, training, realizing innovative projects in the field of information and communication technologies and Communication, and hosting of websites. It plays a unique and prominent role in the Brest (France) area, at the national level, among the half-dozen such association. Its members (...) -
Sélection de projets utilisant MediaSPIP
29 avril 2011, parLes exemples cités ci-dessous sont des éléments représentatifs d’usages spécifiques de MediaSPIP pour certains projets.
Vous pensez avoir un site "remarquable" réalisé avec MediaSPIP ? Faites le nous savoir ici.
Ferme MediaSPIP @ Infini
L’Association Infini développe des activités d’accueil, de point d’accès internet, de formation, de conduite de projets innovants dans le domaine des Technologies de l’Information et de la Communication, et l’hébergement de sites. Elle joue en la matière un rôle unique (...)
Sur d’autres sites (3586)
-
How do I use FFmpeg to fetch an audio from a local network and decode it to PCM ?
26 mai 2020, par Yousef AlaqraCurrently, I have a node js server which is connected to a specific IP address on the local network (the source of the audio), to receive the audio using VBAN protocol. VBAN protocol, basically uses UDP to send audio over the local network.



Node js implementation :



http.listen(3000, () => {
 console.log("Server running on port 3000");
});

let PORT = 6980;
let HOST = "192.168.1.244";

io.on("connection", (socket) => {
 console.log("a user connected");
 socket.on("disconnect", () => {
 console.log("user disconnected");
 });
});

io.on("connection", () => {

 let dgram = require("dgram");
 let server = dgram.createSocket("udp4");

 server.on("listening", () => {
 let address = server.address();
 console.log("server host", address.address);
 console.log("server port", address.port);
 });

 server.on("message", function (message, remote) {
 let audioData = vban.ProcessPacket(message);
 io.emit("audio", audioData); // // console.log(`Received packet: ${remote.address}:${remote.port}`)
 });
 server.bind({
 address: "192.168.1.230",
 port: PORT,
 exclusive: false,
 });
});




once the server receives a package from the local network, it processes the package, then, using socket.io it emits the processed data to the client.



An example of the processed audio data that's being emitted from the socket, and received in the angular :



audio {
 format: {
 channels: 2,
 sampleRate: 44100,
 interleaved: true,
 float: false,
 signed: true,
 bitDepth: 16,
 byteOrder: 'LE'
 },
 sampleRate: 44100,
 buffer: <buffer 2e="2e" 00="00" ce="ce" ff="ff" 3d="3d" bd="bd" 44="44" b6="b6" 48="48" c3="c3" 32="32" d3="d3" 31="31" d4="d4" 30="30" dd="dd" 38="38" 34="34" e5="e5" 1d="1d" c6="c6" 25="25" 974="974" more="more" bytes="bytes">,
 channels: 2,
}
</buffer>



In the client-side (Angular), after receiving a package using socket.io.clinet, AudioConetext is used to decode the audio and play it :



playAudio(audioData) {
 let audioCtx = new AudioContext();
 let count = 0;
 let offset = 0;
 let msInput = 1000;
 let msToBuffer = Math.max(50, msInput);
 let bufferX = 0;
 let audioBuffer;
 let prevFormat = {};
 let source;

 if (!audioBuffer || Object.keys(audioData.format).some((key) => prevFormat[key] !== audioData.format[key])) {
 prevFormat = audioData.format;
 bufferX = Math.ceil(((msToBuffer / 1000) * audioData.sampleRate) / audioData.samples);
 if (bufferX < 3) {
 bufferX = 3;
 }
 audioBuffer = audioCtx.createBuffer(audioData.channels, audioData.samples * bufferX, audioData.sampleRate);
 if (source) {
 source.disconnect();
 }
 source = audioCtx.createBufferSource();
 console.log("source", source);
 source.connect(audioCtx.destination);
 source.loop = true;
 source.buffer = audioBuffer;
 source.start();
 }
 }




Regardless that audio isn't playing in the client-side, and there is something wrong, this isn't the correct implementation.



Brad, mentioned in the comments below, that I can implement this correctly and less complexity using FFmpeg child-process. And I'm very interested to know how to fetch the audio locally using FFmpeg.


-
NodeMediaServer MP4 change default video resolution 540p
11 août 2021, par Farooq ZamanI have setup a nginx RTMP server and the purpose is to store videos streamed from mobile devices in MP4 format for later analysis. Although mobile devices are streaming videos in 720p resolution NodeMediaServer always store video in 540p resolution. How can I change this behaviour ? Following is NodeMediaServer configuration :


const nodeMediaServerConfig = {
 rtmp: {
 port: 1936,
 chunk_size: 60000,
 gop_cache: true,
 ping: 60,
 ping_timeout: 10,
 },
 http: {
 port: 8000,
 mediaroot: './media',
 allow_origin: '*',
 },
 trans: {
 ffmpeg: '/usr/bin/ffmpeg',
 tasks: [
 {
 app: 'live',
 vcParam: [
 "-c:v",
 "libx264",
 "-vf",
 "scale=720:-1",
 "-b:v",
 "2800k",
 "-bufsize",
 "4200k",
 "-preset",
 "fast",
 ],
 ac: 'aac',
 acParam:["-b:a", "128k", "-ar", 48000],
 mp4: true,
 mp4Flags: '[movflags=faststart]',
 },
 ],
 },
};



Any help in this matter is highly appreciated.


-
MPlayer not playing HTTP video stream for a specific type of content from the same source
2 août 2017, par JoelImplementation overview
Before I dive into the question, I need to establish the context from the start.
I am currently implementing a cloud gaming solution utilising the following :
- Nvidia Capture SDK
- Nvidia Video Codec SDK
- FFmpeg
- MPlayer
The Nvidia Capture SDK is used to produce a shim layer (via DXGI.dll), intercepting and capturing DirectX frames so that they can be passed to the Nvidia Video Codec SDK to be encoded into an h264 video format. All this is done within DXGI.dll.
I then pass the encoded video to FFmpeg. FFmpeg acts as an HTTP server that broadcasts the video stream for MPlayer to play.
Problem
I am running an Unreal Engine 4 game called "Epic Survival Game Series". The Nvidia Capture SDK’s shim layer kicks off when the game starts, and FFmpeg launches the HTTP server to start streaming. However, when I start MPlayer to receive the stream, MPlayer stops at the following message, and nothing happens after that.
libavformat version 57.72.101 (internal)
Stream not seekable!
H264-ES file format detectedThe thing is, when I play the same video using ffplay, it works without any issue. This is not the only quirk. When I launch a different Unreal Engine 4 game called "First Person Shooter Template", MPlayer can play that video as well. Also, if I modify the Survival Game to load directly into the game level by skipping the menu, MPlayer is also able to play the video.
Using FFmpeg to write the video to a file instead of streaming it to a video also works, no matter the game or whether I loaded into the menu or game level.
This is very strange and I do not have any idea why this is the case. Any ideas ?
Edit : One strange quirk I forgot to mention is that MPlayer does manage to play the video in very rare occasions - maybe once every 10-20 tries or so.
Implementation Details
Additional details of how certain parts are implemented.
(1) For the Nvidia Capture SDK, I use the provided DXIFRShim example that is provided in the SDK
(2) for the Nvidia Video Codec SDK, I use the provided NvEncoder example that is provided in the SDK
(3) The FFmpeg command I use is this :
ffmpeg -i - -listen 1 -threads 1 -vcodec copy -preset ultrafast -an -tune zerolatency -f h264 http://address:port
The encoded frames from Nvidia Video Codec SDK is piped to FFmpeg.(4) The MPlayer command I use is this :
mplayer -quiet -vo gl -nosound -benchmark http://address:port
Things I’ve tried
I am suspecting MPlayer to be the cause, so I’ve only played around with MPlayer parameters.
mplayer http://address:port
mplayer -fps 30 -vo gl -nosound -benchmark http://address:port
mplayer -fps 30 -screenw 720 -screenh 1280 -vo gl -nosound -benchmark http://address:port
mplayer -fps 30 -vo directx -nosound -benchmark http://address:port
mplayer -fps 30 -vo null -nosound -benchmark http://address:port
None of these worked.