
Recherche avancée
Médias (1)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (65)
-
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page. -
Modifier la date de publication
21 juin 2013, parComment changer la date de publication d’un média ?
Il faut au préalable rajouter un champ "Date de publication" dans le masque de formulaire adéquat :
Administrer > Configuration des masques de formulaires > Sélectionner "Un média"
Dans la rubrique "Champs à ajouter, cocher "Date de publication "
Cliquer en bas de la page sur Enregistrer -
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
Sur d’autres sites (9095)
-
Save video to disk from WebRTC MediaStream in Node
27 novembre 2020, par SAGBO AiméI'm building an app where the user can connect to the server through a WebRTC (I'm using simple-peer library both server-side and client-side to set the peer-to-peer connection).
Once the client and the server are connected, the client app stream the user camera and micro to the server.


Now, I want to save the streamed data to the filesystem server-side as an MP4 video file.


I hear about ffmpeg and fluent-ffmpeg to achieve this but i don't know how to use them.


- 

- Server side code to set up the peer connection




const Peer = require("simple-peer");
const wrtc = require("wrtc");

const peer = new Peer({ initiator: false, wrtc: wrtc, trickle: false });

peer.on("error", (err: any) => console.log("error", err));

 peer.on("signal", (data: any) => {
 if (data.type === "offer" || data.type === "answer")
 dispatchMessage(JSON.stringify(data));
 // if (data.renegotiate || data.transceiverRequest) return;
 });

 peer.on("connect", () => {
 console.log("CONNECTED");
 peer.send(JSON.stringify("HELLO DEER PEER FROM SERVER"));
 });

 peer.on("data", (data: any) => {
 console.log("data: ", data);
 });

 peer.on("stream", (stream: MediaStream) => {
 console.log("-------Stream received", stream);
 });

 peer.on("track", (track: MediaStreamTrack) => {
 console.log("-------trackEvent:", track);
 });



- 

- Client-side code




const stream = await window.navigator.mediaDevices.getUserMedia({
 video: { width: { ideal: 4096 }, height: { ideal: 2160 }},
 audio: true,
});

const p = new SimplePeer({
 initiator: isInitiator, 
 trickle: false 
});

stream.getTracks().forEach(track => p.addTrack(
 track, 
 stream 
));

// Here I set up the listeners for the peer connection



-
ffmpeg legitimate decoding errors
20 juillet 2017, par Gideon OduroMy issue is as follows, i’m sending a H.264 encoded video captured with the help of WebRTC over WebSocket. The idea is to perform server side analysis and object tracking.
navigator.mediaDevices.getUserMedia(constraint).then((stream) => {
isVideoElement(target, stream)
mediaRecorder = recorder(stream, {mimeType: 'video/webm; codecs=H264'})
mediaRecorder.ondataavailable = (blob) => socket.send(blob.data)
mediaRecorder.start('2000');
})On the server side, data is being received as ByteBuffer :
override fun handleBinaryMessage(session: WebSocketSession, msg: BinaryMessage) {
analysis(msg.payload)
}Im using the following resources (resource_1, resource_2) to try to convert my ByteBuffer to a OpenCv frame :
fun startPreview(data: ByteBuffer) {
avcodec_register_all()
val pack = avcodec.AVPacket()
pack.data(BytePointer(data))
avcodec.av_init_packet(pack)
val videoData = BytePointer(data)
val codec = avcodec.avcodec_find_decoder(avcodec.AV_CODEC_ID_H264)
val videoCodecContext = avcodec.avcodec_alloc_context3(codec)
videoCodecContext.width(1280)
videoCodecContext.height(720)
videoCodecContext.pix_fmt(avutil.AV_PIX_FMT_YUV420P)
videoCodecContext.codec_type(avutil.AVMEDIA_TYPE_VIDEO)
videoCodecContext.extradata(videoData)
videoCodecContext.extradata_size(data.capacity())
videoCodecContext.flags2(videoCodecContext.flags2() or avcodec.CODEC_FLAG2_CHUNKS)
avcodec.avcodec_open2(videoCodecContext, codec, null as PointerPointer<*>?)
val decodedFrameLength = avcodec.avcodec_receive_frame(videoCodecContext, avutil.AVFrame())
println(decodedFrameLength)
}Im then receiving decodedFrameLength of -35 indicating a decoding error, cant figure out how to proceed from here ?
-
No audio output using FFmpeg
26 mars 2022, par John Mergene ArellanoI am having problem on Live stream output. I am streaming from mobile app to Node JS server to RTMP. Video output of the live stream is working but not the audio. There is no audio output from live stream.


From my client side, I am sending a stream using the Socket.IO library. I captured the video and audio using getUserMedia API.


navigator.mediaDevices.getUserMedia(constraints).then((stream) => {
 window.videoStream = video.srcObject = stream;
 let mediaRecorder = new MediaRecorder(stream, {
 videoBitsPerSecond : 3 * 1024 * 1024
 });
 mediaRecorder.addEventListener('dataavailable', (e) => {
 let data = e.data;
 socket.emit('live', data);
 });
 mediaRecorder.start(1000);
});



Then my server will receive the stream and write it to FFmpeg.


client.on('live', (stream)=>{
 if(ffmpeg)
 ffmpeg.stdin.write(stream);
});



I tried watching the live video in VLC media player. There is a 5 seconds delay and no audio output.


Please see below for FFmpeg options I used :


ffmpeg = this.CHILD_PROCESS.spawn("ffmpeg", [
 '-f',
 'lavfi',
 '-i', 'anullsrc',
 '-i','-',
 '-c:v', 'libx264', '-preset', 'veryfast', '-tune', 'zerolatency',
 '-c:a', 'aac', '-ar', '44100', '-b:a', '64k',
 '-y', //force to overwrite
 '-use_wallclock_as_timestamps', '1', // used for audio sync
 '-async', '1', // used for audio sync
 '-bufsize', '1000',
 '-f',
 'flv',
 `rtmp://127.0.0.1:1935/live/stream` ]);



What is wrong with my setup ? I need to fix the command so that the live stream will output both video and audio.


I tried streaming to youtube RTMP but still no audio. I am expecting to have an output of video and audio from the getUserMedia API.


What is wrong with my setup ? I need to fix the command so that the live stream will output both video and audio.


I tried streaming to youtube RTMP but still no audio. I am expecting to have an output of video and audio from the getUserMedia API.