
Recherche avancée
Médias (1)
-
SPIP - plugins - embed code - Exemple
2 septembre 2013, par
Mis à jour : Septembre 2013
Langue : français
Type : Image
Autres articles (55)
-
Librairies et logiciels spécifiques aux médias
10 décembre 2010, parPour un fonctionnement correct et optimal, plusieurs choses sont à prendre en considération.
Il est important, après avoir installé apache2, mysql et php5, d’installer d’autres logiciels nécessaires dont les installations sont décrites dans les liens afférants. Un ensemble de librairies multimedias (x264, libtheora, libvpx) utilisées pour l’encodage et le décodage des vidéos et sons afin de supporter le plus grand nombre de fichiers possibles. Cf. : ce tutoriel ; FFMpeg avec le maximum de décodeurs et (...) -
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...)
Sur d’autres sites (9366)
-
How to Use Analytics & Reports for Marketing, Sales & More
28 septembre 2023, par Erin — Analytics Tips -
Conversion Rate Optimisation Statistics for 2024 and Beyond
21 novembre 2023, par Erin — Analytics Tips -
How to Stream RTP (IP camera) Into React App setup
10 novembre 2024, par sharon2469I am trying to transfer a live broadcast from an IP camera or any other broadcast coming from an RTP/RTSP source to my REACT application. BUT MUST BE LIVE


My setup at the moment is :


IP Camera -> (RTP) -> FFmpeg -> (udp) -> Server(nodeJs) -> (WebRTC) -> React app


In the current situation, There is almost no delay, but there are some things here that I can't avoid and I can't understand why, and here is my question :


1) First, is the SETUP even correct and this is the only way to Stream RTP video in Web app ?


2) Is it possible to avoid re-encode the stream , RTP transmission necessarily comes in H.264, hence I don't really need to execute the following command :


return spawn('ffmpeg', [
 '-re', // Read input at its native frame rate Important for live-streaming
 '-probesize', '32', // Set probing size to 32 bytes (32 is minimum)
 '-analyzeduration', '1000000', // An input duration of 1 second
 '-c:v', 'h264', // Video codec of input video
 '-i', 'rtp://238.0.0.2:48888', // Input stream URL
 '-map', '0:v?', // Select video from input stream
 '-c:v', 'libx264', // Video codec of output stream
 '-preset', 'ultrafast', // Faster encoding for lower latency
 '-tune', 'zerolatency', // Optimize for zero latency
 // '-s', '768x480', // Adjust the resolution (experiment with values)
 '-f', 'rtp', `rtp://127.0.0.1:${udpPort}` // Output stream URL
]);



As you can se in this command I re-encode to libx264, But if I set FFMPEG a parameter '-c:v' :'copy' instead of '-c:v', 'libx264' then FFMPEG throw an error says : that it doesn't know how to encode h264 and only knows what is libx264-> Basically, I want to stop the re-encode because there is really no need for it, because the stream is already encoded to H264. Are there certain recommendations that can be made ?


3) I thought about giving up the FFMPEG completely, but the RTP packets arrive at a size of 1200+ BYTES when WEBRTC is limited to up to 1280 BYTE. Is there a way to manage these sabotages without damaging the video and is it to enter this world ? I guess there is the whole story with the JITTER BUFFER here


This is my server side code (THIS IS JUST A TEST CODE)


import {
 MediaStreamTrack,
 randomPort,
 RTCPeerConnection,
 RTCRtpCodecParameters,
 RtpPacket,
} from 'werift'
import {Server} from "ws";
import {createSocket} from "dgram";
import {spawn} from "child_process";
import LoggerFactory from "./logger/loggerFactory";

//

const log = LoggerFactory.getLogger('ServerMedia')

// Websocket server -> WebRTC
const serverPort = 8888
const server = new Server({port: serverPort});
log.info(`Server Media start om port: ${serverPort}`);

// UDP server -> ffmpeg
const udpPort = 48888
const udp = createSocket("udp4");
// udp.bind(udpPort, () => {
// udp.addMembership("238.0.0.2");
// })
udp.bind(udpPort)
log.info(`UDP port: ${udpPort}`)


const createFFmpegProcess = () => {
 log.info(`Start ffmpeg process`)
 return spawn('ffmpeg', [
 '-re', // Read input at its native frame rate Important for live-streaming
 '-probesize', '32', // Set probing size to 32 bytes (32 is minimum)
 '-analyzeduration', '1000000', // An input duration of 1 second
 '-c:v', 'h264', // Video codec of input video
 '-i', 'rtp://238.0.0.2:48888', // Input stream URL
 '-map', '0:v?', // Select video from input stream
 '-c:v', 'libx264', // Video codec of output stream
 '-preset', 'ultrafast', // Faster encoding for lower latency
 '-tune', 'zerolatency', // Optimize for zero latency
 // '-s', '768x480', // Adjust the resolution (experiment with values)
 '-f', 'rtp', `rtp://127.0.0.1:${udpPort}` // Output stream URL
 ]);

}

let ffmpegProcess = createFFmpegProcess();


const attachFFmpegListeners = () => {
 // Capture standard output and print it
 ffmpegProcess.stdout.on('data', (data) => {
 log.info(`FFMPEG process stdout: ${data}`);
 });

 // Capture standard error and print it
 ffmpegProcess.stderr.on('data', (data) => {
 console.error(`ffmpeg stderr: ${data}`);
 });

 // Listen for the exit event
 ffmpegProcess.on('exit', (code, signal) => {
 if (code !== null) {
 log.info(`ffmpeg process exited with code ${code}`);
 } else if (signal !== null) {
 log.info(`ffmpeg process killed with signal ${signal}`);
 }
 });
};


attachFFmpegListeners();


server.on("connection", async (socket) => {
 const payloadType = 96; // It is a numerical value that is assigned to each codec in the SDP offer/answer exchange -> for H264
 // Create a peer connection with the codec parameters set in advance.
 const pc = new RTCPeerConnection({
 codecs: {
 audio: [],
 video: [
 new RTCRtpCodecParameters({
 mimeType: "video/H264",
 clockRate: 90000, // 90000 is the default value for H264
 payloadType: payloadType,
 }),
 ],
 },
 });

 const track = new MediaStreamTrack({kind: "video"});


 udp.on("message", (data) => {
 console.log(data)
 const rtp = RtpPacket.deSerialize(data);
 rtp.header.payloadType = payloadType;
 track.writeRtp(rtp);
 });

 udp.on("error", (err) => {
 console.log(err)

 });

 udp.on("close", () => {
 console.log("close")
 });

 pc.addTransceiver(track, {direction: "sendonly"});

 await pc.setLocalDescription(await pc.createOffer());
 const sdp = JSON.stringify(pc.localDescription);
 socket.send(sdp);

 socket.on("message", (data: any) => {
 if (data.toString() === 'resetFFMPEG') {
 ffmpegProcess.kill('SIGINT');
 log.info(`FFMPEG process killed`)
 setTimeout(() => {
 ffmpegProcess = createFFmpegProcess();
 attachFFmpegListeners();
 }, 5000)
 } else {
 pc.setRemoteDescription(JSON.parse(data));
 }
 });
});



And this fronted :





 
 
 <code class="echappe-js"><script&#xA; crossorigin&#xA; src="https://unpkg.com/react@16/umd/react.development.js"&#xA; ></script>

<script&#xA; crossorigin&#xA; src="https://unpkg.com/react-dom@16/umd/react-dom.development.js"&#xA; ></script>

<script&#xA; crossorigin&#xA; src="https://cdnjs.cloudflare.com/ajax/libs/babel-core/5.8.34/browser.min.js"&#xA; ></script>

<script src="https://cdn.jsdelivr.net/npm/babel-regenerator-runtime@6.5.0/runtime.min.js"></script>








<script type="text/babel">&#xA; let rtc;&#xA;&#xA; const App = () => {&#xA; const [log, setLog] = React.useState([]);&#xA; const videoRef = React.useRef();&#xA; const socket = new WebSocket("ws://localhost:8888");&#xA; const [peer, setPeer] = React.useState(null); // Add state to keep track of the peer connection&#xA;&#xA; React.useEffect(() => {&#xA; (async () => {&#xA; await new Promise((r) => (socket.onopen = r));&#xA; console.log("open websocket");&#xA;&#xA; const handleOffer = async (offer) => {&#xA; console.log("new offer", offer.sdp);&#xA;&#xA; const updatedPeer = new RTCPeerConnection({&#xA; iceServers: [],&#xA; sdpSemantics: "unified-plan",&#xA; });&#xA;&#xA; updatedPeer.onicecandidate = ({ candidate }) => {&#xA; if (!candidate) {&#xA; const sdp = JSON.stringify(updatedPeer.localDescription);&#xA; console.log(sdp);&#xA; socket.send(sdp);&#xA; }&#xA; };&#xA;&#xA; updatedPeer.oniceconnectionstatechange = () => {&#xA; console.log(&#xA; "oniceconnectionstatechange",&#xA; updatedPeer.iceConnectionState&#xA; );&#xA; };&#xA;&#xA; updatedPeer.ontrack = (e) => {&#xA; console.log("ontrack", e);&#xA; videoRef.current.srcObject = e.streams[0];&#xA; };&#xA;&#xA; await updatedPeer.setRemoteDescription(offer);&#xA; const answer = await updatedPeer.createAnswer();&#xA; await updatedPeer.setLocalDescription(answer);&#xA;&#xA; setPeer(updatedPeer);&#xA; };&#xA;&#xA; socket.onmessage = (ev) => {&#xA; const data = JSON.parse(ev.data);&#xA; if (data.type === "offer") {&#xA; handleOffer(data);&#xA; } else if (data.type === "resetFFMPEG") {&#xA; // Handle the resetFFMPEG message&#xA; console.log("FFmpeg reset requested");&#xA; }&#xA; };&#xA; })();&#xA; }, []); // Added socket as a dependency to the useEffect hook&#xA;&#xA; const sendRequestToResetFFmpeg = () => {&#xA; socket.send("resetFFMPEG");&#xA; };&#xA;&#xA; return (&#xA; <div>&#xA; Video: &#xA; <video ref={videoRef} autoPlay muted />&#xA; <button onClick={() => sendRequestToResetFFmpeg()}>Reset FFMPEG</button>&#xA; </div>&#xA; );&#xA; };&#xA;&#xA; ReactDOM.render(<App />, document.getElementById("app1"));&#xA;</script>