Recherche avancée

Médias (91)

Autres articles (112)

  • Le profil des utilisateurs

    12 avril 2011, par

    Chaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
    L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...)

  • Configurer la prise en compte des langues

    15 novembre 2010, par

    Accéder à la configuration et ajouter des langues prises en compte
    Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
    De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
    Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...)

  • XMP PHP

    13 mai 2011, par

    Dixit Wikipedia, XMP signifie :
    Extensible Metadata Platform ou XMP est un format de métadonnées basé sur XML utilisé dans les applications PDF, de photographie et de graphisme. Il a été lancé par Adobe Systems en avril 2001 en étant intégré à la version 5.0 d’Adobe Acrobat.
    Étant basé sur XML, il gère un ensemble de tags dynamiques pour l’utilisation dans le cadre du Web sémantique.
    XMP permet d’enregistrer sous forme d’un document XML des informations relatives à un fichier : titre, auteur, historique (...)

Sur d’autres sites (10873)

  • How to Stream RTP (IP camera) Into React App setup

    10 novembre 2024, par sharon2469

    I am trying to transfer a live broadcast from an IP camera or any other broadcast coming from an RTP/RTSP source to my REACT application. BUT MUST BE LIVE

    


    My setup at the moment is :

    


    IP Camera -> (RTP) -> FFmpeg -> (udp) -> Server(nodeJs) -> (WebRTC) -> React app

    


    In the current situation, There is almost no delay, but there are some things here that I can't avoid and I can't understand why, and here is my question :

    


    1) First, is the SETUP even correct and this is the only way to Stream RTP video in Web app ?

    


    2) Is it possible to avoid re-encode the stream , RTP transmission necessarily comes in H.264, hence I don't really need to execute the following command :

    


        return spawn('ffmpeg', [
    '-re',                              // Read input at its native frame rate Important for live-streaming
    '-probesize', '32',                 // Set probing size to 32 bytes (32 is minimum)
    '-analyzeduration', '1000000',      // An input duration of 1 second
    '-c:v', 'h264',                     // Video codec of input video
    '-i', 'rtp://238.0.0.2:48888',      // Input stream URL
    '-map', '0:v?',                     // Select video from input stream
    '-c:v', 'libx264',                  // Video codec of output stream
    '-preset', 'ultrafast',             // Faster encoding for lower latency
    '-tune', 'zerolatency',             // Optimize for zero latency
    // '-s', '768x480',                    // Adjust the resolution (experiment with values)
    '-f', 'rtp', `rtp://127.0.0.1:${udpPort}` // Output stream URL
]);


    


    As you can se in this command I re-encode to libx264, But if I set FFMPEG a parameter '-c:v' :'copy' instead of '-c:v', 'libx264' then FFMPEG throw an error says : that it doesn't know how to encode h264 and only knows what is libx264-> Basically, I want to stop the re-encode because there is really no need for it, because the stream is already encoded to H264. Are there certain recommendations that can be made ?

    


    3) I thought about giving up the FFMPEG completely, but the RTP packets arrive at a size of 1200+ BYTES when WEBRTC is limited to up to 1280 BYTE. Is there a way to manage these sabotages without damaging the video and is it to enter this world ? I guess there is the whole story with the JITTER BUFFER here

    


    This is my server side code (THIS IS JUST A TEST CODE)

    


    import {
    MediaStreamTrack,
    randomPort,
    RTCPeerConnection,
    RTCRtpCodecParameters,
    RtpPacket,
} from 'werift'
import {Server} from "ws";
import {createSocket} from "dgram";
import {spawn} from "child_process";
import LoggerFactory from "./logger/loggerFactory";

//

const log = LoggerFactory.getLogger('ServerMedia')

// Websocket server -> WebRTC
const serverPort = 8888
const server = new Server({port: serverPort});
log.info(`Server Media start om port: ${serverPort}`);

// UDP server -> ffmpeg
const udpPort = 48888
const udp = createSocket("udp4");
// udp.bind(udpPort, () => {
//     udp.addMembership("238.0.0.2");
// })
udp.bind(udpPort)
log.info(`UDP port: ${udpPort}`)


const createFFmpegProcess = () => {
    log.info(`Start ffmpeg process`)
    return spawn('ffmpeg', [
        '-re',                              // Read input at its native frame rate Important for live-streaming
        '-probesize', '32',                 // Set probing size to 32 bytes (32 is minimum)
        '-analyzeduration', '1000000',      // An input duration of 1 second
        '-c:v', 'h264',                     // Video codec of input video
        '-i', 'rtp://238.0.0.2:48888',      // Input stream URL
        '-map', '0:v?',                     // Select video from input stream
        '-c:v', 'libx264',                  // Video codec of output stream
        '-preset', 'ultrafast',             // Faster encoding for lower latency
        '-tune', 'zerolatency',             // Optimize for zero latency
        // '-s', '768x480',                    // Adjust the resolution (experiment with values)
        '-f', 'rtp', `rtp://127.0.0.1:${udpPort}` // Output stream URL
    ]);

}

let ffmpegProcess = createFFmpegProcess();


const attachFFmpegListeners = () => {
    // Capture standard output and print it
    ffmpegProcess.stdout.on('data', (data) => {
        log.info(`FFMPEG process stdout: ${data}`);
    });

    // Capture standard error and print it
    ffmpegProcess.stderr.on('data', (data) => {
        console.error(`ffmpeg stderr: ${data}`);
    });

    // Listen for the exit event
    ffmpegProcess.on('exit', (code, signal) => {
        if (code !== null) {
            log.info(`ffmpeg process exited with code ${code}`);
        } else if (signal !== null) {
            log.info(`ffmpeg process killed with signal ${signal}`);
        }
    });
};


attachFFmpegListeners();


server.on("connection", async (socket) => {
    const payloadType = 96; // It is a numerical value that is assigned to each codec in the SDP offer/answer exchange -> for H264
    // Create a peer connection with the codec parameters set in advance.
    const pc = new RTCPeerConnection({
        codecs: {
            audio: [],
            video: [
                new RTCRtpCodecParameters({
                    mimeType: "video/H264",
                    clockRate: 90000, // 90000 is the default value for H264
                    payloadType: payloadType,
                }),
            ],
        },
    });

    const track = new MediaStreamTrack({kind: "video"});


    udp.on("message", (data) => {
        console.log(data)
        const rtp = RtpPacket.deSerialize(data);
        rtp.header.payloadType = payloadType;
        track.writeRtp(rtp);
    });

    udp.on("error", (err) => {
        console.log(err)

    });

    udp.on("close", () => {
        console.log("close")
    });

    pc.addTransceiver(track, {direction: "sendonly"});

    await pc.setLocalDescription(await pc.createOffer());
    const sdp = JSON.stringify(pc.localDescription);
    socket.send(sdp);

    socket.on("message", (data: any) => {
        if (data.toString() === 'resetFFMPEG') {
            ffmpegProcess.kill('SIGINT');
            log.info(`FFMPEG process killed`)
            setTimeout(() => {
                ffmpegProcess = createFFmpegProcess();
                attachFFmpegListeners();
            }, 5000)
        } else {
            pc.setRemoteDescription(JSON.parse(data));
        }
    });
});


    


    And this fronted :

    


    &#xA;&#xA;&#xA;    &#xA;    &#xA;    <code class="echappe-js">&lt;script&amp;#xA;            crossorigin&amp;#xA;            src=&quot;https://unpkg.com/react@16/umd/react.development.js&quot;&amp;#xA;    &gt;&lt;/script&gt;&#xA;    &lt;script&amp;#xA;            crossorigin&amp;#xA;            src=&quot;https://unpkg.com/react-dom@16/umd/react-dom.development.js&quot;&amp;#xA;    &gt;&lt;/script&gt;&#xA;    &lt;script&amp;#xA;            crossorigin&amp;#xA;            src=&quot;https://cdnjs.cloudflare.com/ajax/libs/babel-core/5.8.34/browser.min.js&quot;&amp;#xA;    &gt;&lt;/script&gt;&#xA;    &lt;script src=&quot;https://cdn.jsdelivr.net/npm/babel-regenerator-runtime@6.5.0/runtime.min.js&quot;&gt;&lt;/script&gt;&#xA;&#xA;&#xA;
    &#xA;

    &#xA;

    &#xA;&lt;script type=&quot;text/babel&quot;&gt;&amp;#xA;    let rtc;&amp;#xA;&amp;#xA;    const App = () =&gt; {&amp;#xA;        const [log, setLog] = React.useState([]);&amp;#xA;        const videoRef = React.useRef();&amp;#xA;        const socket = new WebSocket(&quot;ws://localhost:8888&quot;);&amp;#xA;        const [peer, setPeer] = React.useState(null); // Add state to keep track of the peer connection&amp;#xA;&amp;#xA;        React.useEffect(() =&gt; {&amp;#xA;            (async () =&gt; {&amp;#xA;                await new Promise((r) =&gt; (socket.onopen = r));&amp;#xA;                console.log(&quot;open websocket&quot;);&amp;#xA;&amp;#xA;                const handleOffer = async (offer) =&gt; {&amp;#xA;                    console.log(&quot;new offer&quot;, offer.sdp);&amp;#xA;&amp;#xA;                    const updatedPeer = new RTCPeerConnection({&amp;#xA;                        iceServers: [],&amp;#xA;                        sdpSemantics: &quot;unified-plan&quot;,&amp;#xA;                    });&amp;#xA;&amp;#xA;                    updatedPeer.onicecandidate = ({ candidate }) =&gt; {&amp;#xA;                        if (!candidate) {&amp;#xA;                            const sdp = JSON.stringify(updatedPeer.localDescription);&amp;#xA;                            console.log(sdp);&amp;#xA;                            socket.send(sdp);&amp;#xA;                        }&amp;#xA;                    };&amp;#xA;&amp;#xA;                    updatedPeer.oniceconnectionstatechange = () =&gt; {&amp;#xA;                        console.log(&amp;#xA;                            &quot;oniceconnectionstatechange&quot;,&amp;#xA;                            updatedPeer.iceConnectionState&amp;#xA;                        );&amp;#xA;                    };&amp;#xA;&amp;#xA;                    updatedPeer.ontrack = (e) =&gt; {&amp;#xA;                        console.log(&quot;ontrack&quot;, e);&amp;#xA;                        videoRef.current.srcObject = e.streams[0];&amp;#xA;                    };&amp;#xA;&amp;#xA;                    await updatedPeer.setRemoteDescription(offer);&amp;#xA;                    const answer = await updatedPeer.createAnswer();&amp;#xA;                    await updatedPeer.setLocalDescription(answer);&amp;#xA;&amp;#xA;                    setPeer(updatedPeer);&amp;#xA;                };&amp;#xA;&amp;#xA;                socket.onmessage = (ev) =&gt; {&amp;#xA;                    const data = JSON.parse(ev.data);&amp;#xA;                    if (data.type === &quot;offer&quot;) {&amp;#xA;                        handleOffer(data);&amp;#xA;                    } else if (data.type === &quot;resetFFMPEG&quot;) {&amp;#xA;                        // Handle the resetFFMPEG message&amp;#xA;                        console.log(&quot;FFmpeg reset requested&quot;);&amp;#xA;                    }&amp;#xA;                };&amp;#xA;            })();&amp;#xA;        }, []); // Added socket as a dependency to the useEffect hook&amp;#xA;&amp;#xA;        const sendRequestToResetFFmpeg = () =&gt; {&amp;#xA;            socket.send(&quot;resetFFMPEG&quot;);&amp;#xA;        };&amp;#xA;&amp;#xA;        return (&amp;#xA;            &lt;div&gt;&amp;#xA;                Video: &amp;#xA;                &lt;video ref={videoRef} autoPlay muted /&gt;&amp;#xA;                &lt;button onClick={() =&gt; sendRequestToResetFFmpeg()}&gt;Reset FFMPEG&lt;/button&gt;&amp;#xA;            &lt;/div&gt;&amp;#xA;        );&amp;#xA;    };&amp;#xA;&amp;#xA;    ReactDOM.render(&lt;App /&gt;, document.getElementById(&quot;app1&quot;));&amp;#xA;&lt;/script&gt;&#xA;&#xA;&#xA;

    &#xA;

  • Using ffserver to stream older IP cam MJPEG to RTSP

    26 mai 2016, par tmar89

    I have an older Sony IP camera that has an MJPEG stream. I need to connect this to an NVR that only takes ONVIP or RTSP and I’m trying to use ffserver and ffmpeg to convert the MJPEG stream to RTSP but it’s not working. Any have some idea of what I may be doing wrong ? Saw an error in the attempted playback about an unsupported Protocol.
    Here is my ffserver config :

    Port 8090
    RTSPPort 5544
    BindAddress 0.0.0.0
    RTSPBindAddress 0.0.0.0
    MaxClients 100
    MaxBandwidth 10000

    <feed>
    File /tmp/feed27.ffm
    FileMaxSize 5M
    ACL allow 127.0.0.1
    </feed>

    <stream>
    Format rtp
    Feed feed27.ffm
    NoAudio
    VideoCodec mjpeg
    VideoFrameRate 30
    VideoSize 736x480
    </stream>

    And here is the ffmpeg command I am using :

    [tm@tele ffserver-rtsp]# ffmpeg -f mjpeg -r 30 -s 736x480 -i http://[CAMIP]/image http://localhost:8090/feed27.ffm
       FFmpeg version 0.6.5, Copyright (c) 2000-2010 the FFmpeg developers
         built on Jan 29 2012 17:52:15 with gcc 4.4.5 20110214 (Red Hat 4.4.5-6)
         configuration: --prefix=/usr --libdir=/usr/lib64 --shlibdir=/usr/lib64 --mandir=/usr/share/man --incdir=/usr/include --disable-avisynth --extra-cflags='-O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -fPIC' --enable-avfilter --enable-avfilter-lavf --enable-libdc1394 --enable-libdirac --enable-libfaac --enable-libfaad --enable-libfaadbin --enable-libgsm --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-librtmp --enable-libschroedinger --enable-libspeex --enable-libtheora --enable-libx264 --enable-gpl --enable-nonfree --enable-postproc --enable-pthreads --enable-shared --enable-swscale --enable-vdpau --enable-version3 --enable-x11grab
         libavutil     50.15. 1 / 50.15. 1
         libavcodec    52.72. 2 / 52.72. 2
         libavformat   52.64. 2 / 52.64. 2
         libavdevice   52. 2. 0 / 52. 2. 0
         libavfilter    1.19. 0 /  1.19. 0
         libswscale     0.11. 0 /  0.11. 0
         libpostproc   51. 2. 0 / 51. 2. 0
       [mjpeg @ 0x1ece670]Estimating duration from bitrate, this may be inaccurate
       Input #0, mjpeg, from 'http://[CAMIP]/image':
         Duration: N/A, bitrate: N/A
           Stream #0.0: Video: mjpeg, yuvj422p, 736x480, 30 fps, 30 tbr, 1200k tbn, 30 tbc
       Output #0, ffm, to 'http://localhost:8090/feed27.ffm':
         Metadata:
           encoder         : Lavf52.64.2
           Stream #0.0: Video: mjpeg, yuvj420p, 736x480, q=2-31, 200 kb/s, 1000k tbn, 30 tbc
       Stream mapping:
         Stream #0.0 -> #0.0
       Press [q] to stop encoding
       [mjpeg @ 0x222d110]rc buffer underflow
       frame=  640 fps= 17 q=31.4 size=   12884kB time=21.33 bitrate=4947.5kbits/s

    When I use VLC to open the stream, it cannot be found :

    Your input can't be opened:
       VLC is unable to open the MRL 'rtsp://localhost:5544/stream27.mpg'. Check the log for details.

    Finally, using ffplay on the same machine :

    [tm@tele tmp]# ffplay rtsp://localhost:5544/stream27.sdp
    FFplay version 0.6.5, Copyright (c) 2003-2010 the FFmpeg developers
     built on Jan 29 2012 17:52:15 with gcc 4.4.5 20110214 (Red Hat 4.4.5-6)
     configuration: --prefix=/usr --libdir=/usr/lib64 --shlibdir=/usr/lib64 --mandir=/usr/share/man --incdir=/usr/include --disable-avisynth --extra-cflags='-O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -fPIC' --enable-avfilter --enable-avfilter-lavf --enable-libdc1394 --enable-libdirac --enable-libfaac --enable-libfaad --enable-libfaadbin --enable-libgsm --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-librtmp --enable-libschroedinger --enable-libspeex --enable-libtheora --enable-libx264 --enable-gpl --enable-nonfree --enable-postproc --enable-pthreads --enable-shared --enable-swscale --enable-vdpau --enable-version3 --enable-x11grab
     libavutil     50.15. 1 / 50.15. 1
     libavcodec    52.72. 2 / 52.72. 2
     libavformat   52.64. 2 / 52.64. 2
     libavdevice   52. 2. 0 / 52. 2. 0
     libavfilter    1.19. 0 /  1.19. 0
     libswscale     0.11. 0 /  0.11. 0
     libpostproc   51. 2. 0 / 51. 2. 0
    ALSA lib pulse.c:229:(pulse_connect) PulseAudio: Unable to connect: Connection refused

    rtsp://localhost:5544/stream27.sdp: Protocol not supported

    And here was the log from ffserver :

    127.0.0.1:5000 - - "PLAY stream27.mpg/streamid=0 RTP/UDP"
    [rtp @ 0x721dc0]Unsupported codec 8
    127.0.0.1:0 - - "PLAY stream27.mpg/streamid=0 RTP/TCP"
    [rtp @ 0x728cb0]Unsupported codec 8
    127.0.0.1 - - [SETUP] "rtsp://localhost:5544/stream27.mpg/streamid=0 RTSP/1.0" 200 641
  • How to stop ffmpeg when there's no incoming rtmp stream

    5 juillet 2016, par M. Irich

    I use ffmpeg together with nginx-rtmp.
    The thing is ffmpeg doesn’t finish the process when the stream’s finished

    I use the following command :

    ffmpeg  -i 'rtmp://localhost:443/live/test' -loglevel debug  -c:a libfdk_aac -b:a 192k -c:v libx264 -profile baseline -preset superfast -tune zerolatency -b:v 2500k -maxrate 4500k -minrate 1500k -bufsize 9000k -keyint_min 15 -g 15 -f dash -use_timeline 1 -use_template 1 -min_seg_duration 5000 -y /tmp/dash/test/test.mpd

    but even the stream’s not running ffmpeg still can’t finish the process and is waiting for the rtmp stream

    Successfully parsed a group of options.
    Opening an input file: rtmp://localhost:443/live/test.
    [rtmp @ 0x2ba2160] No default whitelist set
    [tcp @ 0x2ba2720] No default whitelist set
    [rtmp @ 0x2ba2160] Handshaking...
    [rtmp @ 0x2ba2160] Type answer 3
    [rtmp @ 0x2ba2160] Server version 13.14.10.13
    [rtmp @ 0x2ba2160] Proto = rtmp, path = /live/test, app = live, fname = test
    [rtmp @ 0x2ba2160] Server bandwidth = 5000000
    [rtmp @ 0x2ba2160] Client bandwidth = 5000000
    [rtmp @ 0x2ba2160] New incoming chunk size = 4096
    [rtmp @ 0x2ba2160] Creating stream...
    [rtmp @ 0x2ba2160] Sending play command for 'test'

    Is it possible to limit the latency time to several seconds ?

    Sorry for any possible mistakes - English’s not my native language.