Recherche avancée

Médias (16)

Mot : - Tags -/mp3

Autres articles (26)

  • Support de tous types de médias

    10 avril 2011

    Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)

  • Les formats acceptés

    28 janvier 2010, par

    Les commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
    ffmpeg -codecs ffmpeg -formats
    Les format videos acceptés en entrée
    Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
    Les formats vidéos de sortie possibles
    Dans un premier temps on (...)

  • Supporting all media types

    13 avril 2011, par

    Unlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)

Sur d’autres sites (6720)

  • opus : add a native Opus encoder

    11 février 2017, par Rostislav Pehlivanov
    opus : add a native Opus encoder
    

    This marks the first time anyone has written an Opus encoder without
    using any libopus code. The aim of the encoder is to prove how far
    the format can go by writing the craziest encoder for it.

    Right now the encoder’s basic, it only supports CBR encoding, however
    internally every single feature the CELT layer has is implemented
    (except the pitch pre-filter which needs to work well with the rest of
    whatever gets implemented). Psychoacoustic and rate control systems are
    under development.

    The encoder takes in frames of 120 samples and depending on the value of
    opus_delay the plan is to use the extra buffered frames as lookahead.
    Right now the encoder will pick the nearest largest legal frame size and
    won’t use the lookahead, but that’ll change once there’s a
    psychoacoustic system.

    Even though its a pretty basic encoder its already outperforming
    any other native encoder FFmpeg has by a huge amount.

    The PVQ search algorithm is faster and more accurate than libopus’s
    algorithm so the encoder’s performance is close to that of libopus
    at zero complexity (libopus has more SIMD).
    The algorithm might be ported to libopus or other codecs using PVQ in
    the future.

    The encoder still has a few minor bugs, like desyncs at ultra low
    bitrates (below 9kbps with 20ms frames).

    Signed-off-by : Rostislav Pehlivanov <atomnuker@gmail.com>

    • [DH] configure
    • [DH] libavcodec/Makefile
    • [DH] libavcodec/allcodecs.c
    • [DH] libavcodec/opus_celt.h
    • [DH] libavcodec/opus_pvq.c
    • [DH] libavcodec/opus_pvq.h
    • [DH] libavcodec/opusenc.c
  • MEAN stack express.js video uploader/converter

    5 janvier 2017, par MattJ

    The idea is a social site where people can upload their videos. I am planning to use multer for uploading (limiting by size and by mimetype). Then for performance and mostly storage purposes to use fluent-ffmpeg and convert it to mp4 format and store it somewhere on the server with a reference in mongodb. Since I do not want the user to wait while the whole process is done, I plan to separate it into to main parts :

    1. Uploading
    2. Converting and storing.

    Where the user uploads the file and then some separate node process ( using node-schedule) which run checks every 1 min. or so to convert all files in the directory and after that adds the references to mongodb. So what do you guys think ? What is the best approach performance-wise ?

  • How to Stream RTP (IP camera) Into React App setup

    10 novembre 2024, par sharon2469

    I am trying to transfer a live broadcast from an IP camera or any other broadcast coming from an RTP/RTSP source to my REACT application. BUT MUST BE LIVE

    &#xA;

    My setup at the moment is :

    &#xA;

    IP Camera -> (RTP) -> FFmpeg -> (udp) -> Server(nodeJs) -> (WebRTC) -> React app

    &#xA;

    In the current situation, There is almost no delay, but there are some things here that I can't avoid and I can't understand why, and here is my question :

    &#xA;

    1) First, is the SETUP even correct and this is the only way to Stream RTP video in Web app ?

    &#xA;

    2) Is it possible to avoid re-encode the stream , RTP transmission necessarily comes in H.264, hence I don't really need to execute the following command :

    &#xA;

        return spawn(&#x27;ffmpeg&#x27;, [&#xA;    &#x27;-re&#x27;,                              // Read input at its native frame rate Important for live-streaming&#xA;    &#x27;-probesize&#x27;, &#x27;32&#x27;,                 // Set probing size to 32 bytes (32 is minimum)&#xA;    &#x27;-analyzeduration&#x27;, &#x27;1000000&#x27;,      // An input duration of 1 second&#xA;    &#x27;-c:v&#x27;, &#x27;h264&#x27;,                     // Video codec of input video&#xA;    &#x27;-i&#x27;, &#x27;rtp://238.0.0.2:48888&#x27;,      // Input stream URL&#xA;    &#x27;-map&#x27;, &#x27;0:v?&#x27;,                     // Select video from input stream&#xA;    &#x27;-c:v&#x27;, &#x27;libx264&#x27;,                  // Video codec of output stream&#xA;    &#x27;-preset&#x27;, &#x27;ultrafast&#x27;,             // Faster encoding for lower latency&#xA;    &#x27;-tune&#x27;, &#x27;zerolatency&#x27;,             // Optimize for zero latency&#xA;    // &#x27;-s&#x27;, &#x27;768x480&#x27;,                    // Adjust the resolution (experiment with values)&#xA;    &#x27;-f&#x27;, &#x27;rtp&#x27;, `rtp://127.0.0.1:${udpPort}` // Output stream URL&#xA;]);&#xA;

    &#xA;

    As you can se in this command I re-encode to libx264, But if I set FFMPEG a parameter '-c:v' :'copy' instead of '-c:v', 'libx264' then FFMPEG throw an error says : that it doesn't know how to encode h264 and only knows what is libx264-> Basically, I want to stop the re-encode because there is really no need for it, because the stream is already encoded to H264. Are there certain recommendations that can be made ?

    &#xA;

    3) I thought about giving up the FFMPEG completely, but the RTP packets arrive at a size of 1200+ BYTES when WEBRTC is limited to up to 1280 BYTE. Is there a way to manage these sabotages without damaging the video and is it to enter this world ? I guess there is the whole story with the JITTER BUFFER here

    &#xA;

    This is my server side code (THIS IS JUST A TEST CODE)

    &#xA;

    import {&#xA;    MediaStreamTrack,&#xA;    randomPort,&#xA;    RTCPeerConnection,&#xA;    RTCRtpCodecParameters,&#xA;    RtpPacket,&#xA;} from &#x27;werift&#x27;&#xA;import {Server} from "ws";&#xA;import {createSocket} from "dgram";&#xA;import {spawn} from "child_process";&#xA;import LoggerFactory from "./logger/loggerFactory";&#xA;&#xA;//&#xA;&#xA;const log = LoggerFactory.getLogger(&#x27;ServerMedia&#x27;)&#xA;&#xA;// Websocket server -> WebRTC&#xA;const serverPort = 8888&#xA;const server = new Server({port: serverPort});&#xA;log.info(`Server Media start om port: ${serverPort}`);&#xA;&#xA;// UDP server -> ffmpeg&#xA;const udpPort = 48888&#xA;const udp = createSocket("udp4");&#xA;// udp.bind(udpPort, () => {&#xA;//     udp.addMembership("238.0.0.2");&#xA;// })&#xA;udp.bind(udpPort)&#xA;log.info(`UDP port: ${udpPort}`)&#xA;&#xA;&#xA;const createFFmpegProcess = () => {&#xA;    log.info(`Start ffmpeg process`)&#xA;    return spawn(&#x27;ffmpeg&#x27;, [&#xA;        &#x27;-re&#x27;,                              // Read input at its native frame rate Important for live-streaming&#xA;        &#x27;-probesize&#x27;, &#x27;32&#x27;,                 // Set probing size to 32 bytes (32 is minimum)&#xA;        &#x27;-analyzeduration&#x27;, &#x27;1000000&#x27;,      // An input duration of 1 second&#xA;        &#x27;-c:v&#x27;, &#x27;h264&#x27;,                     // Video codec of input video&#xA;        &#x27;-i&#x27;, &#x27;rtp://238.0.0.2:48888&#x27;,      // Input stream URL&#xA;        &#x27;-map&#x27;, &#x27;0:v?&#x27;,                     // Select video from input stream&#xA;        &#x27;-c:v&#x27;, &#x27;libx264&#x27;,                  // Video codec of output stream&#xA;        &#x27;-preset&#x27;, &#x27;ultrafast&#x27;,             // Faster encoding for lower latency&#xA;        &#x27;-tune&#x27;, &#x27;zerolatency&#x27;,             // Optimize for zero latency&#xA;        // &#x27;-s&#x27;, &#x27;768x480&#x27;,                    // Adjust the resolution (experiment with values)&#xA;        &#x27;-f&#x27;, &#x27;rtp&#x27;, `rtp://127.0.0.1:${udpPort}` // Output stream URL&#xA;    ]);&#xA;&#xA;}&#xA;&#xA;let ffmpegProcess = createFFmpegProcess();&#xA;&#xA;&#xA;const attachFFmpegListeners = () => {&#xA;    // Capture standard output and print it&#xA;    ffmpegProcess.stdout.on(&#x27;data&#x27;, (data) => {&#xA;        log.info(`FFMPEG process stdout: ${data}`);&#xA;    });&#xA;&#xA;    // Capture standard error and print it&#xA;    ffmpegProcess.stderr.on(&#x27;data&#x27;, (data) => {&#xA;        console.error(`ffmpeg stderr: ${data}`);&#xA;    });&#xA;&#xA;    // Listen for the exit event&#xA;    ffmpegProcess.on(&#x27;exit&#x27;, (code, signal) => {&#xA;        if (code !== null) {&#xA;            log.info(`ffmpeg process exited with code ${code}`);&#xA;        } else if (signal !== null) {&#xA;            log.info(`ffmpeg process killed with signal ${signal}`);&#xA;        }&#xA;    });&#xA;};&#xA;&#xA;&#xA;attachFFmpegListeners();&#xA;&#xA;&#xA;server.on("connection", async (socket) => {&#xA;    const payloadType = 96; // It is a numerical value that is assigned to each codec in the SDP offer/answer exchange -> for H264&#xA;    // Create a peer connection with the codec parameters set in advance.&#xA;    const pc = new RTCPeerConnection({&#xA;        codecs: {&#xA;            audio: [],&#xA;            video: [&#xA;                new RTCRtpCodecParameters({&#xA;                    mimeType: "video/H264",&#xA;                    clockRate: 90000, // 90000 is the default value for H264&#xA;                    payloadType: payloadType,&#xA;                }),&#xA;            ],&#xA;        },&#xA;    });&#xA;&#xA;    const track = new MediaStreamTrack({kind: "video"});&#xA;&#xA;&#xA;    udp.on("message", (data) => {&#xA;        console.log(data)&#xA;        const rtp = RtpPacket.deSerialize(data);&#xA;        rtp.header.payloadType = payloadType;&#xA;        track.writeRtp(rtp);&#xA;    });&#xA;&#xA;    udp.on("error", (err) => {&#xA;        console.log(err)&#xA;&#xA;    });&#xA;&#xA;    udp.on("close", () => {&#xA;        console.log("close")&#xA;    });&#xA;&#xA;    pc.addTransceiver(track, {direction: "sendonly"});&#xA;&#xA;    await pc.setLocalDescription(await pc.createOffer());&#xA;    const sdp = JSON.stringify(pc.localDescription);&#xA;    socket.send(sdp);&#xA;&#xA;    socket.on("message", (data: any) => {&#xA;        if (data.toString() === &#x27;resetFFMPEG&#x27;) {&#xA;            ffmpegProcess.kill(&#x27;SIGINT&#x27;);&#xA;            log.info(`FFMPEG process killed`)&#xA;            setTimeout(() => {&#xA;                ffmpegProcess = createFFmpegProcess();&#xA;                attachFFmpegListeners();&#xA;            }, 5000)&#xA;        } else {&#xA;            pc.setRemoteDescription(JSON.parse(data));&#xA;        }&#xA;    });&#xA;});&#xA;

    &#xA;

    And this fronted :

    &#xA;

    &#xA;&#xA;&#xA;    &#xA;    &#xA;    <code class="echappe-js">&lt;script&amp;#xA;            crossorigin&amp;#xA;            src=&quot;https://unpkg.com/react@16/umd/react.development.js&quot;&amp;#xA;    &gt;&lt;/script&gt;&#xA;    &lt;script&amp;#xA;            crossorigin&amp;#xA;            src=&quot;https://unpkg.com/react-dom@16/umd/react-dom.development.js&quot;&amp;#xA;    &gt;&lt;/script&gt;&#xA;    &lt;script&amp;#xA;            crossorigin&amp;#xA;            src=&quot;https://cdnjs.cloudflare.com/ajax/libs/babel-core/5.8.34/browser.min.js&quot;&amp;#xA;    &gt;&lt;/script&gt;&#xA;    &lt;script src=&quot;https://cdn.jsdelivr.net/npm/babel-regenerator-runtime@6.5.0/runtime.min.js&quot;&gt;&lt;/script&gt;&#xA;&#xA;&#xA;
    &#xA;

    &#xA;

    &#xA;&lt;script type=&quot;text/babel&quot;&gt;&amp;#xA;    let rtc;&amp;#xA;&amp;#xA;    const App = () =&gt; {&amp;#xA;        const [log, setLog] = React.useState([]);&amp;#xA;        const videoRef = React.useRef();&amp;#xA;        const socket = new WebSocket(&quot;ws://localhost:8888&quot;);&amp;#xA;        const [peer, setPeer] = React.useState(null); // Add state to keep track of the peer connection&amp;#xA;&amp;#xA;        React.useEffect(() =&gt; {&amp;#xA;            (async () =&gt; {&amp;#xA;                await new Promise((r) =&gt; (socket.onopen = r));&amp;#xA;                console.log(&quot;open websocket&quot;);&amp;#xA;&amp;#xA;                const handleOffer = async (offer) =&gt; {&amp;#xA;                    console.log(&quot;new offer&quot;, offer.sdp);&amp;#xA;&amp;#xA;                    const updatedPeer = new RTCPeerConnection({&amp;#xA;                        iceServers: [],&amp;#xA;                        sdpSemantics: &quot;unified-plan&quot;,&amp;#xA;                    });&amp;#xA;&amp;#xA;                    updatedPeer.onicecandidate = ({ candidate }) =&gt; {&amp;#xA;                        if (!candidate) {&amp;#xA;                            const sdp = JSON.stringify(updatedPeer.localDescription);&amp;#xA;                            console.log(sdp);&amp;#xA;                            socket.send(sdp);&amp;#xA;                        }&amp;#xA;                    };&amp;#xA;&amp;#xA;                    updatedPeer.oniceconnectionstatechange = () =&gt; {&amp;#xA;                        console.log(&amp;#xA;                            &quot;oniceconnectionstatechange&quot;,&amp;#xA;                            updatedPeer.iceConnectionState&amp;#xA;                        );&amp;#xA;                    };&amp;#xA;&amp;#xA;                    updatedPeer.ontrack = (e) =&gt; {&amp;#xA;                        console.log(&quot;ontrack&quot;, e);&amp;#xA;                        videoRef.current.srcObject = e.streams[0];&amp;#xA;                    };&amp;#xA;&amp;#xA;                    await updatedPeer.setRemoteDescription(offer);&amp;#xA;                    const answer = await updatedPeer.createAnswer();&amp;#xA;                    await updatedPeer.setLocalDescription(answer);&amp;#xA;&amp;#xA;                    setPeer(updatedPeer);&amp;#xA;                };&amp;#xA;&amp;#xA;                socket.onmessage = (ev) =&gt; {&amp;#xA;                    const data = JSON.parse(ev.data);&amp;#xA;                    if (data.type === &quot;offer&quot;) {&amp;#xA;                        handleOffer(data);&amp;#xA;                    } else if (data.type === &quot;resetFFMPEG&quot;) {&amp;#xA;                        // Handle the resetFFMPEG message&amp;#xA;                        console.log(&quot;FFmpeg reset requested&quot;);&amp;#xA;                    }&amp;#xA;                };&amp;#xA;            })();&amp;#xA;        }, []); // Added socket as a dependency to the useEffect hook&amp;#xA;&amp;#xA;        const sendRequestToResetFFmpeg = () =&gt; {&amp;#xA;            socket.send(&quot;resetFFMPEG&quot;);&amp;#xA;        };&amp;#xA;&amp;#xA;        return (&amp;#xA;            &lt;div&gt;&amp;#xA;                Video: &amp;#xA;                &lt;video ref={videoRef} autoPlay muted /&gt;&amp;#xA;                &lt;button onClick={() =&gt; sendRequestToResetFFmpeg()}&gt;Reset FFMPEG&lt;/button&gt;&amp;#xA;            &lt;/div&gt;&amp;#xA;        );&amp;#xA;    };&amp;#xA;&amp;#xA;    ReactDOM.render(&lt;App /&gt;, document.getElementById(&quot;app1&quot;));&amp;#xA;&lt;/script&gt;&#xA;&#xA;&#xA;

    &#xA;