Recherche avancée

Médias (1)

Mot : - Tags -/embed

Autres articles (28)

  • Mise à disposition des fichiers

    14 avril 2011, par

    Par défaut, lors de son initialisation, MediaSPIP ne permet pas aux visiteurs de télécharger les fichiers qu’ils soient originaux ou le résultat de leur transformation ou encodage. Il permet uniquement de les visualiser.
    Cependant, il est possible et facile d’autoriser les visiteurs à avoir accès à ces documents et ce sous différentes formes.
    Tout cela se passe dans la page de configuration du squelette. Il vous faut aller dans l’espace d’administration du canal, et choisir dans la navigation (...)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Récupération d’informations sur le site maître à l’installation d’une instance

    26 novembre 2010, par

    Utilité
    Sur le site principal, une instance de mutualisation est définie par plusieurs choses : Les données dans la table spip_mutus ; Son logo ; Son auteur principal (id_admin dans la table spip_mutus correspondant à un id_auteur de la table spip_auteurs)qui sera le seul à pouvoir créer définitivement l’instance de mutualisation ;
    Il peut donc être tout à fait judicieux de vouloir récupérer certaines de ces informations afin de compléter l’installation d’une instance pour, par exemple : récupérer le (...)

Sur d’autres sites (4768)

  • How to resize an mp4 video and reduce frame rate while keeping quality ?

    1er décembre 2019, par Jules

    I’m trying to resize (keeping quality) and reduce frame rate to 30, I’ve seen various command but I’m having difficulty.

    This seems to resize nicely

    ffmpeg -i final-video.mp4 -aspect 886:1920 -c copy final-resized.mp4

    I’ve also see -r 30 and -filter:v fps=fps=30

    But neither seem to work in-conjunction with the resize command.

    I’ve seen posts like this
    Re-sampling H264 video to reduce frame rate while maintaining high image quality
    But this takes a long time.

  • Stream h264 to javafx possibly using javacv/ffmpeg

    4 octobre 2018, par cagney

    I’m really stuck on getting a video stream to play on a java fx project.

    — Short version :

    I’m streaming h264/avcc flavor video from an android phone to a desktop computer. However javafx doesn’t have an easy solution for displaying stream. I’m attempting to use javacv / ffmpeg in an attempt to make this work. However I am getting errors from ffmpeg.

    1) Is there a better way to display streaming video on javafx ?

    2) Do you have a sample project or good tutorial for javacv ffmpegframegrabber ?

    3) I think I may be missing some small detail in mycode but Im not sure what i would be.

    — Longer Version :

    1) On the android end Im getting video using mediarecorder. In order to get the sps/pps info I record and save a small movie to the device and then parse the sps and pps data.

    2) Next, on the android, I split up the nalus to meet MTU req and send them over a udp connection to my desktop

    3)On my desktop I reassmble the nalus( or trash them if they loose data) and feed those to an input stream that I gave to the framegreabber constructor.

    — The Code and Logs :

    The errors are long and numerous depending on the flavor I feed it. Here are two separate examples which are usually repeated at great length

    [h264 @ 0000020225907a40] non-existing PPS 0 referenced
    [h264 @ 0000020225907a40] decode_slice_header error
    [h264 @ 0000020225907a40] no frame!

    [h264 @ 00000163d8637a40] illegal aspect ratio
    [h264 @ 00000163d8637a40] pps_id 3412 out of range
    [AVBSFContext @ 00000163e28a0e00] Invalid NAL unit 0, skipping.

     !! One big caveat that I am aware off is that I have not implemented timestamps
    which I created on the android device when feeding ffmpeg. I think it should still show distorted images without this though

    Because I have spent all day guessing and trying I have several "flavors" of data I have shoved through. I am only showing the first section of each nal which I believe if correct would at least show a garbage image as long as my sps and pps are right

    sps: 67 80 80 1E E9 01 68 22 FD C0 36 85 09 A8
    pps: 68 06 06 E2

    Below is annex B style.
    These were each prefixed with either 00 00 01 and 00 00 00 01

    Debug transfer 65 B8 40 0B E5 B8 7B 80 5B 85
    Debug transfer 41 E2 20 7A 74 34 3B D6 BE FA
    Debug transfer 41 E4 40 2F 01 E0 0C 06 EE 91
    Debug transfer 41 E6 60 3E A1 20 5A 02 3C 6D
    Debug transfer 41 E8 80 13 B0 B9 82 C3 03 F4
    Debug transfer 41 EC C0 1B A3 0C 28 F1 B0 C8
    Debug transfer 41 EE E0 1F CE 07 30 EE 05 06
    Debug transfer 41 F1 00 08 ED 80 9C 20 09 73
    Debug transfer 41 F3 20 09 E9 00 86 60 21 C3
    VideoDecoderaddPacket type: 24
    Debug transfer 67 80 80 1E E9 01 68 22 FD C0
    Debug transfer 68 06 06 E2
    Debug transfer 65 B8 20 00 9F 80 78 00 12 8A
    Debug transfer 41 E2 20 09 F0 1E 40 7B 0C E0
    Debug transfer 41 E4 40 09 F0 29 30 D6 00 AE
    Debug transfer 41 E6 60 09 F1 48 31 80 99 40
    [h264 @ 000001c771617a40] non-existing PPS 0 referenced

    Here I tried Avcc style. You can see the first line is the combination of the sps pps followed by idr and then repeated non idr

    Debug transfer 18 00 0E 67 80 80 1E E9 01 68
    Debug transfer 00 02 4A 8F 65 B8 20 00 9F C5
    Debug transfer 00 02 2F DA 41 E2 20 09 E8 0F
    Debug transfer 00 02 2C 34 41 E4 40 09 F4 20
    Debug transfer 00 02 4D 92 41 E6 60 09 FC 2B
    Debug transfer 00 02 47 02 41 E8 80 09 F0 72
    Debug transfer 00 02 52 50 41 EA A0 09 EC 0F
    Debug transfer 00 02 58 8A 41 EC C0 09 FC 6F
    Debug transfer 00 02 55 F9 41 EE E0 09 FC 6E
    Debug transfer 00 02 4D 79 41 F1 00 09 F0 3E
    Debug transfer 00 02 4D B6 41 F3 20 09 E8 64

    The following class is where I try to get javacv/ffmpeg to show the video. I dont think its an ideal solution and am researching canvasfram as a replacement to the image view.

       public class ImageDecoder {

       private final static String TAG = "ImageDecoder ";

       private ImageDecoder(){

       }


       public static void streamImageToImageView(
               final ImageView view,
               final InputStream inputStream,
               final String format,
               final int frameRate,
               final int bitrate,
               final String preset,
               final int numBuffers
       )
       {
           System.out.println("Image Decoder Starting...");


           try(    final FrameGrabber grabber = new
       FFmpegFrameGrabber(inputStream))
           {

               final Java2DFrameConverter converter = new Java2DFrameConverter();

               grabber.setFrameNumber(frameRate);
               grabber.setFormat(format);
               grabber.setVideoBitrate(bitrate);
               grabber.setVideoOption("preset", preset);
               grabber.setNumBuffers(numBuffers);

               System.out.println("Image Decoder waiting on grabber.start...");
               grabber.start();   //---- this call is blocking the loop

               System.out.println("Image Decoder Looping---------------------------
      -------- hit stop");
               while(!Thread.interrupted()){
                   //System.out.println("Image Decoder Looping");
                   final Frame frame = grabber.grab();
                   if (frame != null){
                       final BufferedImage bufferedImage =
       converter.convert(frame);
                       if (bufferedImage != null){

                           Platform.runLater(() ->
       view.setImage(SwingFXUtils.toFXImage(bufferedImage, null)));
                       }else{
                           System.out.println("no buf im");
                       }
                   }else{
                       System.out.println("no fr");
                       Thread.currentThread().interrupt();
                   }

               }



           }catch (Exception e){
               System.out.print(TAG + e);
           }


       }






       }

    Any help is greatly appreciated.

  • How to Stream RTP (IP camera) Into React App setup

    10 novembre 2024, par sharon2469

    I am trying to transfer a live broadcast from an IP camera or any other broadcast coming from an RTP/RTSP source to my REACT application. BUT MUST BE LIVE

    


    My setup at the moment is :

    


    IP Camera -> (RTP) -> FFmpeg -> (udp) -> Server(nodeJs) -> (WebRTC) -> React app

    


    In the current situation, There is almost no delay, but there are some things here that I can't avoid and I can't understand why, and here is my question :

    


    1) First, is the SETUP even correct and this is the only way to Stream RTP video in Web app ?

    


    2) Is it possible to avoid re-encode the stream , RTP transmission necessarily comes in H.264, hence I don't really need to execute the following command :

    


        return spawn('ffmpeg', [
    '-re',                              // Read input at its native frame rate Important for live-streaming
    '-probesize', '32',                 // Set probing size to 32 bytes (32 is minimum)
    '-analyzeduration', '1000000',      // An input duration of 1 second
    '-c:v', 'h264',                     // Video codec of input video
    '-i', 'rtp://238.0.0.2:48888',      // Input stream URL
    '-map', '0:v?',                     // Select video from input stream
    '-c:v', 'libx264',                  // Video codec of output stream
    '-preset', 'ultrafast',             // Faster encoding for lower latency
    '-tune', 'zerolatency',             // Optimize for zero latency
    // '-s', '768x480',                    // Adjust the resolution (experiment with values)
    '-f', 'rtp', `rtp://127.0.0.1:${udpPort}` // Output stream URL
]);


    


    As you can se in this command I re-encode to libx264, But if I set FFMPEG a parameter '-c:v' :'copy' instead of '-c:v', 'libx264' then FFMPEG throw an error says : that it doesn't know how to encode h264 and only knows what is libx264-> Basically, I want to stop the re-encode because there is really no need for it, because the stream is already encoded to H264. Are there certain recommendations that can be made ?

    


    3) I thought about giving up the FFMPEG completely, but the RTP packets arrive at a size of 1200+ BYTES when WEBRTC is limited to up to 1280 BYTE. Is there a way to manage these sabotages without damaging the video and is it to enter this world ? I guess there is the whole story with the JITTER BUFFER here

    


    This is my server side code (THIS IS JUST A TEST CODE)

    


    import {
    MediaStreamTrack,
    randomPort,
    RTCPeerConnection,
    RTCRtpCodecParameters,
    RtpPacket,
} from 'werift'
import {Server} from "ws";
import {createSocket} from "dgram";
import {spawn} from "child_process";
import LoggerFactory from "./logger/loggerFactory";

//

const log = LoggerFactory.getLogger('ServerMedia')

// Websocket server -> WebRTC
const serverPort = 8888
const server = new Server({port: serverPort});
log.info(`Server Media start om port: ${serverPort}`);

// UDP server -> ffmpeg
const udpPort = 48888
const udp = createSocket("udp4");
// udp.bind(udpPort, () => {
//     udp.addMembership("238.0.0.2");
// })
udp.bind(udpPort)
log.info(`UDP port: ${udpPort}`)


const createFFmpegProcess = () => {
    log.info(`Start ffmpeg process`)
    return spawn('ffmpeg', [
        '-re',                              // Read input at its native frame rate Important for live-streaming
        '-probesize', '32',                 // Set probing size to 32 bytes (32 is minimum)
        '-analyzeduration', '1000000',      // An input duration of 1 second
        '-c:v', 'h264',                     // Video codec of input video
        '-i', 'rtp://238.0.0.2:48888',      // Input stream URL
        '-map', '0:v?',                     // Select video from input stream
        '-c:v', 'libx264',                  // Video codec of output stream
        '-preset', 'ultrafast',             // Faster encoding for lower latency
        '-tune', 'zerolatency',             // Optimize for zero latency
        // '-s', '768x480',                    // Adjust the resolution (experiment with values)
        '-f', 'rtp', `rtp://127.0.0.1:${udpPort}` // Output stream URL
    ]);

}

let ffmpegProcess = createFFmpegProcess();


const attachFFmpegListeners = () => {
    // Capture standard output and print it
    ffmpegProcess.stdout.on('data', (data) => {
        log.info(`FFMPEG process stdout: ${data}`);
    });

    // Capture standard error and print it
    ffmpegProcess.stderr.on('data', (data) => {
        console.error(`ffmpeg stderr: ${data}`);
    });

    // Listen for the exit event
    ffmpegProcess.on('exit', (code, signal) => {
        if (code !== null) {
            log.info(`ffmpeg process exited with code ${code}`);
        } else if (signal !== null) {
            log.info(`ffmpeg process killed with signal ${signal}`);
        }
    });
};


attachFFmpegListeners();


server.on("connection", async (socket) => {
    const payloadType = 96; // It is a numerical value that is assigned to each codec in the SDP offer/answer exchange -> for H264
    // Create a peer connection with the codec parameters set in advance.
    const pc = new RTCPeerConnection({
        codecs: {
            audio: [],
            video: [
                new RTCRtpCodecParameters({
                    mimeType: "video/H264",
                    clockRate: 90000, // 90000 is the default value for H264
                    payloadType: payloadType,
                }),
            ],
        },
    });

    const track = new MediaStreamTrack({kind: "video"});


    udp.on("message", (data) => {
        console.log(data)
        const rtp = RtpPacket.deSerialize(data);
        rtp.header.payloadType = payloadType;
        track.writeRtp(rtp);
    });

    udp.on("error", (err) => {
        console.log(err)

    });

    udp.on("close", () => {
        console.log("close")
    });

    pc.addTransceiver(track, {direction: "sendonly"});

    await pc.setLocalDescription(await pc.createOffer());
    const sdp = JSON.stringify(pc.localDescription);
    socket.send(sdp);

    socket.on("message", (data: any) => {
        if (data.toString() === 'resetFFMPEG') {
            ffmpegProcess.kill('SIGINT');
            log.info(`FFMPEG process killed`)
            setTimeout(() => {
                ffmpegProcess = createFFmpegProcess();
                attachFFmpegListeners();
            }, 5000)
        } else {
            pc.setRemoteDescription(JSON.parse(data));
        }
    });
});


    


    And this fronted :

    


    &#xA;&#xA;&#xA;    &#xA;    &#xA;    <code class="echappe-js">&lt;script&amp;#xA;            crossorigin&amp;#xA;            src=&quot;https://unpkg.com/react@16/umd/react.development.js&quot;&amp;#xA;    &gt;&lt;/script&gt;&#xA;    &lt;script&amp;#xA;            crossorigin&amp;#xA;            src=&quot;https://unpkg.com/react-dom@16/umd/react-dom.development.js&quot;&amp;#xA;    &gt;&lt;/script&gt;&#xA;    &lt;script&amp;#xA;            crossorigin&amp;#xA;            src=&quot;https://cdnjs.cloudflare.com/ajax/libs/babel-core/5.8.34/browser.min.js&quot;&amp;#xA;    &gt;&lt;/script&gt;&#xA;    &lt;script src=&quot;https://cdn.jsdelivr.net/npm/babel-regenerator-runtime@6.5.0/runtime.min.js&quot;&gt;&lt;/script&gt;&#xA;&#xA;&#xA;
    &#xA;

    &#xA;

    &#xA;&lt;script type=&quot;text/babel&quot;&gt;&amp;#xA;    let rtc;&amp;#xA;&amp;#xA;    const App = () =&gt; {&amp;#xA;        const [log, setLog] = React.useState([]);&amp;#xA;        const videoRef = React.useRef();&amp;#xA;        const socket = new WebSocket(&quot;ws://localhost:8888&quot;);&amp;#xA;        const [peer, setPeer] = React.useState(null); // Add state to keep track of the peer connection&amp;#xA;&amp;#xA;        React.useEffect(() =&gt; {&amp;#xA;            (async () =&gt; {&amp;#xA;                await new Promise((r) =&gt; (socket.onopen = r));&amp;#xA;                console.log(&quot;open websocket&quot;);&amp;#xA;&amp;#xA;                const handleOffer = async (offer) =&gt; {&amp;#xA;                    console.log(&quot;new offer&quot;, offer.sdp);&amp;#xA;&amp;#xA;                    const updatedPeer = new RTCPeerConnection({&amp;#xA;                        iceServers: [],&amp;#xA;                        sdpSemantics: &quot;unified-plan&quot;,&amp;#xA;                    });&amp;#xA;&amp;#xA;                    updatedPeer.onicecandidate = ({ candidate }) =&gt; {&amp;#xA;                        if (!candidate) {&amp;#xA;                            const sdp = JSON.stringify(updatedPeer.localDescription);&amp;#xA;                            console.log(sdp);&amp;#xA;                            socket.send(sdp);&amp;#xA;                        }&amp;#xA;                    };&amp;#xA;&amp;#xA;                    updatedPeer.oniceconnectionstatechange = () =&gt; {&amp;#xA;                        console.log(&amp;#xA;                            &quot;oniceconnectionstatechange&quot;,&amp;#xA;                            updatedPeer.iceConnectionState&amp;#xA;                        );&amp;#xA;                    };&amp;#xA;&amp;#xA;                    updatedPeer.ontrack = (e) =&gt; {&amp;#xA;                        console.log(&quot;ontrack&quot;, e);&amp;#xA;                        videoRef.current.srcObject = e.streams[0];&amp;#xA;                    };&amp;#xA;&amp;#xA;                    await updatedPeer.setRemoteDescription(offer);&amp;#xA;                    const answer = await updatedPeer.createAnswer();&amp;#xA;                    await updatedPeer.setLocalDescription(answer);&amp;#xA;&amp;#xA;                    setPeer(updatedPeer);&amp;#xA;                };&amp;#xA;&amp;#xA;                socket.onmessage = (ev) =&gt; {&amp;#xA;                    const data = JSON.parse(ev.data);&amp;#xA;                    if (data.type === &quot;offer&quot;) {&amp;#xA;                        handleOffer(data);&amp;#xA;                    } else if (data.type === &quot;resetFFMPEG&quot;) {&amp;#xA;                        // Handle the resetFFMPEG message&amp;#xA;                        console.log(&quot;FFmpeg reset requested&quot;);&amp;#xA;                    }&amp;#xA;                };&amp;#xA;            })();&amp;#xA;        }, []); // Added socket as a dependency to the useEffect hook&amp;#xA;&amp;#xA;        const sendRequestToResetFFmpeg = () =&gt; {&amp;#xA;            socket.send(&quot;resetFFMPEG&quot;);&amp;#xA;        };&amp;#xA;&amp;#xA;        return (&amp;#xA;            &lt;div&gt;&amp;#xA;                Video: &amp;#xA;                &lt;video ref={videoRef} autoPlay muted /&gt;&amp;#xA;                &lt;button onClick={() =&gt; sendRequestToResetFFmpeg()}&gt;Reset FFMPEG&lt;/button&gt;&amp;#xA;            &lt;/div&gt;&amp;#xA;        );&amp;#xA;    };&amp;#xA;&amp;#xA;    ReactDOM.render(&lt;App /&gt;, document.getElementById(&quot;app1&quot;));&amp;#xA;&lt;/script&gt;&#xA;&#xA;&#xA;

    &#xA;