Recherche avancée

Médias (1)

Mot : - Tags -/3GS

Autres articles (65)

  • Des sites réalisés avec MediaSPIP

    2 mai 2011, par

    Cette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
    Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

Sur d’autres sites (9052)

  • Streaming through FFMpeg stdout

    30 juin 2014, par SandyEmerald

    I need to stream my generating images using ffmpeg througth stdout. Is it possible ? If yes, could you give me some ffmpeg examples like commands or general conceptions ?

    The final task is to create streaming video server through RTP, so i could connect to it via VLC or other client. And i need to do it without using diskspace - only through streams.

    Now i have Qt program which generates images. I know that i can give them to stdout using QDataStream.

  • How to use FFMPEG to send image via RTMP using ProcessBuilder

    13 mai 2022, par ljnoah

    I have a callback function that gives me frames as bytes type to which I would like to pass as FFMPEG parameter to write them to a rtmp URL. but I don't really have any experience with ffmpeg, thus far I was not able to find an example on how to do it. Basically, I would like to know can I use use the bytes array that is FrameData that holds the images I am getting and write to ffmpeg as a parameter to be sent via streaming to a server using ProcessBuilder.

    


     private byte[] FrameData = new byte[384 * 288 * 4];
    private final IFrameCallback mIFrameCallback = new IFrameCallback() {
          @Override
          public void onFrame(final ByteBuffer frameData) {
              frameData.clear();
              frameData.get(FrameData, 0, frameData.capacity());
              ProcessBuilder pb = new ProcessBuilder(ffmpeg ,  "-y",  "-f", "rawvideo",  "vcodec", "rawvideo",  "-pix_fmt", "bgr24",
                    "-r", "25",
                    "-i", "-",
                    "-c:v", "libx264",
                    "-pix_fmt", "yuv420p",
                    "-preset", "ultrafast",
                    "-f", "flv",
                    "rtmp://192.168.0.13:1935/live/test");
              }
            Log.e(TAG, "mIFrameCallback: onFrame------");
            try {
                pb.inheritIO().start().waitFor();
            } catch (InterruptedException | IOException e) {
                e.printStackTrace();
            }
      };


    


    This callback gives me the frames from my camera on the fly and writes it to FrameData, which I can compress to a bitmap if needed. The current attempt isn't working as I have no idea how to pass my byte array as a parameter to ffmpeg to be streamed via rtmp as above to push my frames from the camera that are stored FrameData byte buffer via RTMP/RTSP to my server IP. I would use a similar approach in python like this :

    


     import subprocess
    fps = 25
    width = 224
    height = 224
    command = ['ffmpeg',  '-y',  '-f', 'rawvideo',  '-vcodec', 'rawvideo',  '-pix_fmt', 'bgr24',
               '-s', "{}x{}".format(width, height),
               '-r', str(fps),
               '-i', '-',
               '-c:v', 'libx264',
               '-pix_fmt', 'yuv420p',
               '-preset', 'ultrafast',
               '-f', 'flv',
               'rtmp://192.168.0.13:1935/live/test']
    p = subprocess.Popen(command, stdin=subprocess.PIPE)
    while(True):
        frame = np.random.randint([255], size=(224, 224, 3))
        frame = frame.astype(np.uint8)
        p.stdin.write(frame.tobytes())


    


    I really don't understand how to write my byte arrays to the ffmpeg as I would in this Python example above.
What I tried doing was this :

    


     private byte[] FrameData = new byte[384 * 288 * 4];
    String ffmpeg = Loader.load(org.bytedeco.ffmpeg.ffmpeg.class);
    private final IFrameCallback mIFrameCallback = new IFrameCallback() {
        @RequiresApi(api = Build.VERSION_CODES.O)
        @Override
        public void onFrame(final ByteBuffer frameData) {
            frameData.clear();
            frameData.get(FrameData, 0, frameData.capacity());
            ProcessBuilder pb = new ProcessBuilder(ffmpeg ,  "-y",  "-f", "rawvideo",  "vcodec", "rawvideo",  "-pix_fmt", "bgr24",
                    "-r", "25",
                    "-i", "-",
                    "-c:v", "libx264",
                    "-pix_fmt", "yuv420p",
                    "-preset", "ultrafast",
                    "-f", "flv",
                    "rtmp://192.168.0.13:1935/live/test");
        try {

            Log.e(TAG, "mIFrameCallback: onFrame------");
            pb.redirectInput();
            pb.redirectError();
            Log.e(TAG, "frame data check 1");
            Process p = pb.start();
            Log.e(TAG, "frame data check 2");
            p.getOutputStream().write(FrameData);
            Log.e(TAG, "frame data check 3");
        } catch (IOException e) {
            e.printStackTrace();
        }
        }
    };


    


    Unfortunately, no results whatsoever. It appears that the try/catch block is not executed, I am not even sure if this is the right way to send bytes via RTMP stream

    


    Edit : I have fixed the indicated issue with the ProcessBuilder being called twice and logged the calls made inside the code, it only calls up to : Log.e(TAG, "frame data check 1"); Although, I am still not sure if that's the write way to write an array of bytes to ffmpeg for rtmp streaming.

    


  • How video editor show real time preview of videos ? [closed]

    3 juin 2024, par SWIK

    I am trying to create a simple video editor that combine two video by layering one over another, i can easily do it with ffmpeg, but I am not sure how can I make a preview of it before making final video ? How video editor display preview without building them ? I am looking towards create a react application.