Recherche avancée

Médias (1)

Mot : - Tags -/wave

Autres articles (71)

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Contribute to a better visual interface

    13 avril 2011

    MediaSPIP is based on a system of themes and templates. Templates define the placement of information on the page, and can be adapted to a wide range of uses. Themes define the overall graphic appearance of the site.
    Anyone can submit a new graphic theme or template and make it available to the MediaSPIP community.

Sur d’autres sites (6268)

  • How to use FFMPEG to send image via RTMP using ProcessBuilder

    13 mai 2022, par ljnoah

    I have a callback function that gives me frames as bytes type to which I would like to pass as FFMPEG parameter to write them to a rtmp URL. but I don't really have any experience with ffmpeg, thus far I was not able to find an example on how to do it. Basically, I would like to know can I use use the bytes array that is FrameData that holds the images I am getting and write to ffmpeg as a parameter to be sent via streaming to a server using ProcessBuilder.

    


     private byte[] FrameData = new byte[384 * 288 * 4];
    private final IFrameCallback mIFrameCallback = new IFrameCallback() {
          @Override
          public void onFrame(final ByteBuffer frameData) {
              frameData.clear();
              frameData.get(FrameData, 0, frameData.capacity());
              ProcessBuilder pb = new ProcessBuilder(ffmpeg ,  "-y",  "-f", "rawvideo",  "vcodec", "rawvideo",  "-pix_fmt", "bgr24",
                    "-r", "25",
                    "-i", "-",
                    "-c:v", "libx264",
                    "-pix_fmt", "yuv420p",
                    "-preset", "ultrafast",
                    "-f", "flv",
                    "rtmp://192.168.0.13:1935/live/test");
              }
            Log.e(TAG, "mIFrameCallback: onFrame------");
            try {
                pb.inheritIO().start().waitFor();
            } catch (InterruptedException | IOException e) {
                e.printStackTrace();
            }
      };


    


    This callback gives me the frames from my camera on the fly and writes it to FrameData, which I can compress to a bitmap if needed. The current attempt isn't working as I have no idea how to pass my byte array as a parameter to ffmpeg to be streamed via rtmp as above to push my frames from the camera that are stored FrameData byte buffer via RTMP/RTSP to my server IP. I would use a similar approach in python like this :

    


     import subprocess
    fps = 25
    width = 224
    height = 224
    command = ['ffmpeg',  '-y',  '-f', 'rawvideo',  '-vcodec', 'rawvideo',  '-pix_fmt', 'bgr24',
               '-s', "{}x{}".format(width, height),
               '-r', str(fps),
               '-i', '-',
               '-c:v', 'libx264',
               '-pix_fmt', 'yuv420p',
               '-preset', 'ultrafast',
               '-f', 'flv',
               'rtmp://192.168.0.13:1935/live/test']
    p = subprocess.Popen(command, stdin=subprocess.PIPE)
    while(True):
        frame = np.random.randint([255], size=(224, 224, 3))
        frame = frame.astype(np.uint8)
        p.stdin.write(frame.tobytes())


    


    I really don't understand how to write my byte arrays to the ffmpeg as I would in this Python example above.
What I tried doing was this :

    


     private byte[] FrameData = new byte[384 * 288 * 4];
    String ffmpeg = Loader.load(org.bytedeco.ffmpeg.ffmpeg.class);
    private final IFrameCallback mIFrameCallback = new IFrameCallback() {
        @RequiresApi(api = Build.VERSION_CODES.O)
        @Override
        public void onFrame(final ByteBuffer frameData) {
            frameData.clear();
            frameData.get(FrameData, 0, frameData.capacity());
            ProcessBuilder pb = new ProcessBuilder(ffmpeg ,  "-y",  "-f", "rawvideo",  "vcodec", "rawvideo",  "-pix_fmt", "bgr24",
                    "-r", "25",
                    "-i", "-",
                    "-c:v", "libx264",
                    "-pix_fmt", "yuv420p",
                    "-preset", "ultrafast",
                    "-f", "flv",
                    "rtmp://192.168.0.13:1935/live/test");
        try {

            Log.e(TAG, "mIFrameCallback: onFrame------");
            pb.redirectInput();
            pb.redirectError();
            Log.e(TAG, "frame data check 1");
            Process p = pb.start();
            Log.e(TAG, "frame data check 2");
            p.getOutputStream().write(FrameData);
            Log.e(TAG, "frame data check 3");
        } catch (IOException e) {
            e.printStackTrace();
        }
        }
    };


    


    Unfortunately, no results whatsoever. It appears that the try/catch block is not executed, I am not even sure if this is the right way to send bytes via RTMP stream

    


    Edit : I have fixed the indicated issue with the ProcessBuilder being called twice and logged the calls made inside the code, it only calls up to : Log.e(TAG, "frame data check 1"); Although, I am still not sure if that's the write way to write an array of bytes to ffmpeg for rtmp streaming.

    


  • How video editor show real time preview of videos ? [closed]

    3 juin 2024, par SWIK

    I am trying to create a simple video editor that combine two video by layering one over another, i can easily do it with ffmpeg, but I am not sure how can I make a preview of it before making final video ? How video editor display preview without building them ? I am looking towards create a react application.

    


  • whats wrong in this command for merging 2 video using ffmpeg [duplicate]

    8 juillet 2019, par vikram

    I am trying to merge 2 videos the 1st video created from custom camera media recorder(1280*720) (this video has audio file)and 2nd video created from merging 2 images(this video has no audio file) but it wont work..

          String [] merge2video={ "-i", path1,"-i", new
     path2,"-filter_complex", "[0:0] [0:1] [1:0] [1:1]  concat=n=2:v=1:a=1
     [v]
    [a]", "-map", "[v]", "-map", "[a]","-strict", "experimental", "-vcodec"
    ,"libx264" ,"-preset" ,"ultrafast" ,"-crf" ,"30", output};

    I want to merge 2 videos only