Recherche avancée

Médias (0)

Mot : - Tags -/alertes

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (97)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Multilang : améliorer l’interface pour les blocs multilingues

    18 février 2011, par

    Multilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
    Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela.

  • ANNEXE : Les plugins utilisés spécifiquement pour la ferme

    5 mars 2010, par

    Le site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)

Sur d’autres sites (7599)

  • whats wrong in this command for merging 2 video using ffmpeg [duplicate]

    8 juillet 2019, par vikram

    I am trying to merge 2 videos the 1st video created from custom camera media recorder(1280*720) (this video has audio file)and 2nd video created from merging 2 images(this video has no audio file) but it wont work..

          String [] merge2video={ "-i", path1,"-i", new
     path2,"-filter_complex", "[0:0] [0:1] [1:0] [1:1]  concat=n=2:v=1:a=1
     [v]
    [a]", "-map", "[v]", "-map", "[a]","-strict", "experimental", "-vcodec"
    ,"libx264" ,"-preset" ,"ultrafast" ,"-crf" ,"30", output};

    I want to merge 2 videos only

  • How video editor show real time preview of videos ? [closed]

    3 juin 2024, par SWIK

    I am trying to create a simple video editor that combine two video by layering one over another, i can easily do it with ffmpeg, but I am not sure how can I make a preview of it before making final video ? How video editor display preview without building them ? I am looking towards create a react application.

    


  • How to use FFMPEG to send image via RTMP using ProcessBuilder

    13 mai 2022, par ljnoah

    I have a callback function that gives me frames as bytes type to which I would like to pass as FFMPEG parameter to write them to a rtmp URL. but I don't really have any experience with ffmpeg, thus far I was not able to find an example on how to do it. Basically, I would like to know can I use use the bytes array that is FrameData that holds the images I am getting and write to ffmpeg as a parameter to be sent via streaming to a server using ProcessBuilder.

    


     private byte[] FrameData = new byte[384 * 288 * 4];
    private final IFrameCallback mIFrameCallback = new IFrameCallback() {
          @Override
          public void onFrame(final ByteBuffer frameData) {
              frameData.clear();
              frameData.get(FrameData, 0, frameData.capacity());
              ProcessBuilder pb = new ProcessBuilder(ffmpeg ,  "-y",  "-f", "rawvideo",  "vcodec", "rawvideo",  "-pix_fmt", "bgr24",
                    "-r", "25",
                    "-i", "-",
                    "-c:v", "libx264",
                    "-pix_fmt", "yuv420p",
                    "-preset", "ultrafast",
                    "-f", "flv",
                    "rtmp://192.168.0.13:1935/live/test");
              }
            Log.e(TAG, "mIFrameCallback: onFrame------");
            try {
                pb.inheritIO().start().waitFor();
            } catch (InterruptedException | IOException e) {
                e.printStackTrace();
            }
      };


    


    This callback gives me the frames from my camera on the fly and writes it to FrameData, which I can compress to a bitmap if needed. The current attempt isn't working as I have no idea how to pass my byte array as a parameter to ffmpeg to be streamed via rtmp as above to push my frames from the camera that are stored FrameData byte buffer via RTMP/RTSP to my server IP. I would use a similar approach in python like this :

    


     import subprocess
    fps = 25
    width = 224
    height = 224
    command = ['ffmpeg',  '-y',  '-f', 'rawvideo',  '-vcodec', 'rawvideo',  '-pix_fmt', 'bgr24',
               '-s', "{}x{}".format(width, height),
               '-r', str(fps),
               '-i', '-',
               '-c:v', 'libx264',
               '-pix_fmt', 'yuv420p',
               '-preset', 'ultrafast',
               '-f', 'flv',
               'rtmp://192.168.0.13:1935/live/test']
    p = subprocess.Popen(command, stdin=subprocess.PIPE)
    while(True):
        frame = np.random.randint([255], size=(224, 224, 3))
        frame = frame.astype(np.uint8)
        p.stdin.write(frame.tobytes())


    


    I really don't understand how to write my byte arrays to the ffmpeg as I would in this Python example above.
What I tried doing was this :

    


     private byte[] FrameData = new byte[384 * 288 * 4];
    String ffmpeg = Loader.load(org.bytedeco.ffmpeg.ffmpeg.class);
    private final IFrameCallback mIFrameCallback = new IFrameCallback() {
        @RequiresApi(api = Build.VERSION_CODES.O)
        @Override
        public void onFrame(final ByteBuffer frameData) {
            frameData.clear();
            frameData.get(FrameData, 0, frameData.capacity());
            ProcessBuilder pb = new ProcessBuilder(ffmpeg ,  "-y",  "-f", "rawvideo",  "vcodec", "rawvideo",  "-pix_fmt", "bgr24",
                    "-r", "25",
                    "-i", "-",
                    "-c:v", "libx264",
                    "-pix_fmt", "yuv420p",
                    "-preset", "ultrafast",
                    "-f", "flv",
                    "rtmp://192.168.0.13:1935/live/test");
        try {

            Log.e(TAG, "mIFrameCallback: onFrame------");
            pb.redirectInput();
            pb.redirectError();
            Log.e(TAG, "frame data check 1");
            Process p = pb.start();
            Log.e(TAG, "frame data check 2");
            p.getOutputStream().write(FrameData);
            Log.e(TAG, "frame data check 3");
        } catch (IOException e) {
            e.printStackTrace();
        }
        }
    };


    


    Unfortunately, no results whatsoever. It appears that the try/catch block is not executed, I am not even sure if this is the right way to send bytes via RTMP stream

    


    Edit : I have fixed the indicated issue with the ProcessBuilder being called twice and logged the calls made inside the code, it only calls up to : Log.e(TAG, "frame data check 1"); Although, I am still not sure if that's the write way to write an array of bytes to ffmpeg for rtmp streaming.