Recherche avancée

Médias (0)

Mot : - Tags -/alertes

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (57)

  • Participer à sa traduction

    10 avril 2011

    Vous pouvez nous aider à améliorer les locutions utilisées dans le logiciel ou à traduire celui-ci dans n’importe qu’elle nouvelle langue permettant sa diffusion à de nouvelles communautés linguistiques.
    Pour ce faire, on utilise l’interface de traduction de SPIP où l’ensemble des modules de langue de MediaSPIP sont à disposition. ll vous suffit de vous inscrire sur la liste de discussion des traducteurs pour demander plus d’informations.
    Actuellement MediaSPIP n’est disponible qu’en français et (...)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

Sur d’autres sites (8478)

  • How to use FFMPEG to send image via RTMP using ProcessBuilder

    13 mai 2022, par ljnoah

    I have a callback function that gives me frames as bytes type to which I would like to pass as FFMPEG parameter to write them to a rtmp URL. but I don't really have any experience with ffmpeg, thus far I was not able to find an example on how to do it. Basically, I would like to know can I use use the bytes array that is FrameData that holds the images I am getting and write to ffmpeg as a parameter to be sent via streaming to a server using ProcessBuilder.

    


     private byte[] FrameData = new byte[384 * 288 * 4];
    private final IFrameCallback mIFrameCallback = new IFrameCallback() {
          @Override
          public void onFrame(final ByteBuffer frameData) {
              frameData.clear();
              frameData.get(FrameData, 0, frameData.capacity());
              ProcessBuilder pb = new ProcessBuilder(ffmpeg ,  "-y",  "-f", "rawvideo",  "vcodec", "rawvideo",  "-pix_fmt", "bgr24",
                    "-r", "25",
                    "-i", "-",
                    "-c:v", "libx264",
                    "-pix_fmt", "yuv420p",
                    "-preset", "ultrafast",
                    "-f", "flv",
                    "rtmp://192.168.0.13:1935/live/test");
              }
            Log.e(TAG, "mIFrameCallback: onFrame------");
            try {
                pb.inheritIO().start().waitFor();
            } catch (InterruptedException | IOException e) {
                e.printStackTrace();
            }
      };


    


    This callback gives me the frames from my camera on the fly and writes it to FrameData, which I can compress to a bitmap if needed. The current attempt isn't working as I have no idea how to pass my byte array as a parameter to ffmpeg to be streamed via rtmp as above to push my frames from the camera that are stored FrameData byte buffer via RTMP/RTSP to my server IP. I would use a similar approach in python like this :

    


     import subprocess
    fps = 25
    width = 224
    height = 224
    command = ['ffmpeg',  '-y',  '-f', 'rawvideo',  '-vcodec', 'rawvideo',  '-pix_fmt', 'bgr24',
               '-s', "{}x{}".format(width, height),
               '-r', str(fps),
               '-i', '-',
               '-c:v', 'libx264',
               '-pix_fmt', 'yuv420p',
               '-preset', 'ultrafast',
               '-f', 'flv',
               'rtmp://192.168.0.13:1935/live/test']
    p = subprocess.Popen(command, stdin=subprocess.PIPE)
    while(True):
        frame = np.random.randint([255], size=(224, 224, 3))
        frame = frame.astype(np.uint8)
        p.stdin.write(frame.tobytes())


    


    I really don't understand how to write my byte arrays to the ffmpeg as I would in this Python example above.
What I tried doing was this :

    


     private byte[] FrameData = new byte[384 * 288 * 4];
    String ffmpeg = Loader.load(org.bytedeco.ffmpeg.ffmpeg.class);
    private final IFrameCallback mIFrameCallback = new IFrameCallback() {
        @RequiresApi(api = Build.VERSION_CODES.O)
        @Override
        public void onFrame(final ByteBuffer frameData) {
            frameData.clear();
            frameData.get(FrameData, 0, frameData.capacity());
            ProcessBuilder pb = new ProcessBuilder(ffmpeg ,  "-y",  "-f", "rawvideo",  "vcodec", "rawvideo",  "-pix_fmt", "bgr24",
                    "-r", "25",
                    "-i", "-",
                    "-c:v", "libx264",
                    "-pix_fmt", "yuv420p",
                    "-preset", "ultrafast",
                    "-f", "flv",
                    "rtmp://192.168.0.13:1935/live/test");
        try {

            Log.e(TAG, "mIFrameCallback: onFrame------");
            pb.redirectInput();
            pb.redirectError();
            Log.e(TAG, "frame data check 1");
            Process p = pb.start();
            Log.e(TAG, "frame data check 2");
            p.getOutputStream().write(FrameData);
            Log.e(TAG, "frame data check 3");
        } catch (IOException e) {
            e.printStackTrace();
        }
        }
    };


    


    Unfortunately, no results whatsoever. It appears that the try/catch block is not executed, I am not even sure if this is the right way to send bytes via RTMP stream

    


    Edit : I have fixed the indicated issue with the ProcessBuilder being called twice and logged the calls made inside the code, it only calls up to : Log.e(TAG, "frame data check 1"); Although, I am still not sure if that's the write way to write an array of bytes to ffmpeg for rtmp streaming.

    


  • ffmpeg : md5 of m3u8 playlists generated from same input video with different segment durations (after applying video filter) don't match

    15 juillet 2020, par Saurabh P Bhandari

    Here are a few commands I am using to convert and transize a video in mp4 format to a m3u8 playlist.

    


    For a given input video (mp4 format), generate multiple video only segments with segment duration 30s

    


    ffmpeg -loglevel error -i input.mp4 -dn -sn -an -c:v copy -bsf:v h264_mp4toannexb -copyts -start_at_zero -f segment -segment_time 30 30%03d.mp4 -dn -sn -vn -c:a copy audio.aac


    


    Apply video filter (in this case scaling) on each segment and convert it to a m3u8 format

    


    ls 30*.mp4 | parallel 'ffmpeg -loglevel error -i {} -vf scale=-2:144 -hls_list_size 0 {}.m3u8'


    


    Store the list of m3u8 files generated in list.txt in this format file 'segment-name.m3u8'

    


    for f in 30*.m3u8; do echo "file '$f'" >> list.txt; done


    


    Using concat demuxer, combine all segment files (which are in m3u8 format) and the audio to get one final m3u8 playlist pointing to segments with duration of 10s.

    


    ffmpeg -loglevel error -f concat -i list.txt -i audio.aac -c copy -hls_list_size 0 -hls_time 10 output_30.m3u8


    



    


    I can change the segment duration in the first step from 30s to 60s, and compare the md5 of the final m3u8 playlist generated in both the cases using this command

    


    ffmpeg -loglevel error -i <input m3u8="m3u8" playlist="playlist" /> -f md5 - &#xA;

    &#xA;

    The md5 of the output files differ i.e video streams of output_30.m3u8 and output_60.m3u8 are not the same.

    &#xA;

    Can anyone elaborate on this ?

    &#xA;

    (I expected the md5 to be the same)

    &#xA;

  • Streaming through FFMpeg stdout

    30 juin 2014, par SandyEmerald

    I need to stream my generating images using ffmpeg througth stdout. Is it possible ? If yes, could you give me some ffmpeg examples like commands or general conceptions ?

    The final task is to create streaming video server through RTP, so i could connect to it via VLC or other client. And i need to do it without using diskspace - only through streams.

    Now i have Qt program which generates images. I know that i can give them to stdout using QDataStream.