Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • How to programmatically read an audio RTP stream using javacv and ffmpeg ?

    21 mai 2019, par Chris

    I am trying to read an audio RTP stream coming from ffmpeg in command line using javaCV. I create a DatagramSocket that listens to a specified port but can't get the audio frames.

    I have tried with different types of buffer to play the audio to my speakers but I am getting a lot of "Invalid return value 0 for stream protocol" error messages with no audio in my speakers.

    I am running the following command to stream an audio file:

    ffmpeg -re -i /some/file.wav -ar 44100 -f mulaw -f rtp rtp://127.0.0.1:7780

    And an excerpt of my code so far:

    public class FrameGrabber implements Runnable {

    private static final TimeUnit SECONDS = TimeUnit.SECONDS;
    private InetAddress ipAddress;
    private DatagramSocket serverSocket;
    
    public FrameGrabber(Integer port) throws UnknownHostException, SocketException {
        super();
    
        this.ipAddress = InetAddress.getByName("192.168.44.18");
        serverSocket = new DatagramSocket(port, ipAddress);
    
    }
    
    public AudioFormat getAudioFormat() {
        float sampleRate = 44100.0F;
        // 8000,11025,16000,22050,44100
        int sampleSizeInBits = 16;
        // 8,16
        int channels = 1;
        // 1,2
        boolean signed = true;
        // true,false
        boolean bigEndian = false;
        // true,false
        return new AudioFormat(sampleRate, sampleSizeInBits, channels, signed, bigEndian);
    }
    
    @Override
    public void run() {
    
    
        byte[] buffer = new byte[2048];
        DatagramPacket packet = new DatagramPacket(buffer, buffer.length);
    
        DataInputStream dis = new DataInputStream(new ByteArrayInputStream(packet.getData(), packet.getOffset(), packet.getLength()));
    
    
        FFmpegFrameGrabber grabber = new FFmpegFrameGrabber(dis);
        grabber.setFormat("mulaw");
        grabber.setSampleRate((int) getAudioFormat().getSampleRate());
        grabber.setAudioChannels(getAudioFormat().getChannels());
    
        SourceDataLine soundLine = null;
    
    
        try {
            grabber.start();
    
    
            if (grabber.getSampleRate() > 0 && grabber.getAudioChannels() > 0) {
    
                AudioFormat audioFormat = new AudioFormat(grabber.getSampleRate(), 16, grabber.getAudioChannels(), true, true);
    
                DataLine.Info info = new DataLine.Info(SourceDataLine.class, audioFormat);
                soundLine = (SourceDataLine) AudioSystem.getLine(info);
                soundLine.open(audioFormat);
    
                soundLine.start();
            }
    
            ExecutorService executor = Executors.newSingleThreadExecutor();
    
    
            while (true) {
    
                try {
                    serverSocket.receive(packet);
                } catch (IOException e) {
                    e.printStackTrace();
                }
    
                Frame frame = grabber.grab();
    
                //if (frame == null) break;
    
    
                if (frame != null && frame.samples != null) {
    
                    ShortBuffer channelSamplesFloatBuffer = (ShortBuffer) frame.samples[0];
                    channelSamplesFloatBuffer.rewind();
    
                    ByteBuffer outBuffer = ByteBuffer.allocate(channelSamplesFloatBuffer.capacity() * 2);
                    float[] samples = new float[channelSamplesFloatBuffer.capacity()];
    
                    for (int i = 0; i < channelSamplesFloatBuffer.capacity(); i++) {
                        short val = channelSamplesFloatBuffer.get(i);
                        outBuffer.putShort(val);
                    }
    
                    if (soundLine == null) return;
                    try {
                        SourceDataLine finalSoundLine = soundLine;
                        executor.submit(() -> {
                            finalSoundLine.write(outBuffer.array(), 0, outBuffer.capacity());
                            outBuffer.clear();
                        }).get();
                    } catch (InterruptedException interruptedException) {
                        Thread.currentThread().interrupt();
                    }
                }
    
            }
    
            /*
            executor.shutdownNow();
            executor.awaitTermination(1, SECONDS);
    
            if (soundLine != null) {
                soundLine.stop();
            }
    
            grabber.stop();
            grabber.release();*/
    
            } catch (ExecutionException ex) {
            System.out.println("ExecutionException");
            ex.printStackTrace();
        } catch (org.bytedeco.javacv.FrameGrabber.Exception ex) {
            System.out.println("FrameGrabberException");
            ex.printStackTrace();
        } catch (LineUnavailableException ex) {
            System.out.println("LineUnavailableException");
            ex.printStackTrace();
        }/* catch (InterruptedException e) {
            System.out.println("InterruptedException");
            e.printStackTrace();
        }*/
    
    
    }
    
    public static void main(String[] args) throws SocketException, UnknownHostException {
        Runnable apRunnable = new FrameGrabber(7780);
        Thread ap = new Thread(apRunnable);
        ap.start();
    }
    

    }

    At this stage, I am trying to play the audio file in my speakers but I am getting the following logs:

    Task :FrameGrabber.main() Invalid return value 0 for stream protocol Invalid return value 0 for stream protocol Input #0, mulaw, from 'java.io.DataInputStream@474e6cea': Duration: N/A, bitrate: 352 kb/s Stream #0:0: Audio: pcm_mulaw, 44100 Hz, 1 channels, s16, 352 kb/s Invalid return value 0 for stream protocol Invalid return value 0 for stream protocol Invalid return value 0 for stream protocol Invalid return value 0 for stream protocol ...

    What am I doing wrong?

    Thanks in advance!

  • Advice on how to specify length of animated GPX video with ffmpeg/image2pipe

    21 mai 2019, par Chris Olin

    I'm working on a personal project involving an action camera that records GPS data alongside video from an image sensor. I found an open source projected on GitHub called 'trackanimation' that uses a colored marker to trace the GPX path on a OpenStreetMaps overlay, but it appears that the project has been abandoned. I'm trying to sync the trackanimation video to the image sensor video, but when I try using video editing software to slow the GPX video down to 1%, it still ends up being shorter than the image sensor video. I've tried messing with the baked in ffmpeg command in make_video(), but still can't get the output video to be as long as I want it to be.

    I started digging into the library source to see how the video was being created, tried tweaking a couple things to no avail.

    import trackanimation
    from trackanimation.animation import AnimationTrack
    
    gpx_file = "Videos/20190516 unity ride #2.mp4.gpx"
    gpx_track = trackanimation.read_track(gpx_file)
    
    fig = AnimationTrack(df_points=gpx_track, dpi=300, bg_map=True, map_transparency=0.7)
    fig.make_video(output_file="Videos/1-11trackanimationtest.mp4", framerate=30, linewidth=1.0)
    
        def make_video(self, linewidth=0.5, output_file='video', framerate=5):
            cmdstring = ('ffmpeg',
                         '-y',
                         '-loglevel', 'quiet',
                         '-framerate', str(framerate),
                         '-f', 'image2pipe',
                         '-i', 'pipe:',
                         '-r', '25',
                         '-s', '1920x1080',
                         '-pix_fmt', 'yuv420p',
                         output_file + '.mp4'
                         )
    

    I expect that I should be able to linearly "slow" the GPX video to a dynamic value based on the length of the video and the length I want it to be.

  • I am building ffmpeg now, but I always get an error after executing make

    20 mai 2019, par user10818319

    Configure parameter /-- ./configure --prefix=/usr/workspace/android --disable-static --enable-shared --disable-ffplay --disable-ffprobe --disable-htmlpages --disable-manpages --disable-podpages --disable-txtpages --arch=arm --target-os=linux --cross-prefix=/usr/workspace/ndk/toolchains/arm-linux-androideabi-4.9/prebuilt/linux-x86_64/bin/arm-linux-androideabi- --sysroot=/usr/workspace/ndk/platforms/android-21/arch-arm --extra-cflags="-Os -fpic $ADDI_CFLAGS" --extra-ldflags="$ADDI_LDFLAGS" --/

     make execution error 
    /--
    In file included from ./libavformat/internal.h:24:0,
                     from libavdevice/alldevices.c:23:
    /usr/workspace/ndk/toolchains/arm-linux-androideabi-4.9/prebuilt/linux-x86_64/lib/gcc/arm-linux-androideabi/4.9.x/include/stdint.h:9:26: fatal error: stdint.h: No such file or directory
     # include_next 
                              ^
    compilation terminated.
    make: *** [libavdevice/alldevices.o] Error 1
    
    --/
    
  • automatic generate list ffmpeg trimming video files [duplicate]

    20 mai 2019, par Don

    This question already has an answer here:

    May I ask if there is an automatic way to input all the mp4 file names in the ffmpge code ? I have about 60 small video files that I need to trim, it is a pain if I have to keep entering individual file name. the mylist.txt generate only works with when concat

  • Get motion vectors coordinates from next and previous frames durring transcoding in FFMPEG

    20 mai 2019, par Enock

    Can someone tells me how can I get values of motion vectors for next and previous frames in FFMPEG during transcoding, please? I can use MpegEncContext* s->mv_table to have values of motions vectors for current picture. But I need to get also MVs values for next and previous frame. How can I do ?