Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • Streaming engines best practice in C

    20 novembre 2014, par ash

    LANG: C / ENV: Linux

    I am developing a streaming engine, for now I am able to start, stop and pause the stream, but seeking is the operation that's giving me a lot of headache, I already asked a question here before and fixed some issues inside the code from the answers.

    Using lseek() function, I am passing the open streaming file descriptor as first argument, plus I am using UDP for transmitting, something like the following code:

    transport_fd = open(tsfile, O_RDONLY);
    int offset = 1024;
    off_t offsetIndicator;
    if ((offsetIndicator=lseek(transport_fd, offset, SEEK_CUR))<0) printf("Error seeking\n");
    

    Whenever I try to seek while streaming, the streaming stops and the pictures hangs.

    Is there anything I should pay attention to?, i.e: like attempting to sleep() or nanosleep() after seeking into the file in order for the changes to take effect.

    I couldn't find examples, papers or realted articles for best practices in such engines.

    EDIT:

    After testing, it seems like the file continued to stream but receiving devices on the network didn't catch the stream connection anymore, and calculating the time it took to finish after subtract seeking time, the stream seems to be finished normally.

    CODE SNIPPET:

    while (!completed) 
    {
        while (/* Comparing conditions */ && !completed)
        { 
            if (seekLck == 1) // seekLck is a semaphore to test seek signal from father process initiated by 0
            {
                int offset = 1024;
                off_t offsetIndicator;
                if ((offsetIndicator=lseek(transport_fd, offset, SEEK_CUR))<0) 
                    printf("Error seeking\n");
                nanosleep(&nano_sleep_packet, 0); //Try to sleep to see if it is still hanging, didn't work 
                seekLck = 0;
            }   
            len = read(transport_fd, send_buf, packet_size);
            if(len < 0) {
                fprintf(stderr, "File read error \n");
                completed = 1;
            } 
            else if (len == 0)
            {
                fprintf(stderr, "Sent done\n");
                completed = 1;
            }
            else
            {
                sent = sendto(sockfdstr, send_buf, len, 0, (struct sockaddr *)&addr, sizeof(struct sockaddr_in));
                if(sent <= 0)
                {
                    perror("send(): error ");
                    completed = 1;
                }
            }
        }
        nanosleep(&nano_sleep_packet, 0);
    }
    close(transport_fd);
    close(sockfdstr);
    free(send_buf);
    printf("cleaning up\n");
    return 0;
    }
    
  • Dump WebRTC stream to a file

    20 novembre 2014, par Mondain

    I'd like to capture the audio and video from a WebRTC stream to a file or pair of files, if audio and video require their own individual files. The audio and video are not muxed together and are known to be available on a set of server udp ports:

    Port   Encoding
    5000 - VP8 video
    5001 - RTCP (for video)
    5002 - Opus audio @48kHz 2 channels
    5003 - RTCP (for audio)
    

    The SDP file / data is not available and DTLS may be used.

    I would prefer to use avconv or ffmpeg to capture the stream, unless a better tool is suggested.

    Edit: I've found that this as inquired will most likely not work. Until I hear otherwise, none of these tools support the initial DTLS handshake followed by the data transmission via SRTP.

  • ObjC++ calling static library written in C

    20 novembre 2014, par onemach

    I have an iOS project utilizing ffmpeg (which is a pure C library) and OpenCV.

    Since I use the C++ interface of OpenCV, I write objective-c++ which is a .mm file. But the file is not OK with ffmpeg, and Xcode complains about undefined symbol on linking stage.

    I also use ffmpeg in another .m file and it is OK. So I am sure the problem is with .mm and static library written in C.

  • Facing issue with mp4Parser

    20 novembre 2014, par Rohit

    I am trying to trim video with the help of mp4Parser. In output am getting trimmed video but just before video finishes it give error "Sorry cant Play this video". Video am making is H264.

        public static void main(String args) throws IOException {
    
        Movie movie = new MovieCreator()
                .build(new RandomAccessFileIsoBufferWrapperImpl(
                        new File(
                                "/sdcard/Videos11/"+args+".mp4")));
    
        List tracks = movie.getTracks();
        Log.e("Tracks","Following are the Tracks:- "+tracks);
        movie.setTracks(new LinkedList());
        // remove all tracks we will create new tracks from the old
    
        double startTime = 0.000;
        double endTime = 6.000;
    
        boolean timeCorrected = false;
    
        // Here we try to find a track that has sync samples. Since we can only
        // start decoding
        // at such a sample we SHOULD make sure that the start of the new
        // fragment is exactly
        // such a frame
        for (Track track : tracks) {
            if (track.getSyncSamples() != null
                    && track.getSyncSamples().length > 0) {
                if (timeCorrected) {
                    // This exception here could be a false positive in case we
                    // have multiple tracks
                    // with sync samples at exactly the same positions. E.g. a
                    // single movie containing
                    // multiple qualities of the same video (Microsoft Smooth
                    // Streaming file)
    
                    throw new RuntimeException(
                            "The startTime has already been corrected by another track with SyncSample. Not Supported.");
                }
                startTime = correctTimeToNextSyncSample(track, startTime);
                endTime = correctTimeToNextSyncSample(track, endTime);
                timeCorrected = true;
            }
        }
    
        for (Track track : tracks) {
            long currentSample = 0;
            double currentTime = 0;
            long startSample = -1;
            long endSample = -1;
    
            for (int i = 0; i < track.getDecodingTimeEntries().size(); i++) {
                TimeToSampleBox.Entry entry = track.getDecodingTimeEntries().get(i);
                for (int j = 0; j < entry.getCount(); j++) {
                    // entry.getDelta() is the amount of time the current sample
                    // covers.
    
                    if (currentTime <= startTime) {
                        // current sample is still before the new starttime
                        startSample = currentSample;
                    }
                    if (currentTime <= endTime) {
                        // current sample is after the new start time and still
                        // before the new endtime
                        endSample = currentSample;
                    } else {
                        // current sample is after the end of the cropped video
                        break;
                    }
                    currentTime += (double) entry.getDelta()
                            / (double) track.getTrackMetaData().getTimescale();
                    currentSample++;
                }
            }
            movie.addTrack(new CroppedTrack(track, startSample, endSample));
        }
    
        IsoFile out = new DefaultMp4Builder().build(movie);
    
        String filePath = "sdcard/test"+i+".mp4";
        i++;
        File f = new File(filePath);
        FileOutputStream fos = new FileOutputStream(f);
        BufferedOutputStream bos = new BufferedOutputStream(fos, 65535);
        out.getBox(new IsoOutputStream(fos));
        bos.close();
        fos.flush();
        fos.close();
    
    }
    
    private static double correctTimeToNextSyncSample(Track track,
            double cutHere) {
        double[] timeOfSyncSamples = new double[track.getSyncSamples().length];
        long currentSample = 0;
        double currentTime = 0;
        for (int i = 0; i < track.getDecodingTimeEntries().size(); i++) {
            TimeToSampleBox.Entry entry = track.getDecodingTimeEntries().get(i);
            for (int j = 0; j < entry.getCount(); j++) {
                if (Arrays.binarySearch(track.getSyncSamples(),
                        currentSample + 1) >= 0) {
                    // samples always start with 1 but we start with zero
                    // therefore +1
                    timeOfSyncSamples[Arrays.binarySearch(
                            track.getSyncSamples(), currentSample + 1)] = currentTime;
                }
                currentTime += (double) entry.getDelta()
                        / (double) track.getTrackMetaData().getTimescale();
                currentSample++;
            }
        }
        for (double timeOfSyncSample : timeOfSyncSamples) {
            if (timeOfSyncSample > cutHere) {
                return timeOfSyncSample;
            }
        }
        return timeOfSyncSamples[timeOfSyncSamples.length - 1];
    }
    

    Making video of 30 seconds and want to trim first 6 seconds. Working but in last(between 5 to 6 sec) it shows Cant play this video. Any help will be appreciated.

  • How to add filters to ffmpeg, mp

    20 novembre 2014, par Coderzelf

    I tried to use ffmpeg filter 'mp' but it reports there is no such filter.

    what I tried:

    'ffmpeg -i input.avi -vf mp=ep2=xxx output.avi'
    

    And I build ffmpeg from this link:

    http://ffmpeg.org/download.html#LinuxBuilds