Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • Enabling mp4/mpeg4/avc support for Qt5 WebEngine on Linux

    12 avril 2016, par Thomas233

    i installed Qt 5.4.1 x64 on LUbuntu and created an app which uses the new QtWebEngine.

    I`m trying to display a html5 page with that component which is using the tag. All works fine except if I try to playback a mp4 video. The video area remains black. It works if I use other video types like webm/ogg as source.

    I know this is due to license restrictions, so that mp4 is deactivated by default in Ubuntu/Linux for Qt.

    What is needed in Qt to activate it to allow mp4 playback and on what do I have pay attention in case of license terms (I read that statically linking the library is allowed?) ?

    I`ve already tried to copy over the x64 distribution of libffmpegsuo.so which is included in Chrome (2,2Mb) over to the Qt directory to /plugins/webengine/ and replaced that one that was already there (1,1 Mb) but it had no effect. In Chrome playback works fine btw.

    If you need more details like paths etc. please tell me.

    Thanks !

  • FFMPEG with Gradle Experimental Android plugin

    12 avril 2016, par Spartan

    I extracted ffmpeg in ndk's source folder then compiled it there only for that I followed this:http://www.roman10.net/2013/08/18/how-to-build-ffmpeg-with-ndk-r9/

    So I successfully(generated android folder with arm/lib and arm/include files).

    After that I created one Android.mk file in $NDK/sources/ffmpeg/android/arm and one Android.mk in my android project(src/main/jni folder).

    My src/main/jni/Android.mk is like this:

    LOCAL_PATH := $(call my-dir)
    
    include $(CLEAR_VARS)
    
    LOCAL_MODULE    := tutorial01
    LOCAL_SRC_FILES := tutorial01.c
    LOCAL_LDLIBS := -llog -ljnigraphics -lz
    LOCAL_SHARED_LIBRARIES := libavformat libavcodec libswscale libavutil
    
    include $(BUILD_SHARED_LIBRARY)
    $(call import-module,ffmpeg-3.0.1/android/arm)
    

    Now I stuck here and trying to put these details in build.gradle files like showing in this documentation: http://tools.android.com/tech-docs/new-build-system/gradle-experimental

    for example:

    model {
    
        repositories {
    
            libs(PrebuiltLibraries) {
    
                prebuilt {
    
                    headers.srcDir "path/to/headers"
    
                    binaries.withType(SharedLibraryBinary) {
    
                        sharedLibraryFile = file("lib/${targetPlatform.getName()}/prebuilt.so")
    
                    }
    
                }
    
            }
    
        }
    
        android.sources {
            main {
                jniLibs {
                    dependencies {
                        library "prebuilt"
                    }
                }
            }
        }
    }
    

    But not able transform exact Android.mk(src/main/jni) to this and apart from it I have one more Android.mk in my android/arm folder so how I can call this makefile or should I put that details also in my build.gradle like above.

    I tried like this and using ndk-build command successfully generated jniLibs with .so files but I am getting libavcodec/avcodec.h: No such file or directory while building project.

    android.sources {
        main {
            jni {
                source {
                    srcDirs = ['src/main/myJni']
                }
    
            }
    
        }
    
        main {
            jniLibs {
                source {
                    srcDirs = ['src/main/libs']
                }
            }
        }
    
    }
    
  • rtsp stream capturing

    12 avril 2016, par ДМИТРИЙ МАЛИКОВ

    I'm looking for some universal way to dump rtsp stream. I want to figure out, that some rtsp stream is working well and server is sending some watchable video.

    openRTSP

    At first, google recommends me openRTSP tool.

     openRTSP -4 ${stream_link} > ${output_file}
    

    But output video file dumped by that tool is not really correct. Video decoder (ffdec) returns many errors like "Failed to decode video packet" and "[h264] no frame!", which don't suit me.

    ffmpeg

    Then I've tried to dump rtsp stream with ffmpeg tool.

    ffmpeg -loglevel debug -i "${stream_link}" -s 640x480 -vcodec copy -acodec copy -y ${output_file}
    

    But streaming process was interrupted often by error:

    Application provided invalid, non monotonically increasing dts to muxer in stream 0: 730672 >= 730672
    av_interleaved_write_frame(): Invalid argument
    

    I'm trying to use --fflags igndts but ffmpeg doesn't ignore these errors. It doesn't make any sense, because that error actually means that audio and video streams are sending asynchronously. The worst thing is that dumped file, resulted by that interrupted dump, is not correct too. Ffdec return some error:

    ERROR [mov,mp4,m4a,3gp,3g2,mj2] moov atom not found
    ERROR [ffdec] av_open_input_file: Operation not permitted
    

    After a nice cup of googling I've found, that it's really old ffmpeg's muxer bug.

    mplayer

    Than I've tried to use mplayer with LIVE_555 lib.

    mplayer -noframedrop -dumpfile ${output_file} -dumpstream ${stream_link}
    

    But I've got some errors too.

    Stream not seekable!
    Core dumped ;)
    

    Question

    I think I'm doing something wrong. It's sounds really ridiculous, that there is no way to save rtsp stream in correct and playable video-file.

    Maybe there are some another tools which can help with that task? Actually, I will be grateful for any advice for all kind of libs and languages. But that process should be automatic and have cli.

    Refinements

    Something about 50% experiments I've done on the localhost with vlc-streamer that emulates rtsp-broadcaster. Here is a manual which I try to follow.

    I have really fresh and latest ffmpeg with x264 support, that I've installed by that useful thread.

  • How to implement live camera plus WebGL overlay live streams

    12 avril 2016, par zproxy

    The goal is to test a youtube live stream configuration where:

    1. Samsung Galaxy S6/S7 (android) (wifi) streams cameras (one or all)
    2. Chrome renders a WebGL 3D (static or realtime, webrtc?, flash?) to be overlayed (localhost)
    3. ffmpeg mixes the available stream and upstreams to youtube live event

    Whats needed to implement this configration? Yes this may require actual custom implementation for android app and chrome app in the end.

  • Nginx exec_static reconnect

    12 avril 2016, par Nacsster

    Recently I have installed Nginx with rtmp module and ffmpeg

    I have a few commands like this on my nginx.conf

    exec_static ffmpeg -i http://xxxx.m3u8 -threads 1 -c:v libx264 -profile:v baseline -b:v 370K -s 640x360 -f flv -c:a aac -ac 1 -strict -2 -b:a 56k rtmp://xx.xxx.xx.xx/mobile/nameofstream;

    It transcode the input to low bitrate and all works fine =)

    This is the question: Every time I restart my nginx server it executes the commands correctly but (I think) if the stream (m3u8 input) is down, ffmpeg doesn't wait for a future "input up again", just ignore. I try with -re command on ffmpeg but it doesn't work!

    Any help? Maybe do I need some bash routine?