Newest 'ffmpeg' Questions - Stack Overflow
Les articles publiés sur le site
-
Stream low latency RTSP video to android with ffmpeg
21 octobre 2014, par grzebykI am trying to stream live webcam video from Ubuntu 12.04 PC to android device with KitKat. So far I've written ffserver config file to receive ffm feed and broadcast it through a rtsp protocol. I am able to watch the stream on the other computer in the same LAN with ffplay.
How to watch the stream on the android device? The following code works well when the webcam image is streamed with vlc but it doesn't with ffmpeg:
public class MainActivity extends Activity implements MediaPlayer.OnPreparedListener, SurfaceHolder.Callback { final static String RTSP_URL = "rtsp://192.168.1.54:4424/test.sdp"; private MediaPlayer _mediaPlayer; private SurfaceHolder _surfaceHolder; @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); // Set up a full-screen black window. requestWindowFeature(Window.FEATURE_NO_TITLE); Window window = getWindow(); window.setFlags( WindowManager.LayoutParams.FLAG_FULLSCREEN, WindowManager.LayoutParams.FLAG_FULLSCREEN); window.setBackgroundDrawableResource(android.R.color.black); getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON); setContentView(R.layout.activity_main); // Configure the view that renders live video. SurfaceView videoView = (SurfaceView) findViewById(R.id.videoView); //where R.id.videoView is a simple SurfaceView element in the layout xml file _surfaceHolder = videoView.getHolder(); _surfaceHolder.addCallback(this); _surfaceHolder.setFixedSize(320, 240); } @Override public void surfaceCreated(SurfaceHolder surfaceHolder) { _mediaPlayer = new MediaPlayer(); _mediaPlayer.setDisplay(_surfaceHolder); Context context = getApplicationContext(); Uri source = Uri.parse(RTSP_URL); try { // Specify the IP camera's URL and auth headers. _mediaPlayer.setDataSource(context, source); // Begin the process of setting up a video stream. _mediaPlayer.setOnPreparedListener(this); _mediaPlayer.prepareAsync(); } catch (Exception e) {} } @Override public void onPrepared(MediaPlayer mediaPlayer) { _mediaPlayer.start(); } }
My ffserver.config file:
HTTPPort 8090 RTSPBindAddress 0.0.0.0 RTSPPort 4424 MaxBandwidth 10000 CustomLog -
File /tmp/feed1.ffm FileMaxSize 20M ACL allow 127.0.0.1 Feed feed1.ffm Format rtp VideoCodec libx264 VideoSize 640x480 AVOptionVideo flags +global_header AVOptionVideo me_range 16 AVOptionVideo qdiff 4 AVOptionVideo qmin 10 AVOptionVideo qmax 51 Noaudio ACL allow localhost ACL allow 192.168.0.0 192.168.255.255 I am starting the stream with this command:
ffmpeg -f v4l2 -i /dev/video0 -c:v libx264 -b:v 600k http://localhost:8090/feed1.ffm
-
Use Ffmeg to Convert Video to Gif android studio
21 octobre 2014, par Donnie IbiyemiAm currently making a simple Androidapp that converts a video from the sd card into a gif.
I learnt Ffmeg is the most efficient method to handle the conversion. But i have no idea how to add ffmeg to my android studio project.
Please kindly point me in the right direction
-
how to convert PCM file to MLP file using FFmpeg in windows ?
21 octobre 2014, par user2877921I have tried converting .mp3 file to .pcm file.now i have added a encoder file which converts it into MLP file.i need the command in command prompt to convert pcm file to mlp file.
-
Avconv / FFmpeg - could not find codec parameters
21 octobre 2014, par scottpatersonI am trying to convert a simple SWF (that accepts variables - FlashVars) to an mp4 on Linux Ubuntu.
I have been using Gnash to convert the SWF into a Raw video file:
dump-gnash -1 -D /path/output.raw@30 -P "Flashvars=content=textgoeshere" /path/input.swf
Then I am attempting to use either Avconv / FFmpeg to convert the Raw video file to a MP4:
ffmpeg -i /path/input.raw -c:v libx264 -f rawvideo -c copy -map 0 /path/output.mp4
or
avconv -i /path/input.raw -b 128k /path/output.mp4
both give me the error:
invalid data found when processing input
I am start to wonder if there is something wrong with my original SWF file. So here here are the files I am using for input. As you can see the txt file holds the variables that the SWF reads from.
http://scottpaterson.ca/files/example1.swf
http://scottpaterson.ca/files/example1.txt
How can I convert this into an MP4 video? Any help would be great, thanks in advance.
-
android chromecast mp4 by ffmpeg - shorter files play, some longer ones buffer but dont play
20 octobre 2014, par Robert RowntreeUsing Android SDK for CC, CCL, and 'styled receiver' CC app.
Video Files created by a heroku/ffmpeg bin that will play on any other client i tried (VLC ubuntu... ) and that will even play in chrome on a tab that's being cast to chromecast will NOT PLAY natively on chromecast apps. They will appear as normal load in the UI of the app. CCL logs will show them buffering OK , but they never enter a play state from the buffering state. Debugging the receiver shows nothing unusual ( network tab, or Console ). They just buffer and dont play.
If a slightly shorter version of the same , 2 ffmpeg input files is prepared and hosted, it seems to play fine in shorter versions under about 14 seconds long but will not play if a 16 second version is created.
I used a number of chromecast apps including Allcast to test this. always got same result.
2 dropbox mp4 links created with the ffmpeg cli here are below. A 14 second version of the output mp4 file plays fine on several Chromecast/android apps and a 16 second version fails.
file len 16s Wont play in CC apps
ffprobe output on the files at :
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'tst-nonplay-shorter_14s.mp4': Metadata: major_brand : isom minor_version : 512 compatible_brands: isomiso2avc1mp41 encoder : Lavf54.29.105 Duration: 00:00:14.07, start: 0.046440, bitrate: 311 kb/s Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuvj420p, 1080x614 [SAR 1535:1539 DAR 100:57], 199 kb/s, 1 fps, 1 tbr, 16384 tbn, 2 tbc Metadata: handler_name : VideoHandler Stream #0:1(und): Audio: aac (mp4a / 0x6134706D), 22050 Hz, mono, s16, 110 kb/s Metadata: handler_name : SoundHandler
and the file 2 seconds longer that wont play
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'tst-nonplay-shorter_16s.mp4': Metadata: major_brand : isom minor_version : 512 compatible_brands: isomiso2avc1mp41 encoder : Lavf54.29.105 Duration: 00:00:16.06, start: 0.046440, bitrate: 286 kb/s Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuvj420p, 1080x614 [SAR 1535:1539 DAR 100:57], 174 kb/s, 1 fps, 1 tbr, 16384 tbn, 2 tbc Metadata: handler_name : VideoHandler Stream #0:1(und): Audio: aac (mp4a / 0x6134706D), 22050 Hz, mono, s16, 110 kb/s Metadata: handler_name : SoundHandler
Used 'qtfaststart' and checked for MOOV ATOM issues and dont think that's relevant.
I think i can limit recordings in my app to 14 seconds and avoid problems that way, but I would like to understand what is up with the chromecast GETS on MP4's generated by ffmpeg.