Newest 'ffmpeg' Questions - Stack Overflow
Les articles publiés sur le site
-
Capture raw video byte stream for real time transcoding
9 septembre 2012, par user1145905I would like to achieve the following:
Set up a proxy server to handle video requests by clients (for now, say all video requests from any Android video client) from a remote video server like YouTube, Vimeo, etc. I don't have access to the video files being requested, hence the need for a proxy server. I have settled for Squid. This proxy should process the video signal/stream being passed from the remote server before relaying it back to the requesting client.
To achieve the above, I would either
1. Need to figure out the precise location (URL) of the video resource being requested, download it really fast, and modify it as I want before HTTP streaming it back to the client as the transcoding continues (simultaneously, with some latency)
2. Access the raw byte stream, pipe it into a transcoder (I'm thinking ffmpeg) and proceed with the streaming to client (also with some expected latency).
Option #2 seems tricky to do but lends more flexibility to the kind of transcoding I would like to perform. I would have to actually handle raw data/packets, but I don't know if ffmpeg takes such input.
In short, I'm looking for a solution to implement real-time transcoding of videos that I do not have direct access to from my proxy. Any suggestions on the tools or approaches I could use? I have also read about Gstreamer (but could not tell if it's applicable to my situation), and MPlayer/MEncoder.
And finally, a rather specific question: Are there any tools out there that, given a YouTube video URL, can download the byte stream for further processing? That is, something similar to the Chrome YouTube downloader but one that can be integrated with a server-side script?
Thanks for any pointers/suggestions!
-
FFMPEG Channel Mapping
8 septembre 2012, par user1654755I have a 24 bit WAV file that consists of 8 channels. What I need to do is convert it into 4 24 bit two channel files where the output wavs are made up of source channels (1,2), (3,4), (5,6), (7,8).
Anyone have any thoughts of the best way to do that using FFmpeg?
-
Embedding Metadata to H.264 encoded file
7 septembre 2012, par kerim yucelI am currently developing an application which produces certain metadata with respect to preview frames coming from the camera. I can see this metadata being produced properly and I have no problems here.
However, I have to embed this metadata to these frames of interest (frames are processed by a native algorithm to produce this metadata). I am using ffmpeg with x264 to encode the frames into H.264. I have checked x264.h and some documentations but failed to find what I seek.
My question is; is there any unused portion of H.264 syntax that I can embed my metadata to encoded frames?
I hope I was clear enough. Thanks in advance.
-
Remove the instance of native library from app using dlclose(Android NDK)
7 septembre 2012, par mirroredAbstractionI have compiled
FFmpeg
library usingNDK
and use it to trim videos and get thumbnails from a video in my app, so basically I haveffmpeg.so
andvideo-trimmer.so
libraries up and running.The problem however is strange, the trim or getThumbnail operations are successful but just one time i.e. the first time and the operations fail the second time. However it is successful the third time, I googled it and got two similar posts on SO related to my problem
Interestingly they suggest the same solution and I am unable to solve the issue being a naive in
C
programming language.Here is what I have done
void Java_com_example_demo_natives_LibraryLoader_loadTrimmerLibrary(JNIEnv* env, jclass class, jstring libffmpeg_path, jstring inputFile, jstring outFile, jstring startTime, jstring length) { const char* path; void* handle; int *(*Java_com_example_demo_natives_VideoTrimmer_trim)(JNIEnv *, jclass, jstring, jstring, jstring, jstring); path = (*env)->GetStringUTFChars(env, libffmpeg_path, 0); handle = dlopen(path, RTLD_LAZY); Java_com_example_demo_natives_VideoTrimmer_trim = dlsym(handle, "Java_com_example_demo_natives_VideoTrimmer_trim"); (*Java_com_example_demo_natives_VideoTrimmer_trim)(env, class, inputFile, outFile, startTime, length); (*env)->ReleaseStringUTFChars(env, libffmpeg_path, path); dlclose(handle); }
Despite of calling
dlclose
the library instance still exists in memory, what I am doing wrong here?I come to know that library instance still exists because when I load the libraries again in some other activity the error message says library already exists in CL.
I want to get rid of the instance of that library from memory, please help...
-
Mixing a FLV audio stream with a WAV background track, and converting to MP3 with SoX and FFmpeg
7 septembre 2012, par tubboI'm building a Flash-based recording application for a contracted web site. It streams the recorded voice (via SWF) to a Red5 server, then uses a combination of FFmpeg and SoX to compile the vocal audio with a lower-in-volume background music track. This all has to happen on-demand, that is, when a user "saves" his or her vocal recording.
Here is an example command I will be running. Names have been changed to protect the innocent. The filenames describe their role in the final file:
sox --combine mix -p --no-show-progress --norm "|ffmpeg -i /usr/share/red5/webapps/audiorecorder/stream/SPOKEN_VOICE.flv -t wav pipe:1" /var/www/ufiles/music/BACKGROUND_MUSIC.wav - | ffmpeg -i pipe:1 /var/www/ufiles/recordings/COMPILED_AUDIO_RECORDING.mp3
When I run this command in the shell, this is what happens:
$ sox --combine mix -p --no-show-progress --norm "|ffmpeg -i audioStream_1321399534128_21.flv -ar 44100 -ac 2 -t wav pipe:1" wrong.wav - | ffmpeg -i pipe:1 ~/www/trauma101.com/compiled.mp3 ffmpeg version N-34884-g7575980, Copyright (c) 2000-2011 the FFmpeg developers built on Nov 15 2011 14:06:49 with gcc 4.4.5 configuration: --enable-gpl --enable-version3 --enable-nonfree --enable-postproc --enable-libfaac --enable-libmp3lame --enable-libx264 --enable-x11grab --enable-libspeex libavutil 51. 25. 0 / 51. 25. 0 libavcodec 53. 34. 0 / 53. 34. 0 libavformat 53. 20. 0 / 53. 20. 0 libavdevice 53. 4. 0 / 53. 4. 0 libavfilter 2. 48. 1 / 2. 48. 1 libswscale 2. 1. 0 / 2. 1. 0 libpostproc 51. 2. 0 / 51. 2. 0 ffmpeg version N-34884-g7575980, Copyright (c) 2000-2011 the FFmpeg developers built on Nov 15 2011 14:06:49 with gcc 4.4.5 configuration: --enable-gpl --enable-version3 --enable-nonfree --enable-postproc --enable-libfaac --enable-libmp3lame --enable-libx264 --enable-x11grab --enable-libspeex libavutil 51. 25. 0 / 51. 25. 0 libavcodec 53. 34. 0 / 53. 34. 0 libavformat 53. 20. 0 / 53. 20. 0 libavdevice 53. 4. 0 / 53. 4. 0 libavfilter 2. 48. 1 / 2. 48. 1 libswscale 2. 1. 0 / 2. 1. 0 libpostproc 51. 2. 0 / 51. 2. 0 [libspeex @ 0x1e36b20] Missing Speex header, assuming defaults. Input #0, flv, from 'audioStream_1321399534128_21.flv': Metadata: novideocodec : 0 server : Red5 Server 1.0.0 RC2 Rev: 4295 creationdate : Tue Nov 15 15:25:41 PST 2011 canSeekToEnd : true Duration: 00:00:06.77, start: 0.000000, bitrate: 43 kb/s Stream #0:0: Audio: speex, 16000 Hz, 1 channels, s16 Invalid duration specification for t: wav sox FAIL formats: can't open input pipe `|ffmpeg -i audioStream_1321399534128_21.flv -ar 44100 -ac 2 -t wav pipe:1': premature EOF
I think the issue is stemming from the conversion from FLV to WAV in FFmpeg, and since it's being piped in it causes the whole process to fail. I always get that duration warning, but when FFmpeg outputs to a .wav file and the SoX command is run separately, I can still get a WAV from SoX and convert that to MP3 manually. I'd like to do all this in one line, piping the data between applications.
What do I do?