
Recherche avancée
Médias (1)
-
The pirate bay depuis la Belgique
1er avril 2013, par
Mis à jour : Avril 2013
Langue : français
Type : Image
Autres articles (13)
-
Emballe médias : à quoi cela sert ?
4 février 2011, parCe plugin vise à gérer des sites de mise en ligne de documents de tous types.
Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ; -
Déploiements possibles
31 janvier 2010, parDeux types de déploiements sont envisageable dépendant de deux aspects : La méthode d’installation envisagée (en standalone ou en ferme) ; Le nombre d’encodages journaliers et la fréquentation envisagés ;
L’encodage de vidéos est un processus lourd consommant énormément de ressources système (CPU et RAM), il est nécessaire de prendre tout cela en considération. Ce système n’est donc possible que sur un ou plusieurs serveurs dédiés.
Version mono serveur
La version mono serveur consiste à n’utiliser qu’une (...) -
Gestion des droits de création et d’édition des objets
8 février 2011, parPar défaut, beaucoup de fonctionnalités sont limitées aux administrateurs mais restent configurables indépendamment pour modifier leur statut minimal d’utilisation notamment : la rédaction de contenus sur le site modifiables dans la gestion des templates de formulaires ; l’ajout de notes aux articles ; l’ajout de légendes et d’annotations sur les images ;
Sur d’autres sites (3756)
-
I have a log file with RTP packets : now what ?
9 mai 2012, par BrannonI have a log file with RTP packets coming off of a black box device. I also have a corresponding SDP file (RTSP DESCRIBE) for that. I need to convert this file into some kind of playable video file. Can I pass these two files to FFMpeg or VLC or something else and have it mux that data into something playable ?
As an alternate plan, I can loop through the individual packets in code and do something with each packet. However, it seems that there are existing libraries for parsing this data. And it seems to do it by hand would be asking for a large project. Is there some kind of video file format that is a pretty raw mix of SDP and RTP ? Thanks for your time.
Is there a way for FFmpeg or VLC to open an SDP file and then get their input packets through STDIN ?
I generally use C#, but I could use C if necessary.
Update 1 : Here is my unworking code. I'm trying to get some kind of output to play with ffplay, but I haven't had any luck yet. It gives me invalid data errors. It does go over all the data correctly as far as I can tell. My output is nearly as big as my input (at about 4MB).
public class RtpPacket2
{
public byte VersionPXCC;
public byte MPT;
public ushort Sequence; // length?
public uint Timestamp;
public uint Ssrc;
public int Version { get { return VersionPXCC >> 6; } }
public bool Padding { get { return (VersionPXCC & 32) > 0; } }
public bool Extension { get { return (VersionPXCC & 16) > 0; } }
public int CsrcCount { get { return VersionPXCC & 0xf; } } // ItemCount
public bool Marker { get { return (MPT & 0x80) > 0; } }
public int PayloadType { get { return MPT & 0x7f; } } // PacketType
}
static void Main(string[] args)
{
if (args.Length != 2)
{
Console.WriteLine("Usage: <input rtp="rtp" file="file" /> <output 3gp="3gp" file="file">");
return;
}
var inputFile = args[0];
var outputFile = args[1];
if(File.Exists(outputFile)) File.Delete(outputFile);
// FROM the SDP : fmtp 96 profile-level-id=4D0014;packetization-mode=0
var sps = Convert.FromBase64String("Z0LAHoiLUFge0IAAA4QAAK/IAQ=="); // BitConverter.ToString(sps) "67-42-C0-1E-88-8B-50-58-1E-D0-80-00-03-84-00-00-AF-C8-01" string
var pps = Convert.FromBase64String("aM44gA=="); // BitConverter.ToString(pps) "68-CE-38-80" string
var sep = new byte[] { 00, 00, 01 };
var packet = new RtpPacket2();
bool firstFrame = true;
using (var input = File.OpenRead(inputFile))
using (var reader = new BinaryReader(input))
using (var output = File.OpenWrite(outputFile))
{
//output.Write(header, 0, header.Length);
output.Write(sep, 0, sep.Length);
output.Write(sps, 0, sps.Length);
output.Write(sep, 0, sep.Length);
output.Write(pps, 0, pps.Length);
output.Write(sep, 0, sep.Length);
while (input.Position < input.Length)
{
var size = reader.ReadInt16();
packet.VersionPXCC = reader.ReadByte();
packet.MPT = reader.ReadByte();
packet.Sequence = reader.ReadUInt16();
packet.Timestamp = reader.ReadUInt32();
packet.Ssrc = reader.ReadUInt32();
if (packet.PayloadType == 96)
{
if (packet.CsrcCount > 0 || packet.Extension) throw new NotImplementedException();
var header0 = reader.ReadByte();
var header1 = reader.ReadByte();
var fragmentType = header0 & 0x1F; // should be 28 for video
if(fragmentType != 28) // 28 for video?
{
input.Position += size - 14;
continue;
}
var nalUnit = header0 & ~0x1F;
var nalType = header1 & 0x1F;
var start = (header1 & 0x80) > 0;
var end = (header1 & 0x40) > 0;
if(firstFrame)
{
output.Write(sep, 0, sep.Length);
output.WriteByte((byte)(nalUnit | fragmentType));
firstFrame = false;
}
for (int i = 0; i < size - 14; i++)
output.WriteByte(reader.ReadByte());
if (packet.Marker)
firstFrame = true;
}
else input.Position += size - 12;
}
}
}
</output> -
Recording video on Android using JavaCV (Updated 2014 02 17)
25 septembre 2014, par Fabio BergmannI’m trying to record a video in Android using the JavaCV lib.
I need to record the video in 640x360.I have installed everything as described in README.txt file and I followed the example as below :
https://code.google.com/p/javacv/source/browse/samples/RecordActivity.java
In this example, the video size is this :
private int imageWidth = 320 ;
private int imageHeight = 240 ;In my case, I need to record a video in 640x360 H.264.
(UPDATE) I have reverted my code and kept exactly like in the example, just changing imageWidth and imageHeight to 640x360.
Now I’m getting the video like this image :
http://bergmann.net.br/img/screenshot_video_error.pngHere is my code :
import static com.googlecode.javacv.cpp.opencv_core.IPL_DEPTH_8U;
import java.io.IOException;
import java.nio.ShortBuffer;
import android.app.Activity;
import android.content.Context;
import android.content.pm.ActivityInfo;
import android.hardware.Camera;
import android.hardware.Camera.PreviewCallback;
import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.MediaRecorder;
import android.os.Bundle;
import android.os.PowerManager;
import android.util.Log;
import android.view.Display;
import android.view.KeyEvent;
import android.view.LayoutInflater;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.View;
import android.view.View.OnClickListener;
import android.view.WindowManager;
import android.widget.Button;
import android.widget.LinearLayout;
import android.widget.RelativeLayout;
import com.autosonvideo.helpers.Helpers;
import com.autosonvideo.logic.CameraHelpers;
import com.googlecode.javacv.FFmpegFrameRecorder;
import com.googlecode.javacv.cpp.opencv_core.IplImage;
public class FFmpegRecordActivity extends Activity implements OnClickListener {
private final static String CLASS_LABEL = "RecordActivity";
private final static String LOG_TAG = CLASS_LABEL;
private PowerManager.WakeLock mWakeLock;
private String ffmpeg_link;
long startTime = 0;
boolean recording = false;
private volatile FFmpegFrameRecorder recorder;
private boolean isPreviewOn = false;
private int sampleAudioRateInHz = 44100;
private int imageWidth = 640;
private int imageHeight = 480;
private int finalImageWidth = 640;
private int finalImageHeight = 360;
private int frameRate = 30;
/* audio data getting thread */
private AudioRecord audioRecord;
private AudioRecordRunnable audioRecordRunnable;
private Thread audioThread;
volatile boolean runAudioThread = true;
/* video data getting thread */
private Camera cameraDevice;
private CameraView cameraView;
private IplImage yuvIplimage = null;
/* layout setting */
private final int bg_screen_bx = 232;
private final int bg_screen_by = 128;
private final int bg_screen_width = 700;
private final int bg_screen_height = 500;
private final int bg_width = 1123;
private final int bg_height = 715;
private final int live_width = 1280;
private final int live_height = 960;
private int screenWidth, screenHeight;
private Button btnRecorderControl;
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
setContentView(R.layout.main);
PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK,
CLASS_LABEL);
mWakeLock.acquire();
initLayout();
initRecorder();
}
@Override
protected void onResume() {
super.onResume();
if (mWakeLock == null) {
PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK,
CLASS_LABEL);
mWakeLock.acquire();
}
}
@Override
protected void onPause() {
super.onPause();
if (mWakeLock != null) {
mWakeLock.release();
mWakeLock = null;
}
}
@Override
protected void onDestroy() {
super.onDestroy();
recording = false;
if (cameraView != null) {
cameraView.stopPreview();
cameraDevice.release();
cameraDevice = null;
}
if (mWakeLock != null) {
mWakeLock.release();
mWakeLock = null;
}
}
private void initLayout() {
/* get size of screen */
Display display = ((WindowManager) getSystemService(Context.WINDOW_SERVICE))
.getDefaultDisplay();
screenWidth = display.getWidth();
screenHeight = display.getHeight();
RelativeLayout.LayoutParams layoutParam = null;
LayoutInflater myInflate = null;
myInflate = (LayoutInflater) getSystemService(Context.LAYOUT_INFLATER_SERVICE);
RelativeLayout topLayout = new RelativeLayout(this);
setContentView(topLayout);
LinearLayout preViewLayout = (LinearLayout) myInflate.inflate(
R.layout.main, null);
layoutParam = new RelativeLayout.LayoutParams(screenWidth, screenHeight);
topLayout.addView(preViewLayout, layoutParam);
/* add control button: start and stop */
btnRecorderControl = (Button) findViewById(R.id.recorder_control);
btnRecorderControl.setText("Start");
btnRecorderControl.setOnClickListener(this);
/* add camera view */
int display_width_d = (int) (1.0 * bg_screen_width * screenWidth / bg_width);
int display_height_d = (int) (1.0 * bg_screen_height * screenHeight / bg_height);
int prev_rw, prev_rh;
if (1.0 * display_width_d / display_height_d > 1.0 * live_width
/ live_height) {
prev_rh = display_height_d;
prev_rw = (int) (1.0 * display_height_d * live_width / live_height);
} else {
prev_rw = display_width_d;
prev_rh = (int) (1.0 * display_width_d * live_height / live_width);
}
layoutParam = new RelativeLayout.LayoutParams(prev_rw, prev_rh);
layoutParam.topMargin = (int) (1.0 * bg_screen_by * screenHeight / bg_height);
layoutParam.leftMargin = (int) (1.0 * bg_screen_bx * screenWidth / bg_width);
cameraDevice = Camera.open();
Log.i(LOG_TAG, "cameara open");
cameraView = new CameraView(this, cameraDevice);
topLayout.addView(cameraView, layoutParam);
Log.i(LOG_TAG, "cameara preview start: OK");
}
// ---------------------------------------
// initialize ffmpeg_recorder
// ---------------------------------------
private void initRecorder() {
Log.w(LOG_TAG, "init recorder");
if (yuvIplimage == null) {
yuvIplimage = IplImage.create(finalImageWidth, finalImageHeight,
IPL_DEPTH_8U, 2);
Log.i(LOG_TAG, "create yuvIplimage");
}
ffmpeg_link = CameraHelpers.getOutputMediaFile(
CameraHelpers.MEDIA_TYPE_VIDEO).toString();
Log.i(LOG_TAG, "ffmpeg_url: " + ffmpeg_link);
recorder = new FFmpegFrameRecorder(ffmpeg_link, finalImageWidth,
finalImageHeight, 1);
recorder.setFormat("mp4");
recorder.setSampleRate(sampleAudioRateInHz);
// Set in the surface changed method
recorder.setFrameRate(frameRate);
Log.i(LOG_TAG, "recorder initialize success");
audioRecordRunnable = new AudioRecordRunnable();
audioThread = new Thread(audioRecordRunnable);
}
public void startRecording() {
try {
recorder.start();
startTime = System.currentTimeMillis();
recording = true;
audioThread.start();
} catch (FFmpegFrameRecorder.Exception e) {
e.printStackTrace();
}
}
public void stopRecording() {
runAudioThread = false;
if (recorder != null && recording) {
recording = false;
Log.v(LOG_TAG,
"Finishing recording, calling stop and release on recorder");
try {
recorder.stop();
recorder.release();
} catch (FFmpegFrameRecorder.Exception e) {
e.printStackTrace();
}
recorder = null;
}
}
@Override
public boolean onKeyDown(int keyCode, KeyEvent event) {
if (keyCode == KeyEvent.KEYCODE_BACK) {
if (recording) {
stopRecording();
}
finish();
return true;
}
return super.onKeyDown(keyCode, event);
}
// ---------------------------------------------
// audio thread, gets and encodes audio data
// ---------------------------------------------
class AudioRecordRunnable implements Runnable {
@Override
public void run() {
android.os.Process
.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
// Audio
int bufferSize;
short[] audioData;
int bufferReadResult;
bufferSize = AudioRecord
.getMinBufferSize(sampleAudioRateInHz,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT);
audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC,
sampleAudioRateInHz, AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT, bufferSize);
audioData = new short[bufferSize];
Log.d(LOG_TAG, "audioRecord.startRecording()");
audioRecord.startRecording();
/* ffmpeg_audio encoding loop */
while (runAudioThread) {
// Log.v(LOG_TAG,"recording? " + recording);
bufferReadResult = audioRecord.read(audioData, 0,
audioData.length);
if (bufferReadResult > 0) {
Log.v(LOG_TAG, "bufferReadResult: " + bufferReadResult);
// If "recording" isn't true when start this thread, it
// never get's set according to this if statement...!!!
// Why? Good question...
if (recording) {
try {
recorder.record(ShortBuffer.wrap(audioData, 0,
bufferReadResult));
// Log.v(LOG_TAG,"recording " + 1024*i + " to " +
// 1024*i+1024);
} catch (FFmpegFrameRecorder.Exception e) {
Log.v(LOG_TAG, e.getMessage());
e.printStackTrace();
}
}
}
}
Log.v(LOG_TAG, "AudioThread Finished, release audioRecord");
/* encoding finish, release recorder */
if (audioRecord != null) {
audioRecord.stop();
audioRecord.release();
audioRecord = null;
Log.v(LOG_TAG, "audioRecord released");
}
}
}
// ---------------------------------------------
// camera thread, gets and encodes video data
// ---------------------------------------------
class CameraView extends SurfaceView implements SurfaceHolder.Callback,
PreviewCallback {
private SurfaceHolder mHolder;
private Camera mCamera;
public CameraView(Context context, Camera camera) {
super(context);
Log.w("camera", "camera view");
mCamera = camera;
mHolder = getHolder();
mHolder.addCallback(CameraView.this);
mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
mCamera.setPreviewCallback(CameraView.this);
}
@Override
public void surfaceCreated(SurfaceHolder holder) {
try {
stopPreview();
mCamera.setPreviewDisplay(holder);
} catch (IOException exception) {
mCamera.release();
mCamera = null;
}
}
public void surfaceChanged(SurfaceHolder holder, int format, int width,
int height) {
Log.v(LOG_TAG, "Setting imageWidth: " + imageWidth
+ " imageHeight: " + imageHeight + " frameRate: "
+ frameRate);
Camera.Parameters camParams = mCamera.getParameters();
camParams.setPreviewSize(imageWidth, imageHeight);
Log.v(LOG_TAG,
"Preview Framerate: " + camParams.getPreviewFrameRate());
camParams.setPreviewFrameRate(frameRate);
mCamera.setParameters(camParams);
startPreview();
}
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
try {
mHolder.addCallback(null);
mCamera.setPreviewCallback(null);
} catch (RuntimeException e) {
// The camera has probably just been released, ignore.
}
}
public void startPreview() {
if (!isPreviewOn && mCamera != null) {
isPreviewOn = true;
mCamera.startPreview();
}
}
public void stopPreview() {
if (isPreviewOn && mCamera != null) {
isPreviewOn = false;
mCamera.stopPreview();
}
}
@Override
public void onPreviewFrame(byte[] data, Camera camera) {
/* get video data */
if (yuvIplimage != null && recording) {
// yuvIplimage.getByteBuffer().put(data);
final int startY = 640 * (480 - 360) / 2;
final int lenY = 640 * 360;
yuvIplimage.getByteBuffer().put(data, startY, lenY);
final int startVU = 640 * 480 + 320 * 2 * (240 - 180) / 2;
final int lenVU = 320 * 180 * 2;
yuvIplimage.getByteBuffer().put(data, startVU, lenVU);
Log.v(LOG_TAG, "Writing Frame");
try {
long t = 1000 * (System.currentTimeMillis() - startTime);
if (t > recorder.getTimestamp()) {
recorder.setTimestamp(t);
}
recorder.record(yuvIplimage);
} catch (FFmpegFrameRecorder.Exception e) {
Log.v(LOG_TAG, e.getMessage());
e.printStackTrace();
}
}
}
}
@Override
public void onClick(View v) {
if (!recording) {
startRecording();
Log.w(LOG_TAG, "Start Button Pushed");
btnRecorderControl.setText("Stop");
} else {
// This will trigger the audio recording loop to stop and then set
// isRecorderStart = false;
stopRecording();
Log.w(LOG_TAG, "Stop Button Pushed");
btnRecorderControl.setText("Start");
}
}
} -
Static Compilation of FFmpeg with x264 support
17 février 2014, par b1izzardBased on the Tutorial, I tried to build the latest FFmpeg binary with x264 support for Command line execution using Android-ndk-r9c.
I had built the x264 using the below script :
#!/bin/bash
PREBUILT=/home/blizzard/bin/android-ndk-r9c/toolchains/arm-linux-androideabi-4.6/prebuilt
PLATFORM=/home/blizzard/bin/android-ndk-r9c/platforms/android-9/arch-arm
PREFIX=/home/blizzard/bin/android-ndk-r9c/sources/ffmpeg
./configure --prefix=$PREFIX \
--enable-static \
--enable-pic \
--disable-asm \
--disable-cli \
--host=arm-linux \
--cross-prefix=$PREBUILT/linux-x86_64/bin/arm-linux-androideabi- \
--sysroot=$PLATFORM
make
sudo make install
sudo ldconfigFor building FFmpeg I had used the below script :
#!/bin/bash
PLATFORM=/home/blizzard/bin/android-ndk-r9c/platforms/android-9/arch-arm
PREBUILT=/home/blizzard/bin/android-ndk-r9c/toolchains/arm-linux-androideabi-4.6/prebuilt/linux-x86_64
PREFIX=/home/blizzard/bin/android-ndk-r9c/sources/ffmpeg
function build_one
{
./configure --target-os=linux --prefix=$PREFIX \
--arch=arm \
--cpu=cortex-a8 \
--enable-cross-compile \
--enable-runtime-cpudetect \
--disable-asm \
--enable-static \
--disable-shared \
--arch=arm \
--cc=$PREBUILT/bin/arm-linux-androideabi-gcc \
--cross-prefix=$PREBUILT/bin/arm-linux-androideabi- \
--disable-stripping \
--nm=$PREBUILT/bin/arm-linux-androideabi-nm \
--sysroot=$PLATFORM \
--enable-nonfree \
--enable-version3 \
--disable-everything \
--enable-gpl \
--disable-doc \
--enable-avresample \
--enable-demuxer=rtsp \
--enable-muxer=rtsp \
--disable-ffplay \
--disable-ffserver \
--enable-ffmpeg \
--disable-ffprobe \
--enable-libx264 \
--enable-encoder=libx264 \
--enable-decoder=h264 \
--enable-protocol=rtp \
--enable-hwaccels \
--enable-zlib \
--disable-devices \
--disable-avdevice \
--extra-cflags="-I/home/blizzard/bin/android-ndk-r9c/sources/ffmpeg/include -fPIC -DANDROID -D__thumb__ -mthumb -Wfatal-errors -Wno-deprecated -mfloat-abi=softfp -marm -march=armv7-a" \
--extra-ldflags="-L/home/blizzard/bin/android-ndk-r9c/sources/ffmpeg/lib"
make -j4 install
$PREBUILT/bin/arm-linux-androideabi-ar d libavcodec/libavcodec.a inverse.o
$PREBUILT/bin/arm-linux-androideabi-ld -rpath-link=$PLATFORM/usr/lib -L$PLATFORM/usr/lib -L$PREFIX/lib -soname libffmpeg.so -shared -nostdlib -z noexecstack -Bsymbolic --whole-archive --no-undefined -o $PREFIX/libffmpeg.so libavcodec/libavcodec.a libavfilter/libavfilter.a libavresample/libavresample.a libavformat/libavformat.a libavutil/libavutil.a libswscale/libswscale.a -lc -lm -lz -ldl -llog -lx264 --dynamic-linker=/system/bin/linker $PREBUILT/lib/gcc/arm-linux-androideabi/4.6/libgcc.a
}
build_oneThe output of the script is
install prefix /home/blizzard/bin/android-ndk-r9c/sources/ffmpeg
source path .
C compiler /home/blizzard/bin/android-ndk-r9c/toolchains/arm-linux-androideabi-4.6/prebuilt/linux-x86_64/bin/arm-linux-androideabi-gcc
C library bionic
host C compiler gcc
host C library glibc
ARCH c (cortex-a8)
big-endian no
runtime cpu detection yes
debug symbols yes
strip symbols no
optimize for size no
optimizations yes
static yes
shared no
postprocessing support yes
new filter support yes
network support yes
threading support pthreads
safe bitstream reader yes
SDL support no
opencl enabled no
libzvbi enabled no
texi2html enabled yes
perl enabled yes
pod2man enabled yes
makeinfo enabled no
External libraries:
libx264 zlib
Enabled decoders:
h264
Enabled encoders:
libx264
Enabled hwaccels:
Enabled parsers:
Enabled demuxers:
asf mpegts rtsp
mov rm
Enabled muxers:
rtp rtsp
Enabled protocols:
http tcp udp
rtp
Enabled filters:
aformat format setpts
anull null trim
atrim
Enabled bsfs:
Enabled indevs:
Enabled outdevs:
License: nonfree and unredistributable
Creating config.mak, config.h, and doc/config.texi...
config.h is unchanged
libavutil/avconfig.h is unchanged
WARNING: /home/blizzard/bin/android-ndk-r9c/toolchains/arm-linux-androideabi-4.6/prebuilt/linux-x86_64/bin/arm-linux-androideabi-pkg-config not found, library detection may fail.
INSTALL install-progs-yes
INSTALL ffmpeg
INSTALL presets/libvpx-1080p50_60.ffpreset
INSTALL presets/libvpx-1080p.ffpreset
INSTALL presets/libvpx-360p.ffpreset
INSTALL presets/libvpx-720p50_60.ffpreset
INSTALL presets/libvpx-720p.ffpreset
INSTALL presets/libx264-ipod320.ffpreset
INSTALL presets/libx264-ipod640.ffpreset
INSTALL doc/ffprobe.xsd
INSTALL doc/examples/decoding_encoding.c
INSTALL doc/examples/demuxing_decoding.c
INSTALL doc/examples/filtering_audio.c
INSTALL doc/examples/filtering_video.c
INSTALL doc/examples/metadata.c
INSTALL doc/examples/muxing.c
INSTALL doc/examples/remuxing.c
INSTALL doc/examples/resampling_audio.c
INSTALL doc/examples/scaling_video.c
INSTALL doc/examples/transcode_aac.c
INSTALL doc/examples/Makefile
INSTALL doc/examples/README
INSTALL libavfilter/libavfilter.a
INSTALL libavformat/libavformat.a
INSTALL presets/libvpx-1080p50_60.ffpreset
INSTALL presets/libvpx-1080p.ffpreset
INSTALL presets/libvpx-360p.ffpreset
INSTALL presets/libvpx-720p50_60.ffpreset
INSTALL presets/libvpx-720p.ffpreset
INSTALL presets/libx264-ipod320.ffpreset
INSTALL presets/libx264-ipod640.ffpreset
INSTALL doc/ffprobe.xsd
INSTALL doc/examples/decoding_encoding.c
INSTALL doc/examples/demuxing_decoding.c
INSTALL doc/examples/filtering_audio.c
INSTALL doc/examples/filtering_video.c
INSTALL doc/examples/metadata.c
INSTALL doc/examples/muxing.c
INSTALL doc/examples/remuxing.c
INSTALL doc/examples/resampling_audio.c
INSTALL doc/examples/scaling_video.c
INSTALL doc/examples/transcode_aac.c
INSTALL doc/examples/Makefile
INSTALL doc/examples/README
INSTALL libavresample/libavresample.a
INSTALL libavcodec/libavcodec.a
INSTALL libpostproc/libpostproc.a
INSTALL libswresample/libswresample.a
INSTALL libavutil/libavutil.a
a
INSTALL libavfilter/asrc_abuffer.h
INSTALL libavfilter/avcodec.h
INSTALL libavfilter/avfilter.h
INSTALL libavfilter/avfiltergraph.h
INSTALL libavfilter/buffersink.h
INSTALL libavfilter/buffersrc.h
INSTALL libavfilter/version.h
INSTALL libavformat/avformat.h
INSTALL libavformat/avio.h
INSTALL libavformat/version.h
INSTALL libavfilter/libavfilter.pc
INSTALL libavformat/libavformat.pc
INSTALL libavresample/avresample.h
INSTALL libavresample/version.h
INSTALL libavresample/libavresample.pc
INSTALL libavcodec/avcodec.h
INSTALL libavcodec/avfft.h
INSTALL libavcodec/dxva2.h
INSTALL libavcodec/old_codec_ids.h
INSTALL libavcodec/vaapi.h
INSTALL libavcodec/vda.h
INSTALL libavcodec/vdpau.h
INSTALL libavcodec/version.h
INSTALL libavcodec/xvmc.h
INSTALL libavcodec/libavcodec.pc
INSTALL libpostproc/postprocess.h
INSTALL libpostproc/version.h
INSTALL libpostproc/libpostproc.pc
INSTALL libswresample/swresample.h
INSTALL libswresample/version.h
INSTALL libswscale/swscale.h
INSTALL libswscale/version.h
INSTALL libavutil/adler32.h
INSTALL libavutil/aes.h
INSTALL libavutil/attributes.h
INSTALL libavutil/audio_fifo.h
INSTALL libavutil/audioconvert.h
INSTALL libavutil/avassert.h
INSTALL libavutil/avstring.h
INSTALL libavutil/avutil.h
INSTALL libavutil/base64.h
INSTALL libavutil/blowfish.h
INSTALL libavutil/bprint.h
INSTALL libavutil/bswap.h
INSTALL libavutil/buffer.h
INSTALL libavutil/channel_layout.h
INSTALL libavutil/common.h
INSTALL libavutil/cpu.h
INSTALL libavutil/crc.h
INSTALL libavutil/error.h
INSTALL libavutil/eval.h
INSTALL libavutil/fifo.h
INSTALL libavutil/file.h
INSTALL libavutil/frame.h
INSTALL libavutil/hmac.h
INSTALL libavutil/imgutils.h
INSTALL libavutil/intfloat.h
INSTALL libavutil/intfloat_readwrite.h
INSTALL libavutil/intreadwrite.h
INSTALL libavutil/lfg.h
INSTALL libavutil/log.h
INSTALL libavutil/macros.h
INSTALL libavutil/mathematics.h
INSTALL libavutil/md5.h
INSTALL libavutil/mem.h
INSTALL libavutil/murmur3.h
INSTALL libavutil/dict.h
INSTALL libavutil/old_pix_fmts.h
INSTALL libavutil/opt.h
INSTALL libavutil/parseutils.h
INSTALL libavutil/pixdesc.h
INSTALL libavutil/pixfmt.h
INSTALL libavutil/random_seed.h
INSTALL libavutil/rational.h
INSTALL libavutil/ripemd.h
INSTALL libavutil/samplefmt.h
INSTALL libavutil/sha.h
INSTALL libavutil/sha512.h
INSTALL libavutil/stereo3d.h
INSTALL libavutil/time.h
INSTALL libavutil/timecode.h
INSTALL libavutil/timestamp.h
INSTALL libavutil/version.h
INSTALL libavutil/xtea.h
INSTALL libavutil/avconfig.h
INSTALL libavutil/ffversion.h
INSTALL libswresample/libswresample.pc
INSTALL libswscale/libswscale.pc
INSTALL libavutil/libavutil.pcWhen I tried to execute the following command using the resulting FFmpeg binary in Android Application
ffmpeg -y -loop 1 -r 30 -i myImage.jpg -b:v "4096k" -vf "scale=640:480" -t 3 result.mp4
LogCat displays the following message
02-17 15:42:45.683: D/(14511): *******Starting FFMPEG
02-17 15:42:45.683: D/(14511): ***ffmpeg version N-60108-gda25a65 Copyright (c) 2000-2014 the FFmpeg developers***
02-17 15:42:45.693: D/(14511): *** built on Feb 17 2014 15:35:20 with gcc 4.6 (GCC) 20120106 (prerelease)***
02-17 15:42:45.693: D/(14511): *** configuration: --target-os=linux --prefix=/home/blizzard/bin/android-ndk-r9c/sources/ffmpeg --arch=arm --cpu=cortex-a8 --enable-cross-compile --enable-runtime-cpudetect --disable-asm --enable-static --disable-shared --arch=arm --cc=/home/blizzard/bin/android-ndk-r9c/toolchains/arm-linux-androideabi-4.6/prebuilt/linux-x86_64/bin/arm-linux-androideabi-gcc --cross-prefix=/home/blizzard/bin/android-ndk-r9c/toolchains/arm-linux-androideabi-4.6/prebuilt/linux-x86_64/bin/arm-linux-androideabi- --disable-stripping --nm=/home/blizzard/bin/android-ndk-r9c/toolchains/arm-linux-androideabi-4.6/prebuilt/linux-x86_64/bin/arm-linux-androideabi-nm --sysroot=/home/blizzard/bin/android-ndk-r9c/platforms/android-9/arch-arm --enable-nonfree --enable-version3 --disable-everything --enable-gpl --disable-doc --enable-avresample --enable-demuxer=rtsp --enable-muxer=rtsp --disable-ffplay --disable-ffserver --enable-ffmpeg --disable-ffprobe --enable-libx264 --enable-encoder=libx264 --enable-decoder=h264 --enable-protocol=rtp --enable-hwaccels --enable-zlib --disable-devices --disable-avdevice --extra-cflags='-I/home/blizzard/bin/android-ndk-r9c/sources/ffmpeg/include -fPIC -DANDROID -D__thumb__ -mthumb -Wfatal-errors -Wno-deprecated -mfloat-abi=softfp -mfpu=vfpv3-d16 -marm -march=armv7-a' --extra-ldflags=-L/home/blizzard/bin/android-ndk-r9c/sources/ffmpeg/lib***
02-17 15:42:45.693: D/(14511): *** libavutil 52. 63.100 / 52. 63.100***
02-17 15:42:45.693: D/(14511): *** libavcodec 55. 49.100 / 55. 49.100***
02-17 15:42:45.693: D/(14511): *** libavformat 55. 25.101 / 55. 25.101***
02-17 15:42:45.693: D/(14511): *** libavfilter 4. 1.100 / 4. 1.100***
02-17 15:42:45.693: D/(14511): *** libavresample 1. 1. 0 / 1. 1. 0***
02-17 15:42:45.693: D/(14511): *** libswscale 2. 5.101 / 2. 5.101***
02-17 15:42:45.693: D/(14511): *** libswresample 0. 17.104 / 0. 17.104***
02-17 15:42:45.693: D/(14511): *** libpostproc 52. 3.100 / 52. 3.100***
02-17 15:42:45.693: D/(14511): ***Unrecognized option 'loop'.***
02-17 15:42:45.693: D/(14511): ***Error splitting the argument list: Option not found***
02-17 15:42:45.693: D/(14511): ****ending FFMPEG****Can anybody help me in building the latest version of FFmpeg or Can you share the latest FFMpeg binary for Android ?