
Recherche avancée
Médias (91)
-
GetID3 - Boutons supplémentaires
9 avril 2013, par
Mis à jour : Avril 2013
Langue : français
Type : Image
-
Core Media Video
4 avril 2013, par
Mis à jour : Juin 2013
Langue : français
Type : Video
-
The pirate bay depuis la Belgique
1er avril 2013, par
Mis à jour : Avril 2013
Langue : français
Type : Image
-
Bug de détection d’ogg
22 mars 2013, par
Mis à jour : Avril 2013
Langue : français
Type : Video
-
Exemple de boutons d’action pour une collection collaborative
27 février 2013, par
Mis à jour : Mars 2013
Langue : français
Type : Image
-
Exemple de boutons d’action pour une collection personnelle
27 février 2013, par
Mis à jour : Février 2013
Langue : English
Type : Image
Autres articles (88)
-
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Support de tous types de médias
10 avril 2011Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)
Sur d’autres sites (6912)
-
Encoding images into a movie file
5 avril 2014, par RuAwareI am trying to save jpgs into a movie, I have tried jcodec and alothough my s3 plays it fine other devices do not. including vlc and windows media
I have just spent most of the day playing with MediaCodec, although the SDK is so high, it will help people with jelly bean and above. But I can not work out how to get the Files to the encoder and then write the file.
Ideally I wont to support down to SDK 9/8
Has anyone got any code they can share, either to get MediaCodec to work or another option. If you say ffmpeg, I'd love to but my jin knowledge is non existent and I will need a very good guide.
Code for MediaCodec so far
public class EncodeAndMux extends AsyncTask {
private static int bitRate = 2000000;
private static int MAX_INPUT = 100000;
private static String mimeType = "video/avc";
private int frameRate = 15;
private int colorFormat;
private int stride = 1;
private int sliceHeight = 2;
private MediaCodec encoder = null;
private MediaFormat inputFormat;
private MediaCodecInfo codecInfo = null;
private MediaMuxer muxer;
private boolean mMuxerStarted = false;
private int mTrackIndex = 0;
private long presentationTime = 0;
private Paint bmpPaint;
private static int WAITTIME = 10000;
private static String TAG = "ENCODE";
private ArrayList<string> mFilePaths;
private String mPath;
private EncodeListener mListener;
private int width = 320;
private int height = 240;
private double mSpeed = 1;
public EncodeAndMux(ArrayList<string> filePaths, String savePath) {
mFilePaths = filePaths;
mPath = savePath;
// Create paint to draw BMP
bmpPaint = new Paint();
bmpPaint.setAntiAlias(true);
bmpPaint.setFilterBitmap(true);
bmpPaint.setDither(true);
}
public void setListner(EncodeListener listener) {
mListener = listener;
}
// set the speed, how many frames a second
public void setSpead(int speed) {
mSpeed = speed;
}
public double getSpeed() {
return mSpeed;
}
private long computePresentationTime(int frameIndex) {
final long ONE_SECOND = 1000000;
return (long) (frameIndex * (ONE_SECOND / mSpeed));
}
public interface EncodeListener {
public void finished();
public void errored();
}
@TargetApi(Build.VERSION_CODES.JELLY_BEAN_MR2)
@Override
protected Boolean doInBackground(Integer... params) {
try {
muxer = new MediaMuxer(mPath, OutputFormat.MUXER_OUTPUT_MPEG_4);
} catch (Exception e){
e.printStackTrace();
}
// Find a code that supports the mime type
int numCodecs = MediaCodecList.getCodecCount();
for (int i = 0; i < numCodecs && codecInfo == null; i++) {
MediaCodecInfo info = MediaCodecList.getCodecInfoAt(i);
if (!info.isEncoder()) {
continue;
}
String[] types = info.getSupportedTypes();
boolean found = false;
for (int j = 0; j < types.length && !found; j++) {
if (types[j].equals(mimeType))
found = true;
}
if (!found)
continue;
codecInfo = info;
}
for (int i = 0; i < MediaCodecList.getCodecCount(); i++) {
MediaCodecInfo info = MediaCodecList.getCodecInfoAt(i);
if (!info.isEncoder()) {
continue;
}
String[] types = info.getSupportedTypes();
for (int j = 0; j < types.length; ++j) {
if (types[j] != mimeType)
continue;
MediaCodecInfo.CodecCapabilities caps = info.getCapabilitiesForType(types[j]);
for (int k = 0; k < caps.profileLevels.length; k++) {
if (caps.profileLevels[k].profile == MediaCodecInfo.CodecProfileLevel.AVCProfileHigh && caps.profileLevels[k].level == MediaCodecInfo.CodecProfileLevel.AVCLevel4) {
codecInfo = info;
}
}
}
}
Log.d(TAG, "Found " + codecInfo.getName() + " supporting " + mimeType);
MediaCodecInfo.CodecCapabilities capabilities = codecInfo.getCapabilitiesForType(mimeType);
for (int i = 0; i < capabilities.colorFormats.length && colorFormat == 0; i++) {
int format = capabilities.colorFormats[i];
switch (format) {
case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar:
case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedPlanar:
case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar:
case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedSemiPlanar:
case MediaCodecInfo.CodecCapabilities.COLOR_TI_FormatYUV420PackedSemiPlanar:
colorFormat = format;
break;
}
}
Log.d(TAG, "Using color format " + colorFormat);
// Determine width, height and slice sizes
if (codecInfo.getName().equals("OMX.TI.DUCATI1.VIDEO.H264E")) {
// This codec doesn't support a width not a multiple of 16,
// so round down.
width &= ~15;
}
stride = width;
sliceHeight = height;
if (codecInfo.getName().startsWith("OMX.Nvidia.")) {
stride = (stride + 15) / 16 * 16;
sliceHeight = (sliceHeight + 15) / 16 * 16;
}
inputFormat = MediaFormat.createVideoFormat(mimeType, width, height);
inputFormat.setInteger(MediaFormat.KEY_BIT_RATE, bitRate);
inputFormat.setInteger(MediaFormat.KEY_FRAME_RATE, frameRate);
inputFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, colorFormat);
inputFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
// inputFormat.setInteger("stride", stride);
// inputFormat.setInteger("slice-height", sliceHeight);
inputFormat.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, MAX_INPUT);
encoder = MediaCodec.createByCodecName(codecInfo.getName());
encoder.configure(inputFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
encoder.start();
ByteBuffer[] inputBuffers = encoder.getInputBuffers();
ByteBuffer[] outputBuffers = encoder.getOutputBuffers();
int inputBufferIndex= -1, outputBufferIndex= -1;
BufferInfo info = new BufferInfo();
for (int i = 0; i < mFilePaths.size(); i++) {
// use decode sample to calculate inSample size and then resize
Bitmap bitmapIn = Images.decodeSampledBitmapFromPath(mFilePaths.get(i), width, height);
// Create blank bitmap
Bitmap bitmap = Bitmap.createBitmap(width, height, Config.ARGB_8888);
// Center scaled image
Canvas canvas = new Canvas(bitmap);
canvas.drawBitmap(bitmapIn,(bitmap.getWidth()/2)-(bitmapIn.getWidth()/2),(bitmap.getHeight()/2)-(bitmapIn.getHeight()/2), bmpPaint);
Log.d(TAG, "Bitmap width: " + bitmapIn.getWidth() + " height: " + bitmapIn.getHeight() + " WIDTH: " + width + " HEIGHT: " + height);
byte[] dat = getNV12(width, height, bitmap);
bitmap.recycle();
// Exception occurred on this below line in Emulator, LINE No. 182//**
inputBufferIndex = encoder.dequeueInputBuffer(WAITTIME);
Log.i("DAT", "Size= "+dat.length);
if(inputBufferIndex >= 0){
int samplesiz= dat.length;
inputBuffers[inputBufferIndex].put(dat);
presentationTime = computePresentationTime(i);
if (i == mFilePaths.size()) {
encoder.queueInputBuffer(inputBufferIndex, 0, samplesiz, presentationTime, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
Log.i(TAG, "Last Frame");
} else {
encoder.queueInputBuffer(inputBufferIndex, 0, samplesiz, presentationTime, 0);
}
while(true) {
outputBufferIndex = encoder.dequeueOutputBuffer(info, WAITTIME);
Log.i("BATA", "outputBufferIndex="+outputBufferIndex);
if (outputBufferIndex >= 0) {
ByteBuffer encodedData = outputBuffers[outputBufferIndex];
if (encodedData == null) {
throw new RuntimeException("encoderOutputBuffer " + outputBufferIndex +
" was null");
}
if ((info.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
// The codec config data was pulled out and fed to the muxer when we got
// the INFO_OUTPUT_FORMAT_CHANGED status. Ignore it.
Log.d(TAG, "ignoring BUFFER_FLAG_CODEC_CONFIG");
info.size = 0;
}
if (info.size != 0) {
if (!mMuxerStarted) {
throw new RuntimeException("muxer hasn't started");
}
// adjust the ByteBuffer values to match BufferInfo (not needed?)
encodedData.position(info.offset);
encodedData.limit(info.offset + info.size);
muxer.writeSampleData(mTrackIndex, encodedData, info);
Log.d(TAG, "sent " + info.size + " bytes to muxer");
}
encoder.releaseOutputBuffer(outputBufferIndex, false);
inputBuffers[inputBufferIndex].clear();
outputBuffers[outputBufferIndex].clear();
if ((info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
break; // out of while
}
} else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
// Subsequent data will conform to new format.
MediaFormat opmediaformat = encoder.getOutputFormat();
if (!mMuxerStarted) {
mTrackIndex = muxer.addTrack(opmediaformat);
muxer.start();
mMuxerStarted = true;
}
Log.i(TAG, "op_buf_format_changed: " + opmediaformat);
} else if(outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
outputBuffers = encoder.getOutputBuffers();
Log.d(TAG, "Output Buffer changed " + outputBuffers);
} else if(outputBufferIndex == MediaCodec.INFO_TRY_AGAIN_LATER) {
// No Data, break out
break;
} else {
// Unexpected State, ignore it
Log.d(TAG, "Unexpected State " + outputBufferIndex);
}
}
}
}
if (encoder != null) {
encoder.flush();
encoder.stop();
encoder.release();
encoder = null;
}
if (muxer != null) {
muxer.stop();
muxer.release();
muxer = null;
}
return true;
};
@Override
protected void onPostExecute(Boolean result) {
if (result) {
if (mListener != null)
mListener.finished();
} else {
if (mListener != null)
mListener.errored();
}
super.onPostExecute(result);
}
byte [] getNV12(int inputWidth, int inputHeight, Bitmap scaled) {
int [] argb = new int[inputWidth * inputHeight];
scaled.getPixels(argb, 0, inputWidth, 0, 0, inputWidth, inputHeight);
byte [] yuv = new byte[inputWidth*inputHeight*3/2];
encodeYUV420SP(yuv, argb, inputWidth, inputHeight);
scaled.recycle();
return yuv;
}
void encodeYUV420SP(byte[] yuv420sp, int[] argb, int width, int height) {
final int frameSize = width * height;
int yIndex = 0;
int uvIndex = frameSize;
int a, R, G, B, Y, U, V;
int index = 0;
for (int j = 0; j < height; j++) {
for (int i = 0; i < width; i++) {
a = (argb[index] & 0xff000000) >> 24; // a is not used obviously
R = (argb[index] & 0xff0000) >> 16;
G = (argb[index] & 0xff00) >> 8;
B = (argb[index] & 0xff) >> 0;
// well known RGB to YVU algorithm
Y = ( ( 66 * R + 129 * G + 25 * B + 128) >> 8) + 16;
V = ( ( -38 * R - 74 * G + 112 * B + 128) >> 8) + 128;
U = ( ( 112 * R - 94 * G - 18 * B + 128) >> 8) + 128;
yuv420sp[yIndex++] = (byte) ((Y < 0) ? 0 : ((Y > 255) ? 255 : Y));
if (j % 2 == 0 && index % 2 == 0) {
yuv420sp[uvIndex++] = (byte)((V<0) ? 0 : ((V > 255) ? 255 : V));
yuv420sp[uvIndex++] = (byte)((U<0) ? 0 : ((U > 255) ? 255 : U));
}
index ++;
}
}
}
}
</string></string>This has now been tested on 4 of my devices and works fine, is there are way to
1/ Calculate the MAX_INPUT (to high and on the N7 II it crashes, I Don't want that happening once released)
2/ Offer an api 16 solution ?
3/ Do I need stride and stride height ?Thanks
-
encode video in reverse ?
7 mars 2017, par bobDoes anyone know if it is possible to encode a video using ffmpeg in reverse ? (So the resulting video plays in reverse ?)
I think I can by generating images for each frame (so a folder of images labelled 1.jpg, 2.jpg etc), then write a script to change the image names, and then re-encode the ivdeo from these files.
Does anyone know of a quicker way ?
This is an FLV video.
Thank you
-
Stream H264 To Android Using FFMPEG
26 février 2013, par GroovyDotComI'm trying to stream a .ts file containing H.264 and AAC as an RTP stream to an Android device.
I tried :
.\ffmpeg -fflags +genpts -re -i 1.ts -vcodec copy -an -f rtp rtp ://127.0.0.1:10
000 -vn -acodec copy -f rtp rtp ://127.0.0.1:20000 -newaudioFFMPEG displays what should be in your SDP file and I copied this into an SDP file and tried playing from VLC and FFPLAY. VLC plays audio but just gives errors re : bad NAL unit types for video. FFPLAY doesn't play anything.
My best guess if that the FFMPEG H.264 RTP implementation is broken or at least it doesn't work in video passthru mode (i.e. using the -vcodec copy).
I need a fix for FFMPEG or an alternate simple open-source solution. I don't want to install FFMPEG in my Android client.
thanks.