
Recherche avancée
Autres articles (36)
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Les formats acceptés
28 janvier 2010, parLes commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
ffmpeg -codecs ffmpeg -formats
Les format videos acceptés en entrée
Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
Les formats vidéos de sortie possibles
Dans un premier temps on (...) -
Ajouter notes et légendes aux images
7 février 2011, parPour pouvoir ajouter notes et légendes aux images, la première étape est d’installer le plugin "Légendes".
Une fois le plugin activé, vous pouvez le configurer dans l’espace de configuration afin de modifier les droits de création / modification et de suppression des notes. Par défaut seuls les administrateurs du site peuvent ajouter des notes aux images.
Modification lors de l’ajout d’un média
Lors de l’ajout d’un média de type "image" un nouveau bouton apparait au dessus de la prévisualisation (...)
Sur d’autres sites (8797)
-
flv : Ammon’s changes migrated from 0.6.0 - I believe for the android broadcaster.
25 août 2011, par Joseph Weckerflv : Ammon’s changes migrated from 0.6.0 - I believe for the android broadcaster.
-
Raw audio stream of Android device mic to libgsm_ms file
2 septembre 2011, par MANNI'm capturing audio stream from the android device mic, which I want to convert into libgsm_ms encoded file but without saving it into a file or converting it into some xyz format (time and cpu utilization issue) and then reconvert it into libgsm_ms standard. Please let me know how to achieve this ?
Is it possible to feed this stream directly to ffmpeg ? -
Decoding audio via Android using FFMpeg
31 octobre 2013, par Rob HauptI can play Wav files using the below code without issues. When trying to play the exact same media in Mp3 format I only get garbled junk. I believe I am fundamentally misunderstanding how the avcodec_decode_audio3 function works.
Since the Wav file contains PCM data when it is decoded it can go straight to the AudioTrack.write function. There must be some additional step to get Mp3s to work like this. I don't know what I'm missing, but I've been pulling my hair out for a week now.
Java Code
package com.rohaupt.RRD2;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import android.app.Activity;
import android.media.AudioFormat;
import android.media.AudioManager;
import android.media.AudioTrack;
import android.media.MediaPlayer;
import android.os.Bundle;
import android.os.SystemClock;
public class player extends Activity
{
private AudioTrack track;
private FileOutputStream os;
/** Called when the activity is first created. */
@Override
public void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
createEngine();
MediaPlayer mp = new MediaPlayer();
mp.start();
int bufSize = AudioTrack.getMinBufferSize(32000,
AudioFormat.CHANNEL_CONFIGURATION_STEREO,
AudioFormat.ENCODING_PCM_16BIT);
track = new AudioTrack(AudioManager.STREAM_MUSIC,
32000,
AudioFormat.CHANNEL_CONFIGURATION_STEREO,
AudioFormat.ENCODING_PCM_16BIT,
bufSize,
AudioTrack.MODE_STREAM);
byte[] bytes = new byte[bufSize];
try {
os = new FileOutputStream("/sdcard/a.out",false);
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
String result = loadFile("/sdcard/a.mp3",bytes);
try {
os.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
void playSound(byte[] buf, int size) {
//android.util.Log.v("ROHAUPT", "RAH Playing");
if(track.getPlayState()!=AudioTrack.PLAYSTATE_PLAYING)
track.play();
track.write(buf, 0, size);
try {
os.write(buf,0,size);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
private native void createEngine();
private native String loadFile(String file, byte[] array);
/** Load jni .so on initialization*/
static {
System.loadLibrary("avutil");
System.loadLibrary("avcore");
System.loadLibrary("avcodec");
System.loadLibrary("avformat");
System.loadLibrary("avdevice");
System.loadLibrary("swscale");
System.loadLibrary("avfilter");
System.loadLibrary("ffmpeg");
}
}C Code
#include
#include
#include
#include <android></android>log.h>
#include "libavcodec/avcodec.h"
#include "libavformat/avformat.h"
#define DEBUG_TAG "ROHAUPT"
void Java_com_rohaupt_RRD2_player_createEngine(JNIEnv* env, jclass clazz)
{
avcodec_init();
av_register_all();
}
jstring Java_com_rohaupt_RRD2_player_loadFile(JNIEnv* env, jobject obj,jstring file,jbyteArray array)
{
jboolean isCopy;
int i;
int audioStream=-1;
int res;
int decoded = 0;
int out_size;
AVFormatContext *pFormatCtx;
AVCodecContext *aCodecCtx;
AVCodecContext *c= NULL;
AVCodec *aCodec;
AVPacket packet;
jclass cls = (*env)->GetObjectClass(env, obj);
jmethodID play = (*env)->GetMethodID(env, cls, "playSound", "([BI)V");//At the begining of your main function
const char * szfile = (*env)->GetStringUTFChars(env, file, &isCopy);
int16_t * pAudioBuffer = (int16_t *) av_malloc (AVCODEC_MAX_AUDIO_FRAME_SIZE*2+FF_INPUT_BUFFER_PADDING_SIZE);
int16_t * outBuffer = (int16_t *) av_malloc (AVCODEC_MAX_AUDIO_FRAME_SIZE*2+FF_INPUT_BUFFER_PADDING_SIZE);
__android_log_print(ANDROID_LOG_INFO, DEBUG_TAG, "RAH28 Starting");
res = av_open_input_file(&pFormatCtx, szfile, NULL, 0, NULL);
if(res!=0)
{
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH opening input failed with result: [%d]", res);
return file;
}
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH getting stream info");
res = av_find_stream_info(pFormatCtx);
if(res<0)
{
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH getting stream info failed with result: [%d]", res);
return file;
}
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH getting audio stream");
for(i=0; i < pFormatCtx->nb_streams; i++) {
if(pFormatCtx->streams[i]->codec->codec_type==AVMEDIA_TYPE_AUDIO &&
audioStream < 0) {
audioStream=i;
}
}
if(audioStream==-1)
{
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH couldn't find audio stream");
return file;
}
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH audio stream found with result: [%d]", res);
aCodecCtx=pFormatCtx->streams[audioStream]->codec;
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH audio codec info loaded");
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH audio codec info [%d]", aCodecCtx->codec_id);
aCodec = avcodec_find_decoder(aCodecCtx->codec_id);
if(!aCodec) {
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH audio codec unsupported");
return file;
}
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH audio codec info found");
res = avcodec_open(aCodecCtx, aCodec);
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH audio codec loaded [%d] [%d]",aCodecCtx->sample_fmt,res);
//c=avcodec_alloc_context();
av_init_packet(&packet);
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH channels [%d] sample rate [%d] sample format [%d]",aCodecCtx->channels,aCodecCtx->sample_rate,aCodecCtx->sample_fmt);
int x,y;
x=0;y=0;
while (av_read_frame(pFormatCtx, &packet)>= 0) {
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH frame read: [%d] [%d]",x++,y);
if (aCodecCtx->codec_type == AVMEDIA_TYPE_AUDIO) {
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH audio ready");
int data_size = AVCODEC_MAX_AUDIO_FRAME_SIZE*2+FF_INPUT_BUFFER_PADDING_SIZE;
int size=packet.size;
y=0;
decoded = 0;
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH packet size: [%d]", size);
while(size > 0) {
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH decoding: [%d] [%d]",x,y++);
int len = avcodec_decode_audio3(aCodecCtx, pAudioBuffer, &data_size, &packet);
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH 1 size [%d] len [%d] data_size [%d] out_size [%d]",size,len,data_size,out_size);
jbyte *bytes = (*env)->GetByteArrayElements(env, array, NULL);
memcpy(bytes + decoded, pAudioBuffer, len); //
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH 2");
(*env)->ReleaseByteArrayElements(env, array, bytes, 0);
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH 3");
(*env)->CallVoidMethod(env, obj, play, array, len);
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH 4");
size -= len;
decoded += len;
}
av_free_packet(&packet);
}
}
// Close the video file
av_close_input_file(pFormatCtx);
//__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH Finished Running result: [%d]", res);
(*env)->ReleaseStringUTFChars(env, file, szfile);
return file;
}To add some detail. When Calling this function with a Wav File I get the following Log Data
I/ROHAUPT ( 227): RAH28 Starting
D/ROHAUPT ( 227): RAH getting stream info
D/ROHAUPT ( 227): RAH getting audio stream
D/ROHAUPT ( 227): RAH audio stream found with result: [0]
D/ROHAUPT ( 227): RAH audio codec info loaded
D/ROHAUPT ( 227): RAH audio codec info [65536]
D/ROHAUPT ( 227): RAH audio codec info found
D/ROHAUPT ( 227): RAH audio codec loaded [1] [0]
D/ROHAUPT ( 227): RAH channels [2] sample rate [32000] sample format [1]
D/ROHAUPT ( 227): RAH frame read: [0] [0]
D/ROHAUPT ( 227): RAH audio ready
D/ROHAUPT ( 227): RAH packet size: [4096]
D/ROHAUPT ( 227): RAH decoding: [1] [0]
D/ROHAUPT ( 227): RAH 1 size [4096] len [4096] data_size [4096] out_size [0]
D/ROHAUPT ( 227): RAH 2
D/ROHAUPT ( 227): RAH 3
D/ROHAUPT ( 227): RAH 4
D/ROHAUPT ( 227): RAH frame read: [1] [1]
D/ROHAUPT ( 227): RAH audio ready
...
D/ROHAUPT ( 227): RAH frame read: [924] [1]
D/ROHAUPT ( 227): RAH audio ready
D/ROHAUPT ( 227): RAH packet size: [4096]
D/ROHAUPT ( 227): RAH decoding: [925] [0]
D/ROHAUPT ( 227): RAH 1 size [4096] len [4096] data_size [4096] out_size [0]
D/ROHAUPT ( 227): RAH 2
D/ROHAUPT ( 227): RAH 3
D/ROHAUPT ( 227): RAH 4
D/ROHAUPT ( 227): RAH frame read: [925] [1]
D/ROHAUPT ( 227): RAH audio ready
D/ROHAUPT ( 227): RAH packet size: [3584]
D/ROHAUPT ( 227): RAH decoding: [926] [0]
D/ROHAUPT ( 227): RAH 1 size [3584] len [3584] data_size [3584] out_size [0]
D/ROHAUPT ( 227): RAH 2
D/ROHAUPT ( 227): RAH 3
D/ROHAUPT ( 227): RAH 4When calling with an Mp3 file I get the following
I/ROHAUPT ( 280): RAH28 Starting
D/ROHAUPT ( 280): RAH getting stream info
D/ROHAUPT ( 280): RAH getting audio stream
D/ROHAUPT ( 280): RAH audio stream found with result: [0]
D/ROHAUPT ( 280): RAH audio codec info loaded
D/ROHAUPT ( 280): RAH audio codec info [86017]
D/ROHAUPT ( 280): RAH audio codec info found
D/ROHAUPT ( 280): RAH audio codec loaded [1] [0]
D/ROHAUPT ( 280): RAH channels [2] sample rate [32000] sample format [1]
D/ROHAUPT ( 280): RAH frame read: [0] [0]
D/ROHAUPT ( 280): RAH audio ready
D/ROHAUPT ( 280): RAH packet size: [432]
D/ROHAUPT ( 280): RAH decoding: [1] [0]
D/ROHAUPT ( 280): RAH 1 size [432] len [432] data_size [4608] out_size [0]
D/ROHAUPT ( 280): RAH 2
...
D/ROHAUPT ( 280): RAH frame read: [822] [1]
D/ROHAUPT ( 280): RAH audio ready
D/ROHAUPT ( 280): RAH packet size: [432]
D/ROHAUPT ( 280): RAH decoding: [823] [0]
D/ROHAUPT ( 280): RAH 1 size [432] len [432] data_size [4608] out_size [0]
D/ROHAUPT ( 280): RAH 2
D/ROHAUPT ( 280): RAH 3
D/ROHAUPT ( 280): RAH 4
D/ROHAUPT ( 280): RAH frame read: [823] [1]
D/ROHAUPT ( 280): RAH audio ready
D/ROHAUPT ( 280): RAH packet size: [432]
D/ROHAUPT ( 280): RAH decoding: [824] [0]
D/ROHAUPT ( 280): RAH 1 size [432] len [432] data_size [4608] out_size [0]
D/ROHAUPT ( 280): RAH 2
D/ROHAUPT ( 280): RAH 3
D/ROHAUPT ( 280): RAH 4