
Recherche avancée
Médias (91)
-
DJ Z-trip - Victory Lap : The Obama Mix Pt. 2
15 septembre 2011
Mis à jour : Avril 2013
Langue : English
Type : Audio
-
Matmos - Action at a Distance
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
DJ Dolores - Oslodum 2004 (includes (cc) sample of “Oslodum” by Gilberto Gil)
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Danger Mouse & Jemini - What U Sittin’ On ? (starring Cee Lo and Tha Alkaholiks)
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Cornelius - Wataridori 2
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
The Rapture - Sister Saviour (Blackstrobe Remix)
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
Autres articles (55)
-
(Dés)Activation de fonctionnalités (plugins)
18 février 2011, parPour gérer l’ajout et la suppression de fonctionnalités supplémentaires (ou plugins), MediaSPIP utilise à partir de la version 0.2 SVP.
SVP permet l’activation facile de plugins depuis l’espace de configuration de MediaSPIP.
Pour y accéder, il suffit de se rendre dans l’espace de configuration puis de se rendre sur la page "Gestion des plugins".
MediaSPIP est fourni par défaut avec l’ensemble des plugins dits "compatibles", ils ont été testés et intégrés afin de fonctionner parfaitement avec chaque (...) -
Activation de l’inscription des visiteurs
12 avril 2011, parIl est également possible d’activer l’inscription des visiteurs ce qui permettra à tout un chacun d’ouvrir soit même un compte sur le canal en question dans le cadre de projets ouverts par exemple.
Pour ce faire, il suffit d’aller dans l’espace de configuration du site en choisissant le sous menus "Gestion des utilisateurs". Le premier formulaire visible correspond à cette fonctionnalité.
Par défaut, MediaSPIP a créé lors de son initialisation un élément de menu dans le menu du haut de la page menant (...) -
MediaSPIP : Modification des droits de création d’objets et de publication définitive
11 novembre 2010, parPar défaut, MediaSPIP permet de créer 5 types d’objets.
Toujours par défaut les droits de création et de publication définitive de ces objets sont réservés aux administrateurs, mais ils sont bien entendu configurables par les webmestres.
Ces droits sont ainsi bloqués pour plusieurs raisons : parce que le fait d’autoriser à publier doit être la volonté du webmestre pas de l’ensemble de la plateforme et donc ne pas être un choix par défaut ; parce qu’avoir un compte peut servir à autre choses également, (...)
Sur d’autres sites (10843)
-
Revision e278673c8e : Moved vp8dx_get_raw_frame() call to vp8_get_frame() This change is necessary fo
19 septembre 2012, par Scott LaVarnwayChanged Paths : Modify /vp8/vp8_dx_iface.c Modify /vpxdec.c Moved vp8dx_get_raw_frame() call to vp8_get_frame() This change is necessary for the frame-based multithreading implementation. Since the postproc occurs in this call, vpxdec was modified to time around vpx_codec_get_frame() Change-Id : (...)
-
Android UnsatisfiedLinkError using ffmpeg library
5 avril 2012, par BombasticI'm trying to use
FFMPEG
library in my Android application like this example here : AFPlayer . I have a builded and running version of this application and everything works fine, but when I tried to implement the same logic in my application it crashes everytime I start my media player. So the thing that I did was to copy and paste all files from afplayer's folders to my app (not sure if I have to do something else or if it's the right way,but it's my first touch with Android NDK and I'm not really sure how to do the things). Here is how I'm trying to start my DRadioMediaPlayer :player = new DRadioMediaPlayer();
//player.setOnPreparedListener(this);
//player.setOnCompletionListener(this);
//player.setOnErrorListener(this);
//player.setOnBufferingUpdateListener(this);
//player.setAudioStreamType(AudioManager.STREAM_MUSIC);
try
{
Uri source = Uri.parse(playlist.getCurrentSource().toString());
player.setDataSource(source);
//player.setDataSource(playlist.getCurrentSource().toString());
player.prepare();
playbackState = DRadioPlayerService.PLAYBACK_STATE_BUFFERING;
}This is how I load the library :
static {
System.loadLibrary("player");}
and here is the exception which I'm getting :
04-05 17:04:37.478: W/dalvikvm(686): No implementation found for native Lcom/nimasystems/android/radio2/DRadioMediaPlayer;.n_createEngine ()V
04-05 17:04:37.478: D/AndroidRuntime(686): Shutting down VM
04-05 17:04:37.478: W/dalvikvm(686): threadid=1: thread exiting with uncaught exception (group=0x400259f8)
04-05 17:04:37.478: E/AndroidRuntime(686): FATAL EXCEPTION: main
04-05 17:04:37.478: E/AndroidRuntime(686): java.lang.UnsatisfiedLinkError: n_createEngine
04-05 17:04:37.478: E/AndroidRuntime(686): at com.nimasystems.android.radio2.DRadioMediaPlayer.n_createEngine(Native Method)
04-05 17:04:37.478: E/AndroidRuntime(686): at com.nimasystems.android.radio2.DRadioMediaPlayer.<init>(DRadioMediaPlayer.java:68)
04-05 17:04:37.478: E/AndroidRuntime(686): at com.nimasystems.android.player.service.DRadioPlayerService.initPlayer(DRadioPlayerService.java:577)
04-05 17:04:37.478: E/AndroidRuntime(686): at com.nimasystems.android.player.service.DRadioPlayerService.onStart(DRadioPlayerService.java:568)
04-05 17:04:37.478: E/AndroidRuntime(686): at android.app.Service.onStartCommand(Service.java:420)
04-05 17:04:37.478: E/AndroidRuntime(686): at android.app.ActivityThread.handleServiceArgs(ActivityThread.java:3267)
04-05 17:04:37.478: E/AndroidRuntime(686): at android.app.ActivityThread.access$3600(ActivityThread.java:135)
04-05 17:04:37.478: E/AndroidRuntime(686): at android.app.ActivityThread$H.handleMessage(ActivityThread.java:2211)
04-05 17:04:37.478: E/AndroidRuntime(686): at android.os.Handler.dispatchMessage(Handler.java:99)
04-05 17:04:37.478: E/AndroidRuntime(686): at android.os.Looper.loop(Looper.java:144)
04-05 17:04:37.478: E/AndroidRuntime(686): at android.app.ActivityThread.main(ActivityThread.java:4937)
04-05 17:04:37.478: E/AndroidRuntime(686): at java.lang.reflect.Method.invokeNative(Native Method)
04-05 17:04:37.478: E/AndroidRuntime(686): at java.lang.reflect.Method.invoke(Method.java:521)
04-05 17:04:37.478: E/AndroidRuntime(686): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:858)
04-05 17:04:37.478: E/AndroidRuntime(686): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:616)
04-05 17:04:37.478: E/AndroidRuntime(686): at dalvik.system.NativeStart.main(Native Method)
</init>and here is the line where actually the exception it thrown :
public DRadioMediaPlayer() {
Log.d(TAG, "Create new MediaPlayer");
n_createEngine(); // here
}
public native void n_createEngine();Any suggestions what actually can I do to fix this problem ? I've read all questions with similar issue here,but none of them worked for me.
Thanks in advance !
-
Setting up audio queue for ffmpeg rtsp stream playing
20 novembre 2011, par illewI'm working on an rtsp streaming(AAC format) client for iOS using ffmpeg. Right now I can only say my app is workable, but the streaming sound is very noisy and even a little distorted, far worse than when it's played by vlc or mplayer.
The stream is read by av_read_frame(), decoded by avcodec_decode_audio3(). Then I just send the decoded raw audio to Audio Queue.
When decoding a local aac file with my app, the sound seemed not so noisy at all. I know initial encoding would dramatically affect the result. However at least I should try to have it sounded like other streaming clients...
Many parts in my implementation/modification actually came from try and error. I believe I'm doing something wrong in setting up Audio Queue, and the callback function for filling Audio Buffer.
Any hints, suggestions or help are greatly appreciated.
// —info of test materials dumped by av_dump_format() —
Metadata:
title : /demo/test.3gp
Duration: 00:00:30.11, start: 0.000000, bitrate: N/A
Stream #0:0: Audio: aac, 32000 Hz, stereo, s16
aac Advanced Audio Coding// — the Audio Queue setup procedure —
- (void) startPlayback
{
OSStatus err = 0;
if(playState.playing) return;
playState.started = false;
if(!playState.queue)
{
UInt32 bufferSize;
playState.format.mSampleRate = _av->audio.sample_rate;
playState.format.mFormatID = kAudioFormatLinearPCM;
playState.format.mFormatFlags = kAudioFormatFlagsCanonical;
playState.format.mChannelsPerFrame = _av->audio.channels_per_frame;
playState.format.mBytesPerPacket = sizeof(AudioSampleType) *_av->audio.channels_per_frame;
playState.format.mBytesPerFrame = sizeof(AudioSampleType) *_av->audio.channels_per_frame;
playState.format.mBitsPerChannel = 8 * sizeof(AudioSampleType);
playState.format.mFramesPerPacket = 1;
playState.format.mReserved = 0;
pauseStart = 0;
DeriveBufferSize(playState.format,playState.format.mBytesPerPacket,BUFFER_DURATION,&bufferSize,&numPacketsToRead);
err= AudioQueueNewOutput(&playState.format, aqCallback, &playState, NULL, kCFRunLoopCommonModes, 0, &playState.queue);
if(err != 0)
{
printf("AQHandler.m startPlayback: Error creating new AudioQueue: %d \n", (int)err);
}
for(int i = 0 ; i < NUM_BUFFERS ; i ++)
{
err = AudioQueueAllocateBufferWithPacketDescriptions(playState.queue, bufferSize, numPacketsToRead , &playState.buffers[i]);
if(err != 0)
printf("AQHandler.m startPlayback: Error allocating buffer %d", i);
fillAudioBuffer(&playState,playState.queue, playState.buffers[i]);
}
}
startTime = mu_currentTimeInMicros();
err=AudioQueueStart(playState.queue, NULL);
if(err)
{
char sErr[4];
printf("AQHandler.m startPlayback: Could not start queue %ld %s.", err, FormatError(sErr,err));
playState.playing = NO;
}
else
{
AudioSessionSetActive(true);
playState.playing = YES;
}
}// — callback for filling audio buffer —
static int ct = 0;
static void fillAudioBuffer(void *info,AudioQueueRef queue, AudioQueueBufferRef buffer)
{
int lengthCopied = INT32_MAX;
int dts= 0;
int isDone = 0;
buffer->mAudioDataByteSize = 0;
buffer->mPacketDescriptionCount = 0;
OSStatus err = 0;
AudioTimeStamp bufferStartTime;
AudioQueueGetCurrentTime(queue, NULL, &bufferStartTime, NULL);
PlayState *ps = (PlayState *)info;
if (!ps->started)
ps->started = true;
while(buffer->mPacketDescriptionCount < numPacketsToRead && lengthCopied > 0)
{
lengthCopied = getNextAudio(_av,
buffer->mAudioDataBytesCapacity-buffer->mAudioDataByteSize,
(uint8_t*)buffer->mAudioData+buffeg->mAudioDataByteSize,
&dts,&isDone);
ct+= lengthCopied;
if(lengthCopied < 0 || isDone)
{
printf("nothing to read....\n\n");
PlayState *ps = (PlayState *)info;
ps->finished = true;
ps->started = false;
break;
}
if(aqStartDts < 0) aqStartDts = dts;
if(buffer->mPacketDescriptionCount ==0)
{
bufferStartTime.mFlags = kAudioTimeStampSampleTimeValid;
bufferStartTime.mSampleTime = (Float64)(dts-aqStartDts);//* _av->audio.frame_size;
if (bufferStartTime.mSampleTime <0 )
bufferStartTime.mSampleTime = 0;
printf("AQHandler.m fillAudioBuffer: DTS for %x: %lf time base: %lf StartDTS: %d\n",
(unsigned int)buffer,
bufferStartTime.mSampleTime,
_av->audio.time_base,
aqStartDts);
}
buffer->mPacketDescriptions[buffer->mPacketDescriptionCount].mStartOffset = buffer->mAudioDataByteSize;
buffer->mPacketDescriptions[buffer->mPacketDescriptionCount].mDataByteSize = lengthCopied;
buffer->mPacketDescriptions[buffer->mPacketDescriptionCount].mVariableFramesInPacket = 0;
buffer->mPacketDescriptionCount++;
buffer->mAudioDataByteSize += lengthCopied;
}
int audioBufferCount, audioBufferTotal, videoBufferCount, videoBufferTotal;
bufferCheck(_av,&videoBufferCount, &videoBufferTotal, &audioBufferCount, &audioBufferTotal);
if(buffer->mAudioDataByteSize)
{
err = AudioQueueEnqueueBufferWithParameters(queue, buffer, 0, NULL, 0, 0, 0, NULL, &bufferStartTime, NULL);
if(err)
{
char sErr[10];
printf("AQHandler.m fillAudioBuffer: Could not enqueue buffer 0x%x: %d %s.", buffer, err, FormatError(sErr, err));
}
}
}
int getNextAudio(video_data_t* vInst, int maxlength, uint8_t* buf, int* pts, int* isDone)
{
struct video_context_t *ctx = vInst->context;
int datalength = 0;
while(ctx->audio_ring.lock || (ctx->audio_ring.count <= 0 && ((ctx->play_state & STATE_DIE) != STATE_DIE)))
{
if (ctx->play_state & STATE_EOF) return -1;
usleep(100);
}
*pts = 0;
ctx->audio_ring.lock = kLocked;
if(ctx->audio_ring.count>0 && maxlength > ctx->audio_buffer[ctx->audio_ring.read].size)
{
memcpy(buf, ctx->audio_buffer[ctx->audio_ring.read].data,ctx->audio_buffer[ctx->audio_ring.read].size);
*pts = ctx->audio_buffer[ctx->audio_ring.read].pts;
datalength = ctx->audio_buffer[ctx->audio_ring.read].size;
ctx->audio_ring.read++;
ctx->audio_ring.read %= ABUF_SIZE;
ctx->audio_ring.count--;
}
ctx->audio_ring.lock = kUnlocked;
if((ctx->play_state & STATE_EOF) == STATE_EOF && ctx->audio_ring.count == 0) *isDone = 1;
return datalength;
}