
Recherche avancée
Autres articles (99)
-
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
Encodage et transformation en formats lisibles sur Internet
10 avril 2011MediaSPIP transforme et ré-encode les documents mis en ligne afin de les rendre lisibles sur Internet et automatiquement utilisables sans intervention du créateur de contenu.
Les vidéos sont automatiquement encodées dans les formats supportés par HTML5 : MP4, Ogv et WebM. La version "MP4" est également utilisée pour le lecteur flash de secours nécessaire aux anciens navigateurs.
Les documents audios sont également ré-encodés dans les deux formats utilisables par HTML5 :MP3 et Ogg. La version "MP3" (...)
Sur d’autres sites (10003)
-
Decoding audio via Android using FFMpeg
31 octobre 2013, par Rob HauptI can play Wav files using the below code without issues. When trying to play the exact same media in Mp3 format I only get garbled junk. I believe I am fundamentally misunderstanding how the avcodec_decode_audio3 function works.
Since the Wav file contains PCM data when it is decoded it can go straight to the AudioTrack.write function. There must be some additional step to get Mp3s to work like this. I don't know what I'm missing, but I've been pulling my hair out for a week now.
Java Code
package com.rohaupt.RRD2;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import android.app.Activity;
import android.media.AudioFormat;
import android.media.AudioManager;
import android.media.AudioTrack;
import android.media.MediaPlayer;
import android.os.Bundle;
import android.os.SystemClock;
public class player extends Activity
{
private AudioTrack track;
private FileOutputStream os;
/** Called when the activity is first created. */
@Override
public void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
createEngine();
MediaPlayer mp = new MediaPlayer();
mp.start();
int bufSize = AudioTrack.getMinBufferSize(32000,
AudioFormat.CHANNEL_CONFIGURATION_STEREO,
AudioFormat.ENCODING_PCM_16BIT);
track = new AudioTrack(AudioManager.STREAM_MUSIC,
32000,
AudioFormat.CHANNEL_CONFIGURATION_STEREO,
AudioFormat.ENCODING_PCM_16BIT,
bufSize,
AudioTrack.MODE_STREAM);
byte[] bytes = new byte[bufSize];
try {
os = new FileOutputStream("/sdcard/a.out",false);
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
String result = loadFile("/sdcard/a.mp3",bytes);
try {
os.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
void playSound(byte[] buf, int size) {
//android.util.Log.v("ROHAUPT", "RAH Playing");
if(track.getPlayState()!=AudioTrack.PLAYSTATE_PLAYING)
track.play();
track.write(buf, 0, size);
try {
os.write(buf,0,size);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
private native void createEngine();
private native String loadFile(String file, byte[] array);
/** Load jni .so on initialization*/
static {
System.loadLibrary("avutil");
System.loadLibrary("avcore");
System.loadLibrary("avcodec");
System.loadLibrary("avformat");
System.loadLibrary("avdevice");
System.loadLibrary("swscale");
System.loadLibrary("avfilter");
System.loadLibrary("ffmpeg");
}
}C Code
#include
#include
#include
#include <android></android>log.h>
#include "libavcodec/avcodec.h"
#include "libavformat/avformat.h"
#define DEBUG_TAG "ROHAUPT"
void Java_com_rohaupt_RRD2_player_createEngine(JNIEnv* env, jclass clazz)
{
avcodec_init();
av_register_all();
}
jstring Java_com_rohaupt_RRD2_player_loadFile(JNIEnv* env, jobject obj,jstring file,jbyteArray array)
{
jboolean isCopy;
int i;
int audioStream=-1;
int res;
int decoded = 0;
int out_size;
AVFormatContext *pFormatCtx;
AVCodecContext *aCodecCtx;
AVCodecContext *c= NULL;
AVCodec *aCodec;
AVPacket packet;
jclass cls = (*env)->GetObjectClass(env, obj);
jmethodID play = (*env)->GetMethodID(env, cls, "playSound", "([BI)V");//At the begining of your main function
const char * szfile = (*env)->GetStringUTFChars(env, file, &isCopy);
int16_t * pAudioBuffer = (int16_t *) av_malloc (AVCODEC_MAX_AUDIO_FRAME_SIZE*2+FF_INPUT_BUFFER_PADDING_SIZE);
int16_t * outBuffer = (int16_t *) av_malloc (AVCODEC_MAX_AUDIO_FRAME_SIZE*2+FF_INPUT_BUFFER_PADDING_SIZE);
__android_log_print(ANDROID_LOG_INFO, DEBUG_TAG, "RAH28 Starting");
res = av_open_input_file(&pFormatCtx, szfile, NULL, 0, NULL);
if(res!=0)
{
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH opening input failed with result: [%d]", res);
return file;
}
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH getting stream info");
res = av_find_stream_info(pFormatCtx);
if(res<0)
{
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH getting stream info failed with result: [%d]", res);
return file;
}
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH getting audio stream");
for(i=0; i < pFormatCtx->nb_streams; i++) {
if(pFormatCtx->streams[i]->codec->codec_type==AVMEDIA_TYPE_AUDIO &&
audioStream < 0) {
audioStream=i;
}
}
if(audioStream==-1)
{
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH couldn't find audio stream");
return file;
}
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH audio stream found with result: [%d]", res);
aCodecCtx=pFormatCtx->streams[audioStream]->codec;
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH audio codec info loaded");
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH audio codec info [%d]", aCodecCtx->codec_id);
aCodec = avcodec_find_decoder(aCodecCtx->codec_id);
if(!aCodec) {
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH audio codec unsupported");
return file;
}
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH audio codec info found");
res = avcodec_open(aCodecCtx, aCodec);
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH audio codec loaded [%d] [%d]",aCodecCtx->sample_fmt,res);
//c=avcodec_alloc_context();
av_init_packet(&packet);
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH channels [%d] sample rate [%d] sample format [%d]",aCodecCtx->channels,aCodecCtx->sample_rate,aCodecCtx->sample_fmt);
int x,y;
x=0;y=0;
while (av_read_frame(pFormatCtx, &packet)>= 0) {
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH frame read: [%d] [%d]",x++,y);
if (aCodecCtx->codec_type == AVMEDIA_TYPE_AUDIO) {
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH audio ready");
int data_size = AVCODEC_MAX_AUDIO_FRAME_SIZE*2+FF_INPUT_BUFFER_PADDING_SIZE;
int size=packet.size;
y=0;
decoded = 0;
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH packet size: [%d]", size);
while(size > 0) {
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH decoding: [%d] [%d]",x,y++);
int len = avcodec_decode_audio3(aCodecCtx, pAudioBuffer, &data_size, &packet);
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH 1 size [%d] len [%d] data_size [%d] out_size [%d]",size,len,data_size,out_size);
jbyte *bytes = (*env)->GetByteArrayElements(env, array, NULL);
memcpy(bytes + decoded, pAudioBuffer, len); //
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH 2");
(*env)->ReleaseByteArrayElements(env, array, bytes, 0);
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH 3");
(*env)->CallVoidMethod(env, obj, play, array, len);
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH 4");
size -= len;
decoded += len;
}
av_free_packet(&packet);
}
}
// Close the video file
av_close_input_file(pFormatCtx);
//__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH Finished Running result: [%d]", res);
(*env)->ReleaseStringUTFChars(env, file, szfile);
return file;
}To add some detail. When Calling this function with a Wav File I get the following Log Data
I/ROHAUPT ( 227): RAH28 Starting
D/ROHAUPT ( 227): RAH getting stream info
D/ROHAUPT ( 227): RAH getting audio stream
D/ROHAUPT ( 227): RAH audio stream found with result: [0]
D/ROHAUPT ( 227): RAH audio codec info loaded
D/ROHAUPT ( 227): RAH audio codec info [65536]
D/ROHAUPT ( 227): RAH audio codec info found
D/ROHAUPT ( 227): RAH audio codec loaded [1] [0]
D/ROHAUPT ( 227): RAH channels [2] sample rate [32000] sample format [1]
D/ROHAUPT ( 227): RAH frame read: [0] [0]
D/ROHAUPT ( 227): RAH audio ready
D/ROHAUPT ( 227): RAH packet size: [4096]
D/ROHAUPT ( 227): RAH decoding: [1] [0]
D/ROHAUPT ( 227): RAH 1 size [4096] len [4096] data_size [4096] out_size [0]
D/ROHAUPT ( 227): RAH 2
D/ROHAUPT ( 227): RAH 3
D/ROHAUPT ( 227): RAH 4
D/ROHAUPT ( 227): RAH frame read: [1] [1]
D/ROHAUPT ( 227): RAH audio ready
...
D/ROHAUPT ( 227): RAH frame read: [924] [1]
D/ROHAUPT ( 227): RAH audio ready
D/ROHAUPT ( 227): RAH packet size: [4096]
D/ROHAUPT ( 227): RAH decoding: [925] [0]
D/ROHAUPT ( 227): RAH 1 size [4096] len [4096] data_size [4096] out_size [0]
D/ROHAUPT ( 227): RAH 2
D/ROHAUPT ( 227): RAH 3
D/ROHAUPT ( 227): RAH 4
D/ROHAUPT ( 227): RAH frame read: [925] [1]
D/ROHAUPT ( 227): RAH audio ready
D/ROHAUPT ( 227): RAH packet size: [3584]
D/ROHAUPT ( 227): RAH decoding: [926] [0]
D/ROHAUPT ( 227): RAH 1 size [3584] len [3584] data_size [3584] out_size [0]
D/ROHAUPT ( 227): RAH 2
D/ROHAUPT ( 227): RAH 3
D/ROHAUPT ( 227): RAH 4When calling with an Mp3 file I get the following
I/ROHAUPT ( 280): RAH28 Starting
D/ROHAUPT ( 280): RAH getting stream info
D/ROHAUPT ( 280): RAH getting audio stream
D/ROHAUPT ( 280): RAH audio stream found with result: [0]
D/ROHAUPT ( 280): RAH audio codec info loaded
D/ROHAUPT ( 280): RAH audio codec info [86017]
D/ROHAUPT ( 280): RAH audio codec info found
D/ROHAUPT ( 280): RAH audio codec loaded [1] [0]
D/ROHAUPT ( 280): RAH channels [2] sample rate [32000] sample format [1]
D/ROHAUPT ( 280): RAH frame read: [0] [0]
D/ROHAUPT ( 280): RAH audio ready
D/ROHAUPT ( 280): RAH packet size: [432]
D/ROHAUPT ( 280): RAH decoding: [1] [0]
D/ROHAUPT ( 280): RAH 1 size [432] len [432] data_size [4608] out_size [0]
D/ROHAUPT ( 280): RAH 2
...
D/ROHAUPT ( 280): RAH frame read: [822] [1]
D/ROHAUPT ( 280): RAH audio ready
D/ROHAUPT ( 280): RAH packet size: [432]
D/ROHAUPT ( 280): RAH decoding: [823] [0]
D/ROHAUPT ( 280): RAH 1 size [432] len [432] data_size [4608] out_size [0]
D/ROHAUPT ( 280): RAH 2
D/ROHAUPT ( 280): RAH 3
D/ROHAUPT ( 280): RAH 4
D/ROHAUPT ( 280): RAH frame read: [823] [1]
D/ROHAUPT ( 280): RAH audio ready
D/ROHAUPT ( 280): RAH packet size: [432]
D/ROHAUPT ( 280): RAH decoding: [824] [0]
D/ROHAUPT ( 280): RAH 1 size [432] len [432] data_size [4608] out_size [0]
D/ROHAUPT ( 280): RAH 2
D/ROHAUPT ( 280): RAH 3
D/ROHAUPT ( 280): RAH 4 -
WebRTC predictions for 2016
17 février 2016, par silviaI wrote these predictions in the first week of January and meant to publish them as encouragement to think about where WebRTC still needs some work. I’d like to be able to compare the state of WebRTC in the browser a year from now. Therefore, without further ado, here are my thoughts.
WebRTC Browser support
I’m quite optimistic when it comes to browser support for WebRTC. We have seen Edge bring in initial support last year and Apple looking to hire engineers to implement WebRTC. My prediction is that we will see the following developments in 2016 :
- Edge will become interoperable with Chrome and Firefox, i.e. it will publish VP8/VP9 and H.264/H.265 support
- Firefox of course continues to support both VP8/VP9 and H.264/H.265
- Chrome will follow the spec and implement H.264/H.265 support (to add to their already existing VP8/VP9 support)
- Safari will enter the WebRTC space but only with H.264/H.265 support
Codec Observations
With Edge and Safari entering the WebRTC space, there will be a larger focus on H.264/H.265. It will help with creating interoperability between the browsers.
However, since there are so many flavours of H.264/H.265, I expect that when different browsers are used at different endpoints, we will get poor quality video calls because of having to negotiate a common denominator. Certainly, baseline will work interoperably, but better encoding quality and lower bandwidth will only be achieved if all endpoints use the same browser.
Thus, we will get to the funny situation where we buy ourselves interoperability at the cost of video quality and bandwidth. I’d call that a “degree of interoperability” and not the best possible outcome.
I’m going to go out on a limb and say that at this stage, Google is going to consider strongly to improve the case of VP8/VP9 by improving its bandwidth adaptability : I think they will buy themselves some SVC capability and make VP9 the best quality codec for live video conferencing. Thus, when Safari eventually follows the standard and also implements VP8/VP9 support, the interoperability win of H.264/H.265 will become only temporary overshadowed by a vastly better video quality when using VP9.
The Enterprise Boundary
Like all video conferencing technology, WebRTC is having a hard time dealing with the corporate boundary : firewalls and proxies get in the way of setting up video connections from within an enterprise to people outside.
The telco world has come up with the concept of SBCs (session border controller). SBCs come packed with functionality to deal with security, signalling protocol translation, Quality of Service policing, regulatory requirements, statistics, billing, and even media service like transcoding.
SBCs are a total overkill for a world where a large number of Web applications simply want to add a WebRTC feature – probably mostly to provide a video or audio customer support service, but it could be a live training session with call-in, or an interest group conference all.
We cannot install a custom SBC solution for every WebRTC service provider in every enterprise. That’s like saying we need a custom Web proxy for every Web server. It doesn’t scale.
Cloud services thrive on their ability to sell directly to an individual in an organisation on their credit card without that individual having to ask their IT department to put special rules in place. WebRTC will not make progress in the corporate environment unless this is fixed.
We need a solution that allows all WebRTC services to get through an enterprise firewall and enterprise proxy. I think the WebRTC standards have done pretty well with firewalls and connecting to a TURN server on port 443 will do the trick most of the time. But enterprise proxies are the next frontier.
What it takes is some kind of media packet forwarding service that sits on the firewall or in a proxy and allows WebRTC media packets through – maybe with some configuration that is necessary in the browsers or the Web app to add this service as another type of TURN server.
I don’t have a full understanding of the problems involved, but I think such a solution is vital before WebRTC can go mainstream. I expect that this year we will see some clever people coming up with a solution for this and a new type of product will be born and rolled out to enterprises around the world.
Summary
So these are my predictions. In summary, they address the key areas where I think WebRTC still has to make progress : interoperability between browsers, video quality at low bitrates, and the enterprise boundary. I’m really curious to see where we stand with these a year from now.
—
It’s worth mentioning Philipp Hancke’s tweet reply to my post :
https://datatracker.ietf.org/doc/draft-ietf-rtcweb-return/ … — we saw some clever people come up with a solution already. Now it needs to be implemented
-
Web Analytics Reports : 10 Key Types and How to Use Them
29 janvier 2024, par Erin