
Recherche avancée
Médias (1)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (61)
-
Participer à sa traduction
10 avril 2011Vous pouvez nous aider à améliorer les locutions utilisées dans le logiciel ou à traduire celui-ci dans n’importe qu’elle nouvelle langue permettant sa diffusion à de nouvelles communautés linguistiques.
Pour ce faire, on utilise l’interface de traduction de SPIP où l’ensemble des modules de langue de MediaSPIP sont à disposition. ll vous suffit de vous inscrire sur la liste de discussion des traducteurs pour demander plus d’informations.
Actuellement MediaSPIP n’est disponible qu’en français et (...) -
(Dés)Activation de fonctionnalités (plugins)
18 février 2011, parPour gérer l’ajout et la suppression de fonctionnalités supplémentaires (ou plugins), MediaSPIP utilise à partir de la version 0.2 SVP.
SVP permet l’activation facile de plugins depuis l’espace de configuration de MediaSPIP.
Pour y accéder, il suffit de se rendre dans l’espace de configuration puis de se rendre sur la page "Gestion des plugins".
MediaSPIP est fourni par défaut avec l’ensemble des plugins dits "compatibles", ils ont été testés et intégrés afin de fonctionner parfaitement avec chaque (...) -
Les tâches Cron régulières de la ferme
1er décembre 2010, parLa gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
Le super Cron (gestion_mutu_super_cron)
Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...)
Sur d’autres sites (9422)
-
Can you "stream" images to ffmpeg to construct a video, instead of saving them to disk ?
8 juillet 2016, par BrandonMy work recently involves programmatically making videos. In python, the typical workflow looks something like this :
import subprocess, Image, ImageDraw
for i in range(frames_per_second * video_duration_seconds):
img = createFrame(i)
img.save("%07d.png" % i)
subprocess.call(["ffmpeg","-y","-r",str(frames_per_second),"-i", "%07d.png","-vcodec","mpeg4", "-qscale","5", "-r", str(frames_per_second), "video.avi"])This workflow creates an image for each frame in the video and saves it to disk. After all images have been saved, ffmpeg is called to construct a video from all of the images.
Saving the images to disk (not the creation of the images in memory) consumes the majority of the cycles here, and does not appear to be necessary. Is there some way to perform the same function, but without saving the images to disk ? So, ffmpeg would be called and the images would be constructed and fed to ffmpeg immediately after being constructed.
-
I wonder why logcat says "NO SUCH A FILE OR DIRECTORY(2)"
22 septembre 2013, par autoexebatI wanna play audio on Android with ffmpeg.
But when I run this project, error has occurredwhat should I do ?
java SIDE
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import android.app.Activity;
import android.media.AudioFormat;
import android.media.AudioManager;
import android.media.AudioTrack;
import android.media.MediaPlayer;
import android.os.Bundle;
import android.os.SystemClock;
public class FFmpegBasic extends Activity
{
private AudioTrack track;
private FileOutputStream os;
/** Called when the activity is first created. */
@Override
public void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
createEngine();
int bufSize = AudioTrack.getMinBufferSize(44100,
AudioFormat.CHANNEL_CONFIGURATION_STEREO,
AudioFormat.ENCODING_PCM_16BIT);
track = new AudioTrack(AudioManager.STREAM_MUSIC,
44100,
AudioFormat.CHANNEL_CONFIGURATION_STEREO,
AudioFormat.ENCODING_PCM_16BIT,
bufSize,
AudioTrack.MODE_STREAM);
byte[] bytes = new byte[bufSize];
try {
os = new FileOutputStream("/sdcard/a.out",false);
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
String result = "/mnt/sdcard/Wildlife.mp3";
loadFile(result,bytes);
try {
os.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
void playSound(byte[] buf, int size) {
if(track.getPlayState()!=AudioTrack.PLAYSTATE_PLAYING)
track.play();
track.write(buf, 0, size);
try {
os.write(buf,0,size);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
private native void createEngine();
private native void loadFile(String file, byte[] array);
/** Load jni .so on initialization*/
static {
System.loadLibrary("avutil");
System.loadLibrary("avcore");
System.loadLibrary("avcodec");
System.loadLibrary("avformat");
System.loadLibrary("avdevice");
System.loadLibrary("swscale");
System.loadLibrary("avfilter");
System.loadLibrary("ffmpeg");
System.loadLibrary("basicplayer");
}
}
c SIDE
#include
#include
#include
#include <android></android>log.h>
#include "libavcodec/avcodec.h"
#include "libavformat/avformat.h"
#include "avcodec.h"
#include "avformat.h"
void Java_net_jbong_FFmpegBasic_FFmpegBasic_createEngine(JNIEnv* env, jclass clazz)
{
//avcodec_init();
av_register_all();
}
void Java_net_jbong_FFmpegBasic_FFmpegBasic_loadFile(JNIEnv* env, jobject obj, jstring file, jbyteArray array)
{
AVFormatContext *gFormatCtx = NULL;
AVCodecContext *gAudioCodecCtx = NULL;
AVCodec *gAudioCodec = NULL;
int gAudioStreamIdx = -1;
char *gAudioBuffer = NULL;
int i, outsize = 0;
AVPacket packet;
const char *str;
str = (*env)->GetStringUTFChars(env, file, NULL);
jclass cls = (*env)->GetObjectClass(env, obj);
jmethodID play = (*env)->GetMethodID(env, cls, "playSound", "([BI)V");
if (gFormatCtx != NULL)
return -1;
if (av_open_input_file(&gFormatCtx,str,NULL,0,NULL)!=0)
return -2;
if (av_find_stream_info(gFormatCtx) < 0)
return -3;
for(i=0; inb_streams; i++)
{
if(gFormatCtx->streams[i]->codec->codec_type == AVMEDIA_TYPE_AUDIO)
{
gAudioStreamIdx = i;
break;
}
}
if (gAudioStreamIdx == -1)
return -4;
gAudioCodecCtx = gFormatCtx->streams[gAudioStreamIdx]->codec;
gAudioCodec = avcodec_find_decoder(gAudioCodecCtx->codec_id);
if (gAudioCodec == NULL)
return -5;
if (avcodec_open(gAudioCodecCtx, gAudioCodec)<0)
return -6;
gAudioBuffer = (char *)av_malloc(AVCODEC_MAX_AUDIO_FRAME_SIZE *2);
int decode = 0;
while (av_read_frame(gFormatCtx, &packet) >= 0)
{
if (gFormatCtx-> streams[packet.stream_index]->codec->codec_type == AVMEDIA_TYPE_AUDIO)
{
int data_size = AVCODEC_MAX_AUDIO_FRAME_SIZE * 8;
gAudioBuffer = (char *)av_malloc(data_size);
int size=packet.size;
while(size > 0)
{
int len = avcodec_decode_audio3(gAudioCodecCtx,
(short *) gAudioBuffer, &data_size, &packet);
if (data_size > 0)
{
jbyte *bytes = (*env)->GetByteArrayElements(env, array, NULL);
memcpy(bytes + decode, (int16_t *)gAudioBuffer, size);
(*env)->ReleaseByteArrayElements(env, array, bytes, 0);
(*env)->CallVoidMethod(env, obj, play, array, data_size);
decode += size;
size -= len;
}
}
}
av_free_packet(&packet);
}
av_close_input_file(gFormatCtx);
return 0;
}Why my android logcat show me this message ?
"error opening trace file : No such file or directory (2)" -
Trying to compile x264 and ffmpeg for iPhone - "missing required architecture arm in file"
11 février 2013, par jtrimI'm trying to compile x264 for use in an iPhone application. I see there are instructions on how to compile ffmpeg for use on the platform here : http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2009-October/076618.html , but I can't seem to find anything this complete for compiling x264 on the iPhone. I've found this source tree : http://gitorious.org/x264-arm that seems to have support for the ARM platform.
Here is my config line :
./configure —cross-prefix=/usr/bin/ —host=arm-apple-darwin10 —extra-cflags="-B /Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS3.2.sdk/usr/lib/ -I /Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS3.2.sdk/usr/lib/"
...and inside
configure
I'm using the gas-preprocessor script (first link above) as my assembler :gas-preprocessor.pl gcc
When I start compiling, it chunks away for a little while, then it spits out these warnings and a huge list of undefined symbols :
ld : warning : option -s is obsolete and being ignored ld : warning : -force_cpusubtype_ALL will become unsupported for ARM architectures ld : warning : in /usr/lib/crt1.o, missing required architecture arm in file ld : warning : in /usr/X11R6/lib/libX11.dylib, missing required architecture arm in file ld : warning : in /usr/lib/libm.dylib, missing required architecture arm in file ld : warning : in /usr/lib/libpthread.dylib, missing required architecture arm in file ld : warning : in /usr/lib/libgcc_s.1.dylib, missing required architecture arm in file ld : warning : in /usr/lib/libSystem.dylib, missing required architecture arm in file Undefined symbols :
My guess would be that the problem has to do with the "missing required architecture arm in file" warning...any ideas ?