
Recherche avancée
Autres articles (101)
-
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
Contribute to translation
13 avril 2011You can help us to improve the language used in the software interface to make MediaSPIP more accessible and user-friendly. You can also translate the interface into any language that allows it to spread to new linguistic communities.
To do this, we use the translation interface of SPIP where the all the language modules of MediaSPIP are available. Just subscribe to the mailing list and request further informantion on translation.
MediaSPIP is currently available in French and English (...) -
Prérequis à l’installation
31 janvier 2010, parPréambule
Cet article n’a pas pour but de détailler les installations de ces logiciels mais plutôt de donner des informations sur leur configuration spécifique.
Avant toute chose SPIPMotion tout comme MediaSPIP est fait pour tourner sur des distributions Linux de type Debian ou dérivées (Ubuntu...). Les documentations de ce site se réfèrent donc à ces distributions. Il est également possible de l’utiliser sur d’autres distributions Linux mais aucune garantie de bon fonctionnement n’est possible.
Il (...)
Sur d’autres sites (6660)
-
ffmpeg used vda from os x
13 mai 2014, par user2618420I try to enable hardware decoding project earlier decoded using ffmpeg
ffmpeg has support in hard-copy decoding
Here documentation kotoroya I did https://github.com/dilaroga/ffmpeg-vda/wiki/FFmpeg-vda-usage
my codeenum AVPixelFormat myGetFormatCallback(struct AVCodecContext *ctx, const enum AVPixelFormat * fmt)
{
struct vda_context *vda_ctx;
vda_ctx = (struct vda_context *)malloc(sizeof(vda_context));
vda_ctx->decoder = NULL;
vda_ctx->width = ctx->width;
vda_ctx->height = ctx->height;
vda_ctx->format = 'avc1';
vda_ctx->use_ref_buffer = 1;
switch (ctx->pix_fmt) {
case PIX_FMT_UYVY422:{
vda_ctx->cv_pix_fmt_type = '2vuy';
break;
}
case PIX_FMT_YUYV422:{
vda_ctx->cv_pix_fmt_type = 'yuvs';
break;
}
case PIX_FMT_NV12:{
vda_ctx->cv_pix_fmt_type = '420v';
break;
}
case PIX_FMT_YUV420P:{
vda_ctx->cv_pix_fmt_type = 'y420';
break;
}
default:{
av_log(ctx, AV_LOG_ERROR, "Unsupported pixel format: %d\n", ctx->pix_fmt);
Logger::debug(LOG_LEVEL_ERROR, "Unsupported pixel format: %d", ctx->pix_fmt);
throw AbstractException("Unsupported pixel format");
}
}
int status = ff_vda_create_decoder(vda_ctx, (unsigned char*)ctx->extradata,ctx->extradata_size);
if (status){
Logger::debug(LOG_LEVEL_ERROR, "Error create VDA decoder");
throw AbstractException("Error create VDA decoder");
}else{
ctx->hwaccel_context = vda_ctx;
}
return ctx->pix_fmt;
}
static void release_vda_context(void *opaque, uint8_t *data)
{
vda_buffer_context *vda_context = (vda_buffer_context *)opaque;
av_free(vda_context);
}
int myGetBufferCallback(struct AVCodecContext *s, AVFrame *av_frame, int flags)
{
vda_buffer_context *vda_context = (vda_buffer_context *)av_mallocz(sizeof(*vda_context));
AVBufferRef *buffer = av_buffer_create(NULL, 0, release_vda_context, vda_context, 0);
if( !vda_context || !buffer )
{
av_free(vda_context);
return -1;
}
av_frame->buf[0] = buffer;
av_frame->data[0] = (uint8_t*)1;
return 0;
}
static void release_buffer(struct AVCodecContext *opaque, AVFrame *pic)
{
vda_buffer_context *context = (vda_buffer_context*)opaque;
CVPixelBufferUnlockBaseAddress(context->cv_buffer, 0);
CVPixelBufferRelease(context->cv_buffer);
av_free(context);
}
main(){
//init ff context
if (avcodec_open2(mCodecContext, mCodec, NULL) < 0) throw AbstractException("Unable to open codec");
mCodecContext->get_format = myGetFormatCallback;
mCodecContext->get_buffer2 = myGetBufferCallback;
mCodecContext->release_buffer = release_buffer;
}but I did not myGetFormatCallback the method is called, and called myGetBufferCallback falls
why not called myGetFormatCallback ? what’s wrong ? may not work well at all -
Extract audio from video in mp3 format using android-ffmpeg-library
28 mai 2014, par user2870161I want extrat audio from any type of video file and save it in mp3 format using android-ffmpeg-library.
My most of work is done here i am create wav file using this code but problame is when i create mp3 file it make only 0kb file in sdcard.
I hope I’ve made myself clear, and thanks for taking the time to read this.
if (inputPath == null || outputPath == null) {
throw new IllegalStateException("Need an input and output filepath!");
}
final List<string> cmd = new LinkedList<string>();
String baseDir = Environment.getExternalStorageDirectory().getAbsolutePath();
String fileName = "3.mp4";
String fileName1 = "2.mp3";
String path = baseDir + "/" + fileName;
String path1 = baseDir + "/" + fileName1;
File f = new File(path);
if(f.exists()){
System.out.println("File existed");
}else{
System.out.println("File not found!");
}
cmd.add(mFfmpegPath);
cmd.add("-i");
cmd.add(path);
cmd.add("-vn");
cmd.add("-acodec");
cmd.add("copy");
cmd.add(path1);
final ProcessBuilder pb = new ProcessBuilder(cmd);
return new ProcessRunnable(pb);
</string></string> -
Transcode Adobe Media Encoder live stream using FFMPEG in node.js
23 juin 2014, par user2757842I am using Adobe media live encoder to send a stream up to the adobe media server. What I would like to do is take that stream and transcode it to a local file on my machine in another format other that .f4m
Here is the code I have so far, it is built using FFmpeg within a node.js app :
var ffmpeg = require('fluent-ffmpeg');
var fs = require('fs');
//make sure you set the correct path to your video file
var inStream = fs.createReadStream('rtmp://localhost/livepkgr/livestream live=1'); // this is where the streams are stored
var command = new ffmpeg({ source: inStream});
//Set the path to where FFmpeg is installed
command.setFfmpegPath("C:\\Users\\Jay\\Documents\\FFMPEG\\bin\\ffmpeg.exe");
command
//set the size
.withSize('100%')
// set fps
.withFps(24)
// set output format to force
.toFormat('ismv')
// setup event handlers
.on('end', function() {
console.log('file has been converted successfully');
})
.on('error', function(err) {
console.log('an error happened: ' + err.message);
})
// save to file <-- the new file I want -->
.saveToFile('rtmp://localhost/livepkgr/livestream1'); //this is where i want to store the newly converted streamI have it working with a local file but when I try it with my live stream, I get this error
events.js:72
throw er; // Unhandled 'error' event
^
Error: ENOENT, open 'C:\Users\Jay\workspace\FFMPEGtest\rtmp:\localhost\livepkgr\livestream live=1'Anyone seen this before ?