
Recherche avancée
Médias (1)
-
SWFUpload Process
6 septembre 2011, par
Mis à jour : Septembre 2011
Langue : français
Type : Texte
Autres articles (97)
-
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
Creating farms of unique websites
13 avril 2011, parMediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...) -
Ecrire une actualité
21 juin 2013, parPrésentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
Vous pouvez personnaliser le formulaire de création d’une actualité.
Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)
Sur d’autres sites (13000)
-
Passing streams from Fluent-ffmpeg to Google Cloud Storage
31 octobre 2019, par Emilio FariaIs there a way to pass a stream from Fluent-mmpeg to Google Cloud Storage ? I’m trying to allow the user to upload any kind of media (audio or video), and I want to convert it to flac before uploading it to GCS.
I’m using a few middlewares on my route, such as :
routes.post(
'/upload',
multer.single('audio'),
ConvertController.convert,
UploadController.upload,
FileController.save,
(req, res, next) => res.send('ok')
);I was able to stream from Multer to Fluent-mmpeg and save to a file using this code on ConvertController :
async convert(req, res, next) {
ffmpeg(streamifier.createReadStream(req.file.buffer))
.format('flac')
.output('outputfile.flac')
.audioChannels(1)
.on('progress', function(progress) {
console.log(progress);
})
.run();
}But I would like to use .pipe() to pass it to UploadController, where I would then upload to GCS :
class UploadController {
async upload(req, res, next) {
const gcsHelpers = require('../helpers/google-cloud-storage');
const { storage } = gcsHelpers;
const DEFAULT_BUCKET_NAME = 'my-bucket-name';
const bucketName = DEFAULT_BUCKET_NAME;
const bucket = storage.bucket(bucketName);
const fileName = `test.flac`;
const newFile = bucket.file(fileName);
newFile.createWriteStream({
metadata: {
contentType: file.mimetype
}
})
file.on('error', err => {
throw err;
});
file.on('finish', () => console.log('finished'));
}The problem is that I cannot find anywhere explaining how I can pass down a stream to the next middleware.
Is it possible ?
-
FFmpeg doesn't work on android 10, goes strait to onFailure(String message) with empty message
18 janvier 2020, par nolanicI’m using FFmpeg in one of my projects for video compression. On Android 10 (Google Pixel 3a), it goes straight to onFailure(String message) with empty message for any command sent for execution.
so I have (api ’com.writingminds:FFmpegAndroid:0.3.2’) specified in my app gradle file,
permission (android.permission.WRITE_EXTERNAL_STORAGE) in the manifest is specified
So I do :
InitializationCallback initializationCallback = new InitializationCallback();
try {
FFmpeg.getInstance(context).loadBinary(initializationCallback);
} catch (FFmpegNotSupportedException e) {
initializationCallback.onFailure();
initializationCallback.onFinish();
}Initializes just fine, no problems here.
Later :
void getData(File inputFile) {
//inputFile points to: /storage/emulated/0/Android/data/{package_name}/files/temp_files/temp_1.mp4
String[] cmd = ("-i " + inputFile.getAbsolutePath()).split(" ");
try {
FFmpeg.getInstance(App.instance).execute(cmd, this);
} catch (FFmpegCommandAlreadyRunningException e) {
throw new Error(e);
}
}
@Override
public void onStart() {
//This method is called
}
@Override
public void onSuccess(String message) {
//This method is NOT called
extractAvailableData(message);
}
@Override
public void onProgress(String message) {
//This method is NOT called
extractAvailableData(message);
}
@Override
public void onFailure(String message) {
//This method is called and the message is empty
extractAvailableData(message);
}
@Override
public void onFinish() {
//This method is called
}If I do something like :
String command = "-i ***/file1.mp4 -map 0:v -map 0:a -preset ultrafast -s:v 750:350 ***/file2.mp4";
//file2.mp4 is a non existent file at this point
// (***) --> is just a replacement for the full path of the file, just to keep things shorter here.
String[] cmd = command.split(" ");
try {
FFmpeg.getInstance(App.instance).execute(cmd, this);
} catch (FFmpegCommandAlreadyRunningException e) {
throw new Error(e);
}gives the same result, no video conversion, just a call to onFailure("Nothing")
Even if I do :
String[] cmd = {"-version"};
try {
FFmpeg.getInstance(App.instance).execute(cmd, this);
} catch (FFmpegCommandAlreadyRunningException e) {
throw new Error(e);
}I get nothing, no output at all.
I encountered this issue only on Android 10 so far, it works fine on other devices.
-
ffmpeg include issue - some functions are missing
27 septembre 2019, par Thomas AyoubI try to follow and adapt this example to convert a video thanks to FFMPEG but some function seems to be missing like :
int avcodec_open ( AVCodecContext * avctx, AVCodec * codec)
When I go in the doc to see where it come from, I find it in the file
libavcodec/avcodec.h
which is included in my program#include "libavcodec/avcodec.h"
(in the top of my.h
file).Given this, I don’t understand why Qt throw me this error :
../../Dev/Joker/libs/PhVideo/PhVideoEncoder.cpp:360:6: error: use of undeclared identifier 'avcodec_open'
if (avcodec_open(c, codec) < 0) {