
Recherche avancée
Médias (3)
-
Exemple de boutons d’action pour une collection collaborative
27 février 2013, par
Mis à jour : Mars 2013
Langue : français
Type : Image
-
Exemple de boutons d’action pour une collection personnelle
27 février 2013, par
Mis à jour : Février 2013
Langue : English
Type : Image
-
Collections - Formulaire de création rapide
19 février 2013, par
Mis à jour : Février 2013
Langue : français
Type : Image
Autres articles (62)
-
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
Contribute to a better visual interface
13 avril 2011MediaSPIP is based on a system of themes and templates. Templates define the placement of information on the page, and can be adapted to a wide range of uses. Themes define the overall graphic appearance of the site.
Anyone can submit a new graphic theme or template and make it available to the MediaSPIP community. -
Submit enhancements and plugins
13 avril 2011If you have developed a new extension to add one or more useful features to MediaSPIP, let us know and its integration into the core MedisSPIP functionality will be considered.
You can use the development discussion list to request for help with creating a plugin. As MediaSPIP is based on SPIP - or you can use the SPIP discussion list SPIP-Zone.
Sur d’autres sites (7176)
-
ffmpeg : Make output file duration that of the file with the longest length [closed]
27 février 2024, par losercantcodeI'm going crazy trying to figure this out.


Essentially I'm trying to apply sidechain compression using ffmpeg (for the sake of speed and automation). a2.mp3 in this example is the full length track and a1.wav is a short 5 second, spoken-word clip. The code currently applies the sidechain compression well, but the output file is only that of the spoken-word clips length, so 5 seconds. Is it at all possible to make the output match that of a2.mp3's length instead ? Switching around the input order is not something that's possible as it'll apply the compression to the incorrect track.


Essentially, i'm trying to make it so i'm just layering the spoken-word file on top of the audio track without having to pre-pad the file. I'd rather, if it's possible, this just happen with one block of code within the in one program. I'm interested to see if this is at all possible within ffmpeg.


ffmpeg -i a2.mp3 -i a1.wav -filter_complex "[1:a]asplit=2[sc][mix];[0:a][sc]sidechaincompress=threshold=0.000976563:ratio=10:level_sc=0.015625:release=100:attack=10[compr];[compr][mix]amix=duration=longest[aout]" -map "[aout]" output.wav



a1.wav is mon, a2.mp3 is stereo.


Attempted to implement amix=duration=longest to no avail. Research similar problems on the Stack Exchange that gave unrelated answers.


-
FFmpeg error with ffmpeg.FS("readfile", "output.mp4"). trying to get ffmpeg to work in the react app
21 juin 2024, par Paul Thamconst stackVideos = useCallback(
 async (video1) => {
 try {
 console.log("Fetching video2 from storage...");
 const video2Ref = ref(storage, "video2.mp4");
 const video2Url = await getDownloadURL(video2Ref);
 const video2Blob = await (await fetch(video2Url)).blob();

 console.log("Writing video1 to FFmpeg FS...");
 await ffmpeg.FS("writeFile", "video1.mp4", await fetchFile(video1));

 console.log("Writing video2 to FFmpeg FS...");
 await ffmpeg.FS("writeFile", "video2.mp4", await fetchFile(video2Blob));

 console.log("Files in FFmpeg FS after write:");
 const files = await ffmpeg.FS("readdir", "/");
 console.log(files);

 const { start, end } = inputs[0];
 const startSeconds = new Date(`1970-01-01T${start}Z`).getTime() / 1000;
 const endSeconds = new Date(`1970-01-01T${end}Z`).getTime() / 1000;
 const duration = endSeconds - startSeconds;

 console.log("Running FFmpeg command...");
 await ffmpeg.run(
 "-i",
 "video1.mp4",
 "-ss",
 startSeconds.toString(),
 "-t",
 duration.toString(),
 "-i",
 "video2.mp4",
 "-filter_complex",
 "[0:v]scale=1080:-1[v1];[1:v]scale=-1:1920/2[v2scaled];[v2scaled]crop=1080:1920/2[v2cropped];[v1][v2cropped]vstack=inputs=2,scale=1080:1920[vid]",
 "-map",
 "[vid]",
 "-map",
 "0:a",
 "-c:v",
 "libx264",
 "-crf",
 "23",
 "-preset",
 "veryfast",
 "-shortest",
 "output1.mp4"
 );

 console.log("Files in FFmpeg FS after run:");
 const filesAfterRun = await ffmpeg.FS("readdir", "/");
 console.log(filesAfterRun);

 console.log("Reading output1.mp4 from FFmpeg FS...");
 const data = await ffmpeg.FS("readfile", "output1.mp4");
 console.log("after the FS readfile");
 const url = URL.createObjectURL(
 new Blob([data.buffer], { type: "video/mp4" })
 );
 setStackedVideo(url);
 setOutputFileReady(true); // Mark output file as ready
 } catch (err) {
 console.error("FFmpeg error output:", err);
 setError(`FFmpeg run error: ${err.message}`);
 setIsProcessing(false);
 }
 },
 [inputs]
 );



My error seems to be stemming from this line :


const data = await ffmpeg.FS("readfile", "output1.mp4");



Seeing the ffmpeg.wasm documentation i thought the functions for some of the functions had changed, but when I changed it, it seemed like they did not recognise the new functions. Sometimes this will also give me some other errors like worker.js which I dont understand enough to debug this myself.


words word words words words words word words words wordswords word words words wordswords word words words wordswords word words words wordswords word words words wordswords word words words words


-
runtime error when linking ffmpeg libraries in qt creator
6 juillet 2012, par dxthegreatI'm quite new around here but i hear that if you want a question answered, stackoverflow is the place to ask it >.<. So i hope that my question isn't too trivial that everyone will get annoyed at my lack of research (I've tried googling for two days already D= no progress !)
I've also asked this question in the Qt forums, but i figured i'd ask here too.so...
For the last few days I’ve been fiddling around with opengl and the like, trying to write a video player.
However, when i try to import the ffmpeg libraries (avcodec, avformat, avutils etc), an error occurs on runtime (the program compiles fine). When compiled and run in debug mode, the error message gives me only a memory address and error code 135 (DLL not found).
This error occurs when i include a function from those libraries in my code (e.g. av_register_all()) and it occurs regardless of whether the function is actually called.
So i’m thinking that I’m doing something wrong when linking these libraries.
I’m currently using :
Windows vista (32bit),
Qt creator 2.4.1 based on Qt 4.7.4 (32bit),
Zeranoe’s FFmpeg build git-3233ad4 (2012-06-30)My .pro file consists of :
QT += core gui opengl
TARGET = test
TEMPLATE = app
SOURCES += main.cpp\
mainwindow.cpp \
glwidget.cpp
HEADERS += mainwindow.h \
glwidget.h \
FORMS += mainwindow.ui
LIBS += -L"$$_PRO_FILE_PWD_/libraries/ffmpeg/libs/" -lavcodec -lavformat -lavutil
INCLUDEPATH += libraries/ffmpeg/includesI’ve tried many variations to the LIBS += line and checked my filepath many times. However, the DLL not found error occurs in all of these variations =(.
Is there something I’m forgetting when doing these includes ?
Thanks in advance >.<,
(young and naive) aspiring dev