
Recherche avancée
Médias (3)
-
Exemple de boutons d’action pour une collection collaborative
27 février 2013, par
Mis à jour : Mars 2013
Langue : français
Type : Image
-
Exemple de boutons d’action pour une collection personnelle
27 février 2013, par
Mis à jour : Février 2013
Langue : English
Type : Image
-
Collections - Formulaire de création rapide
19 février 2013, par
Mis à jour : Février 2013
Langue : français
Type : Image
Autres articles (50)
-
La file d’attente de SPIPmotion
28 novembre 2010, parUne file d’attente stockée dans la base de donnée
Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...) -
Other interesting software
13 avril 2011, parWe don’t claim to be the only ones doing what we do ... and especially not to assert claims to be the best either ... What we do, we just try to do it well and getting better ...
The following list represents softwares that tend to be more or less as MediaSPIP or that MediaSPIP tries more or less to do the same, whatever ...
We don’t know them, we didn’t try them, but you can take a peek.
Videopress
Website : http://videopress.com/
License : GNU/GPL v2
Source code : (...) -
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
Sur d’autres sites (5239)
-
Is there any way to change file FPS in javascript browser or prepare wav conventer to 60FPS videos ?
16 novembre 2020, par SZtyroI'm making web application which stores short audio files that have been cut from large video files. User uploads .mp4 file, chooses sound length and here's a little trick. Cutting audio can only be done in backend (correct me if I'm wrong) and sending 700MB data is not good option, so I use code below to decode audio data from .mp4 and then I send it with start and stop params. Backend (Node.js) use's FFMPEG to cut audio and save's it.


This part works, but i realised that decoded audio from 60FPS video doesn't sound good (not terrible but totally useless in my app). My goal is to avoid third party, especially desktop, apps (like audacity) and allow user to cut revelant part of audio from any mp4 video. Is there any way to convert 60FPS video to 30FPS video (ArrayBuffer) in browser and then decode audio ?


fileInput.onchange = event => {
 this.file = event.target["files"][0];
 //.mp4 file
 this.fileURL = URL.createObjectURL(this.file)

 let baseAudioContext = new AudioContext();
 this.file.arrayBuffer().then(buff => {

 baseAudioContext.decodeAudioData(buff,
 success => {
 console.log(success)
 this.bufferToWave(success, 0, success.length);
 },
 err => console.log(err));
 })
 }

 bufferToWave(abuffer, offset, len) {

 var numOfChan = abuffer.numberOfChannels,
 length = len * numOfChan * 2 + 44,
 buffer = new ArrayBuffer(length),
 view = new DataView(buffer),
 channels = [], i, sample,
 pos = 0;

 // write WAVE header
 setUint32(0x46464952); // "RIFF"
 setUint32(length - 8); // file length - 8
 setUint32(0x45564157); // "WAVE"

 setUint32(0x20746d66); // "fmt " chunk
 setUint32(16); // length = 16
 setUint16(1); // PCM (uncompressed)
 setUint16(numOfChan);
 setUint32(abuffer.sampleRate);
 setUint32(abuffer.sampleRate * 2 * numOfChan); // avg. bytes/sec
 setUint16(numOfChan * 2); // block-align
 setUint16(16); // 16-bit (hardcoded in this demo)

 setUint32(0x61746164); // "data" - chunk
 setUint32(length - pos - 4); // chunk length

 // write interleaved data
 for (i = 0; i < abuffer.numberOfChannels; i++)
 channels.push(abuffer.getChannelData(i));

 while (pos < length) {
 for (i = 0; i < numOfChan; i++) { // interleave channels
 sample = Math.max(-1, Math.min(1, channels[i][offset])); // clamp
 sample = (0.5 + sample < 0 ? sample * 32768 : sample * 32767) | 0; // scale to 16-bit signed int
 view.setInt16(pos, sample, true); // update data chunk
 pos += 2;
 }
 offset++ // next source sample
 }

 // create Blob
 //return (URL || webkitURL).createObjectURL(new Blob([buffer], { type: "audio/wav" }));
 var u = (URL || webkitURL).createObjectURL(new Blob([buffer], { type: "audio/wav" }));

 //temporary part
 //downloading file to check quality
 //in this part sound is already broken, no need to show backend code
 const a = document.createElement('a');
 a.style.display = 'none';
 a.href = u;
 a.download = name;
 document.body.appendChild(a);
 a.click();



 function setUint16(data) {
 view.setUint16(pos, data, true);
 pos += 2;
 }

 function setUint32(data) {
 view.setUint32(pos, data, true);
 pos += 4;
 }
 }



-
ffmpeg extract audio from streaming mp4
3 septembre 2016, par user6791548I am trying to extract audio from streaming mp4,it succeeds on
ffmpeg -i http://www.sample-videos.com/video/mp4/720/big_buck_bunny_720p_1mb.mp4 out2.mp3
But it fails on this facebook mp4 streaming video.(Hot girl alert)
It throws error :
out2.mp3: command not found
I suspect if ffmpeg forces I to have file extension ?
-
(Java) ImageIO.write() freezes after 540 iterations
14 octobre 2019, par the4navesI’m currently trying to write a simple audio visualizer (Non realtime), but am running into a very weird problem.
After about 515-545 iterations of ImageIO.write(), the program freezes ; no exceptions, the program doesn’t crash, just, nothing.
Before you read the code below, here’s a quick rundown of it to make it a bit easier to understand :
1. Load data from program arguments.
2. Start ffmpeg with its input set to stdin.
3. In a loop ; Continually render an image, then send it to ffmpeg with ImageIO.write().Other info :
I’ve verified ImageIO.write() IS actually the problem by placing a print before and after the try/catch statement.
My first thought was a memory leak, but that doesn’t seem to be the case, at least according to Runtime.getRuntime().freeMemory() ;
The iterations needed to freeze the program is inconsistent, though its always been in the range of 515-540.
Rendering a video with less frames works flawlessly.
My previous solution was to write all images to a folder, then run ffmpeg on them all at once. This worked fine on any number of frames, but used upwards of a gigabyte of storage for a fairly small test video, unacceptable for the small program I’m trying to create.As a final note, I realize the code probably has a ton of optimization/general problems, I haven’t had the time to try and get it to work well, as it isn’t even fully working yet.
With that said, if you see any obvious problems, feel free to include them in your answer if you have the time, it’ll certainly save me some.Thanks
public class Run {
public static BufferedImage render = new BufferedImage(1920, 1080, BufferedImage.TYPE_INT_RGB);
public static Graphics buffer = render.getGraphics();
public static int frame = 0;
public static double[] bins = new double[3200];
public static double[] previousBins = new double[3200];
public static File audio;
public static BufferedImage background;
public static OutputStream output;
public static void main(String[] args) {
try {
String command = "cd /Users/admin/Desktop/render ; ffmpeg -r 60 -blocksize 150000 -i pipe:0 -i " + args[1].replaceAll(" ", "\\\\ ") + " final.mp4";
System.out.println(command);
ProcessBuilder processBuilder = new ProcessBuilder("bash", "-c", command);
Process process = processBuilder.start();
output = process.getOutputStream();
} catch (IOException e) {
e.printStackTrace();
}
try {
background = ImageIO.read(new File(args[0]));
} catch (IOException e) {
System.out.println("File not found");
}
audio = new File(args[1]);
ByteArrayOutputStream out = new ByteArrayOutputStream();
BufferedInputStream in = null;
try { in = new BufferedInputStream(new FileInputStream(audio));
} catch (FileNotFoundException e1) {
e1.printStackTrace();
}
try {
out.flush();
int read;
byte[] buff = new byte[3200];
while ((read = in .read(buff)) > 0) {
out.write(buff, 0, read);
}
} catch (IOException e1) {
e1.printStackTrace();
}
byte[] audioBytes = out.toByteArray();
System.out.println("Calculating length");
int frames = 600;
System.out.println("Rendering images");
int[] data = new int[3200];
int index = 0;
while (frame < frames) {
for (int i = 0; i < 3200; i++) {
data[i] = (short)(audioBytes[index + 2] << 16 | audioBytes[index + 1] << 8 | audioBytes[index] & 0xff);
index += 3;
}
index -= 4800;
fourier(data, bins);
renderImage();
try {
ImageIO.write(render, "jpg", output);
} catch (IOException e) {
e.printStackTrace();
}
frame++;
if (frame % 20 == 0) {
System.out.println(frame + "/" + frames + " frames completed");
}
}
System.out.println("Render completed");
System.out.println("Optimizing file size");
System.out.println("Optimization complete");
System.out.println("Program complete");
System.exit(0);
}
public static void renderImage() {
buffer.drawImage(background, 0, 0, null);
for (int i = 0; i < 110; i++) {
int height = (int)((bins[i] + previousBins[i]) / Short.MAX_VALUE);
buffer.fillRect(15 * i + 20, 800 - height, 10, (int)(height * 1.2));
}
System.arraycopy(bins, 0, previousBins, 0, 3200);
}
public static void fourier(int[] inReal, double[] out) {
for (int k = 0; k < inReal.length; k++) {
double real = 0.0;
double imaginary = 0.0;
for (int t = 0; t < inReal.length; t++) {
real += inReal[t] * Math.cos(2 * Math.PI * t * k / inReal.length);
imaginary -= inReal[t] * Math.sin(2 * Math.PI * t * k / inReal.length);
}
out[k] = Math.sqrt(Math.pow(real, 2) + Math.pow(imaginary, 2));
}
}
}