
Recherche avancée
Médias (1)
-
GetID3 - Bloc informations de fichiers
9 avril 2013, par
Mis à jour : Mai 2013
Langue : français
Type : Image
Autres articles (62)
-
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page. -
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...) -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...)
Sur d’autres sites (7870)
-
Save video to disk from WebRTC MediaStream in Node
27 novembre 2020, par SAGBO AiméI'm building an app where the user can connect to the server through a WebRTC (I'm using simple-peer library both server-side and client-side to set the peer-to-peer connection).
Once the client and the server are connected, the client app stream the user camera and micro to the server.


Now, I want to save the streamed data to the filesystem server-side as an MP4 video file.


I hear about ffmpeg and fluent-ffmpeg to achieve this but i don't know how to use them.


- 

- Server side code to set up the peer connection




const Peer = require("simple-peer");
const wrtc = require("wrtc");

const peer = new Peer({ initiator: false, wrtc: wrtc, trickle: false });

peer.on("error", (err: any) => console.log("error", err));

 peer.on("signal", (data: any) => {
 if (data.type === "offer" || data.type === "answer")
 dispatchMessage(JSON.stringify(data));
 // if (data.renegotiate || data.transceiverRequest) return;
 });

 peer.on("connect", () => {
 console.log("CONNECTED");
 peer.send(JSON.stringify("HELLO DEER PEER FROM SERVER"));
 });

 peer.on("data", (data: any) => {
 console.log("data: ", data);
 });

 peer.on("stream", (stream: MediaStream) => {
 console.log("-------Stream received", stream);
 });

 peer.on("track", (track: MediaStreamTrack) => {
 console.log("-------trackEvent:", track);
 });



- 

- Client-side code




const stream = await window.navigator.mediaDevices.getUserMedia({
 video: { width: { ideal: 4096 }, height: { ideal: 2160 }},
 audio: true,
});

const p = new SimplePeer({
 initiator: isInitiator, 
 trickle: false 
});

stream.getTracks().forEach(track => p.addTrack(
 track, 
 stream 
));

// Here I set up the listeners for the peer connection



-
Capturing audio data (using javascript) and uploading on a server as MP3
4 septembre 2018, par MichelFollowing a number of resources on the internet, I am trying to build a simple web page, where I can go to record something (my voice), then make a mp3 file out of the recording and finally upload that file to a server.
At this point I can do the recording and also play back, but I haven’t gone as far as uploading, it seems like I cannot even make an mp3 file locally.
Can someone tell me what I am doing wrong, or in the wrong order ?Below is all the code I have at this point.
<div>
<h2>Audio record and playback</h2>
<p>
<button></button></p><h3>Start</h3>
<button disabled="disabled"><h3>Stop</h3></button>
<audio controls="controls"></audio>
<a></a>
</div>
<code class="echappe-js"><script><br />
var player = document.getElementById('player');<br />
<br />
var handleSuccess = function(stream) {<br />
rec = new MediaRecorder(stream);<br />
<br />
rec.ondataavailable = e => {<br />
audioChunks.push(e.data);<br />
if (rec.state == "inactive") {<br />
let blob = new Blob(audioChunks,{type:'audio/x-mpeg-3'});<br />
player.src = URL.createObjectURL(blob);<br />
player.controls=true;<br />
player.autoplay=true;<br />
// audioDownload.href = player.src;<br />
// audioDownload.download = 'sound.data';<br />
// audioDownload.innerHTML = 'Download';<br />
mp3Build();<br />
}<br />
}<br />
<br />
player.src = stream;<br />
};<br />
<br />
navigator.mediaDevices.getUserMedia({audio:true/*, video: false */})<br />
.then(handleSuccess);<br />
<br />
startRecord.onclick = e => {<br />
startRecord.disabled = true;<br />
stopRecord.disabled=false;<br />
audioChunks = [];<br />
rec.start();<br />
}<br />
<br />
stopRecord.onclick = e => {<br />
startRecord.disabled = false;<br />
stopRecord.disabled=true;<br />
rec.stop();<br />
}<br />
<br />
<br />
var ffmpeg = require('ffmpeg');<br />
<br />
function mp3Build() {<br />
try {<br />
var process = new ffmpeg('sound.data');<br />
process.then(function (audio) {<br />
// Callback mode.<br />
audio.fnExtractSoundToMP3('sound.mp3', function (error, file) {<br />
if (!error) {<br />
console.log('Audio file: ' + file);<br />
audioDownload.href = player.src;<br />
audioDownload.download = 'sound.mp3';<br />
audioDownload.innerHTML = 'Download';<br />
} else {<br />
console.log('Error-fnExtractSoundToMP3: ' + error);<br />
}<br />
});<br />
}, function (err) {<br />
console.log('Error: ' + err);<br />
});<br />
} catch (e) {<br />
console.log(e.code);<br />
console.log(e.msg);<br />
}<br />
}<br />
<br />
</script>When I try to investigate and see what is happening using the Debugger inside the Web Console ; on the line :
var process = new ffmpeg('sound.data');
I get this message :
Paused on exception
TypeError ffmpeg is not a contructor.And on the line :
var ffmpeg = require('ffmpeg');
I get this message :
Paused on exception
ReferenceError require is not defined.Beside when I watch the expression ffmpeg, I can see :
ffmpeg: undefined
After some further investigations, and using browserify I use the following code :
<div>
<h2>Audio record and playback</h2>
<p>
<button></button></p><h3>Start</h3>
<button disabled="disabled"><h3>Stop</h3></button>
<audio controls="controls"></audio>
<a></a>
</div>
<code class="echappe-js"><script src='http://stackoverflow.com/feeds/tag/bundle.js'></script><script><br />
var player = document.getElementById('player');<br />
<br />
var handleSuccess = function(stream) {<br />
rec = new MediaRecorder(stream);<br />
<br />
rec.ondataavailable = e => {<br />
if (rec.state == "inactive") {<br />
let blob = new Blob(audioChunks,{type:'audio/x-mpeg-3'});<br />
//player.src = URL.createObjectURL(blob);<br />
//player.srcObject = URL.createObjectURL(blob);<br />
//player.srcObject = blob;<br />
player.srcObject = stream;<br />
player.controls=true;<br />
player.autoplay=true;<br />
// audioDownload.href = player.src;<br />
// audioDownload.download = 'sound.data';<br />
// audioDownload.innerHTML = 'Download';<br />
mp3Build();<br />
}<br />
}<br />
<br />
//player.src = stream;<br />
player.srcObject = stream;<br />
};<br />
<br />
navigator.mediaDevices.getUserMedia({audio:true/*, video: false */})<br />
.then(handleSuccess);<br />
<br />
startRecord.onclick = e => {<br />
startRecord.disabled = true;<br />
stopRecord.disabled=false;<br />
audioChunks = [];<br />
rec.start();<br />
}<br />
<br />
stopRecord.onclick = e => {<br />
startRecord.disabled = false;<br />
stopRecord.disabled=true;<br />
rec.stop();<br />
}<br />
<br />
<br />
var ffmpeg = require('ffmpeg');<br />
<br />
function mp3Build() {<br />
try {<br />
var process = new ffmpeg('sound.data');<br />
process.then(function (audio) {<br />
// Callback mode.<br />
audio.fnExtractSoundToMP3('sound.mp3', function (error, file) {<br />
if (!error) {<br />
console.log('Audio file: ' + file);<br />
//audioDownload.href = player.src;<br />
audioDownload.href = player.srcObject;<br />
audioDownload.download = 'sound.mp3';<br />
audioDownload.innerHTML = 'Download';<br />
} else {<br />
console.log('Error-fnExtractSoundToMP3: ' + error);<br />
}<br />
});<br />
}, function (err) {<br />
console.log('Error: ' + err);<br />
});<br />
} catch (e) {<br />
console.log(e.code);<br />
console.log(e.msg);<br />
}<br />
}<br />
<br />
</script>That solved the problem of :
the expression ffmpeg being: undefined
But the play back is no longer working. I may not be doing the right thing with player.srcObject and maybe some other things too.
When I use this line :
player.srcObject = URL.createObjectURL(blob);
I get this message :
Paused on exception
TypeError: Value being assigned to HTMLMediaElement.srcObject is not an object.And when I use this line :
player.srcObject = blob;
I get this message :
Paused on exception
TypeError: Value being assigned to HTMLMediaElement.srcObject does not implement interface MediaStream.Finally, if I use this :
player.srcObject = stream;
I do not get any error message but the voice recording still does not work.
-
Encountered an exception of ffmpeg.wasm can only run one command at a time
2 mars 2023, par Itay113I want to make a video chat using ffmepg wasm (I know the standard is WebRTC but my assignment is to do this with ffmpeg wasm and a server connecting the 2 clients) and when doing the follow code I am getting ffmpeg.wasm can only run one command at a time exception on the ffmpegWorker.run line


function App() {
 const ffmpegWorker = createFFmpeg({
 log: true
 })

 async function initFFmpeg() {
 await ffmpegWorker.load();
 }

 async function transcode(webcamData) {
 const name = 'record.webm';
 await ffmpegWorker.FS('writeFile', name, await fetchFile(webcamData));
 ffmpegWorker.run('-i', name, '-preset', 'ultrafast', '-c:v', 'h264', '-crf', '28', '-b:v', '0', '-row-mt', '1', '-f', 'mp4', 'output.mp4')
 .then(()=> {

 const data = ffmpegWorker.FS('readFile', 'output.mp4');
 
 const video = document.getElementById('output-video');
 video.src = URL.createObjectURL(new Blob([data.buffer], { type: 'video/mp4' }));
 ffmpegWorker.FS('unlink', 'output.mp4');
 })
 }

 function requestMedia() {
 const webcam = document.getElementById('webcam');
 const chunks = []
 navigator.mediaDevices.getUserMedia({ video: true, audio: true })
 .then(async (stream) => {
 webcam.srcObject = stream;
 await webcam.play();
 const mediaRecorder = new MediaRecorder(stream);
 mediaRecorder.start(0);
 mediaRecorder.onstop = function(e) {
 stream.stop(); 
 }
 mediaRecorder.ondataavailable = async function(e) {
 chunks.push(e.data);
 await transcode(new Uint8Array(await (new Blob(chunks)).arrayBuffer()));
 
 }
 })
 }

 useEffect(() => {
 requestMedia();
 }, [])

 return (
 <div classname="App">
 <div>
 <video width="320px" height="180px"></video>
 <video width="320px" height="180px"></video>
 </div>
 </div>
 );
}



I have tried messing around with the time slice on the media recorder start method argument but it didn't helped