
Recherche avancée
Autres articles (98)
-
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
Multilang : améliorer l’interface pour les blocs multilingues
18 février 2011, parMultilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela. -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...)
Sur d’autres sites (15435)
-
Saving an audio blob into the backend or Azure as an mp3 file using ffmpeg
2 juin 2021, par AnneI have an asp.net webforms, and I am using javascript and navigator.mediaDevices.getUserMedia to record an audio message.
This message has to be loaded into Azure once recorded.


So far, I have :
2 buttons, start and stop to record the audio blob


At the end of the process, I am trying to use ffmpeg to record the blob into a folder in my application, I can then load the file into Azure (I have the code ready for this one).
Or ideally, save directly to Azure.


I have installed ffmpeg in my application using nuget packages (I tried Xabe ffmpeg downloader and Accord video ffmpeg), however ffmpeg is not recognised when I run the function SendData() and I get this error :
Uncaught Error : Module name "ffmpeg" has not been loaded yet for context : _. Use require([])


My questions are :


- 

- How can I install ffmpeg in an asp.net wbeform and register it on the page ?
- Is there another way to save an audio blob to Azure ?
- Is it possible to save the audio chunks into a memory stream that I can later upload into Azure ?








Thank you for your help




<code class="echappe-js"><script>&#xA; navigator.mediaDevices.getUserMedia({ audio: true }).then(stream => { handlerFunction(stream) })&#xA;&#xA; record.onclick = e => {&#xA; record.disabled = true;&#xA; stopRecord.disabled = false;&#xA; audioChunks = [];&#xA; rec.start();&#xA; }&#xA;&#xA; stopRecord.onclick = e => {&#xA; record.disabled = false;&#xA; stop.disabled = true;&#xA; rec.stop();&#xA; }&#xA;&#xA;&#xA; function handlerFunction(stream) {&#xA; rec = new MediaRecorder(stream);&#xA; rec.ondataavailable = e => {audioChunks.push(e.data);&#xA; if (rec.state == "inactive") {&#xA; let blob = new Blob(audioChunks, { type: &#x27;audio/mpeg-3&#x27; });&#xA; recordedAudio.src = URL.createObjectURL(blob);&#xA; recordedAudio.controls = true;&#xA; sendData(blob)&#xA; }&#xA; }&#xA; }&#xA;&#xA; function sendData(data) {&#xA; var ffmepg = require("ffmpeg");&#xA; try {&#xA; var Path = data;&#xA; var process = new ffmepg("Path");&#xA; process.then(function (audio) {audio.fnExtractSoundToMP3("~//AppData//Audio//test.mp3", function (error, file) {&#xA; if (!error)&#xA; console.log("Audio file: " &#x2B; file);&#xA; });&#xA; }, function (err) {&#xA; console.log("Error: " &#x2B; err);&#xA; });&#xA; }&#xA; catch (e) {&#xA; console.log("Catch e.code" &#x2B; e.code);&#xA; console.log("Catch e.msg" &#x2B; e.msg);&#xA; }&#xA; }&#xA; </script>


<script src="https://code.jquery.com/jquery-2.2.0.min.js"></script>

<script src="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/js/bootstrap.min.js"></script>





 
 
 
 
 <code class="echappe-js"><script src='http://stackoverflow.com/Scripts/require.js'></script>




Record











 











-
Video streaming to YouTube using JavaScript and Java
28 septembre 2020, par user1597121I'm trying to stream live video from a user's browser to YouTube Live. I already have the following working :


- 

- Capture video from the webcam using
navigator.mediaDevices.getUserMedia
- Send video data to the server via WebSocket by periodically invoking this function :






function getFrame(video)
{
 var canvas = document.createElement('canvas');
 canvas.width = video.videoWidth;
 canvas.height = video.videoHeight;
 canvas.getContext('2d').drawImage(video, 0, 0);

 return canvas.toDataURL('image/png', 1);
}



- 

- Creating a live broadcast and stream on YouTube via their API and receiving the RTMP info where they expect the video stream to be sent.




This is where I seem to be stuck. I'm not sure how to send the video data from my Java server to YouTube's RTMP endpoint. I've looked into using Red5 or ffmpeg, but haven't been able to find an example where the data is continually being sent via WebSocket. Rather, there's always some "stream" that is being redirected to YouTube, coming in on a dedicated port, or perhaps from a pre-recorded video file.


I have very limited knowledge of how video streaming works, so that's presumably making things more difficult than they should be. I'd really appreciate some help with getting this figured out. Thank you !


- Capture video from the webcam using
-
create a timelapse video using MediaRecorder API ( and ffmpeg ? )
24 août 2022, par The Blind HawkSummary


I have a version of my code already working on Chrome and Edge (Mac Windows and Android), but I need some fixes for it to work on IOS (Safari/Chrome).

My objective is to record around 25 minutes and download a timelapse version of the recording.

final product requirements :

speed: 3fps
length: ~25s

(I need to record one frame every 20 seconds for 25 mins)



this.secondStream settings :


this.secondStream = await navigator.mediaDevices.getUserMedia({
 audio: false,
 video: {width: 430, height: 430, facingMode: "user"}
});



My code for IOS so far :


startIOSVideoRecording: function() {
 console.log("setting up recorder");
 var self = this;
 this.data = [];

 if (MediaRecorder.isTypeSupported('video/mp4')) {
 // IOS does not support webm, so I will be using mp4
 var options = {mimeType: 'video/mp4', videoBitsPerSecond : 1000000};
 } else {
 console.log("ERROR: mp4 is not supported, trying to default to webm");
 var options = {mimeType: 'video/webm'};
 }
 console.log("options settings:");
 console.log(options);

 this.recorder = new MediaRecorder(this.secondStream, options);

 this.recorder.ondataavailable = function(evt) {
 if (evt.data && evt.data.size > 0) {
 self.data.push(evt.data);
 console.log('chunk size: ' + evt.data.size);
 }
 }

 this.recorder.onstop = function(evt) {
 console.log('recorder stopping');
 var blob = new Blob(self.data, {type: "video/mp4"});
 self.download(blob, "mp4");
 self.sendMail(videoBlob);
 }

 console.log("finished setup, starting")
 this.recorder.start(1200);

 function sleep(ms) { return new Promise(resolve => setTimeout(resolve, ms));}

 async function looper() {
 // I am trying to pick one second every 20 more or less
 await sleep(500);
 self.recorder.pause();
 await sleep(18000);
 self.recorder.resume();
 looper();
 }
 looper();
 },



Issues


Only one call to getUserMedia()


I am already using
this.secondstream
elsewhere, and I need the settings to stay as they are for the other functionality.

On Chrome and Edge, I could just callgetUserMedia()
again with different settings, and the issue would be solved, but on IOS callinggetUserMedia()
a second time kills the first stream.

The settings that I was planning to use (works for Chrome and Edge) :

navigator.mediaDevices.getUserMedia({
 audio: false,
 video: { 
 width: 360, height: 240, facingMode: "user", 
 frameRate: { min:0, ideal: 0.05, max:0.1 } 
 },
}



The timelapse library I am using does not support mp4 (ffmpeg as alternative ?)


I am forced to use mp4 on IOS apparently, but this does not allow me to use the library I was relying on so I need an alternative.

I am thinking of usingffmpeg
but cannot find any documentation to make it interact with the blob before the download.

I do not want to edit the video after downloading it, but I want to be able to download the already edited version, so no terminal commands.

MediaRecorder pause and resume are not ideal


On Chrome and Edge I would keep one frame every 20 seconds by setting the frameRate to 0.05, but this does not seem to work on IOS for two reasons.

First one is related to the first issue of not being able to change the settings ofgetUserMedia()
without destroying the initial stream in the first place.

And even after changing the settings, It seems that setting the frame rate below 1 is not supported on IOS. Maybe I wrote something else wrong, but I was not able to open the downloaded file.

Therefore I tried relying on pausing and resuming the MediaRecorder, but this brings forth another two issues :

I am currently saving 1 second every 20 seconds and not 1 frame every 20 seconds, and I cannot find any workarounds.

Pause and Resume take a little bit of time, making the code unreliable, as I sometimes pick 2/20 seconds instead of 1/20, and I have no reliability that the loop is actually running every 20 seconds (might be 18 might be 25).

My working code for other platforms


This is my code for the other platforms, hope it helps !

Quick note : you will need to give it a bit of time between setup and start.

The timelapse library is here


 setupVideoRecording: function() {
 let video = { 
 width: 360, height: 240, facingMode: "user", 
 frameRate: { min:0, ideal: 0.05, max:0.1 } 
 };
 navigator.mediaDevices.getUserMedia({
 audio: false,
 video: video,
 }).then((stream) => {
 // this is a video element
 const recVideo = document.getElementById('self-recorder');
 recVideo.muted = true;
 recVideo.autoplay = true;
 recVideo.srcObject = stream;
 recVideo.play();
 });
 },

 startVideoRecording: function() {
 console.log("setting up recorder");
 var self = this;
 this.data = [];

 var video = document.getElementById('self-recorder');

 if (MediaRecorder.isTypeSupported('video/webm; codecs=vp9')) {
 var options = {mimeType: 'video/webm; codecs=vp9'};
 } else if (MediaRecorder.isTypeSupported('video/webm')) {
 var options = {mimeType: 'video/webm'};
 }
 console.log("options settings:");
 console.log(options);

 this.recorder = new MediaRecorder(video.captureStream(), options);

 this.recorder.ondataavailable = function(evt) {
 self.data.push(evt.data);
 console.log('chunk size: ' + evt.data.size);
 }

 this.recorder.onstop = function(evt) {
 console.log('recorder stopping');
 timelapse(self.data, 3, function(blob) {
 self.download(blob, "webm");
 });
 }

 console.log("finished setup, starting");
 this.recorder.start(40000);
 }