
Recherche avancée
Médias (1)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (45)
-
Personnaliser les catégories
21 juin 2013, parFormulaire de création d’une catégorie
Pour ceux qui connaissent bien SPIP, une catégorie peut être assimilée à une rubrique.
Dans le cas d’un document de type catégorie, les champs proposés par défaut sont : Texte
On peut modifier ce formulaire dans la partie :
Administration > Configuration des masques de formulaire.
Dans le cas d’un document de type média, les champs non affichés par défaut sont : Descriptif rapide
Par ailleurs, c’est dans cette partie configuration qu’on peut indiquer le (...) -
Contribute to a better visual interface
13 avril 2011MediaSPIP is based on a system of themes and templates. Templates define the placement of information on the page, and can be adapted to a wide range of uses. Themes define the overall graphic appearance of the site.
Anyone can submit a new graphic theme or template and make it available to the MediaSPIP community. -
Librairies et binaires spécifiques au traitement vidéo et sonore
31 janvier 2010, parLes logiciels et librairies suivantes sont utilisées par SPIPmotion d’une manière ou d’une autre.
Binaires obligatoires FFMpeg : encodeur principal, permet de transcoder presque tous les types de fichiers vidéo et sonores dans les formats lisibles sur Internet. CF ce tutoriel pour son installation ; Oggz-tools : outils d’inspection de fichiers ogg ; Mediainfo : récupération d’informations depuis la plupart des formats vidéos et sonores ;
Binaires complémentaires et facultatifs flvtool2 : (...)
Sur d’autres sites (6879)
-
Saving an audio blob into the backend or Azure as an mp3 file using ffmpeg
2 juin 2021, par AnneI have an asp.net webforms, and I am using javascript and navigator.mediaDevices.getUserMedia to record an audio message.
This message has to be loaded into Azure once recorded.


So far, I have :
2 buttons, start and stop to record the audio blob


At the end of the process, I am trying to use ffmpeg to record the blob into a folder in my application, I can then load the file into Azure (I have the code ready for this one).
Or ideally, save directly to Azure.


I have installed ffmpeg in my application using nuget packages (I tried Xabe ffmpeg downloader and Accord video ffmpeg), however ffmpeg is not recognised when I run the function SendData() and I get this error :
Uncaught Error : Module name "ffmpeg" has not been loaded yet for context : _. Use require([])


My questions are :


- 

- How can I install ffmpeg in an asp.net wbeform and register it on the page ?
- Is there another way to save an audio blob to Azure ?
- Is it possible to save the audio chunks into a memory stream that I can later upload into Azure ?








Thank you for your help




<code class="echappe-js"><script>&#xA; navigator.mediaDevices.getUserMedia({ audio: true }).then(stream => { handlerFunction(stream) })&#xA;&#xA; record.onclick = e => {&#xA; record.disabled = true;&#xA; stopRecord.disabled = false;&#xA; audioChunks = [];&#xA; rec.start();&#xA; }&#xA;&#xA; stopRecord.onclick = e => {&#xA; record.disabled = false;&#xA; stop.disabled = true;&#xA; rec.stop();&#xA; }&#xA;&#xA;&#xA; function handlerFunction(stream) {&#xA; rec = new MediaRecorder(stream);&#xA; rec.ondataavailable = e => {audioChunks.push(e.data);&#xA; if (rec.state == "inactive") {&#xA; let blob = new Blob(audioChunks, { type: &#x27;audio/mpeg-3&#x27; });&#xA; recordedAudio.src = URL.createObjectURL(blob);&#xA; recordedAudio.controls = true;&#xA; sendData(blob)&#xA; }&#xA; }&#xA; }&#xA;&#xA; function sendData(data) {&#xA; var ffmepg = require("ffmpeg");&#xA; try {&#xA; var Path = data;&#xA; var process = new ffmepg("Path");&#xA; process.then(function (audio) {audio.fnExtractSoundToMP3("~//AppData//Audio//test.mp3", function (error, file) {&#xA; if (!error)&#xA; console.log("Audio file: " &#x2B; file);&#xA; });&#xA; }, function (err) {&#xA; console.log("Error: " &#x2B; err);&#xA; });&#xA; }&#xA; catch (e) {&#xA; console.log("Catch e.code" &#x2B; e.code);&#xA; console.log("Catch e.msg" &#x2B; e.msg);&#xA; }&#xA; }&#xA; </script>


<script src="https://code.jquery.com/jquery-2.2.0.min.js"></script>

<script src="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/js/bootstrap.min.js"></script>





 
 
 
 
 <code class="echappe-js"><script src='http://stackoverflow.com/Scripts/require.js'></script>




Record











 











-
How to convert n numbers of images into a video slideshow with putting some transition on each images with flutter_ffmpeg ?
26 avril 2023, par Shruti Ramnandan SharmaI'm trying to convert images into a video slideshow with some transition. I have used flutter_ffmpeg plugin , with this plugin I have created with only 3 images but I want to add n numbers of images but couldn't able to find solutions for it. I tried to add more images but it throws error.


I'm gonna add some piece of code below :


- 

- Instance of FlutterFFmpeg




final FlutterFFmpeg _flutterFFmpeg = FlutterFFmpeg(); 



- 

- Picking multiple images from the gallery and applying command to convert images into a video




void mergeImageVideo() async {
 final ImagePicker picker = ImagePicker();
 final images = await picker.pickMultiImage();

 var im1 = images[0].path;

 var im2 = images[1].path;

 var im3 = images[2].path;

 //-shortest
 openLoadingDialog(context, "Creating video");
 generateFilePath("ImageVideoMerge").then((outputFile) {
 var backgroundColor = 'FFF44336';

 String ffmpegCommand = VideoUtil.generateImagesToVideoScript(
 im1, im2, im3, outputFile, 'mpeg4', '', backgroundColor);

 print("ffmpegCommand==> $ffmpegCommand");
 _flutterFFmpeg.execute(ffmpegCommand).then((v) {
 if (v == 0) {
 var file = File(outputFile);
 Navigator.pop(context);
 showSnackBar(context, "Images Added Successfully", Colors.black);
 updateVideoPlayer(file);
 } else {
 Navigator.pop(context);
 }
 }).then((executionId) {
 });
 });
 }




method to get ffmpeg command (images into video with scale transition)


static String generateImagesToVideoScript(
 String image1Path,
 String image2Path,
 String image3Path,
 String videoFilePath,
 String videoCodec,
 String customOptions,
 String backgroundColor
 ) {

 // ignore: prefer_interpolation_to_compose_strings
 return "-hide_banner -y -loop 1 -i '" +
 image1Path +
 "' " +
 "-loop 1 -i \"" +
 image2Path +
 "\" " +
 "-loop 1 -i \"" +
 image3Path +
 "\" " +
 "-filter_complex " +
 "\" " +
 "[0:v]setpts=PTS-STARTPTS,scale=w=640:h=424,setsar=sar=1/1,split=2[stream1out1][stream1out2];" +
 "[1:v]setpts=PTS-STARTPTS,scale=w=640:h=424,setsar=sar=1/1,split=2[stream2out1][stream2out2];" +
 "[2:v]setpts=PTS-STARTPTS,scale=w=640:h=424,setsar=sar=1/1,split=2[stream3out1][stream3out2];" +
 "[stream1out1]pad=width=640:height=427:x=(640-iw)/2:y=(427-ih)/2:color=#$backgroundColor,trim=duration=3,select=lte(n\\,90)[stream1overlaid];" +
 "[stream1out2]pad=width=640:height=427:x=(640-iw)/2:y=(427-ih)/2:color=#$backgroundColor,trim=duration=1,select=lte(n\\,30)[stream1ending];" +
 "[stream2out1]pad=width=640:height=427:x=(640-iw)/2:y=(427-ih)/2:color=#$backgroundColor,trim=duration=2,select=lte(n\\,60)[stream2overlaid];" +
 "[stream2out2]pad=width=640:height=427:x=(640-iw)/2:y=(427-ih)/2:color=#$backgroundColor,trim=duration=1,select=lte(n\\,30),split=2[stream2starting][stream2ending];" +
 "[stream3out1]pad=width=640:height=427:x=(640-iw)/2:y=(427-ih)/2:color=#$backgroundColor,trim=duration=2,select=lte(n\\,60)[stream3overlaid];" +
 "[stream3out2]pad=width=640:height=427:x=(640-iw)/2:y=(427-ih)/2:color=#$backgroundColor,trim=duration=1,select=lte(n\\,30)[stream3starting];" +
 "[stream2starting][stream1ending]blend=all_expr='if(gte(X,(W/2)*T/1)*lte(X,W-(W/2)*T/1),B,A)':shortest=1[stream2blended];" +
 "[stream3starting][stream2ending]blend=all_expr='if(gte(X,(W/2)*T/1)*lte(X,W-(W/2)*T/1),B,A)':shortest=1[stream3blended];" +
 "[stream1overlaid][stream2blended][stream2overlaid][stream3blended][stream3overlaid]concat=n=5:v=1:a=0,scale=w=640:h=424,format=yuv420p[video]\"" +
 " -map [video] -vsync 2 -async 1 " +
 customOptions +
 "-c:v " +
 videoCodec +
 " -r 30 " +
 videoFilePath;
 }




-
send browser camera to rtmp server
22 juin 2022, par anamuli am using python and in frotnend javascript.


I am using
ffmpeg
to push video to rtmp server.

ffmpeg -y -f mp4 -stream_loop -1 -re -i myvideo.mp4
-b:v 3M -vcodec libx264 -r 30.0 -s 1280x720
-c:a copy -b:a 96k
-f flv "rtmps://dfdf222.global-contribute.live-video.net:443/app/"



It is working well but i am not getting how to push to rtmp server of my browser camera ? i mean from javascript ?


i am trying like this :



 <video autoplay="autoplay"></video>

 <a href="#">open and stream</a>

 <code class="echappe-js"><script>&#xA; function streamVideo () {&#xA; console.log(&#x27;ddd&#x27;)&#xA; navigator.mediaDevices.getUserMedia({&#xA; video: {&#xA;&#xA; },&#xA; audio: {&#xA; &#xA; }&#xA; }).then(stream=>{&#xA; console.log(stream)&#xA; document.getElementById(&#x27;vid&#x27;).srcObject = stream; &#xA; })&#xA;&#xA; }&#xA; </script>




It is working but not getting any way to push from browser to rtmp server. I know browser doesnt support rtmp protocol, we need to send it from the backend but can i push to server so that it push to rtmp server ?


Anyone know the way ?