
Recherche avancée
Médias (1)
-
Revolution of Open-source and film making towards open film making
6 octobre 2011, par
Mis à jour : Juillet 2013
Langue : English
Type : Texte
Autres articles (67)
-
Amélioration de la version de base
13 septembre 2013Jolie sélection multiple
Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...) -
Menus personnalisés
14 novembre 2010, parMediaSPIP utilise le plugin Menus pour gérer plusieurs menus configurables pour la navigation.
Cela permet de laisser aux administrateurs de canaux la possibilité de configurer finement ces menus.
Menus créés à l’initialisation du site
Par défaut trois menus sont créés automatiquement à l’initialisation du site : Le menu principal ; Identifiant : barrenav ; Ce menu s’insère en général en haut de la page après le bloc d’entête, son identifiant le rend compatible avec les squelettes basés sur Zpip ; (...) -
Le plugin : Gestion de la mutualisation
2 mars 2010, parLe plugin de Gestion de mutualisation permet de gérer les différents canaux de mediaspip depuis un site maître. Il a pour but de fournir une solution pure SPIP afin de remplacer cette ancienne solution.
Installation basique
On installe les fichiers de SPIP sur le serveur.
On ajoute ensuite le plugin "mutualisation" à la racine du site comme décrit ici.
On customise le fichier mes_options.php central comme on le souhaite. Voilà pour l’exemple celui de la plateforme mediaspip.net :
< ?php (...)
Sur d’autres sites (9953)
-
send browser camera to rtmp server
22 juin 2022, par anamuli am using python and in frotnend javascript.


I am using
ffmpeg
to push video to rtmp server.

ffmpeg -y -f mp4 -stream_loop -1 -re -i myvideo.mp4
-b:v 3M -vcodec libx264 -r 30.0 -s 1280x720
-c:a copy -b:a 96k
-f flv "rtmps://dfdf222.global-contribute.live-video.net:443/app/"



It is working well but i am not getting how to push to rtmp server of my browser camera ? i mean from javascript ?


i am trying like this :



 <video autoplay="autoplay"></video>

 <a href="#">open and stream</a>

 <code class="echappe-js"><script>&#xA; function streamVideo () {&#xA; console.log(&#x27;ddd&#x27;)&#xA; navigator.mediaDevices.getUserMedia({&#xA; video: {&#xA;&#xA; },&#xA; audio: {&#xA; &#xA; }&#xA; }).then(stream=>{&#xA; console.log(stream)&#xA; document.getElementById(&#x27;vid&#x27;).srcObject = stream; &#xA; })&#xA;&#xA; }&#xA; </script>




It is working but not getting any way to push from browser to rtmp server. I know browser doesnt support rtmp protocol, we need to send it from the backend but can i push to server so that it push to rtmp server ?


Anyone know the way ?


-
How to convert n numbers of images into a video slideshow with putting some transition on each images with flutter_ffmpeg ?
26 avril 2023, par Shruti Ramnandan SharmaI'm trying to convert images into a video slideshow with some transition. I have used flutter_ffmpeg plugin , with this plugin I have created with only 3 images but I want to add n numbers of images but couldn't able to find solutions for it. I tried to add more images but it throws error.


I'm gonna add some piece of code below :


- 

- Instance of FlutterFFmpeg




final FlutterFFmpeg _flutterFFmpeg = FlutterFFmpeg(); 



- 

- Picking multiple images from the gallery and applying command to convert images into a video




void mergeImageVideo() async {
 final ImagePicker picker = ImagePicker();
 final images = await picker.pickMultiImage();

 var im1 = images[0].path;

 var im2 = images[1].path;

 var im3 = images[2].path;

 //-shortest
 openLoadingDialog(context, "Creating video");
 generateFilePath("ImageVideoMerge").then((outputFile) {
 var backgroundColor = 'FFF44336';

 String ffmpegCommand = VideoUtil.generateImagesToVideoScript(
 im1, im2, im3, outputFile, 'mpeg4', '', backgroundColor);

 print("ffmpegCommand==> $ffmpegCommand");
 _flutterFFmpeg.execute(ffmpegCommand).then((v) {
 if (v == 0) {
 var file = File(outputFile);
 Navigator.pop(context);
 showSnackBar(context, "Images Added Successfully", Colors.black);
 updateVideoPlayer(file);
 } else {
 Navigator.pop(context);
 }
 }).then((executionId) {
 });
 });
 }




method to get ffmpeg command (images into video with scale transition)


static String generateImagesToVideoScript(
 String image1Path,
 String image2Path,
 String image3Path,
 String videoFilePath,
 String videoCodec,
 String customOptions,
 String backgroundColor
 ) {

 // ignore: prefer_interpolation_to_compose_strings
 return "-hide_banner -y -loop 1 -i '" +
 image1Path +
 "' " +
 "-loop 1 -i \"" +
 image2Path +
 "\" " +
 "-loop 1 -i \"" +
 image3Path +
 "\" " +
 "-filter_complex " +
 "\" " +
 "[0:v]setpts=PTS-STARTPTS,scale=w=640:h=424,setsar=sar=1/1,split=2[stream1out1][stream1out2];" +
 "[1:v]setpts=PTS-STARTPTS,scale=w=640:h=424,setsar=sar=1/1,split=2[stream2out1][stream2out2];" +
 "[2:v]setpts=PTS-STARTPTS,scale=w=640:h=424,setsar=sar=1/1,split=2[stream3out1][stream3out2];" +
 "[stream1out1]pad=width=640:height=427:x=(640-iw)/2:y=(427-ih)/2:color=#$backgroundColor,trim=duration=3,select=lte(n\\,90)[stream1overlaid];" +
 "[stream1out2]pad=width=640:height=427:x=(640-iw)/2:y=(427-ih)/2:color=#$backgroundColor,trim=duration=1,select=lte(n\\,30)[stream1ending];" +
 "[stream2out1]pad=width=640:height=427:x=(640-iw)/2:y=(427-ih)/2:color=#$backgroundColor,trim=duration=2,select=lte(n\\,60)[stream2overlaid];" +
 "[stream2out2]pad=width=640:height=427:x=(640-iw)/2:y=(427-ih)/2:color=#$backgroundColor,trim=duration=1,select=lte(n\\,30),split=2[stream2starting][stream2ending];" +
 "[stream3out1]pad=width=640:height=427:x=(640-iw)/2:y=(427-ih)/2:color=#$backgroundColor,trim=duration=2,select=lte(n\\,60)[stream3overlaid];" +
 "[stream3out2]pad=width=640:height=427:x=(640-iw)/2:y=(427-ih)/2:color=#$backgroundColor,trim=duration=1,select=lte(n\\,30)[stream3starting];" +
 "[stream2starting][stream1ending]blend=all_expr='if(gte(X,(W/2)*T/1)*lte(X,W-(W/2)*T/1),B,A)':shortest=1[stream2blended];" +
 "[stream3starting][stream2ending]blend=all_expr='if(gte(X,(W/2)*T/1)*lte(X,W-(W/2)*T/1),B,A)':shortest=1[stream3blended];" +
 "[stream1overlaid][stream2blended][stream2overlaid][stream3blended][stream3overlaid]concat=n=5:v=1:a=0,scale=w=640:h=424,format=yuv420p[video]\"" +
 " -map [video] -vsync 2 -async 1 " +
 customOptions +
 "-c:v " +
 videoCodec +
 " -r 30 " +
 videoFilePath;
 }




-
Saving an audio blob into the backend or Azure as an mp3 file using ffmpeg
2 juin 2021, par AnneI have an asp.net webforms, and I am using javascript and navigator.mediaDevices.getUserMedia to record an audio message.
This message has to be loaded into Azure once recorded.


So far, I have :
2 buttons, start and stop to record the audio blob


At the end of the process, I am trying to use ffmpeg to record the blob into a folder in my application, I can then load the file into Azure (I have the code ready for this one).
Or ideally, save directly to Azure.


I have installed ffmpeg in my application using nuget packages (I tried Xabe ffmpeg downloader and Accord video ffmpeg), however ffmpeg is not recognised when I run the function SendData() and I get this error :
Uncaught Error : Module name "ffmpeg" has not been loaded yet for context : _. Use require([])


My questions are :


- 

- How can I install ffmpeg in an asp.net wbeform and register it on the page ?
- Is there another way to save an audio blob to Azure ?
- Is it possible to save the audio chunks into a memory stream that I can later upload into Azure ?








Thank you for your help




<code class="echappe-js"><script>&#xA; navigator.mediaDevices.getUserMedia({ audio: true }).then(stream => { handlerFunction(stream) })&#xA;&#xA; record.onclick = e => {&#xA; record.disabled = true;&#xA; stopRecord.disabled = false;&#xA; audioChunks = [];&#xA; rec.start();&#xA; }&#xA;&#xA; stopRecord.onclick = e => {&#xA; record.disabled = false;&#xA; stop.disabled = true;&#xA; rec.stop();&#xA; }&#xA;&#xA;&#xA; function handlerFunction(stream) {&#xA; rec = new MediaRecorder(stream);&#xA; rec.ondataavailable = e => {audioChunks.push(e.data);&#xA; if (rec.state == "inactive") {&#xA; let blob = new Blob(audioChunks, { type: &#x27;audio/mpeg-3&#x27; });&#xA; recordedAudio.src = URL.createObjectURL(blob);&#xA; recordedAudio.controls = true;&#xA; sendData(blob)&#xA; }&#xA; }&#xA; }&#xA;&#xA; function sendData(data) {&#xA; var ffmepg = require("ffmpeg");&#xA; try {&#xA; var Path = data;&#xA; var process = new ffmepg("Path");&#xA; process.then(function (audio) {audio.fnExtractSoundToMP3("~//AppData//Audio//test.mp3", function (error, file) {&#xA; if (!error)&#xA; console.log("Audio file: " &#x2B; file);&#xA; });&#xA; }, function (err) {&#xA; console.log("Error: " &#x2B; err);&#xA; });&#xA; }&#xA; catch (e) {&#xA; console.log("Catch e.code" &#x2B; e.code);&#xA; console.log("Catch e.msg" &#x2B; e.msg);&#xA; }&#xA; }&#xA; </script>


<script src="https://code.jquery.com/jquery-2.2.0.min.js"></script>

<script src="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/js/bootstrap.min.js"></script>





 
 
 
 
 <code class="echappe-js"><script src='http://stackoverflow.com/Scripts/require.js'></script>




Record