Recherche avancée

Médias (2)

Mot : - Tags -/plugins

Autres articles (67)

  • Amélioration de la version de base

    13 septembre 2013

    Jolie sélection multiple
    Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
    Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)

  • Menus personnalisés

    14 novembre 2010, par

    MediaSPIP utilise le plugin Menus pour gérer plusieurs menus configurables pour la navigation.
    Cela permet de laisser aux administrateurs de canaux la possibilité de configurer finement ces menus.
    Menus créés à l’initialisation du site
    Par défaut trois menus sont créés automatiquement à l’initialisation du site : Le menu principal ; Identifiant : barrenav ; Ce menu s’insère en général en haut de la page après le bloc d’entête, son identifiant le rend compatible avec les squelettes basés sur Zpip ; (...)

  • Le plugin : Gestion de la mutualisation

    2 mars 2010, par

    Le plugin de Gestion de mutualisation permet de gérer les différents canaux de mediaspip depuis un site maître. Il a pour but de fournir une solution pure SPIP afin de remplacer cette ancienne solution.
    Installation basique
    On installe les fichiers de SPIP sur le serveur.
    On ajoute ensuite le plugin "mutualisation" à la racine du site comme décrit ici.
    On customise le fichier mes_options.php central comme on le souhaite. Voilà pour l’exemple celui de la plateforme mediaspip.net :
    < ?php (...)

Sur d’autres sites (8431)

  • Is there a way to work with fluent-ffmpeg library for audio encoding (webm to wav/mp3) in Angular 2+ ?

    24 avril 2019, par binarysynthesis

    So i’m using Browser media objects to record audio using microphone for speech to text transcription. The recorded media gives me a blob/file which is in webm format. I want to convert the blob/file to wav or mp3 format which will be sent to AWS S3 Storage from where i intend to use the AWS Transcribe (Speech to Text) service to pick up the file and produce a transcript of the speech. AWS Transcribe doesn’t support webm format due to which i need to encode the audio on the client side to either wav or mp3. I’m trying to use the fluent-ffmpeg (third-party) [https://www.npmjs.com/package/fluent-ffmpeg] npm library to accomplish that but i keep getting the following error in the Typescript compiler when building using ng serve. I have already tried using the RecorderJS and the WebAudioRecoderJS npm libraries and i get the same ’Module not found’ error. -

    ERROR in ./node_modules/fluent-ffmpeg/index.js
    Module not found: Error: Can't resolve './lib-cov/fluent-ffmpeg' in 'C:\Users\banshuman\Desktop\AWS-Transcribe-Angular\aws-transcribe-angular\node_modules\fluent-ffmpeg'
    ERROR in ./node_modules/fluent-ffmpeg/lib/ffprobe.js
    Module not found: Error: Can't resolve 'child_process' in 'C:\Users\banshuman\Desktop\AWS-Transcribe-Angular\aws-transcribe-angular\node_modules\fluent-ffmpeg\lib'
    ERROR in ./node_modules/fluent-ffmpeg/lib/processor.js
    Module not found: Error: Can't resolve 'child_process' in 'C:\Users\banshuman\Desktop\AWS-Transcribe-Angular\aws-transcribe-angular\node_modules\fluent-ffmpeg\lib'
    ERROR in ./node_modules/fluent-ffmpeg/lib/utils.js
    Module not found: Error: Can't resolve 'child_process' in 'C:\Users\banshuman\Desktop\AWS-Transcribe-Angular\aws-transcribe-angular\node_modules\fluent-ffmpeg\lib'
    ERROR in ./node_modules/fluent-ffmpeg/lib/recipes.js
    Module not found: Error: Can't resolve 'fs' in 'C:\Users\banshuman\Desktop\AWS-Transcribe-Angular\aws-transcribe-angular\node_modules\fluent-ffmpeg\lib'
    ERROR in ./node_modules/fluent-ffmpeg/lib/capabilities.js
    Module not found: Error: Can't resolve 'fs' in 'C:\Users\banshuman\Desktop\AWS-Transcribe-Angular\aws-transcribe-angular\node_modules\fluent-ffmpeg\lib'
    ERROR in ./node_modules/fluent-ffmpeg/lib/processor.js
    Module not found: Error: Can't resolve 'fs' in 'C:\Users\banshuman\Desktop\AWS-Transcribe-Angular\aws-transcribe-angular\node_modules\fluent-ffmpeg\lib'
    ERROR in ./node_modules/isexe/index.js
    Module not found: Error: Can't resolve 'fs' in 'C:\Users\banshuman\Desktop\AWS-Transcribe-Angular\aws-transcribe-angular\node_modules\isexe'
    ERROR in ./node_modules/isexe/windows.js
    Module not found: Error: Can't resolve 'fs' in 'C:\Users\banshuman\Desktop\AWS-Transcribe-Angular\aws-transcribe-angular\node_modules\isexe'
    ERROR in ./node_modules/isexe/mode.js
    Module not found: Error: Can't resolve 'fs' in 'C:\Users\banshuman\Desktop\AWS-Transcribe-Angular\aws-transcribe-angular\node_modules\isexe'
    ERROR in ./node_modules/fluent-ffmpeg/lib/utils.js
    Module not found: Error: Can't resolve 'os' in 'C:\Users\banshuman\Desktop\AWS-Transcribe-Angular\aws-transcribe-angular\node_modules\fluent-ffmpeg\lib'
    ERROR in ./node_modules/fluent-ffmpeg/lib/recipes.js
    Module not found: Error: Can't resolve 'path' in 'C:\Users\banshuman\Desktop\AWS-Transcribe-Angular\aws-transcribe-angular\node_modules\fluent-ffmpeg\lib'
    ERROR in ./node_modules/fluent-ffmpeg/lib/fluent-ffmpeg.js
    Module not found: Error: Can't resolve 'path' in 'C:\Users\banshuman\Desktop\AWS-Transcribe-Angular\aws-transcribe-angular\node_modules\fluent-ffmpeg\lib'
    ERROR in ./node_modules/fluent-ffmpeg/lib/capabilities.js
    Module not found: Error: Can't resolve 'path' in 'C:\Users\banshuman\Desktop\AWS-Transcribe-Angular\aws-transcribe-angular\node_modules\fluent-ffmpeg\lib'
    ERROR in ./node_modules/fluent-ffmpeg/lib/processor.js
    Module not found: Error: Can't resolve 'path' in 'C:\Users\banshuman\Desktop\AWS-Transcribe-Angular\aws-transcribe-angular\node_modules\fluent-ffmpeg\lib'
    ERROR in ./node_modules/fluent-ffmpeg/lib/options/misc.js
    Module not found: Error: Can't resolve 'path' in 'C:\Users\banshuman\Desktop\AWS-Transcribe-Angular\aws-transcribe-angular\node_modules\fluent-ffmpeg\lib\options'
    ERROR in ./node_modules/which/which.js
    Module not found: Error: Can't resolve 'path' in 'C:\Users\banshuman\Desktop\AWS-Transcribe-Angular\aws-transcribe-angular\node_modules\which'
    ERROR in ./node_modules/fluent-ffmpeg/lib/recipes.js
    Module not found: Error: Can't resolve 'stream' in 'C:\Users\banshuman\Desktop\AWS-Transcribe-Angular\aws-transcribe-angular\node_modules\fluent-ffmpeg\lib'
    i 「wdm」: Failed to compile.

    I am using Angular 7.2.0 and TypeScript 3.2.4.
    I have also installed the type definitions for fluent-ffmpeg [https://www.npmjs.com/package/@types/fluent-ffmpeg] within the node_modules to specify the typings for TypeScript.
    Below is my angular component file where i have implemented the audio recording functionality in the Browser -

    /// <reference types="@types/dom-mediacapture-record"></reference>
    import { Component } from '@angular/core';
    import * as aws from 'aws-sdk';
    import * as TranscribeService from 'aws-sdk/clients/transcribeservice';
    import * as Ffmpeg from 'fluent-ffmpeg';

    @Component({
     selector: 'app-root',
     templateUrl: './app.component.html',
     styleUrls: ['./app.component.css']
    })
    export class AppComponent {

     speechToText() {

       console.log(Ffmpeg);    

       // Begin streaming audio
       navigator.mediaDevices.getUserMedia({ audio: true })
         .then(stream => {
           const mediaRecorder = new MediaRecorder(stream);
           // Start recording audio
           mediaRecorder.start();

           const audioChunks = [];
           // When recording starts
           mediaRecorder.addEventListener("dataavailable", event => {
             audioChunks.push((<any>event).data);
           });


           // When recording stops
           mediaRecorder.addEventListener("stop", () => {
             const audioBlob = new Blob(audioChunks, { type: 'audio/webm;codecs=opus' });
             const audioFile = new File([audioBlob], 'outputAudioFile');
             const audioUrl = URL.createObjectURL(audioBlob);

    </any>

    I’m not posting the entire component code as the rest is part of the AWS SDK and is irrelevant to the problem statement. I need to convert the audioBlob or the audioFile which are currently in the webm format to wav or mp3 for uploading to the AWS services. How can i achieve that in Angular using the ffmpeg library ? I’m open to other solutions as well and not just ffmpeg to get the job done on the client side.

  • Socket.io client in js and server in Socket.io go doesn't send connected messege and data

    24 mars 2023, par OmriHalifa

    I am using ffmpeg and socket.io and I have some issues. I'm trying to send a connection request to a server written in Go through React, but I'm unable to connect to it. I tried adding the events in useEffect and it's still not working, what should I do ? i attaching my code in js and in go :&#xA;main.go

    &#xA;

    package main&#xA;&#xA;import (&#xA;    "log"&#xA;&#xA;    "github.com/gin-gonic/gin"&#xA;&#xA;    socketio "github.com/googollee/go-socket.io"&#xA;)&#xA;&#xA;func main() {&#xA;    router := gin.New()&#xA;&#xA;    server := socketio.NewServer(nil)&#xA;&#xA;    server.OnConnect("/", func(s socketio.Conn) error {&#xA;        s.SetContext("")&#xA;        log.Println("connected:", s.ID())&#xA;        return nil&#xA;    })&#xA;&#xA;    server.OnEvent("/", "notice", func(s socketio.Conn, msg string) {&#xA;        log.Println("notice:", msg)&#xA;        s.Emit("reply", "have "&#x2B;msg)&#xA;    })&#xA;&#xA;    server.OnEvent("/", "transcoded-video", func(s socketio.Conn, data string) {&#xA;        log.Println("transcoded-video:", data)&#xA;    })&#xA;&#xA;    server.OnEvent("/", "bye", func(s socketio.Conn) string {&#xA;        last := s.Context().(string)&#xA;        s.Emit("bye", last)&#xA;        s.Close()&#xA;        return last&#xA;    })&#xA;&#xA;    server.OnError("/", func(s socketio.Conn, e error) {&#xA;        log.Println("meet error:", e)&#xA;    })&#xA;&#xA;    server.OnDisconnect("/", func(s socketio.Conn, reason string) {&#xA;        log.Println("closed", reason)&#xA;    })&#xA;&#xA;    go func() {&#xA;        if err := server.Serve(); err != nil {&#xA;            log.Fatalf("socketio listen error: %s\n", err)&#xA;        }&#xA;    }()&#xA;    defer server.Close()&#xA;&#xA;    if err := router.Run(":8000"); err != nil {&#xA;        log.Fatal("failed run app: ", err)&#xA;    }&#xA;}&#xA;&#xA;

    &#xA;

    App.js

    &#xA;

    import &#x27;./App.css&#x27;;&#xA;import { useEffect } from &#x27;react&#x27;;&#xA;import { createFFmpeg, fetchFile } from &#x27;@ffmpeg/ffmpeg&#x27;;&#xA;import { io } from &#x27;socket.io-client&#x27;; &#xA;&#xA;function App() {&#xA;  const socket = io("http://localhost:8000",function() {&#xA;    // Send a message to the server when the client is connected&#xA;    socket.emit(&#x27;clientConnected&#x27;, &#x27;Client has connected to the server!&#x27;);&#xA;  })&#xA;&#xA;  const ffmpegWorker = createFFmpeg({&#xA;    log: true&#xA;  })&#xA;&#xA;  // Initialize FFmpeg when the component is mounted&#xA;  async function initFFmpeg() {&#xA;    await ffmpegWorker.load();&#xA;  }&#xA;&#xA;  async function transcode(webcamData) {&#xA;    const name = &#x27;record.webm&#x27;;&#xA;    await ffmpegWorker.FS(&#x27;writeFile&#x27;, name, await fetchFile(webcamData));&#xA;    await ffmpegWorker.run(&#x27;-i&#x27;, name, &#x27;-preset&#x27;, &#x27;ultrafast&#x27;, &#x27;-threads&#x27;, &#x27;4&#x27;, &#x27;output.mp4&#x27;);&#xA;    const data = ffmpegWorker.FS(&#x27;readFile&#x27;, &#x27;output.mp4&#x27;);&#xA;    &#xA;    // Set the source of the output video element to the transcoded video data&#xA;    const video = document.getElementById(&#x27;output-video&#x27;);&#xA;    video.src = URL.createObjectURL(new Blob([data.buffer], { type: &#x27;video/mp4&#x27; }));&#xA;    &#xA;    // Remove the output.mp4 file from the FFmpeg virtual file system&#xA;    ffmpegWorker.FS(&#x27;unlink&#x27;, &#x27;output.mp4&#x27;);&#xA;    &#xA;    // Emit a "transcoded-video" event to the server with the transcoded video data&#xA;    socket.emit("transcoded-video", data.buffer)&#xA;  }&#xA;  &#xA;  &#xA;&#xA;  let mediaRecorder;&#xA;  let chunks = [];&#xA;  &#xA;  // Request access to the user&#x27;s camera and microphone and start recording&#xA;  function requestMedia() {&#xA;    const webcam = document.getElementById(&#x27;webcam&#x27;);&#xA;    navigator.mediaDevices.getUserMedia({ video: true, audio: true })&#xA;    .then(async (stream) => {&#xA;      webcam.srcObject = stream;&#xA;      await webcam.play();&#xA;&#xA;      // Set up a MediaRecorder instance to record the video and audio&#xA;      mediaRecorder = new MediaRecorder(stream);&#xA;&#xA;      // Add the recorded data to the chunks array&#xA;      mediaRecorder.ondataavailable = async (e) => {&#xA;        chunks.push(e.data);&#xA;      }&#xA;&#xA;      // Transcode the recorded video data after the MediaRecorder stops&#xA;      mediaRecorder.onstop = async () => {&#xA;        await transcode(new Uint8Array(await (new Blob(chunks)).arrayBuffer()));&#xA;&#xA;        // Clear the chunks array after transcoding&#xA;        chunks = [];&#xA;&#xA;        // Start the MediaRecorder again after a 0 millisecond delay&#xA;        setTimeout(() => {&#xA;          mediaRecorder.start();&#xA;          &#xA;          // Stop the MediaRecorder after 3 seconds&#xA;          setTimeout(() => {&#xA;            mediaRecorder.stop();&#xA;          }, 500);&#xA;        }, 0);&#xA;      }&#xA;&#xA;      // Start the MediaRecorder&#xA;      mediaRecorder.start();&#xA;&#xA;      // Stop the MediaRecorder after 3 seconds&#xA;      setTimeout(() => {&#xA;        mediaRecorder.stop();&#xA;      }, 700);&#xA;    })&#xA;  }&#xA;  &#xA;  useEffect(() => {&#xA;    // Set up event listeners for the socket connection&#xA;    socket.on(&#x27;/&#x27;, function(){&#xA;      // Log a message when the client is connected to the server&#xA;      console.log("Connected to server!"); &#xA;    });&#xA;&#xA;    socket.on(&#x27;transcoded-video&#x27;, function(data){&#xA;      // Log the received data for debugging purposes&#xA;      console.log("Received transcoded video data:", data); &#xA;    });&#xA;&#xA;    socket.on(&#x27;notice&#x27;, function(data){&#xA;      // Emit a "notice" event back to the server to acknowledge the received data&#xA;      socket.emit("notice", "ping server!");&#xA;    });&#xA;&#xA;    socket.on(&#x27;bye&#x27;, function(data){&#xA;      // Log the received data and disconnect from the server&#xA;      console.log("Server sent:", data); &#xA;      socket.disconnect();&#xA;    });&#xA;&#xA;    socket.on(&#x27;disconnect&#x27;, function(){&#xA;      // Log a message when the client is disconnected from the server&#xA;      console.log("Disconnected from server!"); &#xA;    });&#xA;  }, [])&#xA;&#xA;  return (&#xA;    <div classname="App">&#xA;      <div>&#xA;          <video muted="{true}"></video>&#xA;          <video autoplay="autoplay"></video>&#xA;      </div>&#xA;      <button>start streaming</button>&#xA;    </div>&#xA;  );&#xA;}&#xA;&#xA;export default App;&#xA;

    &#xA;

    What can i do to fix it ? thank you !!

    &#xA;

  • How do i play an HLS stream when playlist.m3u8 file is constantly being updated ?

    3 janvier 2021, par Adnan Ahmed

    I am using MediaRecorder to record chunks of my live video in webm format from MediaStream and converting these chunks to .ts files on the server using ffmpeg and then updating my playlist.m3u8 file with this code :

    &#xA;

    function generateM3u8Playlist(fileDataArr, playlistFp, isLive, cb) {&#xA;    var durations = fileDataArr.map(function(fd) {&#xA;        return fd.duration;&#xA;    });&#xA;    var maxT = maxOfArr(durations);&#xA;&#xA;    var meta = [&#xA;        &#x27;#EXTM3U&#x27;,&#xA;        &#x27;#EXT-X-VERSION:3&#x27;,&#xA;        &#x27;#EXT-X-MEDIA-SEQUENCE:0&#x27;,&#xA;        &#x27;#EXT-X-ALLOW-CACHE:YES&#x27;,&#xA;        &#x27;#EXT-X-TARGETDURATION:&#x27; &#x2B; Math.ceil(maxT),&#xA;    ];&#xA;&#xA;    fileDataArr.forEach(function(fd) {&#xA;        meta.push(&#x27;#EXTINF:&#x27; &#x2B; fd.duration.toFixed(2) &#x2B; &#x27;,&#x27;);&#xA;        meta.push(fd.fileName2);&#xA;    });&#xA;&#xA;    if (!isLive) {&#xA;        meta.push(&#x27;#EXT-X-ENDLIST&#x27;);&#xA;    }&#xA;&#xA;    meta.push(&#x27;&#x27;);&#xA;    meta = meta.join(&#x27;\n&#x27;);&#xA;&#xA;    fs.writeFile(playlistFp, meta, cb);&#xA;}&#xA;

    &#xA;

    Here fileDataArr holds information for all the chunks that have been created.

    &#xA;

    After that i use this code to create a hls server :

    &#xA;

    var runStreamServer = (function(streamFolder) {&#xA;    var executed = false;&#xA;    return function(streamFolder) {&#xA;        if (!executed) {&#xA;            executed = true;&#xA;            var HLSServer = require(&#x27;hls-server&#x27;)&#xA;            var http = require(&#x27;http&#x27;)&#xA;&#xA;            var server = http.createServer()&#xA;            var hls = new HLSServer(server, {&#xA;                path: &#x27;/stream&#x27;, // Base URI to output HLS streams&#xA;                dir: &#x27;C:\\Users\\Work\\Desktop\\live-stream\\webcam2hls\\videos\\&#x27; &#x2B; streamFolder // Directory that input files are stored&#xA;            })&#xA;            console.log("We are going to stream from folder:" &#x2B; streamFolder);&#xA;            server.listen(8000);&#xA;            console.log(&#x27;Server Listening on Port 8000&#x27;);&#xA;        }&#xA;    };&#xA;})();&#xA;

    &#xA;

    The problem is that if i stop creating new chunks and then use the hls server link :&#xA;http://localhost:8000/stream/playlist.m3u8 then the video plays in VLC but if i try to play during the recording it keeps loading the file but does not play. I want it to play while its creating new chunks and updating playlist.m3u8. The quirk in generateM3u8Playlist function is that it adds &#x27;#EXT-X-ENDLIST&#x27; to the playlist file after i have stopped recording.&#xA;The software is still in production so its a bit messy code. Thank you for any answers.

    &#xA;

    The client side that generates blobs is as follows :

    &#xA;

    var mediaConstraints = {&#xA;            video: true,&#xA;            audio:true&#xA;        };&#xA;navigator.getUserMedia(mediaConstraints, onMediaSuccess, onMediaError);&#xA;function onMediaSuccess(stream) {&#xA;            console.log(&#x27;will start capturing and sending &#x27; &#x2B; (DT / 1000) &#x2B; &#x27;s videos when you press start&#x27;);&#xA;            var mediaRecorder = new MediaStreamRecorder(stream);&#xA;&#xA;            mediaRecorder.mimeType = &#x27;video/webm&#x27;;&#xA;&#xA;            mediaRecorder.ondataavailable = function(blob) {&#xA;                var count2 = zeroPad(count, 5);&#xA;                // here count2 just creates a blob number &#xA;                console.log(&#x27;sending chunk &#x27; &#x2B; name &#x2B; &#x27; #&#x27; &#x2B; count2 &#x2B; &#x27;...&#x27;);&#xA;                send(&#x27;/chunk/&#x27; &#x2B; name &#x2B; &#x27;/&#x27; &#x2B; count2 &#x2B; (stopped ? &#x27;/finish&#x27; : &#x27;&#x27;), blob);&#xA;                &#x2B;&#x2B;count;&#xA;            };&#xA;        }&#xA;// Here we have the send function which sends our blob to server:&#xA;        function send(url, blob) {&#xA;            var xhr = new XMLHttpRequest();&#xA;            xhr.open(&#x27;POST&#x27;, url, true);&#xA;&#xA;            xhr.responseType = &#x27;text/plain&#x27;;&#xA;            xhr.setRequestHeader(&#x27;Content-Type&#x27;, &#x27;video/webm&#x27;);&#xA;            //xhr.setRequestHeader("Content-Length", blob.length);&#xA;&#xA;            xhr.onload = function(e) {&#xA;                if (this.status === 200) {&#xA;                    console.log(this.response);&#xA;                }&#xA;            };&#xA;            xhr.send(blob);&#xA;        }&#xA;

    &#xA;

    The code that receives the XHR request is as follows :

    &#xA;

    var parts = u.split(&#x27;/&#x27;);&#xA;        var prefix = parts[2];&#xA;        var num = parts[3];&#xA;        var isFirst = false;&#xA;        var isLast = !!parts[4];&#xA;&#xA;        if ((/^0&#x2B;$/).test(num)) {&#xA;            var path = require(&#x27;path&#x27;);&#xA;            shell.mkdir(path.join(__dirname, &#x27;videos&#x27;, prefix));&#xA;            isFirst = true;&#xA;        }&#xA;&#xA;        var fp = &#x27;videos/&#x27; &#x2B; prefix &#x2B; &#x27;/&#x27; &#x2B; num &#x2B; &#x27;.webm&#x27;;&#xA;        var msg = &#x27;got &#x27; &#x2B; fp;&#xA;        console.log(msg);&#xA;        console.log(&#x27;isFirst:%s, isLast:%s&#x27;, isFirst, isLast);&#xA;&#xA;        var stream = fs.createWriteStream(fp, { encoding: &#x27;binary&#x27; });&#xA;        /*stream.on(&#x27;end&#x27;, function() {&#xA;            respond(res, [&#x27;text/plain&#x27;, msg]);&#xA;        });*/&#xA;&#xA;        //req.setEncoding(&#x27;binary&#x27;);&#xA;&#xA;        req.pipe(stream);&#xA;        req.on(&#x27;end&#x27;, function() {&#xA;            respond(res, [&#x27;text/plain&#x27;, msg]);&#xA;&#xA;            if (!LIVE) { return; }&#xA;&#xA;            var duration = 20;&#xA;            var fd = {&#xA;                fileName: num &#x2B; &#x27;.webm&#x27;,&#xA;                filePath: fp,&#xA;                duration: duration&#xA;            };&#xA;            var fileDataArr;&#xA;            if (isFirst) {&#xA;                fileDataArr = [];&#xA;                fileDataArrs[prefix] = fileDataArr;&#xA;            } else {&#xA;                var fileDataArr = fileDataArrs[prefix];&#xA;            }&#xA;            try {&#xA;                fileDataArr.push(fd);&#xA;            } catch (err) {&#xA;                fileDataArr = [];&#xA;                console.log(err.message);&#xA;            }&#xA;            videoUtils.computeStartTimes(fileDataArr);&#xA;&#xA;            videoUtils.webm2Mpegts(fd, function(err, mpegtsFp) {&#xA;                if (err) { return console.error(err); }&#xA;                console.log(&#x27;created %s&#x27;, mpegtsFp);&#xA;&#xA;                var playlistFp = &#x27;videos/&#x27; &#x2B; prefix &#x2B; &#x27;/playlist.m3u8&#x27;;&#xA;&#xA;                var fileDataArr2 = (isLast ? fileDataArr : lastN(fileDataArr, PREV_ITEMS_IN_LIVE));&#xA;&#xA;                var action = (isFirst ? &#x27;created&#x27; : (isLast ? &#x27;finished&#x27; : &#x27;updated&#x27;));&#xA;&#xA;                videoUtils.generateM3u8Playlist(fileDataArr2, playlistFp, !isLast, function(err) {&#xA;                    console.log(&#x27;playlist %s %s&#x27;, playlistFp, (err ? err.toString() : action));&#xA;                });&#xA;            });&#xA;&#xA;&#xA;            runStreamServer(prefix);&#xA;        }&#xA;

    &#xA;