Recherche avancée

Médias (0)

Mot : - Tags -/organisation

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (46)

  • Use, discuss, criticize

    13 avril 2011, par

    Talk to people directly involved in MediaSPIP’s development, or to people around you who could use MediaSPIP to share, enhance or develop their creative projects.
    The bigger the community, the more MediaSPIP’s potential will be explored and the faster the software will evolve.
    A discussion list is available for all exchanges between users.

  • MediaSPIP Player : problèmes potentiels

    22 février 2011, par

    Le lecteur ne fonctionne pas sur Internet Explorer
    Sur Internet Explorer (8 et 7 au moins), le plugin utilise le lecteur Flash flowplayer pour lire vidéos et son. Si le lecteur ne semble pas fonctionner, cela peut venir de la configuration du mod_deflate d’Apache.
    Si dans la configuration de ce module Apache vous avez une ligne qui ressemble à la suivante, essayez de la supprimer ou de la commenter pour voir si le lecteur fonctionne correctement : /** * GeSHi (C) 2004 - 2007 Nigel McNie, (...)

  • MediaSPIP Player : les contrôles

    26 mai 2010, par

    Les contrôles à la souris du lecteur
    En plus des actions au click sur les boutons visibles de l’interface du lecteur, il est également possible d’effectuer d’autres actions grâce à la souris : Click : en cliquant sur la vidéo ou sur le logo du son, celui ci se mettra en lecture ou en pause en fonction de son état actuel ; Molette (roulement) : en plaçant la souris sur l’espace utilisé par le média (hover), la molette de la souris n’exerce plus l’effet habituel de scroll de la page, mais diminue ou (...)

Sur d’autres sites (5311)

  • Capture from multiple streams concurrently, best way to do it and how to reduce CPU usage

    19 juin 2019, par DRONE_6969

    I am currently in the process of writing an application that will capture a lot of RTSP streams(in my case its 12) and display it on the QT widget. The problem arouses when I am going beyond around 6-7 streams, the CPU usage spikes and there is visible stutter.

    The reason why I think that it is not QT draw function is because I have done some checking to measure how much time it takes to draw an incoming image from camera and just sample images I had, it is always a lot less than 33 milliseconds(even if there are 12 widgets being updated).

    I also just ran opencv capture method without drawing and got pretty much the same CPU consumption as if I was drawing the frames (lost like 10% CPU at most and GPU usage went to zero).

    IMPORTANT : I am using RTSP stream which is a h264 stream.

    IF IT MATTERS MY SPECS :

    Intel Core i7-6700 @ 3.40GHZ(8 CPUS)
    Memory : 16gb
    GPU : Intel HD Graphics 530

    (Also I ran my code on a computer with dedicated Graphics card, it did eliminate some stutter but CPU usage is still pretty high)

    I am currently using OPENCV 4.1.0 with GSTREAMER enabled and built, I also have the OPENCV-WORLD version, there is no difference in performance.

    I have created a special class called Camera that holds its frame size constraints and various control functions as well stream function. The stream function is being ran on a separate thread, whenever stream() function is done with current frame it sends ready Mat via onNewFrame event I created which converts to QPixmap and updates widget’s lastImage variable. This way I can update image in a more thread safe way.

    I have tried to manipulate those VideoCapture.set() values, but it didn’t really help.

    This is my stream function (Ignore the bool return, it doesn’t do anything it is a remnant from couple of minutes ago when I was trying to use std::async) :

    bool Camera::stream() {
       /* This function is meant to run on a separate thread and fill up the buffer independantly of
       main stream thread */
       //cv::setNumThreads(100);
       /* Rules for these slightly changed! */
       Mat pre;  // Grab initial undoctored frame
       //pre = Mat::zeros(size, CV_8UC1);
       Mat frame; // Final modified frame
       frame = Mat::zeros(size, CV_8UC1);
       if (!pre.isContinuous()) pre = pre.clone();

       ipCam.open(streamUrl, CAP_FFMPEG);


       while (ipCam.isOpened() && capture) {
           // If camera is opened wel need to capture and process the frame
           try {
               auto start = std::chrono::system_clock::now();

               ipCam >> pre;

               if (pre.empty()) {
                   /* Check for blank frame, return error if there is a blank frame*/
                   cerr << id << ": ERROR! blank frame grabbed\n";
                   for (FrameListener* i : clients) {
                       i->onNotification(1); // Notify clients about this shit
                   }
                   break;
               }

               else {
                   // Only continue if frame not empty

                   if (pre.cols != size.width && pre.rows != size.height) {
                       resize(pre, frame, size);
                       pre.release();
                   }
                   else {
                       frame = pre;
                   }

                   dPacket* pack = new dPacket{id,&frame};
                   for (auto i : clients) {
                       i->onPNewFrame(pack);
                   }
                   frame.release();
                   delete pack;
               }
           }

           catch (int e) {
               cout << endl << "-----Exception during capture process! CODE " << e << endl;
           }
           // End camera manipulations
       }

       cout << "Camera timed out, or connection is closed..." << endl;
       if (tryResetConnection) {
           cout << "Reconnection flag is set, retrying after 3 seconds..." << endl;
           for (FrameListener* i : clients) {
               i->onNotification(-1); // Notify clients about this shit
           }
           this_thread::sleep_for(chrono::milliseconds(3000));
           stream();
       }

       return true;
    }

    This is my onPNewFrame function. The conversion is still being done on camera’s thread because it was called within stream() and therefore is within that scope(and I also checked) :

    void GLWidget::onPNewFrame(dPacket* inPack) {
       lastFlag = 0;

       if (bufferEnabled) {
           buffer.push(QPixmap::fromImage(toQImageFromPMat(inPack->frame)));
       }
       else {
           if (playing) {
               /* Only process if this widget is playing */
               frameProcessing = true;
               lastImage.convertFromImage(toQImageFromPMat(inPack->frame));
               frameProcessing = false;
           }
       }

       if (lastFlag != -1 && !lastImage.isNull()) {
           connecting = false;
       }
       else {
           connecting = true;
       }
    }

    This is my Mat to QImage :

    QImage GLWidget::toQImageFromPMat(cv::Mat* mat) {



       return QImage(mat->data, mat->cols, mat->rows, QImage::Format_RGB888).rgbSwapped();

    NOTE : not converting does not result in CPU boost (at least not a significant one).

    Minimal verifiable example

    This program is large. I am going to paste GLWidget.cpp and GLWidget.h as well as Camera.h and Camera.cpp. You can put GLWidget into anything just as long as you spawn more than 6 of it. Camera relies on the CamUtils, but it is possible to just paste url in videocapture

    I also supplied CamUtils, just in case

    Camera.h :

    #pragma once
    #include <iostream>
    #include <vector>
    #include <fstream>
    #include <map>
    #include <string>
    #include <sstream>
    #include <algorithm>
    #include "FrameListener.h"
    #include
    #include <thread>
    #include "CamUtils.h"
    #include <ctime>
    #include "dPacket.h"

    using namespace std;
    using namespace cv;

    class Camera
    {

       /*
           CLEANED UP!
           Camera now is only responsible for streaming and echoing captured frames.
           Frames are now wrapped into dPacket struct.
       */


    private:
       string id;
       vector clients;
       VideoCapture ipCam;
       string streamUrl;
       Size size;
       bool tryResetConnection = false;

       //TODO: Remove these as they are not going to be used going on:
       bool isPlaying = true;
       bool capture = true;

       //SECRET FEATURES:
       bool detect = false;


    public:
       Camera(string url, int width = 480, int height = 240, bool detect_=false);
       bool stream();
       void setReconnectable(bool newReconStatus);
       void addListener(FrameListener* client);
       vector<bool> getState();    // Returns current state: vector[0] stream state; vector[1] stream state; TODO: Remove this as this is no longer should control behaviour
       void killStream();
       bool getReconnectable();
    };

    </bool></ctime></thread></algorithm></sstream></string></map></fstream></vector></iostream>

    Camera.cpp

    #include "Camera.h"


    Camera::Camera(string url, int width, int height, bool detect_) // Default 240p
    {
       streamUrl = url; // Prepare url
       size = Size(width, height);
       detect = detect_;

    }

    void Camera::addListener(FrameListener* client) {
       clients.push_back(client);
    }


    /*
                   TEST CAMERAS(Paste into cameras.dViewer):
                   {"id":"96a73796-c129-46fc-9c01-40acd8ed7122","ip":"176.57.73.231","password":"null","username":"null"},
                   {"id":"96a73796-c129-46fc-9c01-40acd8ed7122","ip":"176.57.73.231","password":"null","username":"null"},
                   {"id":"96a73796-c129-46fc-9c01-40acd8ed7144","ip":"172.20.101.13","password":"admin","username":"root"}
                   {"id":"96a73796-c129-46fc-9c01-40acd8ed7144","ip":"172.20.101.13","password":"admin","username":"root"}

    */



    bool Camera::stream() {
       /* This function is meant to run on a separate thread and fill up the buffer independantly of
       main stream thread */
       //cv::setNumThreads(100);
       /* Rules for these slightly changed! */
       Mat pre;  // Grab initial undoctored frame
       //pre = Mat::zeros(size, CV_8UC1);
       Mat frame; // Final modified frame
       frame = Mat::zeros(size, CV_8UC1);
       if (!pre.isContinuous()) pre = pre.clone();

       ipCam.open(streamUrl, CAP_FFMPEG);

       while (ipCam.isOpened() &amp;&amp; capture) {
           // If camera is opened wel need to capture and process the frame
           try {
               auto start = std::chrono::system_clock::now();

               ipCam >> pre;

               if (pre.empty()) {
                   /* Check for blank frame, return error if there is a blank frame*/
                   cerr &lt;&lt; id &lt;&lt; ": ERROR! blank frame grabbed\n";
                   for (FrameListener* i : clients) {
                       i->onNotification(1); // Notify clients about this shit
                   }
                   break;
               }

               else {
                   // Only continue if frame not empty

                   if (pre.cols != size.width &amp;&amp; pre.rows != size.height) {
                       resize(pre, frame, size);
                       pre.release();
                   }
                   else {
                       frame = pre;
                   }

                   auto end = std::chrono::system_clock::now();
                   std::time_t ts = std::chrono::system_clock::to_time_t(end);
                   dPacket* pack = new dPacket{ id,&amp;frame};
                   for (auto i : clients) {
                       i->onPNewFrame(pack);
                   }
                   frame.release();
                   delete pack;
               }
           }

           catch (int e) {
               cout &lt;&lt; endl &lt;&lt; "-----Exception during capture process! CODE " &lt;&lt; e &lt;&lt; endl;
           }
           // End camera manipulations
       }

       cout &lt;&lt; "Camera timed out, or connection is closed..." &lt;&lt; endl;
       if (tryResetConnection) {
           cout &lt;&lt; "Reconnection flag is set, retrying after 3 seconds..." &lt;&lt; endl;
           for (FrameListener* i : clients) {
               i->onNotification(-1); // Notify clients about this shit
           }
           this_thread::sleep_for(chrono::milliseconds(3000));
           stream();
       }

       return true;
    }


    void Camera::killStream(){
       tryResetConnection = false;
       capture = false;
       ipCam.release();
    }

    void Camera::setReconnectable(bool reconFlag) {
       tryResetConnection = reconFlag;
    }

    bool Camera::getReconnectable() {
       return tryResetConnection;
    }

    vector<bool> Camera::getState() {
       vector<bool> states;
       states.push_back(isPlaying);
       states.push_back(ipCam.isOpened());
       return states;
    }



    </bool></bool>

    GLWidget.h :

    #ifndef GLWIDGET_H
    #define GLWIDGET_H

    #include <qopenglwidget>
    #include <qmouseevent>
    #include "FrameListener.h"
    #include "Camera.h"
    #include "FrameListener.h"
    #include
    #include "Camera.h"
    #include "CamUtils.h"
    #include
    #include "dPacket.h"
    #include <chrono>
    #include <ctime>
    #include
    #include "FullScreenVideo.h"
    #include <qmovie>
    #include "helper.h"
    #include <iostream>
    #include <qpainter>
    #include <qtimer>

    class Helper;

    class GLWidget : public QOpenGLWidget, public FrameListener
    {
       Q_OBJECT

    public:
       GLWidget(std::string camId, CamUtils *cUtils, int width, int height, bool denyFullScreen_ = false, bool detectFlag_=false, QWidget* parent = nullptr);
       void killStream();
       ~GLWidget();

    public slots:
       void animate();
       void setBufferEnabled(bool setState);
       void setCameraRetryConnection(bool setState);
       void GLUpdate();            // Call to update the widget
       void onRightClickMenu(const QPoint&amp; point);

    protected:
       void paintEvent(QPaintEvent* event) override;
       void onPNewFrame(dPacket* frame);
       void onNotification(int alert_code);


    private:
       // Objects and resourses
       Helper* helper;
       Camera* cam;
       CamUtils* camUtils;
       QTimer* timer; // Keep track of update
       QPixmap lastImage;
       QMovie* connMov;
       QMovie* test;

       QPixmap logo;

       // Control fields
       int width;
       int height;
       int camUtilsAddr;
       int elapsed;
       std::thread* camThread;
       std::string camId;
       bool denyFullScreen = false;
       bool playing = true;
       bool streaming = true;
       bool debug = false;
       bool connecting = true;
       int lastFlag = 0;


       // Debug fields
       std::chrono::high_resolution_clock::time_point lastFrameAt;
       std::chrono::high_resolution_clock::time_point now;
       std::chrono::duration<double> painTime; // time took to draw last frame

       //Buffer stuff
       std::queue<qpixmap> buffer;
       bool bufferEnabled = false;
       bool initialBuffer = false;
       bool buffering = true;
       bool frameProcessing = false;



       //Functions
       QImage toQImageFromPMat(cv::Mat* inFrame);
       void mousePressEvent(QMouseEvent* event) override;
       void drawImageGLLatest(QPainter* painter, QPaintEvent* event, int elapsed);
       void drawOnPaused(QPainter* painter, QPaintEvent* event, int elapsed);
       void drawOnStatus(int statusFlag, QPainter* painter, QPaintEvent* event, int elapsed);
    };

    #endif

    </qpixmap></double></qtimer></qpainter></iostream></qmovie></ctime></chrono></qmouseevent></qopenglwidget>

    GLWidget.cpp :

    #include "glwidget.h"
    #include <future>


    FullScreenVideo* fullScreen;

    GLWidget::GLWidget(std::string camId_, CamUtils* cUtils, int width_, int height_,  bool denyFullScreen_, bool detectFlag_, QWidget* parent)
       : QOpenGLWidget(parent), helper(helper)
    {
       cout &lt;&lt; "Player for CAMERA " &lt;&lt; camId_ &lt;&lt; endl;

       /* Underlying properties */
       camUtils = cUtils;
       cout &lt;&lt; "GLWidget Incoming CamUtils addr " &lt;&lt; camUtils &lt;&lt; endl;
       cout &lt;&lt; "GLWidget Set CamUtils addr " &lt;&lt; camUtils &lt;&lt; endl;
       camId = camId_;
       elapsed = 0;
       width = width_ + 5;
       height = height_ + 5;
       helper = new Helper();
       setFixedSize(width, height);
       denyFullScreen = denyFullScreen_;

       /* Camera capture thread */
       cam = new Camera(camUtils->getCameraStreamURL(camId), width_, height_, detectFlag_);
       cam->addListener(this);

       /* Sync states */
       vector<bool> initState = cam->getState();
       playing = initState[0];
       streaming = initState[1];
       cout &lt;&lt; "Initial states: " &lt;&lt; playing &lt;&lt; " " &lt;&lt; streaming &lt;&lt; endl;
       camThread = new std::thread(&amp;Camera::stream, cam);
       cout &lt;&lt; "================================================" &lt;&lt; endl;

       // Right click set up
       setContextMenuPolicy(Qt::CustomContextMenu);


       /* Loading gif */
       connMov = new QMovie("establishingConnection.gif");
       connMov->start();
       QString url = R"(RLC-logo.png)";
       logo = QPixmap(url);
       QTimer* timer = new QTimer(this);
       connect(timer, SIGNAL(timeout()), this, SLOT(GLUpdate()));
       timer->start(1000/30);
       playing = true;

    }

    /* SYSTEM */
    void GLWidget::animate()
    {
       elapsed = (elapsed + qobject_cast(sender())->interval()) % 1000;
       std::cout &lt;&lt; elapsed &lt;&lt; "\n";
    }


    void GLWidget::GLUpdate() {
       /* Process descisions before update call */
       if (bufferEnabled) {
           /* Process buffer before update */
           now = chrono::high_resolution_clock::now();
           std::chrono::duration timeSinceLastUpdate = now - lastFrameAt;
           if (timeSinceLastUpdate.count() > 25) {
               if (buffer.size() > 1 &amp;&amp; playing) {
                   lastImage.swap(buffer.front());
                   buffer.pop();
                   lastFrameAt = chrono::high_resolution_clock::now();
               }
           }
           //update(); // Update
       }
       else {
           /* No buffer */
       }
       repaint();
    }


    /* EVENTS */
    void GLWidget::onRightClickMenu(const QPoint&amp; point) {
       cout &lt;&lt; "Right click request got" &lt;&lt; endl;

       QPoint globPos = this->mapToGlobal(point);
       QMenu myMenu;

       if (!denyFullScreen) {
           myMenu.addAction("Open Full Screen");
       }
       myMenu.addAction("Toggle Debug Info");


       QAction* selected = myMenu.exec(globPos);

       if (selected) {
           string optiontxt = selected->text().toStdString();

           if (optiontxt == "Open Full Screen") {
               cout &lt;&lt; "Chose to open full screen of " &lt;&lt; camId &lt;&lt; endl;
               fullScreen = new FullScreenVideo(bufferEnabled, this);
               fullScreen->setUpView(camUtils, camId);
               fullScreen->show();
               playing = false;
           }

           if (optiontxt == "Toggle Debug Info") {
               cout &lt;&lt; "Chose to toggle debug of " &lt;&lt; camId &lt;&lt; endl;
               debug = !debug;
           }
       }
       else {
           cout &lt;&lt; "Chose nothing!" &lt;&lt; endl;
       }


    }



    void GLWidget::onPNewFrame(dPacket* inPack) {
       lastFlag = 0;

       if (bufferEnabled) {
           buffer.push(QPixmap::fromImage(toQImageFromPMat(inPack->frame)));
       }
       else {
           if (playing) {
               /* Only process if this widget is playing */
               frameProcessing = true;
               lastImage.convertFromImage(toQImageFromPMat(inPack->frame));
               frameProcessing = false;
           }
       }

       if (lastFlag != -1 &amp;&amp; !lastImage.isNull()) {
           connecting = false;
       }
       else {
           connecting = true;
       }
    }


    void GLWidget::onNotification(int alert) {
       lastFlag = alert;  
    }


    /* Paint events*/


    void GLWidget::paintEvent(QPaintEvent* event)
    {
       QPainter painter(this);

           if (lastFlag != 0 || connecting) {
               drawOnStatus(lastFlag, &amp;painter, event, elapsed);
           }
           else {

               /* Actual frame drawing */
               if (playing) {
                   if (!frameProcessing) {
                       drawImageGLLatest(&amp;painter, event, elapsed);
                   }
               }
               else {
                   drawOnPaused(&amp;painter, event, elapsed);
               }
           }
       painter.end();

    }


    /* DRAWING STUFF */

    void GLWidget::drawOnStatus(int statusFlag, QPainter* bgPaint, QPaintEvent* event, int elapsed) {

       QString str;
       QFont font("times", 15);
       bgPaint->eraseRect(QRect(0, 0, width, height));
       if (!lastImage.isNull()) {
           bgPaint->drawPixmap(QRect(0, 0, width, height), lastImage);
       }
       /* Test background painting */
       if (connecting) {
           string k = "Connecting to " + camUtils->getIp(camId);
           str.append(k.c_str());
       }
       else {
           switch (statusFlag) {
           case 1:
               str = "Blank frame received...";
               break;

           case -1:
               if (cam->getReconnectable()) {
                   str = "Connection lost, will try to reconnect.";
                   bgPaint->setOpacity(0.3);
               }
               else {
                   str = "Connection lost...";
                   bgPaint->setOpacity(0.3);
               }

               break;
           }
       }

       bgPaint->drawPixmap(QRect(0, 0, width, height), QPixmap::fromImage(connMov->currentImage()));
       bgPaint->setPen(Qt::red);
       bgPaint->setFont(font);
       QFontMetrics fm(font);
       const QRect kek(0, 0, fm.width(str), fm.height());
       QRect bound;
       bgPaint->setOpacity(1);
       bgPaint->drawText(bgPaint->viewport().width()/2 - kek.width()/2, bgPaint->viewport().height()/2 - kek.height(), str);

       bgPaint->drawPixmap(bgPaint->viewport().width() / 2 - logo.width()/2, height - logo.width() - 15, logo);

    }



    void GLWidget::drawOnPaused(QPainter* painter, QPaintEvent* event, int elapsed) {
       painter->eraseRect(0, 0, width, height);
       QFont font = painter->font();
       font.setPointSize(18);
       painter->setPen(Qt::red);
       QFontMetrics fm(font);
       QString str("Paused");
       painter->drawPixmap(QRect(0, 0, width, height),lastImage);
       painter->drawText(QPoint(painter->viewport().width() - fm.width(str), 50), str);

       if (debug) {
           QFont font = painter->font();
           font.setPointSize(25);
           painter->setPen(Qt::red);
           string camMess = "CAMID: " + camId;
           QString mess(camMess.c_str());
           string camIp = "IP: " + camUtils->getIp(camId);
           QString ipMess(camIp.c_str());
           QString bufferSize("Buffer size: " + QString::number(buffer.size()));
           QString lastFrameText("Last frame draw time: " + QString::number(painTime.count()) + "s");
           painter->drawText(QPoint(10, 50), mess);
           painter->drawText(QPoint(10, 60), ipMess);
           QString bufferState;
           if (bufferEnabled) {
               bufferState = QString("Experimental BUFFER is enabled!");
               QString currentBufferSize("Current buffer load: " + QString::number(buffer.size()));
               painter->drawText(QPoint(10, 80), currentBufferSize);
           }
           else {
               bufferState = QString("Experimental BUFFER is disabled!");
           }
           painter->drawText(QPoint(10, 70), bufferState);
           painter->drawText(QPoint(10, height - 25), lastFrameText);
       }
    }


    void GLWidget::drawImageGLLatest(QPainter* painter, QPaintEvent* event, int elapsed) {
       auto start = chrono::high_resolution_clock::now();
       painter->drawPixmap(QRect(0, 0, width, height), lastImage);
       if (debug) {
           QFont font = painter->font();
           font.setPointSize(25);
           painter->setPen(Qt::red);
           string camMess = "CAMID: " + camId;
           QString mess(camMess.c_str());
           string camIp = "IP: " + camUtils->getIp(camId);
           QString ipMess(camIp.c_str());
           QString bufferSize("Buffer size: " + QString::number(buffer.size()));
           QString lastFrameText("Last frame draw time: " + QString::number(painTime.count()) + "s");
           painter->drawText(QPoint(10, 50), mess);
           painter->drawText(QPoint(10, 60), ipMess);
           QString bufferState;
           if(bufferEnabled){
               bufferState = QString("Experimental BUFFER is enabled!");
               QString currentBufferSize("Current buffer load: " + QString::number(buffer.size()));
               painter->drawText(QPoint(10,80), currentBufferSize);
           }
           else {
               bufferState = QString("Experimental BUFFER is disabled!");
               QString currentBufferSize("Current buffer load: " + QString::number(buffer.size()));
               painter->drawText(QPoint(10, 80), currentBufferSize);
           }
           painter->drawText(QPoint(10, 70), bufferState);
           painter->drawText(QPoint(10, height - 25), lastFrameText);

       }
       auto end = chrono::high_resolution_clock::now();
       painTime = end - start;
    }



    /* END DRAWING STUFF */



    /* UI EVENTS */

    void GLWidget::mousePressEvent(QMouseEvent* e) {

       if (e->button() == Qt::LeftButton) {
           if (fullScreen == nullptr || !fullScreen->isVisible()) { // Do not unpause if window is opened
               playing = !playing;
           }
       }

       if (e->button() == Qt::RightButton) {
           onRightClickMenu(e->pos());
       }
    }



    /* Utilities */
    QImage GLWidget::toQImageFromPMat(cv::Mat* mat) {



       return QImage(mat->data, mat->cols, mat->rows, QImage::Format_RGB888).rgbSwapped();



    }

    /* State control */

    void GLWidget::killStream() {
       cam->killStream();
       camThread->join();
    }

    void GLWidget::setBufferEnabled(bool newBufferState) {
       cout &lt;&lt; "Player: " &lt;&lt; camId &lt;&lt; ", buffer state updated: " &lt;&lt; newBufferState &lt;&lt; endl;
       bufferEnabled = newBufferState;
       buffer.empty();
    }

    void GLWidget::setCameraRetryConnection(bool newState) {
       cam->setReconnectable(newState);
    }

    /* Destruction */
    GLWidget::~GLWidget() {
       cam->killStream();
       camThread->join();
    }
    </bool></future>

    CamUtils.h :

    #pragma once
    #include <iostream>
    #include <vector>
    #include <fstream>
    #include <map>
    #include <string>
    #include <sstream>
    #include <algorithm>
    #include <nlohmann></nlohmann>json.hpp>

    using namespace std;
    using json = nlohmann::json;

    class CamUtils
    {
    private:

       string camDb = "cameras.dViewer";
       map> cameraList; // Legacy
       json cameras;
       ofstream dbFile;
       bool dbExists(); // Always hard coded

       /* Old IMPLEMENTATION */
       void writeLineToDb_(const string&amp; content, bool append = false);
       void loadCameras_();

       /* JSON based */
       void loadCameras();

    public:
       CamUtils();
       string generateRandomString(size_t length);
       string getCameraStreamURL(string cameraId) const;
       string saveCamera(string ip, string username, string pass); // Return generated id
       vector<string> listAllCameraIds();
       string getIp(string cameraId);
    };


    </string></algorithm></sstream></string></map></fstream></vector></iostream>

    CamUtils.cpp :

    #include "CamUtils.h"
    #pragma comment(lib, "rpcrt4.lib")  // UuidCreate - Minimum supported OS Win 2000
    #include
    #include <iostream>

    CamUtils::CamUtils()
    {
       if (!dbExists()) {
           ofstream dbFile;
           dbFile.open(camDb);
           cameras["cameras"] = json::array();
           dbFile &lt;&lt; cameras &lt;&lt; std::endl;
           dbFile.close();

       }
       else {
           loadCameras();
       }
    }




    vector<string> CamUtils::listAllCameraIds() {
       vector<string> ids;
       cout &lt;&lt; "IN LIST " &lt;&lt; endl;
       for (auto&amp; cam : cameras["cameras"]) {
           ids.push_back(cam["id"].get<string>());
           //cout &lt;&lt; cam["id"].get<string>() &lt;&lt; std::endl;
       }
       return ids;
    }

    string CamUtils::getIp(string id) {
       vector<string> camDetails = cameraList[id];
       string ip = "NO IP WILL DISPLAYED UNTIL I FIGURE OUT A BUG";
       for (auto&amp; cam : cameras["cameras"]) {
           if (id == cam["id"]) {
               ip = cam["ip"].get<string>();
           }
       }

       return ip;
    }

    string CamUtils::getCameraStreamURL(string id) const {
       string url = "err"; // err is the default, it will be overwritten in case id is found, dont forget to check for it

       for (auto&amp; cam : cameras["cameras"]) {
           if (id == cam["id"]) {
               if (cam["username"].get<string>() == "null") {
                   url = "rtsp://" + cam["ip"].get<string>() + ":554/axis-media/media.amp?tcp";
               }
               else {
                   url = "rtsp://" + cam["username"].get<string>() + ":" + cam["password"].get<string>() + "@" + cam["ip"].get<string>() + ":554/axis-media/media.amp?streamprofile=720_30";
               }
           }
       }

       return url;  // Dont forget to check for err when using this shit
    }


    string CamUtils::saveCamera(string ip, string username, string password) {
       UUID uid;
       UuidCreate(&amp;uid);
       char* str;
       UuidToStringA(&amp;uid, (RPC_CSTR*)&amp;str);
       string id = str;
       cout &lt;&lt; "GEN: " &lt;&lt; id &lt;&lt; endl;
       json cam = json({}); //Create emtpy object
       cam["id"] = id;
       cam["ip"] = ip;
       cam["username"] = username;
       cam["password"] = password;
       cameras["cameras"].push_back(cam);
       std::ofstream out(camDb);
       out &lt;&lt; cameras &lt;&lt; std::endl;
       cout &lt;&lt; cameras["cameras"] &lt;&lt; endl;

       cout &lt;&lt; "Saved camera as " &lt;&lt; id &lt;&lt; endl;
       return id;
    }


    bool CamUtils::dbExists() {
       ifstream dbFile(camDb);
       return (bool)dbFile;
    }





    void CamUtils::loadCameras() {
       cout &lt;&lt; "Load call" &lt;&lt; endl;
       ifstream dbFile(camDb);
       string line;
       string wholeFile;

       while (std::getline(dbFile, line)) {
           cout &lt;&lt; line &lt;&lt; endl;
           wholeFile += line;
       }
       try {
           cameras = json::parse(wholeFile);
           //cout &lt;&lt; cameras["cameras"] &lt;&lt; endl;

       }
       catch (exception e) {
           cout &lt;&lt; e.what() &lt;&lt; endl;
       }
       dbFile.close();
    }










    /*
       LEGACY CODE, TO BE REMOVED!

    */



    void CamUtils::loadCameras_() {
       /*
           LEGACY CODE:
           This used to be the way to load cameras, but I moved on to JSON based configuration so this is no longer needed and will be removed soon
       */

       ifstream dbFile(camDb);
       string line;
       while (std::getline(dbFile, line)) {
           /*
               This function load camera data to the map:
               The order MUST be the following: 0:ID, 1:IP, 2:USERNAME, 3:PASSWORD.
               Always delimited with | no spaces between!
           */
           if (!line.empty()) {
               stringstream ss(line);
               string item;
               vector<string> splitString;

               while (std::getline(ss, item, '|')) {
                   splitString.push_back(item);
               }
               if (splitString.size() > 0) {
                   /* Dont even parse if the program didnt split right*/
                   //cout &lt;&lt; "Split string: " &lt;&lt; splitString.size() &lt;&lt; "\n";
                   for (int i = 0; i &lt; (splitString.size()); i++) cameraList[splitString[0]].push_back(splitString[i]);
               }
           }
       }
    }



    void CamUtils::writeLineToDb_(const string &amp; content, bool append) {
       ofstream dbFile;
       cout &lt;&lt; "Creating?";
       if (append) {
           dbFile.open(camDb, ios_base::app);
       }
       else {
           dbFile.open(camDb);
       }

       dbFile &lt;&lt; content.c_str() &lt;&lt; "\r\n";
       dbFile.flush();
    }

    /* JSON Reworx */




    string CamUtils::generateRandomString(size_t length)
    {
       const char* charmap = "ABCDEFGHIJKLMNOPQRSTUVWXYZ";
       const size_t charmapLength = strlen(charmap);
       auto generator = [&amp;]() { return charmap[rand() % charmapLength]; };
       string result;
       result.reserve(length);
       generate_n(back_inserter(result), length, generator);
       return result;
    }
    </string></string></string></string></string></string></string></string></string></string></string></string></iostream>

    End of example

    How would I go about decreasing CPU usage when dealing with large amount of streams ?

  • Why does my ffmpeg audio sound slower and deeper - sample rate mismatch

    4 septembre 2020, par yogesh zinzu

    ok so this is a discord bot to record voice chat&#xA;https://hatebin.com/hgjlazacri&#xA;Now the bot works perfectly fine but the issue is that the audio sounds a bit deeper and slower than normal.. Why does it happen ? how can I make the audio sound 1:1..

    &#xA;

    &#xD;&#xA;
    &#xD;&#xA;
    const Discord = require(&#x27;discord.js&#x27;);&#xA;const client = new Discord.Client();&#xA;const ffmpegInstaller = require(&#x27;@ffmpeg-installer/ffmpeg&#x27;);&#xA;const ffmpeg = require(&#x27;fluent-ffmpeg&#x27;);&#xA;ffmpeg.setFfmpegPath(ffmpegInstaller.path);&#xA;const fs = require(&#x27;fs-extra&#x27;)&#xA;const mergeStream = require(&#x27;merge-stream&#x27;);&#xA;const config = require(&#x27;./config.json&#x27;);&#xA;const { getAudioDurationInSeconds } = require(&#x27;get-audio-duration&#x27;);&#xA;const cp = require(&#x27;child_process&#x27;);&#xA;const path1 = require(&#x27;path&#x27;);&#xA;const Enmap = require(&#x27;enmap&#x27;);&#xA;const UserRecords = require("./models/userrecords.js")&#xA;const ServerRecords = require("./models/serverrecords.js")&#xA;let prefix = `$`&#xA;class Readable extends require(&#x27;stream&#x27;).Readable { _read() {} }&#xA;let recording = false;&#xA;let currently_recording = {};&#xA;let mp3Paths = [];&#xA;const silence_buffer = new Uint8Array(3840);&#xA;const express = require(&#x27;express&#x27;)&#xA;const app = express()&#xA;const port = 3000&#xA;const publicIP = require(&#x27;public-ip&#x27;)&#xA;const { program } = require(&#x27;commander&#x27;);&#xA;const { path } = require(&#x27;@ffmpeg-installer/ffmpeg&#x27;);&#xA;const version = &#x27;0.0.1&#x27;&#xA;program.version(version);&#xA;let debug = false&#xA;let runProd = false&#xA;let fqdn = "";&#xA;const mongoose = require("mongoose");&#xA;const MongoClient = require(&#x27;mongodb&#x27;).MongoClient;&#xA;mongoose.connect(&#x27;SECRRET&#x27;,{&#xA;  useNewUrlParser: true&#xA;}, function(err){&#xA;  if(err){&#xA;    console.log(err);&#xA;  }else{&#xA;    console.log("Database connection initiated");&#xA;  }&#xA;});&#xA;require("dotenv").config()&#xA;function bufferToStream(buffer) {&#xA;    let stream = new Readable();&#xA;    stream.push(buffer);&#xA;    return stream;&#xA;}&#xA;&#xA;&#xA;&#xA;&#xA;&#xA;client.commands = new Enmap();&#xA;&#xA;client.on(&#x27;ready&#x27;, async () => {&#xA;    console.log(`Logged in as ${client.user.tag}`);&#xA;&#xA;    let host = "localhost"&#xA;&#xA;    &#xA;&#xA;    let ip = await publicIP.v4();&#xA;&#xA;    let protocol = "http";&#xA;    if (!runProd) {&#xA;        host = "localhost"&#xA;    } else {&#xA;        host = `35.226.244.186`;&#xA;    }&#xA;    fqdn = `${protocol}://${host}:${port}`&#xA;    app.listen(port, `0.0.0.0`, () => {&#xA;        console.log(`Listening on port ${port} for ${host} at fqdn ${fqdn}`)&#xA;    })&#xA;});&#xA;let randomArr = []&#xA;let finalArrWithIds = []&#xA;let variable = 0&#xA;client.on(&#x27;message&#x27;, async message => {&#xA;    console.log(`fuck`);&#xA;    if(message.content === `$record`){&#xA;        mp3Paths = []&#xA;        finalArrWithIds = []&#xA;        let membersToScrape = Array.from(message.member.voice.channel.members.values());&#xA;        membersToScrape.forEach((member) => {&#xA;            if(member.id === `749250882830598235`) {&#xA;                console.log(`botid`);&#xA;            }&#xA;            else {&#xA;                finalArrWithIds.push(member.id)&#xA;            }&#xA;            &#xA;        })&#xA;        const randomNumber = Math.floor(Math.random() * 100)&#xA;        randomArr = []&#xA;        randomArr.push(randomNumber)&#xA;    }&#xA;   &#xA;    &#xA;    const generateSilentData = async (silentStream, memberID) => {&#xA;        console.log(`recordingnow`)&#xA;        while(recording) {&#xA;            if (!currently_recording[memberID]) {&#xA;                silentStream.push(silence_buffer);&#xA;            }&#xA;            await new Promise(r => setTimeout(r, 20));&#xA;        }&#xA;        return "done";&#xA;    }&#xA;    console.log(generateSilentData, `status`)&#xA;    function generateOutputFile(channelID, memberID) {&#xA;        const dir = `./recordings/${channelID}/${memberID}`;&#xA;        fs.ensureDirSync(dir);&#xA;        const fileName = `${dir}/${randomArr[0]}.aac`;&#xA;        console.log(`${fileName} ---------------------------`);&#xA;        return fs.createWriteStream(fileName);&#xA;    }&#xA;    &#xA;    if (!fs.existsSync("public")) {&#xA;        fs.mkdirSync("public");&#xA;    }&#xA;    app.use("/public", express.static("./public"));&#xA;  if (!message.guild) return;&#xA;&#xA;  if (message.content === config.prefix &#x2B; config.record_command) {&#xA;    if (recording) {&#xA;        message.reply("bot is already recording");&#xA;        return&#xA;    }&#xA;    if (message.member.voice.channel) {&#xA;        recording = true;&#xA;        const connection = await message.member.voice.channel.join();&#xA;        const dispatcher = connection.play(&#x27;./audio.mp3&#x27;);&#xA;&#xA;        connection.on(&#x27;speaking&#x27;, (user, speaking) => {&#xA;            if (speaking.has(&#x27;SPEAKING&#x27;)) {&#xA;                currently_recording[user.id] = true;&#xA;            } else {&#xA;                currently_recording[user.id] = false;&#xA;            }&#xA;        })&#xA;&#xA;&#xA;        let members = Array.from(message.member.voice.channel.members.values());&#xA;        members.forEach((member) => {&#xA;&#xA;            if (member.id != client.user.id) {&#xA;                let memberStream = connection.receiver.createStream(member, {mode : &#x27;pcm&#x27;, end : &#x27;manual&#x27;})&#xA;&#xA;                let outputFile = generateOutputFile(message.member.voice.channel.id, member.id);&#xA;                console.log(outputFile, `outputfile here`);&#xA;                mp3Paths.push(outputFile.path);&#xA;                    &#xA;&#xA;                silence_stream = bufferToStream(new Uint8Array(0));&#xA;                generateSilentData(silence_stream, member.id).then(data => console.log(data));&#xA;                let combinedStream = mergeStream(silence_stream, memberStream);&#xA;&#xA;                ffmpeg(combinedStream)&#xA;                    .inputFormat(&#x27;s32le&#x27;)&#xA;                    .audioFrequency(44100)&#xA;                    .audioChannels(2)&#xA;                    .on(&#x27;error&#x27;, (error) => {console.log(error)})&#xA;                    .audioCodec(&#x27;aac&#x27;)&#xA;                    .format(&#x27;adts&#x27;) &#xA;                    .pipe(outputFile)&#xA;                    &#xA;            }&#xA;        })&#xA;    } else {&#xA;      message.reply(&#x27;You need to join a voice channel first!&#x27;);&#xA;    }&#xA;  }&#xA;&#xA;  if (message.content === config.prefix &#x2B; config.stop_command) {&#xA;&#xA;    let date = new Date();&#xA;    let dd = String(date.getDate()).padStart(2, &#x27;0&#x27;);&#xA;    let mm = String(date.getMonth() &#x2B; 1).padStart(2, &#x27;0&#x27;); &#xA;    let yyyy = date.getFullYear();&#xA;    date = mm &#x2B; &#x27;/&#x27; &#x2B; dd &#x2B; &#x27;/&#x27; &#x2B; yyyy;&#xA;&#xA;&#xA;&#xA;&#xA;&#xA;    let currentVoiceChannel = message.member.voice.channel;&#xA;    if (currentVoiceChannel) {&#xA;        recording = false;&#xA;        await currentVoiceChannel.leave();&#xA;&#xA;        let mergedOutputFolder = &#x27;./recordings/&#x27; &#x2B; message.member.voice.channel.id &#x2B; `/${randomArr[0]}/`;&#xA;        fs.ensureDirSync(mergedOutputFolder);&#xA;        let file_name = `${randomArr[0]}` &#x2B; &#x27;.aac&#x27;;&#xA;        let mergedOutputFile = mergedOutputFolder &#x2B; file_name;&#xA;    &#xA;        &#xA;    let download_path = message.member.voice.channel.id &#x2B; `/${randomArr[0]}/` &#x2B; file_name;&#xA;&#xA;        let mixedOutput = new ffmpeg();&#xA;        console.log(mp3Paths, `mp3pathshere`);&#xA;        mp3Paths.forEach((mp3Path) => {&#xA;             mixedOutput.addInput(mp3Path);&#xA;            &#xA;        })&#xA;        console.log(mp3Paths);&#xA;        //mixedOutput.complexFilter(&#x27;amix=inputs=2:duration=longest&#x27;);&#xA;        mixedOutput.complexFilter(&#x27;amix=inputs=&#x27; &#x2B; mp3Paths.length &#x2B; &#x27;:duration=longest&#x27;);&#xA;        &#xA;        let processEmbed = new Discord.MessageEmbed().setTitle(`Audio Processing.`)&#xA;        processEmbed.addField(`Audio processing starting now..`, `Processing Audio`)&#xA;        processEmbed.setThumbnail(`https://media.discordapp.net/attachments/730811581046325348/748610998985818202/speaker.png`)&#xA;        processEmbed.setColor(` #00FFFF`)&#xA;        const processEmbedMsg = await message.channel.send(processEmbed)&#xA;        async function saveMp3(mixedData, outputMixed) {&#xA;            console.log(`${mixedData} MIXED `)&#xA;            &#xA;            &#xA;            &#xA;            return new Promise((resolve, reject) => {&#xA;                mixedData.on(&#x27;error&#x27;, reject).on(&#x27;progress&#x27;,&#xA;                async (progress) => {&#xA;                    &#xA;                    let processEmbedEdit = new Discord.MessageEmbed().setTitle(`Audio Processing.`)&#xA;                    processEmbedEdit.addField(`Processing: ${progress.targetSize} KB converted`, `Processing Audio`)&#xA;                    processEmbedEdit.setThumbnail(`https://media.discordapp.net/attachments/730811581046325348/748610998985818202/speaker.png`)&#xA;                    processEmbedEdit.setColor(` #00FFFF`)&#xA;                    processEmbedMsg.edit(processEmbedEdit)&#xA;                    console.log(&#x27;Processing: &#x27; &#x2B; progress.targetSize &#x2B; &#x27; KB converted&#x27;);&#xA;                }).on(&#x27;end&#x27;, () => {&#xA;                    console.log(&#x27;Processing finished !&#x27;);&#xA;                    resolve()&#xA;                }).saveToFile(outputMixed);&#xA;                console.log(`${outputMixed} IT IS HERE`);&#xA;            })&#xA;        }&#xA;        // mixedOutput.saveToFile(mergedOutputFile);&#xA;        await saveMp3(mixedOutput, mergedOutputFile);&#xA;        console.log(`${mixedOutput} IN HEREEEEEEE`);&#xA;        // We saved the recording, now copy the recording&#xA;        if (!fs.existsSync(`./public`)) {&#xA;            fs.mkdirSync(`./public`);&#xA;        }&#xA;        let sourceFile = `${__dirname}/recordings/${download_path}`&#xA;        console.log(`DOWNLOAD PATH HERE ${download_path}`)&#xA;        const guildName = message.guild.id;&#xA;        const serveExist = `/public/${guildName}`&#xA;        if (!fs.existsSync(`.${serveExist}`)) {&#xA;            fs.mkdirSync(`.${serveExist}`)&#xA;        }&#xA;        let destionationFile = `${__dirname}${serveExist}/${file_name}`&#xA;&#xA;        let errorThrown = false&#xA;        try {&#xA;            fs.copySync(sourceFile, destionationFile);&#xA;        } catch (err) {&#xA;            errorThrown = true&#xA;            await message.channel.send(`Error: ${err.message}`)&#xA;        }&#xA;        const usersWithTag = finalArrWithIds.map(user => `\n &lt;@${user}>`);&#xA;        let timeSpent = await getAudioDurationInSeconds(`public/${guildName}/${file_name}`)&#xA;        let timesSpentRound = Math.floor(timeSpent)&#xA;        let finalTimeSpent = timesSpentRound / 60&#xA;        let finalTimeForReal = Math.floor(finalTimeSpent)&#xA;        if(!errorThrown){&#xA;            //--------------------- server recording save START&#xA;            class GeneralRecords {&#xA;                constructor(generalLink, date, voice, time) {&#xA;                  this.generalLink = generalLink;&#xA;                  this.date = date;&#xA;                  this.note = `no note`;&#xA;                  this.voice = voice;&#xA;                  this.time = time&#xA;                }&#xA;              }&#xA;              let newGeneralRecordClassObject = new GeneralRecords(`${fqdn}/public/${guildName}/${file_name}`, date, usersWithTag, finalTimeForReal)&#xA;              let checkingServerRecord = await ServerRecords.exists({userid: `server`})&#xA;              if(checkingServerRecord === true){&#xA;                  existingServerRecord = await ServerRecords.findOne({userid: `server`})&#xA;                  existingServerRecord.content.push(newGeneralRecordClassObject)&#xA;                  await existingServerRecord.save()&#xA;              }&#xA;              if(checkingServerRecord === false){&#xA;                let serverRecord = new ServerRecords()&#xA;                serverRecord.userid = `server`&#xA;                serverRecord.content.push(newGeneralRecordClassObject)&#xA;                await serverRecord.save()&#xA;              }&#xA;              //--------------------- server recording save STOP&#xA;        }&#xA;        &#xA;        //--------------------- personal recording section START&#xA;        for( member of finalArrWithIds) {&#xA;&#xA;        let personal_download_path = message.member.voice.channel.id &#x2B; `/${member}/` &#x2B; file_name;&#xA;        let sourceFilePersonal = `${__dirname}/recordings/${personal_download_path}`&#xA;        let destionationFilePersonal = `${__dirname}${serveExist}/${member}/${file_name}`&#xA;        await fs.copySync(sourceFilePersonal, destionationFilePersonal);&#xA;        const user = client.users.cache.get(member);&#xA;        console.log(user, `user here`);&#xA;        try {&#xA;            ffmpeg.setFfmpegPath(ffmpegInstaller.path);&#xA;          &#xA;            ffmpeg(`public/${guildName}/${member}/${file_name}`)&#xA;             .audioFilters(&#x27;silenceremove=stop_periods=-1:stop_duration=1:stop_threshold=-90dB&#x27;)&#xA;             .output(`public/${guildName}/${member}/personal-${file_name}`)&#xA;             .on(`end`, function () {&#xA;               console.log(`DONE`);&#xA;             })&#xA;             .on(`error`, function (error) {&#xA;               console.log(`An error occured` &#x2B; error.message)&#xA;             })&#xA;             .run();&#xA;             &#xA;          }&#xA;          catch (error) {&#xA;          console.log(error)&#xA;          }&#xA;        &#xA;&#xA;        // ----------------- SAVING PERSONAL RECORDING TO DATABASE START&#xA;        class PersonalRecords {&#xA;            constructor(generalLink, personalLink, date, time) {&#xA;              this.generalLink = generalLink;&#xA;              this.personalLink = personalLink;&#xA;              this.date = date;&#xA;              this.note = `no note`;&#xA;              this.time = time;&#xA;            }&#xA;          }&#xA;          let timeSpentPersonal = await getAudioDurationInSeconds(`public/${guildName}/${file_name}`)&#xA;          let timesSpentRoundPersonal = Math.floor(timeSpentPersonal)&#xA;          let finalTimeSpentPersonal = timesSpentRoundPersonal / 60&#xA;          let finalTimeForRealPersonal = Math.floor(finalTimeSpentPersonal)&#xA;          let newPersonalRecordClassObject = new PersonalRecords(`${fqdn}/public/${guildName}/${file_name}`, `${fqdn}/public/${guildName}/${member}/personal-${file_name}`, date, finalTimeForRealPersonal)&#xA;&#xA;           let checkingUserRecord = await UserRecords.exists({userid: member})&#xA;              if(checkingUserRecord === true){&#xA;                  existingUserRecord = await UserRecords.findOne({userid: member})&#xA;                  existingUserRecord.content.push(newPersonalRecordClassObject)&#xA;                  await existingUserRecord.save()&#xA;              }&#xA;              if(checkingUserRecord === false){&#xA;                let newRecord = new UserRecords()&#xA;                newRecord.userid = member&#xA;                newRecord.content.push(newPersonalRecordClassObject)&#xA;                await newRecord.save()&#xA;              }&#xA;&#xA;&#xA;       &#xA;        // ----------------- SAVING PERSONAL RECORDING TO DATABASE END&#xA;       &#xA;&#xA;        const endPersonalEmbed = new Discord.MessageEmbed().setTitle(`Your performance was amazing ! Review it here :D`)&#xA;        endPersonalEmbed.setColor(&#x27;#9400D3&#x27;)&#xA;        endPersonalEmbed.setThumbnail(`https://media.discordapp.net/attachments/730811581046325348/745381641324724294/vinyl.png`)&#xA;        endPersonalEmbed.addField(`1
  • Discord 24/7 video stream self-bot crashes after a couple hours

    21 juillet 2023, par angelo

    I've implemented this library to make a self-bot that streams videos from a local folder in a loop 24/7 (don't ask me why). I set up an ubuntu vps to run the bot and it works perfectly fine the first 2-3 hours, after that it gets more and more laggy until the server crashes.&#xA;pd : It's basically my first time using javascript and stole most of the code from this repo so don't bully me.

    &#xA;

    Here's the code :

    &#xA;

    import { Client, TextChannel, CustomStatus, ActivityOptions } from "discord.js-selfbot-v13";&#xA;import { command, streamLivestreamVideo, VoiceUdp, setStreamOpts, streamOpts } from "@dank074/discord-video-stream";&#xA;import config from "./config.json";&#xA;import fs from &#x27;fs&#x27;;&#xA;import path from &#x27;path&#x27;;&#xA;&#xA;const client = new Client();&#xA;&#xA;client.patchVoiceEvents(); //this is necessary to register event handlers&#xA;&#xA;setStreamOpts(&#xA;    config.streamOpts.width,&#xA;    config.streamOpts.height,&#xA;    config.streamOpts.fps,&#xA;    config.streamOpts.bitrateKbps,&#xA;    config.streamOpts.hardware_acc&#xA;)&#xA;&#xA;const prefix = &#x27;$&#x27;;&#xA;&#xA;const moviesFolder = config.movieFolder || &#x27;./movies&#x27;;&#xA;&#xA;const movieFiles = fs.readdirSync(moviesFolder);&#xA;let movies = movieFiles.map(file => {&#xA;    const fileName = path.parse(file).name;&#xA;    // replace space with _&#xA;    return { name: fileName.replace(/ /g, &#x27;&#x27;), path: path.join(moviesFolder, file) };&#xA;});&#xA;let originalMovList = [...movies];&#xA;let movList = movies;&#xA;let shouldStop = false;&#xA;&#xA;// print out all movies&#xA;console.log(`Available movies:\n${movies.map(m => m.name).join(&#x27;\n&#x27;)}`);&#xA;&#xA;const status_idle = () =>  {&#xA;    return new CustomStatus()&#xA;        .setState(&#x27;摸鱼进行中&#x27;)&#xA;        .setEmoji(&#x27;&#128031;&#x27;)&#xA;}&#xA;&#xA;const status_watch = (name) => {&#xA;    return new CustomStatus()&#xA;        .setState(`Playing ${name}...`)&#xA;        .setEmoji(&#x27;&#128253;&#x27;)&#xA;}&#xA;&#xA;// ready event&#xA;client.on("ready", () => {&#xA;    if (client.user) {&#xA;        console.log(`--- ${client.user.tag} is ready ---`);&#xA;        client.user.setActivity(status_idle() as ActivityOptions)&#xA;    }&#xA;});&#xA;&#xA;let streamStatus = {&#xA;    joined: false,&#xA;    joinsucc: false,&#xA;    playing: false,&#xA;    channelInfo: {&#xA;        guildId: &#x27;&#x27;,&#xA;        channelId: &#x27;&#x27;,&#xA;        cmdChannelId: &#x27;&#x27;&#xA;    },&#xA;    starttime: "00:00:00",&#xA;    timemark: &#x27;&#x27;,&#xA;}&#xA;&#xA;client.on(&#x27;voiceStateUpdate&#x27;, (oldState, newState) => {&#xA;    // when exit channel&#xA;    if (oldState.member?.user.id == client.user?.id) {&#xA;        if (oldState.channelId &amp;&amp; !newState.channelId) {&#xA;            streamStatus.joined = false;&#xA;            streamStatus.joinsucc = false;&#xA;            streamStatus.playing = false;&#xA;            streamStatus.channelInfo = {&#xA;                guildId: &#x27;&#x27;,&#xA;                channelId: &#x27;&#x27;,&#xA;                cmdChannelId: streamStatus.channelInfo.cmdChannelId&#xA;            }&#xA;            client.user?.setActivity(status_idle() as ActivityOptions)&#xA;        }&#xA;    }&#xA;    // when join channel success&#xA;    if (newState.member?.user.id == client.user?.id) {&#xA;        if (newState.channelId &amp;&amp; !oldState.channelId) {&#xA;            streamStatus.joined = true;&#xA;            if (newState.guild.id == streamStatus.channelInfo.guildId &amp;&amp; newState.channelId == streamStatus.channelInfo.channelId) {&#xA;                streamStatus.joinsucc = true;&#xA;            }&#xA;        }&#xA;    }&#xA;})&#xA;&#xA;client.on(&#x27;messageCreate&#x27;, async (message) => {&#xA;    if (message.author.bot) return; // ignore bots&#xA;    if (message.author.id == client.user?.id) return; // ignore self&#xA;    if (!config.commandChannels.includes(message.channel.id)) return; // ignore non-command channels&#xA;    if (!message.content.startsWith(prefix)) return; // ignore non-commands&#xA;&#xA;    const args = message.content.slice(prefix.length).trim().split(/ &#x2B;/); // split command and arguments&#xA;    if (args.length == 0) return;&#xA;&#xA;    const user_cmd = args.shift()!.toLowerCase();&#xA;&#xA;    if (config.commandChannels.includes(message.channel.id)) {&#xA;        switch (user_cmd) {&#xA;            case &#x27;play&#x27;:&#xA;                playCommand(args, message);&#xA;                break;&#xA;            case &#x27;stop&#x27;:&#xA;                stopCommand(message);&#xA;                break;&#xA;            case &#x27;playtime&#x27;:&#xA;                playtimeCommand(message);&#xA;                break;&#xA;            case &#x27;pause&#x27;:&#xA;                pauseCommand(message);&#xA;                break;&#xA;            case &#x27;resume&#x27;:&#xA;                resumeCommand(message);&#xA;                break;&#xA;            case &#x27;list&#x27;:&#xA;                listCommand(message);&#xA;                break;&#xA;            case &#x27;status&#x27;:&#xA;                statusCommand(message);&#xA;                break;&#xA;            case &#x27;refresh&#x27;:&#xA;                refreshCommand(message);&#xA;                break;&#xA;            case &#x27;help&#x27;:&#xA;                helpCommand(message);&#xA;                break;&#xA;            case &#x27;playall&#x27;:&#xA;                playAllCommand(args, message);&#xA;                break;&#xA;            case &#x27;stream&#x27;:&#xA;                streamCommand(args, message);&#xA;                break;&#xA;            case &#x27;shuffle&#x27;:&#xA;                shuffleCommand();&#xA;                break;&#xA;            case &#x27;skip&#x27;:&#xA;                //skip cmd&#xA;                break;&#xA;            default:&#xA;                message.reply(&#x27;Invalid command&#x27;);&#xA;        }&#xA;    }&#xA;});&#xA;&#xA;client.login("TOKEN_HERE");&#xA;&#xA;let lastPrint = "";&#xA;&#xA;async function playAllCommand(args, message) {&#xA;    if (streamStatus.joined) {&#xA;        message.reply("Already joined");&#xA;        return;&#xA;    }&#xA;&#xA;    // args = [guildId]/[channelId]&#xA;    if (args.length === 0) {&#xA;        message.reply("Missing voice channel");&#xA;        return;&#xA;    }&#xA;&#xA;    // process args&#xA;    const [guildId, channelId] = args.shift()!.split("/");&#xA;    if (!guildId || !channelId) {&#xA;        message.reply("Invalid voice channel");&#xA;        return;&#xA;    }&#xA;&#xA;    await client.joinVoice(guildId, channelId);&#xA;    streamStatus.joined = true;&#xA;    streamStatus.playing = false;&#xA;    streamStatus.starttime = "00:00:00";&#xA;    streamStatus.channelInfo = {&#xA;        guildId: guildId,&#xA;        channelId: channelId,&#xA;        cmdChannelId: message.channel.id,&#xA;    };&#xA;&#xA;    const streamUdpConn = await client.createStream();&#xA;&#xA;    streamUdpConn.voiceConnection.setSpeaking(true);&#xA;    streamUdpConn.voiceConnection.setVideoStatus(true);&#xA;&#xA;    playAllVideos(streamUdpConn); // Start playing videos&#xA;&#xA;    // Keep the stream open&#xA;&#xA;    streamStatus.joined = false;&#xA;    streamStatus.joinsucc = false;&#xA;    streamStatus.playing = false;&#xA;    lastPrint = "";&#xA;    streamStatus.channelInfo = {&#xA;        guildId: "",&#xA;        channelId: "",&#xA;        cmdChannelId: "",&#xA;    };&#xA;}&#xA;&#xA;async function playAllVideos(udpConn: VoiceUdp) {&#xA;&#xA;    console.log("Started playing video");&#xA;&#xA;    udpConn.voiceConnection.setSpeaking(true);&#xA;    udpConn.voiceConnection.setVideoStatus(true);&#xA;&#xA;    try {&#xA;        let index = 0;&#xA;&#xA;        while (true) {&#xA;            if (shouldStop) {&#xA;                break; // For the stop command&#xA;            }&#xA;&#xA;            if (index >= movies.length) {&#xA;                // Reset the loop&#xA;                index = 0;&#xA;            }&#xA;&#xA;            const movie = movList[index];&#xA;&#xA;            if (!movie) {&#xA;                console.log("Movie not found");&#xA;                index&#x2B;&#x2B;;&#xA;                continue;&#xA;            }&#xA;&#xA;            let options = {};&#xA;            options["-ss"] = "00:00:00";&#xA;&#xA;            console.log(`Playing ${movie.name}...`);&#xA;&#xA;            try {&#xA;                let videoStream = streamLivestreamVideo(movie.path, udpConn);&#xA;                command?.on(&#x27;progress&#x27;, (msg) => {&#xA;                    // print timemark if it passed 10 second sionce last print, becareful when it pass 0&#xA;                    if (streamStatus.timemark) {&#xA;                        if (lastPrint != "") {&#xA;                            let last = lastPrint.split(&#x27;:&#x27;);&#xA;                            let now = msg.timemark.split(&#x27;:&#x27;);&#xA;                            // turn to seconds&#xA;                            let s = parseInt(now[2]) &#x2B; parseInt(now[1]) * 60 &#x2B; parseInt(now[0]) * 3600;&#xA;                            let l = parseInt(last[2]) &#x2B; parseInt(last[1]) * 60 &#x2B; parseInt(last[0]) * 3600;&#xA;                            if (s - l >= 10) {&#xA;                                console.log(`Timemark: ${msg.timemark}`);&#xA;                                lastPrint = msg.timemark;&#xA;                            }&#xA;                        } else {&#xA;                            console.log(`Timemark: ${msg.timemark}`);&#xA;                            lastPrint = msg.timemark;&#xA;                        }&#xA;                    }&#xA;                    streamStatus.timemark = msg.timemark;&#xA;                });&#xA;                const res = await videoStream;&#xA;                console.log("Finished playing video " &#x2B; res);&#xA;            } catch (e) {&#xA;                console.log(e);&#xA;            }&#xA;&#xA;            index&#x2B;&#x2B;; // Pasar a la siguiente pel&#xED;cula&#xA;        }&#xA;    } finally {&#xA;        udpConn.voiceConnection.setSpeaking(false);&#xA;        udpConn.voiceConnection.setVideoStatus(false);&#xA;    }&#xA;&#xA;    command?.kill("SIGINT");&#xA;    // send message to channel, not reply&#xA;    (client.channels.cache.get(streamStatus.channelInfo.cmdChannelId) as TextChannel).send(&#x27;Finished playing video, timemark is &#x27; &#x2B; streamStatus.timemark);&#xA;    client.leaveVoice();&#xA;    client.user?.setActivity(status_idle() as ActivityOptions)&#xA;    streamStatus.joined = false;&#xA;    streamStatus.joinsucc = false;&#xA;    streamStatus.playing = false;&#xA;    lastPrint = ""&#xA;    streamStatus.channelInfo = {&#xA;        guildId: &#x27;&#x27;,&#xA;        channelId: &#x27;&#x27;,&#xA;        cmdChannelId: &#x27;&#x27;&#xA;    };&#xA;}&#xA;&#xA;function shuffleArray(array) {&#xA;    for (let i = array.length - 1; i > 0; i--) {&#xA;        const j = Math.floor(Math.random() * (i &#x2B; 1));&#xA;        [array[i], array[j]] = [array[j], array[i]];&#xA;    }&#xA;}&#xA;&#xA;function shuffleCommand() {&#xA;    shuffleArray(movList);&#xA;}&#xA;&#xA;async function playCommand(args, message) {&#xA;    if (streamStatus.joined) {&#xA;        message.reply(&#x27;Already joined&#x27;);&#xA;        return;&#xA;    }&#xA;&#xA;    // args = [guildId]/[channelId]&#xA;    if (args.length == 0) {&#xA;        message.reply(&#x27;Missing voice channel&#x27;);&#xA;        return;&#xA;    }&#xA;&#xA;    // process args&#xA;    const [guildId, channelId] = args.shift()!.split(&#x27;/&#x27;);&#xA;    if (!guildId || !channelId) {&#xA;        message.reply(&#x27;Invalid voice channel&#x27;);&#xA;        return;&#xA;    }&#xA;&#xA;    // get movie name and find movie file&#xA;    let moviename = args.shift()&#xA;    let movie = movies.find(m => m.name == moviename);&#xA;&#xA;    if (!movie) {&#xA;        message.reply(&#x27;Movie not found&#x27;);&#xA;        return;&#xA;    }&#xA;&#xA;    // get start time from args "hh:mm:ss"&#xA;    let startTime = args.shift();&#xA;    let options = {}&#xA;    // check if start time is valid&#xA;    if (startTime) {&#xA;        let time = startTime.split(&#x27;:&#x27;);&#xA;        if (time.length != 3) {&#xA;            message.reply(&#x27;Invalid start time&#x27;);&#xA;            return;&#xA;        }&#xA;        let h = parseInt(time[0]);&#xA;        let m = parseInt(time[1]);&#xA;        let s = parseInt(time[2]);&#xA;        if (isNaN(h) || isNaN(m) || isNaN(s)) {&#xA;            message.reply(&#x27;Invalid start time&#x27;);&#xA;            return;&#xA;        }&#xA;        startTime = `${h}:${m}:${s}`;&#xA;        options[&#x27;-ss&#x27;] = startTime;&#xA;        console.log("Start time: " &#x2B; startTime);&#xA;    }&#xA;&#xA;    await client.joinVoice(guildId, channelId);&#xA;    streamStatus.joined = true;&#xA;    streamStatus.playing = false;&#xA;    streamStatus.starttime = startTime ? startTime : "00:00:00";&#xA;    streamStatus.channelInfo = {&#xA;        guildId: guildId,&#xA;        channelId: channelId,&#xA;        cmdChannelId: message.channel.id&#xA;    }&#xA;    const streamUdpConn = await client.createStream();&#xA;    playVideo(movie.path, streamUdpConn, options);&#xA;    message.reply(&#x27;Playing &#x27; &#x2B; (startTime ? ` from ${startTime} ` : &#x27;&#x27;) &#x2B; moviename &#x2B; &#x27;...&#x27;);&#xA;    client.user?.setActivity(status_watch(moviename) as ActivityOptions);&#xA;}&#xA;&#xA;function stopCommand(message) {&#xA;    client.leaveVoice()&#xA;    streamStatus.joined = false;&#xA;    streamStatus.joinsucc = false;&#xA;    streamStatus.playing = false;&#xA;    streamStatus.channelInfo = {&#xA;        guildId: &#x27;&#x27;,&#xA;        channelId: &#x27;&#x27;,&#xA;        cmdChannelId: streamStatus.channelInfo.cmdChannelId&#xA;    }&#xA;    // use sigquit??&#xA;    command?.kill("SIGINT");&#xA;    // msg&#xA;    message.reply(&#x27;Stopped playing&#x27;);&#xA;    shouldStop = true;&#xA;    movList = [...originalMovList];&#xA;}&#xA;&#xA;function playtimeCommand(message) {&#xA;    // streamStatus.starttime &#x2B; streamStatus.timemark&#xA;    // starttime is hh:mm:ss, timemark is hh:mm:ss.000&#xA;    let start = streamStatus.starttime.split(&#x27;:&#x27;);&#xA;    let mark = streamStatus.timemark.split(&#x27;:&#x27;);&#xA;    let h = parseInt(start[0]) &#x2B; parseInt(mark[0]);&#xA;    let m = parseInt(start[1]) &#x2B; parseInt(mark[1]);&#xA;    let s = parseInt(start[2]) &#x2B; parseInt(mark[2]);&#xA;    if (s >= 60) {&#xA;        m &#x2B;= 1;&#xA;        s -= 60;&#xA;    }&#xA;    if (m >= 60) {&#xA;        h &#x2B;= 1;&#xA;        m -= 60;&#xA;    }&#xA;    message.reply(`Play time: ${h}:${m}:${s}`);&#xA;}&#xA;&#xA;function pauseCommand(message) {&#xA;    if (!streamStatus.playing) {&#xA;        command?.kill("SIGSTOP");&#xA;        message.reply(&#x27;Paused&#x27;);&#xA;        streamStatus.playing = false;&#xA;    } else {&#xA;        message.reply(&#x27;Not playing&#x27;);&#xA;    }&#xA;}&#xA;&#xA;function resumeCommand(message) {&#xA;    if (!streamStatus.playing) {&#xA;        command?.kill("SIGCONT");&#xA;        message.reply(&#x27;Resumed&#x27;);&#xA;        streamStatus.playing = true;&#xA;    } else {&#xA;        message.reply(&#x27;Not playing&#x27;);&#xA;    }&#xA;}&#xA;&#xA;function listCommand(message) {&#xA;    message.reply(`Available movies:\n${movies.map(m => m.name).join(&#x27;\n&#x27;)}`);&#xA;}&#xA;&#xA;function statusCommand(message) {&#xA;    message.reply(`Joined: ${streamStatus.joined}\nJoin success: ${streamStatus.joinsucc}\nPlaying: ${streamStatus.playing}\nChannel: ${streamStatus.channelInfo.guildId}/${streamStatus.channelInfo.channelId}\nTimemark: ${streamStatus.timemark}\nStart time: ${streamStatus.starttime}`);&#xA;}&#xA;&#xA;function refreshCommand(message) {&#xA;    // refresh movie list&#xA;    const movieFiles = fs.readdirSync(moviesFolder);&#xA;    movies = movieFiles.map(file => {&#xA;        const fileName = path.parse(file).name;&#xA;        // replace space with _&#xA;        return { name: fileName.replace(/ /g, &#x27;&#x27;), path: path.join(moviesFolder, file) };&#xA;    });&#xA;    message.reply(&#x27;Movie list refreshed &#x27; &#x2B; movies.length &#x2B; &#x27; movies found.\n&#x27; &#x2B; movies.map(m => m.name).join(&#x27;\n&#x27;));&#xA;}&#xA;&#xA;function helpCommand(message) {&#xA;    // reply all commands here&#xA;    message.reply(&#x27;Available commands:\nplay [guildId]/[channelId] [movie] [start time]\nstop\nlist\nstatus\nrefresh\nplaytime\npause\nresume\nhelp&#x27;);&#xA;}&#xA;&#xA;async function playVideo(video: string, udpConn: VoiceUdp, options: any) {&#xA;    console.log("Started playing video");&#xA;&#xA;    udpConn.voiceConnection.setSpeaking(true);&#xA;    udpConn.voiceConnection.setVideoStatus(true);&#xA;    try {&#xA;        let videoStream = streamLivestreamVideo(video, udpConn);&#xA;        command?.on(&#x27;progress&#x27;, (msg) => {&#xA;            // print timemark if it passed 10 second sionce last print, becareful when it pass 0&#xA;            if (streamStatus.timemark) {&#xA;                if (lastPrint != "") {&#xA;                    let last = lastPrint.split(&#x27;:&#x27;);&#xA;                    let now = msg.timemark.split(&#x27;:&#x27;);&#xA;                    // turn to seconds&#xA;                    let s = parseInt(now[2]) &#x2B; parseInt(now[1]) * 60 &#x2B; parseInt(now[0]) * 3600;&#xA;                    let l = parseInt(last[2]) &#x2B; parseInt(last[1]) * 60 &#x2B; parseInt(last[0]) * 3600;&#xA;                    if (s - l >= 10) {&#xA;                        console.log(`Timemark: ${msg.timemark}`);&#xA;                        lastPrint = msg.timemark;&#xA;                    }&#xA;                } else {&#xA;                    console.log(`Timemark: ${msg.timemark}`);&#xA;                    lastPrint = msg.timemark;&#xA;                }&#xA;            }&#xA;            streamStatus.timemark = msg.timemark;&#xA;        });&#xA;        const res = await videoStream;&#xA;        console.log("Finished playing video " &#x2B; res);&#xA;    } catch (e) {&#xA;        console.log(e);&#xA;    } finally {&#xA;        udpConn.voiceConnection.setSpeaking(false);&#xA;        udpConn.voiceConnection.setVideoStatus(false);&#xA;    }&#xA;    command?.kill("SIGINT");&#xA;    // send message to channel, not reply&#xA;    (client.channels.cache.get(streamStatus.channelInfo.cmdChannelId) as TextChannel).send(&#x27;Finished playing video, timemark is &#x27; &#x2B; streamStatus.timemark);&#xA;    client.leaveVoice();&#xA;    client.user?.setActivity(status_idle() as ActivityOptions)&#xA;    streamStatus.joined = false;&#xA;    streamStatus.joinsucc = false;&#xA;    streamStatus.playing = false;&#xA;    lastPrint = ""&#xA;    streamStatus.channelInfo = {&#xA;        guildId: &#x27;&#x27;,&#xA;        channelId: &#x27;&#x27;,&#xA;        cmdChannelId: &#x27;&#x27;&#xA;    }&#xA;}&#xA;&#xA;async function streamCommand(args, message) {&#xA;&#xA;    if (streamStatus.joined) {&#xA;        message.reply(&#x27;Already joined&#x27;);&#xA;        return;&#xA;    }&#xA;&#xA;    // args = [guildId]/[channelId]&#xA;    if (args.length == 0) {&#xA;        message.reply(&#x27;Missing voice channel&#x27;);&#xA;        return;&#xA;    }&#xA;&#xA;    // process args&#xA;    const [guildId, channelId] = args.shift()!.split(&#x27;/&#x27;);&#xA;    if (!guildId || !channelId) {&#xA;        message.reply(&#x27;Invalid voice channel&#x27;);&#xA;        return;&#xA;    }&#xA;&#xA;    let url = args.shift()&#xA;    let options = {}&#xA;&#xA;    await client.joinVoice(guildId, channelId);&#xA;    streamStatus.joined = true;&#xA;    streamStatus.playing = false;&#xA;    //streamStatus.starttime = startTime ? startTime : "00:00:00";&#xA;    streamStatus.channelInfo = {&#xA;        guildId: guildId,&#xA;        channelId: channelId,&#xA;        cmdChannelId: message.channel.id&#xA;    }&#xA;    const streamUdpConn = await client.createStream();&#xA;    playStream(url, streamUdpConn, options);&#xA;    message.reply(&#x27;Playing url&#x27;);&#xA;    client.user?.setActivity(status_watch(&#x27;livestream&#x27;) as ActivityOptions);&#xA;}&#xA;&#xA;async function playStream(video: string, udpConn: VoiceUdp, options: any) {&#xA;    console.log("Started playing video");&#xA;&#xA;    udpConn.voiceConnection.setSpeaking(true);&#xA;    udpConn.voiceConnection.setVideoStatus(true);&#xA;&#xA;    try {&#xA;        console.log("Trying to stream url");&#xA;        const res = await streamLivestreamVideo(video, udpConn);&#xA;        console.log("Finished streaming url");&#xA;    } catch (e) {&#xA;        console.log(e);&#xA;    } finally {&#xA;        udpConn.voiceConnection.setSpeaking(false);&#xA;        udpConn.voiceConnection.setVideoStatus(false);&#xA;    }&#xA;&#xA;    command?.kill("SIGINT");&#xA;    client.leaveVoice();&#xA;    client.user?.setActivity(status_idle() as ActivityOptions)&#xA;    streamStatus.joined = false;&#xA;    streamStatus.joinsucc = false;&#xA;    streamStatus.playing = false;&#xA;    streamStatus.channelInfo = {&#xA;        guildId: &#x27;&#x27;,&#xA;        channelId: &#x27;&#x27;,&#xA;        cmdChannelId: &#x27;&#x27;&#xA;    }&#xA;&#xA;}&#xA;&#xA;// run server if enabled in config&#xA;if (config.server.enabled) {&#xA;    // run server.js&#xA;    require(&#x27;./server&#x27;);&#xA;}&#xA;&#xA;

    &#xA;

    I've tried running the code with the nocache package, setting up a cron job to clean the cache every 5 minutes, unifying functions in the code, but nothigns works.&#xA;I think that the problem has to do with certain process that never really ends after one video finishes playing, probably ffmpeg. I don't know whether is my code, my vps or the library the problem.

    &#xA;

    I wanted the bot to stay in the voice channel streaming my videos 24/7 (no interruptions), I don't know how to prevent it from getting laggy after a while.

    &#xA;

    This is the config.json file just in case you wanna test the code and can't find it

    &#xA;

    {&#xA;    "token": "DCTOKEN",&#xA;    "videoChannels": ["ID", "OTHERID"],&#xA;    "commandChannels": ["ID", "OTHERID"],&#xA;    "adminIds": ["ID"],&#xA;    "movieFolder": "./movies/",&#xA;    "previewCache": "/tmp/preview-cache",&#xA;    "streamOpts": {&#xA;        "width": 1280,&#xA;        "height": 720,&#xA;        "fps": 30,&#xA;        "bitrateKbps": 3000,&#xA;        "hardware_acc": true&#xA;    },&#xA;    "server": {&#xA;        "enabled": false,&#xA;        "port": 8080&#xA;    }&#xA;}&#xA;&#xA;

    &#xA;