
Recherche avancée
Autres articles (46)
-
Use, discuss, criticize
13 avril 2011, parTalk to people directly involved in MediaSPIP’s development, or to people around you who could use MediaSPIP to share, enhance or develop their creative projects.
The bigger the community, the more MediaSPIP’s potential will be explored and the faster the software will evolve.
A discussion list is available for all exchanges between users. -
MediaSPIP Player : problèmes potentiels
22 février 2011, parLe lecteur ne fonctionne pas sur Internet Explorer
Sur Internet Explorer (8 et 7 au moins), le plugin utilise le lecteur Flash flowplayer pour lire vidéos et son. Si le lecteur ne semble pas fonctionner, cela peut venir de la configuration du mod_deflate d’Apache.
Si dans la configuration de ce module Apache vous avez une ligne qui ressemble à la suivante, essayez de la supprimer ou de la commenter pour voir si le lecteur fonctionne correctement : /** * GeSHi (C) 2004 - 2007 Nigel McNie, (...) -
MediaSPIP Player : les contrôles
26 mai 2010, parLes contrôles à la souris du lecteur
En plus des actions au click sur les boutons visibles de l’interface du lecteur, il est également possible d’effectuer d’autres actions grâce à la souris : Click : en cliquant sur la vidéo ou sur le logo du son, celui ci se mettra en lecture ou en pause en fonction de son état actuel ; Molette (roulement) : en plaçant la souris sur l’espace utilisé par le média (hover), la molette de la souris n’exerce plus l’effet habituel de scroll de la page, mais diminue ou (...)
Sur d’autres sites (5311)
-
Capture from multiple streams concurrently, best way to do it and how to reduce CPU usage
19 juin 2019, par DRONE_6969I am currently in the process of writing an application that will capture a lot of RTSP streams(in my case its 12) and display it on the QT widget. The problem arouses when I am going beyond around 6-7 streams, the CPU usage spikes and there is visible stutter.
The reason why I think that it is not QT draw function is because I have done some checking to measure how much time it takes to draw an incoming image from camera and just sample images I had, it is always a lot less than 33 milliseconds(even if there are 12 widgets being updated).
I also just ran opencv capture method without drawing and got pretty much the same CPU consumption as if I was drawing the frames (lost like 10% CPU at most and GPU usage went to zero).
IMPORTANT : I am using RTSP stream which is a h264 stream.
IF IT MATTERS MY SPECS :
Intel Core i7-6700 @ 3.40GHZ(8 CPUS)
Memory : 16gb
GPU : Intel HD Graphics 530(Also I ran my code on a computer with dedicated Graphics card, it did eliminate some stutter but CPU usage is still pretty high)
I am currently using OPENCV 4.1.0 with GSTREAMER enabled and built, I also have the OPENCV-WORLD version, there is no difference in performance.
I have created a special class called Camera that holds its frame size constraints and various control functions as well stream function. The stream function is being ran on a separate thread, whenever stream() function is done with current frame it sends ready Mat via onNewFrame event I created which converts to QPixmap and updates widget’s lastImage variable. This way I can update image in a more thread safe way.
I have tried to manipulate those VideoCapture.set() values, but it didn’t really help.
This is my stream function (Ignore the bool return, it doesn’t do anything it is a remnant from couple of minutes ago when I was trying to use std::async) :
bool Camera::stream() {
/* This function is meant to run on a separate thread and fill up the buffer independantly of
main stream thread */
//cv::setNumThreads(100);
/* Rules for these slightly changed! */
Mat pre; // Grab initial undoctored frame
//pre = Mat::zeros(size, CV_8UC1);
Mat frame; // Final modified frame
frame = Mat::zeros(size, CV_8UC1);
if (!pre.isContinuous()) pre = pre.clone();
ipCam.open(streamUrl, CAP_FFMPEG);
while (ipCam.isOpened() && capture) {
// If camera is opened wel need to capture and process the frame
try {
auto start = std::chrono::system_clock::now();
ipCam >> pre;
if (pre.empty()) {
/* Check for blank frame, return error if there is a blank frame*/
cerr << id << ": ERROR! blank frame grabbed\n";
for (FrameListener* i : clients) {
i->onNotification(1); // Notify clients about this shit
}
break;
}
else {
// Only continue if frame not empty
if (pre.cols != size.width && pre.rows != size.height) {
resize(pre, frame, size);
pre.release();
}
else {
frame = pre;
}
dPacket* pack = new dPacket{id,&frame};
for (auto i : clients) {
i->onPNewFrame(pack);
}
frame.release();
delete pack;
}
}
catch (int e) {
cout << endl << "-----Exception during capture process! CODE " << e << endl;
}
// End camera manipulations
}
cout << "Camera timed out, or connection is closed..." << endl;
if (tryResetConnection) {
cout << "Reconnection flag is set, retrying after 3 seconds..." << endl;
for (FrameListener* i : clients) {
i->onNotification(-1); // Notify clients about this shit
}
this_thread::sleep_for(chrono::milliseconds(3000));
stream();
}
return true;
}This is my onPNewFrame function. The conversion is still being done on camera’s thread because it was called within stream() and therefore is within that scope(and I also checked) :
void GLWidget::onPNewFrame(dPacket* inPack) {
lastFlag = 0;
if (bufferEnabled) {
buffer.push(QPixmap::fromImage(toQImageFromPMat(inPack->frame)));
}
else {
if (playing) {
/* Only process if this widget is playing */
frameProcessing = true;
lastImage.convertFromImage(toQImageFromPMat(inPack->frame));
frameProcessing = false;
}
}
if (lastFlag != -1 && !lastImage.isNull()) {
connecting = false;
}
else {
connecting = true;
}
}This is my Mat to QImage :
QImage GLWidget::toQImageFromPMat(cv::Mat* mat) {
return QImage(mat->data, mat->cols, mat->rows, QImage::Format_RGB888).rgbSwapped();NOTE : not converting does not result in CPU boost (at least not a significant one).
Minimal verifiable example
This program is large. I am going to paste GLWidget.cpp and GLWidget.h as well as Camera.h and Camera.cpp. You can put GLWidget into anything just as long as you spawn more than 6 of it. Camera relies on the CamUtils, but it is possible to just paste url in videocapture
I also supplied CamUtils, just in case
Camera.h :
#pragma once
#include <iostream>
#include <vector>
#include <fstream>
#include <map>
#include <string>
#include <sstream>
#include <algorithm>
#include "FrameListener.h"
#include
#include <thread>
#include "CamUtils.h"
#include <ctime>
#include "dPacket.h"
using namespace std;
using namespace cv;
class Camera
{
/*
CLEANED UP!
Camera now is only responsible for streaming and echoing captured frames.
Frames are now wrapped into dPacket struct.
*/
private:
string id;
vector clients;
VideoCapture ipCam;
string streamUrl;
Size size;
bool tryResetConnection = false;
//TODO: Remove these as they are not going to be used going on:
bool isPlaying = true;
bool capture = true;
//SECRET FEATURES:
bool detect = false;
public:
Camera(string url, int width = 480, int height = 240, bool detect_=false);
bool stream();
void setReconnectable(bool newReconStatus);
void addListener(FrameListener* client);
vector<bool> getState(); // Returns current state: vector[0] stream state; vector[1] stream state; TODO: Remove this as this is no longer should control behaviour
void killStream();
bool getReconnectable();
};
</bool></ctime></thread></algorithm></sstream></string></map></fstream></vector></iostream>Camera.cpp
#include "Camera.h"
Camera::Camera(string url, int width, int height, bool detect_) // Default 240p
{
streamUrl = url; // Prepare url
size = Size(width, height);
detect = detect_;
}
void Camera::addListener(FrameListener* client) {
clients.push_back(client);
}
/*
TEST CAMERAS(Paste into cameras.dViewer):
{"id":"96a73796-c129-46fc-9c01-40acd8ed7122","ip":"176.57.73.231","password":"null","username":"null"},
{"id":"96a73796-c129-46fc-9c01-40acd8ed7122","ip":"176.57.73.231","password":"null","username":"null"},
{"id":"96a73796-c129-46fc-9c01-40acd8ed7144","ip":"172.20.101.13","password":"admin","username":"root"}
{"id":"96a73796-c129-46fc-9c01-40acd8ed7144","ip":"172.20.101.13","password":"admin","username":"root"}
*/
bool Camera::stream() {
/* This function is meant to run on a separate thread and fill up the buffer independantly of
main stream thread */
//cv::setNumThreads(100);
/* Rules for these slightly changed! */
Mat pre; // Grab initial undoctored frame
//pre = Mat::zeros(size, CV_8UC1);
Mat frame; // Final modified frame
frame = Mat::zeros(size, CV_8UC1);
if (!pre.isContinuous()) pre = pre.clone();
ipCam.open(streamUrl, CAP_FFMPEG);
while (ipCam.isOpened() && capture) {
// If camera is opened wel need to capture and process the frame
try {
auto start = std::chrono::system_clock::now();
ipCam >> pre;
if (pre.empty()) {
/* Check for blank frame, return error if there is a blank frame*/
cerr << id << ": ERROR! blank frame grabbed\n";
for (FrameListener* i : clients) {
i->onNotification(1); // Notify clients about this shit
}
break;
}
else {
// Only continue if frame not empty
if (pre.cols != size.width && pre.rows != size.height) {
resize(pre, frame, size);
pre.release();
}
else {
frame = pre;
}
auto end = std::chrono::system_clock::now();
std::time_t ts = std::chrono::system_clock::to_time_t(end);
dPacket* pack = new dPacket{ id,&frame};
for (auto i : clients) {
i->onPNewFrame(pack);
}
frame.release();
delete pack;
}
}
catch (int e) {
cout << endl << "-----Exception during capture process! CODE " << e << endl;
}
// End camera manipulations
}
cout << "Camera timed out, or connection is closed..." << endl;
if (tryResetConnection) {
cout << "Reconnection flag is set, retrying after 3 seconds..." << endl;
for (FrameListener* i : clients) {
i->onNotification(-1); // Notify clients about this shit
}
this_thread::sleep_for(chrono::milliseconds(3000));
stream();
}
return true;
}
void Camera::killStream(){
tryResetConnection = false;
capture = false;
ipCam.release();
}
void Camera::setReconnectable(bool reconFlag) {
tryResetConnection = reconFlag;
}
bool Camera::getReconnectable() {
return tryResetConnection;
}
vector<bool> Camera::getState() {
vector<bool> states;
states.push_back(isPlaying);
states.push_back(ipCam.isOpened());
return states;
}
</bool></bool>GLWidget.h :
#ifndef GLWIDGET_H
#define GLWIDGET_H
#include <qopenglwidget>
#include <qmouseevent>
#include "FrameListener.h"
#include "Camera.h"
#include "FrameListener.h"
#include
#include "Camera.h"
#include "CamUtils.h"
#include
#include "dPacket.h"
#include <chrono>
#include <ctime>
#include
#include "FullScreenVideo.h"
#include <qmovie>
#include "helper.h"
#include <iostream>
#include <qpainter>
#include <qtimer>
class Helper;
class GLWidget : public QOpenGLWidget, public FrameListener
{
Q_OBJECT
public:
GLWidget(std::string camId, CamUtils *cUtils, int width, int height, bool denyFullScreen_ = false, bool detectFlag_=false, QWidget* parent = nullptr);
void killStream();
~GLWidget();
public slots:
void animate();
void setBufferEnabled(bool setState);
void setCameraRetryConnection(bool setState);
void GLUpdate(); // Call to update the widget
void onRightClickMenu(const QPoint& point);
protected:
void paintEvent(QPaintEvent* event) override;
void onPNewFrame(dPacket* frame);
void onNotification(int alert_code);
private:
// Objects and resourses
Helper* helper;
Camera* cam;
CamUtils* camUtils;
QTimer* timer; // Keep track of update
QPixmap lastImage;
QMovie* connMov;
QMovie* test;
QPixmap logo;
// Control fields
int width;
int height;
int camUtilsAddr;
int elapsed;
std::thread* camThread;
std::string camId;
bool denyFullScreen = false;
bool playing = true;
bool streaming = true;
bool debug = false;
bool connecting = true;
int lastFlag = 0;
// Debug fields
std::chrono::high_resolution_clock::time_point lastFrameAt;
std::chrono::high_resolution_clock::time_point now;
std::chrono::duration<double> painTime; // time took to draw last frame
//Buffer stuff
std::queue<qpixmap> buffer;
bool bufferEnabled = false;
bool initialBuffer = false;
bool buffering = true;
bool frameProcessing = false;
//Functions
QImage toQImageFromPMat(cv::Mat* inFrame);
void mousePressEvent(QMouseEvent* event) override;
void drawImageGLLatest(QPainter* painter, QPaintEvent* event, int elapsed);
void drawOnPaused(QPainter* painter, QPaintEvent* event, int elapsed);
void drawOnStatus(int statusFlag, QPainter* painter, QPaintEvent* event, int elapsed);
};
#endif
</qpixmap></double></qtimer></qpainter></iostream></qmovie></ctime></chrono></qmouseevent></qopenglwidget>GLWidget.cpp :
#include "glwidget.h"
#include <future>
FullScreenVideo* fullScreen;
GLWidget::GLWidget(std::string camId_, CamUtils* cUtils, int width_, int height_, bool denyFullScreen_, bool detectFlag_, QWidget* parent)
: QOpenGLWidget(parent), helper(helper)
{
cout << "Player for CAMERA " << camId_ << endl;
/* Underlying properties */
camUtils = cUtils;
cout << "GLWidget Incoming CamUtils addr " << camUtils << endl;
cout << "GLWidget Set CamUtils addr " << camUtils << endl;
camId = camId_;
elapsed = 0;
width = width_ + 5;
height = height_ + 5;
helper = new Helper();
setFixedSize(width, height);
denyFullScreen = denyFullScreen_;
/* Camera capture thread */
cam = new Camera(camUtils->getCameraStreamURL(camId), width_, height_, detectFlag_);
cam->addListener(this);
/* Sync states */
vector<bool> initState = cam->getState();
playing = initState[0];
streaming = initState[1];
cout << "Initial states: " << playing << " " << streaming << endl;
camThread = new std::thread(&Camera::stream, cam);
cout << "================================================" << endl;
// Right click set up
setContextMenuPolicy(Qt::CustomContextMenu);
/* Loading gif */
connMov = new QMovie("establishingConnection.gif");
connMov->start();
QString url = R"(RLC-logo.png)";
logo = QPixmap(url);
QTimer* timer = new QTimer(this);
connect(timer, SIGNAL(timeout()), this, SLOT(GLUpdate()));
timer->start(1000/30);
playing = true;
}
/* SYSTEM */
void GLWidget::animate()
{
elapsed = (elapsed + qobject_cast(sender())->interval()) % 1000;
std::cout << elapsed << "\n";
}
void GLWidget::GLUpdate() {
/* Process descisions before update call */
if (bufferEnabled) {
/* Process buffer before update */
now = chrono::high_resolution_clock::now();
std::chrono::duration timeSinceLastUpdate = now - lastFrameAt;
if (timeSinceLastUpdate.count() > 25) {
if (buffer.size() > 1 && playing) {
lastImage.swap(buffer.front());
buffer.pop();
lastFrameAt = chrono::high_resolution_clock::now();
}
}
//update(); // Update
}
else {
/* No buffer */
}
repaint();
}
/* EVENTS */
void GLWidget::onRightClickMenu(const QPoint& point) {
cout << "Right click request got" << endl;
QPoint globPos = this->mapToGlobal(point);
QMenu myMenu;
if (!denyFullScreen) {
myMenu.addAction("Open Full Screen");
}
myMenu.addAction("Toggle Debug Info");
QAction* selected = myMenu.exec(globPos);
if (selected) {
string optiontxt = selected->text().toStdString();
if (optiontxt == "Open Full Screen") {
cout << "Chose to open full screen of " << camId << endl;
fullScreen = new FullScreenVideo(bufferEnabled, this);
fullScreen->setUpView(camUtils, camId);
fullScreen->show();
playing = false;
}
if (optiontxt == "Toggle Debug Info") {
cout << "Chose to toggle debug of " << camId << endl;
debug = !debug;
}
}
else {
cout << "Chose nothing!" << endl;
}
}
void GLWidget::onPNewFrame(dPacket* inPack) {
lastFlag = 0;
if (bufferEnabled) {
buffer.push(QPixmap::fromImage(toQImageFromPMat(inPack->frame)));
}
else {
if (playing) {
/* Only process if this widget is playing */
frameProcessing = true;
lastImage.convertFromImage(toQImageFromPMat(inPack->frame));
frameProcessing = false;
}
}
if (lastFlag != -1 && !lastImage.isNull()) {
connecting = false;
}
else {
connecting = true;
}
}
void GLWidget::onNotification(int alert) {
lastFlag = alert;
}
/* Paint events*/
void GLWidget::paintEvent(QPaintEvent* event)
{
QPainter painter(this);
if (lastFlag != 0 || connecting) {
drawOnStatus(lastFlag, &painter, event, elapsed);
}
else {
/* Actual frame drawing */
if (playing) {
if (!frameProcessing) {
drawImageGLLatest(&painter, event, elapsed);
}
}
else {
drawOnPaused(&painter, event, elapsed);
}
}
painter.end();
}
/* DRAWING STUFF */
void GLWidget::drawOnStatus(int statusFlag, QPainter* bgPaint, QPaintEvent* event, int elapsed) {
QString str;
QFont font("times", 15);
bgPaint->eraseRect(QRect(0, 0, width, height));
if (!lastImage.isNull()) {
bgPaint->drawPixmap(QRect(0, 0, width, height), lastImage);
}
/* Test background painting */
if (connecting) {
string k = "Connecting to " + camUtils->getIp(camId);
str.append(k.c_str());
}
else {
switch (statusFlag) {
case 1:
str = "Blank frame received...";
break;
case -1:
if (cam->getReconnectable()) {
str = "Connection lost, will try to reconnect.";
bgPaint->setOpacity(0.3);
}
else {
str = "Connection lost...";
bgPaint->setOpacity(0.3);
}
break;
}
}
bgPaint->drawPixmap(QRect(0, 0, width, height), QPixmap::fromImage(connMov->currentImage()));
bgPaint->setPen(Qt::red);
bgPaint->setFont(font);
QFontMetrics fm(font);
const QRect kek(0, 0, fm.width(str), fm.height());
QRect bound;
bgPaint->setOpacity(1);
bgPaint->drawText(bgPaint->viewport().width()/2 - kek.width()/2, bgPaint->viewport().height()/2 - kek.height(), str);
bgPaint->drawPixmap(bgPaint->viewport().width() / 2 - logo.width()/2, height - logo.width() - 15, logo);
}
void GLWidget::drawOnPaused(QPainter* painter, QPaintEvent* event, int elapsed) {
painter->eraseRect(0, 0, width, height);
QFont font = painter->font();
font.setPointSize(18);
painter->setPen(Qt::red);
QFontMetrics fm(font);
QString str("Paused");
painter->drawPixmap(QRect(0, 0, width, height),lastImage);
painter->drawText(QPoint(painter->viewport().width() - fm.width(str), 50), str);
if (debug) {
QFont font = painter->font();
font.setPointSize(25);
painter->setPen(Qt::red);
string camMess = "CAMID: " + camId;
QString mess(camMess.c_str());
string camIp = "IP: " + camUtils->getIp(camId);
QString ipMess(camIp.c_str());
QString bufferSize("Buffer size: " + QString::number(buffer.size()));
QString lastFrameText("Last frame draw time: " + QString::number(painTime.count()) + "s");
painter->drawText(QPoint(10, 50), mess);
painter->drawText(QPoint(10, 60), ipMess);
QString bufferState;
if (bufferEnabled) {
bufferState = QString("Experimental BUFFER is enabled!");
QString currentBufferSize("Current buffer load: " + QString::number(buffer.size()));
painter->drawText(QPoint(10, 80), currentBufferSize);
}
else {
bufferState = QString("Experimental BUFFER is disabled!");
}
painter->drawText(QPoint(10, 70), bufferState);
painter->drawText(QPoint(10, height - 25), lastFrameText);
}
}
void GLWidget::drawImageGLLatest(QPainter* painter, QPaintEvent* event, int elapsed) {
auto start = chrono::high_resolution_clock::now();
painter->drawPixmap(QRect(0, 0, width, height), lastImage);
if (debug) {
QFont font = painter->font();
font.setPointSize(25);
painter->setPen(Qt::red);
string camMess = "CAMID: " + camId;
QString mess(camMess.c_str());
string camIp = "IP: " + camUtils->getIp(camId);
QString ipMess(camIp.c_str());
QString bufferSize("Buffer size: " + QString::number(buffer.size()));
QString lastFrameText("Last frame draw time: " + QString::number(painTime.count()) + "s");
painter->drawText(QPoint(10, 50), mess);
painter->drawText(QPoint(10, 60), ipMess);
QString bufferState;
if(bufferEnabled){
bufferState = QString("Experimental BUFFER is enabled!");
QString currentBufferSize("Current buffer load: " + QString::number(buffer.size()));
painter->drawText(QPoint(10,80), currentBufferSize);
}
else {
bufferState = QString("Experimental BUFFER is disabled!");
QString currentBufferSize("Current buffer load: " + QString::number(buffer.size()));
painter->drawText(QPoint(10, 80), currentBufferSize);
}
painter->drawText(QPoint(10, 70), bufferState);
painter->drawText(QPoint(10, height - 25), lastFrameText);
}
auto end = chrono::high_resolution_clock::now();
painTime = end - start;
}
/* END DRAWING STUFF */
/* UI EVENTS */
void GLWidget::mousePressEvent(QMouseEvent* e) {
if (e->button() == Qt::LeftButton) {
if (fullScreen == nullptr || !fullScreen->isVisible()) { // Do not unpause if window is opened
playing = !playing;
}
}
if (e->button() == Qt::RightButton) {
onRightClickMenu(e->pos());
}
}
/* Utilities */
QImage GLWidget::toQImageFromPMat(cv::Mat* mat) {
return QImage(mat->data, mat->cols, mat->rows, QImage::Format_RGB888).rgbSwapped();
}
/* State control */
void GLWidget::killStream() {
cam->killStream();
camThread->join();
}
void GLWidget::setBufferEnabled(bool newBufferState) {
cout << "Player: " << camId << ", buffer state updated: " << newBufferState << endl;
bufferEnabled = newBufferState;
buffer.empty();
}
void GLWidget::setCameraRetryConnection(bool newState) {
cam->setReconnectable(newState);
}
/* Destruction */
GLWidget::~GLWidget() {
cam->killStream();
camThread->join();
}
</bool></future>CamUtils.h :
#pragma once
#include <iostream>
#include <vector>
#include <fstream>
#include <map>
#include <string>
#include <sstream>
#include <algorithm>
#include <nlohmann></nlohmann>json.hpp>
using namespace std;
using json = nlohmann::json;
class CamUtils
{
private:
string camDb = "cameras.dViewer";
map> cameraList; // Legacy
json cameras;
ofstream dbFile;
bool dbExists(); // Always hard coded
/* Old IMPLEMENTATION */
void writeLineToDb_(const string& content, bool append = false);
void loadCameras_();
/* JSON based */
void loadCameras();
public:
CamUtils();
string generateRandomString(size_t length);
string getCameraStreamURL(string cameraId) const;
string saveCamera(string ip, string username, string pass); // Return generated id
vector<string> listAllCameraIds();
string getIp(string cameraId);
};
</string></algorithm></sstream></string></map></fstream></vector></iostream>CamUtils.cpp :
#include "CamUtils.h"
#pragma comment(lib, "rpcrt4.lib") // UuidCreate - Minimum supported OS Win 2000
#include
#include <iostream>
CamUtils::CamUtils()
{
if (!dbExists()) {
ofstream dbFile;
dbFile.open(camDb);
cameras["cameras"] = json::array();
dbFile << cameras << std::endl;
dbFile.close();
}
else {
loadCameras();
}
}
vector<string> CamUtils::listAllCameraIds() {
vector<string> ids;
cout << "IN LIST " << endl;
for (auto& cam : cameras["cameras"]) {
ids.push_back(cam["id"].get<string>());
//cout << cam["id"].get<string>() << std::endl;
}
return ids;
}
string CamUtils::getIp(string id) {
vector<string> camDetails = cameraList[id];
string ip = "NO IP WILL DISPLAYED UNTIL I FIGURE OUT A BUG";
for (auto& cam : cameras["cameras"]) {
if (id == cam["id"]) {
ip = cam["ip"].get<string>();
}
}
return ip;
}
string CamUtils::getCameraStreamURL(string id) const {
string url = "err"; // err is the default, it will be overwritten in case id is found, dont forget to check for it
for (auto& cam : cameras["cameras"]) {
if (id == cam["id"]) {
if (cam["username"].get<string>() == "null") {
url = "rtsp://" + cam["ip"].get<string>() + ":554/axis-media/media.amp?tcp";
}
else {
url = "rtsp://" + cam["username"].get<string>() + ":" + cam["password"].get<string>() + "@" + cam["ip"].get<string>() + ":554/axis-media/media.amp?streamprofile=720_30";
}
}
}
return url; // Dont forget to check for err when using this shit
}
string CamUtils::saveCamera(string ip, string username, string password) {
UUID uid;
UuidCreate(&uid);
char* str;
UuidToStringA(&uid, (RPC_CSTR*)&str);
string id = str;
cout << "GEN: " << id << endl;
json cam = json({}); //Create emtpy object
cam["id"] = id;
cam["ip"] = ip;
cam["username"] = username;
cam["password"] = password;
cameras["cameras"].push_back(cam);
std::ofstream out(camDb);
out << cameras << std::endl;
cout << cameras["cameras"] << endl;
cout << "Saved camera as " << id << endl;
return id;
}
bool CamUtils::dbExists() {
ifstream dbFile(camDb);
return (bool)dbFile;
}
void CamUtils::loadCameras() {
cout << "Load call" << endl;
ifstream dbFile(camDb);
string line;
string wholeFile;
while (std::getline(dbFile, line)) {
cout << line << endl;
wholeFile += line;
}
try {
cameras = json::parse(wholeFile);
//cout << cameras["cameras"] << endl;
}
catch (exception e) {
cout << e.what() << endl;
}
dbFile.close();
}
/*
LEGACY CODE, TO BE REMOVED!
*/
void CamUtils::loadCameras_() {
/*
LEGACY CODE:
This used to be the way to load cameras, but I moved on to JSON based configuration so this is no longer needed and will be removed soon
*/
ifstream dbFile(camDb);
string line;
while (std::getline(dbFile, line)) {
/*
This function load camera data to the map:
The order MUST be the following: 0:ID, 1:IP, 2:USERNAME, 3:PASSWORD.
Always delimited with | no spaces between!
*/
if (!line.empty()) {
stringstream ss(line);
string item;
vector<string> splitString;
while (std::getline(ss, item, '|')) {
splitString.push_back(item);
}
if (splitString.size() > 0) {
/* Dont even parse if the program didnt split right*/
//cout << "Split string: " << splitString.size() << "\n";
for (int i = 0; i < (splitString.size()); i++) cameraList[splitString[0]].push_back(splitString[i]);
}
}
}
}
void CamUtils::writeLineToDb_(const string & content, bool append) {
ofstream dbFile;
cout << "Creating?";
if (append) {
dbFile.open(camDb, ios_base::app);
}
else {
dbFile.open(camDb);
}
dbFile << content.c_str() << "\r\n";
dbFile.flush();
}
/* JSON Reworx */
string CamUtils::generateRandomString(size_t length)
{
const char* charmap = "ABCDEFGHIJKLMNOPQRSTUVWXYZ";
const size_t charmapLength = strlen(charmap);
auto generator = [&]() { return charmap[rand() % charmapLength]; };
string result;
result.reserve(length);
generate_n(back_inserter(result), length, generator);
return result;
}
</string></string></string></string></string></string></string></string></string></string></string></string></iostream>End of example
How would I go about decreasing CPU usage when dealing with large amount of streams ?
-
Why does my ffmpeg audio sound slower and deeper - sample rate mismatch
4 septembre 2020, par yogesh zinzuok so this is a discord bot to record voice chat
https://hatebin.com/hgjlazacri
Now the bot works perfectly fine but the issue is that the audio sounds a bit deeper and slower than normal.. Why does it happen ? how can I make the audio sound 1:1..




const Discord = require('discord.js');
const client = new Discord.Client();
const ffmpegInstaller = require('@ffmpeg-installer/ffmpeg');
const ffmpeg = require('fluent-ffmpeg');
ffmpeg.setFfmpegPath(ffmpegInstaller.path);
const fs = require('fs-extra')
const mergeStream = require('merge-stream');
const config = require('./config.json');
const { getAudioDurationInSeconds } = require('get-audio-duration');
const cp = require('child_process');
const path1 = require('path');
const Enmap = require('enmap');
const UserRecords = require("./models/userrecords.js")
const ServerRecords = require("./models/serverrecords.js")
let prefix = `$`
class Readable extends require('stream').Readable { _read() {} }
let recording = false;
let currently_recording = {};
let mp3Paths = [];
const silence_buffer = new Uint8Array(3840);
const express = require('express')
const app = express()
const port = 3000
const publicIP = require('public-ip')
const { program } = require('commander');
const { path } = require('@ffmpeg-installer/ffmpeg');
const version = '0.0.1'
program.version(version);
let debug = false
let runProd = false
let fqdn = "";
const mongoose = require("mongoose");
const MongoClient = require('mongodb').MongoClient;
mongoose.connect('SECRRET',{
 useNewUrlParser: true
}, function(err){
 if(err){
 console.log(err);
 }else{
 console.log("Database connection initiated");
 }
});
require("dotenv").config()
function bufferToStream(buffer) {
 let stream = new Readable();
 stream.push(buffer);
 return stream;
}





client.commands = new Enmap();

client.on('ready', async () => {
 console.log(`Logged in as ${client.user.tag}`);

 let host = "localhost"

 

 let ip = await publicIP.v4();

 let protocol = "http";
 if (!runProd) {
 host = "localhost"
 } else {
 host = `35.226.244.186`;
 }
 fqdn = `${protocol}://${host}:${port}`
 app.listen(port, `0.0.0.0`, () => {
 console.log(`Listening on port ${port} for ${host} at fqdn ${fqdn}`)
 })
});
let randomArr = []
let finalArrWithIds = []
let variable = 0
client.on('message', async message => {
 console.log(`fuck`);
 if(message.content === `$record`){
 mp3Paths = []
 finalArrWithIds = []
 let membersToScrape = Array.from(message.member.voice.channel.members.values());
 membersToScrape.forEach((member) => {
 if(member.id === `749250882830598235`) {
 console.log(`botid`);
 }
 else {
 finalArrWithIds.push(member.id)
 }
 
 })
 const randomNumber = Math.floor(Math.random() * 100)
 randomArr = []
 randomArr.push(randomNumber)
 }
 
 
 const generateSilentData = async (silentStream, memberID) => {
 console.log(`recordingnow`)
 while(recording) {
 if (!currently_recording[memberID]) {
 silentStream.push(silence_buffer);
 }
 await new Promise(r => setTimeout(r, 20));
 }
 return "done";
 }
 console.log(generateSilentData, `status`)
 function generateOutputFile(channelID, memberID) {
 const dir = `./recordings/${channelID}/${memberID}`;
 fs.ensureDirSync(dir);
 const fileName = `${dir}/${randomArr[0]}.aac`;
 console.log(`${fileName} ---------------------------`);
 return fs.createWriteStream(fileName);
 }
 
 if (!fs.existsSync("public")) {
 fs.mkdirSync("public");
 }
 app.use("/public", express.static("./public"));
 if (!message.guild) return;

 if (message.content === config.prefix + config.record_command) {
 if (recording) {
 message.reply("bot is already recording");
 return
 }
 if (message.member.voice.channel) {
 recording = true;
 const connection = await message.member.voice.channel.join();
 const dispatcher = connection.play('./audio.mp3');

 connection.on('speaking', (user, speaking) => {
 if (speaking.has('SPEAKING')) {
 currently_recording[user.id] = true;
 } else {
 currently_recording[user.id] = false;
 }
 })


 let members = Array.from(message.member.voice.channel.members.values());
 members.forEach((member) => {

 if (member.id != client.user.id) {
 let memberStream = connection.receiver.createStream(member, {mode : 'pcm', end : 'manual'})

 let outputFile = generateOutputFile(message.member.voice.channel.id, member.id);
 console.log(outputFile, `outputfile here`);
 mp3Paths.push(outputFile.path);
 

 silence_stream = bufferToStream(new Uint8Array(0));
 generateSilentData(silence_stream, member.id).then(data => console.log(data));
 let combinedStream = mergeStream(silence_stream, memberStream);

 ffmpeg(combinedStream)
 .inputFormat('s32le')
 .audioFrequency(44100)
 .audioChannels(2)
 .on('error', (error) => {console.log(error)})
 .audioCodec('aac')
 .format('adts') 
 .pipe(outputFile)
 
 }
 })
 } else {
 message.reply('You need to join a voice channel first!');
 }
 }

 if (message.content === config.prefix + config.stop_command) {

 let date = new Date();
 let dd = String(date.getDate()).padStart(2, '0');
 let mm = String(date.getMonth() + 1).padStart(2, '0'); 
 let yyyy = date.getFullYear();
 date = mm + '/' + dd + '/' + yyyy;





 let currentVoiceChannel = message.member.voice.channel;
 if (currentVoiceChannel) {
 recording = false;
 await currentVoiceChannel.leave();

 let mergedOutputFolder = './recordings/' + message.member.voice.channel.id + `/${randomArr[0]}/`;
 fs.ensureDirSync(mergedOutputFolder);
 let file_name = `${randomArr[0]}` + '.aac';
 let mergedOutputFile = mergedOutputFolder + file_name;
 
 
 let download_path = message.member.voice.channel.id + `/${randomArr[0]}/` + file_name;

 let mixedOutput = new ffmpeg();
 console.log(mp3Paths, `mp3pathshere`);
 mp3Paths.forEach((mp3Path) => {
 mixedOutput.addInput(mp3Path);
 
 })
 console.log(mp3Paths);
 //mixedOutput.complexFilter('amix=inputs=2:duration=longest');
 mixedOutput.complexFilter('amix=inputs=' + mp3Paths.length + ':duration=longest');
 
 let processEmbed = new Discord.MessageEmbed().setTitle(`Audio Processing.`)
 processEmbed.addField(`Audio processing starting now..`, `Processing Audio`)
 processEmbed.setThumbnail(`https://media.discordapp.net/attachments/730811581046325348/748610998985818202/speaker.png`)
 processEmbed.setColor(` #00FFFF`)
 const processEmbedMsg = await message.channel.send(processEmbed)
 async function saveMp3(mixedData, outputMixed) {
 console.log(`${mixedData} MIXED `)
 
 
 
 return new Promise((resolve, reject) => {
 mixedData.on('error', reject).on('progress',
 async (progress) => {
 
 let processEmbedEdit = new Discord.MessageEmbed().setTitle(`Audio Processing.`)
 processEmbedEdit.addField(`Processing: ${progress.targetSize} KB converted`, `Processing Audio`)
 processEmbedEdit.setThumbnail(`https://media.discordapp.net/attachments/730811581046325348/748610998985818202/speaker.png`)
 processEmbedEdit.setColor(` #00FFFF`)
 processEmbedMsg.edit(processEmbedEdit)
 console.log('Processing: ' + progress.targetSize + ' KB converted');
 }).on('end', () => {
 console.log('Processing finished !');
 resolve()
 }).saveToFile(outputMixed);
 console.log(`${outputMixed} IT IS HERE`);
 })
 }
 // mixedOutput.saveToFile(mergedOutputFile);
 await saveMp3(mixedOutput, mergedOutputFile);
 console.log(`${mixedOutput} IN HEREEEEEEE`);
 // We saved the recording, now copy the recording
 if (!fs.existsSync(`./public`)) {
 fs.mkdirSync(`./public`);
 }
 let sourceFile = `${__dirname}/recordings/${download_path}`
 console.log(`DOWNLOAD PATH HERE ${download_path}`)
 const guildName = message.guild.id;
 const serveExist = `/public/${guildName}`
 if (!fs.existsSync(`.${serveExist}`)) {
 fs.mkdirSync(`.${serveExist}`)
 }
 let destionationFile = `${__dirname}${serveExist}/${file_name}`

 let errorThrown = false
 try {
 fs.copySync(sourceFile, destionationFile);
 } catch (err) {
 errorThrown = true
 await message.channel.send(`Error: ${err.message}`)
 }
 const usersWithTag = finalArrWithIds.map(user => `\n <@${user}>`);
 let timeSpent = await getAudioDurationInSeconds(`public/${guildName}/${file_name}`)
 let timesSpentRound = Math.floor(timeSpent)
 let finalTimeSpent = timesSpentRound / 60
 let finalTimeForReal = Math.floor(finalTimeSpent)
 if(!errorThrown){
 //--------------------- server recording save START
 class GeneralRecords {
 constructor(generalLink, date, voice, time) {
 this.generalLink = generalLink;
 this.date = date;
 this.note = `no note`;
 this.voice = voice;
 this.time = time
 }
 }
 let newGeneralRecordClassObject = new GeneralRecords(`${fqdn}/public/${guildName}/${file_name}`, date, usersWithTag, finalTimeForReal)
 let checkingServerRecord = await ServerRecords.exists({userid: `server`})
 if(checkingServerRecord === true){
 existingServerRecord = await ServerRecords.findOne({userid: `server`})
 existingServerRecord.content.push(newGeneralRecordClassObject)
 await existingServerRecord.save()
 }
 if(checkingServerRecord === false){
 let serverRecord = new ServerRecords()
 serverRecord.userid = `server`
 serverRecord.content.push(newGeneralRecordClassObject)
 await serverRecord.save()
 }
 //--------------------- server recording save STOP
 }
 
 //--------------------- personal recording section START
 for( member of finalArrWithIds) {

 let personal_download_path = message.member.voice.channel.id + `/${member}/` + file_name;
 let sourceFilePersonal = `${__dirname}/recordings/${personal_download_path}`
 let destionationFilePersonal = `${__dirname}${serveExist}/${member}/${file_name}`
 await fs.copySync(sourceFilePersonal, destionationFilePersonal);
 const user = client.users.cache.get(member);
 console.log(user, `user here`);
 try {
 ffmpeg.setFfmpegPath(ffmpegInstaller.path);
 
 ffmpeg(`public/${guildName}/${member}/${file_name}`)
 .audioFilters('silenceremove=stop_periods=-1:stop_duration=1:stop_threshold=-90dB')
 .output(`public/${guildName}/${member}/personal-${file_name}`)
 .on(`end`, function () {
 console.log(`DONE`);
 })
 .on(`error`, function (error) {
 console.log(`An error occured` + error.message)
 })
 .run();
 
 }
 catch (error) {
 console.log(error)
 }
 

 // ----------------- SAVING PERSONAL RECORDING TO DATABASE START
 class PersonalRecords {
 constructor(generalLink, personalLink, date, time) {
 this.generalLink = generalLink;
 this.personalLink = personalLink;
 this.date = date;
 this.note = `no note`;
 this.time = time;
 }
 }
 let timeSpentPersonal = await getAudioDurationInSeconds(`public/${guildName}/${file_name}`)
 let timesSpentRoundPersonal = Math.floor(timeSpentPersonal)
 let finalTimeSpentPersonal = timesSpentRoundPersonal / 60
 let finalTimeForRealPersonal = Math.floor(finalTimeSpentPersonal)
 let newPersonalRecordClassObject = new PersonalRecords(`${fqdn}/public/${guildName}/${file_name}`, `${fqdn}/public/${guildName}/${member}/personal-${file_name}`, date, finalTimeForRealPersonal)

 let checkingUserRecord = await UserRecords.exists({userid: member})
 if(checkingUserRecord === true){
 existingUserRecord = await UserRecords.findOne({userid: member})
 existingUserRecord.content.push(newPersonalRecordClassObject)
 await existingUserRecord.save()
 }
 if(checkingUserRecord === false){
 let newRecord = new UserRecords()
 newRecord.userid = member
 newRecord.content.push(newPersonalRecordClassObject)
 await newRecord.save()
 }


 
 // ----------------- SAVING PERSONAL RECORDING TO DATABASE END
 

 const endPersonalEmbed = new Discord.MessageEmbed().setTitle(`Your performance was amazing ! Review it here :D`)
 endPersonalEmbed.setColor('#9400D3')
 endPersonalEmbed.setThumbnail(`https://media.discordapp.net/attachments/730811581046325348/745381641324724294/vinyl.png`)
 endPersonalEmbed.addField(`1
-
Discord 24/7 video stream self-bot crashes after a couple hours
21 juillet 2023, par angeloI've implemented this library to make a self-bot that streams videos from a local folder in a loop 24/7 (don't ask me why). I set up an ubuntu vps to run the bot and it works perfectly fine the first 2-3 hours, after that it gets more and more laggy until the server crashes.
pd : It's basically my first time using javascript and stole most of the code from this repo so don't bully me.


Here's the code :


import { Client, TextChannel, CustomStatus, ActivityOptions } from "discord.js-selfbot-v13";
import { command, streamLivestreamVideo, VoiceUdp, setStreamOpts, streamOpts } from "@dank074/discord-video-stream";
import config from "./config.json";
import fs from 'fs';
import path from 'path';

const client = new Client();

client.patchVoiceEvents(); //this is necessary to register event handlers

setStreamOpts(
 config.streamOpts.width,
 config.streamOpts.height,
 config.streamOpts.fps,
 config.streamOpts.bitrateKbps,
 config.streamOpts.hardware_acc
)

const prefix = '$';

const moviesFolder = config.movieFolder || './movies';

const movieFiles = fs.readdirSync(moviesFolder);
let movies = movieFiles.map(file => {
 const fileName = path.parse(file).name;
 // replace space with _
 return { name: fileName.replace(/ /g, ''), path: path.join(moviesFolder, file) };
});
let originalMovList = [...movies];
let movList = movies;
let shouldStop = false;

// print out all movies
console.log(`Available movies:\n${movies.map(m => m.name).join('\n')}`);

const status_idle = () => {
 return new CustomStatus()
 .setState('摸鱼进行中')
 .setEmoji('🐟')
}

const status_watch = (name) => {
 return new CustomStatus()
 .setState(`Playing ${name}...`)
 .setEmoji('📽')
}

// ready event
client.on("ready", () => {
 if (client.user) {
 console.log(`--- ${client.user.tag} is ready ---`);
 client.user.setActivity(status_idle() as ActivityOptions)
 }
});

let streamStatus = {
 joined: false,
 joinsucc: false,
 playing: false,
 channelInfo: {
 guildId: '',
 channelId: '',
 cmdChannelId: ''
 },
 starttime: "00:00:00",
 timemark: '',
}

client.on('voiceStateUpdate', (oldState, newState) => {
 // when exit channel
 if (oldState.member?.user.id == client.user?.id) {
 if (oldState.channelId && !newState.channelId) {
 streamStatus.joined = false;
 streamStatus.joinsucc = false;
 streamStatus.playing = false;
 streamStatus.channelInfo = {
 guildId: '',
 channelId: '',
 cmdChannelId: streamStatus.channelInfo.cmdChannelId
 }
 client.user?.setActivity(status_idle() as ActivityOptions)
 }
 }
 // when join channel success
 if (newState.member?.user.id == client.user?.id) {
 if (newState.channelId && !oldState.channelId) {
 streamStatus.joined = true;
 if (newState.guild.id == streamStatus.channelInfo.guildId && newState.channelId == streamStatus.channelInfo.channelId) {
 streamStatus.joinsucc = true;
 }
 }
 }
})

client.on('messageCreate', async (message) => {
 if (message.author.bot) return; // ignore bots
 if (message.author.id == client.user?.id) return; // ignore self
 if (!config.commandChannels.includes(message.channel.id)) return; // ignore non-command channels
 if (!message.content.startsWith(prefix)) return; // ignore non-commands

 const args = message.content.slice(prefix.length).trim().split(/ +/); // split command and arguments
 if (args.length == 0) return;

 const user_cmd = args.shift()!.toLowerCase();

 if (config.commandChannels.includes(message.channel.id)) {
 switch (user_cmd) {
 case 'play':
 playCommand(args, message);
 break;
 case 'stop':
 stopCommand(message);
 break;
 case 'playtime':
 playtimeCommand(message);
 break;
 case 'pause':
 pauseCommand(message);
 break;
 case 'resume':
 resumeCommand(message);
 break;
 case 'list':
 listCommand(message);
 break;
 case 'status':
 statusCommand(message);
 break;
 case 'refresh':
 refreshCommand(message);
 break;
 case 'help':
 helpCommand(message);
 break;
 case 'playall':
 playAllCommand(args, message);
 break;
 case 'stream':
 streamCommand(args, message);
 break;
 case 'shuffle':
 shuffleCommand();
 break;
 case 'skip':
 //skip cmd
 break;
 default:
 message.reply('Invalid command');
 }
 }
});

client.login("TOKEN_HERE");

let lastPrint = "";

async function playAllCommand(args, message) {
 if (streamStatus.joined) {
 message.reply("Already joined");
 return;
 }

 // args = [guildId]/[channelId]
 if (args.length === 0) {
 message.reply("Missing voice channel");
 return;
 }

 // process args
 const [guildId, channelId] = args.shift()!.split("/");
 if (!guildId || !channelId) {
 message.reply("Invalid voice channel");
 return;
 }

 await client.joinVoice(guildId, channelId);
 streamStatus.joined = true;
 streamStatus.playing = false;
 streamStatus.starttime = "00:00:00";
 streamStatus.channelInfo = {
 guildId: guildId,
 channelId: channelId,
 cmdChannelId: message.channel.id,
 };

 const streamUdpConn = await client.createStream();

 streamUdpConn.voiceConnection.setSpeaking(true);
 streamUdpConn.voiceConnection.setVideoStatus(true);

 playAllVideos(streamUdpConn); // Start playing videos

 // Keep the stream open

 streamStatus.joined = false;
 streamStatus.joinsucc = false;
 streamStatus.playing = false;
 lastPrint = "";
 streamStatus.channelInfo = {
 guildId: "",
 channelId: "",
 cmdChannelId: "",
 };
}

async function playAllVideos(udpConn: VoiceUdp) {

 console.log("Started playing video");

 udpConn.voiceConnection.setSpeaking(true);
 udpConn.voiceConnection.setVideoStatus(true);

 try {
 let index = 0;

 while (true) {
 if (shouldStop) {
 break; // For the stop command
 }

 if (index >= movies.length) {
 // Reset the loop
 index = 0;
 }

 const movie = movList[index];

 if (!movie) {
 console.log("Movie not found");
 index++;
 continue;
 }

 let options = {};
 options["-ss"] = "00:00:00";

 console.log(`Playing ${movie.name}...`);

 try {
 let videoStream = streamLivestreamVideo(movie.path, udpConn);
 command?.on('progress', (msg) => {
 // print timemark if it passed 10 second sionce last print, becareful when it pass 0
 if (streamStatus.timemark) {
 if (lastPrint != "") {
 let last = lastPrint.split(':');
 let now = msg.timemark.split(':');
 // turn to seconds
 let s = parseInt(now[2]) + parseInt(now[1]) * 60 + parseInt(now[0]) * 3600;
 let l = parseInt(last[2]) + parseInt(last[1]) * 60 + parseInt(last[0]) * 3600;
 if (s - l >= 10) {
 console.log(`Timemark: ${msg.timemark}`);
 lastPrint = msg.timemark;
 }
 } else {
 console.log(`Timemark: ${msg.timemark}`);
 lastPrint = msg.timemark;
 }
 }
 streamStatus.timemark = msg.timemark;
 });
 const res = await videoStream;
 console.log("Finished playing video " + res);
 } catch (e) {
 console.log(e);
 }

 index++; // Pasar a la siguiente película
 }
 } finally {
 udpConn.voiceConnection.setSpeaking(false);
 udpConn.voiceConnection.setVideoStatus(false);
 }

 command?.kill("SIGINT");
 // send message to channel, not reply
 (client.channels.cache.get(streamStatus.channelInfo.cmdChannelId) as TextChannel).send('Finished playing video, timemark is ' + streamStatus.timemark);
 client.leaveVoice();
 client.user?.setActivity(status_idle() as ActivityOptions)
 streamStatus.joined = false;
 streamStatus.joinsucc = false;
 streamStatus.playing = false;
 lastPrint = ""
 streamStatus.channelInfo = {
 guildId: '',
 channelId: '',
 cmdChannelId: ''
 };
}

function shuffleArray(array) {
 for (let i = array.length - 1; i > 0; i--) {
 const j = Math.floor(Math.random() * (i + 1));
 [array[i], array[j]] = [array[j], array[i]];
 }
}

function shuffleCommand() {
 shuffleArray(movList);
}

async function playCommand(args, message) {
 if (streamStatus.joined) {
 message.reply('Already joined');
 return;
 }

 // args = [guildId]/[channelId]
 if (args.length == 0) {
 message.reply('Missing voice channel');
 return;
 }

 // process args
 const [guildId, channelId] = args.shift()!.split('/');
 if (!guildId || !channelId) {
 message.reply('Invalid voice channel');
 return;
 }

 // get movie name and find movie file
 let moviename = args.shift()
 let movie = movies.find(m => m.name == moviename);

 if (!movie) {
 message.reply('Movie not found');
 return;
 }

 // get start time from args "hh:mm:ss"
 let startTime = args.shift();
 let options = {}
 // check if start time is valid
 if (startTime) {
 let time = startTime.split(':');
 if (time.length != 3) {
 message.reply('Invalid start time');
 return;
 }
 let h = parseInt(time[0]);
 let m = parseInt(time[1]);
 let s = parseInt(time[2]);
 if (isNaN(h) || isNaN(m) || isNaN(s)) {
 message.reply('Invalid start time');
 return;
 }
 startTime = `${h}:${m}:${s}`;
 options['-ss'] = startTime;
 console.log("Start time: " + startTime);
 }

 await client.joinVoice(guildId, channelId);
 streamStatus.joined = true;
 streamStatus.playing = false;
 streamStatus.starttime = startTime ? startTime : "00:00:00";
 streamStatus.channelInfo = {
 guildId: guildId,
 channelId: channelId,
 cmdChannelId: message.channel.id
 }
 const streamUdpConn = await client.createStream();
 playVideo(movie.path, streamUdpConn, options);
 message.reply('Playing ' + (startTime ? ` from ${startTime} ` : '') + moviename + '...');
 client.user?.setActivity(status_watch(moviename) as ActivityOptions);
}

function stopCommand(message) {
 client.leaveVoice()
 streamStatus.joined = false;
 streamStatus.joinsucc = false;
 streamStatus.playing = false;
 streamStatus.channelInfo = {
 guildId: '',
 channelId: '',
 cmdChannelId: streamStatus.channelInfo.cmdChannelId
 }
 // use sigquit??
 command?.kill("SIGINT");
 // msg
 message.reply('Stopped playing');
 shouldStop = true;
 movList = [...originalMovList];
}

function playtimeCommand(message) {
 // streamStatus.starttime + streamStatus.timemark
 // starttime is hh:mm:ss, timemark is hh:mm:ss.000
 let start = streamStatus.starttime.split(':');
 let mark = streamStatus.timemark.split(':');
 let h = parseInt(start[0]) + parseInt(mark[0]);
 let m = parseInt(start[1]) + parseInt(mark[1]);
 let s = parseInt(start[2]) + parseInt(mark[2]);
 if (s >= 60) {
 m += 1;
 s -= 60;
 }
 if (m >= 60) {
 h += 1;
 m -= 60;
 }
 message.reply(`Play time: ${h}:${m}:${s}`);
}

function pauseCommand(message) {
 if (!streamStatus.playing) {
 command?.kill("SIGSTOP");
 message.reply('Paused');
 streamStatus.playing = false;
 } else {
 message.reply('Not playing');
 }
}

function resumeCommand(message) {
 if (!streamStatus.playing) {
 command?.kill("SIGCONT");
 message.reply('Resumed');
 streamStatus.playing = true;
 } else {
 message.reply('Not playing');
 }
}

function listCommand(message) {
 message.reply(`Available movies:\n${movies.map(m => m.name).join('\n')}`);
}

function statusCommand(message) {
 message.reply(`Joined: ${streamStatus.joined}\nJoin success: ${streamStatus.joinsucc}\nPlaying: ${streamStatus.playing}\nChannel: ${streamStatus.channelInfo.guildId}/${streamStatus.channelInfo.channelId}\nTimemark: ${streamStatus.timemark}\nStart time: ${streamStatus.starttime}`);
}

function refreshCommand(message) {
 // refresh movie list
 const movieFiles = fs.readdirSync(moviesFolder);
 movies = movieFiles.map(file => {
 const fileName = path.parse(file).name;
 // replace space with _
 return { name: fileName.replace(/ /g, ''), path: path.join(moviesFolder, file) };
 });
 message.reply('Movie list refreshed ' + movies.length + ' movies found.\n' + movies.map(m => m.name).join('\n'));
}

function helpCommand(message) {
 // reply all commands here
 message.reply('Available commands:\nplay [guildId]/[channelId] [movie] [start time]\nstop\nlist\nstatus\nrefresh\nplaytime\npause\nresume\nhelp');
}

async function playVideo(video: string, udpConn: VoiceUdp, options: any) {
 console.log("Started playing video");

 udpConn.voiceConnection.setSpeaking(true);
 udpConn.voiceConnection.setVideoStatus(true);
 try {
 let videoStream = streamLivestreamVideo(video, udpConn);
 command?.on('progress', (msg) => {
 // print timemark if it passed 10 second sionce last print, becareful when it pass 0
 if (streamStatus.timemark) {
 if (lastPrint != "") {
 let last = lastPrint.split(':');
 let now = msg.timemark.split(':');
 // turn to seconds
 let s = parseInt(now[2]) + parseInt(now[1]) * 60 + parseInt(now[0]) * 3600;
 let l = parseInt(last[2]) + parseInt(last[1]) * 60 + parseInt(last[0]) * 3600;
 if (s - l >= 10) {
 console.log(`Timemark: ${msg.timemark}`);
 lastPrint = msg.timemark;
 }
 } else {
 console.log(`Timemark: ${msg.timemark}`);
 lastPrint = msg.timemark;
 }
 }
 streamStatus.timemark = msg.timemark;
 });
 const res = await videoStream;
 console.log("Finished playing video " + res);
 } catch (e) {
 console.log(e);
 } finally {
 udpConn.voiceConnection.setSpeaking(false);
 udpConn.voiceConnection.setVideoStatus(false);
 }
 command?.kill("SIGINT");
 // send message to channel, not reply
 (client.channels.cache.get(streamStatus.channelInfo.cmdChannelId) as TextChannel).send('Finished playing video, timemark is ' + streamStatus.timemark);
 client.leaveVoice();
 client.user?.setActivity(status_idle() as ActivityOptions)
 streamStatus.joined = false;
 streamStatus.joinsucc = false;
 streamStatus.playing = false;
 lastPrint = ""
 streamStatus.channelInfo = {
 guildId: '',
 channelId: '',
 cmdChannelId: ''
 }
}

async function streamCommand(args, message) {

 if (streamStatus.joined) {
 message.reply('Already joined');
 return;
 }

 // args = [guildId]/[channelId]
 if (args.length == 0) {
 message.reply('Missing voice channel');
 return;
 }

 // process args
 const [guildId, channelId] = args.shift()!.split('/');
 if (!guildId || !channelId) {
 message.reply('Invalid voice channel');
 return;
 }

 let url = args.shift()
 let options = {}

 await client.joinVoice(guildId, channelId);
 streamStatus.joined = true;
 streamStatus.playing = false;
 //streamStatus.starttime = startTime ? startTime : "00:00:00";
 streamStatus.channelInfo = {
 guildId: guildId,
 channelId: channelId,
 cmdChannelId: message.channel.id
 }
 const streamUdpConn = await client.createStream();
 playStream(url, streamUdpConn, options);
 message.reply('Playing url');
 client.user?.setActivity(status_watch('livestream') as ActivityOptions);
}

async function playStream(video: string, udpConn: VoiceUdp, options: any) {
 console.log("Started playing video");

 udpConn.voiceConnection.setSpeaking(true);
 udpConn.voiceConnection.setVideoStatus(true);

 try {
 console.log("Trying to stream url");
 const res = await streamLivestreamVideo(video, udpConn);
 console.log("Finished streaming url");
 } catch (e) {
 console.log(e);
 } finally {
 udpConn.voiceConnection.setSpeaking(false);
 udpConn.voiceConnection.setVideoStatus(false);
 }

 command?.kill("SIGINT");
 client.leaveVoice();
 client.user?.setActivity(status_idle() as ActivityOptions)
 streamStatus.joined = false;
 streamStatus.joinsucc = false;
 streamStatus.playing = false;
 streamStatus.channelInfo = {
 guildId: '',
 channelId: '',
 cmdChannelId: ''
 }

}

// run server if enabled in config
if (config.server.enabled) {
 // run server.js
 require('./server');
}




I've tried running the code with the nocache package, setting up a cron job to clean the cache every 5 minutes, unifying functions in the code, but nothigns works.
I think that the problem has to do with certain process that never really ends after one video finishes playing, probably ffmpeg. I don't know whether is my code, my vps or the library the problem.


I wanted the bot to stay in the voice channel streaming my videos 24/7 (no interruptions), I don't know how to prevent it from getting laggy after a while.


This is the config.json file just in case you wanna test the code and can't find it


{
 "token": "DCTOKEN",
 "videoChannels": ["ID", "OTHERID"],
 "commandChannels": ["ID", "OTHERID"],
 "adminIds": ["ID"],
 "movieFolder": "./movies/",
 "previewCache": "/tmp/preview-cache",
 "streamOpts": {
 "width": 1280,
 "height": 720,
 "fps": 30,
 "bitrateKbps": 3000,
 "hardware_acc": true
 },
 "server": {
 "enabled": false,
 "port": 8080
 }
}