Recherche avancée

Médias (0)

Mot : - Tags -/presse-papier

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (53)

  • Use, discuss, criticize

    13 avril 2011, par

    Talk to people directly involved in MediaSPIP’s development, or to people around you who could use MediaSPIP to share, enhance or develop their creative projects.
    The bigger the community, the more MediaSPIP’s potential will be explored and the faster the software will evolve.
    A discussion list is available for all exchanges between users.

  • MediaSPIP Player : problèmes potentiels

    22 février 2011, par

    Le lecteur ne fonctionne pas sur Internet Explorer
    Sur Internet Explorer (8 et 7 au moins), le plugin utilise le lecteur Flash flowplayer pour lire vidéos et son. Si le lecteur ne semble pas fonctionner, cela peut venir de la configuration du mod_deflate d’Apache.
    Si dans la configuration de ce module Apache vous avez une ligne qui ressemble à la suivante, essayez de la supprimer ou de la commenter pour voir si le lecteur fonctionne correctement : /** * GeSHi (C) 2004 - 2007 Nigel McNie, (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

Sur d’autres sites (5720)

  • How to save rtsp stream without packet loss by using FFMPEG

    11 avril 2021, par sumit singh

    I am saving stream of live camera by using FFMPEG. When i am trying to save the video some data packets are loss so the video is not playing properly.I am using following FFMPEG Library

    



    The command which i am sending is-

    



     String[] cmd = {"-y", "-i", "rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mov", "-c:v", "libx264", "-acodec", "aac","-t", time, file_path};
 execFFmpegBinary(cmd);


    



    I am also try this command but the result is same

    



    String[] cmd = { "-y", "-rtsp_transport", "tcp", "-i",  "rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mov", "-c:v", "libx264", "-preset", "slow", "-b:v", "500k", "-maxrate", "500k", "-bufsize", "3000k", "-vf", "scale=-1:480", "-threads", "0", "-codec:a", "libfdk_aac", "-b:a", "128k", "-t", time, file_path};


    



    Here is the log of command output-

    



    07-15 15:16:55.180 25713-25713/com.github.sampleffmpeg V/output: progress : [rtsp @ 0x420391c0] max delay reached. need to consume packet
        07-15 15:16:55.180 25713-25713/com.github.sampleffmpeg V/output: progress : [NULL @ 0x4203ba00] RTP: missed 455 packets
        07-15 15:16:55.190 25713-25713/com.github.sampleffmpeg V/output: progress : [rtsp @ 0x420391c0] max delay reached. need to consume packet
        07-15 15:16:55.190 25713-25713/com.github.sampleffmpeg V/output: progress : [NULL @ 0x4203ba00] RTP: missed 16 packets
        07-15 15:16:55.300 25713-25713/com.github.sampleffmpeg V/output: progress : [rtsp @ 0x420391c0] max delay reached. need to consume packet
        07-15 15:16:55.300 25713-25713/com.github.sampleffmpeg V/output: progress : [NULL @ 0x4203ba00] RTP: missed 13 packets
        07-15 15:16:55.310 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x45c42680] mb_type 58 in P slice too large at 31 16
        07-15 15:16:55.320 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x45c42680] error while decoding MB 31 16
        07-15 15:16:55.330 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x45c42680] concealing 918 DC, 918 AC, 918 MV errors in P frame
        07-15 15:16:55.330 25713-25713/com.github.sampleffmpeg V/output: progress : [rtsp @ 0x420391c0] max delay reached. need to consume packet
        07-15 15:16:55.330 25713-25713/com.github.sampleffmpeg V/output: progress : [NULL @ 0x4203ba00] RTP: missed 55 packets
        07-15 15:16:55.340 25713-25713/com.github.sampleffmpeg V/output: progress : [rtsp @ 0x420391c0] max delay reached. need to consume packet
        07-15 15:16:55.340 25713-25713/com.github.sampleffmpeg V/output: progress : [NULL @ 0x4203ba00] RTP: missed 19 packets
        07-15 15:16:55.340 25713-25713/com.github.sampleffmpeg V/output: progress : [rtsp @ 0x420391c0] max delay reached. need to consume packet
        07-15 15:16:55.350 25713-25713/com.github.sampleffmpeg V/output: progress : [NULL @ 0x4203ba00] RTP: missed 1 packets
        07-15 15:16:55.350 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x458737c0] out of range intra chroma pred mode at 7 28
        07-15 15:16:55.350 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x458737c0] error while decoding MB 7 28
        07-15 15:16:55.360 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x458737c0] concealing 402 DC, 402 AC, 402 MV errors in P frame
        07-15 15:16:55.360 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x45ab8020] P sub_mb_type 8 out of range at 28 14
        07-15 15:16:55.370 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x45ab8020] error while decoding MB 28 14
        07-15 15:16:55.370 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x45ab8020] concealing 1011 DC, 1011 AC, 1011 MV errors in P frame
        07-15 15:16:55.380 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x45afe200] cbp too large (132) at 12 20
        07-15 15:16:55.380 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x45afe200] error while decoding MB 12 20
        07-15 15:16:55.390 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x45afe200] concealing 757 DC, 757 AC, 757 MV errors in P frame
        07-15 15:16:55.640 25713-25713/com.github.sampleffmpeg V/output: progress : frame=  173 fps=3.6 q=28.0 size=     657kB time=00:00:23.84 bitrate= 225.8kbits/s speed=0.502x
        07-15 15:16:55.840 25713-25713/com.github.sampleffmpeg V/output: progress : [rtsp @ 0x420391c0] max delay reached. need to consume packet
        07-15 15:16:55.910 25713-25713/com.github.sampleffmpeg V/output: progress : [NULL @ 0x4203ba00] RTP: missed 520 packets
        07-15 15:16:55.920 25713-25713/com.github.sampleffmpeg V/output: progress : [rtsp @ 0x420391c0] max delay reached. need to consume packet
        07-15 15:16:55.920 25713-25713/com.github.sampleffmpeg V/output: progress : [NULL @ 0x4203ba00] RTP: missed 2 packets
        07-15 15:16:55.920 25713-25713/com.github.sampleffmpeg V/output: progress : [rtsp @ 0x420391c0] max delay reached. need to consume packet
        07-15 15:16:55.920 25713-25713/com.github.sampleffmpeg V/output: progress : [NULL @ 0x4203ba00] RTP: missed 56 packets
        07-15 15:16:55.930 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x4581bb60] concealing 800 DC, 800 AC, 800 MV errors in P frame
        07-15 15:16:56.010 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x45c42680] concealing 311 DC, 311 AC, 311 MV errors in P frame
        07-15 15:16:56.720 25713-25713/com.github.sampleffmpeg V/output: progress : frame=  174 fps=3.6 q=28.0 size=     674kB time=00:00:23.88 bitrate= 231.0kbits/s speed=0.497x
        07-15 15:16:57.050 25713-25713/com.github.sampleffmpeg V/output: progress : frame=  175 fps=3.6 q=28.0 size=     675kB time=00:00:29.00 bitrate= 190.6kbits/s speed=0.596x
        07-15 15:16:57.350 25713-25713/com.github.sampleffmpeg V/output: progress : frame=  177 fps=3.6 q=28.0 size=     686kB time=00:00:30.36 bitrate= 185.1kbits/s speed=0.617x
        07-15 15:16:58.610 25713-25713/com.github.sampleffmpeg V/output: progress : frame=  182 fps=3.6 q=28.0 size=     703kB time=00:00:30.56 bitrate= 188.3kbits/s speed=0.609x
        07-15 15:16:59.120 25713-25713/com.github.sampleffmpeg V/output: progress : [rtsp @ 0x420391c0] max delay reached. need to consume packet
        07-15 15:16:59.120 25713-25713/com.github.sampleffmpeg V/output: progress : [NULL @ 0x4203ba00] RTP: missed 23 packets
        07-15 15:16:59.190 25713-25713/com.github.sampleffmpeg V/output: progress : frame=  184 fps=3.6 q=28.0 size=     709kB time=00:00:30.64 bitrate= 189.6kbits/s speed=0.602x
        07-15 15:16:59.200 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x45c42680] concealing 69 DC, 69 AC, 69 MV errors in P frame
        07-15 15:16:59.370 25713-25713/com.github.sampleffmpeg V/output: progress : [rtsp @ 0x420391c0] max delay reached. need to consume packet
        07-15 15:16:59.440 25713-25713/com.github.sampleffmpeg V/output: progress : [NULL @ 0x4203ba00] RTP: missed 35 packets
        07-15 15:16:59.440 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x458737c0] concealing 338 DC, 338 AC, 338 MV errors in I frame
        07-15 15:16:59.920 25713-25713/com.github.sampleffmpeg V/output: progress : frame=  187 fps=3.6 q=28.0 size=     716kB time=00:00:30.76 bitrate= 190.7kbits/s speed=0.595x
        07-15 15:16:59.920 25713-25713/com.github.sampleffmpeg V/output: progress : [rtsp @ 0x420391c0] max delay reached. need to consume packet
        07-15 15:16:59.990 25713-25713/com.github.sampleffmpeg V/output: progress : [NULL @ 0x4203ba00] RTP: missed 20 packets
        07-15 15:16:59.990 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x45afe200] concealing 489 DC, 489 AC, 489 MV errors in P frame
        07-15 15:17:01.980 25713-25713/com.github.sampleffmpeg V/output: progress : frame=  190 fps=3.5 q=28.0 size=     737kB time=00:00:30.88 bitrate= 195.4kbits/s speed=0.575x
        07-15 15:17:01.980 25713-25713/com.github.sampleffmpeg V/output: progress : [rtsp @ 0x420391c0] max delay reached. need to consume packet
        07-15 15:17:01.980 25713-25713/com.github.sampleffmpeg V/output: progress : [NULL @ 0x4203ba00] RTP: missed 35 packets
        07-15 15:17:02.060 25713-25713/com.github.sampleffmpeg V/output: progress : [rtsp @ 0x420391c0] max delay reached. need to consume packet
        07-15 15:17:02.060 25713-25713/com.github.sampleffmpeg V/output: progress : [NULL @ 0x4203ba00] RTP: missed 10 packets
        07-15 15:17:02.230 25713-25713/com.github.sampleffmpeg V/output: progress : [rtsp @ 0x420391c0] max delay reached. need to consume packet
        07-15 15:17:02.230 25713-25713/com.github.sampleffmpeg V/output: progress : [NULL @ 0x4203ba00] RTP: missed 10 packets
        07-15 15:17:02.270 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x45ab8020] dquant out of range (124) at 15 35
        07-15 15:17:02.270 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x45ab8020] error while decoding MB 15 35
        07-15 15:17:02.280 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x45ab8020] concealing 79 DC, 79 AC, 79 MV errors in P frame
        07-15 15:17:02.280 25713-25713/com.github.sampleffmpeg V/output: progress : [rtsp @ 0x420391c0] max delay reached. need to consume packet
        07-15 15:17:02.280 25713-25713/com.github.sampleffmpeg V/output: progress : [NULL @ 0x4203ba00] RTP: missed 76 packets
        07-15 15:17:02.290 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x45c42680] concealing 805 DC, 805 AC, 805 MV errors in P frame
        07-15 15:17:02.510 25713-25713/com.github.sampleffmpeg V/output: progress : [rtsp @ 0x420391c0] max delay reached. need to consume packet
        07-15 15:17:02.510 25713-25713/com.github.sampleffmpeg V/output: progress : [NULL @ 0x4203ba00] RTP: missed 40 packets
        07-15 15:17:02.600 25713-25713/com.github.sampleffmpeg V/output: progress : frame=  194 fps=3.6 q=28.0 size=     747kB time=00:00:31.04 bitrate= 197.1kbits/s speed=0.57x
        07-15 15:17:02.610 25713-25713/com.github.sampleffmpeg V/output: progress : [rtsp @ 0x420391c0] max delay reached. need to consume packet
        07-15 15:17:02.610 25713-25713/com.github.sampleffmpeg V/output: progress : [NULL @ 0x4203ba00] RTP: missed 60 packets
        07-15 15:17:02.610 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x458737c0] concealing 526 DC, 526 AC, 526 MV errors in P frame
        07-15 15:17:02.620 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x4581bb60] concealing 197 DC, 197 AC, 197 MV errors in P frame
        07-15 15:17:03.380 25713-25713/com.github.sampleffmpeg V/output: progress : frame=  195 fps=3.5 q=28.0 size=     751kB time=00:00:31.08 bitrate= 198.1kbits/s speed=0.562x
        07-15 15:17:03.570 25713-25713/com.github.sampleffmpeg V/output: progress : [rtsp @ 0x420391c0] max delay reached. need to consume packet
        07-15 15:17:03.640 25713-25713/com.github.sampleffmpeg V/output: progress : [NULL @ 0x4203ba00] RTP: missed 474 packets
        07-15 15:17:03.640 25713-25713/com.github.sampleffmpeg V/output: progress : [rtsp @ 0x420391c0] max delay reached. need to consume packet
        07-15 15:17:03.650 25713-25713/com.github.sampleffmpeg V/output: progress : [NULL @ 0x4203ba00] RTP: missed 109 packets
        07-15 15:17:03.650 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x4581bb60] negative number of zero coeffs at 28 22
        07-15 15:17:03.650 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x4581bb60] error while decoding MB 28 22
        07-15 15:17:03.660 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x4581bb60] concealing 651 DC, 651 AC, 651 MV errors in P frame
        07-15 15:17:03.660 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x458737c0] concealing 1074 DC, 1074 AC, 1074 MV errors in P frame
        07-15 15:17:03.920 25713-25713/com.github.sampleffmpeg V/output: progress : frame=  195 fps=3.5 q=28.0 size=     751kB time=00:00:31.08 bitrate= 198.1kbits/s speed=0.557x
        07-15 15:17:05.530 25713-25713/com.github.sampleffmpeg V/output: progress : frame=  199 fps=3.5 q=25.0 size=     766kB time=00:00:32.84 bitrate= 191.2kbits/s speed=0.573x
        07-15 15:17:06.250 25713-25713/com.github.sampleffmpeg V/output: progress : frame=  202 fps=3.5 q=22.0 size=     784kB time=00:00:32.96 bitrate= 194.8kbits/s speed=0.568x
        07-15 15:17:07.130 25713-25713/com.github.sampleffmpeg V/output: progress : frame=  205 fps=3.5 q=28.0 size=     800kB time=00:00:33.08 bitrate= 198.2kbits/s speed=0.562x
        07-15 15:17:08.960 25713-25713/com.github.sampleffmpeg V/output: progress : frame=  207 fps=3.4 q=28.0 size=     811kB time=00:00:35.32 bitrate= 188.0kbits/s speed=0.581x
        07-15 15:17:09.560 25713-25713/com.github.sampleffmpeg V/output: progress : [rtsp @ 0x420391c0] max delay reached. need to consume packet
        07-15 15:17:09.560 25713-25713/com.github.sampleffmpeg V/output: progress : [NULL @ 0x4203ba00] RTP: missed 133 packets
        07-15 15:17:09.660 25713-25713/com.github.sampleffmpeg V/output: progress : frame=  210 fps=3.4 q=28.0 size=     817kB time=00:00:35.84 bitrate= 186.8kbits/s speed=0.584x
        07-15 15:17:09.670 25713-25713/com.github.sampleffmpeg V/output: progress : [rtsp @ 0x420391c0] max delay reached. need to consume packet
        07-15 15:17:09.670 25713-25713/com.github.sampleffmpeg V/output: progress : [NULL @ 0x4203ba00] RTP: missed 14 packets
        07-15 15:17:09.680 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x45c42680] concealing 410 DC, 410 AC, 410 MV errors in I frame
        07-15 15:17:09.770 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x4581bb60] concealing 72 DC, 72 AC, 72 MV errors in P frame
        07-15 15:17:10.730 25713-25713/com.github.sampleffmpeg V/output: progress : [rtsp @ 0x420391c0] max delay reached. need to consume packet
        07-15 15:17:10.730 25713-25713/com.github.sampleffmpeg V/output: progress : [NULL @ 0x4203ba00] RTP: missed 24 packets
        07-15 15:17:10.740 25713-25713/com.github.sampleffmpeg V/output: progress : [rtsp @ 0x420391c0] max delay reached. need to consume packet
        07-15 15:17:10.740 25713-25713/com.github.sampleffmpeg V/output: progress : [NULL @ 0x4203ba00] RTP: missed 56 packets
        07-15 15:17:11.410 25713-25713/com.github.sampleffmpeg V/output: progress : frame=  212 fps=3.4 q=28.0 size=     819kB time=00:00:35.92 bitrate= 186.8kbits/s speed=0.574x
        07-15 15:17:11.510 25713-25713/com.github.sampleffmpeg V/output: progress : [rtsp @ 0x420391c0] max delay reached. need to consume packet
        07-15 15:17:11.510 25713-25713/com.github.sampleffmpeg V/output: progress : [NULL @ 0x4203ba00] RTP: missed 14 packets
        07-15 15:17:11.510 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x45ab8020] concealing 77 DC, 77 AC, 77 MV errors in P frame
        07-15 15:17:11.520 25713-25713/com.github.sampleffmpeg V/output: progress : frame=  213 fps=3.4 q=28.0 size=     820kB time=00:00:35.96 bitrate= 186.8kbits/s speed=0.568x
        07-15 15:17:11.520 25713-25713/com.github.sampleffmpeg V/output: progress : [rtsp @ 0x420391c0] max delay reached. need to consume packet
        07-15 15:17:11.520 25713-25713/com.github.sampleffmpeg V/output: progress : [NULL @ 0x4203ba00] RTP: missed 10 packets
        07-15 15:17:11.670 25713-25713/com.github.sampleffmpeg V/output: progress : [rtsp @ 0x420391c0] max delay reached. need to consume packet
        07-15 15:17:11.720 25713-25713/com.github.sampleffmpeg V/output: progress : [NULL @ 0x4203ba00] RTP: missed 82 packets
        07-15 15:17:11.730 25713-25713/com.github.sampleffmpeg V/output: progress : [rtsp @ 0x420391c0] max delay reached. need to consume packet
        07-15 15:17:11.730 25713-25713/com.github.sampleffmpeg V/output: progress : [NULL @ 0x4203ba00] RTP: missed 41 packets
        07-15 15:17:11.740 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x45c42680] concealing 1343 DC, 1343 AC, 1343 MV errors in I frame
        07-15 15:17:11.740 25713-25713/com.github.sampleffmpeg V/output: progress : [rtsp @ 0x420391c0] max delay reached. need to consume packet
        07-15 15:17:11.750 25713-25713/com.github.sampleffmpeg V/output: progress : [NULL @ 0x4203ba00] RTP: missed 240 packets
        07-15 15:17:11.890 25713-25713/com.github.sampleffmpeg V/output: progress : [rtsp @ 0x420391c0] max delay reached. need to consume packet
        07-15 15:17:11.900 25713-25713/com.github.sampleffmpeg V/output: progress : [NULL @ 0x4203ba00] RTP: missed 7 packets
        07-15 15:17:11.900 25713-25713/com.github.sampleffmpeg V/output: progress : [rtsp @ 0x420391c0] max delay reached. need to consume packet
        07-15 15:17:11.900 25713-25713/com.github.sampleffmpeg V/output: progress : [NULL @ 0x4203ba00] RTP: missed 63 packets
        07-15 15:17:11.940 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x45ab8020] out of range intra chroma pred mode at 42 32
        07-15 15:17:11.950 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x45ab8020] error while decoding MB 42 32
        07-15 15:17:11.960 25713-25713/com.github.sampleffmpeg V/output: progress : [h264 @ 0x45ab8020] concealing 187 DC, 187 AC, 187 MV errors in P frame
        07-15 15:17:12.420 25713-25713/com.github.sampleffmpeg V/output: progress : [rtsp @ 0x420391c0] max delay reached. need to consume packet
        07-15 15:17:12.420 25713-25713/com.github.sampleffmpeg V/output: progress : [NULL @ 0x4203ba00] RTP: missed 9 packets
        07-15 15:17:12.830 25713-25713/com.github.sampleffmpeg V/output: progress : frame=  217 fps=3.4 q=24.0 size=     881kB time=00:00:36.12 bitrate= 199.7kbits/s speed=0.562x
        07-15 15:17:32.710 25713-25713/com.github.sampleffmpeg V/output: progress : frame=  217 fps=2.6 q=-1.0 Lsize=    1192kB time=00:00:59.48 bitrate= 164.2kbits/s speed=0.703x
        07-15 15:17:32.720 25713-25713/com.github.sampleffmpeg V/output: progress : video:1190kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.207405%
        07-15 15:17:32.820 25713-25713/com.github.sampleffmpeg V/output: progress : [libx264 @ 0x420a54c0] frame I:4     Avg QP:17.76  size: 23826
        07-15 15:17:32.820 25713-25713/com.github.sampleffmpeg V/output: progress : [libx264 @ 0x420a54c0] frame P:141   Avg QP:20.25  size:  7361
        07-15 15:17:32.830 25713-25713/com.github.sampleffmpeg V/output: progress : [libx264 @ 0x420a54c0] frame B:72    Avg QP:23.59  size:  1173
        07-15 15:17:32.830 25713-25713/com.github.sampleffmpeg V/output: progress : [libx264 @ 0x420a54c0] consecutive B-frames: 54.4%  1.8%  6.9% 36.9%
        07-15 15:17:32.840 25713-25713/com.github.sampleffmpeg V/output: progress : [libx264 @ 0x420a54c0] mb I  I16..4: 23.2% 42.9% 34.0%
        07-15 15:17:32.840 25713-25713/com.github.sampleffmpeg V/output: progress : [libx264 @ 0x420a54c0] mb P  I16..4:  3.6%  3.2%  1.9%  P16..4: 27.9%  6.4%  4.8%  0.0%  0.0%    skip:52.3%
        07-15 15:17:32.850 25713-25713/com.github.sampleffmpeg V/output: progress : [libx264 @ 0x420a54c0] mb B  I16..4:  0.1%  0.1%  0.0%  B16..8: 20.2%  1.0%  0.3%  direct: 1.1%  skip:77.2%  L0:42.5% L1:53.9% BI: 3.7%
        07-15 15:17:32.860 25713-25713/com.github.sampleffmpeg V/output: progress : [libx264 @ 0x420a54c0] 8x8 transform intra:38.4% inter:29.9%
        07-15 15:17:32.860 25713-25713/com.github.sampleffmpeg V/output: progress : [libx264 @ 0x420a54c0] coded y,uvDC,uvAC intra: 49.0% 31.1% 15.6% inter: 12.9% 10.5% 1.3%
        07-15 15:17:32.870 25713-25713/com.github.sampleffmpeg V/output: progress : [libx264 @ 0x420a54c0] i16 v,h,dc,p: 72%  6%  5% 17%
        07-15 15:17:32.870 25713-25713/com.github.sampleffmpeg V/output: progress : [libx264 @ 0x420a54c0] i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 45% 16% 17%  2%  4%  3%  4%  5%  3%
        07-15 15:17:32.880 25713-25713/com.github.sampleffmpeg V/output: progress : [libx264 @ 0x420a54c0] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 26% 19% 15%  5%  6%  4% 10%  9%  6%
        07-15 15:17:32.880 25713-25713/com.github.sampleffmpeg V/output: progress : [libx264 @ 0x420a54c0] i8c dc,h,v,p: 46% 12% 40%  3%
        07-15 15:17:32.880 25713-25713/com.github.sampleffmpeg V/output: progress : [libx264 @ 0x420a54c0] Weighted P-Frames: Y:0.0% UV:0.0%
        07-15 15:17:32.890 25713-25713/com.github.sampleffmpeg V/output: progress : [libx264 @ 0x420a54c0] ref P L0: 78.9%  6.1%  9.5%  5.6%
        07-15 15:17:32.890 25713-25713/com.github.sampleffmpeg V/output: progress : [libx264 @ 0x420a54c0] ref B L0: 87.4%  9.5%  3.1%
        07-15 15:17:32.890 25713-25713/com.github.sampleffmpeg V/output: progress : [libx264 @ 0x420a54c0] ref B L1: 95.5%  4.5%
        07-15 15:17:32.900 25713-25713/com.github.sampleffmpeg V/output: progress : [libx264 @ 0x420a54c0] kb/s:163.45
        07-15 15:17:32.900 25713-25713/com.github.sampleffmpeg V/output: SUCCESS with output : ffmpeg version n3.0.1 Copyright (c) 2000-2016 the FFmpeg developers
        built with gcc 4.8 (GCC)
        configuration: --target-os=linux --cross-prefix=/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/bin/i686-linux-android- --arch=x86 --cpu=i686 --enable-runtime-cpudetect --sysroot=/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/sysroot --enable-pic --enable-libx264 --enable-libass --enable-libfreetype --enable-libfribidi --enable-libmp3lame --enable-fontconfig --enable-pthreads --disable-debug --disable-ffserver --enable-version3 --enable-hardcoded-tables --disable-ffplay --disable-ffprobe --enable-gpl --enable-yasm --disable-doc --disable-shared --enable-static --pkg-config=/home/vagrant/SourceCode/ffmpeg-android/ffmpeg-pkg-config --prefix=/home/vagrant/SourceCode/ffmpeg-android/build/x86 --extra-cflags='-I/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/include -U_FORTIFY_SOURCE -D_FORTIFY_SOURCE=2 -fno-strict-overflow -fstack-protector-all -march=i686' --extra-ldflags='-L/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/lib -Wl,-z,relro -Wl,-z,now -pie' --extra-libs='-lpng -lexpat -lm' --extra-cxxflags=
        libavutil      55. 17.103 / 55. 17.103
        libavcodec     57. 24.102 / 57. 24.102
        libavformat    57. 25.100 / 57. 25.100
        libavdevice    57.  0.101 / 57.  0.101
        libavfilter     6. 31.100 /  6. 31.100
        libswscale      4.  0.100 /  4.  0.100
        libswresample   2.  0.101 /  2.  0.101
        libpostproc    54.  0.100 / 54.  0.100
        [udp @ 0x4203b040] 'circular_buffer_size' option was set but it is not supported on this build (pthread support is required)
        [udp @ 0x4203c040] 'circular_buffer_size' option was set but it is not supported on this build (pthread support is required)
        Input #0, rtsp, from 'rtsp://81.109.95.91:3000/stream':
        Metadata:
        title           : Session streamed with GStreamer
        comment         : rtsp-server
        Duration: N/A, start: 0.080000, bitrate: N/A
        Stream #0:0: Video: h264 (Constrained Baseline), yuv420p, 720x576, 25 fps, 25 tbr, 90k tbn, 180k tbc
        [libx264 @ 0x420a54c0] using cpu capabilities: none!
        [libx264 @ 0x420a54c0] profile High, level 3.0
        [libx264 @ 0x420a54c0] 264 - core 148 - H.264/MPEG-4 AVC codec - Copyleft 2003-2015 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=6 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
        Output #0, matroska, to '/storage/emulated/0/recording15072016-031605.mkv':
        Metadata:
        title           : Session streamed with GStreamer
        comment         : rtsp-server
        encoder         : Lavf57.25.100
        Stream #0:0: Video: h264 (libx264) (H264 / 0x34363248), yuv420p, 720x576, q=-1--1, 25 fps, 1k tbn, 25 tbc
        Metadata:
        encoder         : Lavc57.24.102 libx264
        Side data:
        unknown side data type 10 (24 bytes)
        Stream mapping:
        Stream #0:0 -> #0:0 (h264 (native) -> h264 (libx264))
        Press [q] to stop, [?] for help
        frame=   23 fps=0.0 q=0.0 size=       0kB time=00:00:00.00 bitrate=N/A speed=   0x
        [rtsp @ 0x420391c0] max delay reached. need to consume packet
        [NULL @ 0x4203ba00] RTP: missed 29 packets
        [h264 @ 0x45afe200] concealing 104 DC, 104 AC, 104 MV errors in P frame
        frame=   42 fps= 41 q=0.0 size=       0kB time=00:00:00.00 bitrate=N/A speed=   0x
        frame=   47 fps= 17 q=0.0 size=       0kB time=00:00:00.00 bitrate=N/A speed=   0x
        frame=   52 fps= 14 q=28.0 size=      36kB time=00:00:00.24 bitrate=1231.4kbits/s speed=0.0655x
        frame=   55 fps= 12 q=28.0 size=      46kB time=00:00:00.36 bitrate=1036.1kbits/s speed=0.0783x
        [rtsp @ 0x420391c0] max delay reached. need to con


    



    How to save rtsp stream without packet loss with good quality ? Any help will be appreciated.

    


  • Apostrophe issue with FFmpeg

    1er mars 2024, par Rohan Molinillo

    I'm working on a company's project which is vue.js.
There are also code parts in php.
It uses FFmpeg to create a video from multiple videos.
On each video, there is a text type subtitle.
Each text is retrieved from a .txt file
But I have a problem with apostrophes.

    


    If in the subtitle is stored like this ( I'm here ) in the txt file, on the video there will be written ( Im ).
The apostrophe is removed and the rest of the text too ( here ) will not be displayed.

    


    I'm new to php and ffmpeg, I've been on this problem for almost 3 weeks.

    


    I share the php code with you.

    


    <?php

declare(strict_types=1);

array_shift($argv); // remove script name in $argv[0]

$parameters = array_reduce($argv, function ($carry, $arg) {
    $tokens = explode('=', $arg);
    $carry[$tokens[0]] = $tokens[1];
    return $carry;
}, []);

$projectPath = $parameters['projectPath'];
$musicPath = $parameters['musicPath'];

$fontPath = getcwd() . "/public/fonts/cobol/Cobol-Bold.ttf";
$logoPath = getcwd() . "/public/images/saintex.jpg";
$carnetLogoPath = getcwd() . "/public/images/CarnetTitre.jpg";

// Adding descriptions for each clip and fade in and fade out filters
$clipsToDescribe =  glob("$projectPath/*.webm");
$clipFrameRate = (int) shell_exec("cd $projectPath && ffprobe -v error -select_streams v -of default=noprint_wrappers=1:nokey=1 -show_entries stream=r_frame_rate $clipsToDescribe[0]");
$clipFrameRate = $clipFrameRate > 60 ? 30 : $clipFrameRate;

foreach ($clipsToDescribe as $key => $clipToDescribe) {
    $clipIndex = $key + 1;
    $clipFrames = (int) shell_exec("cd $projectPath && ffprobe -v error -select_streams v:0 -count_packets -show_entries stream=nb_read_packets -of csv=p=0 $clipToDescribe");
    $clipDuration = (float) ($clipFrames / $clipFrameRate) - 0.5;
    file_put_contents("$projectPath/clip{$clipIndex}_desc.txt", addslashes($parameters["clip{$clipIndex}_desc"]));
    shell_exec("cd $projectPath && ffmpeg -i $clipToDescribe -vf 'drawtext=fontfile=$fontPath: textfile=clip{$clipIndex}_desc.txt: fontcolor=white: fontsize=46: box=1: boxcolor=black@0.5: boxborderw=5: x=(w-text_w)/2: y=(h-text_h-50): fix_bounds=true, fade=t=in:st=0:d=0.3,fade=t=out:st=$clipDuration:d=0.3' -b:v 3000K -b:a 192K clip{$clipIndex}.webm");
}
array_map('unlink', glob("$projectPath/*desc.txt"));

shell_exec("cd $projectPath && ffmpeg -t 2 -f lavfi -i color=c=black:s=1280x720 -r 30 blank.webm");
shell_exec("cd $projectPath && ffmpeg -i blank.webm -i $carnetLogoPath -filter_complex '[0:v][1:v] overlay=(main_w/2)-(overlay_w/2):(main_h/2)-(overlay_h/2)' -pix_fmt yuv420p -c:a copy logo.webm");



$workshop = $parameters["workshop_type"];
$title = $parameters["title"];
shell_exec("cd $projectPath && ffmpeg -f lavfi -i color=size=1280x720:duration=3:rate=30:color=black -vf 'drawtext=text=$workshop:fontfile=$fontPath:fontcolor=white:fontsize=46:x=(w-text_w)/2:y=(h-text_h)/2, drawtext=text=$title:fontfile=$fontPath:fontcolor=white:fontsize=46:x=(w-text_w)/2:y=((h-text_h)/2)+lh+5' opening.webm");
unlink("$projectPath/blank.webm");

$videosFile = "file 'logo.webm'\n";
$videosFile .= "file 'opening.webm'\n";{
    file_put_contents("$projectPath/project_desc.txt", $parameters["project_desc"]);
    shell_exec("cd $projectPath && ffmpeg -f lavfi -i color=size=1280x720:duration=3:rate=30:color=black -vf 'drawtext=fontfile=$fontPath:fontsize=46:fontcolor=white:x=(w-text_w)/2:y=(h-text_h)/2:textfile=project_desc.txt' project_desc.webm");
    unlink("$projectPath/project_desc.txt");
    $videosFile .= "file 'project_desc.webm'\n";
} 
if(array_key_exists("participants", $parameters)) {
    file_put_contents("$projectPath/participants.txt", "Participants :\n" . $parameters["participants"]);
    shell_exec("cd $projectPath && ffmpeg -f lavfi -i color=size=1280x720:duration=2:rate=30:color=black -vf 'drawtext=fontfile=$fontPath:fontsize=46:fontcolor=white:x=(w-text_w)/2:y=(h-text_h)/2:textfile=participants.txt' participants.webm");
    unlink("$projectPath/participants.txt");
}

$videos =  glob("$projectPath/clip*.webm");
foreach($videos as $video) {
    if($video != "originalVideos") {
        $videosFile .= "file ". "'{$video}'\n";
    }
} 
if(array_key_exists("participants", $parameters)) {
    $videosFile .= "file '$projectPath/participants.webm'";
}
array_map('unlink', glob("$projectPath/*webm.txt"));
file_put_contents("$projectPath/videosFile.txt", $videosFile);
if($musicPath == "/_musics/") {
    echo(shell_exec("cd $projectPath && ffmpeg -f concat -safe 0 -i videosFile.txt -b:v 10000K -crf 20 -b:a 192K output.webm"));
} else {
    echo(shell_exec("cd $projectPath && ffmpeg -f concat -safe 0 -i videosFile.txt -b:v 10000K -crf 20 -b:a 192K assembled.webm && ffmpeg -i assembled.webm -i ../..$musicPath -filter_complex ' [1:0] apad ' -shortest -y -b:v 3000K -b:a 192K output.webm"));
}


    


    I tried many things but each time there were errors.
I think I didn't implement the code properly.

    


    I share you the error

    


            [09:21:02] RECEIVED EVENT: videoRequest
{ Error: Command failed: php ./public/src/generate.php projectPath='/home/rohan/Documents/dodoc2/_projects/its-a-test' musicPath='/_musics/classic.mp3' clip1_name='' clip2_name='' clip3_name='' clip4_name='' clip5_name='' clip1_desc='It's a first test' clip2_desc='It's a second test' clip3_desc='It's a third test' clip4_desc='It's a fourth' clip5_desc='It's a last test' project_desc='' workshop_type='Atelier Robotique' title='It's a test' participants='Molinillo Rohan
'
PHP Warning:  Undefined array key 1 in /home/rohan/carnet-numerique/public/src/generate.php on line 9
PHP Warning:  Undefined array key 1 in /home/rohan/carnet-numerique/public/src/generate.php on line 9
PHP Warning:  Undefined array key 1 in /home/rohan/carnet-numerique/public/src/generate.php on line 9
PHP Warning:  Undefined array key 1 in /home/rohan/carnet-numerique/public/src/generate.php on line 9
PHP Warning:  Undefined array key 1 in /home/rohan/carnet-numerique/public/src/generate.php on line 9
PHP Warning:  Undefined array key 1 in /home/rohan/carnet-numerique/public/src/generate.php on line 9
ffmpeg version 5.1.1-1ubuntu2.1 Copyright (c) 2000-2022 the FFmpeg developers
  built with gcc 12 (Ubuntu 12.2.0-3ubuntu1)
  configuration: --prefix=/usr --extra-version=1ubuntu2.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libdav1d --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libglslang --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librabbitmq --enable-librist --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libsrt --enable-libssh --enable-libsvtav1 --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzimg --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opencl --enable-opengl --enable-sdl2 --disable-sndio --enable-pocketsphinx --enable-librsvg --enable-libmfx --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libx264 --enable-libplacebo --enable-shared
  libavutil      57. 28.100 / 57. 28.100
  libavcodec     59. 37.100 / 59. 37.100
  libavformat    59. 27.100 / 59. 27.100
  libavdevice    59.  7.100 / 59.  7.100
  libavfilter     8. 44.100 /  8. 44.100
  libswscale      6.  7.100 /  6.  7.100
  libswresample   4.  7.100 /  4.  7.100
  libpostproc    56.  6.100 / 56.  6.100
Input #0, matroska,webm, from '/home/rohan/Documents/dodoc2/_projects/its-a-test/video-20230404-091933-682.webm':
  Metadata:
    encoder         : QTmuxingAppLibWebM-0.0.1
  Duration: N/A, start: -0.001000, bitrate: N/A
  Stream #0:0(eng): Video: vp8, yuv420p(progressive), 1280x720, SAR 1:1 DAR 16:9, 1k tbr, 1k tbn (default)
  Stream #0:1(eng): Audio: opus, 48000 Hz, stereo, fltp (default)
Stream mapping:
  Stream #0:0 -> #0:0 (vp8 (native) -> vp9 (libvpx-vp9))
  Stream #0:1 -> #0:1 (opus (native) -> opus (libopus))
Press [q] to stop, [?] for help
[libvpx-vp9 @ 0x55952c183000] v1.12.0
Output #0, webm, to 'clip1.webm':
  Metadata:
    encoder         : Lavf59.27.100
  Stream #0:0(eng): Video: vp9, yuv420p(tv, bt470bg/unknown/unknown, progressive), 1280x720 [SAR 1:1 DAR 16:9], q=2-31, 3000 kb/s, 1k fps, 1k tbn (default)
    Metadata:
      encoder         : Lavc59.37.100 libvpx-vp9
    Side data:
      cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: N/A
  Stream #0:1(eng): Audio: opus, 48000 Hz, stereo, flt, 192 kb/s (default)
    Metadata:
      encoder         : Lavc59.37.100 libopus
frame=  229 fps= 11 q=12.0 Lsize=    2786kB time=00:00:07.52 bitrate=3034.7kbits/s speed=0.377x    
video:2601kB audio:181kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.175036%
PHP Warning:  Undefined array key "clip2_desc" in /home/rohan/carnet-numerique/public/src/generate.php on line 29
PHP Fatal error:  Uncaught TypeError: addslashes(): Argument #1 ($string) must be of type string, null given in /home/rohan/carnet-numerique/public/src/generate.php:29
Stack trace:
#0 /home/rohan/carnet-numerique/public/src/generate.php(29): addslashes()
#1 {main}
  thrown in /home/rohan/carnet-numerique/public/src/generate.php on line 29

    at ChildProcess.exithandler (child_process.js:275:12)
    at emitTwo (events.js:126:13)
    at ChildProcess.emit (events.js:214:7)
    at maybeClose (internal/child_process.js:925:16)
    at Socket.stream.socket.on (internal/child_process.js:346:11)
    at emitOne (events.js:116:13)
    at Socket.emit (events.js:211:7)
    at Pipe._handle.close [as _onclose] (net.js:554:12)
  killed: false,
  code: 255,
  signal: null,
  cmd: 'php ./public/src/generate.php projectPath=\'/home/rohan/Documents/dodoc2/_projects/its-a-test\' musicPath=\'/_musics/classic.mp3\' clip1_name=\'\' clip2_name=\'\' clip3_name=\'\' clip4_name=\'\' clip5_name=\'\' clip1_desc=\'It\'s a first test\' clip2_desc=\'It\'s a second test\' clip3_desc=\'It\'s a third test\' clip4_desc=\'It\'s a fourth\' clip5_desc=\'It\'s a last test\' project_desc=\'\' workshop_type=\'Atelier Robotique\' title=\'It\'s a test\' participants=\'Molinillo Rohan\n\'' }


    


  • Capture from multiple streams concurrently, best way to do it and how to reduce CPU usage

    19 juin 2019, par DRONE_6969

    I am currently in the process of writing an application that will capture a lot of RTSP streams(in my case its 12) and display it on the QT widget. The problem arouses when I am going beyond around 6-7 streams, the CPU usage spikes and there is visible stutter.

    The reason why I think that it is not QT draw function is because I have done some checking to measure how much time it takes to draw an incoming image from camera and just sample images I had, it is always a lot less than 33 milliseconds(even if there are 12 widgets being updated).

    I also just ran opencv capture method without drawing and got pretty much the same CPU consumption as if I was drawing the frames (lost like 10% CPU at most and GPU usage went to zero).

    IMPORTANT : I am using RTSP stream which is a h264 stream.

    IF IT MATTERS MY SPECS :

    Intel Core i7-6700 @ 3.40GHZ(8 CPUS)
    Memory : 16gb
    GPU : Intel HD Graphics 530

    (Also I ran my code on a computer with dedicated Graphics card, it did eliminate some stutter but CPU usage is still pretty high)

    I am currently using OPENCV 4.1.0 with GSTREAMER enabled and built, I also have the OPENCV-WORLD version, there is no difference in performance.

    I have created a special class called Camera that holds its frame size constraints and various control functions as well stream function. The stream function is being ran on a separate thread, whenever stream() function is done with current frame it sends ready Mat via onNewFrame event I created which converts to QPixmap and updates widget’s lastImage variable. This way I can update image in a more thread safe way.

    I have tried to manipulate those VideoCapture.set() values, but it didn’t really help.

    This is my stream function (Ignore the bool return, it doesn’t do anything it is a remnant from couple of minutes ago when I was trying to use std::async) :

    bool Camera::stream() {
       /* This function is meant to run on a separate thread and fill up the buffer independantly of
       main stream thread */
       //cv::setNumThreads(100);
       /* Rules for these slightly changed! */
       Mat pre;  // Grab initial undoctored frame
       //pre = Mat::zeros(size, CV_8UC1);
       Mat frame; // Final modified frame
       frame = Mat::zeros(size, CV_8UC1);
       if (!pre.isContinuous()) pre = pre.clone();

       ipCam.open(streamUrl, CAP_FFMPEG);


       while (ipCam.isOpened() && capture) {
           // If camera is opened wel need to capture and process the frame
           try {
               auto start = std::chrono::system_clock::now();

               ipCam >> pre;

               if (pre.empty()) {
                   /* Check for blank frame, return error if there is a blank frame*/
                   cerr << id << ": ERROR! blank frame grabbed\n";
                   for (FrameListener* i : clients) {
                       i->onNotification(1); // Notify clients about this shit
                   }
                   break;
               }

               else {
                   // Only continue if frame not empty

                   if (pre.cols != size.width && pre.rows != size.height) {
                       resize(pre, frame, size);
                       pre.release();
                   }
                   else {
                       frame = pre;
                   }

                   dPacket* pack = new dPacket{id,&frame};
                   for (auto i : clients) {
                       i->onPNewFrame(pack);
                   }
                   frame.release();
                   delete pack;
               }
           }

           catch (int e) {
               cout << endl << "-----Exception during capture process! CODE " << e << endl;
           }
           // End camera manipulations
       }

       cout << "Camera timed out, or connection is closed..." << endl;
       if (tryResetConnection) {
           cout << "Reconnection flag is set, retrying after 3 seconds..." << endl;
           for (FrameListener* i : clients) {
               i->onNotification(-1); // Notify clients about this shit
           }
           this_thread::sleep_for(chrono::milliseconds(3000));
           stream();
       }

       return true;
    }

    This is my onPNewFrame function. The conversion is still being done on camera’s thread because it was called within stream() and therefore is within that scope(and I also checked) :

    void GLWidget::onPNewFrame(dPacket* inPack) {
       lastFlag = 0;

       if (bufferEnabled) {
           buffer.push(QPixmap::fromImage(toQImageFromPMat(inPack->frame)));
       }
       else {
           if (playing) {
               /* Only process if this widget is playing */
               frameProcessing = true;
               lastImage.convertFromImage(toQImageFromPMat(inPack->frame));
               frameProcessing = false;
           }
       }

       if (lastFlag != -1 && !lastImage.isNull()) {
           connecting = false;
       }
       else {
           connecting = true;
       }
    }

    This is my Mat to QImage :

    QImage GLWidget::toQImageFromPMat(cv::Mat* mat) {



       return QImage(mat->data, mat->cols, mat->rows, QImage::Format_RGB888).rgbSwapped();

    NOTE : not converting does not result in CPU boost (at least not a significant one).

    Minimal verifiable example

    This program is large. I am going to paste GLWidget.cpp and GLWidget.h as well as Camera.h and Camera.cpp. You can put GLWidget into anything just as long as you spawn more than 6 of it. Camera relies on the CamUtils, but it is possible to just paste url in videocapture

    I also supplied CamUtils, just in case

    Camera.h :

    #pragma once
    #include <iostream>
    #include <vector>
    #include <fstream>
    #include <map>
    #include <string>
    #include <sstream>
    #include <algorithm>
    #include "FrameListener.h"
    #include
    #include <thread>
    #include "CamUtils.h"
    #include <ctime>
    #include "dPacket.h"

    using namespace std;
    using namespace cv;

    class Camera
    {

       /*
           CLEANED UP!
           Camera now is only responsible for streaming and echoing captured frames.
           Frames are now wrapped into dPacket struct.
       */


    private:
       string id;
       vector clients;
       VideoCapture ipCam;
       string streamUrl;
       Size size;
       bool tryResetConnection = false;

       //TODO: Remove these as they are not going to be used going on:
       bool isPlaying = true;
       bool capture = true;

       //SECRET FEATURES:
       bool detect = false;


    public:
       Camera(string url, int width = 480, int height = 240, bool detect_=false);
       bool stream();
       void setReconnectable(bool newReconStatus);
       void addListener(FrameListener* client);
       vector<bool> getState();    // Returns current state: vector[0] stream state; vector[1] stream state; TODO: Remove this as this is no longer should control behaviour
       void killStream();
       bool getReconnectable();
    };

    </bool></ctime></thread></algorithm></sstream></string></map></fstream></vector></iostream>

    Camera.cpp

    #include "Camera.h"


    Camera::Camera(string url, int width, int height, bool detect_) // Default 240p
    {
       streamUrl = url; // Prepare url
       size = Size(width, height);
       detect = detect_;

    }

    void Camera::addListener(FrameListener* client) {
       clients.push_back(client);
    }


    /*
                   TEST CAMERAS(Paste into cameras.dViewer):
                   {"id":"96a73796-c129-46fc-9c01-40acd8ed7122","ip":"176.57.73.231","password":"null","username":"null"},
                   {"id":"96a73796-c129-46fc-9c01-40acd8ed7122","ip":"176.57.73.231","password":"null","username":"null"},
                   {"id":"96a73796-c129-46fc-9c01-40acd8ed7144","ip":"172.20.101.13","password":"admin","username":"root"}
                   {"id":"96a73796-c129-46fc-9c01-40acd8ed7144","ip":"172.20.101.13","password":"admin","username":"root"}

    */



    bool Camera::stream() {
       /* This function is meant to run on a separate thread and fill up the buffer independantly of
       main stream thread */
       //cv::setNumThreads(100);
       /* Rules for these slightly changed! */
       Mat pre;  // Grab initial undoctored frame
       //pre = Mat::zeros(size, CV_8UC1);
       Mat frame; // Final modified frame
       frame = Mat::zeros(size, CV_8UC1);
       if (!pre.isContinuous()) pre = pre.clone();

       ipCam.open(streamUrl, CAP_FFMPEG);

       while (ipCam.isOpened() &amp;&amp; capture) {
           // If camera is opened wel need to capture and process the frame
           try {
               auto start = std::chrono::system_clock::now();

               ipCam >> pre;

               if (pre.empty()) {
                   /* Check for blank frame, return error if there is a blank frame*/
                   cerr &lt;&lt; id &lt;&lt; ": ERROR! blank frame grabbed\n";
                   for (FrameListener* i : clients) {
                       i->onNotification(1); // Notify clients about this shit
                   }
                   break;
               }

               else {
                   // Only continue if frame not empty

                   if (pre.cols != size.width &amp;&amp; pre.rows != size.height) {
                       resize(pre, frame, size);
                       pre.release();
                   }
                   else {
                       frame = pre;
                   }

                   auto end = std::chrono::system_clock::now();
                   std::time_t ts = std::chrono::system_clock::to_time_t(end);
                   dPacket* pack = new dPacket{ id,&amp;frame};
                   for (auto i : clients) {
                       i->onPNewFrame(pack);
                   }
                   frame.release();
                   delete pack;
               }
           }

           catch (int e) {
               cout &lt;&lt; endl &lt;&lt; "-----Exception during capture process! CODE " &lt;&lt; e &lt;&lt; endl;
           }
           // End camera manipulations
       }

       cout &lt;&lt; "Camera timed out, or connection is closed..." &lt;&lt; endl;
       if (tryResetConnection) {
           cout &lt;&lt; "Reconnection flag is set, retrying after 3 seconds..." &lt;&lt; endl;
           for (FrameListener* i : clients) {
               i->onNotification(-1); // Notify clients about this shit
           }
           this_thread::sleep_for(chrono::milliseconds(3000));
           stream();
       }

       return true;
    }


    void Camera::killStream(){
       tryResetConnection = false;
       capture = false;
       ipCam.release();
    }

    void Camera::setReconnectable(bool reconFlag) {
       tryResetConnection = reconFlag;
    }

    bool Camera::getReconnectable() {
       return tryResetConnection;
    }

    vector<bool> Camera::getState() {
       vector<bool> states;
       states.push_back(isPlaying);
       states.push_back(ipCam.isOpened());
       return states;
    }



    </bool></bool>

    GLWidget.h :

    #ifndef GLWIDGET_H
    #define GLWIDGET_H

    #include <qopenglwidget>
    #include <qmouseevent>
    #include "FrameListener.h"
    #include "Camera.h"
    #include "FrameListener.h"
    #include
    #include "Camera.h"
    #include "CamUtils.h"
    #include
    #include "dPacket.h"
    #include <chrono>
    #include <ctime>
    #include
    #include "FullScreenVideo.h"
    #include <qmovie>
    #include "helper.h"
    #include <iostream>
    #include <qpainter>
    #include <qtimer>

    class Helper;

    class GLWidget : public QOpenGLWidget, public FrameListener
    {
       Q_OBJECT

    public:
       GLWidget(std::string camId, CamUtils *cUtils, int width, int height, bool denyFullScreen_ = false, bool detectFlag_=false, QWidget* parent = nullptr);
       void killStream();
       ~GLWidget();

    public slots:
       void animate();
       void setBufferEnabled(bool setState);
       void setCameraRetryConnection(bool setState);
       void GLUpdate();            // Call to update the widget
       void onRightClickMenu(const QPoint&amp; point);

    protected:
       void paintEvent(QPaintEvent* event) override;
       void onPNewFrame(dPacket* frame);
       void onNotification(int alert_code);


    private:
       // Objects and resourses
       Helper* helper;
       Camera* cam;
       CamUtils* camUtils;
       QTimer* timer; // Keep track of update
       QPixmap lastImage;
       QMovie* connMov;
       QMovie* test;

       QPixmap logo;

       // Control fields
       int width;
       int height;
       int camUtilsAddr;
       int elapsed;
       std::thread* camThread;
       std::string camId;
       bool denyFullScreen = false;
       bool playing = true;
       bool streaming = true;
       bool debug = false;
       bool connecting = true;
       int lastFlag = 0;


       // Debug fields
       std::chrono::high_resolution_clock::time_point lastFrameAt;
       std::chrono::high_resolution_clock::time_point now;
       std::chrono::duration<double> painTime; // time took to draw last frame

       //Buffer stuff
       std::queue<qpixmap> buffer;
       bool bufferEnabled = false;
       bool initialBuffer = false;
       bool buffering = true;
       bool frameProcessing = false;



       //Functions
       QImage toQImageFromPMat(cv::Mat* inFrame);
       void mousePressEvent(QMouseEvent* event) override;
       void drawImageGLLatest(QPainter* painter, QPaintEvent* event, int elapsed);
       void drawOnPaused(QPainter* painter, QPaintEvent* event, int elapsed);
       void drawOnStatus(int statusFlag, QPainter* painter, QPaintEvent* event, int elapsed);
    };

    #endif

    </qpixmap></double></qtimer></qpainter></iostream></qmovie></ctime></chrono></qmouseevent></qopenglwidget>

    GLWidget.cpp :

    #include "glwidget.h"
    #include <future>


    FullScreenVideo* fullScreen;

    GLWidget::GLWidget(std::string camId_, CamUtils* cUtils, int width_, int height_,  bool denyFullScreen_, bool detectFlag_, QWidget* parent)
       : QOpenGLWidget(parent), helper(helper)
    {
       cout &lt;&lt; "Player for CAMERA " &lt;&lt; camId_ &lt;&lt; endl;

       /* Underlying properties */
       camUtils = cUtils;
       cout &lt;&lt; "GLWidget Incoming CamUtils addr " &lt;&lt; camUtils &lt;&lt; endl;
       cout &lt;&lt; "GLWidget Set CamUtils addr " &lt;&lt; camUtils &lt;&lt; endl;
       camId = camId_;
       elapsed = 0;
       width = width_ + 5;
       height = height_ + 5;
       helper = new Helper();
       setFixedSize(width, height);
       denyFullScreen = denyFullScreen_;

       /* Camera capture thread */
       cam = new Camera(camUtils->getCameraStreamURL(camId), width_, height_, detectFlag_);
       cam->addListener(this);

       /* Sync states */
       vector<bool> initState = cam->getState();
       playing = initState[0];
       streaming = initState[1];
       cout &lt;&lt; "Initial states: " &lt;&lt; playing &lt;&lt; " " &lt;&lt; streaming &lt;&lt; endl;
       camThread = new std::thread(&amp;Camera::stream, cam);
       cout &lt;&lt; "================================================" &lt;&lt; endl;

       // Right click set up
       setContextMenuPolicy(Qt::CustomContextMenu);


       /* Loading gif */
       connMov = new QMovie("establishingConnection.gif");
       connMov->start();
       QString url = R"(RLC-logo.png)";
       logo = QPixmap(url);
       QTimer* timer = new QTimer(this);
       connect(timer, SIGNAL(timeout()), this, SLOT(GLUpdate()));
       timer->start(1000/30);
       playing = true;

    }

    /* SYSTEM */
    void GLWidget::animate()
    {
       elapsed = (elapsed + qobject_cast(sender())->interval()) % 1000;
       std::cout &lt;&lt; elapsed &lt;&lt; "\n";
    }


    void GLWidget::GLUpdate() {
       /* Process descisions before update call */
       if (bufferEnabled) {
           /* Process buffer before update */
           now = chrono::high_resolution_clock::now();
           std::chrono::duration timeSinceLastUpdate = now - lastFrameAt;
           if (timeSinceLastUpdate.count() > 25) {
               if (buffer.size() > 1 &amp;&amp; playing) {
                   lastImage.swap(buffer.front());
                   buffer.pop();
                   lastFrameAt = chrono::high_resolution_clock::now();
               }
           }
           //update(); // Update
       }
       else {
           /* No buffer */
       }
       repaint();
    }


    /* EVENTS */
    void GLWidget::onRightClickMenu(const QPoint&amp; point) {
       cout &lt;&lt; "Right click request got" &lt;&lt; endl;

       QPoint globPos = this->mapToGlobal(point);
       QMenu myMenu;

       if (!denyFullScreen) {
           myMenu.addAction("Open Full Screen");
       }
       myMenu.addAction("Toggle Debug Info");


       QAction* selected = myMenu.exec(globPos);

       if (selected) {
           string optiontxt = selected->text().toStdString();

           if (optiontxt == "Open Full Screen") {
               cout &lt;&lt; "Chose to open full screen of " &lt;&lt; camId &lt;&lt; endl;
               fullScreen = new FullScreenVideo(bufferEnabled, this);
               fullScreen->setUpView(camUtils, camId);
               fullScreen->show();
               playing = false;
           }

           if (optiontxt == "Toggle Debug Info") {
               cout &lt;&lt; "Chose to toggle debug of " &lt;&lt; camId &lt;&lt; endl;
               debug = !debug;
           }
       }
       else {
           cout &lt;&lt; "Chose nothing!" &lt;&lt; endl;
       }


    }



    void GLWidget::onPNewFrame(dPacket* inPack) {
       lastFlag = 0;

       if (bufferEnabled) {
           buffer.push(QPixmap::fromImage(toQImageFromPMat(inPack->frame)));
       }
       else {
           if (playing) {
               /* Only process if this widget is playing */
               frameProcessing = true;
               lastImage.convertFromImage(toQImageFromPMat(inPack->frame));
               frameProcessing = false;
           }
       }

       if (lastFlag != -1 &amp;&amp; !lastImage.isNull()) {
           connecting = false;
       }
       else {
           connecting = true;
       }
    }


    void GLWidget::onNotification(int alert) {
       lastFlag = alert;  
    }


    /* Paint events*/


    void GLWidget::paintEvent(QPaintEvent* event)
    {
       QPainter painter(this);

           if (lastFlag != 0 || connecting) {
               drawOnStatus(lastFlag, &amp;painter, event, elapsed);
           }
           else {

               /* Actual frame drawing */
               if (playing) {
                   if (!frameProcessing) {
                       drawImageGLLatest(&amp;painter, event, elapsed);
                   }
               }
               else {
                   drawOnPaused(&amp;painter, event, elapsed);
               }
           }
       painter.end();

    }


    /* DRAWING STUFF */

    void GLWidget::drawOnStatus(int statusFlag, QPainter* bgPaint, QPaintEvent* event, int elapsed) {

       QString str;
       QFont font("times", 15);
       bgPaint->eraseRect(QRect(0, 0, width, height));
       if (!lastImage.isNull()) {
           bgPaint->drawPixmap(QRect(0, 0, width, height), lastImage);
       }
       /* Test background painting */
       if (connecting) {
           string k = "Connecting to " + camUtils->getIp(camId);
           str.append(k.c_str());
       }
       else {
           switch (statusFlag) {
           case 1:
               str = "Blank frame received...";
               break;

           case -1:
               if (cam->getReconnectable()) {
                   str = "Connection lost, will try to reconnect.";
                   bgPaint->setOpacity(0.3);
               }
               else {
                   str = "Connection lost...";
                   bgPaint->setOpacity(0.3);
               }

               break;
           }
       }

       bgPaint->drawPixmap(QRect(0, 0, width, height), QPixmap::fromImage(connMov->currentImage()));
       bgPaint->setPen(Qt::red);
       bgPaint->setFont(font);
       QFontMetrics fm(font);
       const QRect kek(0, 0, fm.width(str), fm.height());
       QRect bound;
       bgPaint->setOpacity(1);
       bgPaint->drawText(bgPaint->viewport().width()/2 - kek.width()/2, bgPaint->viewport().height()/2 - kek.height(), str);

       bgPaint->drawPixmap(bgPaint->viewport().width() / 2 - logo.width()/2, height - logo.width() - 15, logo);

    }



    void GLWidget::drawOnPaused(QPainter* painter, QPaintEvent* event, int elapsed) {
       painter->eraseRect(0, 0, width, height);
       QFont font = painter->font();
       font.setPointSize(18);
       painter->setPen(Qt::red);
       QFontMetrics fm(font);
       QString str("Paused");
       painter->drawPixmap(QRect(0, 0, width, height),lastImage);
       painter->drawText(QPoint(painter->viewport().width() - fm.width(str), 50), str);

       if (debug) {
           QFont font = painter->font();
           font.setPointSize(25);
           painter->setPen(Qt::red);
           string camMess = "CAMID: " + camId;
           QString mess(camMess.c_str());
           string camIp = "IP: " + camUtils->getIp(camId);
           QString ipMess(camIp.c_str());
           QString bufferSize("Buffer size: " + QString::number(buffer.size()));
           QString lastFrameText("Last frame draw time: " + QString::number(painTime.count()) + "s");
           painter->drawText(QPoint(10, 50), mess);
           painter->drawText(QPoint(10, 60), ipMess);
           QString bufferState;
           if (bufferEnabled) {
               bufferState = QString("Experimental BUFFER is enabled!");
               QString currentBufferSize("Current buffer load: " + QString::number(buffer.size()));
               painter->drawText(QPoint(10, 80), currentBufferSize);
           }
           else {
               bufferState = QString("Experimental BUFFER is disabled!");
           }
           painter->drawText(QPoint(10, 70), bufferState);
           painter->drawText(QPoint(10, height - 25), lastFrameText);
       }
    }


    void GLWidget::drawImageGLLatest(QPainter* painter, QPaintEvent* event, int elapsed) {
       auto start = chrono::high_resolution_clock::now();
       painter->drawPixmap(QRect(0, 0, width, height), lastImage);
       if (debug) {
           QFont font = painter->font();
           font.setPointSize(25);
           painter->setPen(Qt::red);
           string camMess = "CAMID: " + camId;
           QString mess(camMess.c_str());
           string camIp = "IP: " + camUtils->getIp(camId);
           QString ipMess(camIp.c_str());
           QString bufferSize("Buffer size: " + QString::number(buffer.size()));
           QString lastFrameText("Last frame draw time: " + QString::number(painTime.count()) + "s");
           painter->drawText(QPoint(10, 50), mess);
           painter->drawText(QPoint(10, 60), ipMess);
           QString bufferState;
           if(bufferEnabled){
               bufferState = QString("Experimental BUFFER is enabled!");
               QString currentBufferSize("Current buffer load: " + QString::number(buffer.size()));
               painter->drawText(QPoint(10,80), currentBufferSize);
           }
           else {
               bufferState = QString("Experimental BUFFER is disabled!");
               QString currentBufferSize("Current buffer load: " + QString::number(buffer.size()));
               painter->drawText(QPoint(10, 80), currentBufferSize);
           }
           painter->drawText(QPoint(10, 70), bufferState);
           painter->drawText(QPoint(10, height - 25), lastFrameText);

       }
       auto end = chrono::high_resolution_clock::now();
       painTime = end - start;
    }



    /* END DRAWING STUFF */



    /* UI EVENTS */

    void GLWidget::mousePressEvent(QMouseEvent* e) {

       if (e->button() == Qt::LeftButton) {
           if (fullScreen == nullptr || !fullScreen->isVisible()) { // Do not unpause if window is opened
               playing = !playing;
           }
       }

       if (e->button() == Qt::RightButton) {
           onRightClickMenu(e->pos());
       }
    }



    /* Utilities */
    QImage GLWidget::toQImageFromPMat(cv::Mat* mat) {



       return QImage(mat->data, mat->cols, mat->rows, QImage::Format_RGB888).rgbSwapped();



    }

    /* State control */

    void GLWidget::killStream() {
       cam->killStream();
       camThread->join();
    }

    void GLWidget::setBufferEnabled(bool newBufferState) {
       cout &lt;&lt; "Player: " &lt;&lt; camId &lt;&lt; ", buffer state updated: " &lt;&lt; newBufferState &lt;&lt; endl;
       bufferEnabled = newBufferState;
       buffer.empty();
    }

    void GLWidget::setCameraRetryConnection(bool newState) {
       cam->setReconnectable(newState);
    }

    /* Destruction */
    GLWidget::~GLWidget() {
       cam->killStream();
       camThread->join();
    }
    </bool></future>

    CamUtils.h :

    #pragma once
    #include <iostream>
    #include <vector>
    #include <fstream>
    #include <map>
    #include <string>
    #include <sstream>
    #include <algorithm>
    #include <nlohmann></nlohmann>json.hpp>

    using namespace std;
    using json = nlohmann::json;

    class CamUtils
    {
    private:

       string camDb = "cameras.dViewer";
       map> cameraList; // Legacy
       json cameras;
       ofstream dbFile;
       bool dbExists(); // Always hard coded

       /* Old IMPLEMENTATION */
       void writeLineToDb_(const string&amp; content, bool append = false);
       void loadCameras_();

       /* JSON based */
       void loadCameras();

    public:
       CamUtils();
       string generateRandomString(size_t length);
       string getCameraStreamURL(string cameraId) const;
       string saveCamera(string ip, string username, string pass); // Return generated id
       vector<string> listAllCameraIds();
       string getIp(string cameraId);
    };


    </string></algorithm></sstream></string></map></fstream></vector></iostream>

    CamUtils.cpp :

    #include "CamUtils.h"
    #pragma comment(lib, "rpcrt4.lib")  // UuidCreate - Minimum supported OS Win 2000
    #include
    #include <iostream>

    CamUtils::CamUtils()
    {
       if (!dbExists()) {
           ofstream dbFile;
           dbFile.open(camDb);
           cameras["cameras"] = json::array();
           dbFile &lt;&lt; cameras &lt;&lt; std::endl;
           dbFile.close();

       }
       else {
           loadCameras();
       }
    }




    vector<string> CamUtils::listAllCameraIds() {
       vector<string> ids;
       cout &lt;&lt; "IN LIST " &lt;&lt; endl;
       for (auto&amp; cam : cameras["cameras"]) {
           ids.push_back(cam["id"].get<string>());
           //cout &lt;&lt; cam["id"].get<string>() &lt;&lt; std::endl;
       }
       return ids;
    }

    string CamUtils::getIp(string id) {
       vector<string> camDetails = cameraList[id];
       string ip = "NO IP WILL DISPLAYED UNTIL I FIGURE OUT A BUG";
       for (auto&amp; cam : cameras["cameras"]) {
           if (id == cam["id"]) {
               ip = cam["ip"].get<string>();
           }
       }

       return ip;
    }

    string CamUtils::getCameraStreamURL(string id) const {
       string url = "err"; // err is the default, it will be overwritten in case id is found, dont forget to check for it

       for (auto&amp; cam : cameras["cameras"]) {
           if (id == cam["id"]) {
               if (cam["username"].get<string>() == "null") {
                   url = "rtsp://" + cam["ip"].get<string>() + ":554/axis-media/media.amp?tcp";
               }
               else {
                   url = "rtsp://" + cam["username"].get<string>() + ":" + cam["password"].get<string>() + "@" + cam["ip"].get<string>() + ":554/axis-media/media.amp?streamprofile=720_30";
               }
           }
       }

       return url;  // Dont forget to check for err when using this shit
    }


    string CamUtils::saveCamera(string ip, string username, string password) {
       UUID uid;
       UuidCreate(&amp;uid);
       char* str;
       UuidToStringA(&amp;uid, (RPC_CSTR*)&amp;str);
       string id = str;
       cout &lt;&lt; "GEN: " &lt;&lt; id &lt;&lt; endl;
       json cam = json({}); //Create emtpy object
       cam["id"] = id;
       cam["ip"] = ip;
       cam["username"] = username;
       cam["password"] = password;
       cameras["cameras"].push_back(cam);
       std::ofstream out(camDb);
       out &lt;&lt; cameras &lt;&lt; std::endl;
       cout &lt;&lt; cameras["cameras"] &lt;&lt; endl;

       cout &lt;&lt; "Saved camera as " &lt;&lt; id &lt;&lt; endl;
       return id;
    }


    bool CamUtils::dbExists() {
       ifstream dbFile(camDb);
       return (bool)dbFile;
    }





    void CamUtils::loadCameras() {
       cout &lt;&lt; "Load call" &lt;&lt; endl;
       ifstream dbFile(camDb);
       string line;
       string wholeFile;

       while (std::getline(dbFile, line)) {
           cout &lt;&lt; line &lt;&lt; endl;
           wholeFile += line;
       }
       try {
           cameras = json::parse(wholeFile);
           //cout &lt;&lt; cameras["cameras"] &lt;&lt; endl;

       }
       catch (exception e) {
           cout &lt;&lt; e.what() &lt;&lt; endl;
       }
       dbFile.close();
    }










    /*
       LEGACY CODE, TO BE REMOVED!

    */



    void CamUtils::loadCameras_() {
       /*
           LEGACY CODE:
           This used to be the way to load cameras, but I moved on to JSON based configuration so this is no longer needed and will be removed soon
       */

       ifstream dbFile(camDb);
       string line;
       while (std::getline(dbFile, line)) {
           /*
               This function load camera data to the map:
               The order MUST be the following: 0:ID, 1:IP, 2:USERNAME, 3:PASSWORD.
               Always delimited with | no spaces between!
           */
           if (!line.empty()) {
               stringstream ss(line);
               string item;
               vector<string> splitString;

               while (std::getline(ss, item, '|')) {
                   splitString.push_back(item);
               }
               if (splitString.size() > 0) {
                   /* Dont even parse if the program didnt split right*/
                   //cout &lt;&lt; "Split string: " &lt;&lt; splitString.size() &lt;&lt; "\n";
                   for (int i = 0; i &lt; (splitString.size()); i++) cameraList[splitString[0]].push_back(splitString[i]);
               }
           }
       }
    }



    void CamUtils::writeLineToDb_(const string &amp; content, bool append) {
       ofstream dbFile;
       cout &lt;&lt; "Creating?";
       if (append) {
           dbFile.open(camDb, ios_base::app);
       }
       else {
           dbFile.open(camDb);
       }

       dbFile &lt;&lt; content.c_str() &lt;&lt; "\r\n";
       dbFile.flush();
    }

    /* JSON Reworx */




    string CamUtils::generateRandomString(size_t length)
    {
       const char* charmap = "ABCDEFGHIJKLMNOPQRSTUVWXYZ";
       const size_t charmapLength = strlen(charmap);
       auto generator = [&amp;]() { return charmap[rand() % charmapLength]; };
       string result;
       result.reserve(length);
       generate_n(back_inserter(result), length, generator);
       return result;
    }
    </string></string></string></string></string></string></string></string></string></string></string></string></iostream>

    End of example

    How would I go about decreasing CPU usage when dealing with large amount of streams ?