
Recherche avancée
Autres articles (103)
-
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
Support de tous types de médias
10 avril 2011Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)
-
ANNEXE : Les plugins utilisés spécifiquement pour la ferme
5 mars 2010, parLe site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)
Sur d’autres sites (11989)
-
inotifywait -m does not process more than 1 file after long running process
2 mai 2022, par Yllier123I have a script that detects files on close_write and runs an 5 minute process on them. These files are written to the directory in batches of up to 100. The issue is that inotifywait only detects the first file in the batch and does not process the subsequent files unless they are removed from the directory by hand and put back. Here is my script :


#!/bin/bash

inotifywait -r -e close_write -e moved_to --format "%f" $TARGET -m | while read file
 do
 if [[ "$file" =~ .*mp4$ ]]; then
 echo "Detected $file"
 /usr/bin/python3 LongRunningProgram.py -i $TARGET/$file -o $PROCESSED -u $UPLOADPATH -c $C
 fi
 done



it is maintained by a systemctl service written like so :


[Unit]
Description=Description
After=network.target

[Service]
Type=idle
user=pi
WorkingDirectory=/home/pi
ExecStart=/bin/bash /home/pi/notify.sh OutPath C
Restart=on-failure

[Install]
WantedBy=multi-user.target



I am confused as to why it only seems to recognize the first file but not subsequent files when run like this, however if I replace the long running program with sleep 300 it seems to work fine.


-
Laravel MySQL DB not updating after long process
11 mai 2022, par slanginbitsI'm doing some video encoding using Laravel and FFMpeg using https://github.com/protonemedia/laravel-ffmpeg


During the video conversion, I can perform some db updates to update the percentage complete in hls_percent field :


$ffmpeg_cmd = FFMpeg::fromDisk('local_videos')
 ->open($video->filename)
 ->exportForHLS()
 ->toDisk('local_videos')
 ->setSegmentLength(3) // optional
 ->setKeyFrameInterval($key_interval) // optional
 ->onProgress(function ($percentage, $remaining = 0, $rate = 0) use ($id) {
 ItemVideo::where('id', $id)->update(['hls_percent' => $percentage]); 
 });



This process works fine and the hls_percent value gets updated to 100 and the encoded video files are generated.


After some files are moved, (takes several seconds) a final db update is not done.


ItemVideo::where('id', $id)->update(['hls_complete' => 1]);



The timeout only happens while encoding a large 150MB (10 min duration) mp4 file. Smaller/shorter videos complete the process without any issues.


I have increased the following in php.ini


memory_limit = 512M 
post_max_size = 1024M
upload_max_filesize = 1024M
max_execution_time = 2400
max_input_time = 2400
default_socket_timeout = 2400



I modified the global variables in the MySQL database server to higher timeouts
as instructed here https://sebhastian.com/lost-connection-mysql-server-during-query/


connect_timeout 2400
delayed_insert_timeout 300
have_statement_timeout YES
innodb_flush_log_at_timeout 1
innodb_lock_wait_timeout 50
innodb_rollback_on_timeout OFF
interactive_timeout 28800
lock_wait_timeout 31536000
mysqlx_connect_timeout 30
mysqlx_idle_worker_thread_timeout 60
mysqlx_interactive_timeout 28800
mysqlx_port_open_timeout 0
mysqlx_read_timeout 30
mysqlx_wait_timeout 28800
mysqlx_write_timeout 60
net_read_timeout 2400
net_write_timeout 2400
replica_net_timeout 2400
rpl_stop_replica_timeout 31536000
rpl_stop_slave_timeout 31536000
slave_net_timeout 2400
ssl_session_cache_timeout 2400
wait_timeout 28800



apache2handler :


Max Requests Per Child : 0 - Keep Alive : on - Max Per Connection : 100
Timeouts Connection : 300 - Keep-Alive : 5


I'm not getting an error messages in laravel or /var/log/apache2/error.log


What else am I missing ? How can I keep the MySQL connection alive to make the final update ?


-
avutil/wchar_filename,file_open : Support long file names on Windows
26 mai 2022, par softworkz