
Recherche avancée
Autres articles (52)
-
Les tâches Cron régulières de la ferme
1er décembre 2010, parLa gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
Le super Cron (gestion_mutu_super_cron)
Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Ajouter notes et légendes aux images
7 février 2011, parPour pouvoir ajouter notes et légendes aux images, la première étape est d’installer le plugin "Légendes".
Une fois le plugin activé, vous pouvez le configurer dans l’espace de configuration afin de modifier les droits de création / modification et de suppression des notes. Par défaut seuls les administrateurs du site peuvent ajouter des notes aux images.
Modification lors de l’ajout d’un média
Lors de l’ajout d’un média de type "image" un nouveau bouton apparait au dessus de la prévisualisation (...)
Sur d’autres sites (5950)
-
Adding FFMPEG support to Chromium Portable on Windows without recompiling
13 décembre 2023, par AshbyI am using Chromium portable on Windows recently.


I downloaded Chromium portable from chromium.org, finding it without FFMPEG support, which means playing media just not possible for some websites.


Is there a simple or a little complex hack or plugin or patches to make FFMPEG on Chromium portable from chromium.org work ? Thanks.


Due to personal reasons I am not intended to use Google Chrome for some months, due to its automatic updates something like Windows Update, making me annoyed.


On Linux, I know there is a package called
chromium-codecs-ffmpeg-extra
, which makes FFMPEG support possible. But I do NOT find something similar on Windows.

Recompiling Chromium costs hours and I just do not want to use some third-party Chromium releases due to security requests.


Years have gone and past questions years ago on stackflow just not cater my need today.


Thanks for your patience & understanding.


-
The system cannot find the file specified error when trying to execute FFMpeg command with C# (same code works fine in a different app)
5 mars 2023, par m_krI know there are similar questions to this one. I have gone through every single one I could find and nothing worked for me. Here is my issue :


I am trying to execute a FFMpeg command in command-line through .NET.


Before anything I tried doing it with the following code :


public static string executeCommand(string commandToBeExecuted)
 {
 Process cmd = new Process();
 cmd.StartInfo.FileName = "cmd.exe";
 cmd.StartInfo.RedirectStandardInput = true;
 cmd.StartInfo.RedirectStandardOutput = true;
 cmd.StartInfo.CreateNoWindow = true;
 cmd.StartInfo.UseShellExecute = false;
 cmd.Start();

 cmd.StandardInput.WriteLine(commandToBeExecuted);
 cmd.StandardInput.Flush();
 cmd.StandardInput.Close();
 cmd.WaitForExit();
 return cmd.StandardOutput.ReadToEnd();
 }



Sending the "ffmpeg -h" command in commandToBeExecuted. This did not work.


I next tried the following solution :


public static string ffmpegCommand(string commandToBeExecuted)
 {
 ProcessStartInfo startInfo = new ProcessStartInfo();
 startInfo.CreateNoWindow = false;
 startInfo.UseShellExecute = false;
 startInfo.FileName = "c:\\ffmpeg\\bin\\ffmpeg.exe";
 startInfo.WindowStyle = ProcessWindowStyle.Hidden;
 startInfo.Arguments = "-h";

 startInfo.RedirectStandardOutput = true;
 startInfo.RedirectStandardError = true;


 Process exeProcess = Process.Start(startInfo);

 // string error = exeProcess.StandardError.ReadToEnd();
 string output = exeProcess.StandardOutput.ReadToEnd();
 exeProcess.WaitForExit();
 return output;
 }



This returns the following error :




The system cannot find the file specified




I am assuming this is referring to this part of the code :


startInfo.FileName = "c:\\ffmpeg\\bin\\ffmpeg.exe";



However, I checked and this is the correct path to my ffmpeg.exe file. On an even weirder note, this code works correct when tested in a new .net console application. However, I am creating an extension for OutSystems in integration, and when testing this code there it no longer works. The long exception from the logs is the following :




CssbobffmpegCommandTestFolder
System.ComponentModel.Win32Exception : The system cannot find the file specified
at Object.s [as getException] (https://personal-jwy0bfog.outsystemscloud.com/FFMpegCommandGeneratorFFProbeVisual/scripts/OutSystems.js?RnlDcii3Xz75iIHHERIZtA:2:10241)
at c.onSuccess (https://personal-jwy0bfog.outsystemscloud.com/FFMpegCommandGeneratorFFProbeVisual/scripts/OutSystems.js?RnlDcii3Xz75iIHHERIZtA:3:7232)
at XMLHttpRequest. (https://personal-jwy0bfog.outsystemscloud.com/FFMpegCommandGeneratorFFProbeVisual/scripts/OutSystems.js?RnlDcii3Xz75iIHHERIZtA:3:2648)




I researched similar problems and tried the following solutions :


In place of :


startInfo.FileName = "c:\\ffmpeg\\bin\\ffmpeg.exe";



I tried :


startInfo.WorkingDirectory = "c:\\ffmpeg\\bin";
 startInfo.FileName = @"ffmpeg.exe";



I also tried changing the :


startInfo.Arguments = "-h";



to :


startInfo.Arguments = "/C -h";



I tried to "add new item" to my solution : the ffmpeg.exe file, and I tried the following logic :


public static string testingNewApproachTwoThree(string commandToBeExecuted)
 {
 string res;
 ProcessStartInfo startInfo = new ProcessStartInfo();

 startInfo.CreateNoWindow = false;
 startInfo.UseShellExecute = false;
 startInfo.FileName = Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "ffmpeg\\ffmpeg.exe");
 startInfo.Arguments = "-h";
 startInfo.RedirectStandardOutput = true;
 //startInfo.RedirectStandardError = true;

 res = string.Format(
 "Executing \"{0}\" with arguments \"{1}\".\r\n",
 startInfo.FileName,
 startInfo.Arguments) + " NEXT: ";

 try
 {
 using (Process process = Process.Start(startInfo))
 {
 while (!process.StandardOutput.EndOfStream)
 {
 res = res + process.StandardOutput.ReadLine();

 }

 process.WaitForExit();
 }
 }
 catch (Exception ex)
 {
 res = res + "exception:" + ex.Message;
 }

 return res;
 }



as suggested in a different question.


I tried changing the capitalization of letters in the specified filepath to make sure it matches the naming of my folders. Nothing worked.


Any ideas ?


-
How to interpret ffmpeg recording options available for a webcam (directshow) ?
5 janvier 2023, par Jones659I am trying to create a GUI for personal use, that allows someone to customise recording and converting options of ffmpeg, without directly using the command line. At the moment, I am learning about different parameters and flags in ffmpeg.


Apologies in advance if I end up asking some stupid questions, I am on a learning journey at the moment, unfortunately not all of this info is available online in an easily understandable way.


I have a USB webcam which reported having the following options available to it :


[dshow @ 00000000003f9340] pixel_format=yuyv422 min s=640x480 fps=5 max s=640x480 fps=30
[dshow @ 00000000003f9340] pixel_format=yuyv422 min s=640x480 fps=5 max s=640x480 fps=30 (tv, bt470bg/bt709/unknown, topleft) chroma_location=topleft
[dshow @ 00000000003f9340] pixel_format=yuyv422 min s=352x288 fps=5 max s=352x288 fps=30
[dshow @ 00000000003f9340] pixel_format=yuyv422 min s=352x288 fps=5 max s=352x288 fps=30 (tv, bt470bg/bt709/unknown, topleft)
[dshow @ 00000000003f9340] pixel_format=yuyv422 min s=320x240 fps=5 max s=320x240 fps=30
[dshow @ 00000000003f9340] pixel_format=yuyv422 min s=320x240 fps=5 max s=320x240 fps=30 (tv, bt470bg/bt709/unknown, topleft)
[dshow @ 00000000003f9340] pixel_format=yuyv422 min s=176x144 fps=5 max s=176x144 fps=30
[dshow @ 00000000003f9340] pixel_format=yuyv422 min s=176x144 fps=5 max s=176x144 fps=30 (tv, bt470bg/bt709/unknown, topleft)
[dshow @ 00000000003f9340] pixel_format=yuyv422 min s=160x120 fps=5 max s=160x120 fps=30
[dshow @ 00000000003f9340] pixel_format=yuyv422 min s=160x120 fps=5 max s=160x120 fps=30 (tv, bt470bg/bt709/unknown, topleft)
[dshow @ 00000000003f9340] pixel_format=yuyv422 min s=1280x1024 fps=5 max s=1280x1024 fps=9
[dshow @ 00000000003f9340] pixel_format=yuyv422 min s=1280x1024 fps=5 max s=1280x1024 fps=9 (tv, bt470bg/bt709/unknown, topleft)



I just want to get to the bottom of how I should interpret this, apologies that I will ask multiple questions :


- 

-
The fact that both resolution and fps have a min and max value (for every option) seems to imply that these two parameters are supposably uncontrollably variable, right ? In practice, the fps has been variable depending on brightness, however the resolution has not been - is it safe to assume that video imaging devices (especially such as a webcam) do not have variable resolution ?


-
Secondly, why is it that every option is listed twice, except half of them specify extra info, such as color_range, color_space, and chroma_location ? Is this just a quirk ? Surely those extra parameter options should not be discarded ?


-
It's hard to know how to make sense of this, but or example : the fact that only "tv" is ever shown, does that impliy that the webcam can only ever do limited color range, and there is no point trying to get full 0,255 out of it ? I read somewhere that "pc" implies full range of 0-255, whereas "tv" implies a range of 16-235


-
With regards to color space, is it acceptable to record the webcam as raw (un-encoded), and then later convert to a different color space later down the line ? Which approach to dealing with the color-space yields the least amount of lost color ? My only previous experience with color spaces is in the realm of images - where for example, it makes no sense to convert sRGB to ROMM16 RGB, because you're going to a color space which has wider coverage, and extra colors won't be created out of thin air, you'd want to go once from raw to a color space, and avoid converting between color spaces afterwards. Also, what does "unknown" mean in the color space options ?












Here's the culmination of some research/testing i've done, is there anything correct, or seriously wrong, in the conclusions and assumptions I've made below ?


My understanding of pixel_format is as follows : when you're recording, (even to raw), you specify the pixel format using something like "-pixel_format yuyv422", this is a "packed", not "planar" format, which is produced by the webcam. When you convert from raw to something like mkv using libx264, you can't specify a "packed" pixel format such as "yuyv422", but must instead use an appropriate planar counterpart, such as "yuv422p", which would be specified using "-pix_fmt yuv422p".


I did a raw recording of the webcam (in which I recorded a bright light, in the dark), I didn't set any of the options in the brackets above. I then converted this video using libx264 with the flags "-dst_range 1 -color_range 2" which I saw elsewhere on the internet.


Taking a screenshot of this video using vlc, and putting it through imagemagick identify -verbose, shows that the color range of the screenshot is 0,255, as for the video itself, "MediaInfo" reports "color range:Full", VLC's codec info says "Decoded format : Planar 4:2:2 YUV full scale - is this info worth anything, or is it just meta-data that the video got tagged with ?


At first I was happy about imagemagick's color range reporting, but I am thinking now, the 0, 255 range could be a result of "overshoot" values produced by the camera, which aren't actually supposed to be mapped linearly.


I appreciate that this probably feels like some school-kiddy offloading their homework assignment to avoid doing work, but I hope it can be seen that I've looked into these things prior to putting this post together.


Thanks in advance, if anyone takes the time to answer anything.


-