
Recherche avancée
Autres articles (77)
-
Le profil des utilisateurs
12 avril 2011, parChaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...) -
Configurer la prise en compte des langues
15 novembre 2010, parAccéder à la configuration et ajouter des langues prises en compte
Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...) -
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
Sur d’autres sites (11063)
-
Interact with ffmpeg from a .NET program - Write Input
7 mai 2015, par ShimmyIn reference to this question, as you can see I managed to run and receive data from the program.
However I didn’t manage to submit data to it, for instance, while converting a file, pressing
q
immediately stop conversion and stops the program.
I need my application to support stopping the process as well, and I think this should be done by passing this parameter to the ffmpeg app, since I want it to take care of all uncollected resource or whatever dust it would leave behind if I would just go and useprocess.Kill()
Here is what I’ve tried :
static int lineCount = 0;
static bool flag;
static void process_ErrorDataReceived(object sender, DataReceivedEventArgs e)
{
Console.WriteLine("Error ({1:m:s:fff}: {0})", lineCount++,
DateTime.Now);
if (e.Data != null && string.Equals(e.Data,"Press [q] to stop, [?] for help"))
flag = true;
if (flag)
{
flag = false;
Console.WriteLine("Stopping ({0:m:s:fff})...", DateTime.Now);
process.CancelErrorRead();
process.CancelOutputRead();
process.StandardInput.WriteLine("q");
}
Console.WriteLine(e.Data);
Console.WriteLine();
}But it doesn’t do anything, seems that once the conversion has been requested, I have no control on it any more, I can only receive output from it. Running it as stand alone does allow me interaction of course.
What am I missing here, is it a different trick in submitting the output or the code in previous answer is wrong, or I should have chosen a different approach ?
For your attention,
RedirectStandardInput
is on.NOTE : as you can see in the answer of my previous question, ffmpeg interacts differently, I think the one who knows the answer will be (maybe I’m wrong) someone with experience in ffmpeg.
-
FFmpeg and Blu-ray subtitles [closed]
14 octobre 2020, par Paul DeRoccoI have some Blu-ray rips with PGS (image-based) subtitles. My goal is to transcode the video to lower-rate H265, to save space and lower the bitrate, but I need the subtitles. I don't really care what the container is—it can remain .mt2s, or be anything else, but I've been unable to get FFmpeg to do anything with this kind of video that doesn't discard the subtitles, or convert them into "PES packets containing private data", as VLC Player describes it.



For instance, how can I do something as simple as making another .mt2s file with the first minute of an existing .mt2s file, copying all the streams verbatim, just truncating them all ? I can understand that perhaps you can't put PGS subtitle streams into some containers, but if the output is the same kind of container as the input, why would that be a problem ?





If I use FFmpeg to list the streams in my original file, it says :



Stream #0:0[0x1011]: Video: h264 (High) (HDMV / 0x564D4448), yuv420p(progressive), 1920x1080 [SAR 1:1 DAR 16:9], 23.98 fps, 23.98 tbr, 90k tbn, 47.95 tbc
Stream #0:1[0x1100]: Audio: ac3 (AC-3 / 0x332D4341), 48000 Hz, mono, fltp, 192 kb/s
Stream #0:2[0x1200]: Subtitle: hdmv_pgs_subtitle ([144][0][0][0] / 0x0090), 1920x1080




If I do



ffmpeg -i foobar.m2ts -c copy -t 1:00 test.m2ts




I end up with the following streams in test.m2ts :



Metadata:
 service_name : Service01
 service_provider: FFmpeg
Stream #0:0[0x100]: Video: h264 (High) ([27][0][0][0] / 0x001B), yuv420p(progressive), 1920x1080 [SAR 1:1 DAR 16:9], 23.98 fps, 23.98 tbr, 90k tbn, 47.95 tbc
Stream #0:1[0x101]: Audio: ac3 ([129][0][0][0] / 0x0081), 48000 Hz, mono, fltp, 192 kb/s
Stream #0:2[0x102]: Data: bin_data ([6][0][0][0] / 0x0006)




I was trying to create a really short file so that I could play with different options for doing what I really want to do, which is to compress the video more, without losing the subtitles, and I don't want to wait for it to convert a whole movie before I discover that it didn't work right.


-
ffmpeg/libav easy way to set options for muxer, codec, format, etc [closed]
14 mars 2023, par PatrickEdit (shorter question) :


In the ff tools (e.g. ffplay) I can set all kinds of options by simply providing a list of arguments in cmd. Id like to do that programmatically as well and without having to look up where it belongs and insert it in the different AVDictionary by hand every time. Is there a way to do this easily ?


Long question :


I recently messed around with ffplay code to see how it works and I noticed it uses a very straight forward way to parse and set all the command line options using the library internal cmdutils.h. I personally find the av_opt_set used in other examples of av wrappers quite confusing (Some args are explicitly stored in the struct, some in priv_data ? Im allowed/supposed to modify void*priv_data ? Which objects can I use av_opt_set on ? Which args go in which object and are they declared or in priv_data ? Where is this documented ?).
In ffplay all args are simply stored in an array and distributed to the right codec/muxer/format instance using cmdutil.
Id like to have exactly this functionality for my program (so that i can simply read a json config and don't need to care about it any further). Apparently the necessary OptionDef arrays are already defined in different implementation files.


However my actual question : I noticed the OptionDef array definition in ffplay does not contain all options (only some from cmdutil included via macro). But other options e.g. fflags are not included anywhere (only defined somewhere else) and yet they work. So how does cmdutil set/parse them ?


I hope someone can answer this, since simply adapting cmdutil would be a quite simple solution for me. Id also really appreciate some general guidance regarding my previous questions.


Many Thanks in advance !


I tried looking into the ffmpeg source and expected all OptionDef array definitions to be connected/collected inside a single array such that cmdutil can parse them easily. However this isn't the case and still some cmd options work. Therefore Im confused on how cmtutil is able to parse them