
Recherche avancée
Médias (1)
-
Collections - Formulaire de création rapide
19 février 2013, par
Mis à jour : Février 2013
Langue : français
Type : Image
Autres articles (90)
-
Qu’est ce qu’un éditorial
21 juin 2013, parEcrivez votre de point de vue dans un article. Celui-ci sera rangé dans une rubrique prévue à cet effet.
Un éditorial est un article de type texte uniquement. Il a pour objectif de ranger les points de vue dans une rubrique dédiée. Un seul éditorial est placé à la une en page d’accueil. Pour consulter les précédents, consultez la rubrique dédiée.
Vous pouvez personnaliser le formulaire de création d’un éditorial.
Formulaire de création d’un éditorial Dans le cas d’un document de type éditorial, les (...) -
Le profil des utilisateurs
12 avril 2011, parChaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...) -
Configurer la prise en compte des langues
15 novembre 2010, parAccéder à la configuration et ajouter des langues prises en compte
Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...)
Sur d’autres sites (14275)
-
Feeding content of an X-Window to a virtual camera
11 mai 2022, par IngoI want to feed a virtual webcam device from an application window (under Linux/Xorg). I have so far just maximised the window and then used
ffmpeg
to grab the whole screen like this :

ffmpeg \
 -f x11grab -framerate 15 -video_size 1280x1024 -i :0+0,0 \
 -f v4l2 -vcodec rawvideo -pix_fmt yuv420p /dev/video6



where
/dev/video6
is my v4l2loopback device. This works and I can use the virtual camera in video calls in chrome. This also indicates that the v4l2loopback module is correctly loaded into the kernel.

Unfortunately, it seems that
ffmpeg
can only read the whole screen, but not an application window.gstreamer
on the other hand can. Playing around withgst-launch-1.0
, I was hoping that I could get away with something like this :

gst-launch-1.0 ximagesrc xid=XID_OF_MY_WINDOW \
 ! "video/x-raw" \
 ! v4l2sink device=/dev/video6



However, that complains that
Device '/dev/video6' is not an output device.


Given that
ffmpeg
seems happy to write to/dev/video6
I also tried piping the gst output to ffmpeg like this :

gst-launch-1.0 ximagesrc xid=XID_OF_MY_WINDOW \
 ! "video/x-raw" \
 ! filesink location=/dev/stdout \
 | ffmpeg -i - -codec copy -f v4l2 -vcodec rawvideo -pix_fmt yuv420p /dev/video6



But then ffmpeg complains about
Invalid data found when processing input
.

This is running inside an xvfb headless environment, so mouse interactions will not work. This rules out obs as far as I can see.


I'm adding the chrome tag, because I see that chrome in principle would also provide a virtual camera via the
--use-fake-device-for-media-stream
switch. However, it seems that this switch only supports a static file rather than a stream.

Although I don't see why, it might be relevant that the other "application window" window is simply a second browser window. So the setup is google meet (or similar) in one browser window and the virtual camera gets fed vrom a second browser window.


-
ANSI FATE
24 août 2010, par Multimedia Mike — FATE ServerThe new FATE server is shaping up well. I think most of the old configurations have been migrated to the new server. I see one new compiler for x86_64– PathScale. It’s not faring particularly well at this point.
New Tests
As I write this, I noticed that there are now an even 700 tests, twice as many as the last time I trumpeted such a milestone. (It should be noted that the new FATE system finally breaks down the master regression suite into individual tests.) Thankfully, it’s no longer necessary to wait for me to create or edit tests (anyone with FFmpeg privileges can do this), nor is it necessary to keep up with this blog to know exactly what tests are new. Now, you can simply inspect the file history on tests/fate.mak and tests/fate2.mak (I think these 2 files are going to merge in the near future).Vitor, as of r24865 : “Add FATE test for ANSI/ASCII animation and TTY demuxer.” Eh ? What’s this about ? I admit I was completely removed from FFmpeg development for much of June and July so I could have missed a lot. Fortunately, I can check the file history to see which lines were added to make this test happen. And if FATE is exercising the test, you know exactly where the samples will live. Here’s this new decoder in action on the relevant sample :
The file history fingers Suxen drol/Peter Ross for this handiwork. I might have guessed– the only person who is arguably more enamored with old, weird formats than even I. Now we wait for the day that YouTube has support for this format. I’m sure there are huge archives of these animations out there (and I wager that Trixter and Jason Scott know where).
It’s an animation — it just keeps going
Meanwhile, the FATE suite now encompasses a bunch of perceptual audio formats, thanks to the 1-off testing method and a few other techniques. These formats include Bink audio, WMA Pro, WMA voice, Vorbis, ATRAC1, ATRAC3, MS-GSM, AC3, E-AC3, NellyMoser, TrueSpeech, Intel Music Coder, QDM2, RealAudio Cooker, QCELP (just going down the source control log here), and others, no doubt.
Then there’s this curious tidbit : “Add FATE test for WMV8 DRM”. The test spec is
"fate-wmv8-drm: CMD = framecrc -cryptokey 137381538c84c068111902a59c5cf6c340247c39 -i $(SAMPLES)/wmv8/wmv_drm.wmv -an"
. I would still like to investigate FFmpeg’s cryptographic capabilities, which I suspect are moving in a direction to function as a complete SSL stack one day.New Platforms
As for new platforms, the new FATE system finally allows testing on OS/2 (remember that classic ? It was “the totally cool way to run your computer”). Thanks to Dave Yeo for taking this on.Further, a new MIPS-based platform recently appeared on the FATE list. This one reports itself as running on 74kf CPU. Googling for this processor quickly brings up Mans’ post about the Popcorn Hour device. So, congratulations to him for getting the mundane box to serve a higher purpose. Perhaps one day, I’ll be able to do the same for that Belco Alpha-400 netbook.
-
how to use guardianproject's android ffmpeg library ?
21 octobre 2013, par Blaze TamaFirst, this is my first time "playing" with ffmpeg, so please bear with me.
Generally, i dont understand ffmpeg even a little bit. So i did lot, lot of researches (and also trial & error) and i finally found this project and its library
So i was successfully created the ffmpeg and sox binary file, and i put it in the raw folder at the library project (from the link i shared).
Now, i want to use the library for my project, but i still cant do it. I tried to use some methods in the
FfmpegController
likecombineAudioAndVideo
and more but its not working (yet).I dont post the error here since i still do my trial&errors (and the error change regularly) but im getting tired now.
EDIT
This is what i did :
private FfmpegController ffController;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
File file = new File(Uri.parse("android.resource://com.my.package/" + R.raw.test).getPath());
try {
ffController = new FfmpegController(this, file);
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
MediaDesc desc = ffController.combineAudioAndVideo(R.raw.test, R.raw.musictest, "test.mp4", null);
}The
combineAudioAndVideo
always error because wrong parameters. It needsMediaDesc
but i dont know how to do it.I will be very happy if you can share your working code if you have done the ffmpeg processing with this library.