
Recherche avancée
Médias (1)
-
The Slip - Artworks
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Texte
Autres articles (103)
-
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page. -
Le profil des utilisateurs
12 avril 2011, parChaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...) -
Configurer la prise en compte des langues
15 novembre 2010, parAccéder à la configuration et ajouter des langues prises en compte
Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...)
Sur d’autres sites (11737)
-
vf_libplacebo : switch to newer libplacebo helpers
14 décembre 2021, par Niklas Haasvf_libplacebo : switch to newer libplacebo helpers
Support for mapping/unmapping hardware frames has been added into
libplacebo itself, so we can scrap this code in favor of using the new
functions. This has the additional benefit of being forwards-compatible
as support for more complicated frame-related state management is added
to libplacebo (e.g. mapping dolby vision metadata).It's worth pointing out that, technically, this would also allow
`vf_libplacebo` to accept, practically unmodified, other frame types
(e.g. vaapi or drm), or even software input formats. (Although we still
need a vulkan *device* to be available)To keep things simple, though, retain the current restriction to vulkan
frames. It's possible we could rethink this in a future commit, but for
now I don't want to introduce any more potentially breaking changes. -
Scheduling an RTMP stream remotely - using an intermediary server for storing + sending video stream packets before deploying to streaming service
25 février 2020, par hedgehog90This is more of a curiosity than something I really need, but I thought I’d ask anyway.
If I just want setup a normal livestream, I would use something like OBS, capture my input, encode it into something manageable for my network and send it to a remote rtmp server.
But I’d like to know if it’s possible to put an intermediary remote server between myself and the streaming service’s server. Basically so I can manage the stream (if it’s a playlist) and schedule when to send it to broadcast on the streaming service.
It’s also worth noting that there may be limited bandwidth on the client-side (my computer), assuming the intermediary has greater bandwidth, this method should eliminate the common issue of dropping frames while streaming.
Now for an example :
To make it simpler, instead of using OBS + capture hardware, I’m using a video file.
I want to encode that video in the same way that OBS does when streaming to a remote server via an rtmp protocol using ffmpeg.Now I upload that data, at my own rate, to a remote server that I control (running Ubuntu) for storage and eventual deployment. Importantly, I do not want or require any video-processing done on the intermediary server, as we have already encoded the data for deployment on the client-side. This is just simply managing and storing the data.
A day later, I want to run a script on my intermediary server that will then send the processed stream data, packet by packet, to the targeted streaming server.
I’m an experienced coder, with lots of experience with data handling and video encoding. It should be simple, but I’m not all that clued up on the way video streaming via RTMP works.
-
How to detach thread performing video decode using libavcodec ?
31 décembre 2018, par codemonkeyI need to create an application which launches a video decoding function (based on the demo from libavcodec here) on a separate thread while the main thread of execution continues to perform other tasks concurrently with the decode process.
I have tried having a
std::thread
call the function and detach. I’ve also tried having it join the main thread after the rest of the main thread executes.It might be worth noting that I am telling libavcodec to use multiple threads for decoding (by setting ctxt->num_threads), but even with one thread I always the same error message :
Assertion fctx->async_lock failed at src/libavcodec/pthread_frame.c:155
(The decode function works fine when its called from the main thread or if I call it from a new thread and join the thread immediately after calling.)
I have also tried
init
’ing all of libavcodec’s resources from the same thread, in case that would have fixed the problem but that did not resolve the issue.In code, these both give me the error above :
decoderThread = new thread(&decoder:: decodeVideo, this);
decoderThread.detach()
*execution of main thread*
decoderThread = new thread(&decoder:: decodeVideo, this);
*execution of main thread*
decoderThread.join()I need a separate thread to decode video (possibly spawning more threads in the process) while allowing the main thread to proceed, but I do not seem able to do so right now. Any guidance would be appreciated.