
Recherche avancée
Autres articles (56)
-
MediaSPIP version 0.1 Beta
16 avril 2011, parMediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...) -
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
Amélioration de la version de base
13 septembre 2013Jolie sélection multiple
Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)
Sur d’autres sites (4359)
-
Real time livestreaming - RPI FFmpeg and H5 Player
29 avril 2022, par VictorI work at a telehealth company and we are using connected medical devices in order to provide the doctor with real time information from these equipements, the equipements are used by a trained health Professional.


Those devices work with video and audio. Right now, we are using them with peerjs (so peer to peer connection) but we are trying to move away from that and have a RPI with his only job to stream data (so streaming audio and video).


Because the equipements are supposed to be used with instructions from a doctor we need the doctor to receive the data in real time.


But we also need the trained health professional to see what he is doing (so we need a local feed from the equipement)


How do we capture audio and video


We are using ffmpeg with a go client that is in charge of managing the ffmpeg clients and stream them to a SRS server.
This works but we are having a 2-3 sec delay when streaming the data. (rtmp from ffmpeg and flv on the front end)


ffmpeg settings :


("ffmpeg", "-f", "v4l2", `-i`, "*/video0", "-f", "flv", "-vcodec", "libx264", "-x264opts", "keyint=15", "-preset", "ultrafast", "-tune", "zerolatency", "-fflags", "nobuffer", "-b:a", "160k", "-threads", "0", "-g", "0", "rtmp://srs-url")



My questions


- 

- Is there a way for this set up to achieve low latency (<1 sec) (for the nurse and for the doctor) ?
- Is the way I want to achieve this good ? Is there a batter way ?






Flow schema


Data exchange and use case flow :






Note : The nurse and doctor use
HTTP-FLV
to play the live stream, for low latency.



-
ffplay with gdigrab doesn't render properly
14 avril 2022, par laggingreflexI'm trying to do this :


ffplay -f gdigrab -framerate 30 -i title="Untitled - Notepad"



It's supposed to show the notepad window in the ffplay window.


But it doesn't seem to render properly.. it shows scrollbars but not anything written in the notepad. Other windows ("calculator") don't show properly either.


Does
ffplay
not work (well) withgdigrab
? Or is it just me ?



-
Rendering libav-decoded video stream in android sdl ? or ?
13 avril 2013, par user1568549I've successfully cross-compiled a c++ video streaming library to the android ICS platform
This library contains a sample player that uses sdl library to render the resulting decoded video streams and libav for decoding that i've also succeeded to cross compile(libav ... classes) Then, i 've made the necessary jni classes and tested it using log tags it seems that everything is fine but now i want to show the result on the screen(show the real streaming not just log messages)
I am searching for the easiest way to render my stream (just to be sure that everything works fine)
Am i oblige to cross compile the sdl library too ? is it possible ? if yes is there any good tutorial ?
Is there any other solution to render directly ffmpeg decoded frames ?
Thanks in advance.