
Recherche avancée
Médias (3)
-
MediaSPIP Simple : futur thème graphique par défaut ?
26 septembre 2013, par
Mis à jour : Octobre 2013
Langue : français
Type : Video
-
GetID3 - Bloc informations de fichiers
9 avril 2013, par
Mis à jour : Mai 2013
Langue : français
Type : Image
-
GetID3 - Boutons supplémentaires
9 avril 2013, par
Mis à jour : Avril 2013
Langue : français
Type : Image
Autres articles (82)
-
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
Amélioration de la version de base
13 septembre 2013Jolie sélection multiple
Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...) -
Le plugin : Podcasts.
14 juillet 2010, parLe problème du podcasting est à nouveau un problème révélateur de la normalisation des transports de données sur Internet.
Deux formats intéressants existent : Celui développé par Apple, très axé sur l’utilisation d’iTunes dont la SPEC est ici ; Le format "Media RSS Module" qui est plus "libre" notamment soutenu par Yahoo et le logiciel Miro ;
Types de fichiers supportés dans les flux
Le format d’Apple n’autorise que les formats suivants dans ses flux : .mp3 audio/mpeg .m4a audio/x-m4a .mp4 (...)
Sur d’autres sites (10763)
-
Connect external cameras to iOS and decompress to a usable form
27 septembre 2017, par Ping ChenI want to create a 2 camera setup which can send 1 of the camera views out as an RTMP stream depending on the motion intensity detected. The chosen camera view can change if motion intensity on the views changes.
I imagine that I could use an iPhone/iPad as encoding/streaming hub as well as 1 of the cameras. And connect a WiFi camera to the iPad/iPhone to feed the 2nd camera view.
My goals for the iOS side are :
- Connect with a WiFi camera on the local network
- Decode the data and run motion intensity detection on the WiFi camera feed AND the iPhone/iPad’s own camera feed with Brad Larson’s GPUImage framework https://github.com/BradLarson/GPUImage
- Stream out the chosen camera view. depending on motion detected
Larson’s GPUImage framework works with an AVCaptureSession subclass. I’m only familiar with AVFoundation objects, but am a complete noob with it comes to VideoToolbox and some of the lower level iOS video stuff. Through googling, I kind of know that VTDecompressionSession is what I’d get from the WiFi camera. I have no clue how I can manipulate that to a usable form for my purposes.
I’ve dug through stackoverflow answers such as : https://stackoverflow.com/a/29525001/7097455
Very informative, but maybe I don’t even know to ask the correct questions
-
Tools to determine video orientation
6 avril 2014, par JayLevI receive videos from different devices and want to encode them using the correct orientation.
I've seen some of examples of how to determine the orientation of a video from a iPhone.
With exiftool and mediainfo I can indeed tell if an iPhone video has to be rotated.
However, for android videos, both portrait and landscape videos have the same rotation and matrix structure as each other.
Maybe this is just with my phone, I'm trying to find videos taken from newer droid phones.
My question however is whether there's other tools or a different way to determine the orientation that'll work with all devices.
EDIT :
I just checked a video from a Samsung Galaxy S II, and I can get the orientation from exiftool. So it's not a problem with all android phones.
My android phone is a HTC Desire running on android 2.2.And actually (I didn't even notice before) a portrait video will not be correctly oriented even when playing on the phone. So I guess it's not about the tools, the orientation data just doesn't seem to be correct at all.
-
How to estimate bandwidth / speed requirements for real-time streaming video ?
19 juin 2016, par Vivek SethFor a project I’m working on, I’m trying to stream video to an iPhone through its headphone jack. My estimated bitrate is about 200kbps (If i’m wrong about this, please ignore that).
I’d like to squeeze as much performance out of this bitrate as possible and sound is not important for me, only video. My understanding is that to stream a a real-time video I will need to encode it with some codec on-the-fly and send compressed frames to the iPhone for it to decode and render. Based on my research, it seems that H.265 is one of the most space efficient codecs available so i’m considering using that.
Assuming my basic understanding of live streaming is correct, how would I estimate the FPS I could achieve for a given resolution using the H.265 codec ?
The best solution I can think of it to take a video file, encode it with H.265 and trim it to 1 minute of length to see how large the file is. The issue I see with this approach is that I think my calculations would include some overhead from the video container format (AVI, MKV, etc) and from the audio channels that I don’t care about.