
Recherche avancée
Médias (1)
-
GetID3 - Bloc informations de fichiers
9 avril 2013, par
Mis à jour : Mai 2013
Langue : français
Type : Image
Autres articles (97)
-
Les sons
15 mai 2013, par -
Automated installation script of MediaSPIP
25 avril 2011, parTo overcome the difficulties mainly due to the installation of server side software dependencies, an "all-in-one" installation script written in bash was created to facilitate this step on a server with a compatible Linux distribution.
You must have access to your server via SSH and a root account to use it, which will install the dependencies. Contact your provider if you do not have that.
The documentation of the use of this installation script is available here.
The code of this (...) -
MediaSPIP en mode privé (Intranet)
17 septembre 2013, parÀ partir de la version 0.3, un canal de MediaSPIP peut devenir privé, bloqué à toute personne non identifiée grâce au plugin "Intranet/extranet".
Le plugin Intranet/extranet, lorsqu’il est activé, permet de bloquer l’accès au canal à tout visiteur non identifié, l’empêchant d’accéder au contenu en le redirigeant systématiquement vers le formulaire d’identification.
Ce système peut être particulièrement utile pour certaines utilisations comme : Atelier de travail avec des enfants dont le contenu ne doit pas (...)
Sur d’autres sites (9933)
-
Is there any open source solution to display a remote stream inside a Hololens2 UWP Vuforia application ?
19 avril 2023, par T777What do we need ?


We are trying to develop an application for quality management in which we show an hologram on a metal part as an assitance marking. (using Hololen2 + Vuforia + ModleTargets) The employee uses an sensor to follow this assitance marking and the data will be analyzed live by a test device. The results are outputed on a screen / are visible at an closed source application of the manufacturer of the test device.


Capturing of the video output :
The current plan is to capture the video stream of the test device via capture card. Add a via mrtk2 videopanel inside the vuforia app and stream the captured video to the Hololens2 using obs or an OpenCV python script for screen recording.


What we have tried so far


1) Sending Raw udp stream
via RMTP and decoding + converting with gstreamer server and writing an own library in Unity for Receiving
Result : Temporary stopped, because receiving the udp streams needs connection/ session management (signalling) frame syncing and agreement on video size, color format, frame rate etc.. and we have no solution.
An own implementation of any of this would have high complexity is consuming a lot of time.


2) Using available protocols that i can find on the web
Actually there are some protocols already developed for session creation and streaming :


- 

- HTTP streaming (HLS) (Transport + Session)
- RTMP (Transport + Session),
- RTP (Transport) + RTPS (Session),
- WebRTC : Is possible with different protocol stacks
RTMP/TCP/UDP (Transport) + SDP (standardized format for video paramaters) + ICE (p2p)/ WHIP (http, client-server) / Websocket(client-server) (signaling protocols) that can be used and some good open source streaming servers (gstreamer, mediamtx and srs)










When using these the video will be encoded typcially with xh264 and need to be decoded on the HoloLens 2. There are APIs to C/C++ native (hardware) decoding libraries like unity-vlc and ffmpeg.NET that needing media library ffmpeg. I could figure out (not tested) that there is an hardware h264 decoder on the HoloLens2 but I have no clue how to access it. Since there I couldnt disvocer any information about HoloLens2 media libraries.


3) Using Unity packages


- 

-
Unit package WebRTC (https://docs.unity3d.com/Packages/com.unity.webrtc@2.4/manual/index.html) supports multiple transport protocols seem to have no signaling mechanism and


-
Unity package Render streaming (https://docs.unity3d.com/Packages/com.unity.renderstreaming@3.1/manual/index.html) is a fully integrated unity to unity and unit to browser streaming package with integrated stream Server with web GUI. It offers various streaming protocols (TCP, UDP,rmtp) and signaling mechanism over websocket, http (seems custom and not whip) or Furiouus.
BUT it doesnt support UWP as noted into the documentation. Implementing an example application we could demonstrate an working example with Vuforia, but it fails on build with target UWP on missing libraries.
Similar to : https://www.youtube.com/watch?v=nHRC0uGBnn8








Will be testing other compile options tomorrow..


- 

- Mixed Reality WebRTC (https://github.com/microsoft/MixedReality-WebRTC) :
Various protocol support, Microsoft brought Webrtc specifically to HoloLens.
Deprecated, as fas as I can see just support for Hololens1 and ARM32. So i can not evaluate if trying it with this is worth it.




What are the next options ?


- 

- Developing a raw udp streaming library with untiy directly.
- Rebuilding the application with visionlib (ARM32) compatible and MixedRealityWebRTC (ARM32)
- Porting ffmpeg + API to UWP ?
- Also there seem some affords to make WebRTC in general available to UWP platforms : https://github.com/microsoft/winrtc










The questions


- 

- Does Vuforia support ARM32 ?
- How to access hardware decoder of Hololens2 via Unity Code ?






-
Cycle video display ffmpeg
10 avril 2023, par Lucas McKameyI'm new to ffmpeg so I've been struggling to get this code to work so I thought I'd reach out to the community.
Essentially, I was wondering how I could write an ffmpeg command that switches between displaying a video from the inputs I provide every 10 seconds. Ideally, there would be a crossfade transition when switching between videos.
As an example, let's say I have 3 inputs (v1, v2, v3). Say each video is of a duration of 1 minute. The output video would look something like this :


v1(0-10s) -> v2(10-20s) -> v3(20-30s) -> v1(30-40s) -> v2(40-50s) -> v3(50-60s) where the arrows would be crossfade transitions.
Any advice or input you could provide on how I could start on something like this would be a huge help.


Best,
Lucas


-
Gstreamer convert and display video v4l2 - tee problems in rust
27 mars 2023, par d3imI have USB grabber v4l2 source and I want to tee stream to autovideosink and x264enc to file (now as fake black hole)


When I disable one or another branch it works but together Pipeline goes :


Pipeline state changed from Null to Ready
Pipeline state changed from Ready to Paused



and stays there never switches to Playing


gst-launch-1.0 with similar functionality works well.


gst::Element::link_many(&[&pw_video, &v_caps, &vid_queuey, &vid_tee]).unwrap();
 gst::Element::link_many(&[&vid_queue1, &autovideoconvert, &vid_queuex, &autovideosink]).unwrap();
 gst::Element::link_many(&[&vid_queue2, &autovideoconvert_x264, &vid_queue3, &x264, &vid_queue4, &fake]).unwrap();

 let tee_display_pad = vid_tee.request_pad_simple("src_10").unwrap();
 let vid_queue1_pad = vid_queue1.static_pad("sink").unwrap();

 tee_display_pad.link(&vid_queue1_pad).unwrap();

 let tee_convert_pad = vid_tee.request_pad_simple("src_20").unwrap();
 let vid_queue2_pad = vid_queue2.static_pad("sink").unwrap();

 tee_convert_pad.link(&vid_queue2_pad).unwrap();



How can I use tee in rust properly to have playable pipeline with two branches ?


Update : I read some posts about increasing queue size, so I tried for this and then all queues :


let vid_queue1 = gst::ElementFactory::make("queue")
 .name("queue1")
 .property("max-size-buffers", 5000 as u32)
 .property("max-size-bytes", 1048576000 as u32)
 .property("max-size-time", 60000000000 as u64)
 .build()
 .expect("queue1");



but it didn't help so I tried set zero latency :


let x264 = gst::ElementFactory::make("x264enc")
 .name("x264")
 .property_from_str("speed-preset", "ultrafast")
 .property_from_str("pass", "qual")
 .property_from_str("tune", "zerolatency")
 .property("quantizer", 0 as u32)
 .property("threads", 8 as u32)
 .build()
 .expect("!x264");



and it works now. But comparable gst-launch-1.0 settings didn't had such option - only queues sizes increased.


Is there any other option than setting zerolatency ?