
Recherche avancée
Médias (1)
-
Revolution of Open-source and film making towards open film making
6 octobre 2011, par
Mis à jour : Juillet 2013
Langue : English
Type : Texte
Autres articles (34)
-
Dépôt de média et thèmes par FTP
31 mai 2013, parL’outil MédiaSPIP traite aussi les média transférés par la voie FTP. Si vous préférez déposer par cette voie, récupérez les identifiants d’accès vers votre site MédiaSPIP et utilisez votre client FTP favori.
Vous trouverez dès le départ les dossiers suivants dans votre espace FTP : config/ : dossier de configuration du site IMG/ : dossier des média déjà traités et en ligne sur le site local/ : répertoire cache du site web themes/ : les thèmes ou les feuilles de style personnalisées tmp/ : dossier de travail (...) -
Selection of projects using MediaSPIP
2 mai 2011, parThe examples below are representative elements of MediaSPIP specific uses for specific projects.
MediaSPIP farm @ Infini
The non profit organizationInfini develops hospitality activities, internet access point, training, realizing innovative projects in the field of information and communication technologies and Communication, and hosting of websites. It plays a unique and prominent role in the Brest (France) area, at the national level, among the half-dozen such association. Its members (...) -
Encodage et transformation en formats lisibles sur Internet
10 avril 2011MediaSPIP transforme et ré-encode les documents mis en ligne afin de les rendre lisibles sur Internet et automatiquement utilisables sans intervention du créateur de contenu.
Les vidéos sont automatiquement encodées dans les formats supportés par HTML5 : MP4, Ogv et WebM. La version "MP4" est également utilisée pour le lecteur flash de secours nécessaire aux anciens navigateurs.
Les documents audios sont également ré-encodés dans les deux formats utilisables par HTML5 :MP3 et Ogg. La version "MP3" (...)
Sur d’autres sites (3916)
-
APPLY Strong to Buffer rule. Quit Switching Bit rates MPEG DASH
14 juillet 2015, par VinayI am using mpeg dash for adaptive bit rate streaming of video from my server.
I have used ffmpeg and MP4Box to generate 4 different quality video files from my source .mp4
The .mpd file generated has the below code
<?xml version="1.0"?>
<mpd xmlns="urn:mpeg:dash:schema:mpd:2011" minbuffertime="PT1.500000S" type="static" mediapresentationduration="PT0H3M1.42S" profiles="urn:mpeg:dash:profile:isoff-on-demand:2011">
<programinformation moreinformationurl="http://gpac.sourceforge.net">
</programinformation>
<period duration="PT0H3M1.42S">
<adaptationset segmentalignment="true" maxwidth="1920" maxheight="1080" maxframerate="24" par="16:9" lang="und" subsegmentstartswithsap="1">
<representation mimetype="video/mp4" codecs="avc1.64000d" width="320" height="240" framerate="24" sar="1:1" startwithsap="1" bandwidth="375715">
<baseurl>400_dashinit.mp4</baseurl>
<segmentbase indexrangeexact="true" indexrange="904-1403">
<initialization range="0-903"></initialization>
</segmentbase>
</representation>
<representation mimetype="video/mp4" codecs="avc1.640015" width="420" height="270" framerate="24" sar="1:1" startwithsap="1" bandwidth="644824">
<baseurl>700_dashinit.mp4</baseurl>
<segmentbase indexrangeexact="true" indexrange="905-1404">
<initialization range="0-904"></initialization>
</segmentbase>
</representation>
<representation mimetype="video/mp4" codecs="avc1.64001f" width="1024" height="576" framerate="24" sar="1:1" startwithsap="1" bandwidth="1349484">
<baseurl>1500_dashinit.mp4</baseurl>
<segmentbase indexrangeexact="true" indexrange="905-1404">
<initialization range="0-904"></initialization>
</segmentbase>
</representation>
<representation mimetype="video/mp4" codecs="avc1.64001f" width="1280" height="720" framerate="24" sar="1:1" startwithsap="1" bandwidth="2264379">
<baseurl>2500_dashinit.mp4</baseurl>
<segmentbase indexrangeexact="true" indexrange="905-1404">
<initialization range="0-904"></initialization>
</segmentbase>
</representation>
<representation mimetype="video/mp4" codecs="avc1.640028" width="1920" height="1080" framerate="24" sar="1:1" startwithsap="1" bandwidth="3633049">
<baseurl>4000_dashinit.mp4</baseurl>
<segmentbase indexrangeexact="true" indexrange="906-1405">
<initialization range="0-905"></initialization>
</segmentbase>
</representation>
</adaptationset>
</period>
</mpd>I am using video.js along with dash.js to playback the mpeg dash content on client side. The issue is that the video doesn’t playback perfectly when i simulate network conditions from chrome dev tools.
It works at times and it doesn’t at others. For ex the stream starts with bit rate of 400kbps and then detects enough bandwidth available so it switches to 2500kbps. Then when i bring down my bandwidth to 400kbps again then the video freezes at some point of time.
At times the video freezes after few initial seconds of playback when it tries to switch the stream. I think there might be some command line parameter that i am missing while generating my video files via ffmpeg or generating .mpd file via MP4Box.
below are the commands i use for ffmpeg and MP4Box
ffmpeg -y -i inputfile -c:a libfdk_aac -ac 2 -ab 128k -c:v libx264 -r 24 – g 24 -b:v 1500k -maxrate 1500k -bufsize 1000k -vf "scale=-1:720" outputfile.mp4
MP4Box -dash [DURATION] -rap -frag-rap -profile [PROFILE] -out [path/to/outpout.file] [path/to/input1.file] [path/to/input2.file] [path/to/input3.file]Also while i am generating .mpd files via MP4Box i am getting below warning in console
[DASH]: Files have non-proportional track layouts (320x240 vs 420x270) but sample size and aspect ratio match, assuming precision issue
[DASH]: Files have non-proportional track layouts (320x240 vs 1024x576) but sample size and aspect ratio match, assuming precision issue
[DASH]: Files have non-proportional track layouts (320x240 vs 1280x720) but sample size and aspect ratio match, assuming precision issue
[DASH]: Files have non-proportional track layouts (320x240 vs 1920x1080) but sample size and aspect ratio match, assuming precision issueWhenever the video stops playing the chrome console has these logs
Number of times the buffer has run dry: 25
Apply STRONG to buffer rule.
Quit switching bit rates.I don’t have any clue as to why the buffers run dry and it stops switching the bit rates.
Anything that is predominantly wrong in the process ?
-
Is there any open source solution to display a remote stream inside a Hololens2 UWP Vuforia application ?
19 avril 2023, par T777What do we need ?


We are trying to develop an application for quality management in which we show an hologram on a metal part as an assitance marking. (using Hololen2 + Vuforia + ModleTargets) The employee uses an sensor to follow this assitance marking and the data will be analyzed live by a test device. The results are outputed on a screen / are visible at an closed source application of the manufacturer of the test device.


Capturing of the video output :
The current plan is to capture the video stream of the test device via capture card. Add a via mrtk2 videopanel inside the vuforia app and stream the captured video to the Hololens2 using obs or an OpenCV python script for screen recording.


What we have tried so far


1) Sending Raw udp stream
via RMTP and decoding + converting with gstreamer server and writing an own library in Unity for Receiving
Result : Temporary stopped, because receiving the udp streams needs connection/ session management (signalling) frame syncing and agreement on video size, color format, frame rate etc.. and we have no solution.
An own implementation of any of this would have high complexity is consuming a lot of time.


2) Using available protocols that i can find on the web
Actually there are some protocols already developed for session creation and streaming :


- 

- HTTP streaming (HLS) (Transport + Session)
- RTMP (Transport + Session),
- RTP (Transport) + RTPS (Session),
- WebRTC : Is possible with different protocol stacks
RTMP/TCP/UDP (Transport) + SDP (standardized format for video paramaters) + ICE (p2p)/ WHIP (http, client-server) / Websocket(client-server) (signaling protocols) that can be used and some good open source streaming servers (gstreamer, mediamtx and srs)










When using these the video will be encoded typcially with xh264 and need to be decoded on the HoloLens 2. There are APIs to C/C++ native (hardware) decoding libraries like unity-vlc and ffmpeg.NET that needing media library ffmpeg. I could figure out (not tested) that there is an hardware h264 decoder on the HoloLens2 but I have no clue how to access it. Since there I couldnt disvocer any information about HoloLens2 media libraries.


3) Using Unity packages


- 

-
Unit package WebRTC (https://docs.unity3d.com/Packages/com.unity.webrtc@2.4/manual/index.html) supports multiple transport protocols seem to have no signaling mechanism and


-
Unity package Render streaming (https://docs.unity3d.com/Packages/com.unity.renderstreaming@3.1/manual/index.html) is a fully integrated unity to unity and unit to browser streaming package with integrated stream Server with web GUI. It offers various streaming protocols (TCP, UDP,rmtp) and signaling mechanism over websocket, http (seems custom and not whip) or Furiouus.
BUT it doesnt support UWP as noted into the documentation. Implementing an example application we could demonstrate an working example with Vuforia, but it fails on build with target UWP on missing libraries.
Similar to : https://www.youtube.com/watch?v=nHRC0uGBnn8








Will be testing other compile options tomorrow..


- 

- Mixed Reality WebRTC (https://github.com/microsoft/MixedReality-WebRTC) :
Various protocol support, Microsoft brought Webrtc specifically to HoloLens.
Deprecated, as fas as I can see just support for Hololens1 and ARM32. So i can not evaluate if trying it with this is worth it.




What are the next options ?


- 

- Developing a raw udp streaming library with untiy directly.
- Rebuilding the application with visionlib (ARM32) compatible and MixedRealityWebRTC (ARM32)
- Porting ffmpeg + API to UWP ?
- Also there seem some affords to make WebRTC in general available to UWP platforms : https://github.com/microsoft/winrtc










The questions


- 

- Does Vuforia support ARM32 ?
- How to access hardware decoder of Hololens2 via Unity Code ?






-
Can ffmpeg concatenate mp3 files using the process at audio-joiner.com ?
7 juin 2020, par Ed999I have a dozen or more mp3 audio files, which I need to concatenate into a single mp3 file. The files all have the same bitrate (320 kbps) and sample rate (44.1 kHz), but all of them have differing durations.



I have studied the three methods of concatenation recommended on stackoverflow (How to concatenate two MP4 files using FFmpeg). One method actually works, but when I play back the output file I find that there are noticeable audio artifacts (audible glitches) at each join point.



I've been told that this problem is caused by the input files not having identical duration. This seems likely, because I've had some successes in concatenating audio files with identical bit rate, sample rate, and duration.



I have seen, online, some much more complex scripts which are, at present, miles beyond my understanding. One solution I was directed to required a fairly deep knowledge of Python !



However, my research also included a site at audio-joiner.com - and this had the only completely successful method I've yet found, for files of non-identical duration. That site processed some of my input files, joined the multiple files into one, and the concatenated output file it produced did not have any audible glitches at the joins.



I looked into the process the site was using, hoping to get a clue as to where I've been going wrong, but the script on the site (which looks like ajax-based javascript) is too complex for me to follow.



Because the process seemed to take quite a long time, I wouldn't be too surprised to learn that the mp3 input files are being converted to some other audio format, joined, then converted back to mp3 for the output. But if so, that wouldn't put me off using the process.



Is anyone familiar with the approach being used, and can say whether it might be reproducible using ffmpeg ?



.



ADDED -



There are 7 scripts, in all, listed in the source of the relevant page :



https://pagead2.googlesyndication.com/pagead/js/adsbygoogle.js
https://cdnjs.cloudflare.com/ajax/libs/jquery/1.12.0/jquery.min.js
https://static.123apps.com/js/socket.io.js
https://static.123apps.com/js/shared_0.js
https://static.123apps.com/js/shared_1.js
https://static.123apps.com/js/ajoiner.js
https://ajax.googleapis.com/ajax/libs/swfobject/2.2/swfobject.js




.



ADDED -



The successful (javascript) function seems to be this, but it isn't obvious to me why it is succeeding (too complex for me !). Can anyone suggest what approach it is taking ? For example, is it transcoding the mp3 files to an intermediate format, and concatenating the intermediate files ?



function start_join(e){
 l("start_join():"),l(e);
 var t;
 return(t=$.parseJSON(e)) && $("#ajoiner").ajoiner("set_params",t),!0
}

function cancel_join(e){
 return l("cancel_join():"),l(e),!0
}

!function(o){
 var t={
 init:function(e){
 var t=o(this),n=o.extend({lang:{cancel:"Cancel",download:"Download"}},e);
 o(this).data("ajoiner",{o:n,tmp_i:1,pid:-1,params:{}});
 t.data("ajoiner");
 t.ajoiner("_connect"),o("body").bind("socket_connected",function(){t.ajoiner("_connect")})
 },set_params:function(e){
 var t=o(this).data("ajoiner");
 isset(e)?(e.uid=Cookies.get("uid"),t.params=e,t.params.lang_id=lang_id,t.params.host=location.hostname,t.params.hostprotocol=location.protocol,l("socket emit join:"),l(t.params),socket.emit("join",t.params)):error("set_params: params not set")
 },_connect:function(){

 var t=o(this).data("ajoiner");

 l("_connect"),socket.on("join",function(e){
 "progress"==e.message_type?(t.tmp_i,t.tmp_i++,void 0!==getObj("theSWF")&&(getObj("theSWF").set_join_progress(parseInt(e.progress_value)),l("SWF.set_join_progress("+parseInt(e.progress_value)+")")),isset(e.pid)&&(t.pid=e.pid)):"final_result"==e.message_type?(void(e.tmp_i=0)!==getObj("theSWF")&&(getObj("theSWF").join_finished(o.stringifyJSON(e)),l("SWF.join_finished('"+o.stringifyJSON(e)+"')")),last_conv_result=e):"error"==e.message_type&&l(e.error_desc)
 }
 )},_cancel_convert:function(){
 var e=o(this).data("ajoiner");
 0code>