
Recherche avancée
Autres articles (98)
-
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
Multilang : améliorer l’interface pour les blocs multilingues
18 février 2011, parMultilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela. -
Organiser par catégorie
17 mai 2013, parDans MédiaSPIP, une rubrique a 2 noms : catégorie et rubrique.
Les différents documents stockés dans MédiaSPIP peuvent être rangés dans différentes catégories. On peut créer une catégorie en cliquant sur "publier une catégorie" dans le menu publier en haut à droite ( après authentification ). Une catégorie peut être rangée dans une autre catégorie aussi ce qui fait qu’on peut construire une arborescence de catégories.
Lors de la publication prochaine d’un document, la nouvelle catégorie créée sera proposée (...)
Sur d’autres sites (11219)
-
Merge individual frame to video file using Opencv
18 août 2022, par RohitI am trying to stack a individual frame to a video file using Opencv. I want to combine two different code together to make the individual frame.
Following code help me extract the individual frame,


fourcc = cv2.VideoWriter_fourcc(*'mp4v')
out = cv2.VideoWriter('file_data.mp4',fourcc,20 (1920,1080),False)
while True:
 ret, frame=cap.read()
 mask = object_detector.apply(frame)
 _, mask = cv2.threshold(mask,254,255,cv2.THRESH_BINARY) 
 contours,_ = cv2.findContours(mask, cv2.RETR_TREE, cv2.CHAIN_APPROX_SIMPLE)
 res = cv2.bitwise_and(frame,frame,mask=mask)
 for cnt in contours:
 area = cv2.contourArea(cnt)
 if area>1000: 
 #print("Area of contour:", area)
 cv2.drawContours(frame, [cnt], -1, (0,255,0),2)
 cv2.imwrite("file%d.jpg"%count, frame)
 out.write(frame)
 if cv2.waitKey(1) and 0xFF == ord('q'):
 break



I tried storing the individual frame in array, but it didn't work. It doesn't show any error, but pc crash.




fps = 20,
,width = 1920
,height = 1080




-
MPEG DASH - do I need to have audio and video tracks as seperate source file for creating DASH package using mp4box
10 juillet 2016, par TarunI have one source mp4, I tried to create MPEG DASH package using mp4box by GPAC.
I am able to play output MPD files in OSMO4 player by GPAC.However I am not able to play the same in DASH JS player @ http://dashif.org/reference/players/javascript/0.2.3/index.html
When I try to play the mpd in it I get error "Error creating source buffer"
I tried reading their MPD files, and I found that those guys are using audio and video as separate source track.
Ques1) Does DASH specs states that audio and video tracks should be seprate source tracks ?
Ques2) Please find below the MPD file created by me, Let me know if anybody thinks that there is a problem in it
<mpd type="static" xmlns="urn:mpeg:DASH:schema:MPD:2011" profiles="urn:mpeg:dash:profile:full:2011" minbuffertime="PT1.5S" mediapresentationduration="PT0H2M31.63S">
<programinformation moreinformationurl="http://gpac.sourceforge.net">
</programinformation>
<period start="PT0S" duration="PT0H2M31.63S">
<adaptationset>
<contentcomponent contenttype="video"></contentcomponent>
<contentcomponent contenttype="audio" lang="und"></contentcomponent>
<segmenttemplate initialization="flight_init.mp4"></segmenttemplate>
<representation mimetype="video/mp4" codecs="avc1.64001f,mp4a.40.02" width="1280" height="720" samplerate="44100" numchannels="2" lang="und" startwithsap="1" bandwidth="3096320">
<segmenttemplate timescale="1000" duration="20164" media="flight_test_flight_3000$Number$.mp4" startnumber="1"></segmenttemplate>
</representation>
<representation mimetype="video/mp4" codecs="avc1.64001e,mp4a.40.02" width="640" height="360" samplerate="44100" numchannels="2" lang="und" startwithsap="1" bandwidth="1119428">
<segmenttemplate timescale="1000" duration="20099" media="flight_test_flight_1000$Number$.mp4" startnumber="1"></segmenttemplate>
</representation>
<representation mimetype="video/mp4" codecs="avc1.640014,mp4a.40.02" width="320" height="180" samplerate="44100" numchannels="2" lang="und" startwithsap="1" bandwidth="722208">
<segmenttemplate timescale="1000" duration="20164" media="flight_test_flight_600$Number$.mp4" startnumber="1"></segmenttemplate>
</representation>
</adaptationset>
</period>
</mpd> -
Using ffmpeg to generate dash manifest and it cannot be played by dash.js
18 mars 2019, par PunkheadI’m using ffmpeg to encode incoming stream via rtmp protocol, the code as following :
ffmpeg -re -i rtmp://localhost:1935${StreamPath} -use_timeline 1 /
-use_template 1 -window_size 10 -min_seg_duration 5000 -f dash out.mpdThe manifest looks like this :
<?xml version="1.0" encoding="utf-8"?>
<mpd xmlns="urn:mpeg:dash:schema:mpd:2011" profiles="urn:mpeg:dash:profile:isoff-live:2011" type="static" mediapresentationduration="PT1M36.4S" minbuffertime="PT8.3S">
<programinformation>
</programinformation>
<period start="PT0.0S">
<adaptationset contenttype="video" segmentalignment="true" bitstreamswitching="true" framerate="30/1">
<representation mimetype="video/mp4" codecs="avc1.640028" width="1920" height="1080" framerate="30/1">
<segmenttemplate timescale="15360" initialization="init-stream$RepresentationID$.m4s" media="chunk-stream$RepresentationID$-$Number%05d$.m4s" startnumber="4">
<segmenttimeline>
<s t="384000" d="128000"></s>
<s d="71680"></s>
<s d="128000" r="4"></s>
<s d="56832"></s>
<s d="128000"></s>
<s d="72704"></s>
</segmenttimeline>
</segmenttemplate>
</representation>
</adaptationset>
<adaptationset contenttype="audio" segmentalignment="true" bitstreamswitching="true">
<representation mimetype="audio/mp4" codecs="mp4a.40.2" bandwidth="128000" audiosamplingrate="44100">
<audiochannelconfiguration schemeiduri="urn:mpeg:dash:23003:3:audio_channel_configuration:2011" value="2"></audiochannelconfiguration>
<segmenttemplate timescale="44100" initialization="init-stream$RepresentationID$.m4s" media="chunk-stream$RepresentationID$-$Number%05d$.m4s" startnumber="4">
<segmenttimeline>
<s t="1099755" d="367616"></s>
<s d="205824"></s>
<s d="367616" r="4"></s>
<s d="162816"></s>
<s d="367616"></s>
<s d="207872"></s>
</segmenttimeline>
</segmenttemplate>
</representation>
</adaptationset>
</period>
</mpd>When I try to play it on dash.js player, a error occured :
[112] Parsing complete: ( xml2json: 3.50ms, objectiron: 1.76ms, total: 0.00526s) Debug.js:127
[116] SegmentTimeline detected using calculated Live Edge Time Debug.js:127
[118] MediaSource attached to element. Waiting on open... Debug.js:127
[119] Manifest has been refreshed at Tue Jan 02 2018 01:57:35 GMT+0800 [1514829455.1] Debug.js:127
[155] MediaSource is open! Debug.js:127
[156] Duration successfully set to: 96.4 Debug.js:127
[157] Added 0 inline events Debug.js:127
[158] video codec: video/mp4;codecs="avc1.640028" Stream.js:225
Uncaught TypeError: Cannot read property 'type' of null
at z (Stream.js:225)
at C (Stream.js:285)
at D (Stream.js:373)
at E (Stream.js:398)
at Object.d [as activate] (Stream.js:107)
at y (StreamController.js:363)
at MediaSource.c (StreamController.js:342)then it fails to playback...
Is it because I didn’t set the parameters right on ffmpeg or this is a bug in dash.js ?
I really stuck here !