
Recherche avancée
Médias (1)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (48)
-
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
Les formats acceptés
28 janvier 2010, parLes commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
ffmpeg -codecs ffmpeg -formats
Les format videos acceptés en entrée
Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
Les formats vidéos de sortie possibles
Dans un premier temps on (...) -
Automated installation script of MediaSPIP
25 avril 2011, parTo overcome the difficulties mainly due to the installation of server side software dependencies, an "all-in-one" installation script written in bash was created to facilitate this step on a server with a compatible Linux distribution.
You must have access to your server via SSH and a root account to use it, which will install the dependencies. Contact your provider if you do not have that.
The documentation of the use of this installation script is available here.
The code of this (...)
Sur d’autres sites (9135)
-
arm : Create proper .rdata sections for COFF
11 janvier 2019, par Martin Storsjöarm : Create proper .rdata sections for COFF
As .rodata isn’t one of the default created sections for COFF, it was
created as a read-write data section. By using the default .rdata
section name for COFF, it automatically becomes a read-only data section.
The existing ".section .rodata" works as intended for ELF though.This is based on an original patch and diagnose by Tom Tan
<Tom.Tan@microsoft.com>.Signed-off-by : Martin Storsjö <martin@martin.st>
-
How to correctly calculate which segments are ready to be downloaded using MPEG-DASH
24 avril 2019, par igal kWhat i’m trying to do ?
Write a simple MPEG-DASH client using the
SegmentTemplate
pattern to calculate which segments are ready to be downloaded for a live source.A picture taken using
chrome
’s debugging tools at a momentX
showing anmpd
request(8af651fd747.....mpd
) and the actual segments fetched respectfully to that request.Given the following
MPD
<mpd availabilitystarttime="2019-04-24T06:43:32Z" maxsegmentduration="PT4.096S" minbuffertime="PT4.096S" minimumupdateperiod="PT15.835S" profiles="urn:mpeg:dash:profile:isoff-live:2011" publishtime="2019-04-24T11:14:01Z" suggestedpresentationdelay="PT11.878S" timeshiftbufferdepth="PT65.536S" type="dynamic" xmlns="urn:mpeg:dash:schema:mpd:2011">
<location>https://content-aaps1.uplynk.com/channel/8af651fd7473474f86a05ffb0a1c8972.mpd?rmt=wv&amp;cid=8af651fd7473474f86a05ffb0a1c8972&amp;oid=600e5c27541344a1bf3818617ad712ce&amp;prettydash=1&amp;exp=1556091088&amp;rn=4138683939&amp;tc=1&amp;ct=c&amp;sig=5fb7f0c18f3f1d2ad4fdee53c02c1e1ed904bc5e8474f4ebf886d209ff7f21c9&amp;pbs=05b6594bcf4b4728ac1094976a80194d</location>
<period start="PT2826.240S">
<adaptationset maxframerate="30" maxheight="720" maxwidth="1280" mimetype="video/mp4" segmentalignment="true" startwithsap="1">
<representation bandwidth="2604473" codecs="avc1.64001e" framerate="30" height="360" scantype="progressive" width="640">
<baseurl>https://x-default-stgec.uplynk.com/aapm/slices/8c1/600e5c27541344a1bf3818617ad712ce/8c1027496a964b049f1bd5895f8f0412/</baseurl>
<segmenttemplate duration="368640" initialization="https://x-default-stgec.uplynk.com/aapm/slices/8c1/600e5c27541344a1bf3818617ad712ce/8c1027496a964b049f1bd5895f8f0412/$RepresentationID$_init.mp4?pbs=05b6594bcf4b4728ac1094976a80194d&amp;_jt=l&amp;chid=8af651fd7473474f86a05ffb0a1c8972" media="$RepresentationID$$Number%08d$.m4f?pbs=05b6594bcf4b4728ac1094976a80194d&amp;_jt=l&amp;chid=8af651fd7473474f86a05ffb0a1c8972" presentationtimeoffset="254361599" startnumber="690" timescale="90000"></segmenttemplate>
</representation>
</adaptationset>
</period>
<utctiming schemeiduri="urn:mpeg:dash:utc:http-iso:2014" value="https://content-aaps1.uplynk.com/misc/utcservertime"></utctiming>
</mpd>I see that the next segment request should be #3955
What i have tried so far
period.end = 1556104456;
period.start = 2826;
availability_start_time = 1556088212;
max_segment_duration = 4;
time_shift_buffer_depth = 65So, first of all, i read DASH-IF-IOP 4.3 section
4.3.4.2
page #82 and implemented the following code :int k1 = 1;
int period_duration = period.end - (period.start + data_.availability_start_time);
int k2 = ceil((float)period_duration / (float)data_.max_segment_duration);
double duration = ((float)representation.duration / (float)representation.timeScale);
size_t live_edge = std::min(
(int)floor((float)((data_.publish_time - data_.availability_start_time - period.start) / duration)), k2);
size_t oldest = std::max(k1, (int)floor((float)((data_.publish_time - data_.availability_start_time - period.start -
data_.time_shift_buffer_depth) /
duration)));after calculating everything :
k1=1
, k2=3355
,live_edge=3272
andoldest=3256
Also tried using
ffmpeg
’sdashdec.c
for min_segment :
if (c->is_live && pls->fragment_duration)
{
num = pls->first_seq_no + (((get_current_time_in_sec() - c->availability_start_time) - c->time_shift_buffer_depth) * pls->fragment_timescale) / pls->fragment_duration;
}for max_segment :
num = pls->first_seq_no + (((get_current_time_in_sec() - c->availability_start_time)) * pls->fragment_timescale) / pls->fragment_duration;
after a small modification :
size_t pmax = (((data_.publish_time - data_.availability_start_time))) / duration;
size_t pmin = ((data_.publish_time - data_.availability_start_time) - data_.time_shift_buffer_depth) / duration;pmin=3946
pmax=3961
in the ffmpeg example, i had to manually remove the
first_seq_no
variable because it looked like i doubled added theSegmentTemplate@StartNumber
.even after succeeding in this task, how do i exactly build the request list of
Segment(NOW)
---->Segment(LIVE_EDGE)
-
Crash in ffmpeg avcodec_free_context in one application but not in other
5 novembre 2017, par geekowlI am building an application in C++ on Windows 10 using Microsoft Visual Studio 2012 Professional.
I have created a wrapper library around ffmpeg (libavcodec) to encode video in H.264 format using libx264. This wrapper contains following functions :
Initialize()
Open()
EncodeFrame()
Close()
Uninitialize()I created a test application to test the wrapper library. The test application works perfectly fine.
When I use the wrapper library in my actual main application, the main application crashes in
Close()
API. Inside close, it crashes inavcodec_free_context()
. The difference between the main application and the test application is that the main application links with some more dependent libraries that test application does not link with.To debug the problem in the main application, I put
avcodec_free_context()
in Open() after the context is allocated. The crash occurs ifavcodec_free_context()
is put at a certain point as shown below.pCodecContext = avcodec_alloc_context3(pCodec);
// <---- No crash here.
pCodecContext->bit_rate = 200000;
pCodecContext->width = 320;
pCodecContext->height = 240;
.
.
.
if (pCodec->id == AV_CODEC_ID_H264)
{
av_opt_set(pCodecContext->priv_data, "preset", "slow", 0);
av_opt_set(pCodecContext->priv_data, "tune", "zerolatency", 0);
}
pCodecContext->flags |= CODEC_FLAG_GLOBAL_HEADER;
// <-- No crash here
if (avcodec_open2(pCodecContext, pCodec, NULL) < 0)
{
return -1;
}
// <-- Crash here [avcodec_free_context(&pCodecContext);]What is a correct approach to identify and resolve this problem ?
Thanks in advance.
Update
I found out that ffmpeg was built using gcc. That was causing the crash. I rebuilt ffmpeg using Visual Studio. That solved problem.