
Recherche avancée
Médias (2)
-
Exemple de boutons d’action pour une collection collaborative
27 février 2013, par
Mis à jour : Mars 2013
Langue : français
Type : Image
-
Exemple de boutons d’action pour une collection personnelle
27 février 2013, par
Mis à jour : Février 2013
Langue : English
Type : Image
Autres articles (31)
-
Qu’est ce qu’un éditorial
21 juin 2013, parEcrivez votre de point de vue dans un article. Celui-ci sera rangé dans une rubrique prévue à cet effet.
Un éditorial est un article de type texte uniquement. Il a pour objectif de ranger les points de vue dans une rubrique dédiée. Un seul éditorial est placé à la une en page d’accueil. Pour consulter les précédents, consultez la rubrique dédiée.
Vous pouvez personnaliser le formulaire de création d’un éditorial.
Formulaire de création d’un éditorial Dans le cas d’un document de type éditorial, les (...) -
Contribute to translation
13 avril 2011You can help us to improve the language used in the software interface to make MediaSPIP more accessible and user-friendly. You can also translate the interface into any language that allows it to spread to new linguistic communities.
To do this, we use the translation interface of SPIP where the all the language modules of MediaSPIP are available. Just subscribe to the mailing list and request further informantion on translation.
MediaSPIP is currently available in French and English (...) -
Les formats acceptés
28 janvier 2010, parLes commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
ffmpeg -codecs ffmpeg -formats
Les format videos acceptés en entrée
Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
Les formats vidéos de sortie possibles
Dans un premier temps on (...)
Sur d’autres sites (7368)
-
Basic Video Palette Conversion
How do you take a 24-bit RGB image and convert it to an 8-bit paletted image for the purpose of compression using a codec that requires 8-bit input images ? Seems simple enough and that’s what I’m tackling in this post.
Ask FFmpeg/Libav To Do It
Ideally, FFmpeg / Libav should be able to handle this automatically. Indeed, FFmpeg used to be able to, at least at the time I wrote this post about ZMBV and was unhappy with FFmpeg’s default results. Somewhere along the line, FFmpeg and Libav lost the ability to do this. I suspect it got removed during some swscale refactoring.Still, there’s no telling if the old system would have computed palettes correctly for QuickTime files.
Distance Approach
When I started writing my SMC video encoder, I needed to convert RGB (from PNG files) to PAL8 colorspace. The path of least resistance was to match the pixels in the input image to the default 256-color palette that QuickTime assumes (and is hardcoded into FFmpeg/Libav).How to perform the matching ? Find the palette entry that is closest to a given input pixel, where "closest" is the minimum distance as computed by the usual distance formula (square root of the sum of the squares of the diffs of all the components).
That means for each pixel in an image, check the pixel against 256 palette entries (early termination is possible if an acceptable threshold is met). As you might imagine, this can be a bit time-consuming. I wondered about a faster approach...
Lookup Table
I think this is the approach that FFmpeg used to use, but I went and derived it for myself after studying the default QuickTime palette table. There’s a pattern there— all of the RGB entries are comprised of combinations of 6 values — 0x00, 0x33, 0x66, 0x99, 0xCC, and 0xFF. If you mix and match these for red, green, and blue values, you come up with6 * 6 * 6 = 216
different colors. This happens to be identical to the web-safe color palette.The first (0th) entry in the table is (FF, FF, FF), followed by (FF, FF, CC), (FF, FF, 99), and on down to (FF, FF, 00) when the green component gets knocked down and step and the next color is (FF, CC, FF). The first 36 palette entries in the table all have a red component of 0xFF. Thus, if an input RGB pixel has a red color closest to 0xFF, it must map to one of those first 36 entries.
I created a table which maps indices 0..215 to values from 5..0. Each of the R, G, and B components of an input pixel are used to index into this table and derive 3 indices ri, gi, and bi. Finally, the index into the palette table is given by :
index = ri * 36 + gi * 6 + bi
For example, the pixel (0xFE, 0xFE, 0x01) would yield ri, gi, and bi values of 0, 0, and 5. Therefore :
index = 0 * 36 + 0 * 6 + 5
The palette index is 5, which maps to color (0xFF, 0xFF, 0x00).
Validation
So I was pretty pleased with myself for coming up with that. Now, ideally, swapping out one algorithm for another in my SMC encoder should yield identical results. That wasn’t the case, initially.One problem is that the regulation QuickTime palette actually has 40 more entries above and beyond the typical 216-entry color cube (rounding out the grand total of 256 colors). Thus, using the distance approach with the full default table provides for a little more accuracy.
However, there still seems to be a problem. Let’s check our old standby, the Big Buck Bunny logo image :
Distance approach using the full 256-color QuickTime default palette
Distance approach using the 216-color palette
Table lookup approach using the 216-color palette
I can’t quite account for that big red splotch there. That’s the most notable difference between images 1 and 2 and the only visible difference between images 2 and 3.
To prove to myself that the distance approach is equivalent to the table approach, I wrote a Python script to iterate through all possible RGB combinations and verify the equivalence. If you’re not up on your base 2 math, that’s 224 or 16,777,216 colors to run through. I used Python’s multiprocessing module to great effect and really maximized a Core i7 CPU with 8 hardware threads.
So I’m confident that the palette conversion techniques are sound. The red spot is probably attributable to a bug in my WIP SMC encoder.
Source Code
Update August 23, 2011 : Here’s the Python code I used for proving equivalence between the 2 approaches. In terms of leveraging multiple CPUs, it’s possibly the best program I have written to date.PYTHON :-
# !/usr/bin/python
-
-
from multiprocessing import Pool
-
-
palette = []
-
pal8_table = []
-
-
def process_r(r) :
-
counts = []
-
-
for i in xrange(216) :
-
counts.append(0)
-
-
print "r = %d" % (r)
-
for g in xrange(256) :
-
for b in xrange(256) :
-
min_dsqrd = 0xFFFFFFFF
-
best_index = 0
-
for i in xrange(len(palette)) :
-
dr = palette[i][0] - r
-
dg = palette[i][1] - g
-
db = palette[i][2] - b
-
dsqrd = dr * dr + dg * dg + db * db
-
if dsqrd <min_dsqrd :
-
min_dsqrd = dsqrd
-
best_index = i
-
counts[best_index] += 1
-
-
# check if the distance approach deviates from the table-based approach
-
i = best_index
-
r = palette[i][0]
-
g = palette[i][1]
-
b = palette[i][2]
-
ri = pal8_table[r]
-
gi = pal8_table[g]
-
bi = pal8_table[b]
-
table_index = ri * 36 + gi * 6 + bi ;
-
if table_index != best_index :
-
print "(0x%02X 0x%02X 0x%02X) : distance index = %d, table index = %d" % (r, g, b, best_index, table_index)
-
-
return counts
-
-
if __name__ == ’__main__’ :
-
counts = []
-
for i in xrange(216) :
-
counts.append(0)
-
-
# initialize reference palette
-
color_steps = [ 0xFF, 0xCC, 0x99, 0x66, 0x33, 0x00 ]
-
for r in color_steps :
-
for g in color_steps :
-
for b in color_steps :
-
palette.append([r, g, b])
-
-
# initialize palette conversion table
-
for i in range(0, 26) :
-
pal8_table.append(5)
-
for i in range(26, 77) :
-
pal8_table.append(4)
-
for i in range(77, 128) :
-
pal8_table.append(3)
-
for i in range(128, 179) :
-
pal8_table.append(2)
-
for i in range(179, 230) :
-
pal8_table.append(1)
-
for i in range(230, 256) :
-
pal8_table.append(0)
-
-
# create a pool of worker threads and break up the overall job
-
pool = Pool()
-
it = pool.imap_unordered(process_r, range(256))
-
try :
-
while 1 :
-
partial_counts = it.next()
-
for i in xrange(216) :
-
counts[i] += partial_counts[i]
-
except StopIteration :
-
pass
-
-
print "index, count, red, green, blue"
-
for i in xrange(len(counts)) :
-
print "%d, %d, %d, %d, %d" % (i, counts[i], palette[i][0], palette[i][1], palette[i][2])
-
-
How to configure and validate a Funnel in Piwik Analytics
16 janvier 2017, par InnoCraft — CommunityIn the last blog post we have covered how the conversion Funnel plugin enriches your Piwik experience. This post will focus on how to configure and validate your funnel in Piwik so you get the correct data when you view the funnel reports. When you set up a funnel, it is crucial to have it configured correctly as the funnel report will be only as good as its configuration. When we built this Funnel feature, we focused on making the configuration and validation real simple because it is so important to get it right.
To recap quickly : A Funnel defines a series of steps that you expect your visitors to take on their way to converting a goal or a sale. Funnels, a premium feature for Piwik developed by InnoCraft, lets you define funnels so you can improve your websites and mobile apps based on this data. Learn more about Funnel.
Configuring a funnel
As you will notice Funnels integrates nicely into the Piwik Goals management. You can configure a funnel whenever you create or update a goal. You can access the Goals Management either via “Administration => Goals” or via the reporting menu “Goals => Manage”. Then click on either “Add a new goal” or select an existing goal to edit it. At the bottom of the goal form, you will see a new row letting you configure a funnel. As with all our premium features we focused on displaying lots of inline help and explain directly in the UI what a funnel is about, what the steps are in order to configure a funnel, how a funnel helps you and more. This lets you use the Funnel feature even if you have never created or analyzed a funnel before.
Preparing your Funnel configuration
Before starting to configure a Funnel we usually have a brainstorm session identifying the funnels on a website or app and the paths we expect users to take there. Once we have identified each step, we click through those identified pages in our website and we note the URLs for each page as the URLs will be needed when you configure a funnel.
Setting up a Goal
Once we have finished the planning phase it is time to log into Piwik. We start by either adding a new goal or selecting an existing goal. If you are unfamiliar with setting up goals, have a look at the Piwik Goals user guide. At the bottom of a goal form when you create or update a goal, you can configure your funnel. The UI will first explain you everything about Funnels, what they are, how they help you and which steps you need to take in order to configure it.
Configuring Funnel steps
We start by configuring the steps we have identified in the planning phase. Those are the steps we expect our users to take when they convert a goal or purchase something. Now we need to add a step for each page we expect users to take, each step consists of a name and a pattern.
The name will be shown to you in the funnel reporting so think of a good name that describes each step best, for example “Product”, “Cart”, “Checkout” and “Order”.
The pattern is needed to define when a visitor will enter this step. Here it comes in handy to have already notes for each URL from the planning phase. You can select lots of different patterns based on “URL Path”, “URL” and “URL parameter”. For example “URL starts with”, “Path ends with”, “URL contains”, “URL matches the regular expression”, and more. Most tools make this configuration unnecessarily hard because they only allow you to choose from one or two patterns (only complicated pattern like regular expressions) and they don’t let you validate whether the URL you have in mind actually matches the pattern. There are three ways to validate your step configurations.
Validating funnel steps
When we configure a funnel, we validate our steps in the following three ways.
1. Via the help icon next to the step configuration
When you click on the help icon, you will receive valuable tips about configuring steps, what “required” means and how to match popular pages. It will also show you a list of all URLs that were tracked in your Piwik in the past and match your specified pattern. For example say you specify a pattern “Path starts with /products”, then Piwik will list all URLs that were tracked in the past matching this pattern. This lets you validate whether your pattern actually matches the URLs you had in mind. It will also show you if the pattern doesn’t match any known URL which can indicate that your configuration may be wrong.
2. Via the URL validator
Below the steps configuration you find a form field that lets you enter any URL.
We recommend to enter each URL that you have noted before in the planning phase. Once you enter a URL, the configurations will be validated immediately and the result will be shown to you in the step configuration. When a step matches your specified URL, the background will become green, when a step does not match the URL, the background will be red.
If the URL does not match the expected step, simply change your step configuration and the steps will be re-validated as you change the configuration. This way you will see instantly as soon as you got the configuration right.
What you don’t want is that either all of your steps don’t match (red background) or that several steps match a certain URL (green background). When several step match one URL, then one visitor might enter several funnel steps on just one page. This usually indicates a problem with the step configuration.
3. Manual funnel validation
After we have created or updated the goal (more about this soon), we always test a funnel configuration manually. This means we now open our website and click through the pages that we hand in mind and check afterwards whether the steps we took actually appear in the funnel report as expected. This is just another safety net to make sure your funnel configuration is right.
It is really crucial to have a correct funnel configuration as otherwise the shown data in the funnel reports might not be as helpful. That’s why we focused so much on making the validation part real easy.
Activating and saving the funnel
Once you are happy with your configuration, it is time to activate your funnel. As soon as you activate your funnel, a report for this funnel will be generated and the links and reports for this funnel will be visible in the UI. If you are later no longer interested in the funnel, simply deactivate the funnel so it won’t appear in the reporting UI anymore.
To save your funnel configuration simply click on either “Add goal” or “Update goal”. The funnel will be automatically saved whenever you update your goal.
Goals Management
The funnel plugin also enriches the list of goals in the Piwik goal management. At a glance you can see whether a funnel for a goal is configured and activated (green tick in the funnel column), whether a funnel is configured but not activated (grey tick in the funnel column) or whether no funnel is configured for a goal (no tick at all).
How to get Funnels and related features
You can get Funnels on the Piwik Marketplace. If you want to learn more about Funnels you might be also interested in the Funnel User Guide and the Funnel FAQ.
Similar to Funnels we also offer Users Flow which lets you visualize the flow of your users and visitors across several interactions.
-
x264 Decoding time increases with zerolatency
13 avril 2016, par Ajay Ponna VenkateshReference - Why sliced thread affect so much on realtime encoding using ffmpeg x264 ?
With Zerolatency / sliced-threads enabled, I am observing that the decoding time shoots up ! I am encoding on my Windows 10 laptop and streaming to Samsung S4 phone where it is decoded and rendered. If usually, decoding takes 2-3 ms, it shoots up to around 25 ms if I use sliced-threads. It is a real time streaming application so I need low latency and that’s why I enabled zerolatency. Can someone help please ?
I am using the hardware decoder on the phone.