Recherche avancée

Médias (91)

Autres articles (38)

  • Use, discuss, criticize

    13 avril 2011, par

    Talk to people directly involved in MediaSPIP’s development, or to people around you who could use MediaSPIP to share, enhance or develop their creative projects.
    The bigger the community, the more MediaSPIP’s potential will be explored and the faster the software will evolve.
    A discussion list is available for all exchanges between users.

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • De l’upload à la vidéo finale [version standalone]

    31 janvier 2010, par

    Le chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
    Upload et récupération d’informations de la vidéo source
    Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
    Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)

Sur d’autres sites (6243)

  • IJG swings again, and misses

    1er février 2010, par Mans — Multimedia

    Earlier this month the IJG unleashed version 8 of its ubiquitous libjpeg library on the world. Eager to try out the “major breakthrough in image coding technology” promised in the README file accompanying v7, I downloaded the release. A glance at the README file suggests something major indeed is afoot :

    Version 8.0 is the first release of a new generation JPEG standard to overcome the limitations of the original JPEG specification.

    The text also hints at the existence of a document detailing these marvellous new features, and a Google search later a copy has found its way onto my monitor. As I read, however, my state of mind shifts from an initial excited curiosity, through bewilderment and disbelief, finally arriving at pure merriment.

    Already on the first page it becomes clear no new JPEG standard in fact exists. All we have is an unsolicited proposal sent to the ITU-T by members of the IJG. Realising that even the most brilliant of inventions must start off as mere proposals, I carry on reading. The summary informs me that I am about to witness the introduction of three extensions to the T.81 JPEG format :

    1. An alternative coefficient scan sequence for DCT coefficient serialization
    2. A SmartScale extension in the Start-Of-Scan (SOS) marker segment
    3. A Frame Offset definition in or in addition to the Start-Of-Frame (SOF) marker segment

    Together these three extensions will, it is promised, “bring DCT based JPEG back to the forefront of state-of-the-art image coding technologies.”

    Alternative scan

    The first of the proposed extensions introduces an alternative DCT coefficient scan sequence to be used in place of the zigzag scan employed in most block transform based codecs.

    Alternative scan sequence

    Alternative scan sequence

    The advantage of this scan would be that combined with the existing progressive mode, it simplifies decoding of an initial low-resolution image which is enhanced through subsequent passes. The author of the document calls this scheme “image-pyramid/hierarchical multi-resolution coding.” It is not immediately obvious to me how this constitutes even a small advance in image coding technology.

    At this point I am beginning to suspect that our friend from the IJG has been trapped in a half-world between interlaced GIF images transmitted down noisy phone lines and today’s inferno of SVC, MVC, and other buzzwords.

    (Not so) SmartScale

    Disguised behind this camel-cased moniker we encounter a method which, we are told, will provide better image quality at high compression ratios. The author has combined two well-known (to us) properties in a (to him) clever way.

    The first property concerns the perceived impact of different types of distortion in an image. When encoding with JPEG, as the quantiser is increased, the decoded image becomes ever more blocky. At a certain point, a better subjective visual quality can be achieved by down-sampling the image before encoding it, thus allowing a lower quantiser to be used. If the decoded image is scaled back up to the original size, the unpleasant, blocky appearance is replaced with a smooth blur.

    The second property belongs to the DCT where, as we all know, the top-left (DC) coefficient is the average of the entire block, its neighbours represent the lowest frequency components etc. A top-left-aligned subset of the coefficient block thus represents a low-resolution version of the full block in the spatial domain.

    In his flash of genius, our hero came up with the idea of using the DCT for down-scaling the image. Unfortunately, he appears to possess precious little knowledge of sampling theory and human visual perception. Any block-based resampling will inevitably produce sharp artefacts along the block edges. The human visual system is particularly sensitive to sharp edges, so this is one of the most unwanted types of distortion in an encoded image.

    Despite the obvious flaws in this approach, I decided to give it a try. After all, the software is already written, allowing downscaling by factors of 8/8..16.

    Using a 1280×720 test image, I encoded it with each of the nine scaling options, from unity to half size, each time adjusting the quality parameter for a final encoded file size of no more than 200000 bytes. The following table presents the encoded file size, the libjpeg quality parameter used, and the SSIM metric for each of the images.

    Scale Size Quality SSIM
    8/8 198462 59 0.940
    8/9 196337 70 0.936
    8/10 196133 79 0.934
    8/11 197179 84 0.927
    8/12 193872 89 0.915
    8/13 197153 92 0.914
    8/14 188334 94 0.899
    8/15 198911 96 0.886
    8/16 197190 97 0.869

    Although the smaller images allowed a higher quality setting to be used, the SSIM value drops significantly. Numbers may of course be misleading, but the images below speak for themselves. These are cut-outs from the full image, the original on the left, unscaled JPEG-compressed in the middle, and JPEG with 8/16 scaling to the right.

    Looking at these images, I do not need to hesitate before picking the JPEG variant I prefer.

    Frame offset

    The third and final extension proposed is quite simple and also quite pointless : a top-left cropping to be applied to the decoded image. The alleged utility of this feature would be to enable lossless cropping of a JPEG image. In a typical image workflow, however, JPEG is only used for the final published version, so the need for this feature appears quite far-fetched.

    The grand finale

    Throughout the text, the author makes references to “the fundamental DCT property for image representation.” In his own words :

    This property was found by the author during implementation of the new DCT scaling features and is after his belief one of the most important discoveries in digital image coding after releasing the JPEG standard in 1992.

    The secret is to be revealed in an annex to the main text. This annex quotes in full a post by the author to the comp.dsp Usenet group in a thread with the subject why DCT. Reading the entire thread proves quite amusing. A few excerpts follow.

    The actual reason is much simpler, and therefore apparently very difficult to recognize by complicated-thinking people.

    Here is the explanation :

    What are people doing when they have a bunch of images and want a quick preview ? They use thumbnails ! What are thumbnails ? Thumbnails are small downscaled versions of the original image ! If you want more details of the image, you can zoom in stepwise by enlarging (upscaling) the image.

    So with proper understanding of the fundamental DCT property, the MPEG folks could make their videos more scalable, but, as in the case of JPEG, they are unable to recognize this simple but basic property, unfortunately, and pursue rather inferior approaches in actual developments.

    These are just phrases, and they don’t explain anything. But this is typical for the current state in this field : The relevant people ignore and deny the true reasons, and thus they turn in a circle and no progress is being made.

    However, there are dark forces in action today which ignore and deny any fruitful advances in this field. That is the reason that we didn’t see any progress in JPEG for more than a decade, and as long as those forces dominate, we will see more confusion and less enlightenment. The truth is always simple, and the DCT *is* simple, but this fact is suppressed by established people who don’t want to lose their dubious position.

    I believe a trip to the Total Perspective Vortex may be in order. Perhaps his tin-foil hat will save him.

  • Adventures in Unicode

    29 novembre 2012, par Multimedia Mike — Programming, php, Python, sqlite3, unicode

    Tangential to multimedia hacking is proper metadata handling. Recently, I have gathered an interest in processing a large corpus of multimedia files which are likely to contain metadata strings which do not fall into the lower ASCII set. This is significant because the lower ASCII set intersects perfectly with my own programming comfort zone. Indeed, all of my programming life, I have insisted on covering my ears and loudly asserting “LA LA LA LA LA ! ALL TEXT EVERYWHERE IS ASCII !” I suspect I’m not alone in this.

    Thus, I took this as an opportunity to conquer my longstanding fear of Unicode. I developed a self-learning course comprised of a series of exercises which add up to this diagram :



    Part 1 : Understanding Text Encoding
    Python has regular strings by default and then it has Unicode strings. The latter are prefixed by the letter ‘u’. This is what ‘ö’ looks like encoded in each type.

    1. >>> ’ö’, u’ö’
    2. (\xc3\xb6’, u\xf6’)

    A large part of my frustration with Unicode comes from Python yelling at me about UnicodeDecodeErrors and an inability to handle the number 0xc3 for some reason. This usually comes when I’m trying to wrap my head around an unrelated problem and don’t care to get sidetracked by text encoding issues. However, when I studied the above output, I finally understood where the 0xc3 comes from. I just didn’t understand what the encoding represents exactly.

    I can see from assorted tables that ‘ö’ is character 0xF6 in various encodings (in Unicode and Latin-1), so u’\xf6′ makes sense. But what does ‘\xc3\xb6′ mean ? It’s my style to excavate straight down to the lowest levels, and I wanted to understand exactly how characters are represented in memory. The UTF-8 encoding tables inform us that any Unicode code point above 0x7F but less than 0×800 will be encoded with 2 bytes :

     110xxxxx 10xxxxxx
    

    Applying this pattern to the \xc3\xb6 encoding :

                hex : 0xc3      0xb6
               bits : 11000011  10110110
     important bits : ---00011  —110110
          assembled : 00011110110
         code point : 0xf6
    

    I was elated when I drew that out and made the connection. Maybe I’m the last programmer to figure this stuff out. But I’m still happy that I actually understand those Python errors pertaining to the number 0xc3 and that I won’t have to apply canned solutions without understanding the core problem.

    I’m cheating on this part of this exercise just a little bit since the diagram implied that the Unicode text needs to come from a binary file. I’ll return to that in a bit. For now, I’ll just contrive the following Unicode string from the Python REPL :

    1. >>> u = u’Üñìçôđé’
    2. >>> u
    3. u\xdc\xf1\xec\xe7\xf4\u0111\xe9’

    Part 2 : From Python To SQLite3
    The next step is to see what happens when I use Python’s SQLite3 module to dump the string into a new database. Will the Unicode encoding be preserved on disk ? What will UTF-8 look like on disk anyway ?

    1. >>> import sqlite3
    2. >>> conn = sqlite3.connect(’unicode.db’)
    3. >>> conn.execute("CREATE TABLE t (t text)")
    4. >>> conn.execute("INSERT INTO t VALUES (?)", (u, ))
    5. >>> conn.commit()
    6. >>> conn.close()

    Next, I manually view the resulting database file (unicode.db) using a hex editor and look for strings. Here we go :

    000007F0   02 29 C3 9C  C3 B1 C3 AC  C3 A7 C3 B4  C4 91 C3 A9
    

    Look at that ! It’s just like the \xc3\xf6 encoding we see in the regular Python strings.

    Part 3 : From SQLite3 To A Web Page Via PHP
    Finally, use PHP (love it or hate it, but it’s what’s most convenient on my hosting provider) to query the string from the database and display it on a web page, completing the outlined processing pipeline.

    1. < ?php
    2. $dbh = new PDO("sqlite:unicode.db") ;
    3. foreach ($dbh->query("SELECT t from t") as $row) ;
    4. $unicode_string = $row[’t’] ;
    5.  ?>
    6.  
    7. <html>
    8. <head><meta http-equiv="Content-Type" content="text/html ; charset=utf-8"></meta></head>
    9. <body><h1>< ?=$unicode_string ?></h1></body>
    10. </html>

    I tested the foregoing PHP script on 3 separate browsers that I had handy (Firefox, Internet Explorer, and Chrome) :



    I’d say that counts as success ! It’s important to note that the “meta http-equiv” tag is absolutely necessary. Omit and see something like this :



    Since we know what the UTF-8 stream looks like, it’s pretty obvious how the mapping is operating here : 0xc3 and 0xc4 correspond to ‘Ã’ and ‘Ä’, respectively. This corresponds to an encoding named ISO/IEC 8859-1, a.k.a. Latin-1. Speaking of which…

    Part 4 : Converting Binary Data To Unicode
    At the start of the experiment, I was trying to extract metadata strings from these binary multimedia files and I noticed characters like our friend ‘ö’ from above. In the bytestream, this was represented simply with 0xf6. I mistakenly believed that this was the on-disk representation of UTF-8. Wrong. Turns out it’s Latin-1.

    However, I still need to solve the problem of transforming such strings into Unicode to be shoved through the pipeline diagrammed above. For this experiment, I created a 9-byte file with the Latin-1 string ‘Üñìçôdé’ couched by 0′s, to simulate yanking a string out of a binary file. Here’s unicode.file :

    00000000   00 DC F1 EC  E7 F4 64 E9  00         ......d..
    

    (Aside : this experiment uses plain ‘d’ since the ‘đ’ with a bar through it doesn’t occur in Latin-1 ; shows up all over the place in Vietnamese, at least.)

    I’ve been mashing around Python code via the REPL, trying to get this string into a Unicode-friendly format. This is a successful method but it’s probably not the best :

    1. >>> import struct
    2. >>> f = open(’unicode.file’, ’r’).read()
    3. >>> u = u’’
    4. >>> for c in struct.unpack("B"*7, f[1 :8]) :
    5. ... u += unichr(c)
    6. ...
    7. >>> u
    8. u\xdc\xf1\xec\xe7\xf4d\xe9’
    9. >>> print u
    10. Üñìçôdé

    Conclusion
    Dealing with text encoding matters reminds me of dealing with integer endian-ness concerns. When you’re just dealing with one system, you probably don’t need to think too much about it because the system is usually handling everything consistently underneath the covers.

    However, when the data leaves one system and will be interpreted by another system, that’s when a programmer needs to be cognizant of matters such as integer endianness or text encoding.

  • Enterprise web analytics : Quick start guide (and top tools)

    10 juillet, par Joe — Analytics Tips

    Without data, you’ll get lost in the sea of competition.

    This is even more important for large organisations.

    Data helps you :

    • Optimise customer experiences
    • Navigate complex business decisions
    • Create a roadmap to sustainable brand growth
    • Data can power differentiation, especially within fiercely competitive sectors.

    How do you get the benefits of data in a large organisation ?

    Enterprise web analytics.

    In this guide, we’ll cover everything you need to know about enterprise web analytics to enhance website performance, improve customer experiences and increase conversions.

    What is enterprise web analytics ?

    Enterprise web analytics help large organisations capture, analyse, and act on website data to optimise customer experiences and make informed decisions. By providing insight into customer interactions, user behaviour and preferences, they’re vital in helping big businesses improve their websites.

    Definition of enterprise web analytics

    Enterprise web analytics can extract data from web pages and reveal a range of performance metrics, including :

    • Pageviews
    • Average time on page
    • Actions per visit
    • Bounce rate
    • Conversions
    • Traffic sources
    • Device type
    • Event tracking
    • And more

    You can track this data daily or access monthly reports, which will give you valuable insights into optimising user engagement, improving your website’s search engine traffic, and meeting business goals like increased conversion rates.

    For large organisations, web analytics isn’t just about measuring traffic. Instead, it’s an asset you can use to identify issues in your web strategy so you can gain insights that will fuel sustainable business growth.

    An advanced analytics strategy goes beyond the digital channels, page views and bounce rates of traditional analytics.

    Instead, modern web analytics incorporates behavioural analytics for deeper analysis and insight into user experiences. These advanced features include :

    • Heatmaps (or scroll maps) to track scroll behaviour on each page
    • User flow reports to see the pages your users visit in the customer journey
    • Session recordings to analyse user interactions (step-by-step)

    Taking a two-pronged approach to web analytics that includes both traditional and behavioural metrics, organisations get a clearer picture of users and their brand interactions.

    Different needs of enterprise companies

    Let’s dive deeper into the different needs of enterprise companies and how enterprise web analytics can help solve them :

    Access more storage

    Let’s face it. Large organisations have complex IT infrastructures and vast amounts of data.

    The amount of data to capture, analyse and store isn’t slowing down anytime soon.

    Enterprise web analytics can help handle and store large amounts of data in ways that serve the entire organisation.

    Enable cross-organisational data consumption

    It’s one thing to access data in a small company. You’ve got yourself and a few employees. That’s easy.

    But, it’s another thing to enable an organisation with thousands of employees with different roles to access complex data structures and large amounts of data.

    Enterprise web analytics allows big companies to enable their entire workforce to gain access to the data they need when they need it.

    Increase security

    As mentioned above, large organisations can use enterprise web analytics to help hundreds or even thousands of employees access their web data.

    However, some data shouldn’t be accessed by every type of employee. For example, some organisations may only want certain data accessed by executives, and some employees may not need to access certain types of data that may confuse or overwhelm them.

    Enterprise web analytics can help you grant access to certain types of data based on your role in the company, ensuring the security of sensitive data in your organisation.

    Improve privacy

    You can keep your data secure from internal breaches with enterprise web analytics. But, how do you protect customer data ?

    With all-inclusive privacy measures.

    To ensure that your customers’ privacy and data are protected, choose a web analytics solution that’s compliant with the latest and most important privacy measures, such as GDPR, LGPD and CCPA.

    Taking a privacy-first approach to data helps ensure your protection from potential legal action or fines.

    Enterprise web analytics best practices

    Want to make sure you get the most out of your web analytics strategy ?

    Woman analyzing data from analytics.

    Be clear on what metrics you want to track

    You can track a ton of data in your organisation, but you may not need to. To ensure you’re not wasting time and resources tracking irrelevant numbers, you should make sure you’re clear from day one on the metrics you want to track.

    Start by making a list of key data points relevant to your business.

    For example, if you have an online marketplace, you’ll want to track specific ecommerce metrics like conversion rate, total visits, bounce rates, traffic source, etc.

    Don’t take data at face value

    Numbers alone can’t tell you the whole story of what’s happening in your organisation. It’s crucial you add context to your data, no matter what.

    Dozens of factors could impact your data and visitors’ interactions with your site, so you should always try to look beyond the numbers to see if there are other factors at play.

    For example, you might see that your site traffic is down and think your search engine optimisation (SEO) efforts aren’t working. Meanwhile, there could have been a major Google algorithm update or some sort of seasonality in a key market.

    On the other hand, you might see some positive signals that things are going well with your organic social media strategy because you saw a large influx of traffic from Instagram. But, there could be more to the story.

    For example, an Instagram influencer with five million followers may have just posted a reel reviewing your product or service without you knowing it, leading to a major traffic spike for your website.

    Remember to add notes to your web analytics data if necessary to ensure you can reference any insights from your data to maintain that point of context.

    Ensure your data is accurate

    With web analytics, data is everything. It will help you see where your traffic is coming from, how your users are behaving, and gain actionable insights into how you can improve your website and user experience.

    But if your data isn’t accurate, your efforts will be futile.

    Accurate data is crucial for launching an effective web analytics strategy. Data sampling and simple tracking errors can lead to inaccurate numbers and misleading conclusions. 

    If a tool relies on cookies to collect data, then it’s relying on a faulty data collection system. Cookies give users the option to opt out of tracking, making it challenging to get a clear picture of every user interaction.

    For example, some platforms like Google Analytics use data sampling to make predictions about traffic rather than relying on accurate data collection, leading to inaccurate numbers and conclusions.

    To ensure you’re making decisions based on accurate data, find a solution that doesn’t rely on inaccurate data collection methods like data sampling or cookies.

    Lean on visual data tools to improve analysis

    Enterprise organisations deal with a ton of data. There are endless data points to track, and it can be easy to lose track of what’s going on with the bigger picture.

    One of the best ways to interpret your data is to use a data visualisation tool to integrate with your web analytics solution, like Looker or PowerBI.

    Make sure your chosen platform lets you export your data easily so you can link it with a visual support tool.

    With Matomo, you can easily export your data into Google BigQuery to warehouse your customer data and visualise it through other tools (without the need for APIs, scripts or additional tools).

    Use advanced web analytics

    Web analytics is quite broad, and different tools will offer various features you can access in your analytics dashboard.

    Take advantage of advanced features that utilise both traditional and behavioural data for deeper insights.

    • Use heatmaps to better understand what parts of your web pages your visitors are focusing on to improve conversion rates.
    • Review session recordings to see the exact steps your customers take as they interact with your website.
    • Conduct A/B tests to see which call to action, headline, or image provides the optimal user experience.

    There are dozens of advanced features available, so take the time to make sure your chosen tool has everything you need.

    Choose a privacy-focused tool

    Obviously, not every tool is created equal, and most of the software on the market isn’t suitable for enterprise businesses.

    As a large organisation, the most important step is to choose a trusted enterprise web analytics tool to ensure it’s capable of fitting within a company of your size.

    It needs to have great infrastructure and be able to handle large amounts of data.

    Another crucial factor is to check that the tool is compatible with your website or app. Does it integrate easily with it ? What about your other software ? Will it integrate with those as well and fit into your current tech stack ?

    Most importantly, you need a platform that can provide the data and insights your organisation needs.

    Make sure the tool you choose is GDPR-compliant and privacy-friendly. The last thing you want is to be sued or fined because you chose the wrong software. 

    Consumers are growing more cautious about privacy and data risks, so picking a privacy-focused tool will help build trust with customers.

    Top 5 enterprise web analytics tools

    Now that you understand enterprise web analytics and how to get the most out of it, it’s time to talk about tools.

    You need to make sure you’re using the right web analytics software to improve productivity, optimise website performance and grow your brand without compromising on the infrastructure required for large organisations to thrive.

    Here are five of the best enterprise solutions available :

    Features and pricing comparison

    GDPR
    compliant
    On-premise option100% data ownershipTraditional analytics Behavioural analyticsAwarded best enterprise software
    Matomo✔️✔️✔️✔️✔️✔️
    Amplitude✔️✔️✔️
    Adobe✔️✔️✔️
    GA360✔️
    Contentsquare✔️✔️✔️✔️

    Use Matomo to power your website analytics

    Web analytics help enterprise organisations reach new users, improve engagement with current users or grow their web presence.

    These advanced solutions support cross-organisational data consumption, enhance data privacy and security and allow brands to create the web experiences they know customers will love.

    Matomo’s dashboard on a laptop.

    Matomo can help you unlock the potential of your website strategy with traditional and behavioural analytics and accurate data. Trusted by over 1 million websites, Matomo’s open-source software is an ethical web solution that helps organisations of all sizes improve decision-making and customer experiences without compromising on privacy or security.

    Start your free 21-day trial now. No credit card required.