Recherche avancée

Médias (2)

Mot : - Tags -/media

Autres articles (111)

  • Personnaliser les catégories

    21 juin 2013, par

    Formulaire de création d’une catégorie
    Pour ceux qui connaissent bien SPIP, une catégorie peut être assimilée à une rubrique.
    Dans le cas d’un document de type catégorie, les champs proposés par défaut sont : Texte
    On peut modifier ce formulaire dans la partie :
    Administration > Configuration des masques de formulaire.
    Dans le cas d’un document de type média, les champs non affichés par défaut sont : Descriptif rapide
    Par ailleurs, c’est dans cette partie configuration qu’on peut indiquer le (...)

  • Script d’installation automatique de MediaSPIP

    25 avril 2011, par

    Afin de palier aux difficultés d’installation dues principalement aux dépendances logicielles coté serveur, un script d’installation "tout en un" en bash a été créé afin de faciliter cette étape sur un serveur doté d’une distribution Linux compatible.
    Vous devez bénéficier d’un accès SSH à votre serveur et d’un compte "root" afin de l’utiliser, ce qui permettra d’installer les dépendances. Contactez votre hébergeur si vous ne disposez pas de cela.
    La documentation de l’utilisation du script d’installation (...)

  • Activation de l’inscription des visiteurs

    12 avril 2011, par

    Il est également possible d’activer l’inscription des visiteurs ce qui permettra à tout un chacun d’ouvrir soit même un compte sur le canal en question dans le cadre de projets ouverts par exemple.
    Pour ce faire, il suffit d’aller dans l’espace de configuration du site en choisissant le sous menus "Gestion des utilisateurs". Le premier formulaire visible correspond à cette fonctionnalité.
    Par défaut, MediaSPIP a créé lors de son initialisation un élément de menu dans le menu du haut de la page menant (...)

Sur d’autres sites (6651)

  • Is there a way to use ffmpeg audio filters to automatically synchronize 2 streams with similar content

    29 mai 2015, par user3741412

    I have a situation where I have a video capture of HD content via HDMI with audio from a sound board that goes through a impedance drop into a microphone input of a camcorder. That same signal is split at line level to a ’line in’ jack on the same computer that is capturing the HDMI. Alternatively I can capture the audio via USB from the soundboard which is probably the best plan, but carries with it the same issue.

    The point is that the line in or usb capture will be much higher quality than the one on HDMI because the line out -> impedance change -> mic in path generates inferior quality in that simply brushing the mic jack on the camera while trying to change the zoom (close proximity) can cause noise on the recording.

    So I can do this today :

    • Take the good sound and the camera captured sound and load each into
      audacity and pretty quickly use the timeshift toot to perfectly fit
      the good audio to the questionable audio from the HDMI capture and
      cut the good audio to the exact size of the video. Then I can use
      ffmpeg or other video editing software to replace the questionable
      audio with the better audio.

    But while somewhat quick and easy, it always carries with it a bit of human error and time. I’d like to automate this if possible as this process is repeated at least weekly throughout the year.

    Does anyone have a suggestion if any of these ideas have merit or could suggest another approach ?

    1. I suspect but have yet to confirm that the system timestamp of the start time may be recorded in both audio captured with something like Audacity, or the USB capture tool from the sound board as well as the HDMI mpeg-2 video. I tried ffprobe on a couple audacity captured .wav files but didn’t see anything in the results about such a time code, but perhaps other audio formats or other probing tools may include this info. Can anyone advise if this is common with any particular capture tools or file formats ?

      • if so, I think I could get best results by extracting this information and then using simple adelay and atrim filters in ffmpeg to sync reliably directly from the two sources in one ffmpeg call. This is all theoretical for me right now— I’ve never tried either of these filters yet— just trying to optimize against blind alleys by asking for advice up front.
    2. If such timestamps are not embedded, possibly I can use the file system timestamp for the same idea expressed in 1a, but I suspect the file open of the two capture tools may have different inherant delays. Possibly these delays will be found to be nearly constant and the approach can work with a built-in constant anticipation delay but sounds messy and less reliable than idea 1. Still, I’d take it, if it turns out reasonably reliable

    3. Are there any ffmpeg or general digital audio experts out there that know of particular filters that can be employed on the actual data to look for similarities like normalizing the peak amplitudes or normalizing the amplification of the two to some RMS value and then stepping through a short 10 second snippet of audio, moving one time stream .01s left against the other repeatedly and subtracting the two and looking for a minimum ? Sounds like it could take a while, but if it could do this in less than a minute and be reliable, I suspect it could work. But I have only rudimentary knowledge of audio streams and perhaps what I suggest is just not plausible— but since each stream starts with the same source I think there should be a chance. I am just way out of my depth as to how to go down this road, so if someone out there knows such magic or can throw me some names of filters and example calls, I can explore if I can make it work.

    4. any hardware level suggestions to take a line level output down to a mic level input and not have the problems I am seeing using a simple in-line impedance drop module, so that I can simply rely on the audio from the HDMI ?

    Thanks in advance for any pointers or suggestinons !

  • Matomo Launches Global Partner Programme to Deepen Local Connections and Champion Ethical Analytics

    25 juin, par Matomo Core Team — Press Releases

    Matomo introduces a global Partner Programme designed to connect organisations with trusted local experts, advancing its commitment to privacy, data sovereignty, and localisation.

    Wellington, New Zealand 25 June 2025 Matomo, the leading web analytics platform, is
    proud to announce the launch of the Matomo Partner Programme. This new initiative marks a significant step in Matomo’s global growth strategy, bringing together a carefully selected
    network of expert partners to support customers with localised, hightrust analytics services
    rooted in shared values.

    As privacy concerns rise and organisations seek alternatives to mainstream analytics solutions, the need for regional expertise has never been more vital. The Matomo Partner Programme ensures that customers around the world are supported not just by a worldclass platform, but by trusted local professionals who understand their specific regulatory, cultural, and business needs.

    “Matomo is evolving. As privacy regulations become more nuanced and the need for regional
    understanding grows, we’ve made localisation a central pillar of our strategy. Our partners are
    the key to helping customers navigate these complexities with confidence and care,” said
    Adam Taylor, Chief Operating Officer at Matomo.

    Local Experts, Global Values

    At the heart of the Matomo Partner Programme is a commitment to connect clients with local experts who live and breathe their markets. These partners are more than service
    providersthey’re trusted advisors who bring deep insight into their region’s privacy
    legislation, cultural norms, sectorspecific requirements, and digital trends.

    The programme empowers partners to act as extensions of Matomo’s core teams :

    As Customer Success allies, delivering personalised training, support, and technical
    services in local languages and time zones.
    As Sales ambassadors, raising awareness of ethical analytics in both public and private
    sectors, where trust, compliance, and transparency are crucial.

    This decentralised, valuesaligned approach ensures that every Matomo customer benefits
    from localised delivery with global consistency.

    A Programme Designed for Impactful Partnerships

    The Matomo Partner Programme is open to organisations who share a commitment to ethical, open-source analytics and can demonstrate :

    Technical excellence in deploying, configuring, and supporting Matomo Analytics in diverse environments.
    Deep market understanding, allowing them to tell the Matomo story in ways that
    resonate locally.
    Commercial strength to position Matomo across key industries, particularly in sectors with complex compliance and data sovereignty demands.

    Partners who meet these standards will be recognised as ‘Official Matomo Partners’— a symbol of excellence, credibility, and shared purpose. With this status, they gain access to :

    Brand alignment and trust : Strengthen credibility with clients by promoting their
    connection to Matomo and its globally respected ethical stance.
    Go-to-market support : Access to qualified leads, joint marketing, and tools to scale their business in a privacy-first market.
    Strategic collaboration : Early insights into the product roadmap and direct
    engagement with Matomo’s core team.
    Meaningful local impact : Help regional organisations reclaim control of their data and embrace ethical analytics with confidence.

    Ethical Analytics for Today’s World

    Matomo was founded in 2007 with the belief that people should have full control over their data. As the first opensource web analytics platform of its kind, Matomo continues to challenge the dominance of opaque, centralised tools by offering a transparent and flexible alternative that puts users first.

    In today’s landscapemarked by increased regulatory scrutiny, data protection concerns, and rapid advancements in AIMatomo’s approach is more relevant than ever. Opensource technology provides the adaptability organisations need to respond to local expectations while reinforcing digital trust with users.

    Whether it’s a government department, healthcare provider, educational institution, or
    commercial businessMatomo partners are on the ground, ready to help organisations
    transition to analytics that are not only powerful but principled.
  • Optical Drive Value Proposition

    28 août 2010, par Multimedia Mike — General

    I have the absolute worst luck in the optical drive department. Ever since I started building my own computers in 1995 — close to the beginning of the CD-ROM epoch — I have burned through a staggering number of optical drives. Seriously, especially in the time period between about 1995-1998, I was going through a new drive every 4-6 months or so. This was also during that CD-ROM speed race where the the drive packages kept advertising loftier ‘X’ speed ratings. I didn’t play a lot of CD-ROM games during that timeframe, though I did listen to quite a few audio CDs through the computer.



    I use “optical drive” as a general term to describe CD-ROM drives, CD-R/RW drives, DVD-ROM drives, DVD-R/RW drives, and drives capable of doing any combination of reading and writing CDs and DVDs. In my observation, optical media seems to be falling out of favor somewhat, giving way to online digital distribution for things like games and software, as well as flash drives and external hard drives vs. recordable or rewritable media for backup and sneakernet duty. Somewhere along the line, I started to buy computers that didn’t even have optical drives. That’s why I have purchased at least 2 external USB drives (seen in the picture above). I don’t have much confidence that either works correctly. My main desktop until recently, a Mac Mini, has an internal optical drive that grew flaky and unreliable a few months after the unit was purchased.

    I just have really rotten luck with optical drives. The most reliable drive in my house is the one on the headless machine that, until recently, was the main workhorse on the FATE farm. The eject switch didn’t work correctly so I have to log in remotely, 'sudo eject', walk to the other room, pop in the disc, walk back to the other room, and work with the disc.

    Maybe optical media is on its way out, but I still have many hundreds of CD-ROMs. Perhaps I should move forward on this brainstorm to archive all of my optical discs on hard drives (and then think of some data mining experiments, just for the academic appeal), before it’s too late ; optical discs don’t last forever.

    So if I needed a good optical drive, what should I consider ? I’ve always been the type to go cheap, I admit. Many of my optical drives were on the lower end of the cost spectrum, which might have played some role in their rapid replacement. However, I’m not sold on the idea that I’m getting quality just because I’m paying a higher price. That LG unit at the top of the pile up there was relatively pricey and still didn’t fare well in the long (or even medium) term.

    Come to think of it, I used to have a ridiculous stockpile of castoff (but somehow still functional) optical drives. So many, in fact, that in 2004 I had a full size PC tower that I filled with 4 working drives, just because I could. Okay, I admit that there was a period where I had some reliable drives.

    That might be an idea, actually– throw together such a computer for heavy duty archival purposes. I visited Weird Stuff Warehouse today (needed some PC100 RAM for an old machine and they came through) and I think I could put together such a box rather cheaply.

    It’s a dirty job, but… well, you know the rest.