Recherche avancée

Médias (91)

Autres articles (31)

  • Librairies et binaires spécifiques au traitement vidéo et sonore

    31 janvier 2010, par

    Les logiciels et librairies suivantes sont utilisées par SPIPmotion d’une manière ou d’une autre.
    Binaires obligatoires FFMpeg : encodeur principal, permet de transcoder presque tous les types de fichiers vidéo et sonores dans les formats lisibles sur Internet. CF ce tutoriel pour son installation ; Oggz-tools : outils d’inspection de fichiers ogg ; Mediainfo : récupération d’informations depuis la plupart des formats vidéos et sonores ;
    Binaires complémentaires et facultatifs flvtool2 : (...)

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

  • De l’upload à la vidéo finale [version standalone]

    31 janvier 2010, par

    Le chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
    Upload et récupération d’informations de la vidéo source
    Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
    Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)

Sur d’autres sites (5104)

  • How to Use Analytics & Reports for Marketing, Sales & More

    28 septembre 2023, par Erin — Analytics Tips

    By now, most professionals know they should be using analytics and reports to make better business decisions. Blogs and thought leaders talk about it all the time. But most sources don’t tell you how to use analytics and reports. So marketers, salespeople and others either skim whatever reports they come across or give up on making data-driven decisions entirely. 

    But it doesn’t have to be this way.

    In this article, we’ll cover what analytics and reports are, how they differ and give you examples of each. Then, we’ll explain how clean data comes into play and how marketing, sales, and user experience teams can use reports and analytics to uncover actionable insights.

    What’s the difference between analytics & reports ? 

    Many people speak of reports and analytics as if the terms are interchangeable, but they have two distinct meanings.

    A report is a collection of data presented in one place. By tracking key metrics and providing numbers, reports tell you what is happening in your business. Analytics is the study of data and the process of generating insights from data. Both rely on data and are essential for understanding and improving your business results.

    https://docs.google.com/document/d/1teSgciAq0vi2oXtq_I2_n6Cv89kPi0gBF1l0zve1L2Q/edit

    A science experiment is a helpful analogy for how reporting and analytics work together. To conduct an experiment, scientists collect data and results and compile a report of what happened. But the process doesn’t stop there. After generating a data report, scientists analyse the data and try to understand the why behind the results.

    In a business context, you collect and organise data in reports. With analytics, you then use those reports and their data to draw conclusions about what works and what doesn’t.

    Reports examples 

    Reports are a valuable tool for just about any part of your business, from sales to finance to human resources. For example, your finance team might collect data about spending and use it to create a report. It might show how much you spend on employee compensation, real estate, raw materials and shipping.

    On the other hand, your marketing team might benefit from a report on lead sources. This would mean collecting data on where your sales leads come from (social media, email, organic search, etc.). You could collect and present lead source data over time for a more in-depth report. This shows which sources are becoming more effective over time. With advanced tools, you can create detailed, custom reports that include multiple factors, such as time, geographical location and device type.

    Analytics examples 

    Because analytics requires looking at and drawing insights from data and reports to collect and present data, analytics often begins by studying reports. 

    In our example of a report on lead sources, an analytics professional might study the report and notice that webinars are an important source of leads. To better understand this, they might look closely at the number of leads acquired compared to how often webinars occur. If they notice that the number of webinar leads has been growing, they might conclude that the business should invest in more webinars to generate more leads. This is just one kind of insight analytics can provide.

    For another example, your human resources team might study a report on employee retention. After analysing the data, they could discover valuable insights, such as which teams have the highest turnover rate. Further analysis might help them uncover why certain teams fail to keep employees and what they can do to solve the problem.

    The importance of clean data 

    Both analytics and reporting rely on data, so it’s essential your data is clean. Clean data means you’ve audited your data, removed inaccuracies and duplicate entries, and corrected mislabelled data or errors. Basically, you want to ensure that each piece of information you’re using for reports and analytics is accurate and organised correctly.

    If your data isn’t clean and accurate, neither will your reports be. And making business decisions based on bad data can come at a considerable cost. Inaccurate data might lead you to invest in a channel that appears more valuable than it actually is. Or it could cause you to overlook opportunities for growth. Moreover, poor data maintenance and the poor insight it provides will lead your team to have less trust in your reports and analytics team.

    The simplest way to maintain clean data is to be meticulous when inputting or transferring data. This can be as simple as ensuring that your sales team fills in every field of an account record. When you need to import or transfer data from other sources, you need to perform quality assurance (QA) checks to make sure data is appropriately labelled and organised. 

    Another way to maintain clean data is by avoiding cookies. Most web visitors reject cookie consent banners. When this happens, analysts and marketers don’t get data on these visitors and only see the percentage of users who accept tracking. This means they decide on a smaller sample size, leading to poor or inaccurate data. These banners also create a poor user experience and annoy web visitors.

    Matomo can be configured to run cookieless — which, in most countries, means you don’t need to have an annoying cookie consent screen on your site. This way, you can get more accurate data and create a better user experience.

    Marketing analytics and reports 

    Analytics and reporting help you measure and improve the effectiveness of your marketing efforts. They help you learn what’s working and what you should invest more time and money into. And bolstering the effectiveness of your marketing will create more opportunities for sales.

    One common area where marketing teams use analytics and reports is to understand and improve their keyword rankings and search engine optimization. They use web analytics platforms like Matomo to report on how their website performs for specific keywords. Insights from these reports are then used to inform changes to the website and the development of new content.

    As we mentioned above, marketing teams often use reports on lead sources to understand how their prospects and customers are learning about the brand. They might analyse their lead sources to better understand their audience. 

    For example, if your company finds that you receive a lot of leads from LinkedIn, you might decide to study the content you post there and how it differs from other platforms. You could apply a similar content approach to other channels to see if it increases lead generation. You can then study reporting on how lead source data changes after you change content strategies. This is one example of how analysing a report can lead to marketing experimentation. 

    Email and paid advertising are also marketing channels that can be optimised with reports and analysis. By studying the data around what emails and ads your audience clicks on, you can draw insights into what topics and messaging resonate with your customers.

    Marketing teams often use A/B testing to learn about audience preferences. In an A/B test, you can test two landing page versions, such as two different types of call-to-action (CTA) buttons. Matomo will generate a report showing how many people clicked each version. From those results, you may draw an insight into the design your audience prefers.

    Sales analytics and reports 

    Sales analytics and reports are used to help teams close more deals and sell more efficiently. They also help businesses understand their revenue, set goals, and optimise sales processes. And understanding your sales and revenue allows you to plan for the future.

    One of the keys to building a successful sales strategy and team is understanding your sales cycle. That’s why it’s so important for companies to analyse their lead and sales data. For business-to-business (B2B) companies in particular, the sales cycle can be a long process. But you can use reporting and analytics to learn about the stages of the buying cycle, including how long they take and how many leads proceed to the next step.

    Analysing lead and customer data also allows you to gain insights into who your customers are. With detailed account records, you can track where your customers are, what industries they come from, what their role is and how much they spend. While you can use reports to gather customer data, you also have to use analysis and qualitative information in order to build buyer personas. 

    Many sales teams use past individual and business performance to understand revenue trends. For instance, you might study historical data reports to learn how seasonality affects your revenue. If you dive deeper, you might find that seasonal trends may depend on the country where your customers live. 

    Sales rep, money and clock

    Conversely, it’s also important to analyse what internal variables are affecting revenue. You can use revenue reports to identify your top-performing sales associates. You can then try to expand and replicate that success. While sales is a field often driven by personal relationships and conversations, many types of reports allow you to learn about and improve the process.

    Website and user behaviour analytics and reports 

    More and more, businesses view their websites as an experience and user behaviour as an important part of their business. And just like sales and marketing, reporting and analytics help you better understand and optimise your web experience. 

    Many web and user behaviour metrics, like traffic source, have important implications for marketing. For example, page traffic and user flows can provide valuable insights into what your customers are interested in. This can then drive future content development and marketing campaigns.

    You can also learn about how your users navigate and use your website. A robust web analytics tool, like Matomo, can supply user session recordings and visitor tracking. For example, you could study which pages a particular user visits. But Matomo also has a feature called Transitions that provides visual reports showing where a particular page’s traffic comes from and where visitors tend to go afterward. 

    As you consider why people might be leaving your website, site performance is another important area for reporting. Most users are accustomed to near-instantaneous web experiences, so it’s worth monitoring your page load time and looking out for backend delays. In today’s world, your website experience is part of what you’re selling to customers. Don’t miss out on opportunities to impress and delight them.

    Dive into your data

    Reporting and analytics can seem like mysterious buzzwords we’re all supposed to understand already. But, like anything else, they require definitions and meaningful examples. When you dig into the topic, though, the applications for reporting and analytics are endless.

    Use these examples to identify how you can use analytics and reports in your role and department to achieve better results, whether that means higher quality leads, bigger deal size or a better user experience.

    To see how Matomo can collect accurate and reliable data and turn it into in-depth analytics and reports, start a free 21-day trial. No credit card required.

  • Cohort Analysis 101 : How-To, Examples & Top Tools

    13 novembre 2023, par Erin — Analytics Tips

    Imagine that a farmer is trying to figure out why certain hens are laying large brown eggs and others are laying average-sized white eggs.

    The farmer decides to group the hens into cohorts based on what kind of eggs they lay to make it easier to detect patterns in their day-to-day lives. After careful observation and analysis, she discovered that the hens laying big brown eggs ate more than the roost’s other hens.

    With this cohort analysis, the farmer deduced that a hen’s body weight directly corresponds to egg size. She can now develop a strategy to increase the body weight of her hens to sell more large brown eggs, which are very popular at the weekly farmers’ market.

    Cohort analysis has a myriad of applications in the world of web analytics. Like our farmer, you can use it to better understand user behaviour and reap the benefits of your efforts. This article will discuss the best practices for conducting an effective cohort analysis and compare the top cohort analysis tools for 2024. 

    What is cohort analysis ?

    By definition, cohort analysis refers to a technique where users are grouped based on shared characteristics or behaviours and then examined over a specified period.

    Think of it as a marketing superpower, enabling you to comprehend user behaviours, craft personalised campaigns and allocate resources wisely, ultimately resulting in improved performance and better ROI.

    Why does cohort analysis matter ?

    In web analytics, a cohort is a group of users who share a certain behaviour or characteristic. The goal of cohort analysis is to uncover patterns and compare the performance and behaviour of different cohorts over time.

    An example of a cohort is a group of users who made their first purchase during the holidays. By analysing this cohort, you could learn more about their behaviour and buying patterns. You may discover that this cohort is more likely to buy specific product categories as holiday gifts — you can then tailor future holiday marketing campaigns to include these categories. 

    Types of cohort analysis

    There are a few different types of notable cohorts : 

    1. Time-based cohorts are groups of users categorised by a specific time. The example of the farmer we went over at the beginning of this section is a great example of a time-based cohort.
    2. Acquisition cohorts are users acquired during a specific time frame, event or marketing channel. Analysing these cohorts can help you determine the value of different acquisition methods. 
    3. Behavioural cohorts consist of users who show similar patterns of behaviour. Examples include frequent purchases with your mobile app or digital content engagement. 
    4. Demographic cohorts share common demographic characteristics like age, gender, education level and income. 
    5. Churn cohorts are buyers who have cancelled a subscription/stopped using your service within a specific time frame. Analysing churn cohorts can help you understand why customers leave.
    6. Geographic cohorts are pretty self-explanatory — you can use them to tailor your marketing efforts to specific regions. 
    7. Customer journey cohorts are based on the buyer lifecycle — from acquisition to adoption to retention. 
    8. Product usage cohorts are buyers who use your product/service specifically (think basic users, power users or occasional users). 

    Best practices for conducting a cohort analysis 

    So, you’ve decided you want to understand your user base better but don’t know how to go about it. Perhaps you want to reduce churn and create a more engaging user experience. In this section, we’ll walk you through the dos and don’ts of conducting an effective cohort analysis. Remember that you should tailor your cohort analysis strategy for organisation-specific goals.

    A line graph depicting product usage cohort data with a blue line for new users and a green line for power users.

    1. Preparing for cohort analysis : 

      • First, define specific goals you want your cohort analysis to achieve. Examples include improving conversion rates or reducing churn.
      • Choosing the right time frame will help you compare short-term vs. long-term data trends. 

    2. Creating effective cohorts : 

      • Define your segmentation criteria — anything from demographics to location, purchase history or user engagement level. Narrowing in on your specific segments will make your cohort analysis more precise. 
      • It’s important to find a balance between cohort size and similarity. If your cohort is too small and diverse, you won’t be able to find specific behavioural patterns.

    3. Performing cohort analysis :

        • Study retention rates across cohorts to identify patterns in user behaviour and engagement over time. Pay special attention to cohorts with high retention or churn rates. 
        • Analysing cohorts can reveal interesting behavioural insights — how do specific cohorts interact with your website ? Do they have certain preferences ? Why ? 

    4. Visualising and interpreting data :

      • Visualising your findings can be a great way to reveal patterns. Line charts can help you spot trends, while bar charts can help you compare cohorts.
      • Guide your analytics team on how to interpret patterns in cohort data. Watch for sudden drops or spikes and what they could mean. 

    5. Continue improving :

      • User behaviour is constantly evolving, so be adaptable. Continuous tracking of user behaviour will help keep your strategies up to date. 
      • Encourage iterative analysis optimisation based on your findings. 
    wrench trying to hammer in a nail, and a hammer trying to screw in a screw to a piece of wood

    The top cohort analysis tools for 2024

    In this section, we’ll go over the best cohort analysis tools for 2024, including their key features, cohort analysis dashboards, cost and pros and cons.

    1. Matomo

    A screenshot of a cohorts graph in Matomo

    Matomo is an open-source, GDPR-compliant web analytics solution that offers cohort analysis as a standard feature in Matomo Cloud and is available as a plugin for Matomo On-Premise. Pairing traditional web analytics with cohort analysis will help you gain even deeper insights into understanding user behaviour over time. 

    You can use the data you get from web analytics to identify patterns in user behaviour and target your marketing strategies to specific cohorts. 

    Key features

    • Matomo offers a cohorts table that lets you compare cohorts side-by-side, and it comes with a time series.
      • All core session and conversion metrics are also available in the Cohorts report.
    • Create custom segments based on demographics, geography, referral sources, acquisition date, device types or user behaviour. 
    • Matomo provides retention analysis so you can track how many users from a specific cohort return to your website and when. 
    • Flexibly analyse your cohorts with custom reports. Customise your reports by combining metrics and dimensions specific to different cohorts. 
    • Create cohorts based on events or interactions with your website. 
    • Intuitive, colour-coded data visualisation, so you can easily spot patterns.

    Pros

    • No setup is needed if you use the JavaScript tracker
    • You can fetch cohort without any limit
    • 100% accurate data, no AI or Machine Learning data filling, and without the use of data sampling

    Cons

    • Matomo On-Premise (self-hosted) is free, but advanced features come with additional charges
    • Servers and technical know-how are required for Matomo On-Premise. Alternatively, for those not ready for self-hosting, Matomo Cloud presents a more accessible option and starts at $19 per month.

    Price : 

    • Matomo Cloud : 21-day free trial, then starts at $19 per month (includes Cohorts).
    • Matomo On-Premise : Free to self-host ; Cohorts plugin : 30-day free trial, then $99 per year.

    2. Mixpanel

    Mixpanel is a product analytics tool designed to help teams better understand user behaviour. It is especially well-suited for analysing user behaviour on iOS and Android apps. It offers various cohort analytics features that can be used to identify patterns and engage your users. 

    Key features

    • Create cohorts based on criteria such as sign-up date, first purchase date, referral source, geographic location, device type or another custom event/property. 
    • Compare how different cohorts engage with your app with Mixpanel’s comparative analysis features.
    • Create interactive dashboards, charts and graphs to visualise data.
    • Mixpanel provides retention analysis tools to see how often users return to your product over time. 
    • Send targeted messages and notifications to specific cohorts to encourage user engagement, announce new features, etc. 
    • Track and analyse user behaviours within cohorts — understand how different types of users engage with your product.

    Pros

    • Easily export cohort analysis data for further analysis
    • Combined with Mixpanel reports, cohorts can be a powerful tool for improving your product

    Cons

    • With the free Mixpanel plan, you can’t save cohorts for future use
    • Enterprise-level pricing is expensive
    • Time-consuming cohort creation process

    Price : Free basic version. The growth version starts at £16/month.

    3. Amplitude

    A screenshot of a cohorts graph in Amplitude

    Amplitude is another product analytics solution that can help businesses track user interactions across digital platforms. Amplitude offers a standard toolkit for in-depth cohort analysis.

    Key features

    • Create cohorts based on criteria such as sign-up date, first purchase date, referral source, geographic location, device type or another custom event/property. 
    • Conduct behavioural, time-based and retention analyses.
    • Create custom reports with custom data.
    • Segment cohorts further based on additional criteria and compare multiple cohorts side-by-side.

    Pros

    • Highly customisable and flexible
    • Quick and simple setup

    Cons

    • Steep learning curve — requires significant training 
    • Slow loading speed
    • High price point compared to other tools

    Price : Free basic version. Plus version starts at £40/month (billed annually).

    4. Kissmetrics

    A screenshot of a cohorts graph in Kissmetrics

    Kissmetrics is a customer engagement automation platform that offers powerful analytics features. Kissmetrics provides behavioural analytics, segmentation and email campaign automation. 

    Key features

    • Create cohorts based on demographics, user behaviour, referral sources, events and specific time frames.
    • The user path tool provides path visualisation so you can identify common paths users take and spot abandonment points. 
    • Create and optimise conversion funnels.
    • Customise events, user properties, funnels, segments, cohorts and more.

    Pros

    • Powerful data visualisation options
    • Highly customisable

    Cons

    • Difficult to install
    • Not well-suited for small businesses
    • Limited integration with other tools

    Price : Starting at £21/month for 10k events (billed monthly).

    Improve your cohort analysis with Matomo

    When choosing a cohort analysis tool, consider factors such as the tool’s ease of integration with your existing systems, data accuracy, the flexibility it offers in defining cohorts, the comprehensiveness of reporting features, and its scalability to accommodate the growth of your data and analysis needs over time. Moreover, it’s essential to confirm GDPR compliance to uphold rigorous privacy standards. 

    If you’re ready to understand your user’s behaviour, take Matomo for a test drive. Paired with web analytics, this powerful combination can advance your marketing efforts. Start your 21-day free trial today — no credit card required.

  • FFmpeg : High pitched audio when converting AVI video to MP4 [closed]

    15 août 2023, par Karen S

    I'm pretty new to FFmpeg and I'm having trouble converting a video from AVI format to MP4. I'm trying to convert pcm_s16le to aac but the resulting audio is very high pitched and choppy.

    


    The AVI video has two audio streams :

    


    Stream #0:1: Audio: pcm_s16le, 48000 Hz, stereo, s16, 1536 kb/s
Stream #0:2: Audio: pcm_s16le, 32000 Hz, stereo, s16, 1024 kb/s


    


    In VLC Media Player the video plays fine and it uses the second audio stream.

    


    When I run ffmpeg -i input.avi -vcodec libx264 -crf 27 output.mp4 in my terminal, the output audio in the MP4 file sounds extremely choppy and the voices are chipmunk-like.
This is the codec information for the audio stream in VLC after the conversion :

    


    Codec: MPEG AAC Audio (mp4a)
Type: Audio
Channels: Stereo
Sample rate: 48000 Hz
Bits per sample: 32


    


    This is the terminal output :

    


    ffmpeg version 2023-06-11-git-09621fd7d9-full_build-www.gyan.dev Copyright (c) 2000-2023 the FFmpeg developers
  built with gcc 12.2.0 (Rev10, Built by MSYS2 project)
  configuration: --enable-gpl --enable-version3 --enable-static --disable-w32threads --disable-autodetect --enable-fontconfig --enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp --enable-bzlib --enable-lzma --enable-libsnappy --enable-zlib --enable-librist --enable-libsrt --enable-libssh --enable-libzmq --enable-avisynth --enable-libbluray --enable-libcaca --enable-sdl2 --enable-libaribb24 --enable-libaribcaption --enable-libdav1d --enable-libdavs2 --enable-libuavs3d --enable-libzvbi --enable-librav1e --enable-libsvtav1 --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs2 --enable-libxvid --enable-libaom --enable-libjxl --enable-libopenjpeg --enable-libvpx --enable-mediafoundation --enable-libass --enable-frei0r --enable-libfreetype --enable-libfribidi --enable-liblensfun --enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf --enable-cuda-llvm --enable-cuvid --enable-ffnvcodec --enable-nvdec --enable-nvenc --enable-d3d11va --enable-dxva2 --enable-libvpl --enable-libshaderc --enable-vulkan --enable-libplacebo --enable-opencl --enable-libcdio --enable-libgme --enable-libmodplug --enable-libopenmpt --enable-libopencore-amrwb --enable-libmp3lame --enable-libshine --enable-libtheora --enable-libtwolame --enable-libvo-amrwbenc --enable-libcodec2 --enable-libilbc --enable-libgsm --enable-libopencore-amrnb --enable-libopus --enable-libspeex --enable-libvorbis --enable-ladspa --enable-libbs2b --enable-libflite --enable-libmysofa --enable-librubberband --enable-libsoxr --enable-chromaprint
  libavutil      58. 13.100 / 58. 13.100
  libavcodec     60. 17.100 / 60. 17.100
  libavformat    60.  6.100 / 60.  6.100
  libavdevice    60.  2.100 / 60.  2.100
  libavfilter     9.  8.101 /  9.  8.101
  libswscale      7.  3.100 /  7.  3.100
  libswresample   4. 11.100 /  4. 11.100
  libpostproc    57.  2.100 / 57.  2.100
[avi @ 000001d1b05ffbc0] Switching to NI mode, due to poor interleaving
Input #0, avi, from 'VG2002-06-10 TAPE_71 001.avi':
  Duration: 00:10:40.64, start: 0.000000, bitrate: 28852 kb/s
  Stream #0:0: Video: dvvideo, yuv411p, 720x480 [SAR 32:27 DAR 16:9], 25000 kb/s, SAR 8:9 DAR 4:3, 60k fps, 29.97 tbr, 60k tbn
  Stream #0:1: Audio: pcm_s16le, 48000 Hz, stereo, s16, 1536 kb/s
  Stream #0:2: Audio: pcm_s16le, 32000 Hz, stereo, s16, 1024 kb/s
File 'VG2002-06-10compress.mp4' already exists. Overwrite? [y/N] y
Stream mapping:
  Stream #0:0 -> #0:0 (dvvideo (native) -> h264 (libx264))
  Stream #0:1 -> #0:1 (pcm_s16le (native) -> aac (native))
Press [q] to stop, [?] for help
[pcm_s16le @ 000001d1b0696680] This decoder does not support parameter changes, but PARAM_CHANGE side data was sent to it.
[pcm_s16le @ 000001d1b0696680] Error applying parameter changes.
[libx264 @ 000001d1b0ab4940] using SAR=8/9
[libx264 @ 000001d1b0ab4940] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 BMI2 AVX2
[libx264 @ 000001d1b0ab4940] profile High 4:2:2, level 3.0, 4:2:2, 8-bit
[libx264 @ 000001d1b0ab4940] 264 - core 164 r3107 a8b68eb - H.264/MPEG-4 AVC codec - Copyleft 2003-2023 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=12 lookahead_threads=2 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=27.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
Output #0, mp4, to 'VG2002-06-10compress.mp4':
  Metadata:
    encoder         : Lavf60.6.100
  Stream #0:0: Video: h264 (avc1 / 0x31637661), yuv422p(tv, bottom coded first (swapped)), 720x480 [SAR 8:9 DAR 4:3], q=2-31, 29.97 fps, 30k tbn
    Metadata:
      encoder         : Lavc60.17.100 libx264
    Side data:
      cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: N/A
  Stream #0:1: Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 128 kb/s
    Metadata:
      encoder         : Lavc60.17.100 aac
[dvvideo @ 000001d1b0a26f00] Concealing bitstream errors.49 bitrate=1424.2kbits/s speed=2.51x
    Last message repeated 76 times
[dvvideo @ 000001d1b0f61680] Concealing bitstream errors.23 bitrate=1451.5kbits/s speed=2.47x
    Last message repeated 19 times
[out#0/mp4 @ 000001d1b0696c80] video:106080kB audio:6745kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.465412%
frame=19200 fps= 73 q=-1.0 Lsize=  113351kB time=00:10:40.60 bitrate=1449.5kbits/s speed=2.43x
[libx264 @ 000001d1b0ab4940] frame I:99    Avg QP:26.85  size: 42388
[libx264 @ 000001d1b0ab4940] frame P:4922  Avg QP:29.69  size: 13724
[libx264 @ 000001d1b0ab4940] frame B:14179 Avg QP:33.82  size:  2601
[libx264 @ 000001d1b0ab4940] consecutive B-frames:  1.2%  0.8%  1.1% 97.0%
[libx264 @ 000001d1b0ab4940] mb I  I16..4: 12.2% 77.3% 10.4%
[libx264 @ 000001d1b0ab4940] mb P  I16..4:  1.7%  8.1%  1.4%  P16..4: 39.9% 16.6% 12.2%  0.0%  0.0%    skip:20.1%
[libx264 @ 000001d1b0ab4940] mb B  I16..4:  0.3%  0.6%  0.1%  B16..8: 38.6%  3.8%  1.1%  direct: 3.3%  skip:52.2%  L0:42.2% L1:49.3% BI: 8.5%
[libx264 @ 000001d1b0ab4940] 8x8 transform intra:71.6% inter:71.6%
[libx264 @ 000001d1b0ab4940] coded y,uvDC,uvAC intra: 72.3% 66.8% 11.7% inter: 17.3% 10.6% 0.2%
[libx264 @ 000001d1b0ab4940] i16 v,h,dc,p: 11% 64%  7% 17%
[libx264 @ 000001d1b0ab4940] i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 11% 26% 24%  5%  6%  5%  9%  5%  9%
[libx264 @ 000001d1b0ab4940] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu:  8% 52% 13%  3%  5%  4%  7%  3%  6%
[libx264 @ 000001d1b0ab4940] i8c dc,h,v,p: 58% 21% 17%  5%
[libx264 @ 000001d1b0ab4940] Weighted P-Frames: Y:3.5% UV:1.3%
[libx264 @ 000001d1b0ab4940] ref P L0: 51.9% 15.0% 23.5%  9.3%  0.3%
[libx264 @ 000001d1b0ab4940] ref B L0: 87.5%  9.7%  2.9%
[libx264 @ 000001d1b0ab4940] ref B L1: 95.1%  4.9%
[libx264 @ 000001d1b0ab4940] kb/s:1356.46
[aac @ 000001d1b0a6ab80] Qavg: 517.695


    


    FFmpeg automatically selects the first audio stream to encode but then this error comes up :

    


    [pcm_s16le @ 000001d1b0696680] This decoder does not support parameter changes, but PARAM_CHANGE side data was sent to it.


    


    To bypass this error, I've edited the FFmpeg command so that the second stream is selected instead for aac encoding, but then the resulting MP4 video is silent.

    


    Does anyone know why FFmpeg is unable to convert the original AVI audio to aac without it sounding higher-pitched ? I think that the sample rate of the first stream might be set incorrectly but I'm not too sure how to fix it so that the encoder reads it properly.