
Recherche avancée
Médias (91)
-
Richard Stallman et le logiciel libre
19 octobre 2011, par
Mis à jour : Mai 2013
Langue : français
Type : Texte
-
Stereo master soundtrack
17 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Audio
-
Elephants Dream - Cover of the soundtrack
17 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Image
-
#7 Ambience
16 octobre 2011, par
Mis à jour : Juin 2015
Langue : English
Type : Audio
-
#6 Teaser Music
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#5 End Title
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
Autres articles (51)
-
Changer son thème graphique
22 février 2011, parLe thème graphique ne touche pas à la disposition à proprement dite des éléments dans la page. Il ne fait que modifier l’apparence des éléments.
Le placement peut être modifié effectivement, mais cette modification n’est que visuelle et non pas au niveau de la représentation sémantique de la page.
Modifier le thème graphique utilisé
Pour modifier le thème graphique utilisé, il est nécessaire que le plugin zen-garden soit activé sur le site.
Il suffit ensuite de se rendre dans l’espace de configuration du (...) -
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...)
Sur d’autres sites (7246)
-
Why Matomo is the top Google Analytics alternative
17 juin, par JoeYou probably made the switch to Google Analytics 4 (GA4) when Google stopped collecting Universal Analytics (UA) data in July 2023. Up to that point, UA had long been the default analytics platform, despite its many limitations.
This was mostly because everyone loved its free nature and simple setup. A Google account was all you needed — even a free legacy G-Suite account worked perfectly. Looking at the analytics for just about any website was easy.
That all changed with GA4, which addressed many of UA’s shortcomings by introducing a completely new way to model website data. Unfortunately, this also meant you couldn’t transfer historical data from UA into GA4, leading to more criticism.
Then there’s the added cost. GA4 is still free, but its limited functionality encourages you to upgrade to the enterprise version, Google Analytics 360 (GA360). Sure, you get lots of great functionality, less data sampling, and longer data retention periods, but it comes at a hefty price — $50,000 per year, to be exact.
There are other options, though, and Matomo Analytics is one of the best. It’s an open-source, privacy-centric platform that offers advanced features of GA360 and more.
In this article, we’ll compare GA4, GA360, and Matomo and give you what you need to make an informed decision.
Google Analytics 4 in a nutshell
Google Analytics 4 is a great tool to use to start learning about web analytics. But soon enough, you’ll likely find that GA4 doesn’t quite cover all of your needs.
For example, it can’t provide a detailed view of user experiences, and Google doesn’t offer dedicated support or onboarding. There are other shortcomings, too.
Data sampling
Google only processes a selected sample of website activity rather than every individual data point. Rather than looking at the whole picture, it sets a threshold and selects a [hopefully] representative sample for analysis.
This inevitably creates gaps in data. Google attempts to fill them in using AI and machine learning, inferring the rest from data patterns. Since the results rely on assumptions and estimates, they aren’t always precise.
In practical terms, this means that the accuracy of GA4 analysis will likely decline as website traffic increases.
Data collection limits
GA4’s 25 million monthly events limit seems like a lot, but they add up quickly.
All user interactions are recorded as events, including :
- Session start : User visits the site.
- Page view : User loads a page (tracked automatically).
- First visit : User accesses the site for the first time.
- User engagement : User stays on a page for a set time period.
- Scroll : User scrolls past 90% of the page (enhanced measurement).
- Click : User clicks on any element (links, buttons, etc.).
- Video start/complete : User starts or completes a video (enhanced measurement).
- File download : User downloads a file (enhanced measurement).
For context, consider a website averaging 50 events per session per user. If every user logs on every third day, on average, you’ll need 10,000 individual visitors a month to reach that 25 million. But that’s not the problem.
The problem is that collection limits in GA4 affect your ability to capture, secure, and analyse customer data effectively.
Customisation
GA4 users also face configuration limits that restrict their customisation options. For example :
- Audience limits : Since only 100 audiences are allowed, it’s necessary to combine or optimise segments rather than track too many small groups.
- Retention limits : Data retention is limited to only 14 months, so external storage solutions may be necessary in situations where historical data needs to be preserved.
- Conversion events : GA4 will only track up to 30 conversion events, so it’s best to focus on high-value interactions (e.g., purchases and lead form submissions).
- Event-scoped dimensions : Since e-commerce operations are limited to 50 event-scoped dimensions, they need to carefully consider custom dimensions and key metrics. This makes it important to be selective about which product details to track (color, size, discount code, etc.).
Data privacy
GA4 isn’t GDPR-compliant out of the box. In fact, Google Analytics 4 is banned in seven EU countries because they believe the way it collects and transfers data violates GDPR.
Data privacy regulations may or may not be a big concern, depending on where your customers are. However, if some are in the UK or any of the 30 countries that make up the European Economic Area (EEA), you must comply with the General Data Protection Regulation (GDPR).
It tells your customers that you don’t respect their data if you don’t. It can also get very expensive.
Limited attribution models
Attribution models track how different marketing touchpoints lead to a conversion (such as a purchase, sign-up, or lead generation). They help businesses understand which marketing channels and strategies are most effective in driving results.
GA4 supports only two of the six standard attribution models previously supported in Universal Analytics. Organisations wanting data-driven or last-click attribution models will find them in Google Analytics. But they’ll need to look elsewhere if they’re going to use any of these models :
- First click attribution
- Linear attribution
- Time decay attribution
- Position-based attribution (u-shaped)
GA360 isn’t a solution either
Fundamentally, GA360 is the same product as GA4, without the above limits and restrictions. For companies that pay $50,000 (or more) each year, the only changes involve how much data is collected, how long it stays and data sampling thresholds.
Above all, the GDPR-compliance issue remains. That can be a real problem for organisations with operations that collect personal data in the EEA or the UK.
And the problem could soon be much bigger than just those 31 countries. Many countries currently implementing data privacy laws are modelling their efforts on GDPR, which may rule out both GA4 and GA360.
What makes Matomo the top alternative ?
No data limits
One way to overcome all these challenges is to switch to Matomo Analytics.
There’s no data sampling and no data collection limits whatsoever with on-premise implementation. Matomo also supports all six attribution models, is open source and fully customisable and complies with GDPR out of the box.
Imagine trying to change your business strategy or marketing campaigns if you’re not confident that your data is reliable and accurate.
It’s no secret that data sampling can negatively affect the accuracy of the data, and inaccurate data can lead to poor decision-making.
With Matomo, there are no limits. We don’t restrict the size of containers within the Tag Manager nor the number of containers or tags within each container. You have more control over your customers’ data.
And you get to make your decisions based on all that data. That’s important because data quality is critical for high-impact decisions.
Open source
Open-source software allows anyone to inspect, audit, and improve the source code for security and efficiency. That means no hidden data collection, faster bug fixes, and no vendor lock-in. As a bonus, these things make complying with data privacy laws and regulations easier.
Matomo can also be modified in any way, which provides unlimited customisation possibilities. There’s also a very active developer community around Matomo, so you don’t have to make changes yourself — you can hire someone who has the technical knowledge and expertise. They can :
- Modify tracking scripts for advanced analytics
- Create custom attribution models, tracking methods and dashboards
- Integrate Matomo with any system (CRM, eCommerce, CMS, etc.)
Data ownership
Matomo’s open-source nature also means full data ownership. No third parties can access the data, and there’s no risk of Google using that data for ads or AI training. Furthermore, Matomo follows privacy-first tracking principles, meaning that there’s :
- No third-party data sharing
- Full user consent control
- Support for cookie-less tracking
- IP Anonymisation, by default
- Do Not Track (DNT) support
All of that underlines the fact that Matomo collects, stores, and tracks data 100% ethically.
On-premise and cloud-based options
You can use the Matomo On-Premise web analytics solution if local data privacy laws require that you store data locally. Here’s a helpful tip : many of them do. However, this might not be necessary.
Due to GDPR, several countries recognise the EEA as an acceptable storage location for their citizens’ data. That means servers hosted in any of those 30 countries are already compliant in terms of data location.
Alternatively, you could embrace modernity and choose Matomo Cloud — our servers are also in Europe. While GA4 and GA360 are cloud-based, Google’s servers are in the US, and that’s a big problem for GDPR.
Comprehensive analytics
If you need a sophisticated web analytics platform that offers full control of your data and you have privacy concerns, Matomo is a solid choice.
It has built-in behavioural analytics features like Heatmaps, Scroll Depth and Session Recording. These tools allow you to collect and analyse data without relying on cookies or resorting to data sampling.
Those standout features can’t be found in GA4 or GA360. Google also doesn’t offer an on-premise solution.
The one area where Matomo can’t compete with Google Analytics is in its tight integration with the Google ecosystem : Google Ads, Gemini and Firebase.
Key things to consider before switching to Matomo
There are pros and cons to switching from GA4 (or even GA360) to Matomo. That’s because no software is perfect. There are always tradeoffs somewhere. With Matomo, there are a few things to consider before switching :
- Learning curve. Matomo is a full-featured analytics platform with many advanced features (session replay, custom event tracking, etc.). That can overwhelm new users and take time to understand well enough to maximise the benefits.
- Technical resources. Choosing a Matomo On-Premise solution requires technical resources, such as a server and skills.
- Third-party integration. Matomo provides pre-built integration tools for about a hundred platforms. However, it’s open source, so technical resources are required. On the plus side, it does make it possible to add to the list of APIs and connectors.
Head-to-head : GA4 vs GA360 vs Matomo
It’s always helpful to look at how different products stack up in terms of features and capabilities :
GA4 GA360 Matomo Data ownership ✔ Event-based data ✔ ✔ ✔ Session-based data ✔ Unsampled data ✔ Real-time data ✔ ✔ ✔ Heatmaps ✔ Session recordings ✔ A/B testing ✔ Open source ✔ On-premise hosting ✔ Data privacy Subject to Google’s data policies Subject to Google’s data policies GDPR, CCPA compliant ; full control over data storage Custom dimensions Yes (limited in free version) Yes (higher limits) Yes (unlimited in self-hosted) Attribution models Last click, data-driven Last click, data-driven, advanced Google Ads integration Last click, first click, linear, time decay, position-based, custom Data retention Up to 14 months (free) Up to 50 months Unlimited (self-hosted) Integrations Google Ads, Search Console, BigQuery (limited in free version) Advanced integrations (Google Ads, BigQuery, Salesforce, etc.) 100+ integrations (Google Ads, WordPress, Shopify, etc.) BigQuery export Free (limited to 1M events/day) Free (unlimited) Paid add-on (via plugin) Custom reports Limited customisation Advanced customisation Fully customisable Scalability Suitable for small to medium businesses Designed for large enterprises Scalable without limits (self-hosted or cloud) Ease of use Simple, requires onboarding Steeper learning curve Flexible, setup-intensive. Pricing Free Premium (starts at $50,000/year) Free open-source (self-hosted) ; Cloud starts at $29/month So, is Matomo the right solution for you ?
That’d be a ‘yes’ if you want a Google Analytics alternative that ticks all these boxes :
- Complies natively with privacy laws and regulations
- Offers real-time data and custom event tracking
- Enables a deeper understanding of user behaviour
- Allows you to fine-tune user experiences
- Provides full control over your customers’ data
- Offers conversion funnels, session recordings and heatmaps
- Has session replay to trace user interactions
- Includes plenty of readily actionable insights
Find out why millions of websites trust Matomo
Matomo is an easy-to-use, all-in-one web analytics tool with advanced behavioural analytics functionality.
It’ll also help you future-proof your business because it supports compliance with global privacy laws in 162 countries. With an ethical alternative like Matomo, you don’t need to risk your business or customers’ private data.
It’s not just about avoiding fines. It’s also about building trust with your customers. That’s why you need a privacy-focused, ethical solution like Matomo.
See for yourself : download Matomo On-Premise today, or start your 21-day free trial of Matomo Cloud (no credit card required).
-
VideoWriter Doesn't work using openCV, ubuntu, Qt
25 janvier 2023, par underflow223My code :


cv::VideoWriter(
 strFile.toStdString(),
 cv::VideoWriter::fourcc('m','p','4','v'),
 nfps,
 cv::Size(1920/nresize, 1080/nresize)
);



Error message :


[mpeg4_v4l2m2m @ 0x7f50a43c50] arm_release_ver of this libmali is 'g6p0-01eac0', rk_so_ver is '7'.
Could not find a valid device
[mpeg4_v4l2m2m @ 0x7f50a43c50] can't configure encoder



If I use MJPG codec, it works fine thow.


This is OPENCV configure info :


-- General configuration for OpenCV 4.6.0 =====================================
-- Version control: unknown
-- 
-- Extra modules:
-- Location (extra): /home/firefly/Downloads/opencv_contrib-4.6.0/modules
-- Version control (extra): unknown
-- 
-- Platform:
-- Timestamp: 2023-01-19T02:11:26Z
-- Host: Linux 5.10.110 aarch64
-- CMake: 3.16.3
-- CMake generator: Unix Makefiles
-- CMake build tool: /usr/bin/make
-- Configuration: Release
-- 
-- CPU/HW features:
-- Baseline: NEON FP16
-- 
-- C/C++:
-- Built as dynamic libs?: YES
-- C++ standard: 11
-- C++ Compiler: /usr/bin/c++ (ver 9.4.0)
-- C++ flags (Release): -fsigned-char -W -Wall -Wreturn-type -Wnon-virtual-dtor -Waddress -Wsequence-point -Wformat -Wformat-security -Wmissing-declarations -Wundef -Winit-self -Wpointer-arith -Wshadow -Wsign-promo -Wuninitialized -Wsuggest-override -Wno-delete-non-virtual-dtor -Wno-comment -Wimplicit-fallthrough=3 -Wno-strict-overflow -fdiagnostics-show-option -pthread -fomit-frame-pointer -ffunction-sections -fdata-sections -fvisibility=hidden -fvisibility-inlines-hidden -O3 -DNDEBUG -DNDEBUG
-- C++ flags (Debug): -fsigned-char -W -Wall -Wreturn-type -Wnon-virtual-dtor -Waddress -Wsequence-point -Wformat -Wformat-security -Wmissing-declarations -Wundef -Winit-self -Wpointer-arith -Wshadow -Wsign-promo -Wuninitialized -Wsuggest-override -Wno-delete-non-virtual-dtor -Wno-comment -Wimplicit-fallthrough=3 -Wno-strict-overflow -fdiagnostics-show-option -pthread -fomit-frame-pointer -ffunction-sections -fdata-sections -fvisibility=hidden -fvisibility-inlines-hidden -g -O0 -DDEBUG -D_DEBUG
-- C Compiler: /usr/bin/cc
-- C flags (Release): -fsigned-char -W -Wall -Wreturn-type -Waddress -Wsequence-point -Wformat -Wformat-security -Wmissing-declarations -Wmissing-prototypes -Wstrict-prototypes -Wundef -Winit-self -Wpointer-arith -Wshadow -Wuninitialized -Wno-comment -Wimplicit-fallthrough=3 -Wno-strict-overflow -fdiagnostics-show-option -pthread -fomit-frame-pointer -ffunction-sections -fdata-sections -fvisibility=hidden -O3 -DNDEBUG -DNDEBUG
-- C flags (Debug): -fsigned-char -W -Wall -Wreturn-type -Waddress -Wsequence-point -Wformat -Wformat-security -Wmissing-declarations -Wmissing-prototypes -Wstrict-prototypes -Wundef -Winit-self -Wpointer-arith -Wshadow -Wuninitialized -Wno-comment -Wimplicit-fallthrough=3 -Wno-strict-overflow -fdiagnostics-show-option -pthread -fomit-frame-pointer -ffunction-sections -fdata-sections -fvisibility=hidden -g -O0 -DDEBUG -D_DEBUG
-- Linker flags (Release): -Wl,--gc-sections -Wl,--as-needed -Wl,--no-undefined 
-- Linker flags (Debug): -Wl,--gc-sections -Wl,--as-needed -Wl,--no-undefined 
-- ccache: NO
-- Precompiled headers: NO
-- Extra dependencies: dl m pthread rt
-- 3rdparty dependencies:
-- 
-- OpenCV modules:
-- To be built: aruco barcode bgsegm bioinspired calib3d ccalib core datasets dnn dnn_objdetect dnn_superres dpm face features2d flann freetype fuzzy gapi hfs highgui img_hash imgcodecs imgproc intensity_transform line_descriptor mcc ml objdetect optflow phase_unwrapping photo plot quality rapid reg rgbd saliency shape stereo stitching structured_light superres surface_matching text tracking ts video videoio videostab wechat_qrcode xfeatures2d ximgproc xobjdetect xphoto
-- Disabled: world
-- Disabled by dependency: -
-- Unavailable: alphamat cudaarithm cudabgsegm cudacodec cudafeatures2d cudafilters cudaimgproc cudalegacy cudaobjdetect cudaoptflow cudastereo cudawarping cudev cvv hdf java julia matlab ovis python2 python3 sfm viz
-- Applications: tests perf_tests apps
-- Documentation: NO
-- Non-free algorithms: NO
-- 
-- GUI: GTK3
-- GTK+: YES (ver 3.24.20)
-- GThread : YES (ver 2.64.6)
-- GtkGlExt: NO
-- VTK support: NO
-- 
-- Media I/O: 
-- ZLib: /usr/lib/aarch64-linux-gnu/libz.so (ver 1.2.11)
-- JPEG: /usr/lib/aarch64-linux-gnu/libjpeg.so (ver 80)
-- WEBP: build (ver encoder: 0x020f)
-- PNG: /usr/lib/aarch64-linux-gnu/libpng.so (ver 1.6.37)
-- TIFF: /usr/lib/aarch64-linux-gnu/libtiff.so (ver 42 / 4.1.0)
-- JPEG 2000: build (ver 2.4.0)
-- OpenEXR: build (ver 2.3.0)
-- HDR: YES
-- SUNRASTER: YES
-- PXM: YES
-- PFM: YES
-- 
-- Video I/O:
-- DC1394: YES (2.2.5)
-- FFMPEG: YES
-- avcodec: YES (58.54.100)
-- avformat: YES (58.29.100)
-- avutil: YES (56.31.100)
-- swscale: YES (5.5.100)
-- avresample: YES (4.0.0)
-- GStreamer: YES (1.16.2)
-- v4l/v4l2: YES (linux/videodev2.h)
-- 
-- Parallel framework: pthreads
-- 
-- Trace: YES (with Intel ITT)
-- 
-- Other third-party libraries:
-- Lapack: NO
-- Eigen: NO
-- Custom HAL: YES (carotene (ver 0.0.1))
-- Protobuf: build (3.19.1)
-- 
-- OpenCL: YES (no extra features)
-- Include path: /home/firefly/Downloads/opencv-4.6.0/3rdparty/include/opencl/1.2
-- Link libraries: Dynamic load
-- 
-- Python (for build): /usr/bin/python2.7
-- 
-- Java: 
-- ant: NO
-- JNI: NO
-- Java wrappers: NO
-- Java tests: NO
-- 
============================================================================================



ffmpeg info :


============================================================================================
ffmpeg
ffmpeg version 4.2.4-1ubuntu1.0firefly5 Copyright (c) 2000-2020 the FFmpeg developers
 built with gcc 9 (Ubuntu 9.4.0-1ubuntu1~20.04.1)
 configuration: --prefix=/usr --extra-version=1ubuntu1.0firefly5 --toolchain=hardened --libdir=/usr/lib/aarch64-linux-gnu --incdir=/usr/include/aarch64-linux-gnu --arch=arm64 --enable-gpl --disable-stripping --enable-avresample --disable-filter=resample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librsvg --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opencl --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libx264 --enable-libdrm --enable-librga --enable-rkmpp --enable-version3 --disable-libopenh264 --disable-vaapi --disable-vdpau --disable-decoder=h264_v4l2m2m --disable-decoder=vp8_v4l2m2m --disable-decoder=mpeg2_v4l2m2m --disable-decoder=mpeg4_v4l2m2m --enable-shared --disable-doc
 libavutil 56. 31.100 / 56. 31.100
 libavcodec 58. 54.100 / 58. 54.100
 libavformat 58. 29.100 / 58. 29.100
 libavdevice 58. 8.100 / 58. 8.100
 libavfilter 7. 57.100 / 7. 57.100
 libavresample 4. 0. 0 / 4. 0. 0
 libswscale 5. 5.100 / 5. 5.100
 libswresample 3. 5.100 / 3. 5.100
 libpostproc 55. 5.100 / 55. 5.100
Hyper fast Audio and Video encoder
usage: ffmpeg [options] [[infile options] -i infile]... {[outfile options] outfile}...
====================================================================================



-
Using PyAV to encode mono audio to file, params match docs, but still causes Errno 22
20 février 2023, par andrew8088While trying to use PyAV to encode live mono audio from a microphone to a compressed audio stream (using mp2 or flac as encoder), the program kept raising an exception
ValueError: [Errno 22] Invalid argument
.

To remove the live microphone source as a cause of the problem, and to make the problematic code easier for others to run/test, I have removed the mic source and now just generate a pure tone as a sequence of input buffers.


All attempts to figure out the missing or mismatched or incorrect argument have just resulted in seeing documentation and examples that are the same as my code.


I would like to know from someone who has used PyAV successfully for mono audio what the correct method and parameters are for encoding mono frames into the mono stream.


The package used is av 10.0.0 installed with

pip3 install av --no-binary av

so it uses my package-manager provided ffmpeg library, which is version 4.2.7.

The problematic python code is :


#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Recreating an error 22 when encoding sound with PyAV.

Created on Sun Feb 19 08:10:29 2023
@author: andrewm
"""
import typing
import sys
import math
import fractions

import av
from av import AudioFrame

""" Ensure some PyAudio constants are still defined without changing 
 the PyAudio recording callback function and without depending 
 on PyAudio simply for reproducing the PyAV bug [Errno 22] thrown in 
 File "av/filter/context.pyx", line 89, in av.filter.context.FilterContext.push
"""
class PA_Stub():
 paContinue = True
 paComplete= False

pyaudio = PA_Stub()


"""Generate pure tone at given frequency with amplitude 0...1.0 at 
 sampling frewuency fs and beginning at phase offset 'phase'.
 Returns the new phase after the sinusoid has cycled over the 
 sampling window length.
"""
def generate_tone(
 freq:int, phase:float, amp:float, fs, samp_fmt, buffer:bytearray
) -> float:
 assert samp_fmt == "s16", "Only s16 supported atm"
 samp_size_bytes = 2
 n_samples = int(len(buffer)/samp_size_bytes)
 window = [int(0) for i in range(n_samples)]
 theta = phase
 phase_inc = 2*math.pi * freq / fs
 for i in range(n_samples):
 v = amp * math.sin(theta)
 theta += phase_inc
 s = int((2**15-1)*v)
 window[i] = s
 for sample_i in range(len(window)):
 byte_i = sample_i * samp_size_bytes
 enc = window[sample_i].to_bytes(
 2, byteorder=sys.byteorder, signed=True
 )
 buffer[byte_i] = enc[0]
 buffer[byte_i+1] = enc[1]
 return theta


channels = 1
fs = 44100 # Record at 44100 samples per second
fft_size_samps = 256
chunk_samps = fft_size_samps * 10 # Record in chunks that are multiples of fft windows.

# print(f"fft_size_samps={fft_size_samps}\nchunk_samps={chunk_samps}")

seconds = 3.0
out_filename = "testoutput.wav"

# Store data in chunks for 3 seconds
sample_limit = int(fs * seconds)
sample_len = 0
frames = [] # Initialize array to store frames

ffmpeg_codec_name = 'mp2' # flac, mp3, or libvorbis make same error.

sample_size_bytes = 2
buffer = bytearray(int(chunk_samps*sample_size_bytes))
chunkperiod = chunk_samps / fs
total_chunks = int(math.ceil(seconds / chunkperiod))
phase = 0.0

### uncomment if you want to see the synthetic data being used as a mic input.
# with open("test.raw","wb") as raw_out:
# for ci in range(total_chunks):
# phase = generate_tone(2600, phase, 0.8, fs, "s16", buffer)
# raw_out.write(buffer)
# print("finished gen test")
# sys.exit(0)
# #---- 

# Using mp2 or mkv as the container format gets the same error.
with av.open(out_filename+'.mp2', "w", format="mp2") as output_con:
 output_con.metadata["title"] = "My title"
 output_con.metadata["key"] = "value"
 channel_layout = "mono"
 sample_fmt = "s16p"

 ostream = output_con.add_stream(ffmpeg_codec_name, fs, layout=channel_layout)
 assert ostream is not None, "No stream!"
 cctx = ostream.codec_context
 cctx.sample_rate = fs
 cctx.time_base = fractions.Fraction(numerator=1,denominator=fs)
 cctx.format = sample_fmt
 cctx.channels = channels
 cctx.layout = channel_layout
 print(cctx, f"layout#{cctx.channel_layout}")
 
 # Define PyAudio-style callback for recording plus PyAV transcoding.
 def rec_callback(in_data, frame_count, time_info, status):
 global sample_len
 global ostream
 frames.append(in_data)
 nsamples = int(len(in_data) / (channels*sample_size_bytes))
 
 frame = AudioFrame(format=sample_fmt, layout=channel_layout, samples=nsamples)
 frame.sample_rate = fs
 frame.time_base = fractions.Fraction(numerator=1,denominator=fs)
 frame.pts = sample_len
 frame.planes[0].update(in_data)
 print(frame, len(in_data))
 
 for out_packet in ostream.encode(frame):
 output_con.mux(out_packet)
 for out_packet in ostream.encode(None):
 output_con.mux(out_packet)
 
 sample_len += nsamples
 retflag = pyaudio.paContinue if sample_lencode>


If you uncomment the RAW output part you will find the generated data can be imported as PCM s16 Mono 44100Hz into Audacity and plays the expected tone, so the generated audio data does not seem to be the problem.


The normal program console output up until the exception is :


mp2 at 0x7f8e38202cf0> layout#4
Beginning
 5120
. 5120



The stack trace is :


Traceback (most recent call last):

 File "Dev/multichan_recording/av_encode.py", line 147, in <module>
 ret_data, ret_flag = rec_callback(buffer, ci, {}, 1)

 File "Dev/multichan_recording/av_encode.py", line 121, in rec_callback
 for out_packet in ostream.encode(frame):

 File "av/stream.pyx", line 153, in av.stream.Stream.encode

 File "av/codec/context.pyx", line 484, in av.codec.context.CodecContext.encode

 File "av/audio/codeccontext.pyx", line 42, in av.audio.codeccontext.AudioCodecContext._prepare_frames_for_encode

 File "av/audio/resampler.pyx", line 101, in av.audio.resampler.AudioResampler.resample

 File "av/filter/graph.pyx", line 211, in av.filter.graph.Graph.push

 File "av/filter/context.pyx", line 89, in av.filter.context.FilterContext.push

 File "av/error.pyx", line 336, in av.error.err_check

ValueError: [Errno 22] Invalid argument

</module>


edit : It's interesting that the error happens on the 2nd AudioFrame, as apparently the first one was encoded okay, because they are given the same attribute values aside from the Presentation Time Stamp (pts), but leaving this out and letting PyAV/ffmpeg generate the PTS by itself does not fix the error, so an incorrect PTS does not seem the cause.


After a brief glance in
av/filter/context.pyx
the exception must come from a bad return value fromres = lib.av_buffersrc_write_frame(self.ptr, frame.ptr)

Trying to dig intoav_buffersrc_write_frame
from the ffmpeg source it is not clear what could be causing this error. The only obvious one is a mismatch between channel layouts, but my code is setting the layout the same in the Stream and the Frame. That problem had been found by an old question pyav - cannot save stream as mono and their answer (that one parameter required is undocumented) is the only reason the code now has the layout='mono' argument when making the stream.

The program output shows layout #4 is being used, and from https://github.com/FFmpeg/FFmpeg/blob/release/4.2/libavutil/channel_layout.h you can see this is the value for symbol AV_CH_FRONT_CENTER which is the only channel in the MONO layout.


The mismatch is surely some other object property or an undocumented parameter requirement.


How do you encode mono audio to a compressed stream with PyAV ?