Recherche avancée

Médias (10)

Mot : - Tags -/wav

Autres articles (99)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • MediaSPIP v0.2

    21 juin 2013, par

    MediaSPIP 0.2 est la première version de MediaSPIP stable.
    Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • MediaSPIP version 0.1 Beta

    16 avril 2011, par

    MediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

Sur d’autres sites (12107)

  • Using FFmpeg to stitch together H.264 videos and variably-spaced JPEG pictures ; dealing with ffmpeg warnings

    19 octobre 2022, par LB2

    Context

    


    I have a process flow that may output either H.264 Annex B streams, variably-spaced JPEGs, or a mixture of two. By variably-spaced I mean where elapsed time between any two adjacent JPEGs may (and likely to be) different from any other two adjacent JPEGs. So an example of possible inputs are :

    


      

    1. stream1.h264
    2. 


    3. {Set of JPEGs}
    4. 


    5. stream1.h264 + stream2.h264
    6. 


    7. stream1.h264 + {Set of JPEGs}
    8. 


    9. stream1.h264 + {Set of JPEGs} + stream2.h264
    10. 


    11. stream1.h264 + {Set of JPEGs} + stream2.h264 + {Set of JPEGs} + ...
    12. 


    13. stream1.h264 + stream2.h264 + {Set of JPEGs} + ...
    14. 


    


    The output needs to be a single stitched (i.e. concatenated) output in MPEG-4 container.

    


    Requirements : No re-encoding or transcoding of existing video compression (One time conversion of JPEG sets to video format is OKay).

    


    Solution Prototype

    


    To prototype the solution I have found that ffmpeg has concat demuxer that would let me specify an ordered sequence of inputs that ffmpeg would then concatenate together, but all inputs must be of the same format. So, to meet that requirement, I :

    


      

    1. Convert every JPEG set to an .mp4 using concat (and using delay # directive to specify time-spacing between each JPEG)
    2. 


    3. Convert every .h264 to .mp4 using -c copy to avoid transcoding.
    4. 


    5. Stitch all generated interim .mp4 files into the single final .mp4 using -f concat and -c copy.
    6. 


    


    Here's the bash script, in parts, that performs the above :

    


      

    1. Ignore the curl comment ; that's from originally generating a 100 jpeg images with numbers and these are simply saved locally. What the loop does is it generates concat input file with file sequence#.jpeg directives and duration # directive where each successive JPEG delay is incremented by 0.1 seconds (0.1 between first and second, 0.2 b/w 2nd and 3rd, 0.3 b/w 3rd and 4th, and so on). Then it runs ffmpeg command to convert the set of JPEGs to .mp4 interim file.

      


      echo "ffconcat version 1.0" >ffconcat-jpeg.txt
echo >>ffconcat-jpeg.txt

for i in {1..100}
do
    echo "file $i.jpeg" >>ffconcat-jpeg.txt
    d=$(echo "$i" | awk '{printf "%f", $1 / 10}')
    # d=$(echo "scale=2; $i/10" | bc)
    echo "duration $d" >>ffconcat-jpeg.txt
    echo "" >>ffconcat-jpeg.txt
    # curl -o "$i.jpeg" "https://math.tools/equation/get_equaimages?equation=$i&fontsize=256"
done

ffmpeg \
    -hide_banner \
    -vsync vfr \
    -f concat \
    -i ffconcat-jpeg.txt \
    -r 30 \
    -video_track_timescale 90000 \
    video-jpeg.mp4


      


    2. 


    3. Convert two streams from .h264 to .mp4 via copy (no transcoding).

      


      ffmpeg \
    -hide_banner \
    -i low-motion-video.h264 \
    -c copy \
    -vsync vfr \
    -video_track_timescale 90000 \
    low-motion-video.mp4

ffmpeg \
    -hide_banner \
    -i full-video.h264 \
    -c copy \
    -video_track_timescale 90000 \
    -vsync vfr \
    full-video.mp4


      


    4. 


    5. Stitch all together by generating another concat directive file.

      


      echo "ffconcat version 1.0" >ffconcat-h264.txt
echo >>ffconcat-h264.txt
echo "file low-motion-video.mp4" >>ffconcat-h264.txt
echo >>ffconcat-h264.txt
echo "file full-video.mp4" >>ffconcat-h264.txt
echo >>ffconcat-h264.txt
echo "file video-jpeg.mp4" >>ffconcat-h264.txt
echo >>ffconcat-h264.txt

ffmpeg \
    -hide_banner \
    -f concat \
    -i ffconcat-h264.txt \
    -pix_fmt yuv420p \
    -c copy \
    -video_track_timescale 90000 \
    -vsync vfr \
    video-out.mp4



      


    6. 


    


    Problem (and attempted troubleshooting)

    


    The above does produce a reasonable output — it plays first video, then plays second video with no timing/rate issues AFAICT, then plays JPEGs with time between each JPEG "frame" growing successively, as expected.

    


    But, the conversion process produces warnings that concern me (for compatibility with players ; or potentially other IRL streams that may result in some issue my prototyping content doesn't make obvious). Initial attempts generated 100s of warnings, but with some arguments added, I reduced it down to just a handful, but this handful is stubborn and nothing I tried would help.

    


    The first conversion of JPEGs to .mp4 goes fine with the following output :

    


    Input #0, concat, from 'ffconcat-jpeg.txt':
  Duration: 00:08:25.00, start: 0.000000, bitrate: 0 kb/s
  Stream #0:0: Video: png, pal8(pc), 176x341 [SAR 3780:3780 DAR 16:31], 25 fps, 25 tbr, 25 tbn, 25 tbc
Stream mapping:
  Stream #0:0 -> #0:0 (png (native) -> h264 (libx264))
Press [q] to stop, [?] for help
[libx264 @ 0x7fe418008e00] using SAR=1/1
[libx264 @ 0x7fe418008e00] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2
[libx264 @ 0x7fe418008e00] profile High 4:4:4 Predictive, level 1.3, 4:4:4, 8-bit
[libx264 @ 0x7fe418008e00] 264 - core 163 r3060 5db6aa6 - H.264/MPEG-4 AVC codec - Copyleft 2003-2021 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=4 threads=11 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
Output #0, mp4, to 'video-jpeg.mp4':
  Metadata:
    encoder         : Lavf58.76.100
  Stream #0:0: Video: h264 (avc1 / 0x31637661), yuv444p(tv, progressive), 176x341 [SAR 1:1 DAR 16:31], q=2-31, 30 fps, 90k tbn
    Metadata:
      encoder         : Lavc58.134.100 libx264
    Side data:
      cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: N/A
frame=  100 fps=0.0 q=-1.0 Lsize=     157kB time=00:07:55.33 bitrate=   2.7kbits/s speed=2.41e+03x    
video:155kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 1.800846%
[libx264 @ 0x7fe418008e00] frame I:1     Avg QP:20.88  size:   574
[libx264 @ 0x7fe418008e00] frame P:43    Avg QP:14.96  size:  2005
[libx264 @ 0x7fe418008e00] frame B:56    Avg QP:21.45  size:  1266
[libx264 @ 0x7fe418008e00] consecutive B-frames: 14.0% 24.0% 30.0% 32.0%
[libx264 @ 0x7fe418008e00] mb I  I16..4: 36.4% 55.8%  7.9%
[libx264 @ 0x7fe418008e00] mb P  I16..4:  5.1%  7.5% 11.2%  P16..4:  5.6%  8.1%  4.5%  0.0%  0.0%    skip:57.9%
[libx264 @ 0x7fe418008e00] mb B  I16..4:  2.4%  0.9%  3.9%  B16..8: 16.2%  8.8%  4.6%  direct: 1.2%  skip:62.0%  L0:56.6% L1:38.7% BI: 4.7%
[libx264 @ 0x7fe418008e00] 8x8 transform intra:28.3% inter:3.7%
[libx264 @ 0x7fe418008e00] coded y,u,v intra: 26.5% 0.0% 0.0% inter: 9.8% 0.0% 0.0%
[libx264 @ 0x7fe418008e00] i16 v,h,dc,p: 82% 13%  4%  0%
[libx264 @ 0x7fe418008e00] i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 20%  8% 71%  1%  0%  0%  0%  0%  0%
[libx264 @ 0x7fe418008e00] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 41% 11% 29%  4%  2%  3%  1%  7%  1%
[libx264 @ 0x7fe418008e00] Weighted P-Frames: Y:0.0% UV:0.0%
[libx264 @ 0x7fe418008e00] ref P L0: 44.1%  4.2% 28.4% 23.3%
[libx264 @ 0x7fe418008e00] ref B L0: 56.2% 32.1% 11.6%
[libx264 @ 0x7fe418008e00] ref B L1: 92.4%  7.6%
[libx264 @ 0x7fe418008e00] kb/s:2.50


    


    The conversion of individual streams from .h264 to .mp4 generates two types of warnings each. One is [mp4 @ 0x7faee3040400] Timestamps are unset in a packet for stream 0. This is deprecated and will stop working in the future. Fix your code to set the timestamps properly, and the other is [mp4 @ 0x7faee3040400] pts has no value.

    


    Some posts on SO (can't find my original finds on that now) suggested that it's safe to ignore and comes from H.264 being an elementary stream that supposedly doesn't contain timestamps. It surprises me a bit since I produce that stream using NVENC API and clearly supply timing information for each frame via PIC_PARAMS structure : NV_STRUCT(PIC_PARAMS, pp); ...; pp.inputTimeStamp = _frameIndex++ * (H264_CLOCK_RATE / _params.frameRate);, where #define H264_CLOCK_RATE 9000 and _params.frameRate = 30.

    


    Input #0, h264, from 'low-motion-video.h264':
  Duration: N/A, bitrate: N/A
  Stream #0:0: Video: h264 (High), yuv420p(progressive), 1440x3040 [SAR 1:1 DAR 9:19], 30 fps, 30 tbr, 1200k tbn, 60 tbc
Output #0, mp4, to 'low-motion-video.mp4':
  Metadata:
    encoder         : Lavf58.76.100
  Stream #0:0: Video: h264 (High) (avc1 / 0x31637661), yuv420p(progressive), 1440x3040 [SAR 1:1 DAR 9:19], q=2-31, 30 fps, 30 tbr, 90k tbn, 1200k tbc
Stream mapping:
  Stream #0:0 -> #0:0 (copy)
Press [q] to stop, [?] for help
[mp4 @ 0x7faee3040400] Timestamps are unset in a packet for stream 0. This is deprecated and will stop working in the future. Fix your code to set the timestamps properly
[mp4 @ 0x7faee3040400] pts has no value
[mp4 @ 0x7faee3040400] pts has no value0kB time=-00:00:00.03 bitrate=N/A speed=N/A    
    Last message repeated 17985 times
frame=17987 fps=0.0 q=-1.0 Lsize=   79332kB time=00:09:59.50 bitrate=1084.0kbits/s speed=1.59e+03x    
video:79250kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.103804%
Input #0, h264, from 'full-video.h264':
  Duration: N/A, bitrate: N/A
  Stream #0:0: Video: h264 (High), yuv420p(progressive), 1440x3040 [SAR 1:1 DAR 9:19], 30 fps, 30 tbr, 1200k tbn, 60 tbc
Output #0, mp4, to 'full-video.mp4':
  Metadata:
    encoder         : Lavf58.76.100
  Stream #0:0: Video: h264 (High) (avc1 / 0x31637661), yuv420p(progressive), 1440x3040 [SAR 1:1 DAR 9:19], q=2-31, 30 fps, 30 tbr, 90k tbn, 1200k tbc
Stream mapping:
  Stream #0:0 -> #0:0 (copy)
Press [q] to stop, [?] for help
[mp4 @ 0x7f9381864600] Timestamps are unset in a packet for stream 0. This is deprecated and will stop working in the future. Fix your code to set the timestamps properly
[mp4 @ 0x7f9381864600] pts has no value
[mp4 @ 0x7f9381864600] pts has no value0kB time=-00:00:00.03 bitrate=N/A speed=N/A    
    Last message repeated 17981 times
frame=17983 fps=0.0 q=-1.0 Lsize=   52976kB time=00:09:59.36 bitrate= 724.1kbits/s speed=1.33e+03x    
video:52893kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.156232%


    


    But the most worrisome error for me is from stitching together all interim .mp4 files into one :

    


    [mov,mp4,m4a,3gp,3g2,mj2 @ 0x7f9ff2010e00] Auto-inserting h264_mp4toannexb bitstream filter
Input #0, concat, from 'ffconcat-h264.txt':
  Duration: N/A, bitrate: 1082 kb/s
  Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p, 1440x3040 [SAR 1:1 DAR 9:19], 1082 kb/s, 30 fps, 30 tbr, 90k tbn, 60 tbc
    Metadata:
      handler_name    : VideoHandler
      vendor_id       : [0][0][0][0]
Output #0, mp4, to 'video-out.mp4':
  Metadata:
    encoder         : Lavf58.76.100
  Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p, 1440x3040 [SAR 1:1 DAR 9:19], q=2-31, 1082 kb/s, 30 fps, 30 tbr, 90k tbn, 90k tbc
    Metadata:
      handler_name    : VideoHandler
      vendor_id       : [0][0][0][0]
Stream mapping:
  Stream #0:0 -> #0:0 (copy)
Press [q] to stop, [?] for help
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x7f9fe1009c00] Auto-inserting h264_mp4toannexb bitstream filter
[mp4 @ 0x7f9ff2023400] Non-monotonous DTS in output stream 0:0; previous: 53954460, current: 53954460; changing to 53954461. This may result in incorrect timestamps in the output file.
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x7f9fd1008a00] Auto-inserting h264_mp4toannexb bitstream filter
[mp4 @ 0x7f9ff2023400] Non-monotonous DTS in output stream 0:0; previous: 107900521, current: 107874150; changing to 107900522. This may result in incorrect timestamps in the output file.
[mp4 @ 0x7f9ff2023400] Non-monotonous DTS in output stream 0:0; previous: 107900522, current: 107886150; changing to 107900523. This may result in incorrect timestamps in the output file.
frame=36070 fps=0.0 q=-1.0 Lsize=  132464kB time=00:27:54.26 bitrate= 648.1kbits/s speed=6.54e+03x    
video:132296kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.126409%


    


    I'm not sure how to deal with those non-monotonous DTS errors, and no matter what I try, nothing budges. I analyzed the interim .mp4 files using ffprobe -show_frames and found that the last frame of each interim .mp4 does not have DTS, while previous frames do. E.g. :

    


    ...
[FRAME]
media_type=video
stream_index=0
key_frame=0
pkt_pts=53942461
pkt_pts_time=599.360678
pkt_dts=53942461
pkt_dts_time=599.360678
best_effort_timestamp=53942461
best_effort_timestamp_time=599.360678
pkt_duration=3600
pkt_duration_time=0.040000
pkt_pos=54161377
pkt_size=1034
width=1440
height=3040
pix_fmt=yuv420p
sample_aspect_ratio=1:1
pict_type=B
coded_picture_number=17982
display_picture_number=0
interlaced_frame=0
top_field_first=0
repeat_pict=0
color_range=unknown
color_space=unknown
color_primaries=unknown
color_transfer=unknown
chroma_location=left
[/FRAME]
[FRAME]
media_type=video
stream_index=0
key_frame=0
pkt_pts=53927461
pkt_pts_time=599.194011
pkt_dts=N/A
pkt_dts_time=N/A
best_effort_timestamp=53927461
...


    


    My guess is that as concat demuxer reads in (or elsewhere in ffmpeg's conversion pipeline), for the last frame it sees no DTS set, and produces a virtual value equal to the last seen. Then further in pipeline it consumes this input, sees that DTS value is being repeated, issues a warning and offsets it with increment by one, which might be somewhat nonsensical/unrealistic timing value.

    


    I tried using -fflags +genpts as suggested in this SO answer, but that doesn't change anything.

    


    Per yet other posts suggesting issue being with incompatible tbn and tbc values and possible timebase issues, I tried adding -time_base 1:90000 and -enc_time_base 1:90000 and -copytb 1 and nothing budges. The -video_track_timescale 90000 is there b/c it helped reduce those DTS warnings from 100s down to 3, but doesn't eliminate them all.

    


    Question

    


    What is missing and how can I get ffmpeg to perform conversions without these warnings, to be sure it produces proper, well-formed output ?

    


  • Data Privacy in Business : A Risk Leading to Major Opportunities

    9 août 2022, par Erin — Privacy

    Data privacy in business is a contentious issue. 

    Claims that “big data is the new oil of the digital economy” and strong links between “data-driven personalisation and customer experience” encourage leaders to set up massive data collection programmes.

    However, many of these conversations downplay the magnitude of security, compliance and ethical risks companies face when betting too much on customer data collection. 

    In this post, we discuss the double-edged nature of privacy issues in business — the risk-ridden and the opportunity-driven. ​​

    3 Major Risks of Ignoring Data Privacy in Business

    As the old adage goes : Just because everyone else is doing it doesn’t make it right.

    Easy data accessibility and ubiquity of analytics tools make data consumer collection and processing sound like a “given”. But the decision to do so opens your business to a spectrum of risks. 

    1. Compliance and Legal Risks 

    Data collection and customer privacy are protected by a host of international laws including GDPR, CCPA, and regional regulations. Only 15% of countries (mostly developing ones) don’t have dedicated laws for protecting consumer privacy. 

    State of global data protection legislature via The UN

    Global legislature includes provisions on : 

    • Collectible data types
    • Allowed uses of obtained data 
    • Consent to data collection and online tracking 
    • Rights to request data removal 

    Personally identifiable information (PII) processing is prohibited or strictly regulated in most jurisdictions. Yet businesses repeatedly circumnavigate existing rules and break them on occasion.

    In Australia, for example, only 2% of brands use logos, icons or messages to transparently call out online tracking, data sharing or other specific uses of data at the sign-up stage. In Europe, around half of small businesses are still not fully GDPR-compliant — and Big Tech companies like Google, Amazon and Facebook can’t get a grip on their data collection practices even when pressed with horrendous fines. 

    Although the media mostly reports on compliance fines for “big names”, smaller businesses are increasingly receiving more scrutiny. 

    As Max Schrems, an Austrian privacy activist and founder of noyb NGO, explained in a Matomo webinar :

    “In Austria, my home country, there are a lot of €5,000 fines going out there as well [to smaller businesses]. Most of the time, they are just not reported. They just happen below the surface. [GDPR fines] are already a reality.”​

    In April 2022, the EU Court of Justice ruled that consumer groups can autonomously sue businesses for breaches of data protection — and nonprofit organisations like noyb enable more people to do so. 

    Finally, new data privacy legislation is underway across the globe. In the US, Colorado, Connecticut, Virginia and Utah have data protection acts at different stages of approval. South African authorities are working on the Protection of Personal Information Act (POPI) act and Brazil is working on a local General Data Protection Law (LGPD).

    Re-thinking your stance on user privacy and data protection now can significantly reduce the compliance burden in the future. 

    2. Security Risks 

    Data collection also mandates data protection for businesses. Yet, many organisations focus on the former and forget about the latter. 

    Lenient attitudes to consumer data protection resulted in a major spike in data breaches.

    Check Point research found that cyberattacks increased 50% year-over-year, with each organisation facing 925 cyberattacks per week globally.

    Many of these attacks end up being successful due to poor data security in place. As a result, billions of stolen consumer records become publicly available or get sold on dark web marketplaces.

    What’s even more troublesome is that stolen consumer records are often purchased by marketing firms or companies, specialising in spam campaigns. Buyers can also use stolen emails to distribute malware, stage phishing and other social engineering attacks – and harvest even more data for sale. 

    One business’s negligence creates a snowball effect of negative changes down the line with customers carrying the brunt of it all. 

    In 2020, hackers successfully targeted a Finnish psychotherapy practice. They managed to steal hundreds of patient records — and then demanded a ransom both from the firm and its patients for not exposing information about their mental health issues. Many patients refused to pay hackers and some 300 records ended up being posted online as Associated Press reported.

    Not only did the practice have to deal with the cyber-breach aftermath, but it also faced vocal regulatory and patient criticisms for failing to properly protect such sensitive information.

    Security negligence can carry both direct (heavy data breach fines) and indirect losses in the form of reputational damages. An overwhelming 90% of consumers say they wouldn’t buy from a business if it doesn’t adequately protect their data. This brings us to the last point. 

    3. Reputational Risks 

    Trust is the new currency. Data negligence and consumer privacy violations are the two fastest ways to lose it. 

    Globally, consumers are concerned about how businesses collect, use, and protect their data. 

    Consumer data sharing attitudes
    • According to Forrester, 47% of UK adults actively limit the amount of data they share with websites and apps. 49% of Italians express willingness to ask companies to delete their personal data. 36% of Germans use privacy and security tools to minimise online tracking of their activities. 
    • A GDMA survey also notes that globally, 82% of consumers want more control over their personal information, shared with companies. 77% also expect brands to be transparent about how their data is collected and used. 

    When businesses fail to hold their end of the bargain — collect just the right amount of data and use it with integrity — consumers are fast to cut ties. 

    Once the information about privacy violations becomes public, companies lose : 

    • Brand equity 
    • Market share 
    • Competitive positioning 

    An AON report estimates that post-data breach companies can lose as much as 25% of their initial value. In some cases, the losses can be even higher. 

    In 2015, British telecom TalkTalk suffered from a major data breach. Over 150,000 customer records were stolen by hackers. To contain the issue, TalkTalk had to throw between $60-$70 million into containment efforts. Still, they lost over 100,000 customers in a matter of months and one-third of their company value, equivalent to $1.4 billion, by the end of the year. 

    Fresher data from Infosys gives the following maximum cost estimates of brand damage, companies could experience after a data breach (accidental or malicious).

    Estimated cost of brand damage due to a data breach

    3 Major Advantages of Privacy in Business 

    Despite all the industry mishaps, a reassuring 77% of CEOs now recognise that their companies must fundamentally change their approaches to customer engagement, in particular when it comes to ensuring data privacy. 

    Many organisations take proactive steps to cultivate a privacy-centred culture and implement transparent data collection policies. 

    Here’s why gaining the “privacy advantage” pays off.

    1. Market Competitiveness 

    There’s a reason why privacy-focused companies are booming. 

    Consumers’ mounting concerns and frustrations over the lack of online privacy, prompt many to look for alternative privacy-centred products and services

    The following B2C and B2B products are moving from the industry margins to the mainstream : 

    Across the board, consumers express greater trust towards companies, protective of their privacy : 

    And as we well know : trust translates to higher engagement, loyalty, and – ultimately revenue. 

    By embedding privacy into the core of your product, you give users more reasons to select, stay and support your business. 

    2. Higher Operational Efficiency

    Customer data protection isn’t just a policy – it’s a culture of collecting “just enough” data, protecting it and using it responsibly. 

    Sadly, that’s the area where most organisations trail behind. At present, some 90% of businesses admit to having amassed massive data silos. 

    Siloed data is expensive to maintain and operationalise. Moreover, when left unattended, it can evolve into a pressing compliance issue. 

    A recently leaked document from Facebook says the company has no idea where all of its first-party, third-party and sensitive categories data goes or how it is processed. Because of this, Facebook struggles to achieve GDPR compliance and remains under regulatory pressure. 

    Similarly, Google Analytics is riddled with privacy issues. Other company products were found to be collecting and operationalising consumer data without users’ knowledge or consent. Again, this creates valid grounds for regulatory investigations. 

    Smaller companies have a better chance of making things right at the onset. 

    By curbing customer data collection, you can : 

    • Reduce data hosting and Cloud computation costs (aka trim your Cloud bill) 
    • Improve data security practices (since you would have fewer assets to protect) 
    • Make your staff more productive by consolidating essential data and making it easy and safe to access

    Privacy-mindful companies also have an easier time when it comes to compliance and can meet new data regulations faster. 

    3. Better Marketing Campaigns 

    The biggest counter-argument to reducing customer data collection is marketing. 

    How can we effectively sell our products if we know nothing about our customers ? – your team might be asking. 

    This might sound counterintuitive, but minimising data collection and usage can lead to better marketing outcomes. 

    Limiting the types of data that can be used encourages your people to become more creative and productive by focusing on fewer metrics that are more important.

    Think of it this way : Every other business uses the same targeting parameters on Facebook or Google for running paid ad campaigns on Facebook. As a result, we see ads everywhere — and people grow unresponsive to them or choose to limit exposure by using ad blocking software, private browsers and VPNs. Your ad budgets get wasted on chasing mirage metrics instead of actual prospects. 

    Case in point : In 2017 Marc Pritchard of Procter & Gamble decided to first cut the company’s digital advertising budget by 6% (or $200 million). Unilever made an even bolder move and reduced its ad budget by 30% in 2018. 

    Guess what happened ?

    P&G saw a 7.5% increase in organic sales and Unilever had a 3.8% gain as HBR reports. So how come both companies became more successful by spending less on advertising ? 

    They found that overexposure to online ads led to diminishing returns and annoyances among loyal customers. By minimising ad exposure and adopting alternative marketing strategies, the two companies managed to market better to new and existing customers. 

    The takeaway : There are more ways to engage consumers aside from pestering them with repetitive retargeting messages or creepy personalisation. 

    You can collect first-party data with consent to incrementally improve your product — and educate them on the benefits of your solution in transparent terms.

    Final Thoughts 

    The definitive advantage of privacy is consumers’ trust. 

    You can’t buy it, you can’t fake it, you can only cultivate it by aligning your external appearances with internal practices. 

    Because when you fail to address privacy internally, your mishaps will quickly become apparent either as social media call-outs or worse — as a security incident, a data breach or a legal investigation. 

    By choosing to treat consumer data with respect, you build an extra layer of protection around your business, plus draw in some banging benefits too. 

    Get one step closer to becoming a privacy-centred company by choosing Matomo as your web analytics solution. We offer robust privacy controls for ensuring ethical, compliant, privacy-friendly and secure website tracking. 

  • Privacy in Business : What Is It and Why Is It Important ?

    13 juillet 2022, par Erin — Privacy

    Privacy concerns loom large among consumers. Yet, businesses remain reluctant to change the old ways of doing things until they become an operational nuisance. 

    More and more businesses are slowly starting to feel the pressure to incorporate privacy best practices. But what exactly does privacy mean in business ? And why is it important for businesses to protect users’ privacy ? 

    In this blog, we’ll answer all of these questions and more. 

    What is Privacy in Business ?

    In the corporate world, privacy stands for the business decision to use collected consumer data in a safe, secure and compliant way. 

    Companies with a privacy-centred culture : 

    • Get explicit user consent to tracking, opt-ins and data sharing 
    • Collect strictly necessary data in compliance with regulations 
    • Ask for permissions to collect, process and store sensitive data 
    • Provide transparent explanations about data operationalisation and usage 
    • Have mechanisms for data collection opt-outs and data removal requests 
    • Implement security controls for storing collected data and limit access permissions to it 

    In other words : They treat consumers’ data with utmost integrity and security – and provide reassurances of ethical data usage. 

    What Are the Ethical Business Issues Related to Privacy ?

    Consumer data analytics has been around for decades. But digital technologies – ubiquitous connectivity, social media networks, data science and machine learning – increased the magnitude and sophistication of customer profiling.

    Big Tech companies like Google and Facebook, among others, capture millions of data points about users. These include general demographics data like “age” or “gender”, as well as more granular insights such as “income”, “past browsing history” or “recently visited geo-locations”. 

    When combined, such personally identifiable information (PII) can be used to approximate the user’s exact address, frequently purchased goods, political beliefs or past medical conditions. Then such information is shared with third parties such as advertisers. 

    That’s when ethical issues arise. 

    The Cambridge Analytica data scandal is a prime example of consumer data that was unethically exploited. 

    Over the years, Google also faced a series of regulatory issues surrounding consumer privacy breaches :

    • In 2021, a Google Chrome browser update put some 2.6 billion users at risk of “surveillance, manipulation and abuse” by providing third parties with data on device usage. 
    • The same year, Google was taken to court for failing to provide full disclosures on tracking performed in Google Chrome incognito mode. A $5 billion lawsuit is still pending.
    • As of 2022, Google Analytics 4 is considered GDPR non-compliant and was branded “illegal” by several European countries. 

    If you are curious, learn more about Google Analytics privacy issues

    The bigger issue ? Big Tech companies make the businesses that use their technologies (unknowingly) complicit in consumer data violations.

    In 2022, the Belgian data regulator found the official IAB Europe framework for user consent gathering in breach of GDPR. The framework was used by all major AdTech platforms to issue pop-ups for user consent to tracking. Now ad platforms must delete all data gathered through these. Biggest advertisers such as Procter & Gamble, Unilever, IBM and Mastercard among others, also received a notice about data removal and a regulatory warning on further repercussions if they fail to comply. 

    Big Tech firms have given brands unprecedented access to granular consumer data. Unrestricted access, however, also opened the door to data abuse and unethical use. 

    Examples of Unethical Data Usage by Businesses 

    • Data hoarding means excessively harvesting all available consumer data because a possibility to do so exists, often using murky consent mechanisms. Yet, 85% of collected Big Data is either dark or redundant, obsolete or trivial (ROT).
    • Invasive personalisation based on sensitive user information (or second-guesses), like a recent US marketing campaign, congratulating women on pregnancy (even if they weren’t expecting). Overall, 75% of consumers find most forms of personalisation somewhat creepy. 22% also said they’d leave for another brand due to creepy experiences.
    • Hyper-targeted advertising campaigns based on data consumers would prefer not to share. A recent investigation found that advertising platforms often assign sensitive labels to users (as part of their ad profiles), indicative of their religion, mental issues, history with abuse and so on. This allows advertisers to target such consumers with dubious ads. 

    Ultimately, excessive data collection, paired with poor data protection in business settings, results in major data breaches and costly damage control. Given that cyber attacks are on the rise, every business is vulnerable. 

    Why Should a Business Be Concerned About Protecting the Privacy of Its Customers ?

    Businesses must prioritise customer privacy because that’s what is expected of them. Globally, 89% of consumers say they care about their privacy. 

    As frequent stories about unethical data usage, excessive tracking and data breaches surface online, even more grow more concerned about protecting their data. Many publicly urge companies to take action. Others curtail their relationships with brands privately. 

    On average, 45% of consumers feel uncomfortable about sharing personal data. According to KPMG, 78% of American consumers have fears about the amount of data being collected. 40% of them also don’t trust companies to use their data ethically. Among Europeans, 41% are unwilling to share any personal data with businesses. 

    Because the demand for online privacy is rising, progressive companies now treat privacy as a competitive advantage. 

    For example, the encrypted messaging app Signal gained over 42 million active users in a year because it offers better data security and privacy protection. 

    ProtonMail, a privacy-centred email client, also amassed a 50 million user base in several years thanks to a “fundamentally stronger definition of privacy”.

    The growth of privacy-mindful businesses speaks volumes. And even more good things happen to privacy-mindful businesses : 

    • Higher consumer trust and loyalty 
    • Improved attractiveness to investors
    • Less complex compliance
    • Minimum cybersecurity exposure 
    • Better agility and innovation

    It’s time to start pursuing them ! Learn how to embed privacy and security into your operations.