
Recherche avancée
Autres articles (50)
-
ANNEXE : Les plugins utilisés spécifiquement pour la ferme
5 mars 2010, parLe site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)
-
MediaSPIP v0.2
21 juin 2013, parMediaSPIP 0.2 is the first MediaSPIP stable release.
Its official release date is June 21, 2013 and is announced here.
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
Creating farms of unique websites
13 avril 2011, parMediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)
Sur d’autres sites (7562)
-
How to increase engagement and convert them into customers
8 septembre 2020, par Joselyn Khor — Analytics Tips, MarketingLong gone are the days of simply tracking page views as a measure of engagement. Now it’s about engagement analysis, which is layered and provides insight for effective data-driven decisions.
Discover how engaged people are with your website by uncovering behavioural patterns that tell you how well your site and content is or isn’t performing. This insight helps you re-evaluate, adapt and optimise your content and strategy. The more engaged they are, the more likely you’ll be able to guide them on a predetermined journey that results in more conversions ; and helps you reach the goals you’ve set for your business.
Why is visitor engagement important ?
It’s vital to measure engagement if you have anything content related that plays a role in your customer’s journey. Some websites may find more value in figuring out how engaging their entire site is, while others may only want to zone in on, say, a blogging section, e-newsletters, social media channels or sign-up pages.
In the larger scheme of things, engagement can be seen as what’s running your site. Every aspect of the buyer’s journey requires your visitors to be engaged. Whether you’re trying to attract, convert or build a loyal audience base, you need to know your content is optimised to maintain their attention and encourage them along the path to purchase, conversion or loyalty.
How to increase engagement with Matomo
You need to know what’s going right or wrong to eventually be able to deliver more riveting content your visitors can’t help but be drawn to. Learn how to apply Matomo’s easy-to-use features to increase engagement :
- The Behaviour feature
- Heatmaps
- A/B Testing
- Media Analytics
- Transitions
- Custom reports
- Other metrics to keep an eye on
1. Look at the Behaviour feature
It allows you to learn how visitors are responding to your content. This information is gathered by drawing insight from features such as site search, downloads, events and content interactions. Learn more
Matomo’s top five ways to increase engagement with the Behaviour feature :
Behaviour -> Pages
Get complete insights on what pages your users engage with, what pages provide little value to your business and see the results of entry and exit pages. If important content is generating low traffic, you need to place it where it can be seen. Spend time where it matters and focus on the content that will engage with your users and see how it eventually converts them into customers.Behaviour -> Site search
Site search tracks how people use your website’s internal search engine. You can see :- What search keywords visitors used on your website’s internal search.
- Which of those keywords resulted in no results (what content your visitors are looking for but cannot find).
- What pages visitors visited immediately after a search.
- What search categories visitors use (if your website employs search categories).
Behaviour -> Downloads
What are users wanting to take away with them ? They could be downloading .pdfs, .zip files, ebooks, infographics or other free/paid resources. For example, if you were working for an education institution and created valuable information packs for students that you made available online in .pdf format. To see an increase in downloads meant students were finding the .pdfs and realising the need to download them. No downloads could mean the information packs weren’t being found which would be problematic.Behaviour -> Events
Tracking events is a very useful way to measure the interactions your users have with your website content, which are not directly page views or downloads.How have Events been used effectively ? A great example comes from one of our customers, Catalyst. They wanted to capture and measure the user interaction of accordions (an area of content that expands or closes depending on how a user interacts with it) to see if people were actually getting all the information available to them on this one page. By creating an Event to record which accordion had been opened, as well as creating events for other user interactions, they were able to figure out which content got the most engagement and which got the least. Being able to see how visitors navigated through their website helped them optimise the site to ensure people were getting the relevant information they were craving.
Behaviour -> Content interactions
Content tracking allows you to track interaction within the content of your web page. Go beyond page views, bounce rates and average time spent on page with your content. Instead, you can analyse content interaction rates based on mouse clicking and configuring scrolling or hovering behaviours to see precisely how engaged your users are. If interaction rates are low, perhaps you need to restructure your page layout to grab your user’s attention sooner. Possibly you will get more interaction when you have more images or banner ads to other areas of your business.Watch this video to learn about the Behaviour feature
2. Set up Heatmaps
Effortlessly discover how your visitors truly engage with your most important web pages that impact the success of your business. Heatmaps shows you visually where your visitors try to click, move the mouse and how far down they scroll on each page.
You don’t need to waste time digging for key metrics or worry about putting together tables of data to understand how your visitors are interacting with your website. Heatmaps make it easy and fast to discover where your users are paying their attention, where they have problems, where useless content is and how engaging your content is. Get insights that you cannot get from traditional reports. Learn more
3. Carry out A/B testing
With A/B Testing you reduce risk in your decision-making and can test what your visitors are responding well to.
Ever had discussions with colleagues about where to place content on a landing page ? Or discussed what the call-to-action should be and assumed you were making the best decisions ? The truth is, you never know what really works the best (and what doesn’t) unless you test it. Learn more
How to increase engagement with A/B Testing : Test, test and test. This is a surefire way to learn what content is leading your visitors on a path to conversion and what isn’t.
4. Media Analytics
Tells you how visitors are engaging with your video or audio content, and whether they’re leading to your desired conversions. Track :
- How many plays your media gets and which parts they viewed
- Finish rates
- How your media was consumed over time
- How media was consumed on specific days
- Which locations your users were viewing your content from
- Learn more
How to increase engagement with Media Analytics : These metrics give a picture of how audiences are behaving when it comes to your content. By showing insights such as, how popular your media content is, how engaging it is and which days content will be most viewed, you can tailor content strategies to produce content people will actually find interesting and watch/listen.
Matomo example : When we went through the feature video metrics on our own site to see how our videos were performing, we noticed our Acquisition video had a 95% completion rate. Even though it was longer than most videos, the stats showed us it had, by far, the most engagement. By using Media Analytics to get insights on the best and worst performing videos, we gathered useful info to help us better allocate resources effectively so that in the future, we’re producing more videos that will be watched.
5. Investigate transitions
See which page visitors are entering the site from and where they exit to. Transitions shows engagement on each page and whether the content is leading them to the pages you want them to be directed to.
This gives you a greater understanding of user pathways. You may be assuming visitors are finding your content from one particular pathway, but figure out users are actually coming through other channels you never thought of. Through Transitions, you may discover and capitalise on new opportunities from external sites.
How to increase engagement with Transitions : Identify clearly where users may be getting distracted to click away and where other pages are creating opportunity to click-through to conversion.
6. Create Custom Reports
You can choose from over 200 dimensions and metrics to get the insights you need as well as various visualisation options. This makes understanding the data incredibly easy and you can get the insights you need instantly for faster results without the need for a developer. Learn more
How to increase engagement with Custom Reports : Set custom reports to see when content is being viewed and figure out how engaged users are by looking at different hours of the day or which days of the week they’re visiting your website. For example, you could be wondering what hour of the day performed best for converting your customers. Understanding these metrics helps you figure out the best time to schedule your blog posts, pay-per-click advertising, edms or social media posts knowing that your visitors are more likely to convert at different times.
7. Other metrics to key an eye on …
A good indication of a great experience and of engagement is whether your readers, viewers or listeners want to do it again and again.
“Best” metrics are hard to determine so you’ll need to ask yourself what you want to do or what you want your site to do. How do you want your users to behave or what kind of buyer’s journey do you want them to have ?
Want to know where to start ? Look at …
- Bounce rate – a high bounce rate isn’t great as people aren’t finding what they’re looking for and are leaving without taking action. (This offers great opportunities as you can test to see why people are bouncing off your site and figure out what you need to change.)
- Time on site – a long time on site is usually a good indication that people are spending time reading, navigating and being engaged with your website.
- Frequency of visit – how often do people come back to interact with the content on your website ? The higher the % of your visitors that come back time and time again will show how engaged they are with your content.
- Session length/average session duration – how much time users spend on site each session
- Pages per session – is great to show engagement because it shows visitors are happy going through your website and learn more about your business.
Key takeaway
Whichever stage of the buyer’s journey your visitors are in, you need to ensure your content is optimised for engagement so that visitors can easily spend time on your website.
“Every single visit by every single visitor is no longer judged as a success or a failure at the end of 29 min (max) session in your analytics tool. Every visit is not a ‘last-visit’, rather it becomes a continuous experience leading to a win-win outcome.” – Avinash Kaushik
As you can tell, one size does not fit all when it comes to analysing and measuring engagement, but with a toolkit of features, you can make sure you have everything you need to experiment and figure out the metrics that matter to the success of your business and website.
Concurrently, these gentle nudges for visitors to consume more and more content encourages them along their path to purchase, conversion or loyalty. They get a more engaging website experience over time and you get happy visitors/customers who end up coming back for more.
Want to learn how to increase conversions with Matomo ? Look out for the final in this series : part 3 ! We’ll go through how you can boost conversions and meet your business goals with web analytics.
-
How to Conduct a Customer Journey Analysis (Step-by-Step)
9 mai 2024, par Erin -
Failed to decode h264 key frame with DXVA2.0 because returned buffer is to small
16 mai 2023, par grill2010I hava a strange problem on Windows with DXVA2 h264 decoding. I recently figured out a ffmpeg decoding limitation for DXVA2 and D3D11VA on Windows and how to solve it, this solution completly fixes the problem with D3D11VA but DXVA2 still has some problems with certain keyframes. Upon further investigation it turned out that the decoding of these certain keyframes fail because the buffer returned from the IDirectXVideoDecoder_GetBuffer function was too small. FFmpeg is printing out these logs when the decoding fails :


Error: [h264 @ 0000028b2e5796c0] Buffer for type 5 was too small. size: 58752, dxva_size: 55296
Error: [h264 @ 0000028b2e5796c0] Failed to add bitstream or slice control buffer
Error: [h264 @ 0000028b2e5796c0] hardware accelerator failed to decode picture



Why is this returned buffer too low ? What kind of factors inside ffmpeg do have an effect on this buffer size or is this a limitation of DXVA2 in general ? All other decoders like Cuvid, D3D11VA or the software decoder are not affected by this problem and can decode all keyframes.


I have an example javacv project on github that can reproduce the problem. I also provide the source code of the main class here. The keyframe example data with prepended SPS and PPS in hex form can be downloaded here.


import javafx.application.Application;
import javafx.scene.Scene;
import javafx.scene.layout.Pane;
import javafx.scene.layout.StackPane;
import javafx.stage.Stage;
import org.bytedeco.ffmpeg.avcodec.AVCodec;
import org.bytedeco.ffmpeg.avcodec.AVCodecContext;
import org.bytedeco.ffmpeg.avcodec.AVCodecHWConfig;
import org.bytedeco.ffmpeg.avcodec.AVPacket;
import org.bytedeco.ffmpeg.avutil.AVBufferRef;
import org.bytedeco.ffmpeg.avutil.AVDictionary;
import org.bytedeco.ffmpeg.avutil.AVFrame;
import org.bytedeco.ffmpeg.avutil.LogCallback;
import org.bytedeco.javacpp.BytePointer;
import org.bytedeco.javacpp.IntPointer;
import org.bytedeco.javacpp.Pointer;
import org.tinylog.Logger;

import java.io.IOException;
import java.io.InputStream;
import java.nio.charset.StandardCharsets;
import java.util.Objects;
import java.util.function.Consumer;

import static org.bytedeco.ffmpeg.avcodec.AVCodecContext.FF_THREAD_SLICE;
import static org.bytedeco.ffmpeg.global.avcodec.*;
import static org.bytedeco.ffmpeg.global.avutil.*;

public class App extends Application {

 /**** decoder variables ****/

 private AVHWContextInfo hardwareContext;

 private AVCodec decoder;
 private AVCodecContext m_VideoDecoderCtx;

 private AVCodecContext.Get_format_AVCodecContext_IntPointer formatCallback;

 private final int streamResolutionX = 1920;
 private final int streamResolutionY = 1080;

 // AV_HWDEVICE_TYPE_CUDA // example works with cuda
 // AV_HWDEVICE_TYPE_DXVA2 // producing Invalid data found on keyframe
 // AV_HWDEVICE_TYPE_D3D11VA // producing Invalid data found on keyframe
 private static final int HW_DEVICE_TYPE = AV_HWDEVICE_TYPE_DXVA2;

 private static final boolean USE_HW_ACCEL = true;

 private static final boolean USE_AV_EF_EXPLODE = true;

 public static void main(final String[] args) {
 //System.setProperty("prism.order", "d3d,sw");
 System.setProperty("prism.vsync", "false");
 Application.launch(App.class);
 }

 @Override
 public void start(final Stage primaryStage) {
 final Pane dummyPane = new Pane();
 dummyPane.setStyle("-fx-background-color: black");
 final Scene scene = new Scene(dummyPane, this.streamResolutionX, this.streamResolutionY);
 primaryStage.setScene(scene);
 primaryStage.show();
 primaryStage.setMinWidth(480);
 primaryStage.setMinHeight(360);

 this.initializeFFmpeg(result -> {
 if (!result) {
 Logger.error("FFmpeg could not be initialized correctly, terminating program");
 System.exit(1);
 return;
 }
 scene.setRoot(new StackPane());
 this.performTestFramesFeeding();
 });
 }

 private void initializeFFmpeg(final Consumer<boolean> finishHandler) {
 FFmpegLogCallback.setLevel(AV_LOG_DEBUG); // Increase log level until the first frame is decoded
 FFmpegLogCallback.set();
 Pointer pointer = new Pointer((Pointer) null);
 AVCodec c;
 while ((c = av_codec_iterate(pointer)) != null) {
 if (av_codec_is_decoder(c) > 0)
 Logger.debug("{}:{} ", c.name().getString(), c.type());
 }

 this.decoder = avcodec_find_decoder(AV_CODEC_ID_H264); // usually decoder name is h264 and without hardware support it's yuv420p otherwise nv12
 if (this.decoder == null) {
 Logger.error("Unable to find decoder for format {}", "h264");
 finishHandler.accept(false);
 return;
 }
 Logger.info("Current decoder name: {}, {}", this.decoder.name().getString(), this.decoder.long_name().getString());

 if (true) {
 for (; ; ) {
 this.m_VideoDecoderCtx = avcodec_alloc_context3(this.decoder);
 if (this.m_VideoDecoderCtx == null) {
 Logger.error("Unable to find decoder for format AV_CODEC_ID_H264");
 if (this.hardwareContext != null) {
 this.hardwareContext.free();
 this.hardwareContext = null;
 }
 continue;
 }

 if (App.USE_HW_ACCEL) {
 this.hardwareContext = this.createHardwareContext();
 if (this.hardwareContext != null) {
 Logger.info("Set hwaccel support");
 this.m_VideoDecoderCtx.hw_device_ctx(this.hardwareContext.hwContext()); // comment to disable hwaccel
 }
 } else {
 Logger.info("Hwaccel manually disabled");
 }

 // Always request low delay decoding
 this.m_VideoDecoderCtx.flags(this.m_VideoDecoderCtx.flags() | AV_CODEC_FLAG_LOW_DELAY);

 // Allow display of corrupt frames and frames missing references
 this.m_VideoDecoderCtx.flags(this.m_VideoDecoderCtx.flags() | AV_CODEC_FLAG_OUTPUT_CORRUPT);
 this.m_VideoDecoderCtx.flags2(this.m_VideoDecoderCtx.flags2() | AV_CODEC_FLAG2_SHOW_ALL);

 if (App.USE_AV_EF_EXPLODE) {
 // Report decoding errors to allow us to request a key frame
 this.m_VideoDecoderCtx.err_recognition(this.m_VideoDecoderCtx.err_recognition() | AV_EF_EXPLODE);
 }

 // Enable slice multi-threading for software decoding
 if (this.m_VideoDecoderCtx.hw_device_ctx() == null) { // if not hw accelerated
 this.m_VideoDecoderCtx.thread_type(this.m_VideoDecoderCtx.thread_type() | FF_THREAD_SLICE);
 this.m_VideoDecoderCtx.thread_count(2/*AppUtil.getCpuCount()*/);
 } else {
 // No threading for HW decode
 this.m_VideoDecoderCtx.thread_count(1);
 }

 this.m_VideoDecoderCtx.width(this.streamResolutionX);
 this.m_VideoDecoderCtx.height(this.streamResolutionY);
 this.m_VideoDecoderCtx.pix_fmt(this.getDefaultPixelFormat());

 this.formatCallback = new AVCodecContext.Get_format_AVCodecContext_IntPointer() {
 @Override
 public int call(final AVCodecContext context, final IntPointer pixelFormats) {
 final boolean hwDecodingSupported = context.hw_device_ctx() != null && App.this.hardwareContext != null;
 final int preferredPixelFormat = hwDecodingSupported ?
 App.this.hardwareContext.hwConfig().pix_fmt() :
 context.pix_fmt();
 int i = 0;
 while (true) {
 final int currentSupportedFormat = pixelFormats.get(i++);
 System.out.println("Supported pixel formats " + currentSupportedFormat);
 if (currentSupportedFormat == AV_PIX_FMT_NONE) {
 break;
 }
 }

 i = 0;
 while (true) {
 final int currentSupportedFormat = pixelFormats.get(i++);
 if (currentSupportedFormat == preferredPixelFormat) {
 Logger.info("[FFmpeg]: pixel format in format callback is {}", currentSupportedFormat);
 return currentSupportedFormat;
 }
 if (currentSupportedFormat == AV_PIX_FMT_NONE) {
 break;
 }
 }

 i = 0;
 while (true) { // try again and search for yuv
 final int currentSupportedFormat = pixelFormats.get(i++);
 if (currentSupportedFormat == AV_PIX_FMT_YUV420P) {
 Logger.info("[FFmpeg]: Not found in first match so use {}", AV_PIX_FMT_YUV420P);
 return currentSupportedFormat;
 }
 if (currentSupportedFormat == AV_PIX_FMT_NONE) {
 break;
 }
 }

 i = 0;
 while (true) { // try again and search for nv12
 final int currentSupportedFormat = pixelFormats.get(i++);
 if (currentSupportedFormat == AV_PIX_FMT_NV12) {
 Logger.info("[FFmpeg]: Not found in second match so use {}", AV_PIX_FMT_NV12);
 return currentSupportedFormat;
 }
 if (currentSupportedFormat == AV_PIX_FMT_NONE) {
 break;
 }
 }

 Logger.info("[FFmpeg]: pixel format in format callback is using fallback {}", AV_PIX_FMT_NONE);
 return AV_PIX_FMT_NONE;
 }
 };
 this.m_VideoDecoderCtx.get_format(this.formatCallback);

 final AVDictionary options = new AVDictionary(null);
 final int result = avcodec_open2(this.m_VideoDecoderCtx, this.decoder, options);
 if (result < 0) {
 Logger.error("avcodec_open2 was not successful");
 finishHandler.accept(false);
 return;
 }
 av_dict_free(options);
 break;
 }
 }

 if (this.decoder == null || this.m_VideoDecoderCtx == null) {
 finishHandler.accept(false);
 return;
 }
 finishHandler.accept(true);
 }

 private AVHWContextInfo createHardwareContext() {
 AVHWContextInfo result = null;
 for (int i = 0; ; i++) {
 final AVCodecHWConfig config = avcodec_get_hw_config(this.decoder, i);
 if (config == null) {
 break;
 }

 if ((config.methods() & AV_CODEC_HW_CONFIG_METHOD_HW_DEVICE_CTX) < 0) {
 continue;
 }
 final int device_type = config.device_type();
 if (device_type != App.HW_DEVICE_TYPE) {
 continue;
 }
 final AVBufferRef hw_context = av_hwdevice_ctx_alloc(device_type);
 if (hw_context == null || av_hwdevice_ctx_create(hw_context, device_type, (String) null, null, 0) < 0) {
 Logger.error("HW accel not supported for type {}", device_type);
 av_free(config);
 av_free(hw_context);
 } else {
 Logger.info("HW accel created for type {}", device_type);
 result = new AVHWContextInfo(config, hw_context);
 }
 break;
 }

 return result;
 }

 @Override
 public void stop() {
 this.releaseNativeResources();
 }

 /*****************************/
 /*** test frame processing ***/
 /*****************************/
 
 private void performTestFramesFeeding() {
 final AVPacket pkt = av_packet_alloc();
 if (pkt == null) {
 return;
 }
 try (final BytePointer bp = new BytePointer(65_535 * 15)) {


 for (int i = 0; i < 1; i++) {
 final byte[] frameData = AVTestFrames.h264KeyTestFrame;

 bp.position(0);

 bp.put(frameData);
 bp.limit(frameData.length);

 pkt.data(bp);
 pkt.capacity(bp.capacity());
 pkt.size(frameData.length);
 pkt.position(0);
 pkt.limit(frameData.length);
 //pkt.flags(AV_PKT_FLAG_KEY);
 final AVFrame avFrame = av_frame_alloc();
 System.out.println("frameData.length " + frameData.length);

 final int err = avcodec_send_packet(this.m_VideoDecoderCtx, pkt); //fill_scaling_lists
 if (err < 0) {
 final BytePointer buffer = new BytePointer(512);
 av_strerror(err, buffer, buffer.capacity());
 final String string = buffer.getString();
 System.out.println("Error on decoding test frame " + err + " message " + string);
 av_frame_free(avFrame);
 return;
 }

 final int result = avcodec_receive_frame(this.m_VideoDecoderCtx, avFrame);
 final AVFrame decodedFrame;
 if (result == 0) {
 if (this.m_VideoDecoderCtx.hw_device_ctx() == null) {
 decodedFrame = avFrame;
 System.out.println("SUCESS with SW decoding");
 } else {
 final AVFrame hwAvFrame = av_frame_alloc();
 if (av_hwframe_transfer_data(hwAvFrame, avFrame, 0) < 0) {
 System.out.println("Failed to transfer frame from hardware");
 av_frame_unref(hwAvFrame);
 decodedFrame = avFrame;
 } else {
 av_frame_unref(avFrame);
 decodedFrame = hwAvFrame;
 System.out.println("SUCESS with HW decoding");
 }
 }

 av_frame_unref(decodedFrame);
 } else {
 final BytePointer buffer = new BytePointer(512);
 av_strerror(result, buffer, buffer.capacity());
 final String string = buffer.getString();
 System.out.println("error " + result + " message " + string);
 av_frame_free(avFrame);
 }
 }
 } finally {
 if (pkt.stream_index() != -1) {
 av_packet_unref(pkt);
 }
 pkt.releaseReference();
 }
 }

 final Object releaseLock = new Object();
 private volatile boolean released = false;

 private void releaseNativeResources() {
 if (this.released) {
 return;
 }
 this.released = true;
 synchronized (this.releaseLock) {
 // Close the video codec
 if (this.m_VideoDecoderCtx != null) {
 avcodec_free_context(this.m_VideoDecoderCtx);
 this.m_VideoDecoderCtx = null;
 }

 // close the format callback
 if (this.formatCallback != null) {
 this.formatCallback.close();
 this.formatCallback = null;
 }

 // close hw context
 if (this.hardwareContext != null) {
 this.hardwareContext.free();
 }
 }
 }

 private int getDefaultPixelFormat() {
 return AV_PIX_FMT_YUV420P; // Always return yuv420p here
 }


 /*********************/
 /*** inner classes ***/
 /*********************/

 public static final class HexUtil {

 private HexUtil() {
 }

 public static byte[] unhexlify(final String argbuf) {
 final int arglen = argbuf.length();
 if (arglen % 2 != 0) {
 throw new RuntimeException("Odd-length string");
 } else {
 final byte[] retbuf = new byte[arglen / 2];

 for (int i = 0; i < arglen; i += 2) {
 final int top = Character.digit(argbuf.charAt(i), 16);
 final int bot = Character.digit(argbuf.charAt(i + 1), 16);
 if (top == -1 || bot == -1) {
 throw new RuntimeException("Non-hexadecimal digit found");
 }

 retbuf[i / 2] = (byte) ((top << 4) + bot);
 }

 return retbuf;
 }
 }
 }

 public static final class AVHWContextInfo {
 private final AVCodecHWConfig hwConfig;
 private final AVBufferRef hwContext;

 private volatile boolean freed = false;

 public AVHWContextInfo(final AVCodecHWConfig hwConfig, final AVBufferRef hwContext) {
 this.hwConfig = hwConfig;
 this.hwContext = hwContext;
 }

 public AVCodecHWConfig hwConfig() {
 return this.hwConfig;
 }

 public AVBufferRef hwContext() {
 return this.hwContext;
 }

 public void free() {
 if (this.freed) {
 return;
 }
 this.freed = true;
 av_free(this.hwConfig);
 av_free(this.hwContext);
 }


 @Override
 public boolean equals(Object o) {
 if (this == o) return true;
 if (o == null || getClass() != o.getClass()) return false;
 AVHWContextInfo that = (AVHWContextInfo) o;
 return freed == that.freed && Objects.equals(hwConfig, that.hwConfig) && Objects.equals(hwContext, that.hwContext);
 }

 @Override
 public int hashCode() {
 return Objects.hash(hwConfig, hwContext, freed);
 }

 @Override
 public String toString() {
 return "AVHWContextInfo[" +
 "hwConfig=" + this.hwConfig + ", " +
 "hwContext=" + this.hwContext + ']';
 }
 }

 public static final class AVTestFrames {

 private AVTestFrames() {

 }

 static {
 InputStream inputStream = null;
 try {
 inputStream = AVTestFrames.class.getClassLoader().getResourceAsStream("h264_test_key_frame.txt");
 final byte[] h264TestFrameBuffer = inputStream == null ? new byte[0] : inputStream.readAllBytes();
 final String h264TestFrame = new String(h264TestFrameBuffer, StandardCharsets.UTF_8);
 AVTestFrames.h264KeyTestFrame = HexUtil.unhexlify(h264TestFrame);
 } catch (final IOException e) {
 Logger.error(e, "Could not parse test frame");
 } finally {
 if (inputStream != null) {
 try {
 inputStream.close();
 } catch (final IOException e) {
 Logger.error(e, "Could not close test frame input stream");
 }
 }
 }
 }

 public static byte[] h264KeyTestFrame;
 }

 public static class FFmpegLogCallback extends LogCallback {

 private static final org.bytedeco.javacpp.tools.Logger logger = org.bytedeco.javacpp.tools.Logger.create(FFmpegLogCallback.class);

 static final FFmpegLogCallback instance = new FFmpegLogCallback().retainReference();

 public static FFmpegLogCallback getInstance() {
 return instance;
 }

 /**
 * Calls {@code avutil.setLogCallback(getInstance())}.
 */
 public static void set() {
 setLogCallback(getInstance());
 }

 /**
 * Returns {@code av_log_get_level()}.
 **/
 public static int getLevel() {
 return av_log_get_level();
 }

 /**
 * Calls {@code av_log_set_level(level)}.
 **/
 public static void setLevel(int level) {
 av_log_set_level(level);
 }

 @Override
 public void call(int level, BytePointer msg) {
 switch (level) {
 case AV_LOG_PANIC, AV_LOG_FATAL, AV_LOG_ERROR -> logger.error(msg.getString());
 case AV_LOG_WARNING -> logger.warn(msg.getString());
 case AV_LOG_INFO -> logger.info(msg.getString());
 case AV_LOG_VERBOSE, AV_LOG_DEBUG, AV_LOG_TRACE -> logger.debug(msg.getString());
 default -> {
 assert false;
 }
 }
 }
 }
}
</boolean>