Recherche avancée

Médias (0)

Mot : - Tags -/inscription3

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (84)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Personnaliser les catégories

    21 juin 2013, par

    Formulaire de création d’une catégorie
    Pour ceux qui connaissent bien SPIP, une catégorie peut être assimilée à une rubrique.
    Dans le cas d’un document de type catégorie, les champs proposés par défaut sont : Texte
    On peut modifier ce formulaire dans la partie :
    Administration > Configuration des masques de formulaire.
    Dans le cas d’un document de type média, les champs non affichés par défaut sont : Descriptif rapide
    Par ailleurs, c’est dans cette partie configuration qu’on peut indiquer le (...)

  • Des sites réalisés avec MediaSPIP

    2 mai 2011, par

    Cette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
    Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.

Sur d’autres sites (11409)

  • Video encoding task not working with Django Celery Redis FFMPEG and GraphQL

    18 juin 2023, par phanio

    I'm having a hard time trying to understand how is this FFMPEG encoding works while using Django, Celery, Redis, GraphQL and Docker too.

    


    I have this video / courses platform project and want I'm trying to do using FFMPEG, Celery and Redis is to create different video resolutions so I can display them the way Youtube does inside the videoplayer ( the videoplayer is handled in frontend by Nextjs and Apollo Client ), now on the backend I've just learned that in order to use properly the FFMPEG to resize the oridinal video size, I need to use Celery and Redis to perform asyncronus tasks. I've found a few older posts here on stackoverflow and google, but is not quite enough info for someone who is using the ffmpeg and clery and redis for the first time ( I've started already step by step and created that example that adds two numbers together with celery, that works well ). Now I'm not sure what is wrong with my code, because first of all I'm not really sure where should I trigger the task from, I mean from which file, because at the end of the task I want to send the data through api using GrapQL Strawberry.

    


    This is what I've tried by now :

    


    So first things first my project structure looks like this

    


    - backend #root directory
 --- backend
    -- __init__.py
    -- celery.py
    -- settings.py
    -- urls.py
      etc..

 --- static
   -- videos

 --- video
   -- models.py
   -- schema.py
   -- tasks.py
   -- types.py
   etc..

 --- .env

 --- db.sqlite3

 --- docker-compose.yml

 --- Dockerfile

 --- manage.py

 --- requirements.txt


    


    here is my settings.py file :

    


    from pathlib import Path
import os

# Build paths inside the project like this: BASE_DIR / 'subdir'.
BASE_DIR = Path(__file__).resolve().parent.parent

DEBUG = True

ALLOWED_HOSTS=["localhost", "0.0.0.0", "127.0.0.1"]

DEFAULT_AUTO_FIELD = 'django.db.models.BigAutoField'


# Application definition

INSTALLED_APPS = [
    "corsheaders",
    'django.contrib.admin',
    'django.contrib.auth',
    'django.contrib.contenttypes',
    'django.contrib.sessions',
    'django.contrib.messages',
    'django.contrib.staticfiles',

    "strawberry.django",
    "video"
]

etc...

STATIC_URL = '/static/'
MEDIA_URL = '/videos/'

STATICFILES_DIRS = [
    BASE_DIR / 'static',
    # BASE_DIR / 'frontend/build/static',
]

MEDIA_ROOT = BASE_DIR / 'static/videos'

STATIC_ROOT = BASE_DIR / 'staticfiles'

STATICFILES_STORAGE = 'whitenoise.storage.CompressedManifestStaticFilesStorage'

CORS_ALLOW_ALL_ORIGINS = True


CELERY_BEAT_SCHEDULER = 'django_celery_beat.schedulers:DatabaseScheduler'

# REDIS CACHE
CACHES = {
    "default": {
        "BACKEND": "django_redis.cache.RedisCache",
        "LOCATION": f"redis://127.0.0.1:6379/1",
        "OPTIONS": {
            "CLIENT_CLASS": "django_redis.client.DefaultClient",
        },
    }
}

# Docker
CELERY_BROKER_URL = os.environ.get("CELERY_BROKER", "redis://redis:6379/0")
CELERY_RESULT_BACKEND = os.environ.get("CELERY_BROKER", "redis://redis:6379/0")


    


    This is my main urls.py file :

    


    from django.contrib import admin
from django.conf import settings
from django.conf.urls.static import static
from django.urls import path
from django.urls.conf import include
from strawberry.django.views import GraphQLView

from video.schema import schema

urlpatterns = [
    path('admin/', admin.site.urls),
    path("graphql", GraphQLView.as_view(schema=schema)),
]

if settings.DEBUG:
    urlpatterns += static(settings.MEDIA_URL,
                          document_root=settings.MEDIA_ROOT)
    urlpatterns += static(settings.STATIC_URL,
                          document_root=settings.STATIC_ROOT)


    


    This is my celery.py file :

    


    from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from django.conf import settings

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'backend.settings')

backend = Celery('backend')

backend.config_from_object('django.conf:settings', namespace="CELERY")

backend.autodiscover_tasks()

@backend.task(bind=True)
def debug_task(self):
    print('Request: {0!r}'.format(self.request))


    


    This is my init.py file :

    


    from .celery import backend as celery_backend

__all__ = ('celery_backend',)


    


    This is my Dockerfile :

    


    FROM python:3
ENV PYTHONUNBUFFERED=1

WORKDIR /usr/src/backend

RUN apt-get -y update
RUN apt-get -y upgrade
RUN apt-get install -y ffmpeg

COPY requirements.txt ./
RUN pip install -r requirements.txt


    


    This is my docker-compose.yml file :

    


    version: "3.8"

services:
  django:
    build: .
    container_name: django
    command: python manage.py runserver 0.0.0.0:8000
    volumes:
      - .:/usr/src/backend/
    ports:
      - "8000:8000"
    environment:
      - DEBUG=1
      - DJANGO_ALLOWED_HOSTS=localhost 127.0.0.1 [::1]
      - CELERY_BROKER=redis://redis:6379/0
      - CELERY_BACKEND=redis://redis:6379/0
    depends_on:
      - pgdb
      - redis

  celery:
    build: .
    command: celery -A backend worker -l INFO
    volumes:
      - .:/usr/src/backend
    depends_on:
      - django
      - redis

  pgdb:
    image: postgres
    container_name: pgdb
    environment:
      - POSTGRES_DB=postgres
      - POSTGRES_USER=postgres
      - POSTGRES_PASSWORD=postgres
    volumes:
      - pgdata:/var/lib/postgresql/data/

  redis:
    image: "redis:alpine"

volumes:
  pgdata:


    


    And now inside my video app folder :

    


    My models.py file :

    


      

    • here I've created separated fields for all resolution sizes, from video_file_2k to video_file_144, I was thinking that maybe after the process of the encoding this will populate those fields..
    • 


    


    from django.db import models
from django.urls import reverse


class Video(models.Model):
    video_id = models.AutoField(primary_key=True, editable=False)
    slug = models.SlugField(max_length=255)
    title = models.CharField(max_length=150, blank=True, null=True)
    description = models.TextField(blank=True, null=True)
    video_file = models.FileField(null=False, blank=False)
    video_file_2k = models.FileField(null=True, blank=True)
    video_file_fullhd = models.FileField(null=True, blank=True)
    video_file_hd = models.FileField(null=True, blank=True)
    video_file_480 = models.FileField(null=True, blank=True)
    video_file_360 = models.FileField(null=True, blank=True)
    video_file_240 = models.FileField(null=True, blank=True)
    video_file_144 = models.FileField(null=True, blank=True)
    category = models.CharField(max_length=64, blank=False, null=False)
    created_at = models.DateTimeField(
        ("Created at"), auto_now_add=True, editable=False)
    updated_at = models.DateTimeField(("Updated at"), auto_now=True)

    class Meta:
        ordering = ("-created_at",)
        verbose_name = ("Video")
        verbose_name_plural = ("Videos")

    def get_absolute_url(self):
        return reverse("store:video_detail", args=[self.slug])

    def __str__(self):
        return self.title


    


    This is my schema.py file :

    


    import strawberry
from strawberry.file_uploads import Upload
from typing import List
from .types import VideoType
from .models import Video
from .tasks import task_video_encoding_1080p, task_video_encoding_720p


@strawberry.type
class Query:
    @strawberry.field
    def videos(self, category: str = None) -> List[VideoType]:
        if category:
            videos = Video.objects.filter(category=category)
            return videos
        return Video.objects.all()

    @strawberry.field
    def video(self, slug: str) -> VideoType:
        if slug == slug:
            video = Video.objects.get(slug=slug)
            return video

    @strawberry.field
    def video_by_id(self, video_id: int) -> VideoType:
        if video_id == video_id:
            video = Video.objects.get(pk=video_id)

          # Here I've tried to trigger my tasks, when I visited 0.0.0.0:8000/graphql url
          # and I was querying for a video by it's id , then I've got the error from celery 
            task_video_encoding_1080p.delay(video_id)
            task_video_encoding_720p.delay(video_id)

            return video


@strawberry.type
class Mutation:
    @strawberry.field
    def create_video(self, slug: str, title: str, description: str, video_file: Upload, video_file_2k: str, video_file_fullhd: str, video_file_hd: str, video_file_480: str, video_file_360: str, video_file_240: str, video_file_144: str, category: str) -> VideoType:

        video = Video(slug=slug, title=title, description=description,
                      video_file=video_file, video_file_2k=video_file_2k, video_file_fullhd=video_file_fullhd, video_file_hd=video_file_hd, video_file_480=video_file_480, video_file_360=video_file_360, video_file_240=video_file_240, video_file_144=video_file_144,category=category)
        
        video.save()
        return video

    @strawberry.field
    def update_video(self, video_id: int, slug: str, title: str, description: str, video_file: str, category: str) -> VideoType:
        video = Video.objects.get(video_id=video_id)
        video.slug = slug
        video.title = title
        video.description = description
        video.video_file = video_file
        video.category = category
        video.save()
        return video

    @strawberry.field
    def delete_video(self, video_id: int) -> bool:
        video = Video.objects.get(video_id=video_id)
        video.delete
        return True


schema = strawberry.Schema(query=Query, mutation=Mutation)


    


    This is my types.py file ( strawberry graphql related ) :

    


    import strawberry

from .models import Video


@strawberry.django.type(Video)
class VideoType:
    video_id: int
    slug: str
    title: str
    description: str
    video_file: str
    video_file_2k: str
    video_file_fullhd: str
    video_file_hd: str
    video_file_480: str
    video_file_360: str
    video_file_240: str
    video_file_144: str
    category: str


    


    And this is my tasks.py file :

    


    from __future__ import absolute_import, unicode_literals
import os, subprocess
from django.conf import settings
from django.core.exceptions import ValidationError
from celery import shared_task
from celery.utils.log import get_task_logger
from .models import Video
FFMPEG_PATH = os.environ["IMAGEIO_FFMPEG_EXE"] = "/opt/homebrew/Cellar/ffmpeg/6.0/bin/ffmpeg"

logger = get_task_logger(__name__)


# CELERY TASKS
@shared_task
def add(x,y):
    return x + y


@shared_task
def task_video_encoding_720p(video_id):
    logger.info('Video Processing started')
    try:
        video = Video.objects.get(video_id=video_id)
        input_file_path = video.video_file.path
        input_file_url = video.video_file.url
        input_file_name = video.video_file.name

        # get the filename (without extension)
        filename = os.path.basename(input_file_url)

        # path to the new file, change it according to where you want to put it
        output_file_name = os.path.join('videos', 'mp4', '{}.mp4'.format(filename))
        output_file_path = os.path.join(settings.MEDIA_ROOT, output_file_name)

        # 2-pass encoding
        for i in range(1):
           new_video_720p = subprocess.call([FFMPEG_PATH, '-i', input_file_path, '-s', '1280x720', '-vcodec', 'mpeg4', '-acodec', 'libvo_aacenc', '-b', '10000k', '-pass', i, '-r', '30', output_file_path])
        #    new_video_720p = subprocess.call([FFMPEG_PATH, '-i', input_file_path, '-s', '{}x{}'.format(height * 16/9, height), '-vcodec', 'mpeg4', '-acodec', 'libvo_aacenc', '-b', '10000k', '-pass', i, '-r', '30', output_file_path])

        if new_video_720p == 0:
            # save the new file in the database
            # video.video_file_hd.name = output_file_name
            video.save(update_fields=['video_file_hd'])
            logger.info('Video Processing Finished')
            return video

        else:
            logger.info('Proceesing Failed.') # Just for now

    except:
        raise ValidationError('Something went wrong')


@shared_task
# def task_video_encoding_1080p(video_id, height):
def task_video_encoding_1080p(video_id):
    logger.info('Video Processing started')
    try:
        video = Video.objects.get(video_id=video_id)
        input_file_path = video.video_file.url
        input_file_name = video.video_file.name

        # get the filename (without extension)
        filename = os.path.basename(input_file_path)

        # path to the new file, change it according to where you want to put it
        output_file_name = os.path.join('videos', 'mp4', '{}.mp4'.format(filename))
        output_file_path = os.path.join(settings.MEDIA_ROOT, output_file_name)

        for i in range(1):
            new_video_1080p = subprocess.call([FFMPEG_PATH, '-i', input_file_path, '-s', '1920x1080', '-vcodec', 'mpeg4', '-acodec', 'libvo_aacenc', '-b', '10000k', '-pass', i, '-r', '30', output_file_path])

        if new_video_1080p == 0:
            # save the new file in the database
            # video.video_file_hd.name = output_file_name
            video.save(update_fields=['video_file_fullhd'])
            logger.info('Video Processing Finished')
            return video
        else:
            logger.info('Proceesing Failed.') # Just for now

    except:
        raise ValidationError('Something went wrong')


    


    In my first attempt I wasn't triggering the tasks no where, then I've tried to trigger the task from the schema.py file from graphql inside the video_by_id, but there I've got this error :

    


    backend-celery-1  | django.core.exceptions.ValidationError: ['Something went wrong']
backend-celery-1  | [2023-06-18 16:38:52,859: ERROR/ForkPoolWorker-4] Task video.tasks.task_video_encoding_1080p[d33b1a42-5914-467c-ad5c-00565bc8be6f] raised unexpected: ValidationError(['Something went wrong'])
backend-celery-1  | Traceback (most recent call last):
backend-celery-1  |   File "/usr/src/backend/video/tasks.py", line 81, in task_video_encoding_1080p
backend-celery-1  |     new_video_1080p = subprocess.call([FFMPEG_PATH, '-i', input_file_path, '-s', '1920x1080', '-vcodec', 'mpeg4', '-acodec', 'libvo_aacenc', '-b', '10000k', '-pass', i, '-r', '30', output_file_path])
backend-celery-1  |                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
backend-celery-1  |   File "/usr/local/lib/python3.11/subprocess.py", line 389, in call
backend-celery-1  |     with Popen(*popenargs, **kwargs) as p:
backend-celery-1  |          ^^^^^^^^^^^^^^^^^^^^^^^^^^^
backend-celery-1  |   File "/usr/local/lib/python3.11/subprocess.py", line 1026, in __init__
backend-celery-1  |     self._execute_child(args, executable, preexec_fn, close_fds,
backend-celery-1  |   File "/usr/local/lib/python3.11/subprocess.py", line 1883, in _execute_child
backend-celery-1  |     self.pid = _fork_exec(
backend-celery-1  |                ^^^^^^^^^^^
backend-celery-1  | TypeError: expected str, bytes or os.PathLike object, not int
backend-celery-1  | 
backend-celery-1  | During handling of the above exception, another exception occurred:
backend-celery-1  | 
backend-celery-1  | Traceback (most recent call last):
backend-celery-1  |   File "/usr/local/lib/python3.11/site-packages/celery/app/trace.py", line 477, in trace_task
backend-celery-1  |     R = retval = fun(*args, **kwargs)
backend-celery-1  |                  ^^^^^^^^^^^^^^^^^^^^
backend-celery-1  |   File "/usr/local/lib/python3.11/site-packages/celery/app/trace.py", line 760, in __protected_call__
backend-celery-1  |     return self.run(*args, **kwargs)
backend-celery-1  |            ^^^^^^^^^^^^^^^^^^^^^^^^^
backend-celery-1  |   File "/usr/src/backend/video/tasks.py", line 93, in task_video_encoding_1080p
backend-celery-1  |     raise ValidationError('Something went wrong')
backend-celery-1  | django.core.exceptions.ValidationError: ['Something went wrong']


    


    If anyone has done this kind of project or something like this please any suggestion or help is much appreciated.

    


    Thank you in advance !

    


  • To all Matomo plugin developers : Matomo 5 is coming, make your plugin compatible now

    5 mai 2023, par Matomo Core Team — Development

    We’re planning to release the first beta of Matomo 5 in a few weeks. For making it easy for Matomo users to be able to upgrade to this beta, it would be great to have as many plugins on the Marketplace as possible already updated and compatible with Matomo 5. Then many users would be able to upgrade to the first beta without any issues.

    Presumably, as you put your plugin on our Marketplace, you want people to use it. Making your plugin compatible with Matomo 5 helps ensure that people will be able to find and keep using your plugin. If your plugin is not compatible with Matomo 5, your plugin will be automatically deactivated in Matomo 5 instances. We’ll be happy to help you achieve compatibility should there be any issue.

    How do I upgrade my Matomo instance to Matomo 5 ?

    If you have installed your Matomo development environment through git, you can simply checkout the Matomo 5 branch “5.x-dev” and install its dependencies by executing these commands :

    • git checkout 5.x-dev
    • composer install

    Alternatively, you can also download the latest version directly from GitHub as a zip file and run composer install afterwards.

    How do I upgrade my plugin to Matomo 5 ?

    While there are some breaking changes in Matomo 5, most of our Platform APIs remain unchanged, and almost all changes are for rarely used APIs. Quite often, making your plugin compatible with Matomo 5 will just be a matter of adjusting the “plugin.json” file (as mentioned in the migration guide).

    You can find all developer documentation on our developer zone which has already been updated for Matomo 5.

    How do I know my plugin changes were released successfully ?

    If you have configured an email address within your “plugin.json” file, then you will receive a confirmation or an error email within a few minutes. Alternatively, you can also check out your plugin page on the Marketplace directly. If the plugin release was successful, you will see additional links below the download button showing which versions your plugin is compatible with.

    what it looks like when your plugin is compatible with multiple Matomo versions

    How can switch between Matomo 4 and Matomo 5 or downgrade to Matomo 4 ?

    To downgrade from Matomo 5 to Matomo 4 in your Matomo development environment :

    • check out the “4.x-dev” branch 
    • run “composer install” as usual

    When will the final Matomo 5 release be available ?

    We estimate the final stable Matomo 5.0.0 release will be released in approx. 2-3 months.

    What is new in Matomo 5 ?

    We don’t have a summary of the changes available just yet but you can see all closed issues within this release here.

    Any questions or need help ?

    If you have any questions, or experience any problems during the migration, don’t hesitate to get in touch with us. We’ll be happy to help get your plugin compatible and the update published. If you find any undocumented breaking change or find any step during the migration process not clear, please let us know as well.

    Thank you for contributing a plugin to the Marketplace and making Matomo better. We really appreciate your work !

  • FFmpeg invalid data found when processing input with D3D11VA and DXVA2 hw acceleration

    6 mai 2023, par grill2010

    I'm currently porting my Android streaming app to Windows and for decoding the h264 video stream I use FFmpeg with possible hardware acceleration. Last two weeks I was reading a lot of documentation and studied a lot of examples on the internet. For my project I use JavaCV which is internally using FFmpeg 5.1.2. On Windows I support D3D11VA, DXVA2 and Cuvid for hardware acceleration (and software decoding as fallback). During testing I noticed that I get some strange artefacts when using D3D11VA or DXVA2 hw acceleration. Upon further investigation I saw that I receive a lot of

    


    "Invalid data found when processing input"

    


    errors when calling avcodec_send_packet. It seems this error only occurs on certain key frames. The error is reproducable all the time. The software decoder or cuvid decoder has absolutely no problem to process and to decode such a frame, so not sure why there should be an invalid data in the frame ? I played around a lot with the decoder configuration but nothing seems to help and at that point I think this is definitely not normal behaviour.

    


    I provided a reproducable example which can be downloaded from here. All the important part is in the App.java class. In addition an example of the code was posted below. The example is trying to decode a key frame. The keyframe data with sps and pps is read from a file in the resource folder of the project.

    


    To run the project just perform a .\gradlew build and afterwards a .\gradlew run. If you run the example the last log message shown in the terminal should be "SUCESS with HW decoding". The hardware decoder can be changed via the HW_DEVICE_TYPE variable in the App.java class. To disable hw acceleration just set the USE_HW_ACCEL to false.

    


    For me everything seems to be correct and I have no idea what could be wrong with the code. I looked a lot on the internet to find the root cause of the issue and I did not really found a solution but other sources which are related to (maybe) the same problem

    


    https://www.mail-archive.com/libav-user@ffmpeg.org/...

    


    https://stackoverflow.com/questions/67307397/ffmpeg-...

    


    I also found another streaming app on Windows which can use D3D11VA and DXVA2 hardware acceleration called Chiaki (it requires a PS4 or a PS5) which seems to have the exact same problem. I used the build provided here. It will fail to decode certain key frames as well when hardware acceleration with D3D11VA or DXVA2 is selected (e.g. the first key frame received by the stream). Chiaki can output the seemingly faulty frame but this is also possible with my example by setting the USE_AV_EF_EXPLODE to false.

    


    Are there any ffmpeg gurus around that can check what's the problem with D3D11VA or DXVA2 ? Anything else that needs to be done to make the D3D11VA and DXVA2 hardware decoder work ? I'm now completly out of ideas and I'm not even sure if this is fixable.

    


    I have Windows 11 installed on my test machine, and I have the latest Nvidea drivers installed.

    


    Edit : here is a shrinked complete example of my project (keyframe file which includes sps and pps can be downloaded from here. It's a hex string file and can be decoded with the provided HexUtil class)

    


    import javafx.application.Application;&#xA;import javafx.scene.Scene;&#xA;import javafx.scene.layout.Pane;&#xA;import javafx.stage.Stage;&#xA;import org.bytedeco.ffmpeg.avcodec.AVCodec;&#xA;import org.bytedeco.ffmpeg.avcodec.AVCodecContext;&#xA;import org.bytedeco.ffmpeg.avcodec.AVCodecHWConfig;&#xA;import org.bytedeco.ffmpeg.avcodec.AVPacket;&#xA;import org.bytedeco.ffmpeg.avutil.AVBufferRef;&#xA;import org.bytedeco.ffmpeg.avutil.AVDictionary;&#xA;import org.bytedeco.ffmpeg.avutil.AVFrame;&#xA;import org.bytedeco.javacpp.BytePointer;&#xA;import org.bytedeco.javacpp.IntPointer;&#xA;import org.bytedeco.javacv.FFmpegLogCallback;&#xA;import org.tinylog.Logger;&#xA;&#xA;import java.io.IOException;&#xA;import java.io.InputStream;&#xA;import java.nio.charset.StandardCharsets;&#xA;import java.util.Objects;&#xA;import java.util.function.Consumer;&#xA;&#xA;import static org.bytedeco.ffmpeg.avcodec.AVCodecContext.AV_EF_EXPLODE;&#xA;import static org.bytedeco.ffmpeg.avcodec.AVCodecContext.FF_THREAD_SLICE;&#xA;import static org.bytedeco.ffmpeg.global.avcodec.*;&#xA;import static org.bytedeco.ffmpeg.global.avutil.*;&#xA;&#xA;public class App extends Application {&#xA;&#xA;    /**** decoder variables ****/&#xA;&#xA;    private AVHWContextInfo hardwareContext;&#xA;&#xA;    private AVCodec decoder;&#xA;    private AVCodecContext m_VideoDecoderCtx;&#xA;&#xA;    private AVCodecContext.Get_format_AVCodecContext_IntPointer formatCallback;&#xA;    private final int streamResolutionX = 1920;&#xA;    private final int streamResolutionY = 1080;&#xA;&#xA;    // AV_HWDEVICE_TYPE_CUDA // example works with cuda&#xA;    // AV_HWDEVICE_TYPE_DXVA2 // producing Invalid data found on keyframe&#xA;    // AV_HWDEVICE_TYPE_D3D11VA // producing Invalid data found on keyframe&#xA;    private static final int HW_DEVICE_TYPE = AV_HWDEVICE_TYPE_DXVA2;&#xA;&#xA;    private static final boolean USE_HW_ACCEL = true;&#xA;&#xA;    private static final boolean USE_AV_EF_EXPLODE = true;&#xA;&#xA;    public static void main(final String[] args) {&#xA;        //System.setProperty("prism.order", "d3d,sw");&#xA;        System.setProperty("prism.vsync", "false");&#xA;        Application.launch(App.class);&#xA;    }&#xA;&#xA;    @Override&#xA;    public void start(final Stage primaryStage) {&#xA;        final Pane dummyPane = new Pane();&#xA;        dummyPane.setStyle("-fx-background-color: black");&#xA;        final Scene scene = new Scene(dummyPane, this.streamResolutionX, this.streamResolutionY);&#xA;        primaryStage.setScene(scene);&#xA;        primaryStage.show();&#xA;        primaryStage.setMinWidth(480);&#xA;        primaryStage.setMinHeight(360);&#xA;&#xA;        this.initializeFFmpeg(result -> {&#xA;            if (!result) {&#xA;                Logger.error("FFmpeg could not be initialized correctly, terminating program");&#xA;                System.exit(1);&#xA;                return;&#xA;            }&#xA;            this.performTestFramesFeeding();&#xA;        });&#xA;    }&#xA;&#xA;    private void initializeFFmpeg(final Consumer<boolean> finishHandler) {&#xA;        FFmpegLogCallback.setLevel(AV_LOG_DEBUG); // Increase log level until the first frame is decoded&#xA;        //FFmpegLogCallback.setLevel(AV_LOG_QUIET);&#xA;        this.decoder = avcodec_find_decoder(AV_CODEC_ID_H264); // usually decoder name is h264 and without hardware support it&#x27;s yuv420p otherwise nv12&#xA;&#xA;        if (this.decoder == null) {&#xA;            Logger.error("Unable to find decoder for format {}", "h264");&#xA;            finishHandler.accept(false);&#xA;            return;&#xA;        }&#xA;        Logger.info("Current decoder name: {}, {}", this.decoder.name().getString(), this.decoder.long_name().getString());&#xA;&#xA;        if (true) {&#xA;            for (; ; ) {&#xA;                this.m_VideoDecoderCtx = avcodec_alloc_context3(this.decoder);&#xA;                if (this.m_VideoDecoderCtx == null) {&#xA;                    Logger.error("Unable to find decoder for format AV_CODEC_ID_H264");&#xA;                    if (this.hardwareContext != null) {&#xA;                        this.hardwareContext.free();&#xA;                        this.hardwareContext = null;&#xA;                    }&#xA;                    continue;&#xA;                }&#xA;&#xA;                if (App.USE_HW_ACCEL) {&#xA;                    this.hardwareContext = this.createHardwareContext();&#xA;                    if (this.hardwareContext != null) {&#xA;                        Logger.info("Set hwaccel support");&#xA;                        this.m_VideoDecoderCtx.hw_device_ctx(this.hardwareContext.hwContext()); // comment to disable hwaccel&#xA;                    }&#xA;                } else {&#xA;                    Logger.info("Hwaccel manually disabled");&#xA;                }&#xA;&#xA;&#xA;                // Always request low delay decoding&#xA;                this.m_VideoDecoderCtx.flags(this.m_VideoDecoderCtx.flags() | AV_CODEC_FLAG_LOW_DELAY);&#xA;&#xA;                // Allow display of corrupt frames and frames missing references&#xA;                this.m_VideoDecoderCtx.flags(this.m_VideoDecoderCtx.flags() | AV_CODEC_FLAG_OUTPUT_CORRUPT);&#xA;                this.m_VideoDecoderCtx.flags2(this.m_VideoDecoderCtx.flags2() | AV_CODEC_FLAG2_SHOW_ALL);&#xA;&#xA;                if (App.USE_AV_EF_EXPLODE) {&#xA;                    // Report decoding errors to allow us to request a key frame&#xA;                    this.m_VideoDecoderCtx.err_recognition(this.m_VideoDecoderCtx.err_recognition() | AV_EF_EXPLODE);&#xA;                }&#xA;&#xA;                // Enable slice multi-threading for software decoding&#xA;                if (this.m_VideoDecoderCtx.hw_device_ctx() == null) { // if not hw accelerated&#xA;                    this.m_VideoDecoderCtx.thread_type(this.m_VideoDecoderCtx.thread_type() | FF_THREAD_SLICE);&#xA;                    this.m_VideoDecoderCtx.thread_count(2/*AppUtil.getCpuCount()*/);&#xA;                } else {&#xA;                    // No threading for HW decode&#xA;                    this.m_VideoDecoderCtx.thread_count(1);&#xA;                }&#xA;&#xA;                this.m_VideoDecoderCtx.width(this.streamResolutionX);&#xA;                this.m_VideoDecoderCtx.height(this.streamResolutionY);&#xA;                this.m_VideoDecoderCtx.pix_fmt(this.getDefaultPixelFormat());&#xA;&#xA;                this.formatCallback = new AVCodecContext.Get_format_AVCodecContext_IntPointer() {&#xA;                    @Override&#xA;                    public int call(final AVCodecContext context, final IntPointer pixelFormats) {&#xA;                        final boolean hwDecodingSupported = context.hw_device_ctx() != null &amp;&amp; App.this.hardwareContext != null;&#xA;                        final int preferredPixelFormat = hwDecodingSupported ?&#xA;                                App.this.hardwareContext.hwConfig().pix_fmt() :&#xA;                                context.pix_fmt();&#xA;                        int i = 0;&#xA;                        while (true) {&#xA;                            final int currentSupportedFormat = pixelFormats.get(i&#x2B;&#x2B;);&#xA;                            System.out.println("Supported pixel formats " &#x2B; currentSupportedFormat);&#xA;                            if (currentSupportedFormat == preferredPixelFormat) {&#xA;                                Logger.info("[FFmpeg]: pixel format in format callback is {}", currentSupportedFormat);&#xA;                                return currentSupportedFormat;&#xA;                            }&#xA;                            if (currentSupportedFormat == AV_PIX_FMT_NONE) {&#xA;                                break;&#xA;                            }&#xA;                        }&#xA;&#xA;                        i = 0;&#xA;                        while (true) { // try again and search for yuv&#xA;                            final int currentSupportedFormat = pixelFormats.get(i&#x2B;&#x2B;);&#xA;                            if (currentSupportedFormat == AV_PIX_FMT_YUV420P) {&#xA;                                Logger.info("[FFmpeg]: Not found in first match so use {}", AV_PIX_FMT_YUV420P);&#xA;                                return currentSupportedFormat;&#xA;                            }&#xA;                            if (currentSupportedFormat == AV_PIX_FMT_NONE) {&#xA;                                break;&#xA;                            }&#xA;                        }&#xA;&#xA;                        i = 0;&#xA;                        while (true) { // try again and search for nv12&#xA;                            final int currentSupportedFormat = pixelFormats.get(i&#x2B;&#x2B;);&#xA;                            if (currentSupportedFormat == AV_PIX_FMT_NV12) {&#xA;                                Logger.info("[FFmpeg]: Not found in second match so use {}", AV_PIX_FMT_NV12);&#xA;                                return currentSupportedFormat;&#xA;                            }&#xA;                            if (currentSupportedFormat == AV_PIX_FMT_NONE) {&#xA;                                break;&#xA;                            }&#xA;                        }&#xA;&#xA;                        Logger.info("[FFmpeg]: pixel format in format callback is using fallback {}", AV_PIX_FMT_NONE);&#xA;                        return AV_PIX_FMT_NONE;&#xA;                    }&#xA;                };&#xA;                this.m_VideoDecoderCtx.get_format(this.formatCallback);&#xA;&#xA;                final AVDictionary options = new AVDictionary(null);&#xA;                final int result = avcodec_open2(this.m_VideoDecoderCtx, this.decoder, options);&#xA;                if (result &lt; 0) {&#xA;                    Logger.error("avcodec_open2 was not successful");&#xA;                    finishHandler.accept(false);&#xA;                    return;&#xA;                }&#xA;                av_dict_free(options);&#xA;                break;&#xA;            }&#xA;        }&#xA;&#xA;        if (this.decoder == null || this.m_VideoDecoderCtx == null) {&#xA;            finishHandler.accept(false);&#xA;            return;&#xA;        }&#xA;        finishHandler.accept(true);&#xA;    }&#xA;&#xA;    private AVHWContextInfo createHardwareContext() {&#xA;        AVHWContextInfo result = null;&#xA;        for (int i = 0; ; i&#x2B;&#x2B;) {&#xA;            final AVCodecHWConfig config = avcodec_get_hw_config(this.decoder, i);&#xA;            if (config == null) {&#xA;                break;&#xA;            }&#xA;&#xA;            if ((config.methods() &amp; AV_CODEC_HW_CONFIG_METHOD_HW_DEVICE_CTX) &lt; 0) {&#xA;                continue;&#xA;            }&#xA;            final int device_type = config.device_type();&#xA;            if (device_type != App.HW_DEVICE_TYPE) {&#xA;                continue;&#xA;            }&#xA;            final AVBufferRef hw_context = av_hwdevice_ctx_alloc(device_type);&#xA;            if (hw_context == null || av_hwdevice_ctx_create(hw_context, device_type, (String) null, null, 0) &lt; 0) {&#xA;                Logger.error("HW accel not supported for type {}", device_type);&#xA;                av_free(config);&#xA;                av_free(hw_context);&#xA;            } else {&#xA;                Logger.info("HW accel created for type {}", device_type);&#xA;                result = new AVHWContextInfo(config, hw_context);&#xA;            }&#xA;            break;&#xA;        }&#xA;&#xA;        return result;&#xA;    }&#xA;&#xA;    @Override&#xA;    public void stop() {&#xA;        this.releaseNativeResources();&#xA;    }&#xA;&#xA;    /************************/&#xA;    /*** video processing ***/&#xA;    /************************/&#xA;&#xA;&#xA;    private void performTestFramesFeeding() {&#xA;        final AVPacket pkt = av_packet_alloc();&#xA;        if (pkt == null) {&#xA;            return;&#xA;        }&#xA;        try (final BytePointer bp = new BytePointer(65_535 * 4)) {&#xA;            final byte[] frameData = AVTestFrames.h264KeyTestFrame;&#xA;&#xA;&#xA;            bp.position(0);&#xA;&#xA;            bp.put(frameData);&#xA;            bp.limit(frameData.length);&#xA;&#xA;            pkt.data(bp);&#xA;            pkt.capacity(bp.capacity());&#xA;            pkt.size(frameData.length);&#xA;            pkt.position(0);&#xA;            pkt.limit(frameData.length);&#xA;            final AVFrame avFrame = av_frame_alloc();&#xA;&#xA;            final int err = avcodec_send_packet(this.m_VideoDecoderCtx, pkt); // this will fail with D3D11VA and DXVA2&#xA;            if (err &lt; 0) {&#xA;                final BytePointer buffer = new BytePointer(512);&#xA;                av_strerror(err, buffer, buffer.capacity());&#xA;                final String string = buffer.getString();&#xA;                System.out.println("Error on decoding test frame " &#x2B; err &#x2B; " message " &#x2B; string);&#xA;                av_frame_free(avFrame);&#xA;                return;&#xA;            }&#xA;&#xA;            final int result = avcodec_receive_frame(this.m_VideoDecoderCtx, avFrame);&#xA;            final AVFrame decodedFrame;&#xA;            if (result == 0) {&#xA;                if (this.m_VideoDecoderCtx.hw_device_ctx() == null) {&#xA;                    decodedFrame = avFrame;&#xA;                    av_frame_unref(decodedFrame);&#xA;                    System.out.println("SUCESS with SW decoding");&#xA;                } else {&#xA;                    final AVFrame hwAvFrame = av_frame_alloc();&#xA;                    if (av_hwframe_transfer_data(hwAvFrame, avFrame, 0) &lt; 0) {&#xA;                        System.out.println("Failed to transfer frame from hardware");&#xA;                        av_frame_unref(hwAvFrame);&#xA;                        decodedFrame = avFrame;&#xA;                    } else {&#xA;                        av_frame_unref(avFrame);&#xA;                        decodedFrame = hwAvFrame;&#xA;                        System.out.println("SUCESS with HW decoding");&#xA;                    }&#xA;                    av_frame_unref(decodedFrame);&#xA;                }&#xA;            } else {&#xA;                final BytePointer buffer = new BytePointer(512);&#xA;                av_strerror(result, buffer, buffer.capacity());&#xA;                final String string = buffer.getString();&#xA;                System.out.println("error " &#x2B; result &#x2B; " message " &#x2B; string);&#xA;                av_frame_free(avFrame);&#xA;            }&#xA;        } finally {&#xA;            if (pkt.stream_index() != -1) {&#xA;                av_packet_unref(pkt);&#xA;            }&#xA;            pkt.releaseReference();&#xA;        }&#xA;    }&#xA;&#xA;    final Object releaseLock = new Object();&#xA;    private volatile boolean released = false;&#xA;&#xA;    private void releaseNativeResources() {&#xA;        if (this.released) {&#xA;            return;&#xA;        }&#xA;        this.released = true;&#xA;        synchronized (this.releaseLock) {&#xA;            // Close the video codec&#xA;            if (this.m_VideoDecoderCtx != null) {&#xA;                avcodec_free_context(this.m_VideoDecoderCtx);&#xA;                this.m_VideoDecoderCtx = null;&#xA;            }&#xA;&#xA;            // close the format callback&#xA;            if (this.formatCallback != null) {&#xA;                this.formatCallback.close();&#xA;                this.formatCallback = null;&#xA;            }&#xA;&#xA;            // close hw context&#xA;            if (this.hardwareContext != null) {&#xA;                this.hardwareContext.free();&#xA;            }&#xA;        }&#xA;    }&#xA;&#xA;    private int getDefaultPixelFormat() {&#xA;        return AV_PIX_FMT_YUV420P; // Always return yuv420p here&#xA;    }&#xA;&#xA;    public static final class HexUtil {&#xA;        private static final char[] hexArray = "0123456789ABCDEF".toCharArray();&#xA;&#xA;        private HexUtil() {&#xA;        }&#xA;&#xA;        public static String hexlify(final byte[] bytes) {&#xA;            final char[] hexChars = new char[bytes.length * 2];&#xA;&#xA;            for (int j = 0; j &lt; bytes.length; &#x2B;&#x2B;j) {&#xA;                final int v = bytes[j] &amp; 255;&#xA;                hexChars[j * 2] = HexUtil.hexArray[v >>> 4];&#xA;                hexChars[j * 2 &#x2B; 1] = HexUtil.hexArray[v &amp; 15];&#xA;            }&#xA;&#xA;            return new String(hexChars);&#xA;        }&#xA;&#xA;        public static byte[] unhexlify(final String argbuf) {&#xA;            final int arglen = argbuf.length();&#xA;            if (arglen % 2 != 0) {&#xA;                throw new RuntimeException("Odd-length string");&#xA;            } else {&#xA;                final byte[] retbuf = new byte[arglen / 2];&#xA;&#xA;                for (int i = 0; i &lt; arglen; i &#x2B;= 2) {&#xA;                    final int top = Character.digit(argbuf.charAt(i), 16);&#xA;                    final int bot = Character.digit(argbuf.charAt(i &#x2B; 1), 16);&#xA;                    if (top == -1 || bot == -1) {&#xA;                        throw new RuntimeException("Non-hexadecimal digit found");&#xA;                    }&#xA;&#xA;                    retbuf[i / 2] = (byte) ((top &lt;&lt; 4) &#x2B; bot);&#xA;                }&#xA;&#xA;                return retbuf;&#xA;            }&#xA;        }&#xA;    }&#xA;&#xA;    public static final class AVHWContextInfo {&#xA;        private final AVCodecHWConfig hwConfig;&#xA;        private final AVBufferRef hwContext;&#xA;&#xA;        private volatile boolean freed = false;&#xA;&#xA;        public AVHWContextInfo(final AVCodecHWConfig hwConfig, final AVBufferRef hwContext) {&#xA;            this.hwConfig = hwConfig;&#xA;            this.hwContext = hwContext;&#xA;        }&#xA;&#xA;        public AVCodecHWConfig hwConfig() {&#xA;            return this.hwConfig;&#xA;        }&#xA;&#xA;        public AVBufferRef hwContext() {&#xA;            return this.hwContext;&#xA;        }&#xA;&#xA;        public void free() {&#xA;            if (this.freed) {&#xA;                return;&#xA;            }&#xA;            this.freed = true;&#xA;            av_free(this.hwConfig);&#xA;            av_free(this.hwContext);&#xA;        }&#xA;&#xA;&#xA;        @Override&#xA;        public boolean equals(Object o) {&#xA;            if (this == o) return true;&#xA;            if (o == null || getClass() != o.getClass()) return false;&#xA;            AVHWContextInfo that = (AVHWContextInfo) o;&#xA;            return freed == that.freed &amp;&amp; Objects.equals(hwConfig, that.hwConfig) &amp;&amp; Objects.equals(hwContext, that.hwContext);&#xA;        }&#xA;&#xA;        @Override&#xA;        public int hashCode() {&#xA;            return Objects.hash(hwConfig, hwContext, freed);&#xA;        }&#xA;&#xA;        @Override&#xA;        public String toString() {&#xA;            return "AVHWContextInfo[" &#x2B;&#xA;                    "hwConfig=" &#x2B; this.hwConfig &#x2B; ", " &#x2B;&#xA;                    "hwContext=" &#x2B; this.hwContext &#x2B; &#x27;]&#x27;;&#xA;        }&#xA;&#xA;    }&#xA;&#xA;    public static final class AVTestFrames {&#xA;&#xA;        private AVTestFrames() {&#xA;&#xA;        }&#xA;&#xA;        static {&#xA;            InputStream inputStream = null;&#xA;            try {&#xA;                inputStream = AVTestFrames.class.getClassLoader().getResourceAsStream("h264_test_key_frame.txt");&#xA;                final byte[] h264TestFrameBuffer = inputStream == null ? new byte[0] : inputStream.readAllBytes();&#xA;                final String h264TestFrame = new String(h264TestFrameBuffer, StandardCharsets.UTF_8);&#xA;                AVTestFrames.h264KeyTestFrame = HexUtil.unhexlify(h264TestFrame);&#xA;            } catch (final IOException e) {&#xA;                Logger.error(e, "Could not parse test frame");&#xA;            } finally {&#xA;                if (inputStream != null) {&#xA;                    try {&#xA;                        inputStream.close();&#xA;                        inputStream = null;&#xA;                    } catch (final IOException e) {&#xA;                        Logger.error(e, "Could not close test frame input stream");&#xA;                    }&#xA;                }&#xA;            }&#xA;        }&#xA;&#xA;        public static byte[] h264KeyTestFrame;&#xA;    }&#xA;}&#xA;</boolean>

    &#xA;

    The build gradle of the project looks like this

    &#xA;

    plugins {&#xA;    id &#x27;application&#x27;&#xA;    id &#x27;org.openjfx.javafxplugin&#x27; version &#x27;0.0.13&#x27;&#xA;}&#xA;&#xA;group &#x27;com.test.example&#x27;&#xA;version &#x27;1.0.0&#x27;&#xA;&#xA;repositories {&#xA;    mavenCentral()&#xA;    mavenLocal()&#xA;    maven { url &#x27;https://jitpack.io&#x27; }&#xA;}&#xA;&#xA;dependencies {&#xA;    implementation group: &#x27;org.bytedeco&#x27;, name: &#x27;javacv-platform&#x27;, version: &#x27;1.5.8&#x27;&#xA;    implementation group: &#x27;com.github.oshi&#x27;, name: &#x27;oshi-core&#x27;, version: &#x27;3.4.3&#x27;&#xA;    implementation &#x27;org.tinylog:tinylog-api:2.1.0&#x27;&#xA;    implementation &#x27;org.tinylog:tinylog-impl:2.1.0&#x27;&#xA;    implementation &#x27;org.jcodec:jcodec:0.2.5&#x27;&#xA;}&#xA;&#xA;test {&#xA;    useJUnitPlatform()&#xA;}&#xA;&#xA;javafx {&#xA;    version = &#x27;17.0.6&#x27;&#xA;    modules = [&#x27;javafx.graphics&#x27;, &#x27;javafx.controls&#x27;, &#x27;javafx.fxml&#x27;, &#x27;javafx.base&#x27;]&#xA;}&#xA;&#xA;mainClassName = &#x27;com.test.example.App&#x27;&#xA;

    &#xA;