
Recherche avancée
Autres articles (113)
-
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Les tâches Cron régulières de la ferme
1er décembre 2010, parLa gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
Le super Cron (gestion_mutu_super_cron)
Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir
Sur d’autres sites (32960)
-
Encode and stream from Xbox 360 kinect using ffmpeg
17 juin 2015, par user3288346I want to live stream content obtained from Kinect onto my internal network.
I have one physical machine which is my server and has ubuntu 14.04 Server on it. I connect remotely to it. I have installed ffmpeg and ffserver and can encode and stream stored video files on the server. However, I have a few problems when using the Xbox Kinect.
I have xbox 360 kinect which I have attached through usb. I have followed this https://bitbucket.org/samirmenon/scl-manips-v2/wiki/vision/kinect, however I couldn’t get through the OpenCV part. When I run
$ cmake-gui ..
I get
cmake-gui: cannot connect to X server
I don’t have physical access to the machine. Probably, its due to accessing it remotely.
When I do
test@cloud-node-2:~/kinnect$ lsusb
Bus 002 Device 006: ID 045e:02ae Microsoft Corp. Xbox NUI Camera
Bus 002 Device 004: ID 045e:02b0 Microsoft Corp. Xbox NUI Motor
Bus 002 Device 005: ID 045e:02ad Microsoft Corp. Xbox NUI Audio
Bus 002 Device 003: ID 0409:005a NEC Corp. HighSpeed Hub
Bus 002 Device 002: ID 0bda:0181 Realtek Semiconductor Corp.
Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 008 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub
Bus 007 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub
Bus 006 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub
Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 005 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub
Bus 004 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub
Bus 003 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hubWhen I do
test@cloud-node-2:~/kinnect$ ls -ltrh /dev/video*
ls: cannot access /dev/video*: No such file or directoryTherefore, I am not able to capture the video using ffmpeg.
-
How do I create and initialise a DXGI_FORMAT_NV12 resource in DX12 (source is AVFrame)
5 janvier 2023, par mikeI'm trying to create an NV12 resource as source for a video encoder in DX12. While I intend to eventually populate a resource from GPU, what I'm trying to do now is take an ffmpeg
AVFrame
I already have (inAV_PIX_FMT_YUV420P
format) and create a texture inDXGI_FORMAT_NV12
format using that data.

I understand the NV12 format (https://learn.microsoft.com/en-us/windows/win32/medfound/recommended-8-bit-yuv-formats-for-video-rendering#nv12) has U and V interleaved while the
AV_PIX_FMT_YUV420P
doesn't.

My main question is what does the
D3D12_RESOURCE_DESC
look like for an NV12 texture - do I tell it I need more than one array/mip level to make it planar ? Or do I just give it a single memory address with both planes layed out as per the NV12 format, and it figures out subresources for me based on the format ?

I understand that to read the data I define two SRVs, one for Y mapped to the Red channel and a second for U and V, but it's how I initialise it that's confusing me.

-
Hardware Accelerated H264 Decode using DirectX11 in Unity Plugin for UWP
8 janvier 2019, par rohit nI’ve built an Unity plugin for my UWP app which converts raw h264 packets to RGB data and renders it to a texture. I’ve used FFMPEG to do this and it works fine.
int framefinished = avcodec_send_packet(m_pCodecCtx, &packet);
framefinished = avcodec_receive_frame(m_pCodecCtx, m_pFrame);
// YUV to RGB conversion and render to texture after thisNow, I’m trying to shift to hardware based decoding using DirectX11 DXVA2.0.
Using this : https://docs.microsoft.com/en-us/windows/desktop/medfound/supporting-direct3d-11-video-decoding-in-media-foundation
I was able to create a decoder(ID3D11VideoDecoder) but I don’t know how to supply it the raw H264 packets and get the YUV or NV12 data as output.
(Or if its possible to render the output directly to the texture since I can get the ID3D11Texture2D pointer)so my question is, How do you send the raw h264 packets to this decoder and get the output from it ?
Also, this is for real time operation so I’m trying to achieve minimal latency.
Thanks in advance !