
Recherche avancée
Médias (1)
-
The Slip - Artworks
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Texte
Autres articles (57)
-
Keeping control of your media in your hands
13 avril 2011, parThe vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...) -
MediaSPIP v0.2
21 juin 2013, parMediaSPIP 0.2 est la première version de MediaSPIP stable.
Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...) -
Mise à disposition des fichiers
14 avril 2011, parPar défaut, lors de son initialisation, MediaSPIP ne permet pas aux visiteurs de télécharger les fichiers qu’ils soient originaux ou le résultat de leur transformation ou encodage. Il permet uniquement de les visualiser.
Cependant, il est possible et facile d’autoriser les visiteurs à avoir accès à ces documents et ce sous différentes formes.
Tout cela se passe dans la page de configuration du squelette. Il vous faut aller dans l’espace d’administration du canal, et choisir dans la navigation (...)
Sur d’autres sites (10035)
-
Use OpenH264 instead of libx264 in ffmpeg ?
26 août 2020, par MINGI use node-fluent-ffmpeg module to call ffmpeg.exe in my node.js app


How to use OpenH264 when calling ffmpeg.exe ?


Do I need to recompile ffmpeg ?


But compiling ffmpeg looks complicated...


The actual code is probably like this


var FfmpegCommand = require('fluent-ffmpeg');
var FFMPEG_PATH = "C:/ffmpeg/bin/ffmpeg.exe" // binary file
FfmpegCommand.setFfmpegPath(FFMPEG_PATH)

FfmpegCommand("dog.webm", {})
 .videoCodec('libx264')
 // command line 
 // .addOutputOption([
 // '-threads 8'
 // ])
 .on('start', function (commandLine) {
 // Actual ffmpeg command
 // ffmpeg -i C:\Users\ming\Pictures\WEBCAMCAPTURE\dog.webm -y -vcodec libx264 C:\Users\ming\Pictures\WEBCAMCAPTURE\dog_x264.mp4
 console.log('command' + commandLine)
 })
 .on('progress', function (progress) {
 console.log('Processing: ' + progress.percent + '% done')
 })
 .on('end', function () {
 console.log('Finished')
 })
 .save("dog_x264.mp4")




English isn’t my first language, so please excuse any mistakes.


-
Use OpenH264 instead of libx264 in ffmpeg ?
26 août 2020, par MINGI use node-fluent-ffmpeg module to call ffmpeg.exe in my node.js app


How to use OpenH264 when calling ffmpeg.exe ?


Do I need to recompile ffmpeg ?


But compiling ffmpeg looks complicated...


The actual code is probably like this


var FfmpegCommand = require('fluent-ffmpeg');
var FFMPEG_PATH = "C:/ffmpeg/bin/ffmpeg.exe" // binary file
FfmpegCommand.setFfmpegPath(FFMPEG_PATH)

FfmpegCommand("dog.webm", {})
 .videoCodec('libx264')
 // command line 
 // .addOutputOption([
 // '-threads 8'
 // ])
 .on('start', function (commandLine) {
 // Actual ffmpeg command
 // ffmpeg -i C:\Users\ming\Pictures\WEBCAMCAPTURE\dog.webm -y -vcodec libx264 C:\Users\ming\Pictures\WEBCAMCAPTURE\dog_x264.mp4
 console.log('command' + commandLine)
 })
 .on('progress', function (progress) {
 console.log('Processing: ' + progress.percent + '% done')
 })
 .on('end', function () {
 console.log('Finished')
 })
 .save("dog_x264.mp4")




English isn’t my first language, so please excuse any mistakes.


-
How can I render frames decoded by FFmpeg using hardware decoding with D3D11 ?
14 juin 2024, par mercuric taylorI have completed the process of decoding a video frame using FFmpeg. The format of the decoded frame is AV_PIX_FMT_NV12. Now, I want to render this frame to the screen using D3D11. My questions are :


- 

- What is the equivalent concept in D3D11 for a decoded frame ? Is it a texture ?
- I have seen many solutions that convert NV12 data to RGB, but it seems that DX11 does not require this conversion anymore.
- I just want to display this frame, and since my frame is on the GPU, is there a more convenient way to render directly on the GPU without copying ?








Please forgive my not-so-good English. Can anyone provide a reference example ?


I have already referenced this open-source project. texthttps://github.com/balapradeepswork/D3D11NV12Rendering/tree/master/D3D11NV12Rendering
But I don't understand it very well.
Since I don't use dx11 to make game, just for show video, I am looking forward for a more easy solution. It has worried me for weeks, can anyone give me some advice(even good tutorial, some I can found is too old). Thanks sincerely !