
Recherche avancée
Médias (91)
-
Spoon - Revenge !
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
My Morning Jacket - One Big Holiday
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Zap Mama - Wadidyusay ?
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
David Byrne - My Fair Lady
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Beastie Boys - Now Get Busy
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Granite de l’Aber Ildut
9 septembre 2011, par
Mis à jour : Septembre 2011
Langue : français
Type : Texte
Autres articles (35)
-
Support de tous types de médias
10 avril 2011Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)
-
Supporting all media types
13 avril 2011, parUnlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)
-
Encoding and processing into web-friendly formats
13 avril 2011, parMediaSPIP automatically converts uploaded files to internet-compatible formats.
Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
All uploaded files are stored online in their original format, so you can (...)
Sur d’autres sites (4553)
-
Can I test funcion of nvidia decoder (nvdec/cuvid) on generated ffmpeg video ?
31 janvier 2021, par Milan Čížekgoal : In my script I try to check if nvdec on my graphics card is available/functional.
I don't have any source video (H.264 / H.265) to use as input at this time intentionally, so I want to generate it.
It is also not necessary to use an encoder, because I do not need the output file.
I'm testing the exit code of command ffmpeg ($ ?).
I use nvidia-smi for check dec/enc load.


My attempt :


ffmpeg -y -hwaccel cuda -hwaccel_output_format cuda -c:v h264_cuvid -f lavfi -i testsrc="duration=5:size=1920x1080:rate=25" -c:v copy test.ts



output of my commands :


Input #0, lavfi, from 'testsrc=duration=5:size=1920x1080:rate=25':
 Duration: N/A, bitrate: N/A
 Stream #0:0: Video: h264, rgb24, 1920x1080 [SAR 1:1 DAR 16:9], 25 tbn
Stream mapping:
 Stream #0:0 -> #0:0 (h264 (h264_cuvid) -> wrapped_avframe (native))
Press [q] to stop, [?] for help
No information about the input framerate is available. Falling back to a default value of 25fps for output stream #0:0. Use the -r option if you want a different framerate.
Output #0, null, to 'pipe:':
 Metadata:
 encoder : Lavf58.65.101
 Stream #0:0: Video: wrapped_avframe, rgb24, 1920x1080 [SAR 1:1 DAR 16:9], q=2-31, 200 kb/s, 25 fps, 25 tbn
 Metadata:
 encoder : Lavc58.119.100 wrapped_avframe
frame= 0 fps=0.0 q=0.0 Lsize=N/A time=00:00:00.00 bitrate=N/A speed= 0x
video:0kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown
Output file is empty, nothing was encoded (check -ss / -t / -frames parameters if used)



I triead add -t 5 before test.ts but nothing changed.
Output ts file has zero size.


If I debug the command, I expect to add it to the end "-f null - 2>/dev/null". Output file is only for debug purposes.


Thank you.


-
Using FFmpeg with Nvidia GPU acceleration
27 janvier 2021, par derekc23I’m using Using FFmpeg with NVIDIA GPU Hardware Acceleration : : NVIDIA Video Codec SDK Documentation for Windows 10


All goes well until I hit the last three commands :


• Goto nv-codec-headers directory and install ffnvcodec


make install PREFIX=/usr


• Go to the FFmpeg installation folder and run the following command.


./configure —enable-nonfree –disable-shared —enable-cuda-sdk —enable-libnpp –-toolchain=msvc —extra-cflags=-I../nv_sdk —extra-ldflags=-libpath :../nv_sdk


• Compile the code by executing the following command.


make -j 8


I cannot get the ‘make’ command to work – I get 'make' is not recognized as an internal or external command,
operable program or batch file.
I did try the next command, the ./configure one, this was also rejected.


The documentation is : https://docs.nvidia.com/video-technologies/video-codec-sdk/ffmpeg-with-nvidia-gpu/


Can anyone please help with this ?


-
What do I have to do to decompress MP4 video with NVDEC directly to a Texture buffer on an NVidia Jetson AGX Xavier computer ?
13 septembre 2020, par Alexis WilkeWhat I'm trying to do is decompress two MP4 frames (one per NVDEC) and then manipulate them with OpenGL.


Right now, though, this is too slow (definitely not real time), as I have to make copies of 4K images (3840x2160x3 in RGB) and that's just too much data to copy.


I wrote a small benchmark to verify that part. I can only copy between 240 and 250 such buffers per second with
memcpy()
. That's too slow when the input movies are 60 fps...

I'm able to use the NVDEC chip to decompress to a buffer through ffmpeg, but to place that in a texture, I then have to get the frame out of ffmpeg (copy 1) and then send that image to a texture (copy 2). Do that for two videos, that's 4 copies... 4K is huge ! So the CPUs don't have the time to do that much work 60 times per second.


So right now I'm looking for a way to send the output of the NVDEC directly to a texture. Looking at GStreamer (the gst-launch-1.0), it takes like 3% CPU and can playback a 4K video in real time. What am I doing wrong ?