
Recherche avancée
Médias (2)
-
SPIP - plugins - embed code - Exemple
2 septembre 2013, par
Mis à jour : Septembre 2013
Langue : français
Type : Image
-
Publier une image simplement
13 avril 2011, par ,
Mis à jour : Février 2012
Langue : français
Type : Video
Autres articles (105)
-
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
List of compatible distributions
26 avril 2011, parThe table below is the list of Linux distributions compatible with the automated installation script of MediaSPIP. Distribution nameVersion nameVersion number Debian Squeeze 6.x.x Debian Weezy 7.x.x Debian Jessie 8.x.x Ubuntu The Precise Pangolin 12.04 LTS Ubuntu The Trusty Tahr 14.04
If you want to help us improve this list, you can provide us access to a machine whose distribution is not mentioned above or send the necessary fixes to add (...)
Sur d’autres sites (9347)
-
vsync flag usage in ffmpeg while filtering
13 octobre 2022, par antortjimI am trying to apply a threshold to an input video with ffmpeg, but I observe the following warning emitted for every processed frame


[mp4 @ 0x56360181a200] Non-monotonous DTS in output stream 0:0; previous: 182272, current: 182272; changing to 182273. This may result in incorrect timestamps in the output file.



where the previous and current are always 1 less than the value to which the DTS (Decoding Time Stamp) is changed


I have noticed this warning is emitted only if I set
-vsync passthrough
(which I changed from the original-vsync 0
which is seen in many online examples).

# input.mp4 has resolution 790x790
ffmpeg -vsync passthrough -i input.mp4 -f lavfi -i color=808080:s=790x790 -f lavfi -i color=black:s=790x790 -f lavfi -i color=white:s=790x790 -filter_complex '[0:v][1:v][2:v][3:v]threshold' -an -c:v h264_nvenc threshold.mp4



Shall I leave the vsync flag set to the default (auto or -1), or is
-vsync passthrough
essential to guarantee the frames are displayed in the right order ? In that case, how do I handle this warning ? Some other online examples I found of users experiencing this warning are different from mine, because in their case they are concatenating videos (see 1, 2

From the documentation on the
-vsync
flag, at the end, I see :



With -map you can select from which stream the timestamps should be taken. You can leave either video or audio unchanged and sync the remaining stream(s) to the unchanged one




Maybe this warning should be handled with
-map
? But I don't know how.

Sidenote, I keep getting the deprecation warning asserting me to change
-vsync
for-fps_mode
, however doing so breaks the command.

FFPEG Version :


commit
28ac2279adb860ea8b90d3073603912bf3eb6a83
from ffmpeg master branch

ffmpeg version N-108625-g28ac2279ad Copyright (c) 2000-2022 the FFmpeg developers
built with gcc 9 (Ubuntu 9.4.0-1ubuntu1~20.04.1)
configuration: --enable-nonfree --enable-cuda-nvcc --enable-libnpp --enable-gpl --extra-cflags=-I/usr/local/cuda/include --extra-ldflags=-L/usr/local/cuda/lib64 --disable-static --enable-shared
libavutil 57. 39.101 / 57. 39.101
libavcodec 59. 50.100 / 59. 50.100
libavformat 59. 34.101 / 59. 34.101
libavdevice 59. 8.101 / 59. 8.101
libavfilter 8. 49.101 / 8. 49.101
libswscale 6. 8.112 / 6. 8.112
libswresample 4. 9.100 / 4. 9.100
libpostproc 56. 7.100 / 56. 7.100



OS


Ubuntu 20.04.4 LTS


-
How exactly do I extract and merge a audio file using @ffmpeg/ffmpeg
5 octobre 2022, par jeff.gogoomaI have a lot of issues about audio transfer to server and frontend.


I am building a solution that is extracting and merging a file received from frontend.


I am using local ffmpeg file with node.js child_process package.


My Env :


framework : nest.js


runtime : node.js


lang : typescript


here is console.log(file : Express.Multer.File).
I am storing to S3 bucket these files. So I am not using multer's memory, disk storage.


// console.log(file);
 {
 fieldname: 'file',
 originalname: 'input1.mp3',
 encoding: '7bit',
 mimetype: 'audio/mpeg'
}



import cp from "child_process";

type Section = { from: number, to: number };

const files: Array = [];

export const setupAudioFileSection = async (file: Express.Multer.File, section: Section) => {
 try {
 const file1 = await cp.exec(`ffmpeg -i ${file} -y -ss 0 -t ${section.from}`);
 const file2 = await cp.exec(`ffmpeg -i ${file} -y -ss ${section.from} -t ${section.to}`);
 const file3 = await cp.exec(`ffmpeg -i ${file} -y -ss ${section.to}`);
 files.push(file1);
 files.push(file3);

 return file2;
 } catch (error) {
 console.log(error);
 throw new Error('setupAudioFileSection error');
 }



But, My Local ffmpeg is printing the error. because of [object Object]. I think local ffmpeg is not recognized the audio file received from client. So, I am resolving this issue by writing a file to disk or memory directly. but I am confusing this solution whether is right or not.


error Error: Command failed: ffmpeg -i [object Object] -y -ss 10 -t 20 -f mp3 ./TestOutput
ffmpeg version 5.1.1 Copyright (c) 2000-2022 the FFmpeg developers
 built with Apple clang version 13.1.6 (clang-1316.0.21.2.5)
 configuration: --prefix=/opt/homebrew/Cellar/ffmpeg/5.1.1 --enable-shared --enable-pthreads --enable-version3 --cc=clang --host-cflags= --host-ldflags= --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libbluray --enable-libdav1d --enable-libmp3lame --enable-libopus --enable-librav1e --enable-librist --enable-librubberband --enable-libsnappy --enable-libsrt --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libspeex --enable-libsoxr --enable-libzmq --enable-libzimg --disable-libjack --disable-indev=jack --enable-videotoolbox --enable-neon
 libavutil 57. 28.100 / 57. 28.100
 libavcodec 59. 37.100 / 59. 37.100
 libavformat 59. 27.100 / 59. 27.100
 libavdevice 59. 7.100 / 59. 7.100
 libavfilter 8. 44.100 / 8. 44.100
 libswscale 6. 7.100 / 6. 7.100
 libswresample 4. 7.100 / 4. 7.100
 libpostproc 56. 6.100 / 56. 6.100

[object: No such file or directory




How can I write file on my server and read ? if that can be, I can resolve [object Object] of ffmpeg issue maybe. Please some answer or advise my issue. Thank you !


-
How to correctly use h264_amf on Windows ?
27 octobre 2022, par HarshivI'm trying to use hardware encoding with FFMPEG on Windows 10 with AMD Radeon r5 M330 DGPU on laptop : HP-AC026-TX. FFMPEG version is
5.1.2-full_build-www.gyan.dev
from gyan.dev. I'm using this :

fmpeg ^
 -hide_banner ^
 -v verbose ^
 -i "%1" ^
 -c:v h264_amf -acodec copy -y out.mp4



but it fails with :


[h264_amf @ 000001fb84d80b80] AMF initialisation succeeded via D3D11.
[h264_amf @ 000001fb847ab3c0] CreateComponent(AMFVideoEncoderVCE_AVC) failed with error 36
Error initializing output stream 0:0 -- Error while opening encoder for output stream #0:0 - maybe incorrect parameters such as bit_rate, rate, width or height



The arguments seem as generic as I could hope, FFMPEG should use sensible defaults, so I can't Understand why it is failing.
Entire FFMPEG output is :


[h264 @ 000001fb84706c40] Reinit context to 1920x1088, pix_fmt: yuv420p
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '1366_pngl_20220816_161558.mp4':
 Metadata:
 major_brand : mp42
 minor_version : 0
 compatible_brands: isommp42
 creation_time : 2022-08-16T10:51:42.000000Z
 com.android.version: 11
 com.android.capture.fps: 30.000000
 Duration: 00:03:41.33, start: 0.000000, bitrate: 7517 kb/s
 Stream #0:0[0x1](eng): Video: h264 (High), 1 reference frame (avc1 / 0x31637661), yuv420p(tv, bt709, progressive, left), 1920x1080 (1920x1088), 7259 kb/s, SAR 1:1 DAR 16:9, 29.96 fps, 29.99 tbr, 90k tbn (default)
 Metadata:
 creation_time : 2022-08-16T10:51:42.000000Z
 handler_name : VideoHandle
 vendor_id : [0][0][0][0]
 Stream #0:1[0x2](eng): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 256 kb/s (default)
 Metadata:
 creation_time : 2022-08-16T10:51:42.000000Z
 handler_name : SoundHandle
 vendor_id : [0][0][0][0]
Stream mapping:
 Stream #0:0 -> #0:0 (h264 (native) -> h264 (h264_amf))
 Stream #0:1 -> #0:1 (copy)
Press [q] to stop, [?] for help
[h264 @ 000001fb84787400] Reinit context to 1920x1088, pix_fmt: yuv420p
[graph 0 input from stream 0:0 @ 000001fb84cfd780] w:1920 h:1080 pixfmt:yuv420p tb:1/90000 fr:90000/3001 sar:1/1
[h264_amf @ 000001fb84d80b80] AMF initialisation succeeded via D3D11.
[h264_amf @ 000001fb847ab3c0] CreateComponent(AMFVideoEncoderVCE_AVC) failed with error 36
Error initializing output stream 0:0 -- Error while opening encoder for output stream #0:0 - maybe incorrect parameters such as bit_rate, rate, width or height
[AVIOContext @ 000001fb846f89c0] Statistics: 0 bytes written, 0 seeks, 0 writeouts
[AVIOContext @ 000001fb846f3f40] Statistics: 371902 bytes read, 3 seeks
Conversion failed!



What should I do to make hardware encoding work ?