
Recherche avancée
Médias (1)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (37)
-
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...) -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
De l’upload à la vidéo finale [version standalone]
31 janvier 2010, parLe chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
Upload et récupération d’informations de la vidéo source
Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)
Sur d’autres sites (7460)
-
How exactly do I extract and merge a audio file using @ffmpeg/ffmpeg
5 octobre 2022, par jeff.gogoomaI have a lot of issues about audio transfer to server and frontend.


I am building a solution that is extracting and merging a file received from frontend.


I am using local ffmpeg file with node.js child_process package.


My Env :


framework : nest.js


runtime : node.js


lang : typescript


here is console.log(file : Express.Multer.File).
I am storing to S3 bucket these files. So I am not using multer's memory, disk storage.


// console.log(file);
 {
 fieldname: 'file',
 originalname: 'input1.mp3',
 encoding: '7bit',
 mimetype: 'audio/mpeg'
}



import cp from "child_process";

type Section = { from: number, to: number };

const files: Array = [];

export const setupAudioFileSection = async (file: Express.Multer.File, section: Section) => {
 try {
 const file1 = await cp.exec(`ffmpeg -i ${file} -y -ss 0 -t ${section.from}`);
 const file2 = await cp.exec(`ffmpeg -i ${file} -y -ss ${section.from} -t ${section.to}`);
 const file3 = await cp.exec(`ffmpeg -i ${file} -y -ss ${section.to}`);
 files.push(file1);
 files.push(file3);

 return file2;
 } catch (error) {
 console.log(error);
 throw new Error('setupAudioFileSection error');
 }



But, My Local ffmpeg is printing the error. because of [object Object]. I think local ffmpeg is not recognized the audio file received from client. So, I am resolving this issue by writing a file to disk or memory directly. but I am confusing this solution whether is right or not.


error Error: Command failed: ffmpeg -i [object Object] -y -ss 10 -t 20 -f mp3 ./TestOutput
ffmpeg version 5.1.1 Copyright (c) 2000-2022 the FFmpeg developers
 built with Apple clang version 13.1.6 (clang-1316.0.21.2.5)
 configuration: --prefix=/opt/homebrew/Cellar/ffmpeg/5.1.1 --enable-shared --enable-pthreads --enable-version3 --cc=clang --host-cflags= --host-ldflags= --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libbluray --enable-libdav1d --enable-libmp3lame --enable-libopus --enable-librav1e --enable-librist --enable-librubberband --enable-libsnappy --enable-libsrt --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libspeex --enable-libsoxr --enable-libzmq --enable-libzimg --disable-libjack --disable-indev=jack --enable-videotoolbox --enable-neon
 libavutil 57. 28.100 / 57. 28.100
 libavcodec 59. 37.100 / 59. 37.100
 libavformat 59. 27.100 / 59. 27.100
 libavdevice 59. 7.100 / 59. 7.100
 libavfilter 8. 44.100 / 8. 44.100
 libswscale 6. 7.100 / 6. 7.100
 libswresample 4. 7.100 / 4. 7.100
 libpostproc 56. 6.100 / 56. 6.100

[object: No such file or directory




How can I write file on my server and read ? if that can be, I can resolve [object Object] of ffmpeg issue maybe. Please some answer or advise my issue. Thank you !


-
vsync flag usage in ffmpeg while filtering
13 octobre 2022, par antortjimI am trying to apply a threshold to an input video with ffmpeg, but I observe the following warning emitted for every processed frame


[mp4 @ 0x56360181a200] Non-monotonous DTS in output stream 0:0; previous: 182272, current: 182272; changing to 182273. This may result in incorrect timestamps in the output file.



where the previous and current are always 1 less than the value to which the DTS (Decoding Time Stamp) is changed


I have noticed this warning is emitted only if I set
-vsync passthrough
(which I changed from the original-vsync 0
which is seen in many online examples).

# input.mp4 has resolution 790x790
ffmpeg -vsync passthrough -i input.mp4 -f lavfi -i color=808080:s=790x790 -f lavfi -i color=black:s=790x790 -f lavfi -i color=white:s=790x790 -filter_complex '[0:v][1:v][2:v][3:v]threshold' -an -c:v h264_nvenc threshold.mp4



Shall I leave the vsync flag set to the default (auto or -1), or is
-vsync passthrough
essential to guarantee the frames are displayed in the right order ? In that case, how do I handle this warning ? Some other online examples I found of users experiencing this warning are different from mine, because in their case they are concatenating videos (see 1, 2

From the documentation on the
-vsync
flag, at the end, I see :



With -map you can select from which stream the timestamps should be taken. You can leave either video or audio unchanged and sync the remaining stream(s) to the unchanged one




Maybe this warning should be handled with
-map
? But I don't know how.

Sidenote, I keep getting the deprecation warning asserting me to change
-vsync
for-fps_mode
, however doing so breaks the command.

FFPEG Version :


commit
28ac2279adb860ea8b90d3073603912bf3eb6a83
from ffmpeg master branch

ffmpeg version N-108625-g28ac2279ad Copyright (c) 2000-2022 the FFmpeg developers
built with gcc 9 (Ubuntu 9.4.0-1ubuntu1~20.04.1)
configuration: --enable-nonfree --enable-cuda-nvcc --enable-libnpp --enable-gpl --extra-cflags=-I/usr/local/cuda/include --extra-ldflags=-L/usr/local/cuda/lib64 --disable-static --enable-shared
libavutil 57. 39.101 / 57. 39.101
libavcodec 59. 50.100 / 59. 50.100
libavformat 59. 34.101 / 59. 34.101
libavdevice 59. 8.101 / 59. 8.101
libavfilter 8. 49.101 / 8. 49.101
libswscale 6. 8.112 / 6. 8.112
libswresample 4. 9.100 / 4. 9.100
libpostproc 56. 7.100 / 56. 7.100



OS


Ubuntu 20.04.4 LTS


-
Restream a rtsp stream to a rtmp server
26 septembre 2022, par Sayan ChakrabortyI have few hikvision cameras, I want to access it remotely in an app that I wrote. I am able to get the camera feed when I am connected to the network which has the camera. So I have setup a rtmp server in ec2 with Nginx, where I will be broadcasting the camera video feed through the raspberry pi, and would Be able to fetch the feed in my app.


I used obs to stream the camera feed to the ramp server and was able to successfully get the video feed from the ec2 instance, but when using ffmpeg I am getting the following error.


pi@raspberrypi:~$ ffmpeg -re -i rtsp://admin:pass@192.168.0.253/ISAPI/Streaming/channels/101 -an -c:v libx264 -f flv rtmp://ec2instanceip/live
ffmpeg version 4.2.7-0ubuntu0.1 Copyright (c) 2000-2022 the FFmpeg developers
 built with gcc 9 (Ubuntu 9.4.0-1ubuntu1~20.04.1)
 configuration: --prefix=/usr --extra-version=0ubuntu0.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-avresample --disable-filter=resample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librsvg --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opencl --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-nvenc --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared
 libavutil 56. 31.100 / 56. 31.100
 libavcodec 58. 54.100 / 58. 54.100
 libavformat 58. 29.100 / 58. 29.100
 libavdevice 58. 8.100 / 58. 8.100
 libavfilter 7. 57.100 / 7. 57.100
 libavresample 4. 0. 0 / 4. 0. 0
 libswscale 5. 5.100 / 5. 5.100
 libswresample 3. 5.100 / 3. 5.100
 libpostproc 55. 5.100 / 55. 5.100
[rtsp @ 0x55dd3b126300] Missing PPS in sprop-parameter-sets, ignoring
[rtsp @ 0x55dd3b126300] UDP timeout, retrying with TCP
[rtsp @ 0x55dd3b126300] method PAUSE failed: 551 Option not supported
[rtsp @ 0x55dd3b126300] Could not find codec parameters for stream 0 (Video: h264, none): unspecified size
Consider increasing the value for the 'analyzeduration' and 'probesize' options
Guessed Channel Layout for Input Stream #0.1 : mono
Input #0, rtsp, from 'rtsp://admin:pass@192.168.0.253/ISAPI/Streaming/channels/101':
 Metadata:
 title : Media Presentation
 Duration: N/A, bitrate: 64 kb/s
 Stream #0:0: Video: h264, none, 90k tbr, 90k tbn, 180k tbc
 Stream #0:1: Audio: pcm_mulaw, 8000 Hz, mono, s16, 64 kb/s
Output #0, flv, to 'rtmp://ec2instanceip/live':
Output file #0 does not contain any stream