
Recherche avancée
Autres articles (46)
-
Keeping control of your media in your hands
13 avril 2011, parThe vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...) -
Submit bugs and patches
13 avril 2011Unfortunately a software is never perfect.
If you think you have found a bug, report it using our ticket system. Please to help us to fix it by providing the following information : the browser you are using, including the exact version as precise an explanation as possible of the problem if possible, the steps taken resulting in the problem a link to the site / page in question
If you think you have solved the bug, fill in a ticket and attach to it a corrective patch.
You may also (...) -
Ajouter notes et légendes aux images
7 février 2011, parPour pouvoir ajouter notes et légendes aux images, la première étape est d’installer le plugin "Légendes".
Une fois le plugin activé, vous pouvez le configurer dans l’espace de configuration afin de modifier les droits de création / modification et de suppression des notes. Par défaut seuls les administrateurs du site peuvent ajouter des notes aux images.
Modification lors de l’ajout d’un média
Lors de l’ajout d’un média de type "image" un nouveau bouton apparait au dessus de la prévisualisation (...)
Sur d’autres sites (5335)
-
How do I redirect the output of SpeechSynthesizer to a Process of ffmpeg
27 septembre 2020, par TheOneAndOnlyMrXI am trying to have a SpeechSynthesizer generate some audio data, pipe it into a Process of FFmpeg, and have FFmpeg save the data to a file (output.wav). Eventually, the audio data will be used for something else, which is why I am using FFmpeg.


using (MemoryStream voiceStream = new MemoryStream())
 using (Process ffmpeg = new Process())
 {
 SpeechSynthesizer synth = new SpeechSynthesizer();

 int samplesPerSecond = 48000;
 int bitsPerSample = 8;
 int channelCount = 2;
 int averageBytesPerSecond = samplesPerSecond * (bitsPerSample / 8) * channelCount;
 int blockalign = (bitsPerSample / 8) * channelCount;
 byte[] formatSpecificData = new byte[0];

 synth.SetOutputToAudioStream(
 voiceStream,
 new System.Speech.AudioFormat.SpeechAudioFormatInfo(
 System.Speech.AudioFormat.EncodingFormat.Pcm,
 samplesPerSecond,
 bitsPerSample,
 channelCount,
 averageBytesPerSecond,
 blockalign,
 formatSpecificData
 )
 );

 synth.Speak("Hello there");

 synth.SetOutputToNull();

 ffmpeg.StartInfo = new ProcessStartInfo
 {
 FileName = "ffmpeg",
 Arguments = $"-y -f u8 -ac 2 -ar 48000 -i pipe:0 out.wav",
 UseShellExecute = false,
 RedirectStandardOutput = true,
 RedirectStandardInput = true
 };

 ffmpeg.Start();

 using (Stream ffmpegIn = ffmpeg.StandardInput.BaseStream)
 {
 voiceStream.CopyTo(ffmpegIn);
 
 ffmpegIn.FlushAsync();
 }
 }



When running the program, FFmpeg said that the input stream contains no data, and returns an empty file.
I believe that I do not interface properly with the Process object, however, the problem might also be my incorrect specification of the audio stream, since I do not know much about audio.


-
what is AV_SAMPLE_FMT_FLT
21 décembre 2020, par GoluSwrContext *swr_ctx = swr_alloc_set_opts(NULL, 
 AV_CH_LAYOUT_STEREO,
 AV_SAMPLE_FMT_FLT,
 sample_rate,
 pCodecParameters->channel_layout, 
 
 pCodecParameters->format,
 pCodecParameters->sample_rate, 
 0,
 NULL);



what exactly AV_SAMPLE_FMT_FLT is ? i already read docs but i want to know that what is float layout means in the context of Audio. How actually binary data of audio will look in that format.


-
Firebase Function using ffmpeg successful with emulator, out of memory when deployed
25 septembre 2024, par flyingL123I need some help. I have a
.mov
file in Firebase Storage. The file is 25 seconds long and106 MB
. I wrote a callable Firebase function that usesffmpeg
to convert the file to a.mp4
file and save it to Firebase Storage. When I test the function using the Functions emulator, it works without issue. The function returns successfully and I see the converted file appear in storage. The converted video is about6 MB
and plays correctly when dowloaded.

When I deploy this function and run it in production on the exact same video file, the function fails with :




'Memory limit of 256 MiB exceeded with 407 MiB used. Consider
increasing the memory limit, see
https://cloud.google.com/functions/docs/configuring/memory'




As a test, I edited the function and changed its allocated memory to
1 GiB
. Then I test the function again in production. Now I receive the same error :



'Memory limit of 1024 MiB exceeded with 1029 MiB used. Consider
increasing the memory limit, see
https://cloud.google.com/functions/docs/configuring/memory'




This is my function code :


const {initializeApp} = require("firebase-admin/app");
const {onCall} = require("firebase-functions/v2/https");
const { getStorage, getDownloadURL } = require('firebase-admin/storage');

initializeApp();

exports.convertVideo = onCall((request) => {
 const ffmpegPath = require('@ffmpeg-installer/ffmpeg').path;
 const ffmpeg = require('fluent-ffmpeg');
 ffmpeg.setFfmpegPath(ffmpegPath);
 const originalLocation = request.data.originalLocation;
 const convertedLocation = request.data.convertedLocation;
 const originalVideoFile = getStorage().bucket().file(originalLocation);
 const newVideoFile = getStorage().bucket().file(convertedLocation);

 return new Promise(async (resolve, reject) => {
 await originalVideoFile.download({destination: '/tmp/original'}).catch(console.error);
 
 ffmpeg('/tmp/original')
 .addOutputOptions('-movflags +frag_keyframe+separate_moof+omit_tfhd_offset+empty_moov')
 .format('mp4')
 .on('error', (err) => {
 console.log(err);
 })
 .pipe(newVideoFile.createWriteStream())
 .on('error', (err) => {
 console.log(err);
 })
 .on('close', async () => {
 fs.unlink('/tmp/original', (err) => {
 if (err) throw err;
 });
 const convertedUrl = await getDownloadURL(newVideoFile);
 resolve([convertedLocation, convertedUrl]);
 });
 });
});



I am sending a test request to the Function emulator using curl :


curl -d '{"data": {"originalLocation": "customer_videos/original_video.mov", "convertedLocation": "customer_videos/converted/original_video.mp4"}}' -H "Content-Type: application/json" http://127.0.0.1:5001/foo/bar/convertVideo



This works correctly. I send the same request to the deployed function, and receive the out of memory error.


curl -d '{"data": {"originalLocation": "customer_videos/original_video.mov", "convertedLocation": "customer_videos/converted/original_video.mp4"}}' -H "Content-Type: application/json" https://convertvideo-foobarbaz-uc.a.run.ap



Can somebody please help me understand why this is happening ? I didn't think I was doing anything too memory intensive, especially since it works correctly using the emulator.