
Recherche avancée
Autres articles (37)
-
Configuration spécifique d’Apache
4 février 2011, parModules spécifiques
Pour la configuration d’Apache, il est conseillé d’activer certains modules non spécifiques à MediaSPIP, mais permettant d’améliorer les performances : mod_deflate et mod_headers pour compresser automatiquement via Apache les pages. Cf ce tutoriel ; mode_expires pour gérer correctement l’expiration des hits. Cf ce tutoriel ;
Il est également conseillé d’ajouter la prise en charge par apache du mime-type pour les fichiers WebM comme indiqué dans ce tutoriel.
Création d’un (...) -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
De l’upload à la vidéo finale [version standalone]
31 janvier 2010, parLe chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
Upload et récupération d’informations de la vidéo source
Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)
Sur d’autres sites (3561)
-
Search pre-defined words in live stream video
31 janvier 2018, par Brajraj AgrawalI want to search for some pre-defined keywords in youtube live stream ( or some other live stream) in my application.
I have gone through the many articles. I just have a URL that is a live streaming URL and a set of keywords. If any of that keywords appear in the video I just want the user to provide a link to jump to that frame where the keyword appears in as real-time as possible.My understanding for that is to use Google Speech API and FFMPEG. But I am confused how can I do playback and which HTML player should I use for that.
E.g. A video is streaming live and If a word "dog" is spoked in the video then the user should get notify with playback URL from 5 sec behind it appears in the video.
Please suggest
-
How to add subtitles using FFmpeg-kit ?
17 novembre 2024, par Mohammed BekeleI'm running a Flutter app with Ffmpeg-kit package to burn a subtitle on a video. I have a words list with the timings and map that to generate an srt and ass file, but when executing this it didn't work.


Firstly, here is how I generated the Ass file.


Future<string> _createAssFile() async {
 String filePath = await getAssOutputFilePath();

 final file = File(filePath);
 final buffer = StringBuffer();

 // Write ASS headers
 buffer.writeln('[Script Info]');
 buffer.writeln('Title: Generated ASS');
 buffer.writeln('ScriptType: v4.00+');
 buffer.writeln('Collisions: Normal');
 buffer.writeln('PlayDepth: 0');
 buffer.writeln('Timer: 100,0000');
 buffer.writeln('[V4+ Styles]');
 buffer.writeln(
 'Format: Name, Fontname, Fontsize, PrimaryColour, SecondaryColour, OutlineColour, BackColour, Bold, Italic, Underline, StrikeOut, ScaleX, ScaleY, Spacing, Angle, BorderStyle, Outline, Shadow, Alignment, MarginL, MarginR, MarginV, Encoding');
 buffer.writeln(
 'Style: Default,Arial,40,&H00FFFFFF,&H000000FF,&H00000000,&H80000000,1,1,1,1,100,100,0,0,1,1,1,2,10,10,10,1');
 buffer.writeln('[Events]');
 buffer.writeln(
 'Format: Layer, Start, End, Style, Name, MarginL, MarginR, MarginV, Effect, Text');

 // Write events (subtitles)
 for (int i = 0; i < widget.words.length; i++) {
 final word = widget.words[i];
 final startTime = _formatAssTime(word['startTime'].toDouble());
 final endTime = _formatAssTime(word['endTime'].toDouble());
 final text = word['word'];

 buffer.writeln('Dialogue: 0,$startTime,$endTime,Default,,0,0,0,,$text');
 }

 await file.writeAsString(buffer.toString());
 return filePath;
 }

 String _formatAssTime(double seconds) {
 final int hours = seconds ~/ 3600;
 final int minutes = ((seconds % 3600) ~/ 60);
 final int secs = (seconds % 60).toInt();
 final int millis = ((seconds - secs) * 1000).toInt() % 1000;

 return '${hours.toString().padLeft(1, '0')}:${minutes.toString().padLeft(2, '0')}:${secs.toString().padLeft(2, '0')}.${(millis ~/ 10).toString().padLeft(2, '0')}';
 }
</string>


Then I used this command which was the official way of adding ass file to a video.


String newCmd = "-i $videoPath -vf ass=$assFilePath -c:a copy $_outputPath";



Yet when executing this command it did not work. However I changed it to this command


String newCmd = "-i $videoPath -i $assFilePath $_outputPath";



Well, that code works but the styling is not being applied. So I tried yet another command to filter and position the ass file.


String newCmd = "-i $videoPath -i $assFilePath -filter_complex \"[0:v][1:s]ass=\\an5:text='%{event.text}',scale=iw*0.8:ih*0.8,setdar=16/9[outv]\" -map [outv] -c:a copy $_outputPath";



This made it even worse, because the app crashed when I ran this. Also there is a log for this command before crashing


F/libc (19624): Fatal signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x19 in tid 21314 (pool-4-thread-7), pid 19624 (ple.caption_app)
*** *** *** *** *** *** *** *** *** *** *** *** *** *** *** ***
Build fingerprint: 'samsung/a15nsxx/a15:14/UP1A.231005.007/A155FXXU1AWKA:user/release-keys'
Revision: '5'
ABI: 'arm64'
Processor: '7'
Timestamp: 2024-07-10 19:27:40.812476860+0300
Process uptime: 1746s
Cmdline: com.example.caption_app
pid: 19624, tid: 21314, name: pool-4-thread-7 >>> com.example.caption_app <<<
uid: 10377
tagged_addr_ctrl: 0000000000000001 (PR_TAGGED_ADDR_ENABLE)
signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x0000000000000019
Cause: null pointer dereference
 x0 0000000000000001 x1 b400006ec98d8660 x2 0000000000000000 x3 0000000000000002
 x4 0000006f62d1ea12 x5 b400006e04f745ea x6 352f35372f64352f x7 7f7f7f7f7f7f7f7f
 x8 632ace36577a905e x9 632ace36577a905e x10 0000006e191378c0 x11 0000000000000001
 x12 0000000000000001 x13 0000000000000000 x14 0000000000000004 x15 0000000000000008
 x16 0000006e27ffa358 x17 0000006e27dbfb00 x18 0000006e00d48000 x19 0000006f62d1f0e8
 x20 0000006f62d24000 x21 0000006e27ffc5d0 x22 0000006e27ffc5f0 x23 0000000000000000
 x24 b400006f3f089248 x25 0000006f62d1f220 x26 b400006f3f057b20 x27 0000000000000002
 x28 0000000000000088 x29 0000006f62d1f100
 lr 0000006e27dbfb0c sp 0000006f62d1f0a0 pc 0000006e27dbfb10 pst 0000000060001000
1 total frames
backtrace:
 #00 pc 0000000000121b10 /data/app/~~oWqjrGA2indQhuEw6u_J2A==/com.example.caption_app-u2Ofk54FVtMc5D-i3SLC6g==/base.apk!libavfilter.so (offset 0x10a46000) (avfilter_inout_free+16) 
Lost connection to device.



I tried the normal command in my local machine on Windows, and this is what I used


ffmpeg -i F:\ffmpeg\video.avi -vf subtitles='F\:\\ffmpeg\\caption.srt':force_style='Fontsize=24' F:\ffmpeg\new.mp4



As you can see above when subtitles are used it had an extra backslashes for each path and a pre backslash after the partition letter. But I don't know if this does apply for android devices. How can I make sense of this ?


-
Fluent-ffmpeg randomly doesn't add the text I want to the video, even though it's in the filters
15 décembre 2022, par ToborWinnerI am making a simple script to add some text to a 4 seconds video, it all works fine, but sometimes it randomly doesn't add some of the text.
You can find here the relevant parts of my code :



const video = ffmpeg('path/to/video.mp4')

let index = 0
let left = true

const filters = [{
 filter: 'drawtext',
 options: {
 //fontfile:'font.ttf',
 text: title,
 fontsize: 30,
 fontcolor: 'white',
 x: '(main_w/2-text_w/2)',
 y: 130,
 shadowcolor: 'black',
 shadowx: 2,
 shadowy: 2
 }
}]

for (let thought of thoughts) {
 if (thought.length == 0) {
 continue
 }
 thought = wrap(thought, {width: 35})
 const strings = thought.split("\n")
 let line = 0
 for (const string of strings
 .filter(string => string.length > 0)
 .map(string => string.trim())
 ) {
 let yoffset = 130+(130*(index+1))+(line*20)
 if (yoffset < 0) {
 yoffset = 0
 }
 console.log(string, yoffset)
 filters.push({
 filter: 'drawtext',
 options: {
 //fontfile:'font.ttf',
 text: string,
 fontsize: 18,
 fontcolor: 'white',
 x: `(main_w${left ? "*0.3" : "*0.7"}-text_w/2)`,
 y: yoffset,
 shadowcolor: 'black',
 shadowx: 2,
 shadowy: 2
 }
 })
 line++;
 }
 index++;
 left = !left
 }


video.videoFilters(filters)
video.noAudio()


video.save('path/to/output.mp4');




The wrap function comes from the package word-wrap (
const wrap = require('word-wrap');
)
Thoughts is a list of strings that aren't too long (with the wrap function they end up being like 2-4 lines).

This is inside an async function.


For some reason only a few lines appear on the output video.
Sometimes, when it doesn't do that, it also throws an error saying that one of the inputs is invalid (while processing filters).
The wrap function seems to work, and also the
yoffset
, I have printed them.

If someone has an idea why, please help me solve this.


I tried chasing the text in thoughts, and for example, this works with no problems (shows the title, and the texts right, left, right, left, ...).


const thoughts = ["Nothing is on fire, fire is on things","Nothing is on fire, fire is on things","Nothing is on fire, fire is on things","Nothing is on fire, fire is on things","Nothing is on fire, fire is on things"]