
Recherche avancée
Médias (2)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
-
Carte de Schillerkiez
13 mai 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Texte
Autres articles (81)
-
Les tâches Cron régulières de la ferme
1er décembre 2010, parLa gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
Le super Cron (gestion_mutu_super_cron)
Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.
Sur d’autres sites (9932)
-
Capture CMOS video with FPGA, encode and send over Ethernet
23 décembre 2015, par ya_urockI am planning a open source university project for my students based on Zynq Xilinx FPGA that will capture CMOS video, encode it into transport stream and send it over Ethernet to remote PC. Basically I want to design yet another IP camera. I have strong FPGA experience, but lack knowledge regarding encoding and transfering video data. Here is my plan :
-
Connect CMOS camera to FPGA, recieve video frames and save them to external DDR memory, verify using HDMI output to monitor. I have no problems with that.
-
I understand that I have to compress my video stream for example to H.264 format and put into transport stream. Here I have little knowledge and require some hints.
-
After I form transport stream I can send it over network using UDP packets. I have working hardware solution that reads data from FIFO and sends it to remote PC as UDP papckets.
-
And finally I plan to receive and play video using ffmpeg library.
ffplay udp://localhost:5678
My question is basically regarding 2 step. How do I convert pixel frames to transport stream ? My options are :
- Use commercial IP, like
Here I doubt that they are free to use and we don’t have much funds.
-
Use open cores like
- http://sourceforge.net/projects/hardh264/ - here core generates only h264 output, but how to encapsulate it into transport stream ?
- I have searched opencores.org but with no success on this topic
- Maybe somebody knows some good open source relevant FPGA projects ?
-
Develop harware encoder by myself using Vivado HLS (C Language). But here is the problem that I don’t know the algorithm. Maybe I could gig ffmpeg or Cisco openh264 library and find there a function that converts raw pixel frames to H.264 format and then puts it into transport stream ? Any help would be appriciated here also.
Also I am worried about format compatibility of stream I might generate inside FPGA and the one expected at host by ffplay utility. Any help, hints, links and books are appriciated !
-
-
Create animated Gif in Kotlin
27 octobre 2022, par Gregor SotošekI am struggling with something I thought would be a piece of cake. In my Android App (jetpack compose variant), I want to save some screenshots (not many, 10-20 of them), and then "packed them" into animated GIF file, and then user would be able to share this GIF to other users. I am having a hard time "creating" or "saving" GIF file from saved screenshots. The only thing I found on web was pretty old post here


1 : How to create an animated GIF from JPEGs in Android (development) but I just can't get it to work. All the others posts are related to "displaying" GIF in android, which is not my problem currently. If someone can point me in the right direction, would be very nice.


I also try to make it to work with FFmpeg, but don't know how to either.
https://proandroiddev.com/a-story-about-ffmpeg-in-android-part-i-compilation-898e4a249422


Thank you


-
How to Match ASS Subtitle Font Size with Flutter Text Size for a 1080p Video ?
16 décembre 2024, par Mostafa FathiI’m working on a project where I need to synchronize the font size of ASS subtitles with the text size a user sees in a Flutter application. The goal is to ensure that the text size in a 1080p video matches what the user sees in the app.


What I've Tried :


- 

- Calculating font size using height ratio (PlayResY/DeviceHeight) :




- 

- I used the formula :




FontSize_ASS = FontSize_Flutter * (PlayResY / DeviceHeight)



- 

- While the result seemed logical, the final output in the video was smaller than expected.




- 

- Adding a scaling factor :




- 

- I introduced a scaling factor (around 3.0) to address the size discrepancy.
- This improved the result but still felt inconsistent and lacked precision.






- 

- Using force_style in FFmpeg :




- 

- I applied the force_style parameter to control the font size in FFmpeg directly.




ffmpeg -i input.mp4 -vf "subtitles=subtitle.ass:force_style='FontSize=90'" -c:a copy output.mp4



- 

- While it produced better results, it’s not an ideal solution as it bypasses the calculations in the ASS file.




- 

- Aligning
PlayResX
andPlayResY
in the ASS file :
I ensured that these parameters matched the target video resolution (1920×1080) :




PlayResX: 1920
PlayResY: 1080



- 

- Despite this adjustment, the font size didn’t align perfectly with the Flutter app text size.




- 

- Reading font metrics from the font file dynamically :
To improve precision, I wrote a function in Flutter that reads font metrics (units per EM, ascender, and descender) from the TTF font file and calculates a more accurate scaling factor :




Future readFontMetrics(
 String fontFilePath, 
 double originalFontSize,
) async {
 final fontData = await File(fontFilePath).readAsBytes();
 final fontBytes = fontData.buffer.asUint8List();
 final byteData = ByteData.sublistView(fontBytes);

 int numTables = readUInt16BE(byteData, 4);
 int offsetTableStart = 12;
 Map> tables = {};

 for (int i = 0; i < numTables; i++) {
 int recordOffset = offsetTableStart + i * 16;
 String tag =
 utf8.decode(fontBytes.sublist(recordOffset, recordOffset + 4));
 int offset = readUInt32BE(byteData, recordOffset + 8);
 int length = readUInt32BE(byteData, recordOffset + 12);

 tables[tag] = {
 'offset': offset,
 'length': length,
 };
 }

 if (!tables.containsKey('head') || !tables.containsKey('hhea'){
 print('Required tables not found in the font file.');
 return null;
 }

 int headOffset = tables['head']!['offset']!;
 int unitsPerEm = readUInt16BE(byteData, headOffset + 18);

 int hheaOffset = tables['hhea']!['offset']!;
 int ascender = readInt16BE(byteData, hheaOffset + 4);
 int descender = readInt16BE(byteData, hheaOffset + 6);

 print('unitsPerEm: $unitsPerEm');
 print('ascender: $ascender');
 print('descender: $descender');

 int nominalSize = unitsPerEm;
 int realDimensionSize = ascender - descender;
 double scaleFactor = realDimensionSize / nominalSize;
 double realFontSize = originalFontSize * scaleFactor;

 print('Scale Factor: $scaleFactor');
 print('Real Font Size: $realFontSize');

 return realFontSize;
}



- 

- This function dynamically reads the font properties (ascender, descender, and unitsPerEM) and calculates a scale factor to get the real font size. Despite this effort, discrepancies persist when mapping it to the ASS font size.




Question :
How can I ensure that the font size in the ASS file accurately reflects the size the user sees in Flutter ? Is there a reliable method to calculate or align the sizes correctly across both systems ? Any insights or suggestions would be greatly appreciated.


Thank you ! 🙏