
Recherche avancée
Médias (2)
-
Granite de l’Aber Ildut
9 septembre 2011, par
Mis à jour : Septembre 2011
Langue : français
Type : Texte
-
Géodiversité
9 septembre 2011, par ,
Mis à jour : Août 2018
Langue : français
Type : Texte
Autres articles (100)
-
Mise à jour de la version 0.1 vers 0.2
24 juin 2013, parExplications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...) -
Les formats acceptés
28 janvier 2010, parLes commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
ffmpeg -codecs ffmpeg -formats
Les format videos acceptés en entrée
Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
Les formats vidéos de sortie possibles
Dans un premier temps on (...) -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...)
Sur d’autres sites (6312)
-
Save FFMpeg conversion to PHP variable vs. File System for use with Whisper API ?
13 avril 2023, par SScottiI just started working on a little demo to transalte audio captured from the front-end as audio/webm using JS and then sent the back-end in a Laravel App. I guess there are JS libraries that can handle the conversion, but I'd rather use a server side solution with FFMPEG, which I am doing.


The backend code is below. It seems to be working after playing around with the PHP composer package that I'm using vs. one for Laravel that is also there. I'd rather use this one because I have other PHP apps that are not Laravel.


Questions :


- 

-
With the FFMpeg library, is there a way to capture the converted .mp3 file to a PHP variable in the script rather than saving it to the file system and then reading it back in later ?


-
For the OpenAI call, I'd like to catch exceptions there also. I just sort of have a placeholder there for now.


protected function whisper(Request $request) {

 $yourApiKey = getenv('OPENAI_API_KEY');
 $client = OpenAI::client($yourApiKey);

 $file = $request->file('file');
 $mimeType = $request->file('file')->getMimeType();
 $audioContents = $file->getContent();

 try {

 FFMpeg::open($file)
 ->export()
 ->toDisk('public')
 ->inFormat(new \FFMpeg\Format\Audio\Mp3)
 ->save('song_converted.mp3');
 }
 catch (EncodingException $exception) {
 $command = $exception->getCommand();
 $errorLog = $exception->getErrorOutput();
 }

 $mp3 = Storage::disk('public')->path('song_converted.mp3');
 try {
 $response = $client->audio()->transcribe([
 'model' => 'whisper-1',
 'file' => fopen($mp3, 'r'),
 'response_format' => 'verbose_json',
 ]);
 }
 catch (EncodingException $exception) {
 $command = $exception->getCommand();
 $errorLog = $exception->getErrorOutput();
 }

 echo json_encode($response);

}









-
-
React Native Expo File System : open failed : ENOENT (No such file or directory)
9 février 2023, par coloradayI'm getting this error in a bare React Native project :


Possible Unhandled Promise Rejection (id: 123):
Error: /data/user/0/com.filsufius.VisionishAItest/files/image-new-♥d.jpg: open failed: ENOENT (No such file or directory)



The same code was saving to File System with no problem yesterday, but today as you can see I am getting an ENOENT error, plus I am getting these funny heart shapes ♥d in the path. Any pointers as to what might be causing this, please ? I use npx expo run:android to builld app locally and expo start —dev-client to run on a physical Android device connected through USB.


import { Image, View, Text, StyleSheet } from "react-native";
import * as FileSystem from "expo-file-system";
import RNFFmpeg from "react-native-ffmpeg";
import * as tf from "@tensorflow/tfjs";
import * as cocossd from "@tensorflow-models/coco-ssd";
import { decodeJpeg, bundleResourceIO } from "@tensorflow/tfjs-react-native";

const Record = () => {
 const [frames, setFrames] = useState([]);
 const [currentFrame, setCurrentFrame] = useState(0);
 const [model, setModel] = useState(null);
 const [detections, setDetections] = useState([]);

 useEffect(() => {
 const fileName = "image-new-%03d.jpg";
 const outputPath = FileSystem.documentDirectory + fileName;
 RNFFmpeg.execute(
 "-y -i https://res.cloudinary.com/dannykeane/video/upload/sp_full_hd/q_80:qmax_90,ac_none/v1/dk-memoji-dark.m3u8 -vf fps=25 -f mjpeg " +
 outputPath
 )
 .then((result) => {
 console.log("Extraction succeeded:", result);
 FileSystem.readDirectoryAsync(FileSystem.documentDirectory).then(
 (files) => {
 setFrames(
 files
 .filter((file) => file.endsWith(".jpg"))
 .sort((a, b) => {
 const aNum = parseInt(a.split("-")[2].split(".")[0]);
 const bNum = parseInt(b.split("-")[2].split(".")[0]);
 return aNum - bNum;
 })
 );
 }
 );
 })
 .catch((error) => {
 console.error("Extraction failed:", error);
 });
 }, []);

 useEffect(() => {
 tf.ready().then(() => cocossd.load().then((model) => setModel(model)));
 }, []);
 useEffect(() => {
 if (frames.length && model) {
 const intervalId = setInterval(async () => {
 setCurrentFrame((currentFrame) =>
 currentFrame === frames.length - 1 ? 0 : currentFrame + 1
 );
 const path = FileSystem.documentDirectory + frames[currentFrame];
 const imageAssetPath = await FileSystem.readAsStringAsync(path, {
 encoding: FileSystem.EncodingType.Base64,
 });
 const imgBuffer = tf.util.encodeString(imageAssetPath, "base64").buffer;
 const imageData = new Uint8Array(imgBuffer);
 const imageTensor = decodeJpeg(imageData, 3);
 console.log("after decodeJpeg.");
 const detections = await model.detect(imageTensor);
 console.log(detections);
 setDetections(detections);
 }, 100);
 return () => clearInterval(intervalId);
 }
 }, [frames, model]);

 
 return (
 <view style="{styles.container}">
 
 <view style="{styles.predictions}">
 {detections.map((p, i) => (
 <text key="{i}" style="{styles.text}">
 {p.class}: {(p.score * 100).toFixed(2)}%
 </text>
 ))}
 </view>
 </view>
 );
};

const styles = StyleSheet.create({
 container: {
 flex: 1,
 alignItems: "center",
 justifyContent: "center",
 },
 image: {
 width: 300,
 height: 300,
 resizeMode: "contain",
 },
 predictions: {
 width: 300,
 height: 100,
 marginTop: 20,
 },
 text: {
 fontSize: 14,
 textAlign: "center",
 },
});

export default Record;```



-
HTML5 transparent video with the greatest cross-browser/system support
22 juin 2022, par Will AshworthI'm encountering an issue getting videos with alpha transparency to reliably load and play on a web page. After some thorough research, this is where I ended up as a means of video encoding to accomplish transparent video which isn't over a solid background color.


Hoping the general community has insight into why we're noticing weirdness with MacOS Monterey in Safari 15 🤷♂️


Note : We tried Lottie as an option for the animations, but what we found was that the DOM was excessively bloated ; which would inevitably cause performance issues for the website. So we went back to video as an option.


Convert to HEVC with alpha


ffmpeg -i "source.mov" -c:v hevc_videotoolbox -allow_sw 1 -alpha_quality 1 -vtag hvc1 output.mov



Convert to VP9 with alpha


ffmpeg -i "source.mov" -c:v libvpx-vp9 output.webm



HTML5 method of serving these files to the browser


<video autoplay="autoplay" loop="loop" muted="muted" playsinline="playsinline" class="tmpl-front-page__transition-item tmpl-front-page__transition-item--0 tmpl-front-page__transition-item--banner-video">
 <source src="path/to/video.mov" type="'video/mp4;" codecs="hvc1">
 <source src="path/to/video.webm" type="video/webm">
</source></source></video>



How it works


Essentially, we've learned the following :


- 

- Safari supports HEVC with alpha, Chrome does not
- Chrome supports VP9 with alpha, Safari does not






Now we let the browser choose which version it wants to use.


There are issues


There's inconsistency in how reliably this works in reality. For example, I'm currently running MacOS Catalina with Safari 14.0.2, and the videos started loading for me when using the above method.


While testing MacOS Monterey with Safari 15.1 inside a Parallels VM, the video doesn't load at all when I test that way. That said, another developer on our team did take the plunge and upgraded to Monterey and has Safari 15.1 ; and he can see the videos loading just fine.


This is getting a little silly, and I'm not sure what else to try. Thanks for any help !