
Recherche avancée
Médias (1)
-
La conservation du net art au musée. Les stratégies à l’œuvre
26 mai 2011
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (101)
-
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
Support de tous types de médias
10 avril 2011Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)
-
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)
Sur d’autres sites (13069)
-
Read ID3 tags generated using Apple's id3taggenerator
9 octobre 2024, par Damiaan DufauxHi I am creating an HLS stream with ID3 tags using Apple's HTTP Live Streaming (HLS) Tools, FFmpeg and NodeJS. When I try to read out the stream using an
AVPlayer
andAVPlayerItemMetadataOutput
on macOS I'm not able to read out the ID3 tags. When I use the same code to read out a sample stream containing ID3 tags I do see them popping up. What am I doing wrong ?

Reproduction :


Streaming


I generate an infinite HLS stream from a 5 minute long mpeg ts file using this command :


ffmpeg -stream_loop -1 -re -i 5m.ts -c:v copy -c:a copy -f mpegts -strict -2 - | mediastreamsegmenter -b http://DamiaanTheoPro14.local:8081/ -f /tmp/hlsId3/video -D -m -M 50000 -log /tmp/hlsId3/log.txt



I serve that HLS stream using nodejs builtin
http-server


ID3 tag generation


Then I emit some ID3 tags using the following commands :


id3taggenerator -title foo -artist bar -a localhost:50000
id3taggenerator -text "foo bar" -d "sample text" -r -a localhost:50000



Reading


Now to read out the tags I use this little SwiftUI app :


import SwiftUI
import AVKit

let bipbopUrl = URL(string: "https://devstreaming-cdn.apple.com/videos/streaming/examples/bipbop_16x9/bipbop_16x9_variant.m3u8")!
let localUrl = URL(string: "http://damiaantheopro14.local:8081/prog_index.m3u8")!
let local = AVPlayerItem(url: localUrl)
let bipbop = AVPlayerItem(url: bipbopUrl)

struct ContentView: View {
 let player = AVPlayer()
 let observer = Id3Observer()
 var lastTag: AVMetadataGroup?
 
 var body: some View {
 VStack {
 HStack {
 Button("BipBop") {
 player.replaceCurrentItem(with: bipbop)
 bipbop.add(observer.metadataOutput)
 }
 Button("Local") {
 player.replaceCurrentItem(with: local)
 local.add(observer.metadataOutput)
 }
 Button("🅧") {
 player.replaceCurrentItem(with: nil)
 }
 }
 VideoPlayer(player: player)
 }
 .padding()
 }
}

class Id3Observer: NSObject, AVPlayerItemMetadataOutputPushDelegate {
 let metadataOutput = AVPlayerItemMetadataOutput()
 
 override init() {
 super.init()
 metadataOutput.setDelegate(self, queue: .global())
 }
 
 func metadataOutput(_ output: AVPlayerItemMetadataOutput, didOutputTimedMetadataGroups groups: [AVTimedMetadataGroup], from: AVPlayerItemTrack?) {
 print("metadataOutput", groups.count)
 print("\t", groups.map { group in
 group.items.map { item in
 "\(item.dataType) \(item.keySpace!) \(item.key!) \(item.time.seconds) \(item.duration.seconds.rounded())"
 }.joined(separator: "/n/t")
 }.joined(separator: "\n\t"))
 }
}



-
aarch64 : Factorize code for CPU feature detection on Apple platforms
12 mars 2024, par Martin Storsjö -
sandboxed electron app cant use ffmpeg (mac apple store)
3 janvier 2021, par MartinI am trying to build an electron application for the mac apple store that uses ffmpeg.


I can use fluent-ffmpeg locally fine and It continues to work when I build my app for windows/mac/linux, but when I build a sandoxed Mac Apple Store (MAS) .app file and sign the .app file, fluent-ffmpeg does not work anymore, and throws an
ffmpeg was killed with signal SIGILL
error in console :



Fluent-ffmpeg gets setup with this javscript code :


//begin get ffmpeg info
 const ffmpeg = require('fluent-ffmpeg');
 //Get the paths to the packaged versions of the binaries we want to use
 var ffmpegPath = require('ffmpeg-static-electron').path;
 ffmpegPath = ffmpegPath.replace('app.asar', 'app.asar.unpacked')
 var ffprobePath = require('ffprobe-static-electron').path;
 ffprobePath = ffprobePath.replace('app.asar', 'app.asar.unpacked')
 //tell the ffmpeg package where it can find the needed binaries.
 
 ffmpeg.setFfmpegPath(ffmpegPath)//("./src/ffmpeg/ffmpeg");
 ffmpeg.setFfprobePath(ffprobePath)//("./src/ffmpeg/ffprobe");
 
 //end set ffmpeg info



I have looked into this issue some and found similar questions, like this stackoverflow answer (link here) which says to compile a static ffmpeg executable and use that.


So I learned how to compile ffmpeg from the command line using these commands on my mac terminal :


git clone https://git.ffmpeg.org/ffmpeg.git 

./configure --pkg-config-flags="--static" --libdir=/usr/local/lib --extra-version=ntd_20150128 --disable-shared --enable-static --enable-gpl --enable-pthreads --enable-nonfree --enable-libass --enable-libfdk-aac --enable-libmp3lame --enable-libx264 --enable-filters --enable-runtime-cpudetect

make




After a while, I get an ffmpeg folder, which I move to my electron project's src folder
/src/


I place this ffmpeg folder inside my electron /src folder and change my ffmpeg setup code to use my statically built folder like so :


//begin get ffmpeg info
 const ffmpeg = require('fluent-ffmpeg');
 //Get the paths to the packaged versions of the binaries we want to use
 var ffmpegPath = require('ffmpeg-static-electron').path;
 ffmpegPath = ffmpegPath.replace('app.asar', 'app.asar.unpacked')
 var ffprobePath = require('ffprobe-static-electron').path;
 ffprobePath = ffprobePath.replace('app.asar', 'app.asar.unpacked')
 //tell the ffmpeg package where it can find the needed binaries.
 
 ffmpeg.setFfmpegPath(ffmpegPath)//("./src/ffmpeg/ffmpeg");
 ffmpeg.setFfprobePath(ffprobePath)//("./src/ffmpeg/ffprobe");
 
 //end set ffmpeg info



And then build / sign my app with these commands :


$ electron-builder build --mac

$ sudo codesign --deep --force --verbose --sign '##(my dev id)####' dist/mas/Digify-mac.app



But the final built .app file has the same error when trying to launch ffmpeg :


ffmpeg was killed with signal SIGILL



I've been trying to solve this issue myself but with no luck, there have been some recent posts about this on the apple developer forums :
https://developer.apple.com/forums/thread/87849


but most of the other guides online are outdated.


Can anyone please help me get ffmpeg working in an sandboxed electron app for the Mac Apple Store ? Any help would be much appreciated