Recherche avancée

Médias (1)

Mot : - Tags -/net art

Autres articles (101)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • Support de tous types de médias

    10 avril 2011

    Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

Sur d’autres sites (13069)

  • Read ID3 tags generated using Apple's id3taggenerator

    9 octobre 2024, par Damiaan Dufaux

    Hi I am creating an HLS stream with ID3 tags using Apple's HTTP Live Streaming (HLS) Tools, FFmpeg and NodeJS. When I try to read out the stream using an AVPlayer and AVPlayerItemMetadataOutput on macOS I'm not able to read out the ID3 tags. When I use the same code to read out a sample stream containing ID3 tags I do see them popping up. What am I doing wrong ?

    


    Reproduction :

    


    Streaming

    


    I generate an infinite HLS stream from a 5 minute long mpeg ts file using this command :

    


    ffmpeg -stream_loop -1 -re -i 5m.ts -c:v copy -c:a copy -f mpegts -strict -2 - | mediastreamsegmenter -b http://DamiaanTheoPro14.local:8081/ -f /tmp/hlsId3/video -D -m -M 50000 -log /tmp/hlsId3/log.txt


    


    I serve that HLS stream using nodejs builtin http-server

    


    ID3 tag generation

    


    Then I emit some ID3 tags using the following commands :

    


    id3taggenerator -title foo -artist bar -a localhost:50000
id3taggenerator -text "foo bar" -d "sample text" -r -a localhost:50000


    


    Reading

    


    Now to read out the tags I use this little SwiftUI app :

    


    import SwiftUI
import AVKit

let bipbopUrl = URL(string: "https://devstreaming-cdn.apple.com/videos/streaming/examples/bipbop_16x9/bipbop_16x9_variant.m3u8")!
let localUrl = URL(string: "http://damiaantheopro14.local:8081/prog_index.m3u8")!
let local = AVPlayerItem(url: localUrl)
let bipbop = AVPlayerItem(url: bipbopUrl)

struct ContentView: View {
    let player = AVPlayer()
    let observer = Id3Observer()
    var lastTag: AVMetadataGroup?
    
    var body: some View {
        VStack {
            HStack {
                Button("BipBop") {
                    player.replaceCurrentItem(with: bipbop)
                    bipbop.add(observer.metadataOutput)
                }
                Button("Local") {
                    player.replaceCurrentItem(with: local)
                    local.add(observer.metadataOutput)
                }
                Button("🅧") {
                    player.replaceCurrentItem(with: nil)
                }
            }
            VideoPlayer(player: player)
        }
        .padding()
    }
}

class Id3Observer: NSObject, AVPlayerItemMetadataOutputPushDelegate {
    let metadataOutput = AVPlayerItemMetadataOutput()
    
    override init() {
        super.init()
        metadataOutput.setDelegate(self, queue: .global())
    }
    
    func metadataOutput(_ output: AVPlayerItemMetadataOutput, didOutputTimedMetadataGroups groups: [AVTimedMetadataGroup], from: AVPlayerItemTrack?) {
        print("metadataOutput", groups.count)
        print("\t", groups.map { group in
            group.items.map { item in
                "\(item.dataType) \(item.keySpace!) \(item.key!) \(item.time.seconds) \(item.duration.seconds.rounded())"
            }.joined(separator: "/n/t")
        }.joined(separator: "\n\t"))
    }
}


    


  • aarch64 : Factorize code for CPU feature detection on Apple platforms

    12 mars 2024, par Martin Storsjö
    aarch64 : Factorize code for CPU feature detection on Apple platforms
    

    Signed-off-by : Martin Storsjö <martin@martin.st>

    • [DH] libavutil/aarch64/cpu.c
  • sandboxed electron app cant use ffmpeg (mac apple store)

    3 janvier 2021, par Martin

    I am trying to build an electron application for the mac apple store that uses ffmpeg.

    &#xA;

    I can use fluent-ffmpeg locally fine and It continues to work when I build my app for windows/mac/linux, but when I build a sandoxed Mac Apple Store (MAS) .app file and sign the .app file, fluent-ffmpeg does not work anymore, and throws an ffmpeg was killed with signal SIGILL error in console :

    &#xA;

    enter image description here

    &#xA;

    Fluent-ffmpeg gets setup with this javscript code :

    &#xA;

            //begin get ffmpeg info&#xA;        const ffmpeg = require(&#x27;fluent-ffmpeg&#x27;);&#xA;        //Get the paths to the packaged versions of the binaries we want to use&#xA;        var ffmpegPath = require(&#x27;ffmpeg-static-electron&#x27;).path;&#xA;        ffmpegPath = ffmpegPath.replace(&#x27;app.asar&#x27;, &#x27;app.asar.unpacked&#x27;)&#xA;        var ffprobePath = require(&#x27;ffprobe-static-electron&#x27;).path;&#xA;        ffprobePath = ffprobePath.replace(&#x27;app.asar&#x27;, &#x27;app.asar.unpacked&#x27;)&#xA;        //tell the ffmpeg package where it can find the needed binaries.&#xA;        &#xA;        ffmpeg.setFfmpegPath(ffmpegPath)//("./src/ffmpeg/ffmpeg");&#xA;        ffmpeg.setFfprobePath(ffprobePath)//("./src/ffmpeg/ffprobe");&#xA;        &#xA;        //end set ffmpeg info&#xA;

    &#xA;

    I have looked into this issue some and found similar questions, like this stackoverflow answer (link here) which says to compile a static ffmpeg executable and use that.

    &#xA;

    So I learned how to compile ffmpeg from the command line using these commands on my mac terminal :

    &#xA;

    git clone https://git.ffmpeg.org/ffmpeg.git &#xA;&#xA;./configure --pkg-config-flags="--static" --libdir=/usr/local/lib --extra-version=ntd_20150128 --disable-shared --enable-static --enable-gpl --enable-pthreads --enable-nonfree  --enable-libass --enable-libfdk-aac  --enable-libmp3lame  --enable-libx264 --enable-filters --enable-runtime-cpudetect&#xA;&#xA;make&#xA;&#xA;

    &#xA;

    After a while, I get an ffmpeg folder, which I move to my electron project's src folder /src/

    &#xA;

    I place this ffmpeg folder inside my electron /src folder and change my ffmpeg setup code to use my statically built folder like so :

    &#xA;

            //begin get ffmpeg info&#xA;        const ffmpeg = require(&#x27;fluent-ffmpeg&#x27;);&#xA;        //Get the paths to the packaged versions of the binaries we want to use&#xA;        var ffmpegPath = require(&#x27;ffmpeg-static-electron&#x27;).path;&#xA;        ffmpegPath = ffmpegPath.replace(&#x27;app.asar&#x27;, &#x27;app.asar.unpacked&#x27;)&#xA;        var ffprobePath = require(&#x27;ffprobe-static-electron&#x27;).path;&#xA;        ffprobePath = ffprobePath.replace(&#x27;app.asar&#x27;, &#x27;app.asar.unpacked&#x27;)&#xA;        //tell the ffmpeg package where it can find the needed binaries.&#xA;        &#xA;        ffmpeg.setFfmpegPath(ffmpegPath)//("./src/ffmpeg/ffmpeg");&#xA;        ffmpeg.setFfprobePath(ffprobePath)//("./src/ffmpeg/ffprobe");&#xA;        &#xA;        //end set ffmpeg info&#xA;

    &#xA;

    And then build / sign my app with these commands :

    &#xA;

    $ electron-builder build --mac&#xA;&#xA;$ sudo codesign --deep --force --verbose --sign &#x27;##(my dev id)####&#x27; dist/mas/Digify-mac.app&#xA;

    &#xA;

    But the final built .app file has the same error when trying to launch ffmpeg :

    &#xA;

    ffmpeg was killed with signal SIGILL&#xA;

    &#xA;

    I've been trying to solve this issue myself but with no luck, there have been some recent posts about this on the apple developer forums :&#xA;https://developer.apple.com/forums/thread/87849

    &#xA;

    but most of the other guides online are outdated.

    &#xA;

    Can anyone please help me get ffmpeg working in an sandboxed electron app for the Mac Apple Store ? Any help would be much appreciated

    &#xA;