Recherche avancée

Médias (1)

Mot : - Tags -/Rennes

Autres articles (101)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Amélioration de la version de base

    13 septembre 2013

    Jolie sélection multiple
    Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
    Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)

  • MediaSPIP v0.2

    21 juin 2013, par

    MediaSPIP 0.2 est la première version de MediaSPIP stable.
    Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

Sur d’autres sites (13050)

  • How do I get video frame buffer stream from a connected GoPro camera ?

    29 décembre 2024, par cleanrun

    I'm creating an app that can connect to a GoPro camera and I want to get the frame buffer stream from the connected GoPro camera and use it on my iOS app (by converting the buffer data into CMSampleBuffer). I'm currently trying to use the FFmpeg library but so far it doesn't work. Here's the logic I've implemented (I'm using ChatGPT to generate the code) :

    


    import Foundation
import CoreMedia
import AVFoundation
import ffmpegkit


final class FFmpegBufferProcessor: AnyBufferProcessor {
    weak var delegate: BufferProcessorDelegate?
    
    private var pipePath: String = NSTemporaryDirectory() + "ffmpeg_pipe"
    private var isProcessing: Bool = false
    private var videoWidth = 1920
    private var videoHeight = 1080
    private let pixelFormat = kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange
    
    init() {
        setupPipe()
    }
    
    deinit {
        cleanupPipe()
    }
    
    private func setupPipe() {
        do {
            if FileManager.default.fileExists(atPath: pipePath) {
                try FileManager.default.removeItem(atPath: pipePath)
            }
            
            let result = mkfifo(pipePath.cString(using: .utf8), 0o644)
            if result != 0 {
                print("\(#function); Pipe creation failed.")
                return
            }
        } catch {
            print("\(#function); Setup pipe error: \(error.localizedDescription)")
        }
    }
    
    private func cleanupPipe() {
        do {
            try FileManager.default.removeItem(atPath: pipePath)
        } catch {
            print("\(#function); Cleanup pipe error: \(error.localizedDescription)")
        }
    }
    
    func startProcessingStream(from udpURL: String) {
        guard !isProcessing else {
            print("\(#function); Already processing stream.")
            return
        }
        
        isProcessing = true
        let command = """
        -i \(udpURL) -f rawvideo -pix_fmt nv12 \(pipePath)
        """
        
        FFmpegKit.executeAsync(command) { [weak self] session in
            let returnCode = session?.getReturnCode()
            if ReturnCode.isSuccess(returnCode) {
                print("\(#function); FFmpeg session completed.")
            } else {
                print("\(#function); FFmpeg session error: \(String(describing: session?.getFailStackTrace())).")
            }
            
            self?.isProcessing = false
        }
        
        readFromPipe()
    }
    
    func stopProcessingStream() {
        isProcessing = false
        FFmpegKit.cancel()
    }
}

// MARK: - Private methods

private extension FFmpegBufferProcessor {
    func readFromPipe() {
        DispatchQueue.global(qos: .background).async { [unowned self] in
            guard let fileHandle = FileHandle(forReadingAtPath: self.pipePath) else {
                print("\(#function); Fail to read file handle from pipe path.")
                return
            }
            
            autoreleasepool {
                while self.isProcessing {
                    let frameSize = self.videoWidth * self.videoHeight * 3 / 2
                    let rawData = fileHandle.readData(ofLength: frameSize)
                    
                    if rawData.isEmpty {
                        print("\(#function); Pipe closed / no more data to read.")
                        break
                    }
                    
                    self.handleRawFrameData(rawData)
                }
                
                fileHandle.closeFile()
            }
        }
    }
    
    func handleRawFrameData(_ data: Data) {
        let width = 1920
        let height = 1080
        
        // Creating the Pixel Buffer (if possible)
        guard let pixelBuffer = createPixelBuffer(from: data, width: width, height: height) else {
            print("\(#function); Failed to create pixel buffer")
            return
        }
        
        var timing = CMSampleTimingInfo(duration: CMTime(value: 1, timescale: 30), presentationTimeStamp: .zero, decodeTimeStamp: .invalid)
        // Creating the Sample Buffer (if possible)
        guard let sampleBuffer = createSampleBuffer(from: pixelBuffer, timing: &timing) else {
            print("\(#function); Failed to create sample buffer")
            return
        }
        
        delegate?.bufferProcessor(self, didOutput: sampleBuffer)
    }
}


    


    Here's the logs I'm getting from FFMpeg :

    


    Debug log

    


    Also a quick note, I'm using AVSampleBufferDisplayLayer to enqueue and show the buffers, but obviously it doesn't show up.

    


    What should I do to fix this ? Or maybe is there any other way to get the frame buffers from a GoPro camera and show it in iOS ? Any help would be appreciated. Thank you.

    


  • avcodec/aactab : Make AAC encoder and decoders actually init-threadsafe

    22 novembre 2020, par Andreas Rheinhardt
    avcodec/aactab : Make AAC encoder and decoders actually init-threadsafe
    

    Commit 1a29804558c13ef512d9ef73a9b0d782af4fa5f2 guarded several
    initializations of static data in the AAC decoders with an AVOnce and
    set the FF_CODEC_CAP_INIT_THREADSAFE flag, believing the former to be
    sufficient for the latter. It wasn't, because several of these static
    tables are shared with other components, so that there might be data
    races if they are initialized from multiple threads. This affected
    initializing the ff_sine_* tables as well as initializing the
    ff_aac_pow*sf_tab tables (shared between both decoders and encoder) as
    well as ff_aac_kbd_* tables (shared between encoder and floating point
    decoder).

    Commit 3d62e7a30fa552be52d12b31e3e0f79153aff891 set the
    FF_CODEC_CAP_INIT_THREADSAFE flag for the AAC encoder. More explicitly,
    this commit used the same AVOnce to guard initializing ff_aac_pow*sf_tab
    in the encoder and to guard initializing the static data of each
    decoder ; the ensuing catastrophe was "fixed" in commit
    ec0719264cb9a9d5cbaf225da48929aea24997a3 by using a single AVOnce
    for each codec again. But the codec cap has not been removed and
    therefore the encoder claimed to be init-threadsafe, but wasn't, because
    of the same tables as above.

    The ff_sine_* tables as well as ff_aac_pow*sf_tab tables have already
    been fixed ; this commit deals with the ff_aac_kbd_* tables, making the
    encoder as well as the floating-point decoder init-threadsafe (the
    fixed-point decoder is it already).

    Signed-off-by : Andreas Rheinhardt <andreas.rheinhardt@gmail.com>

    • [DH] libavcodec/aactab.c
  • ffmpeg Error : Pattern type 'glob' was selected but globbing is not support ed by this libavformat build

    14 septembre 2017, par Aryan Naim

    I’m trying to convert group of ".jpg" files acting as individual frames into 1 single mpeg video ".mp4"

    Example parameters i used :

    frame duration  = 2 secs
    frame rate      = 30  fps
    encoder         = libx264 (mpeg)
    input pattern   = "*.jpg"
    output pattern  = video.mp4

    Based on ffmpeg wiki instructions at (https://trac.ffmpeg.org/wiki/Create%20a%20video%20slideshow%20from%20images), I issued this command :

    ffmpeg -framerate 1/2 -pattern_type glob -i "*.jpg" -c:v libx264 -r 30 -pix_fmt yuv420p video.mp4

    But I’m getting this error :

    [image2 @ 049ab120] Pattern type 'glob' was selected but globbing is not
    supported by this libavformat build *.jpg: Function not implemented

    Which probably means the API pattern matching commands for my build/version have changed. By the way this my windows 32bit ffmpeg download build (ffmpeg-20150702-git-03b2b40-win32-static).

    How can I choose a group of files using pattern matching using ffmpeg ?