
Recherche avancée
Autres articles (43)
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Librairies et binaires spécifiques au traitement vidéo et sonore
31 janvier 2010, parLes logiciels et librairies suivantes sont utilisées par SPIPmotion d’une manière ou d’une autre.
Binaires obligatoires FFMpeg : encodeur principal, permet de transcoder presque tous les types de fichiers vidéo et sonores dans les formats lisibles sur Internet. CF ce tutoriel pour son installation ; Oggz-tools : outils d’inspection de fichiers ogg ; Mediainfo : récupération d’informations depuis la plupart des formats vidéos et sonores ;
Binaires complémentaires et facultatifs flvtool2 : (...) -
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)
Sur d’autres sites (5414)
-
Volume Detect on half second window sizes [duplicate]
14 juin 2021, par user82395214How can I get the max volume of individual slices of a video using FFMpeg ?


Using this command, you can obtain the max volume for the whole clip :


ffmpeg -i video.avi -af "volumedetect" -vn -sn -dn -f null /dev/null


However, I want the max volume for every half second window in the audio file. How can I achieve this with ffmpeg ?


-
How to detect black screen at play time in ijkplayer ?
31 août 2019, par seaguestI am using ijkplayer,I am wondering if it is possible to know if current player is black screen at play time ?
because some old model mobiles do not support hard-decode and get black screen, I need to know this case then switch to soft-decode.
-
Swift - Workaround/Alternative to M3u8 to play mp4 segment or merge segments into mp4
14 mai 2020, par STerrierI used AVAssetExportSession to download a session URL but the issue that you can't download live stream so to get around it, the live stream is split into 10 seconds mp4 segments that are downloaded using an m3u8 to create the URLs. I then use AVAssetExportSession to merge those mp4 segments.



I can merge those clips one by one into one mp4 file which is what I want but as the file gets bigger, the longer it takes as I am dealing with thousands of segments which becomes unpractical.



I thought about using AVplayerLooper but I cannot scrub, rewind or forward through the mp4 segment like a single video.



Is there a way to combine the mp4 clips together to play as one video as the m3u8 does without merging ? or is there a fast way to merge videos ?



Note : The server uses FFmpeg but I am not allowed to use FFmpeg or pods in the app.



below is the function to merge videos



var mp4Array: [AVAsset] = []
var avAssetExportSession: AVAssetExportSession?

var firstAsset: AVAsset?
var secondAsset: AVAsset?

func mergeVideos() {

 firstAsset = mp4Array.first
 secondAsset = mp4Array[1]

 guard let firstAsset = firstAsset, let secondAsset = secondAsset else { return }
 let mixComposition = AVMutableComposition()

 guard let firstTrack = mixComposition.addMutableTrack(withMediaType: .video, preferredTrackID: Int32(kCMPersistentTrackID_Invalid)) else {return}

 do {

 try firstTrack.insertTimeRange(CMTimeRangeMake(start: CMTime.zero, duration: firstAsset.duration),
 of: firstAsset.tracks(withMediaType: .video)[0],
 at: CMTime.zero)

 } catch {
 print("Couldn't load track 1")
 return
 }

 guard let secondTrack = mixComposition.addMutableTrack(withMediaType: .video, preferredTrackID: Int32(kCMPersistentTrackID_Invalid)) else {return}

 do {
 try secondTrack.insertTimeRange(CMTimeRangeMake(start: CMTime.zero, duration: secondAsset.duration),
 of: secondAsset.tracks(withMediaType: .video)[0],
 at: firstAsset.duration)
 } catch {
 print("couldn't load track 2")
 return
 }

 let mainInstruction = AVMutableVideoCompositionInstruction()
 mainInstruction.timeRange = CMTimeRangeMake(start: CMTime.zero, duration: CMTimeAdd(firstAsset.duration, secondAsset.duration))

 let firstAssetInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: firstTrack)
 firstAssetInstruction.setOpacity(0.0, at: firstAsset.duration)

 let secondAssetInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: secondTrack)

 mainInstruction.layerInstructions = [firstAssetInstruction, secondAssetInstruction]
 let mainComposition = AVMutableVideoComposition()
 mainComposition.instructions = [mainInstruction]
 mainComposition.frameDuration = CMTimeMake(value: 1, timescale: 30)
 mainComposition.renderSize = firstTrack.naturalSize

 guard let documentDirectory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first else { return }
 let url = documentDirectory.appendingPathComponent("MergedVideos/mergeVideo\(videoInt).mp4")

 guard let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality) else {return}

 exporter.outputURL = url
 exporter.outputFileType = AVFileType.mp4
 exporter.shouldOptimizeForNetworkUse = true
 exporter.videoComposition = mainComposition

 exporter.exportAsynchronously {

 if exporter.status == .completed {
 let avasset = AVAsset(url:url)
 self.mergeUrl = avasset
 if self.mp4Array.count > 1{
 print("This add the merged video to the front of the mp4array")
 self.mp4Array.remove(at: 1)
 self.mp4Array.removeFirst()
 self.videoInt = self.videoInt + 1
 self.mp4Array.append(self.mergeUrl!)
 self.mp4Array.bringToFront(item: self.mp4Array.last!)
 }

 if (self.mp4Array.count > 1){
 if self.mergeUrl != nil {
 self.mergeVideos()
 }
 } else {
 var numberofvideosdeleted = 0
 while (numberofvideosdeleted < self.videoInt - 1){
 do {
 print("deleting")
 let url = documentDirectory.appendingPathComponent("MergedVideos/mergeVideo\(numberofvideosdeleted).mp4")
 try FileManager.default.removeItem(at: url)
 numberofvideosdeleted = numberofvideosdeleted + 1
 } catch {
 print("Error removing videos")
 }
 }

 self.deleteCurrentSegementsInFolder()
 }
 }
 }
}