
Recherche avancée
Médias (1)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (44)
-
Installation en mode ferme
4 février 2011, parLe mode ferme permet d’héberger plusieurs sites de type MediaSPIP en n’installant qu’une seule fois son noyau fonctionnel.
C’est la méthode que nous utilisons sur cette même plateforme.
L’utilisation en mode ferme nécessite de connaïtre un peu le mécanisme de SPIP contrairement à la version standalone qui ne nécessite pas réellement de connaissances spécifique puisque l’espace privé habituel de SPIP n’est plus utilisé.
Dans un premier temps, vous devez avoir installé les mêmes fichiers que l’installation (...) -
Emballe médias : à quoi cela sert ?
4 février 2011, parCe plugin vise à gérer des sites de mise en ligne de documents de tous types.
Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ; -
Keeping control of your media in your hands
13 avril 2011, parThe vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...)
Sur d’autres sites (5233)
-
How can I generate a metadata.mov file for higher-resolution Live Photos (e.g. 1440x2560) ?
30 juillet, par brijesh patelI'm generating Live Photos programmatically for use as wallpapers on iOS. I'm using a known metadata.mov file bundled with the app (likely extracted from a working Live Photo with resolution 1080x1920). This setup works fine when I use a video of the same resolution.


I'm using this open-source library to handle the video-to-LivePhoto conversion :

https://github.com/TouSC/Video2LivePhoto

However, when I try using a higher-resolution video (e.g. 2560x1440) to avoid black bars on high-resolution devices (like iPhone 14 Pro Max), the Photos app shows the Live Photo, but the motion component doesn't work—it just says “Motion Not Available.”


I believe the issue is that the static metadata.mov contains resolution-specific metadata, which prevents it from working correctly with other video sizes.


- 

-
Tried changing the resolution of the video (e.g. 1440x2560, 1284x2778) – motion breaks.


-
Tried generating a new .mov file using FFmpeg with a silent video track, matching the new resolution – Live Photo not recognized or shows errors.


-
Tried modifying the existing metadata.mov with tools like FFmpeg, AtomicParsley, Bento4, and mp4box – resulting files often break the Live Photo entirely.


-
I expected to generate a valid metadata.mov (or similar track) that would support the custom resolution and restore Live Photo motion support.












static func convertVideo(videoURL: URL, complete: @escaping (_ success: Bool, _ errorMessage: String?) -> Void) {
 print("start converting")
 
 guard let metaURL = Bundle.main.url(forResource: "metadata", withExtension: "mov") else {
 complete(false, "metadata.mov not found")
 return
 }

 let livePhotoSize = CGSize(width: 1440, height: 2560) // <-- updated resolution
 let livePhotoDuration = CMTime(value: 550, timescale: 600)
 let assetIdentifier = UUID().uuidString

 guard let documentPath = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true).first else {
 complete(false, "Document path not found")
 return
 }

 let durationPath = documentPath + "/duration.mp4"
 let acceleratePath = documentPath + "/accelerate.mp4"
 let resizePath = documentPath + "/resize.mp4"
 let finalPath = resizePath

 removeFileIfExists(at: durationPath)
 removeFileIfExists(at: acceleratePath)
 removeFileIfExists(at: resizePath)

 let converter = Converter4Video(path: finalPath)

 Task {
 do {
 try await converter.durationVideo(at: videoURL, outputPath: durationPath, targetDuration: 3)
 try await converter.accelerateVideo(at: durationPath, to: livePhotoDuration, outputPath: acceleratePath)
 try await converter.resizeVideo(at: acceleratePath, outputPath: resizePath, outputSize: livePhotoSize)

 print("### resize Success")
 let image = try await generateCGImage(finalPath: finalPath)

 await generateLivePhoto(
 image: image,
 documentPath: documentPath,
 assetIdentifier: assetIdentifier,
 metaURL: metaURL,
 converter: converter,
 complete: complete
 )
 } catch {
 print("Video conversion error: \(error)")
 complete(false, error.localizedDescription)
 return
 }
 }
}



-
-
libav converted audio file distorted
23 juillet 2015, par C0dRI am trying to implement a file converter using libav. Currently I am testing just the audio convertion (for example aac input and mp3 output). I am using the code from this question Conversion from mp3 to aac/mp4 container (FFmpeg/c++) but the resulting output file sounds corrupted (too slow, noisy, distorted).
This is the result from converting AAC to mp3. AAC input format is AV_SAMPLE_FMT_FLTP and mp3 output format is AV_SAMPLE_FMT_S16P. It looks like part of the one channel is inverted on the other channel ?I am using avresample to convert the audio data. I just cant find out what is wrong, i already looked through the examples but as i can see i am doing it just like in the examples.
Here is my converter class :http://pastebin.com/c6hvrRaM (.h)
http://pastebin.com/u6iAPHZ9 (.cpp)
I know this is pretty much but I am desperated...
-
Playing encrypted m4a on Android
20 novembre 2013, par FixeeI'm pulling down encrypted music, decrypting on-the-fly to m4a plaintext buffer and want to then play this music on Android (4.x). It appears the options are bleak :
- Write the buffer to disk and use MediaPlayer() with a FileDescriptor
- Write the buffer to disk and use a proxy like
nano
to serve MediaPlayer() via a URI - Decode the buffer to PCM and use AudioTrack to play it
Options 1 and 2 require writing plaintext to SD, which isn't acceptable. Option 3 requires decoding in software (with, eg,
ffmpeg
) which seems ridiculous : if there are hardware decoders on the device we can't use them. And if there is a software decoder, we can't access it and put yet another decoder on the device (ffmpeg's).Note that using OpenSL doesn't help at all : you still cannot play m4a's from a buffer. Is there another way ?