
Recherche avancée
Autres articles (65)
-
Mediabox : ouvrir les images dans l’espace maximal pour l’utilisateur
8 février 2011, parLa visualisation des images est restreinte par la largeur accordée par le design du site (dépendant du thème utilisé). Elles sont donc visibles sous un format réduit. Afin de profiter de l’ensemble de la place disponible sur l’écran de l’utilisateur, il est possible d’ajouter une fonctionnalité d’affichage de l’image dans une boite multimedia apparaissant au dessus du reste du contenu.
Pour ce faire il est nécessaire d’installer le plugin "Mediabox".
Configuration de la boite multimédia
Dès (...) -
Installation en mode standalone
4 février 2011, parL’installation de la distribution MediaSPIP se fait en plusieurs étapes : la récupération des fichiers nécessaires. À ce moment là deux méthodes sont possibles : en installant l’archive ZIP contenant l’ensemble de la distribution ; via SVN en récupérant les sources de chaque modules séparément ; la préconfiguration ; l’installation définitive ;
[mediaspip_zip]Installation de l’archive ZIP de MediaSPIP
Ce mode d’installation est la méthode la plus simple afin d’installer l’ensemble de la distribution (...) -
Soumettre bugs et patchs
10 avril 2011Un logiciel n’est malheureusement jamais parfait...
Si vous pensez avoir mis la main sur un bug, reportez le dans notre système de tickets en prenant bien soin de nous remonter certaines informations pertinentes : le type de navigateur et sa version exacte avec lequel vous avez l’anomalie ; une explication la plus précise possible du problème rencontré ; si possibles les étapes pour reproduire le problème ; un lien vers le site / la page en question ;
Si vous pensez avoir résolu vous même le bug (...)
Sur d’autres sites (6624)
-
How to properly abort av_seek_frame on FFmpeg ?
5 mars 2021, par SuRGeoNixNormally, you wouldn't care aborting av_seek_frame as it would be really fast on a local file. However, in my case I use a custom AVIOContext for torrent streaming with custom read/seek functions and I'm not able to abort a single seek request !


I've already tried interrupt callbacks (they will not be called at all), some timeouts (rw_timeout/timeout etc.) but by checking FFmpeg's code didn't find anything at all. My last chance was to try to return on read/seek functions an error (I've tried AVERROR_EXIT) which causes even more problems (memory leaks).


The main issue is with Matroska formats that they need to resync (level-1) and they are trying to scan the whole file.


Unfortunately, I'm using C# .NET with FFmpeg.Autogen bindings which means that I don't have low-level access to play around. My workaround is to re-open the whole format context in case on seek abort (to ensure that the player will continues to play at least)


Hope you have a tip for me !


(By the way for some http/hls web formats interrupt callbacks are supported)


-
Have 2 blocking scripts interact with each other in linux
18 novembre 2014, par OrtixxI have 2 blocking shell scripts which I want to have interact with each other. The scripts in question are peerflix (nodejs script) and ffmpeg (a simple bash script).
What happens : Peerflix fires up, feeds data to ffmpeg bash scrip which terminates peerflix on completion.
So once peerflix starts it outputs 2 lines and blocks immediately :
[08:15 PM]-[vagrant@packer-virtualbox-iso]-[/var/www/test]-[git master]
$ node /var/www/test/node/node_modules/peerflix/app.js /var/www/test/flexget/torrents/test.torrent -r -q
listening: http://10.0.2.15:38339/
process: 9601I have to feed the listening address to the ffmpeg bash script :
#!/bin/sh
ffmpeg -ss 00:05:00 -i {THE_LISTENING_PORT} -frames:v 1 out1.jpg
ffmpeg -ss 00:10:00 -i {THE_LISTENING_PORT} -frames:v 1 out2.jpgAfter the bash script is done I have to kill the peerflix script (hence me outputting the PID).
My question is how do I achieve this ?
-
How can I record, encrypt (in memory), and mux audio and video on android without the files getting out of sync ?
9 février 2021, par RobertWe’re attempting to save video and audio from an Android device into an encrypted file. Our current implementation pipes the outputs from microphone and camera through the MediaEncoder class. As the data is output from MediaEncoder, we are encrypting and writing the contents of the byte buffer to disk. This approach works, however, when attempting to stitch the files back together with FFMPEG, we notice that the two streams seem to get out of sync somewhere mid stream. It appears that a lot of important metadata is lost with this method, specifically presentation timestamps and frame rate data as ffmpeg has to do some guess work to mux the files.


Are there techniques for keeping these streams in sync without using MediaMuxer ? The video is encoded with H.264 and the audio with AAC.


Other Approaches :
We attempted to use the MediaMuxer to mux the output data to a file, but our use case requires that we encrypt the bytes of data before they are saved to disk which eliminates the possibility of using the default constructor.


Additionally, we have attempted to use the newly added (API 26) constructor that takes a FileDescriptor instead and have that pointed to a ParcelFileDescriptor that wrapped an Encrypted Document (https://android.googlesource.com/platform/development/+/master/samples/Vault/src/com/example/android/vault/EncryptedDocument.java). However, this approach led to crashes at the native layer and we believe it may have to do with this comment from the source code (https://android.googlesource.com/platform/frameworks/base.git/+/master/media/java/android/media/MediaMuxer.java#353) about the native writer trying to memory map the output file.


import android.graphics.YuvImage
import android.media.MediaCodec
import android.media.MediaCodecInfo
import android.media.MediaFormat
import android.media.MediaMuxer
import com.callyo.video_10_21.Utils.YuvImageUtils.convertNV21toYUV420Planar
import java.io.FileDescriptor
import java.util.*
import java.util.concurrent.atomic.AtomicReference
import kotlin.properties.Delegates

class VideoEncoderProcessor(
 private val fileDescriptor: FileDescriptor,
 private val width: Int,
 private val height: Int,
 private val frameRate: Int
): MediaCodec.Callback() {
 private lateinit var videoFormat: MediaFormat
 private var trackIndex by Delegates.notNull<int>()
 private var mediaMuxer: MediaMuxer
 private val mediaCodec = createEncoder()
 private val pendingVideoEncoderInputBufferIndices = AtomicReference>(LinkedList())

 companion object {
 private const val VIDEO_FORMAT = "video/avc"
 }

 init {
 mediaMuxer = MediaMuxer(fileDescriptor, MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4)
 mediaCodec.setCallback(this)
 mediaCodec.start()
 }

 private fun createEncoder(): MediaCodec {
 videoFormat = MediaFormat.createVideoFormat(VIDEO_FORMAT, width, height).apply {
 setInteger(MediaFormat.KEY_FRAME_RATE, frameRate)
 setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Flexible)
 setInteger(MediaFormat.KEY_BIT_RATE, width * height * 5)
 setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1)
 }

 return MediaCodec.createEncoderByType(VIDEO_FORMAT).apply {
 configure(videoFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE)
 }
 }

 override fun onInputBufferAvailable(codec: MediaCodec, index: Int) {
 // logic for handling stream end omitted for clarity

 /* Video frames come in asynchronously from input buffer availability
 * so we need to keep track of available buffers in queue */
 pendingVideoEncoderInputBufferIndices.get().add(index)
 }

 override fun onError(codec: MediaCodec, e: MediaCodec.CodecException) {}

 override fun onOutputFormatChanged(codec: MediaCodec, format: MediaFormat) {
 trackIndex = mediaMuxer.addTrack(format)
 mediaMuxer.start()
 }

 override fun onOutputBufferAvailable(codec: MediaCodec, index: Int, bufferInfo: MediaCodec.BufferInfo) {
 val buffer = mediaCodec.getOutputBuffer(index)
 buffer?.apply {
 if (bufferInfo.size != 0) {
 limit(bufferInfo.offset + bufferInfo.size)
 rewind()
 mediaMuxer.writeSampleData(trackIndex, this, bufferInfo)
 }
 }

 mediaCodec.releaseOutputBuffer(index, false)

 if (bufferInfo.flags == MediaCodec.BUFFER_FLAG_END_OF_STREAM) {
 mediaCodec.stop()
 mediaCodec.release()
 mediaMuxer.stop()
 mediaMuxer.release()
 }
 }

 // Public method that receives raw unencoded video data
 fun encode(yuvImage: YuvImage) {
 // logic for handling stream end omitted for clarity

 pendingVideoEncoderInputBufferIndices.get().poll()?.let { index ->
 val buffer = mediaCodec.getInputBuffer(index)
 buffer?.clear()
 // converting frame to correct color format
 val input =
 yuvImage.convertNV21toYUV420Planar(ByteArray(yuvImage.yuvData.size), yuvImage.width, yuvImage.height)
 buffer?.put(input)
 buffer?.let {
 mediaCodec.queueInputBuffer(index, 0, input.size, System.nanoTime() / 1000, 0)
 }
 }
 }
}


</int>



Additional Info :
I’m using MediaCodec.Callback() (https://developer.android.com/reference/kotlin/android/media/MediaCodec.Callback?hl=en) to handle the encoding asynchronously.