Recherche avancée

Médias (91)

Autres articles (39)

  • Submit bugs and patches

    13 avril 2011

    Unfortunately a software is never perfect.
    If you think you have found a bug, report it using our ticket system. Please to help us to fix it by providing the following information : the browser you are using, including the exact version as precise an explanation as possible of the problem if possible, the steps taken resulting in the problem a link to the site / page in question
    If you think you have solved the bug, fill in a ticket and attach to it a corrective patch.
    You may also (...)

  • (Dés)Activation de fonctionnalités (plugins)

    18 février 2011, par

    Pour gérer l’ajout et la suppression de fonctionnalités supplémentaires (ou plugins), MediaSPIP utilise à partir de la version 0.2 SVP.
    SVP permet l’activation facile de plugins depuis l’espace de configuration de MediaSPIP.
    Pour y accéder, il suffit de se rendre dans l’espace de configuration puis de se rendre sur la page "Gestion des plugins".
    MediaSPIP est fourni par défaut avec l’ensemble des plugins dits "compatibles", ils ont été testés et intégrés afin de fonctionner parfaitement avec chaque (...)

  • Le plugin : Podcasts.

    14 juillet 2010, par

    Le problème du podcasting est à nouveau un problème révélateur de la normalisation des transports de données sur Internet.
    Deux formats intéressants existent : Celui développé par Apple, très axé sur l’utilisation d’iTunes dont la SPEC est ici ; Le format "Media RSS Module" qui est plus "libre" notamment soutenu par Yahoo et le logiciel Miro ;
    Types de fichiers supportés dans les flux
    Le format d’Apple n’autorise que les formats suivants dans ses flux : .mp3 audio/mpeg .m4a audio/x-m4a .mp4 (...)

Sur d’autres sites (6040)

  • How to stream synchronized video and audio in real-time from an Android smartphone using HLS while preserving orientation metadata ?

    6 mars, par Jérôme LAROSE
    Hello,  
I am working on an Android application where I need to stream video
from one or two cameras on my smartphone, along with audio from the
microphone, in real-time via a link or web page accessible to users.
The stream should be live, allow rewinding (DVR functionality), and be
recorded simultaneously. A latency of 1 to 2 minutes is acceptable,
and the streaming is one-way.  

I have chosen HLS (HTTP Live Streaming) for its browser compatibility
and DVR support. However, I am encountering issues with audio-video
synchronization, managing camera orientation metadata, and format
conversions.


    


    Here are my attempts :

    


      

    1. MP4 segmentation with MediaRecorder

      


        

      • I used MediaRecorder with setNextOutputFile to generate short MP4 segments, then ffmpeg-kit to convert them to fMP4 for HLS.
      • 


      • Expected : Well-aligned segments for smooth HLS playback.
      • 


      • Result : Timestamp issues causing jumps or interruptions in playback.
      • 


      


    2. 


    3. MPEG2-TS via local socket

      


        

      • I configured MediaRecorder to produce an MPEG2-TS stream sent via a local socket to ffmpeg-kit.
      • 


      • Expected : Stable streaming with preserved metadata.
      • 


      • Result : Streaming works, but orientation metadata is lost, leading to incorrectly oriented video (e.g., rotated 90°).
      • 


      


    4. 


    5. Orientation correction with ffmpeg

      


        

      • I tested -vf transpose=1 in ffmpeg to correct the orientation.
      • 


      • Expected : Correctly oriented video without excessive latency.
      • 


      • Result : Re-encoding takes too long for real-time streaming, causing unacceptable latency.
      • 


      


    6. 


    7. MPEG2-TS to fMP4 conversion

      


        

      • I converted the MPEG2-TS stream to fMP4 with ffmpeg to preserve orientation.
      • 


      • Expected : Perfect audio-video synchronization.
      • 


      • Result : Slight desynchronization between audio and video, affecting the user experience.
      • 


      


    8. 


    


    I am looking for a solution to :

    


      

    • Stream an HLS feed from Android with correctly timestamped segments.
    • 


    • Preserve orientation metadata without heavy re-encoding.
    • 


    • Ensure perfect audio-video synchronization.
    • 


    


    UPDATE

    


    package com.example.angegardien

import android.Manifest
import android.content.Context
import android.content.pm.PackageManager
import android.graphics.SurfaceTexture
import android.hardware.camera2.*
import android.media.*
import android.os.*
import android.util.Log
import android.view.Surface
import android.view.TextureView
import android.view.WindowManager
import androidx.activity.ComponentActivity
import androidx.core.app.ActivityCompat
import com.arthenica.ffmpegkit.FFmpegKit
import fi.iki.elonen.NanoHTTPD
import kotlinx.coroutines.*
import java.io.File
import java.io.IOException
import java.net.ServerSocket
import android.view.OrientationEventListener

/**
 * MainActivity class:
 * - Manages camera operations using the Camera2 API.
 * - Records video using MediaRecorder.
 * - Pipes data to FFmpeg to generate HLS segments.
 * - Hosts a local HLS server using NanoHTTPD to serve the generated HLS content.
 */
class MainActivity : ComponentActivity() {

    // TextureView used for displaying the camera preview.
    private lateinit var textureView: TextureView
    // Camera device instance.
    private lateinit var cameraDevice: CameraDevice
    // Camera capture session for managing capture requests.
    private lateinit var cameraCaptureSession: CameraCaptureSession
    // CameraManager to access camera devices.
    private lateinit var cameraManager: CameraManager
    // Directory where HLS output files will be stored.
    private lateinit var hlsDir: File
    // Instance of the HLS server.
    private lateinit var hlsServer: HlsServer

    // Camera id ("1" corresponds to the rear camera).
    private val cameraId = "1"
    // Flag indicating whether recording is currently active.
    private var isRecording = false

    // MediaRecorder used for capturing audio and video.
    private lateinit var activeRecorder: MediaRecorder
    // Surface for the camera preview.
    private lateinit var previewSurface: Surface
    // Surface provided by MediaRecorder for recording.
    private lateinit var recorderSurface: Surface

    // Port for the FFmpeg local socket connection.
    private val ffmpegPort = 8080

    // Coroutine scope to manage asynchronous tasks.
    private val scope = CoroutineScope(Dispatchers.IO + SupervisorJob())

    // Variables to track current device rotation and listen for orientation changes.
    private var currentRotation = 0
    private lateinit var orientationListener: OrientationEventListener

    override fun onCreate(savedInstanceState: Bundle?) {
        super.onCreate(savedInstanceState)

        // Initialize the TextureView and set it as the content view.
        textureView = TextureView(this)
        setContentView(textureView)

        // Get the CameraManager system service.
        cameraManager = getSystemService(CAMERA_SERVICE) as CameraManager
        // Setup the directory for HLS output.
        setupHLSDirectory()

        // Start the local HLS server on port 8081.
        hlsServer = HlsServer(8081, hlsDir, this)
        try {
            hlsServer.start()
            Log.d("HLS_SERVER", "HLS Server started on port 8081")
        } catch (e: IOException) {
            Log.e("HLS_SERVER", "Error starting HLS Server", e)
        }

        // Initialize the current rotation.
        currentRotation = getDeviceRotation()

        // Add a listener to detect orientation changes.
        orientationListener = object : OrientationEventListener(this) {
            override fun onOrientationChanged(orientation: Int) {
                if (orientation == ORIENTATION_UNKNOWN) return // Skip unknown orientations.
                // Determine the new rotation angle.
                val newRotation = when {
                    orientation >= 315 || orientation < 45 -> 0
                    orientation >= 45 && orientation < 135 -> 90
                    orientation >= 135 && orientation < 225 -> 180
                    orientation >= 225 && orientation < 315 -> 270
                    else -> 0
                }
                // If the rotation has changed and recording is active, update the rotation.
                if (newRotation != currentRotation && isRecording) {
                    Log.d("ROTATION", "Orientation change detected: $newRotation")
                    currentRotation = newRotation
                }
            }
        }
        orientationListener.enable()

        // Set up the TextureView listener to know when the surface is available.
        textureView.surfaceTextureListener = object : TextureView.SurfaceTextureListener {
            override fun onSurfaceTextureAvailable(surface: SurfaceTexture, width: Int, height: Int) {
                // Open the camera when the texture becomes available.
                openCamera()
            }
            override fun onSurfaceTextureSizeChanged(surface: SurfaceTexture, width: Int, height: Int) {}
            override fun onSurfaceTextureDestroyed(surface: SurfaceTexture) = false
            override fun onSurfaceTextureUpdated(surface: SurfaceTexture) {}
        }
    }

    /**
     * Sets up the HLS directory in the public Downloads folder.
     * If the directory exists, it deletes it recursively and creates a new one.
     */
    private fun setupHLSDirectory() {
        val downloadsDir = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DOWNLOADS)
        hlsDir = File(downloadsDir, "HLS_Output")

        if (hlsDir.exists()) {
            hlsDir.deleteRecursively()
        }
        hlsDir.mkdirs()

        Log.d("HLS", "📂 HLS folder created: ${hlsDir.absolutePath}")
    }

    /**
     * Opens the camera after checking for necessary permissions.
     */
    private fun openCamera() {
        if (ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED ||
            ActivityCompat.checkSelfPermission(this, Manifest.permission.RECORD_AUDIO) != PackageManager.PERMISSION_GRANTED) {
            // Request permissions if they are not already granted.
            ActivityCompat.requestPermissions(this, arrayOf(Manifest.permission.CAMERA, Manifest.permission.RECORD_AUDIO), 101)
            return
        }

        try {
            // Open the specified camera using its cameraId.
            cameraManager.openCamera(cameraId, object : CameraDevice.StateCallback() {
                override fun onOpened(camera: CameraDevice) {
                    cameraDevice = camera
                    // Start the recording session once the camera is opened.
                    startNextRecording()
                }
                override fun onDisconnected(camera: CameraDevice) { camera.close() }
                override fun onError(camera: CameraDevice, error: Int) { camera.close() }
            }, null)
        } catch (e: CameraAccessException) {
            e.printStackTrace()
        }
    }

    /**
     * Starts a new recording session:
     * - Sets up the preview and recorder surfaces.
     * - Creates a pipe for MediaRecorder output.
     * - Creates a capture session for simultaneous preview and recording.
     */
    private fun startNextRecording() {
        // Get the SurfaceTexture from the TextureView and set its default buffer size.
        val texture = textureView.surfaceTexture!!
        texture.setDefaultBufferSize(1920, 1080)
        // Create the preview surface.
        previewSurface = Surface(texture)

        // Create and configure the MediaRecorder.
        activeRecorder = createMediaRecorder()

        // Create a pipe to route MediaRecorder data.
        val pipe = ParcelFileDescriptor.createPipe()
        val pfdWrite = pipe[1] // Write end used by MediaRecorder.
        val pfdRead = pipe[0]  // Read end used by the local socket server.

        // Set MediaRecorder output to the file descriptor of the write end.
        activeRecorder.setOutputFile(pfdWrite.fileDescriptor)
        setupMediaRecorder(activeRecorder)
        // Obtain the recorder surface from MediaRecorder.
        recorderSurface = activeRecorder.surface

        // Create a capture request using the RECORD template.
        val captureRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD)
        captureRequestBuilder.addTarget(previewSurface)
        captureRequestBuilder.addTarget(recorderSurface)

        // Create a capture session including both preview and recorder surfaces.
        cameraDevice.createCaptureSession(
            listOf(previewSurface, recorderSurface),
            object : CameraCaptureSession.StateCallback() {
                override fun onConfigured(session: CameraCaptureSession) {
                    cameraCaptureSession = session
                    captureRequestBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO)
                    // Start a continuous capture request.
                    cameraCaptureSession.setRepeatingRequest(captureRequestBuilder.build(), null, null)

                    // Launch a coroutine to start FFmpeg and MediaRecorder with synchronization.
                    scope.launch {
                        startFFmpeg()
                        delay(500) // Wait for FFmpeg to be ready.
                        activeRecorder.start()
                        isRecording = true
                        Log.d("HLS", "🎥 Recording started...")
                    }

                    // Launch a coroutine to run the local socket server to forward data.
                    scope.launch {
                        startLocalSocketServer(pfdRead)
                    }
                }
                override fun onConfigureFailed(session: CameraCaptureSession) {
                    Log.e("Camera2", "❌ Configuration failed")
                }
            },
            null
        )
    }

    /**
     * Coroutine to start a local socket server.
     * It reads from the MediaRecorder pipe and sends the data to FFmpeg.
     */
    private suspend fun startLocalSocketServer(pfdRead: ParcelFileDescriptor) {
        withContext(Dispatchers.IO) {
            val serverSocket = ServerSocket(ffmpegPort)
            Log.d("HLS", "Local socket server started on port $ffmpegPort")

            // Accept connection from FFmpeg.
            val socket = serverSocket.accept()
            Log.d("HLS", "Connection accepted from FFmpeg")

            // Read data from the pipe and forward it through the socket.
            val inputStream = ParcelFileDescriptor.AutoCloseInputStream(pfdRead)
            val outputStream = socket.getOutputStream()
            val buffer = ByteArray(8192)
            var bytesRead: Int
            while (inputStream.read(buffer).also { bytesRead = it } != -1) {
                outputStream.write(buffer, 0, bytesRead)
            }
            outputStream.close()
            inputStream.close()
            socket.close()
            serverSocket.close()
        }
    }

    /**
     * Coroutine to start FFmpeg using a local TCP input.
     * Applies a video rotation filter based on device orientation and generates HLS segments.
     */
    private suspend fun startFFmpeg() {
        withContext(Dispatchers.IO) {
            // Retrieve the appropriate transpose filter based on current rotation.
            val transposeFilter = getTransposeFilter(currentRotation)

            // FFmpeg command to read from the TCP socket and generate an HLS stream.
            // Two alternative commands are commented below.
            // val ffmpegCommand = "-fflags +genpts -i tcp://localhost:$ffmpegPort -c copy -bsf:a aac_adtstoasc -movflags +faststart -f dash -seg_duration 10 -hls_playlist 1 ${hlsDir.absolutePath}/manifest.mpd"
            // val ffmpegCommand = "-fflags +genpts -i tcp://localhost:$ffmpegPort -c copy -bsf:a aac_adtstoasc -movflags +faststart -f hls -hls_time 5 -hls_segment_type fmp4 -hls_flags split_by_time -hls_list_size 0 -hls_playlist_type event -hls_fmp4_init_filename init.mp4 -hls_segment_filename ${hlsDir.absolutePath}/segment_%03d.m4s ${hlsDir.absolutePath}/playlist.m3u8"
            val ffmpegCommand = "-fflags +genpts -i tcp://localhost:$ffmpegPort -vf $transposeFilter -c:v libx264 -preset ultrafast -crf 23 -c:a copy -movflags +faststart -f hls -hls_time 0.1 -hls_segment_type mpegts -hls_flags split_by_time -hls_list_size 0 -hls_playlist_type event -hls_segment_filename ${hlsDir.absolutePath}/segment_%03d.ts ${hlsDir.absolutePath}/playlist.m3u8"

            FFmpegKit.executeAsync(ffmpegCommand) { session ->
                if (session.returnCode.isValueSuccess) {
                    Log.d("HLS", "✅ HLS generated successfully")
                } else {
                    Log.e("FFmpeg", "❌ Error generating HLS: ${session.allLogsAsString}")
                }
            }
        }
    }

    /**
     * Gets the current device rotation using the WindowManager.
     */
    private fun getDeviceRotation(): Int {
        val windowManager = getSystemService(Context.WINDOW_SERVICE) as WindowManager
        return when (windowManager.defaultDisplay.rotation) {
            Surface.ROTATION_0 -> 0
            Surface.ROTATION_90 -> 90
            Surface.ROTATION_180 -> 180
            Surface.ROTATION_270 -> 270
            else -> 0
        }
    }

    /**
     * Returns the FFmpeg transpose filter based on the rotation angle.
     * Used to rotate the video stream accordingly.
     */
    private fun getTransposeFilter(rotation: Int): String {
        return when (rotation) {
            90 -> "transpose=1" // 90° clockwise
            180 -> "transpose=2,transpose=2" // 180° rotation
            270 -> "transpose=2" // 90° counter-clockwise
            else -> "transpose=0" // No rotation
        }
    }

    /**
     * Creates and configures a MediaRecorder instance.
     * Sets up audio and video sources, formats, encoders, and bitrates.
     */
    private fun createMediaRecorder(): MediaRecorder {
        return MediaRecorder().apply {
            setAudioSource(MediaRecorder.AudioSource.MIC)
            setVideoSource(MediaRecorder.VideoSource.SURFACE)
            setOutputFormat(MediaRecorder.OutputFormat.MPEG_2_TS)
            setVideoEncodingBitRate(5000000)
            setVideoFrameRate(24)
            setVideoSize(1080, 720)
            setVideoEncoder(MediaRecorder.VideoEncoder.H264)
            setAudioEncoder(MediaRecorder.AudioEncoder.AAC)
            setAudioSamplingRate(16000)
            setAudioEncodingBitRate(96000) // 96 kbps
        }
    }

    /**
     * Prepares the MediaRecorder and logs the outcome.
     */
    private fun setupMediaRecorder(recorder: MediaRecorder) {
        try {
            recorder.prepare()
            Log.d("HLS", "✅ MediaRecorder prepared")
        } catch (e: IOException) {
            Log.e("HLS", "❌ Error preparing MediaRecorder", e)
        }
    }

    /**
     * Custom HLS server class extending NanoHTTPD.
     * Serves HLS segments and playlists from the designated HLS directory.
     */
    private inner class HlsServer(port: Int, private val hlsDir: File, private val context: Context) : NanoHTTPD(port) {
        override fun serve(session: IHTTPSession): Response {
            val uri = session.uri.trimStart('/')

            // Intercept the request for `init.mp4` and serve it from assets.
            /*
            if (uri == "init.mp4") {
                Log.d("HLS Server", "📡 Intercepting init.mp4, sending file from assets...")
                return try {
                    val assetManager = context.assets
                    val inputStream = assetManager.open("init.mp4")
                    newFixedLengthResponse(Response.Status.OK, "video/mp4", inputStream, inputStream.available().toLong())
                } catch (e: Exception) {
                    Log.e("HLS Server", "❌ Error reading init.mp4 from assets: ${e.message}")
                    newFixedLengthResponse(Response.Status.INTERNAL_ERROR, MIME_PLAINTEXT, "Server error")
                }
            }
            */

            // Serve all other HLS files normally from the hlsDir.
            val file = File(hlsDir, uri)
            return if (file.exists()) {
                newFixedLengthResponse(Response.Status.OK, getMimeTypeForFile(uri), file.inputStream(), file.length())
            } else {
                newFixedLengthResponse(Response.Status.NOT_FOUND, MIME_PLAINTEXT, "File not found")
            }
        }
    }

    /**
     * Clean up resources when the activity is destroyed.
     * Stops recording, releases the camera, cancels coroutines, and stops the HLS server.
     */
    override fun onDestroy() {
        super.onDestroy()
        if (isRecording) {
            activeRecorder.stop()
            activeRecorder.release()
        }
        cameraDevice.close()
        scope.cancel()
        hlsServer.stop()
        orientationListener.disable()
        Log.d("HLS", "🛑 Activity destroyed")
    }
}


    


    I have three examples of ffmpeg commands.

    


      

    • One command segments into DASH, but the camera does not have the correct rotation.
    • 


    • One command segments into HLS without re-encoding with 5-second segments ; it’s fast but does not have the correct rotation.
    • 


    • One command segments into HLS with re-encoding, which applies a rotation. It’s too slow for 5-second segments, so a 1-second segment was chosen.
    • 


    


    Note :

    


      

    • In the second command ("One command segments into HLS without re-encoding with 5-second segments ; it’s fast but does not have the correct rotation."), it returns fMP4. To achieve the correct rotation, I provide a preconfigured init.mp4 file during the HTTP request to retrieve it (see comment).
    • 


    • In the third command ("One command segments into HLS with re-encoding, which applies a rotation. It’s too slow for 5-second segments, so a 1-second segment was chosen."), it returns TS.
    • 


    


  • How to fix the ffmpeg python issue "Error applying option 'original_size' to filter 'ass' : Invalid argument" ?

    16 février, par Gauthier Buttez

    ENVIRONMENT :

    


    Python 3.10

    


    Windows 11

    


    ffmpeg-python==0.2.0

    


    CONTEXT :

    


    I am trying to add hardcoded subtitles on a video with ffmpeg and Pyton.

    


    PROBLEM :

    


    ffmpeg is not able to find the path of my ass subtitle file. So I tried different ways with different methods. I asked different AI to try to find a solution. Impossible ! It is such a crazy issue that I tested the code from a "test" folder located on the root "C :" from the command line wid nows. No AI was able to fixed this issue. I tried chatGPT, deepseek, claude.ai. They proposed different solutions which were not working.

    


    THE CODE :

    


    import os
import ffmpeg
import subprocess

def add_subtitles_with_ass(video_path, ass_path, output_path):

    video_path = os.path.abspath(video_path)
    ass_path = os.path.abspath(ass_path)
    output_path = os.path.abspath(output_path)

    print(f"🚀 Add susbtitiles with ASS : {ass_path}")
    ass_path_escaped = rf'"{ass_path}"'

    try:
        ffmpeg.input(video_path).output(output_path, vf=f"ass='{ass_path_escaped}'").run(overwrite_output=True)
        print(f"✅ Subtitles added with success : {output_path}")
    except Exception as e:
        print(f"❌ ERROR when adding subtitles : {e}")


def add_subtitles_with_ass_with_subprocess(video_path, ass_path, output_path):
    video_path = os.path.abspath(video_path)
    ass_path = os.path.abspath(ass_path)
    output_path = os.path.abspath(output_path)

    print(f"🚀 Add subtitles with ASS : {ass_path}")

    try:
        command = [
            'ffmpeg',
            '-i', video_path,
            '-vf', f"ass='{ass_path}'",
            '-c:a', 'copy',
            output_path,
            '-y'  # Overwrite output file without asking
        ]

        subprocess.run(command, check=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
        print(f"✅ Subtitles added with success : {output_path}")
    except subprocess.CalledProcessError as e:
        print(f"❌ ERROR when adding subtitles : {e.stderr.decode('utf-8')}")


print(f"Test 1 -----------------------------------------------------------------------")
ass_path="G:\Mi unidad\Python\Scripts\shorts_videos\10000views_and_higher\Li2zY7XcOf0_Matthew_Hussey_flip_h_BW_with_logo_EN.ass"
video_path="G:\Mi unidad\Python\Scripts\shorts_videos\10000views_and_higher\Li2zY7XcOf0_Matthew_Hussey_flip_h_BW_with_logo.mp4"
output_path="G:\Mi unidad\Python\Scripts\shorts_videos\10000views_and_higher\Li2zY7XcOf0_Matthew_Hussey_flip_h_BW_with_logo_EN.mp4"
add_subtitles_with_ass(video_path, ass_path, output_path)
print(50*"*")
print(50*"*")
print(50*"*")

print(f"Test 2 -----------------------------------------------------------------------")
ass_path2="G:\\Mi unidad\\Python\\Scripts\\shorts_videos\\10000views_and_higher\\Li2zY7XcOf0_Matthew_Hussey_flip_h_BW_with_logo_EN.ass"
video_path2="G:\\Mi unidad\\Python\\Scripts\\shorts_videos\\10000views_and_higher\\Li2zY7XcOf0_Matthew_Hussey_flip_h_BW_with_logo.mp4"
output_path2="G:\\Mi unidad\\Python\\Scripts\\shorts_videos\\10000views_and_higher\\Li2zY7XcOf0_Matthew_Hussey_flip_h_BW_with_logo_EN.mp4"
add_subtitles_with_ass(video_path2, ass_path2, output_path2)

print(50*"*")
print(50*"*")
print(50*"*")

print(f"Test 3 -----------------------------------------------------------------------")
ass_path3="G:/\Mi unidad/\Python/\Scripts/\shorts_videos/\10000views_and_higher/\Li2zY7XcOf0_Matthew_Hussey_flip_h_BW_with_logo_EN.ass"
video_path3="G:/\Mi unidad/\Python/\Scripts/\shorts_videos/\10000views_and_higher/\Li2zY7XcOf0_Matthew_Hussey_flip_h_BW_with_logo.mp4"
output_path3="G:/\Mi unidad/\Python/\Scripts/\shorts_videos/\10000views_and_higher/\Li2zY7XcOf0_Matthew_Hussey_flip_h_BW_with_logo_EN.mp4"
add_subtitles_with_ass(video_path3, ass_path3, output_path3)

print(50*"*")
print(50*"*")
print(50*"*")

print(f"Test 4 -----------------------------------------------------------------------")
ass_path4="G:/\\Mi unidad/\\Python/\\Scripts/\\shorts_videos/\\10000views_and_higher/\\Li2zY7XcOf0_Matthew_Hussey_flip_h_BW_with_logo_EN.ass"
video_path4="G:/\\Mi unidad/\\Python/\\Scripts/\\shorts_videos/\\10000views_and_higher/\\Li2zY7XcOf0_Matthew_Hussey_flip_h_BW_with_logo.mp4"
output_path4="G:/\\Mi unidad/\\Python/\\Scripts/\\shorts_videos/\\10000views_and_higher/\\Li2zY7XcOf0_Matthew_Hussey_flip_h_BW_with_logo_EN.mp4"
add_subtitles_with_ass(video_path4, ass_path4, output_path4)




print(f"Test 1 with subprocess -----------------------------------------------------------------------")
add_subtitles_with_ass_with_subprocess(video_path, ass_path, output_path)
print(50*"*")
print(50*"*")
print(50*"*")

print(f"Test 2 with subprocess -----------------------------------------------------------------------")
add_subtitles_with_ass_with_subprocess(video_path2, ass_path2, output_path2)
print(50*"*")
print(50*"*")
print(50*"*")

print(f"Test 3 with subprocess -----------------------------------------------------------------------")
add_subtitles_with_ass_with_subprocess(video_path3, ass_path3, output_path3)
print(50*"*")
print(50*"*")
print(50*"*")

print(f"Test 4 with subprocess -----------------------------------------------------------------------")
add_subtitles_with_ass_with_subprocess(video_path4, ass_path4, output_path4)
print(50*"*")
print(50*"*")
print(50*"*")

#======================= IT DIDN'T WORK, SO LET'S DO WITH RELATIVE PATH ===================================


print(f"Test 1 with relative path -----------------------------------------------------------------------")
ass_path_relative="10000views_and_higher\Li2zY7XcOf0_Matthew_Hussey_flip_h_BW_with_logo_EN.ass"
video_path_relative="10000views_and_higher\Li2zY7XcOf0_Matthew_Hussey_flip_h_BW_with_logo.mp4"
output_path_relative="10000views_and_higher\Li2zY7XcOf0_Matthew_Hussey_flip_h_BW_with_logo_EN.mp4"
add_subtitles_with_ass(video_path_relative, ass_path_relative, output_path_relative)
print(50*"*")
print(50*"*")
print(50*"*")

print(f"Test 2  with relative path -----------------------------------------------------------------------")
ass_path2_relative="10000views_and_higher\\Li2zY7XcOf0_Matthew_Hussey_flip_h_BW_with_logo_EN.ass"
video_path2_relative="10000views_and_higher\\Li2zY7XcOf0_Matthew_Hussey_flip_h_BW_with_logo.mp4"
output_path2_relative="10000views_and_higher\\Li2zY7XcOf0_Matthew_Hussey_flip_h_BW_with_logo_EN.mp4"
add_subtitles_with_ass(video_path2_relative, ass_path2_relative, output_path2_relative)

print(50*"*")
print(50*"*")
print(50*"*")

print(f"Test 3  with relative path -----------------------------------------------------------------------")
ass_path3_relative="10000views_and_higher/\Li2zY7XcOf0_Matthew_Hussey_flip_h_BW_with_logo_EN.ass"
video_path3_relative="10000views_and_higher/\Li2zY7XcOf0_Matthew_Hussey_flip_h_BW_with_logo.mp4"
output_path3_relative="10000views_and_higher/\Li2zY7XcOf0_Matthew_Hussey_flip_h_BW_with_logo_EN.mp4"
add_subtitles_with_ass(video_path3_relative, ass_path3_relative, output_path3_relative)

print(50*"*")
print(50*"*")
print(50*"*")

print(f"Test 4  with relative path -----------------------------------------------------------------------")
ass_path4_relative="10000views_and_higher/\\Li2zY7XcOf0_Matthew_Hussey_flip_h_BW_with_logo_EN.ass"
video_path4_relative="10000views_and_higher/\\Li2zY7XcOf0_Matthew_Hussey_flip_h_BW_with_logo.mp4"
output_path4_relative="10000views_and_higher/\\Li2zY7XcOf0_Matthew_Hussey_flip_h_BW_with_logo_EN.mp4"
add_subtitles_with_ass(video_path4_relative, ass_path4_relative, output_path4_relative)




print(f"Test 1 with subprocess  with relative path -----------------------------------------------------------------------")
add_subtitles_with_ass_with_subprocess(video_path, ass_path, output_path)
print(50*"*")
print(50*"*")
print(50*"*")

print(f"Test 2 with subprocess  with relative path -----------------------------------------------------------------------")
add_subtitles_with_ass_with_subprocess(video_path2_relative, ass_path2_relative, output_path2_relative)
print(50*"*")
print(50*"*")
print(50*"*")

print(f"Test 3 with subprocess  with relative path -----------------------------------------------------------------------")
add_subtitles_with_ass_with_subprocess(video_path3_relative, ass_path3_relative, output_path3_relative)
print(50*"*")
print(50*"*")
print(50*"*")

print(f"Test 4 with subprocess  with relative path -----------------------------------------------------------------------")
add_subtitles_with_ass_with_subprocess(video_path4_relative, ass_path4_relative, output_path4_relative)
print(50*"*")
print(50*"*")
print(50*"*")


    


    THE OUTPUT :

    


    The output was too large to post it here. I uploaded it on my github :

    


    https://github.com/gauthierbuttez/public/blob/master/output_ffmpeg.log

    


    I also zipped and uploaded all the files to make reproducing the issue on your computer easy :

    


    https://github.com/gauthierbuttez/public/blob/master/test.zip

    


    Is there any kind super Python Master in the place ?

    


  • Merged Video Contains Inverted Clips After First Video Ends

    3 février, par Nikunj Agrawal

    I am working on a Flutter application that merges multiple videos using ffmpeg_kit_flutter . However, after merging, I notice that the second video (and any subsequent ones) appear inverted or rotated in the final output.

    


    Issue Details :

    


      

    1. The first video appears normal.
    2. 


    3. The videos can be recorded using both front and back cameras.
    4. 


    5. The second (and later) videos are flipped or rotated upside down.
    6. 


    7. This happens after merging using ffmpeg_kit_flutter.
    8. 


    


    Question :
How can I correctly merge multiple videos in Flutter without rotation issues ? Is there a way to normalize video orientation before merging using ffmpeg_kit_flutter ?

    


    Any help would be appreciated ! 🚀

    


    Code :

    


    import &#x27;dart:io&#x27;;&#xA;import &#x27;dart:math&#x27;;&#xA;&#xA;import &#x27;package:camera/camera.dart&#x27;;&#xA;import &#x27;package:ffmpeg_kit_flutter/ffmpeg_kit.dart&#x27;;&#xA;import &#x27;package:ffmpeg_kit_flutter/return_code.dart&#x27;;&#xA;import &#x27;package:flutter/material.dart&#x27;;&#xA;import &#x27;package:path_provider/path_provider.dart&#x27;;&#xA;import &#x27;package:permission_handler/permission_handler.dart&#x27;;&#xA;import &#x27;package:record/record.dart&#x27;;&#xA;import &#x27;package:videotest/video_player.dart&#x27;;&#xA;&#xA;class MergeVideoRecording extends StatefulWidget {&#xA;  const MergeVideoRecording({super.key});&#xA;&#xA;  @override&#xA;  State<mergevideorecording> createState() => _MergeVideoRecordingState();&#xA;}&#xA;&#xA;class _MergeVideoRecordingState extends State<mergevideorecording> {&#xA;  CameraController? _cameraController;&#xA;  final AudioRecorder _audioRecorder = AudioRecorder();&#xA;&#xA;  bool _isRecording = false;&#xA;  String? _videoPath;&#xA;  String? _audioPath;&#xA;  List<cameradescription> _cameras = [];&#xA;  int _currentCameraIndex = 0;&#xA;  final List<string> _recordedVideos = [];&#xA;&#xA;  @override&#xA;  Widget build(BuildContext context) {&#xA;    return Scaffold(&#xA;      body: Column(&#xA;        mainAxisAlignment: MainAxisAlignment.center,&#xA;        children: [&#xA;          _cameraController != null &amp;&amp; _cameraController!.value.isInitialized&#xA;              ? SizedBox(&#xA;                  width: MediaQuery.of(context).size.width * 0.4,&#xA;                  height: MediaQuery.of(context).size.height * 0.3,&#xA;                  child: Stack(&#xA;                    children: [&#xA;                      ClipRRect(&#xA;                        borderRadius: BorderRadius.circular(16),&#xA;                        child: SizedBox(&#xA;                          width: MediaQuery.of(context).size.width * 0.4,&#xA;                          height: MediaQuery.of(context).size.height * 0.3,&#xA;                          child: Transform(&#xA;                            alignment: Alignment.center,&#xA;                            transform:&#xA;                                _cameras[_currentCameraIndex].lensDirection ==&#xA;                                        CameraLensDirection.front&#xA;                                    ? Matrix4.rotationY(pi)&#xA;                                    : Matrix4.identity(),&#xA;                            child: CameraPreview(_cameraController!),&#xA;                          ),&#xA;                        ),&#xA;                      ),&#xA;                      Align(&#xA;                        alignment: Alignment.topRight,&#xA;                        child: InkWell(&#xA;                          onTap: _switchCamera,&#xA;                          child: const Padding(&#xA;                            padding: EdgeInsets.all(8.0),&#xA;                            child: CircleAvatar(&#xA;                              radius: 18,&#xA;                              backgroundColor: Colors.white,&#xA;                              child: Icon(&#xA;                                Icons.flip_camera_android,&#xA;                                color: Colors.black,&#xA;                              ),&#xA;                            ),&#xA;                          ),&#xA;                        ),&#xA;                      ),&#xA;                    ],&#xA;                  ),&#xA;                )&#xA;              : const CircularProgressIndicator(),&#xA;          const SizedBox(height: 16),&#xA;          Row(&#xA;            mainAxisAlignment: MainAxisAlignment.center,&#xA;            children: [&#xA;              FloatingActionButton(&#xA;                heroTag: &#x27;record_button&#x27;,&#xA;                onPressed: _toggleRecording,&#xA;                child: Icon(&#xA;                  _isRecording ? Icons.stop : Icons.video_camera_back,&#xA;                ),&#xA;              ),&#xA;              const SizedBox(&#xA;                width: 50,&#xA;              ),&#xA;              FloatingActionButton(&#xA;                heroTag: &#x27;merge_button&#x27;,&#xA;                onPressed: _mergeVideos,&#xA;                child: const Icon(&#xA;                  Icons.merge,&#xA;                ),&#xA;              ),&#xA;            ],&#xA;          ),&#xA;          if (!_isRecording)&#xA;            ListView.builder(&#xA;              shrinkWrap: true,&#xA;              itemCount: _recordedVideos.length,&#xA;              itemBuilder: (context, index) => InkWell(&#xA;                onTap: () {&#xA;                  Navigator.push(&#xA;                    context,&#xA;                    MaterialPageRoute(&#xA;                      builder: (context) => VideoPlayerScreen(&#xA;                        videoPath: _recordedVideos[index],&#xA;                      ),&#xA;                    ),&#xA;                  );&#xA;                },&#xA;                child: ListTile(&#xA;                  title: Text(&#x27;Video ${index &#x2B; 1}&#x27;),&#xA;                  subtitle: Text(&#x27;Path ${_recordedVideos[index]}&#x27;),&#xA;                  trailing: const Icon(Icons.play_arrow),&#xA;                ),&#xA;              ),&#xA;            ),&#xA;        ],&#xA;      ),&#xA;    );&#xA;  }&#xA;&#xA;  @override&#xA;  void dispose() {&#xA;    _cameraController?.dispose();&#xA;    _audioRecorder.dispose();&#xA;    super.dispose();&#xA;  }&#xA;&#xA;  @override&#xA;  void initState() {&#xA;    super.initState();&#xA;    _initializeDevices();&#xA;  }&#xA;&#xA;  Future<void> _initializeCameraController(CameraDescription camera) async {&#xA;    _cameraController = CameraController(&#xA;      camera,&#xA;      ResolutionPreset.high,&#xA;      enableAudio: true,&#xA;      imageFormatGroup: ImageFormatGroup.yuv420, // Add this line&#xA;    );&#xA;&#xA;    await _cameraController!.initialize();&#xA;    await _cameraController!.setExposureMode(ExposureMode.auto);&#xA;    await _cameraController!.setFocusMode(FocusMode.auto);&#xA;    setState(() {});&#xA;  }&#xA;&#xA;  Future<void> _initializeDevices() async {&#xA;    final cameraStatus = await Permission.camera.request();&#xA;    final micStatus = await Permission.microphone.request();&#xA;&#xA;    if (!cameraStatus.isGranted || !micStatus.isGranted) {&#xA;      _showError(&#x27;Camera and microphone permissions required&#x27;);&#xA;      return;&#xA;    }&#xA;&#xA;    _cameras = await availableCameras();&#xA;    if (_cameras.isNotEmpty) {&#xA;      final frontCameraIndex = _cameras.indexWhere(&#xA;          (camera) => camera.lensDirection == CameraLensDirection.front);&#xA;      _currentCameraIndex = frontCameraIndex != -1 ? frontCameraIndex : 0;&#xA;      await _initializeCameraController(_cameras[_currentCameraIndex]);&#xA;    }&#xA;  }&#xA;&#xA;  // Merge video&#xA;  Future<void> _mergeVideos() async {&#xA;    if (_recordedVideos.isEmpty) {&#xA;      _showError(&#x27;No videos to merge&#x27;);&#xA;      return;&#xA;    }&#xA;&#xA;    try {&#xA;      // Debug logging&#xA;      print(&#x27;Starting merge process&#x27;);&#xA;      print(&#x27;Number of videos to merge: ${_recordedVideos.length}&#x27;);&#xA;      for (var i = 0; i &lt; _recordedVideos.length; i&#x2B;&#x2B;) {&#xA;        final file = File(_recordedVideos[i]);&#xA;        final exists = await file.exists();&#xA;        final size = exists ? await file.length() : 0;&#xA;        print(&#x27;Video $i: ${_recordedVideos[i]}&#x27;);&#xA;        print(&#x27;Exists: $exists, Size: $size bytes&#x27;);&#xA;      }&#xA;&#xA;      final Directory appDir = await getApplicationDocumentsDirectory();&#xA;      final String outputPath =&#xA;          &#x27;${appDir.path}/merged_${DateTime.now().millisecondsSinceEpoch}.mp4&#x27;;&#xA;      final String listFilePath = &#x27;${appDir.path}/list.txt&#x27;;&#xA;&#xA;      print(&#x27;Output path: $outputPath&#x27;);&#xA;      print(&#x27;List file path: $listFilePath&#x27;);&#xA;&#xA;      // Create and verify list file&#xA;      final listFile = File(listFilePath);&#xA;      final fileContent = _recordedVideos&#xA;          .map((path) => "file &#x27;${path.replaceAll("&#x27;", "&#x27;\\&#x27;&#x27;")}&#x27;")&#xA;          .join(&#x27;\n&#x27;);&#xA;      await listFile.writeAsString(fileContent);&#xA;&#xA;      print(&#x27;List file content:&#x27;);&#xA;      print(await listFile.readAsString());&#xA;&#xA;      // Simpler FFmpeg command for testing&#xA;      final command = &#x27;&#x27;&#x27;&#xA;      -f concat&#xA;      -safe 0&#xA;      -i "$listFilePath"&#xA;      -c copy&#xA;      -y&#xA;      "$outputPath"&#xA;    &#x27;&#x27;&#x27;&#xA;          .trim()&#xA;          .replaceAll(&#x27;\n&#x27;, &#x27; &#x27;);&#xA;&#xA;      print(&#x27;Executing FFmpeg command: $command&#x27;);&#xA;&#xA;      final session = await FFmpegKit.execute(command);&#xA;      final returnCode = await session.getReturnCode();&#xA;      final logs = await session.getAllLogsAsString();&#xA;      final failStackTrace = await session.getFailStackTrace();&#xA;&#xA;      print(&#x27;FFmpeg return code: ${returnCode?.getValue() ?? "null"}&#x27;);&#xA;      print(&#x27;FFmpeg logs: $logs&#x27;);&#xA;      if (failStackTrace != null) {&#xA;        print(&#x27;FFmpeg fail stack trace: $failStackTrace&#x27;);&#xA;      }&#xA;&#xA;      if (ReturnCode.isSuccess(returnCode)) {&#xA;        final outputFile = File(outputPath);&#xA;        final outputExists = await outputFile.exists();&#xA;        final outputSize = outputExists ? await outputFile.length() : 0;&#xA;&#xA;        print(&#x27;Output file exists: $outputExists&#x27;);&#xA;        print(&#x27;Output file size: $outputSize bytes&#x27;);&#xA;&#xA;        if (outputExists &amp;&amp; outputSize > 0) {&#xA;          setState(() => _recordedVideos.add(outputPath));&#xA;          _showSuccess(&#x27;Videos merged successfully&#x27;);&#xA;        } else {&#xA;          _showError(&#x27;Merged file is empty or not created&#x27;);&#xA;        }&#xA;      } else {&#xA;        _showError(&#x27;Failed to merge videos. Check logs for details.&#x27;);&#xA;      }&#xA;&#xA;      // Clean up&#xA;      try {&#xA;        await listFile.delete();&#xA;        print(&#x27;List file cleaned up successfully&#x27;);&#xA;      } catch (e) {&#xA;        print(&#x27;Failed to delete list file: $e&#x27;);&#xA;      }&#xA;    } catch (e, s) {&#xA;      print(&#x27;Error during merge: $e&#x27;);&#xA;      print(&#x27;Stack trace: $s&#x27;);&#xA;      _showError(&#x27;Error merging videos: ${e.toString()}&#x27;);&#xA;    }&#xA;  }&#xA;&#xA;  void _showError(String message) {&#xA;    ScaffoldMessenger.of(context).showSnackBar(&#xA;      SnackBar(content: Text(message), backgroundColor: Colors.red),&#xA;    );&#xA;  }&#xA;&#xA;  void _showSuccess(String message) {&#xA;    ScaffoldMessenger.of(context).showSnackBar(&#xA;      SnackBar(content: Text(message), backgroundColor: Colors.green),&#xA;    );&#xA;  }&#xA;&#xA;  Future<void> _startAudioRecording() async {&#xA;    try {&#xA;      final Directory tempDir = await getTemporaryDirectory();&#xA;      final audioPath = &#x27;${tempDir.path}/recording.wav&#x27;;&#xA;      await _audioRecorder.start(const RecordConfig(), path: audioPath);&#xA;      setState(() => _isRecording = true);&#xA;    } catch (e) {&#xA;      _showError(&#x27;Recording start error: $e&#x27;);&#xA;    }&#xA;  }&#xA;&#xA;  Future<void> _startVideoRecording() async {&#xA;    try {&#xA;      await _cameraController!.startVideoRecording();&#xA;      setState(() => _isRecording = true);&#xA;    } catch (e) {&#xA;      _showError(&#x27;Recording start error: $e&#x27;);&#xA;    }&#xA;  }&#xA;&#xA;  Future<void> _stopAndSaveAudioRecording() async {&#xA;    _audioPath = await _audioRecorder.stop();&#xA;    if (_audioPath != null) {&#xA;      final Directory appDir = await getApplicationDocumentsDirectory();&#xA;      final timestamp = DateTime.now().millisecondsSinceEpoch;&#xA;      final String audioFileName = &#x27;audio_$timestamp.wav&#x27;;&#xA;      await File(_audioPath!).copy(&#x27;${appDir.path}/$audioFileName&#x27;);&#xA;      _showSuccess(&#x27;Saved: $audioFileName&#x27;);&#xA;    }&#xA;  }&#xA;&#xA;  Future<void> _stopAndSaveVideoRecording() async {&#xA;    try {&#xA;      final video = await _cameraController!.stopVideoRecording();&#xA;      _videoPath = video.path;&#xA;&#xA;      if (_videoPath != null) {&#xA;        final Directory appDir = await getApplicationDocumentsDirectory();&#xA;        final timestamp = DateTime.now().millisecondsSinceEpoch;&#xA;        final String videoFileName = &#x27;video_$timestamp.mp4&#x27;;&#xA;        final savedVideoPath = &#x27;${appDir.path}/$videoFileName&#x27;;&#xA;        await File(_videoPath!).copy(savedVideoPath);&#xA;&#xA;        setState(() {&#xA;          _recordedVideos.add(savedVideoPath);&#xA;          _isRecording = false;&#xA;        });&#xA;&#xA;        _showSuccess(&#x27;Saved: $videoFileName&#x27;);&#xA;      }&#xA;    } catch (e) {&#xA;      _showError(&#x27;Recording stop error: $e&#x27;);&#xA;    }&#xA;  }&#xA;&#xA;  Future<void> _switchCamera() async {&#xA;    if (_cameras.length &lt;= 1) return;&#xA;&#xA;    if (_isRecording) {&#xA;      await _stopAndSaveVideoRecording();&#xA;      _currentCameraIndex = (_currentCameraIndex &#x2B; 1) % _cameras.length;&#xA;      await _initializeCameraController(_cameras[_currentCameraIndex]);&#xA;      await _startVideoRecording();&#xA;    } else {&#xA;      _currentCameraIndex = (_currentCameraIndex &#x2B; 1) % _cameras.length;&#xA;      await _initializeCameraController(_cameras[_currentCameraIndex]);&#xA;    }&#xA;  }&#xA;&#xA;  Future<void> _toggleRecording() async {&#xA;    if (_cameraController == null) return;&#xA;&#xA;    if (_isRecording) {&#xA;      await _stopAndSaveVideoRecording();&#xA;      await _stopAndSaveAudioRecording();&#xA;    } else {&#xA;      _startVideoRecording();&#xA;      _startAudioRecording();&#xA;      setState(() => _recordedVideos.clear());&#xA;    }&#xA;  }&#xA;}&#xA;</void></void></void></void></void></void></void></void></void></string></cameradescription></mergevideorecording></mergevideorecording>

    &#xA;