
Recherche avancée
Autres articles (96)
-
Supporting all media types
13 avril 2011, parUnlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)
-
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
Librairies et binaires spécifiques au traitement vidéo et sonore
31 janvier 2010, parLes logiciels et librairies suivantes sont utilisées par SPIPmotion d’une manière ou d’une autre.
Binaires obligatoires FFMpeg : encodeur principal, permet de transcoder presque tous les types de fichiers vidéo et sonores dans les formats lisibles sur Internet. CF ce tutoriel pour son installation ; Oggz-tools : outils d’inspection de fichiers ogg ; Mediainfo : récupération d’informations depuis la plupart des formats vidéos et sonores ;
Binaires complémentaires et facultatifs flvtool2 : (...)
Sur d’autres sites (5955)
-
Revision 31019 : - gestion du multilingue ! - nouvelle entrée "Se déconnecter" - ...
20 août 2009, par vincent@… — Loggestion du multilingue !
nouvelle entrée "Se déconnecter"
possibilité de préciser une classe CSS supplémentaire sur les entrées qui ne génèrent qu’une ligne
-
How to stream synchronized video and audio in real-time from an Android smartphone using HLS while preserving orientation metadata ?
6 mars, par Jérôme LAROSEHello, 
I am working on an Android application where I need to stream video
from one or two cameras on my smartphone, along with audio from the
microphone, in real-time via a link or web page accessible to users.
The stream should be live, allow rewinding (DVR functionality), and be
recorded simultaneously. A latency of 1 to 2 minutes is acceptable,
and the streaming is one-way. 

I have chosen HLS (HTTP Live Streaming) for its browser compatibility
and DVR support. However, I am encountering issues with audio-video
synchronization, managing camera orientation metadata, and format
conversions.



Here are my attempts :


- 

-
MP4 segmentation with
MediaRecorder


- 

- I used
MediaRecorder
withsetNextOutputFile
to generate short MP4 segments, thenffmpeg-kit
to convert them to fMP4 for HLS. - Expected : Well-aligned segments for smooth HLS playback.
- Result : Timestamp issues causing jumps or interruptions in playback.








- I used
-
MPEG2-TS via local socket


- 

- I configured
MediaRecorder
to produce an MPEG2-TS stream sent via a local socket toffmpeg-kit
. - Expected : Stable streaming with preserved metadata.
- Result : Streaming works, but orientation metadata is lost, leading to incorrectly oriented video (e.g., rotated 90°).








- I configured
-
Orientation correction with
ffmpeg


- 

- I tested
-vf transpose=1
inffmpeg
to correct the orientation. - Expected : Correctly oriented video without excessive latency.
- Result : Re-encoding takes too long for real-time streaming, causing unacceptable latency.








- I tested
-
MPEG2-TS to fMP4 conversion


- 

- I converted the MPEG2-TS stream to fMP4 with
ffmpeg
to preserve orientation. - Expected : Perfect audio-video synchronization.
- Result : Slight desynchronization between audio and video, affecting the user experience.








- I converted the MPEG2-TS stream to fMP4 with










I am looking for a solution to :


- 

- Stream an HLS feed from Android with correctly timestamped segments.
- Preserve orientation metadata without heavy re-encoding.
- Ensure perfect audio-video synchronization.








UPDATE


package com.example.angegardien

import android.Manifest
import android.content.Context
import android.content.pm.PackageManager
import android.graphics.SurfaceTexture
import android.hardware.camera2.*
import android.media.*
import android.os.*
import android.util.Log
import android.view.Surface
import android.view.TextureView
import android.view.WindowManager
import androidx.activity.ComponentActivity
import androidx.core.app.ActivityCompat
import com.arthenica.ffmpegkit.FFmpegKit
import fi.iki.elonen.NanoHTTPD
import kotlinx.coroutines.*
import java.io.File
import java.io.IOException
import java.net.ServerSocket
import android.view.OrientationEventListener

/**
 * MainActivity class:
 * - Manages camera operations using the Camera2 API.
 * - Records video using MediaRecorder.
 * - Pipes data to FFmpeg to generate HLS segments.
 * - Hosts a local HLS server using NanoHTTPD to serve the generated HLS content.
 */
class MainActivity : ComponentActivity() {

 // TextureView used for displaying the camera preview.
 private lateinit var textureView: TextureView
 // Camera device instance.
 private lateinit var cameraDevice: CameraDevice
 // Camera capture session for managing capture requests.
 private lateinit var cameraCaptureSession: CameraCaptureSession
 // CameraManager to access camera devices.
 private lateinit var cameraManager: CameraManager
 // Directory where HLS output files will be stored.
 private lateinit var hlsDir: File
 // Instance of the HLS server.
 private lateinit var hlsServer: HlsServer

 // Camera id ("1" corresponds to the rear camera).
 private val cameraId = "1"
 // Flag indicating whether recording is currently active.
 private var isRecording = false

 // MediaRecorder used for capturing audio and video.
 private lateinit var activeRecorder: MediaRecorder
 // Surface for the camera preview.
 private lateinit var previewSurface: Surface
 // Surface provided by MediaRecorder for recording.
 private lateinit var recorderSurface: Surface

 // Port for the FFmpeg local socket connection.
 private val ffmpegPort = 8080

 // Coroutine scope to manage asynchronous tasks.
 private val scope = CoroutineScope(Dispatchers.IO + SupervisorJob())

 // Variables to track current device rotation and listen for orientation changes.
 private var currentRotation = 0
 private lateinit var orientationListener: OrientationEventListener

 override fun onCreate(savedInstanceState: Bundle?) {
 super.onCreate(savedInstanceState)

 // Initialize the TextureView and set it as the content view.
 textureView = TextureView(this)
 setContentView(textureView)

 // Get the CameraManager system service.
 cameraManager = getSystemService(CAMERA_SERVICE) as CameraManager
 // Setup the directory for HLS output.
 setupHLSDirectory()

 // Start the local HLS server on port 8081.
 hlsServer = HlsServer(8081, hlsDir, this)
 try {
 hlsServer.start()
 Log.d("HLS_SERVER", "HLS Server started on port 8081")
 } catch (e: IOException) {
 Log.e("HLS_SERVER", "Error starting HLS Server", e)
 }

 // Initialize the current rotation.
 currentRotation = getDeviceRotation()

 // Add a listener to detect orientation changes.
 orientationListener = object : OrientationEventListener(this) {
 override fun onOrientationChanged(orientation: Int) {
 if (orientation == ORIENTATION_UNKNOWN) return // Skip unknown orientations.
 // Determine the new rotation angle.
 val newRotation = when {
 orientation >= 315 || orientation < 45 -> 0
 orientation >= 45 && orientation < 135 -> 90
 orientation >= 135 && orientation < 225 -> 180
 orientation >= 225 && orientation < 315 -> 270
 else -> 0
 }
 // If the rotation has changed and recording is active, update the rotation.
 if (newRotation != currentRotation && isRecording) {
 Log.d("ROTATION", "Orientation change detected: $newRotation")
 currentRotation = newRotation
 }
 }
 }
 orientationListener.enable()

 // Set up the TextureView listener to know when the surface is available.
 textureView.surfaceTextureListener = object : TextureView.SurfaceTextureListener {
 override fun onSurfaceTextureAvailable(surface: SurfaceTexture, width: Int, height: Int) {
 // Open the camera when the texture becomes available.
 openCamera()
 }
 override fun onSurfaceTextureSizeChanged(surface: SurfaceTexture, width: Int, height: Int) {}
 override fun onSurfaceTextureDestroyed(surface: SurfaceTexture) = false
 override fun onSurfaceTextureUpdated(surface: SurfaceTexture) {}
 }
 }

 /**
 * Sets up the HLS directory in the public Downloads folder.
 * If the directory exists, it deletes it recursively and creates a new one.
 */
 private fun setupHLSDirectory() {
 val downloadsDir = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DOWNLOADS)
 hlsDir = File(downloadsDir, "HLS_Output")

 if (hlsDir.exists()) {
 hlsDir.deleteRecursively()
 }
 hlsDir.mkdirs()

 Log.d("HLS", "📂 HLS folder created: ${hlsDir.absolutePath}")
 }

 /**
 * Opens the camera after checking for necessary permissions.
 */
 private fun openCamera() {
 if (ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED ||
 ActivityCompat.checkSelfPermission(this, Manifest.permission.RECORD_AUDIO) != PackageManager.PERMISSION_GRANTED) {
 // Request permissions if they are not already granted.
 ActivityCompat.requestPermissions(this, arrayOf(Manifest.permission.CAMERA, Manifest.permission.RECORD_AUDIO), 101)
 return
 }

 try {
 // Open the specified camera using its cameraId.
 cameraManager.openCamera(cameraId, object : CameraDevice.StateCallback() {
 override fun onOpened(camera: CameraDevice) {
 cameraDevice = camera
 // Start the recording session once the camera is opened.
 startNextRecording()
 }
 override fun onDisconnected(camera: CameraDevice) { camera.close() }
 override fun onError(camera: CameraDevice, error: Int) { camera.close() }
 }, null)
 } catch (e: CameraAccessException) {
 e.printStackTrace()
 }
 }

 /**
 * Starts a new recording session:
 * - Sets up the preview and recorder surfaces.
 * - Creates a pipe for MediaRecorder output.
 * - Creates a capture session for simultaneous preview and recording.
 */
 private fun startNextRecording() {
 // Get the SurfaceTexture from the TextureView and set its default buffer size.
 val texture = textureView.surfaceTexture!!
 texture.setDefaultBufferSize(1920, 1080)
 // Create the preview surface.
 previewSurface = Surface(texture)

 // Create and configure the MediaRecorder.
 activeRecorder = createMediaRecorder()

 // Create a pipe to route MediaRecorder data.
 val pipe = ParcelFileDescriptor.createPipe()
 val pfdWrite = pipe[1] // Write end used by MediaRecorder.
 val pfdRead = pipe[0] // Read end used by the local socket server.

 // Set MediaRecorder output to the file descriptor of the write end.
 activeRecorder.setOutputFile(pfdWrite.fileDescriptor)
 setupMediaRecorder(activeRecorder)
 // Obtain the recorder surface from MediaRecorder.
 recorderSurface = activeRecorder.surface

 // Create a capture request using the RECORD template.
 val captureRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD)
 captureRequestBuilder.addTarget(previewSurface)
 captureRequestBuilder.addTarget(recorderSurface)

 // Create a capture session including both preview and recorder surfaces.
 cameraDevice.createCaptureSession(
 listOf(previewSurface, recorderSurface),
 object : CameraCaptureSession.StateCallback() {
 override fun onConfigured(session: CameraCaptureSession) {
 cameraCaptureSession = session
 captureRequestBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO)
 // Start a continuous capture request.
 cameraCaptureSession.setRepeatingRequest(captureRequestBuilder.build(), null, null)

 // Launch a coroutine to start FFmpeg and MediaRecorder with synchronization.
 scope.launch {
 startFFmpeg()
 delay(500) // Wait for FFmpeg to be ready.
 activeRecorder.start()
 isRecording = true
 Log.d("HLS", "🎥 Recording started...")
 }

 // Launch a coroutine to run the local socket server to forward data.
 scope.launch {
 startLocalSocketServer(pfdRead)
 }
 }
 override fun onConfigureFailed(session: CameraCaptureSession) {
 Log.e("Camera2", "❌ Configuration failed")
 }
 },
 null
 )
 }

 /**
 * Coroutine to start a local socket server.
 * It reads from the MediaRecorder pipe and sends the data to FFmpeg.
 */
 private suspend fun startLocalSocketServer(pfdRead: ParcelFileDescriptor) {
 withContext(Dispatchers.IO) {
 val serverSocket = ServerSocket(ffmpegPort)
 Log.d("HLS", "Local socket server started on port $ffmpegPort")

 // Accept connection from FFmpeg.
 val socket = serverSocket.accept()
 Log.d("HLS", "Connection accepted from FFmpeg")

 // Read data from the pipe and forward it through the socket.
 val inputStream = ParcelFileDescriptor.AutoCloseInputStream(pfdRead)
 val outputStream = socket.getOutputStream()
 val buffer = ByteArray(8192)
 var bytesRead: Int
 while (inputStream.read(buffer).also { bytesRead = it } != -1) {
 outputStream.write(buffer, 0, bytesRead)
 }
 outputStream.close()
 inputStream.close()
 socket.close()
 serverSocket.close()
 }
 }

 /**
 * Coroutine to start FFmpeg using a local TCP input.
 * Applies a video rotation filter based on device orientation and generates HLS segments.
 */
 private suspend fun startFFmpeg() {
 withContext(Dispatchers.IO) {
 // Retrieve the appropriate transpose filter based on current rotation.
 val transposeFilter = getTransposeFilter(currentRotation)

 // FFmpeg command to read from the TCP socket and generate an HLS stream.
 // Two alternative commands are commented below.
 // val ffmpegCommand = "-fflags +genpts -i tcp://localhost:$ffmpegPort -c copy -bsf:a aac_adtstoasc -movflags +faststart -f dash -seg_duration 10 -hls_playlist 1 ${hlsDir.absolutePath}/manifest.mpd"
 // val ffmpegCommand = "-fflags +genpts -i tcp://localhost:$ffmpegPort -c copy -bsf:a aac_adtstoasc -movflags +faststart -f hls -hls_time 5 -hls_segment_type fmp4 -hls_flags split_by_time -hls_list_size 0 -hls_playlist_type event -hls_fmp4_init_filename init.mp4 -hls_segment_filename ${hlsDir.absolutePath}/segment_%03d.m4s ${hlsDir.absolutePath}/playlist.m3u8"
 val ffmpegCommand = "-fflags +genpts -i tcp://localhost:$ffmpegPort -vf $transposeFilter -c:v libx264 -preset ultrafast -crf 23 -c:a copy -movflags +faststart -f hls -hls_time 0.1 -hls_segment_type mpegts -hls_flags split_by_time -hls_list_size 0 -hls_playlist_type event -hls_segment_filename ${hlsDir.absolutePath}/segment_%03d.ts ${hlsDir.absolutePath}/playlist.m3u8"

 FFmpegKit.executeAsync(ffmpegCommand) { session ->
 if (session.returnCode.isValueSuccess) {
 Log.d("HLS", "✅ HLS generated successfully")
 } else {
 Log.e("FFmpeg", "❌ Error generating HLS: ${session.allLogsAsString}")
 }
 }
 }
 }

 /**
 * Gets the current device rotation using the WindowManager.
 */
 private fun getDeviceRotation(): Int {
 val windowManager = getSystemService(Context.WINDOW_SERVICE) as WindowManager
 return when (windowManager.defaultDisplay.rotation) {
 Surface.ROTATION_0 -> 0
 Surface.ROTATION_90 -> 90
 Surface.ROTATION_180 -> 180
 Surface.ROTATION_270 -> 270
 else -> 0
 }
 }

 /**
 * Returns the FFmpeg transpose filter based on the rotation angle.
 * Used to rotate the video stream accordingly.
 */
 private fun getTransposeFilter(rotation: Int): String {
 return when (rotation) {
 90 -> "transpose=1" // 90° clockwise
 180 -> "transpose=2,transpose=2" // 180° rotation
 270 -> "transpose=2" // 90° counter-clockwise
 else -> "transpose=0" // No rotation
 }
 }

 /**
 * Creates and configures a MediaRecorder instance.
 * Sets up audio and video sources, formats, encoders, and bitrates.
 */
 private fun createMediaRecorder(): MediaRecorder {
 return MediaRecorder().apply {
 setAudioSource(MediaRecorder.AudioSource.MIC)
 setVideoSource(MediaRecorder.VideoSource.SURFACE)
 setOutputFormat(MediaRecorder.OutputFormat.MPEG_2_TS)
 setVideoEncodingBitRate(5000000)
 setVideoFrameRate(24)
 setVideoSize(1080, 720)
 setVideoEncoder(MediaRecorder.VideoEncoder.H264)
 setAudioEncoder(MediaRecorder.AudioEncoder.AAC)
 setAudioSamplingRate(16000)
 setAudioEncodingBitRate(96000) // 96 kbps
 }
 }

 /**
 * Prepares the MediaRecorder and logs the outcome.
 */
 private fun setupMediaRecorder(recorder: MediaRecorder) {
 try {
 recorder.prepare()
 Log.d("HLS", "✅ MediaRecorder prepared")
 } catch (e: IOException) {
 Log.e("HLS", "❌ Error preparing MediaRecorder", e)
 }
 }

 /**
 * Custom HLS server class extending NanoHTTPD.
 * Serves HLS segments and playlists from the designated HLS directory.
 */
 private inner class HlsServer(port: Int, private val hlsDir: File, private val context: Context) : NanoHTTPD(port) {
 override fun serve(session: IHTTPSession): Response {
 val uri = session.uri.trimStart('/')

 // Intercept the request for `init.mp4` and serve it from assets.
 /*
 if (uri == "init.mp4") {
 Log.d("HLS Server", "📡 Intercepting init.mp4, sending file from assets...")
 return try {
 val assetManager = context.assets
 val inputStream = assetManager.open("init.mp4")
 newFixedLengthResponse(Response.Status.OK, "video/mp4", inputStream, inputStream.available().toLong())
 } catch (e: Exception) {
 Log.e("HLS Server", "❌ Error reading init.mp4 from assets: ${e.message}")
 newFixedLengthResponse(Response.Status.INTERNAL_ERROR, MIME_PLAINTEXT, "Server error")
 }
 }
 */

 // Serve all other HLS files normally from the hlsDir.
 val file = File(hlsDir, uri)
 return if (file.exists()) {
 newFixedLengthResponse(Response.Status.OK, getMimeTypeForFile(uri), file.inputStream(), file.length())
 } else {
 newFixedLengthResponse(Response.Status.NOT_FOUND, MIME_PLAINTEXT, "File not found")
 }
 }
 }

 /**
 * Clean up resources when the activity is destroyed.
 * Stops recording, releases the camera, cancels coroutines, and stops the HLS server.
 */
 override fun onDestroy() {
 super.onDestroy()
 if (isRecording) {
 activeRecorder.stop()
 activeRecorder.release()
 }
 cameraDevice.close()
 scope.cancel()
 hlsServer.stop()
 orientationListener.disable()
 Log.d("HLS", "🛑 Activity destroyed")
 }
}



I have three examples of ffmpeg commands.


- 

- One command segments into DASH, but the camera does not have the correct rotation.
- One command segments into HLS without re-encoding with 5-second segments ; it’s fast but does not have the correct rotation.
- One command segments into HLS with re-encoding, which applies a rotation. It’s too slow for 5-second segments, so a 1-second segment was chosen.








Note :


- 

- In the second command ("One command segments into HLS without re-encoding with 5-second segments ; it’s fast but does not have the correct rotation."), it returns fMP4. To achieve the correct rotation, I provide a preconfigured
init.mp4
file during the HTTP request to retrieve it (see comment). - In the third command ("One command segments into HLS with re-encoding, which applies a rotation. It’s too slow for 5-second segments, so a 1-second segment was chosen."), it returns TS.






-
-
VLC dead input for RTP stream
27 mars, par CaptainCheeseI'm working on creating an rtp stream that's meant to display live waveform data from Pioneer prolink players. The motivation for sending this video out is to be able to receive it in a flutter frontend. I initially was just sending a base-24 encoding of the raw ARGB packed ints per frame across a Kafka topic to it but processing this data in flutter proved to be untenable and was bogging down the main UI thread. Not sure if this is the most optimal way of going about this but just trying to get anything to work if it means some speedup on the frontend. So the issue the following implementation is experiencing is that when I run
vlc --rtsp-timeout=120000 --network-caching=30000 -vvvv stream_1.sdp
where

% cat stream_1.sdp
v=0
o=- 0 1 IN IP4 127.0.0.1
s=RTP Stream
c=IN IP4 127.0.0.1
t=0 0
a=tool:libavformat
m=video 5007 RTP/AVP 96
a=rtpmap:96 H264/90000



I see (among other questionable logs) the following :


[0000000144c44d10] live555 demux error: no data received in 10s, aborting
[00000001430ee2f0] main input debug: EOF reached
[0000000144b160c0] main decoder debug: killing decoder fourcc `h264'
[0000000144b160c0] main decoder debug: removing module "videotoolbox"
[0000000144b164a0] main packetizer debug: removing module "h264"
[0000000144c44d10] main demux debug: removing module "live555"
[0000000144c45bb0] main stream debug: removing module "record"
[0000000144a64960] main stream debug: removing module "cache_read"
[0000000144c29c00] main stream debug: removing module "filesystem"
[00000001430ee2f0] main input debug: Program doesn't contain anymore ES
[0000000144806260] main playlist debug: dead input
[0000000144806260] main playlist debug: changing item without a request (current 0/1)
[0000000144806260] main playlist debug: nothing to play
[0000000142e083c0] macosx interface debug: Playback has been ended
[0000000142e083c0] macosx interface debug: Releasing IOKit system sleep blocker (37463)



This is sort of confusing because when I run
ffmpeg -protocol_whitelist file,crypto,data,rtp,udp -i stream_1.sdp -vcodec libx264 -f null -

I see a number logs about

[h264 @ 0x139304080] non-existing PPS 0 referenced
 Last message repeated 1 times
[h264 @ 0x139304080] decode_slice_header error
[h264 @ 0x139304080] no frame!



After which I see the stream is received and I start getting telemetry on it :


Input #0, sdp, from 'stream_1.sdp':
 Metadata:
 title : RTP Stream
 Duration: N/A, start: 0.016667, bitrate: N/A
 Stream #0:0: Video: h264 (Constrained Baseline), yuv420p(progressive), 1200x200, 60 fps, 60 tbr, 90k tbn
Stream mapping:
 Stream #0:0 -> #0:0 (h264 (native) -> h264 (libx264))
Press [q] to stop, [?] for help
[libx264 @ 0x107f04f40] using cpu capabilities: ARMv8 NEON
[libx264 @ 0x107f04f40] profile High, level 3.1, 4:2:0, 8-bit
Output #0, null, to 'pipe:':
 Metadata:
 title : RTP Stream
 encoder : Lavf61.7.100
 Stream #0:0: Video: h264, yuv420p(tv, progressive), 1200x200, q=2-31, 60 fps, 60 tbn
 Metadata:
 encoder : Lavc61.19.101 libx264
 Side data:
 cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: N/A
[out#0/null @ 0x60000069c000] video:144KiB audio:0KiB subtitle:0KiB other streams:0KiB global headers:0KiB muxing overhead: unknown
frame= 1404 fps= 49 q=-1.0 Lsize=N/A time=00:00:23.88 bitrate=N/A speed=0.834x



Not sure why VLC is turning me down like some kind of Berghain bouncer that lets nobody in the entire night.


I initially tried just converting the ARGB ints to a YUV420p buffer and used this to create the Frame objects but I couldn't for the life of me figure out how to properly initialize it as the attempts I made kept spitting out garbled junk.


Please go easy on me, I've made an unhealthy habit of resolving nearly all of my coding questions by simply lurking the internet for answers but that's not really helping me solve this issue.


Here's the Java I'm working on (the meat of the rtp comms occurs within
updateWaveformForPlayer()
) :

package com.bugbytz.prolink;

import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.Producer;
import org.apache.kafka.clients.producer.ProducerConfig;
import org.apache.kafka.clients.producer.ProducerRecord;
import org.bytedeco.ffmpeg.global.avcodec;
import org.bytedeco.ffmpeg.global.avutil;
import org.bytedeco.javacv.FFmpegFrameGrabber;
import org.bytedeco.javacv.FFmpegFrameRecorder;
import org.bytedeco.javacv.FFmpegLogCallback;
import org.bytedeco.javacv.Frame;
import org.bytedeco.javacv.FrameGrabber;
import org.deepsymmetry.beatlink.CdjStatus;
import org.deepsymmetry.beatlink.DeviceAnnouncement;
import org.deepsymmetry.beatlink.DeviceAnnouncementAdapter;
import org.deepsymmetry.beatlink.DeviceFinder;
import org.deepsymmetry.beatlink.Util;
import org.deepsymmetry.beatlink.VirtualCdj;
import org.deepsymmetry.beatlink.data.BeatGridFinder;
import org.deepsymmetry.beatlink.data.CrateDigger;
import org.deepsymmetry.beatlink.data.MetadataFinder;
import org.deepsymmetry.beatlink.data.TimeFinder;
import org.deepsymmetry.beatlink.data.WaveformDetail;
import org.deepsymmetry.beatlink.data.WaveformDetailComponent;
import org.deepsymmetry.beatlink.data.WaveformFinder;

import java.awt.*;
import java.awt.image.BufferedImage;
import java.io.File;
import java.nio.ByteBuffer;
import java.text.DecimalFormat;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.HashSet;
import java.util.Map;
import java.util.Properties;
import java.util.Set;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.Executors;
import java.util.concurrent.ScheduledExecutorService;
import java.util.concurrent.ScheduledFuture;
import java.util.concurrent.TimeUnit;

import static org.bytedeco.ffmpeg.global.avutil.AV_PIX_FMT_RGB24;

public class App {
 public static ArrayList<track> tracks = new ArrayList<>();
 public static boolean dbRead = false;
 public static Properties props = new Properties();
 private static Map recorders = new HashMap<>();
 private static Map frameCount = new HashMap<>();

 private static final ScheduledExecutorService scheduler = Executors.newScheduledThreadPool(1);
 private static final int FPS = 60;
 private static final int FRAME_INTERVAL_MS = 1000 / FPS;

 private static Map schedules = new HashMap<>();

 private static Set<integer> streamingPlayers = new HashSet<>();

 public static String byteArrayToMacString(byte[] macBytes) {
 StringBuilder sb = new StringBuilder();
 for (int i = 0; i < macBytes.length; i++) {
 sb.append(String.format("%02X%s", macBytes[i], (i < macBytes.length - 1) ? ":" : ""));
 }
 return sb.toString();
 }

 private static void updateWaveformForPlayer(int player) throws Exception {
 Integer frame_for_player = frameCount.get(player);
 if (frame_for_player == null) {
 frame_for_player = 0;
 frameCount.putIfAbsent(player, frame_for_player);
 }

 if (!WaveformFinder.getInstance().isRunning()) {
 WaveformFinder.getInstance().start();
 }
 WaveformDetail detail = WaveformFinder.getInstance().getLatestDetailFor(player);

 if (detail != null) {
 WaveformDetailComponent component = (WaveformDetailComponent) detail.createViewComponent(
 MetadataFinder.getInstance().getLatestMetadataFor(player),
 BeatGridFinder.getInstance().getLatestBeatGridFor(player)
 );
 component.setMonitoredPlayer(player);
 component.setPlaybackState(player, TimeFinder.getInstance().getTimeFor(player), true);
 component.setAutoScroll(true);
 int width = 1200;
 int height = 200;
 Dimension dimension = new Dimension(width, height);
 component.setPreferredSize(dimension);
 component.setSize(dimension);
 component.setScale(1);
 component.doLayout();

 // Create a fresh BufferedImage and clear it before rendering
 BufferedImage image = new BufferedImage(width, height, BufferedImage.TYPE_INT_RGB);
 Graphics2D g = image.createGraphics();
 g.clearRect(0, 0, width, height); // Clear any old content

 // Draw waveform into the BufferedImage
 component.paint(g);
 g.dispose();

 int port = 5004 + player;
 String inputFile = port + "_" + frame_for_player + ".mp4";
 // Initialize the FFmpegFrameRecorder for YUV420P
 FFmpegFrameRecorder recorder_file = new FFmpegFrameRecorder(inputFile, width, height);
 FFmpegLogCallback.set(); // Enable FFmpeg logging for debugging
 recorder_file.setFormat("mp4");
 recorder_file.setVideoCodec(avcodec.AV_CODEC_ID_H264);
 recorder_file.setPixelFormat(avutil.AV_PIX_FMT_YUV420P); // Use YUV420P format directly
 recorder_file.setFrameRate(FPS);

 // Set video options
 recorder_file.setVideoOption("preset", "ultrafast");
 recorder_file.setVideoOption("tune", "zerolatency");
 recorder_file.setVideoOption("x264-params", "repeat-headers=1");
 recorder_file.setGopSize(FPS);
 try {
 recorder_file.start(); // Ensure this is called before recording any frames
 System.out.println("Recorder started successfully for player: " + player);
 } catch (org.bytedeco.javacv.FFmpegFrameRecorder.Exception e) {
 e.printStackTrace();
 }

 // Get all pixels in one call
 int[] pixels = new int[width * height];
 image.getRGB(0, 0, width, height, pixels, 0, width);
 recorder_file.recordImage(width,height,Frame.DEPTH_UBYTE,1,3 * width, AV_PIX_FMT_RGB24, ByteBuffer.wrap(argbToByteArray(pixels, width, height)));
 recorder_file.stop();
 recorder_file.release();
 final FFmpegFrameRecorder recorder = recorders.get(player);
 FFmpegFrameGrabber grabber = new FFmpegFrameGrabber(inputFile);


 try {
 grabber.start();
 } catch (Exception e) {
 e.printStackTrace();
 }
 if (recorder == null) {
 try {
 String outputStream = "rtp://127.0.0.1:" + port;
 FFmpegFrameRecorder initial_recorder = new FFmpegFrameRecorder(outputStream, grabber.getImageWidth(), grabber.getImageHeight());
 initial_recorder.setFormat("rtp");
 initial_recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
 initial_recorder.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);
 initial_recorder.setFrameRate(grabber.getFrameRate());
 initial_recorder.setGopSize(FPS);
 initial_recorder.setVideoOption("x264-params", "keyint=60");
 initial_recorder.setVideoOption("rtsp_transport", "tcp");
 initial_recorder.start();
 recorders.putIfAbsent(player, initial_recorder);
 frameCount.putIfAbsent(player, 0);
 putToRTP(player, grabber, initial_recorder);
 }
 catch (Exception e) {
 e.printStackTrace();
 }
 }
 else {
 putToRTP(player, grabber, recorder);
 }
 File file = new File(inputFile);
 if (file.exists() && file.delete()) {
 System.out.println("Successfully deleted file: " + inputFile);
 } else {
 System.out.println("Failed to delete file: " + inputFile);
 }
 }
 }

 public static void putToRTP(int player, FFmpegFrameGrabber grabber, FFmpegFrameRecorder recorder) throws FrameGrabber.Exception {
 final Frame frame = grabber.grabFrame();
 int frameCount_local = frameCount.get(player);
 frame.keyFrame = frameCount_local++ % FPS == 0;
 frameCount.put(player, frameCount_local);
 try {
 recorder.record(frame);
 } catch (FFmpegFrameRecorder.Exception e) {
 throw new RuntimeException(e);
 }
 }
 public static byte[] argbToByteArray(int[] argb, int width, int height) {
 int totalPixels = width * height;
 byte[] byteArray = new byte[totalPixels * 3]; // 4 bytes per pixel (ARGB)

 for (int i = 0; i < totalPixels; i++) {
 int argbPixel = argb[i];

 byteArray[i * 3] = (byte) ((argbPixel >> 16) & 0xFF); // Red
 byteArray[i * 3 + 1] = (byte) ((argbPixel >> 8) & 0xFF); // Green
 byteArray[i * 3 + 2] = (byte) (argbPixel & 0xFF); // Blue
 }

 return byteArray;
 }


 public static void main(String[] args) throws Exception {
 VirtualCdj.getInstance().setDeviceNumber((byte) 4);
 CrateDigger.getInstance().addDatabaseListener(new DBService());
 props.put("bootstrap.servers", "localhost:9092");
 props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
 props.put("value.serializer", "com.bugbytz.prolink.CustomSerializer");
 props.put(ProducerConfig.MAX_REQUEST_SIZE_CONFIG, "20971520");

 VirtualCdj.getInstance().addUpdateListener(update -> {
 if (update instanceof CdjStatus) {
 try (Producer producer = new KafkaProducer<>(props)) {
 DecimalFormat df_obj = new DecimalFormat("#.##");
 DeviceStatus deviceStatus = new DeviceStatus(
 update.getDeviceNumber(),
 ((CdjStatus) update).isPlaying() || !((CdjStatus) update).isPaused(),
 ((CdjStatus) update).getBeatNumber(),
 update.getBeatWithinBar(),
 Double.parseDouble(df_obj.format(update.getEffectiveTempo())),
 Double.parseDouble(df_obj.format(Util.pitchToPercentage(update.getPitch()))),
 update.getAddress().getHostAddress(),
 byteArrayToMacString(DeviceFinder.getInstance().getLatestAnnouncementFrom(update.getDeviceNumber()).getHardwareAddress()),
 ((CdjStatus) update).getRekordboxId(),
 update.getDeviceName()
 );
 ProducerRecord record = new ProducerRecord<>("device-status", "device-" + update.getDeviceNumber(), deviceStatus);
 try {
 producer.send(record).get();
 } catch (InterruptedException ex) {
 throw new RuntimeException(ex);
 } catch (ExecutionException ex) {
 throw new RuntimeException(ex);
 }
 producer.flush();
 if (!WaveformFinder.getInstance().isRunning()) {
 try {
 WaveformFinder.getInstance().start();
 } catch (Exception ex) {
 throw new RuntimeException(ex);
 }
 }
 }
 }
 });
 DeviceFinder.getInstance().addDeviceAnnouncementListener(new DeviceAnnouncementAdapter() {
 @Override
 public void deviceFound(DeviceAnnouncement announcement) {
 if (!streamingPlayers.contains(announcement.getDeviceNumber())) {
 streamingPlayers.add(announcement.getDeviceNumber());
 schedules.putIfAbsent(announcement.getDeviceNumber(), scheduler.scheduleAtFixedRate(() -> {
 try {
 Runnable task = () -> {
 try {
 updateWaveformForPlayer(announcement.getDeviceNumber());
 } catch (InterruptedException e) {
 System.out.println("Thread interrupted");
 } catch (Exception e) {
 throw new RuntimeException(e);
 }
 System.out.println("Lambda thread work completed!");
 };
 task.run();
 } catch (Exception e) {
 e.printStackTrace();
 }
 }, 0, FRAME_INTERVAL_MS, TimeUnit.MILLISECONDS));
 }
 }

 @Override
 public void deviceLost(DeviceAnnouncement announcement) {
 if (streamingPlayers.contains(announcement.getDeviceNumber())) {
 schedules.get(announcement.getDeviceNumber()).cancel(true);
 streamingPlayers.remove(announcement.getDeviceNumber());
 }
 }
 });
 BeatGridFinder.getInstance().start();
 MetadataFinder.getInstance().start();
 VirtualCdj.getInstance().start();
 TimeFinder.getInstance().start();
 DeviceFinder.getInstance().start();
 CrateDigger.getInstance().start();

 try {
 LoadCommandConsumer consumer = new LoadCommandConsumer("localhost:9092", "load-command-group");
 Thread consumerThread = new Thread(consumer::startConsuming);
 consumerThread.start();

 Runtime.getRuntime().addShutdownHook(new Thread(() -> {
 consumer.shutdown();
 try {
 consumerThread.join();
 } catch (InterruptedException e) {
 Thread.currentThread().interrupt();
 }
 }));
 Thread.sleep(60000);
 } catch (InterruptedException e) {
 System.out.println("Interrupted, exiting.");
 }
 }
}
</integer></track>