
Recherche avancée
Autres articles (106)
-
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
Ajouter notes et légendes aux images
7 février 2011, parPour pouvoir ajouter notes et légendes aux images, la première étape est d’installer le plugin "Légendes".
Une fois le plugin activé, vous pouvez le configurer dans l’espace de configuration afin de modifier les droits de création / modification et de suppression des notes. Par défaut seuls les administrateurs du site peuvent ajouter des notes aux images.
Modification lors de l’ajout d’un média
Lors de l’ajout d’un média de type "image" un nouveau bouton apparait au dessus de la prévisualisation (...) -
Taille des images et des logos définissables
9 février 2011, parDans beaucoup d’endroits du site, logos et images sont redimensionnées pour correspondre aux emplacements définis par les thèmes. L’ensemble des ces tailles pouvant changer d’un thème à un autre peuvent être définies directement dans le thème et éviter ainsi à l’utilisateur de devoir les configurer manuellement après avoir changé l’apparence de son site.
Ces tailles d’images sont également disponibles dans la configuration spécifique de MediaSPIP Core. La taille maximale du logo du site en pixels, on permet (...)
Sur d’autres sites (9298)
-
Unable to scrub video after remuxing multiple mpeg-ts to mp4
21 juillet 2022, par JonaI have written custom code to concat multiple mpeg-ts files into an mp4 video file. I've used as reference the remuxing code sample.


I'm having issues with the final output. I'm unable to scrub the video on QuickTime. The final output seems to be missing something.


I compared using ffprobe my custom code remuxer results to that of using the following terminal command :


ffmpeg -i "concat:input1.ts|input2.ts|input3.ts" -c copy output.mp4



To my surprise, the videos look almost identical but I'm noticing some differences in the reported distance and keyframe values. I also did a hex comparison and see minor differences at the end of the file which tells me I've muxed the frames correctly. What could I be missing ? Any tips or ideas would be really helpful !


FFmpeg terminal remuxed video [WORKING VIDEO]



Custom remuxer code video [SCRUBBING BROKEN]



App.mp4 vs ffmpeg.mp4 remuxed video



-
Merged Video Contains Inverted Clips After First Video Ends
3 février, par Nikunj AgrawalI am working on a Flutter application that merges multiple videos using
ffmpeg_kit_flutter
. However, after merging, I notice that the second video (and any subsequent ones) appear inverted or rotated in the final output.

Issue Details :


- 

- The first video appears normal.
- The videos can be recorded using both front and back cameras.
- The second (and later) videos are flipped or rotated upside down.
- This happens after merging using
ffmpeg_kit_flutter
.










Question :
How can I correctly merge multiple videos in Flutter without rotation issues ? Is there a way to normalize video orientation before merging using
ffmpeg_kit_flutter
?

Any help would be appreciated ! 🚀


Code :


import 'dart:io';
import 'dart:math';

import 'package:camera/camera.dart';
import 'package:ffmpeg_kit_flutter/ffmpeg_kit.dart';
import 'package:ffmpeg_kit_flutter/return_code.dart';
import 'package:flutter/material.dart';
import 'package:path_provider/path_provider.dart';
import 'package:permission_handler/permission_handler.dart';
import 'package:record/record.dart';
import 'package:videotest/video_player.dart';

class MergeVideoRecording extends StatefulWidget {
 const MergeVideoRecording({super.key});

 @override
 State<mergevideorecording> createState() => _MergeVideoRecordingState();
}

class _MergeVideoRecordingState extends State<mergevideorecording> {
 CameraController? _cameraController;
 final AudioRecorder _audioRecorder = AudioRecorder();

 bool _isRecording = false;
 String? _videoPath;
 String? _audioPath;
 List<cameradescription> _cameras = [];
 int _currentCameraIndex = 0;
 final List<string> _recordedVideos = [];

 @override
 Widget build(BuildContext context) {
 return Scaffold(
 body: Column(
 mainAxisAlignment: MainAxisAlignment.center,
 children: [
 _cameraController != null && _cameraController!.value.isInitialized
 ? SizedBox(
 width: MediaQuery.of(context).size.width * 0.4,
 height: MediaQuery.of(context).size.height * 0.3,
 child: Stack(
 children: [
 ClipRRect(
 borderRadius: BorderRadius.circular(16),
 child: SizedBox(
 width: MediaQuery.of(context).size.width * 0.4,
 height: MediaQuery.of(context).size.height * 0.3,
 child: Transform(
 alignment: Alignment.center,
 transform:
 _cameras[_currentCameraIndex].lensDirection ==
 CameraLensDirection.front
 ? Matrix4.rotationY(pi)
 : Matrix4.identity(),
 child: CameraPreview(_cameraController!),
 ),
 ),
 ),
 Align(
 alignment: Alignment.topRight,
 child: InkWell(
 onTap: _switchCamera,
 child: const Padding(
 padding: EdgeInsets.all(8.0),
 child: CircleAvatar(
 radius: 18,
 backgroundColor: Colors.white,
 child: Icon(
 Icons.flip_camera_android,
 color: Colors.black,
 ),
 ),
 ),
 ),
 ),
 ],
 ),
 )
 : const CircularProgressIndicator(),
 const SizedBox(height: 16),
 Row(
 mainAxisAlignment: MainAxisAlignment.center,
 children: [
 FloatingActionButton(
 heroTag: 'record_button',
 onPressed: _toggleRecording,
 child: Icon(
 _isRecording ? Icons.stop : Icons.video_camera_back,
 ),
 ),
 const SizedBox(
 width: 50,
 ),
 FloatingActionButton(
 heroTag: 'merge_button',
 onPressed: _mergeVideos,
 child: const Icon(
 Icons.merge,
 ),
 ),
 ],
 ),
 if (!_isRecording)
 ListView.builder(
 shrinkWrap: true,
 itemCount: _recordedVideos.length,
 itemBuilder: (context, index) => InkWell(
 onTap: () {
 Navigator.push(
 context,
 MaterialPageRoute(
 builder: (context) => VideoPlayerScreen(
 videoPath: _recordedVideos[index],
 ),
 ),
 );
 },
 child: ListTile(
 title: Text('Video ${index + 1}'),
 subtitle: Text('Path ${_recordedVideos[index]}'),
 trailing: const Icon(Icons.play_arrow),
 ),
 ),
 ),
 ],
 ),
 );
 }

 @override
 void dispose() {
 _cameraController?.dispose();
 _audioRecorder.dispose();
 super.dispose();
 }

 @override
 void initState() {
 super.initState();
 _initializeDevices();
 }

 Future<void> _initializeCameraController(CameraDescription camera) async {
 _cameraController = CameraController(
 camera,
 ResolutionPreset.high,
 enableAudio: true,
 imageFormatGroup: ImageFormatGroup.yuv420, // Add this line
 );

 await _cameraController!.initialize();
 await _cameraController!.setExposureMode(ExposureMode.auto);
 await _cameraController!.setFocusMode(FocusMode.auto);
 setState(() {});
 }

 Future<void> _initializeDevices() async {
 final cameraStatus = await Permission.camera.request();
 final micStatus = await Permission.microphone.request();

 if (!cameraStatus.isGranted || !micStatus.isGranted) {
 _showError('Camera and microphone permissions required');
 return;
 }

 _cameras = await availableCameras();
 if (_cameras.isNotEmpty) {
 final frontCameraIndex = _cameras.indexWhere(
 (camera) => camera.lensDirection == CameraLensDirection.front);
 _currentCameraIndex = frontCameraIndex != -1 ? frontCameraIndex : 0;
 await _initializeCameraController(_cameras[_currentCameraIndex]);
 }
 }

 // Merge video
 Future<void> _mergeVideos() async {
 if (_recordedVideos.isEmpty) {
 _showError('No videos to merge');
 return;
 }

 try {
 // Debug logging
 print('Starting merge process');
 print('Number of videos to merge: ${_recordedVideos.length}');
 for (var i = 0; i < _recordedVideos.length; i++) {
 final file = File(_recordedVideos[i]);
 final exists = await file.exists();
 final size = exists ? await file.length() : 0;
 print('Video $i: ${_recordedVideos[i]}');
 print('Exists: $exists, Size: $size bytes');
 }

 final Directory appDir = await getApplicationDocumentsDirectory();
 final String outputPath =
 '${appDir.path}/merged_${DateTime.now().millisecondsSinceEpoch}.mp4';
 final String listFilePath = '${appDir.path}/list.txt';

 print('Output path: $outputPath');
 print('List file path: $listFilePath');

 // Create and verify list file
 final listFile = File(listFilePath);
 final fileContent = _recordedVideos
 .map((path) => "file '${path.replaceAll("'", "'\\''")}'")
 .join('\n');
 await listFile.writeAsString(fileContent);

 print('List file content:');
 print(await listFile.readAsString());

 // Simpler FFmpeg command for testing
 final command = '''
 -f concat
 -safe 0
 -i "$listFilePath"
 -c copy
 -y
 "$outputPath"
 '''
 .trim()
 .replaceAll('\n', ' ');

 print('Executing FFmpeg command: $command');

 final session = await FFmpegKit.execute(command);
 final returnCode = await session.getReturnCode();
 final logs = await session.getAllLogsAsString();
 final failStackTrace = await session.getFailStackTrace();

 print('FFmpeg return code: ${returnCode?.getValue() ?? "null"}');
 print('FFmpeg logs: $logs');
 if (failStackTrace != null) {
 print('FFmpeg fail stack trace: $failStackTrace');
 }

 if (ReturnCode.isSuccess(returnCode)) {
 final outputFile = File(outputPath);
 final outputExists = await outputFile.exists();
 final outputSize = outputExists ? await outputFile.length() : 0;

 print('Output file exists: $outputExists');
 print('Output file size: $outputSize bytes');

 if (outputExists && outputSize > 0) {
 setState(() => _recordedVideos.add(outputPath));
 _showSuccess('Videos merged successfully');
 } else {
 _showError('Merged file is empty or not created');
 }
 } else {
 _showError('Failed to merge videos. Check logs for details.');
 }

 // Clean up
 try {
 await listFile.delete();
 print('List file cleaned up successfully');
 } catch (e) {
 print('Failed to delete list file: $e');
 }
 } catch (e, s) {
 print('Error during merge: $e');
 print('Stack trace: $s');
 _showError('Error merging videos: ${e.toString()}');
 }
 }

 void _showError(String message) {
 ScaffoldMessenger.of(context).showSnackBar(
 SnackBar(content: Text(message), backgroundColor: Colors.red),
 );
 }

 void _showSuccess(String message) {
 ScaffoldMessenger.of(context).showSnackBar(
 SnackBar(content: Text(message), backgroundColor: Colors.green),
 );
 }

 Future<void> _startAudioRecording() async {
 try {
 final Directory tempDir = await getTemporaryDirectory();
 final audioPath = '${tempDir.path}/recording.wav';
 await _audioRecorder.start(const RecordConfig(), path: audioPath);
 setState(() => _isRecording = true);
 } catch (e) {
 _showError('Recording start error: $e');
 }
 }

 Future<void> _startVideoRecording() async {
 try {
 await _cameraController!.startVideoRecording();
 setState(() => _isRecording = true);
 } catch (e) {
 _showError('Recording start error: $e');
 }
 }

 Future<void> _stopAndSaveAudioRecording() async {
 _audioPath = await _audioRecorder.stop();
 if (_audioPath != null) {
 final Directory appDir = await getApplicationDocumentsDirectory();
 final timestamp = DateTime.now().millisecondsSinceEpoch;
 final String audioFileName = 'audio_$timestamp.wav';
 await File(_audioPath!).copy('${appDir.path}/$audioFileName');
 _showSuccess('Saved: $audioFileName');
 }
 }

 Future<void> _stopAndSaveVideoRecording() async {
 try {
 final video = await _cameraController!.stopVideoRecording();
 _videoPath = video.path;

 if (_videoPath != null) {
 final Directory appDir = await getApplicationDocumentsDirectory();
 final timestamp = DateTime.now().millisecondsSinceEpoch;
 final String videoFileName = 'video_$timestamp.mp4';
 final savedVideoPath = '${appDir.path}/$videoFileName';
 await File(_videoPath!).copy(savedVideoPath);

 setState(() {
 _recordedVideos.add(savedVideoPath);
 _isRecording = false;
 });

 _showSuccess('Saved: $videoFileName');
 }
 } catch (e) {
 _showError('Recording stop error: $e');
 }
 }

 Future<void> _switchCamera() async {
 if (_cameras.length <= 1) return;

 if (_isRecording) {
 await _stopAndSaveVideoRecording();
 _currentCameraIndex = (_currentCameraIndex + 1) % _cameras.length;
 await _initializeCameraController(_cameras[_currentCameraIndex]);
 await _startVideoRecording();
 } else {
 _currentCameraIndex = (_currentCameraIndex + 1) % _cameras.length;
 await _initializeCameraController(_cameras[_currentCameraIndex]);
 }
 }

 Future<void> _toggleRecording() async {
 if (_cameraController == null) return;

 if (_isRecording) {
 await _stopAndSaveVideoRecording();
 await _stopAndSaveAudioRecording();
 } else {
 _startVideoRecording();
 _startAudioRecording();
 setState(() => _recordedVideos.clear());
 }
 }
}
</void></void></void></void></void></void></void></void></void></string></cameradescription></mergevideorecording></mergevideorecording>


-
Flutter : How to use "ffmpeg_kit_flutter" to merge videos ?
28 mai 2024, par Hani Kanakrii am using "ffmpeg_kit_flutter" to merge two videos with code



import 'dart:io';

import 'package:ffmpeg_kit_flutter/ffmpeg_kit.dart';
import 'package:ffmpeg_kit_flutter/abstract_session.dart';
import 'package:ffmpeg_kit_flutter/return_code.dart';
import 'package:flutter/material.dart';
import 'package:flutter_bloc/flutter_bloc.dart';
import 'package:wechat_assets_picker/wechat_assets_picker.dart';

import '/features/merge_videos/cubit/merge_videos_state.dart';

class MergeVideosCubit extends Cubit<mergevideosstate> {
 MergeVideosCubit(this.originalFile) : super(InitialMergeVideos());
 final File? originalFile;

 Future<void> selectVideo(BuildContext context) async {
 final List<assetentity>? videos = await AssetPicker.pickAssets(
 context,
 pickerConfig: const AssetPickerConfig(requestType: RequestType.video),
 );

 if (videos != null && videos.isNotEmpty) {
 for (AssetEntity asset in videos) {
 File? videoFile = await asset.file;
 if (videoFile != null) {
 print('Selected Asset Path: ${videoFile.path}');
 mergeVideos(originalFile!.path, videoFile.path);
 }
 }
 }
 }

 Future<void> mergeVideos(String inputPath1, String inputPath2) async {
 final String outputPath = "/storage/emulated/0/merged_video_${now()}.mp4";
 // final String command =
 // '-i $inputPath1 -i $inputPath2 -filter_complex "[0:v][0:a][1:v][1:a] concat=n=2:v=1:a=1[outv][outa]" -map "[outv]" -map "[outa]" -y $outputPath';
 final String command =
 '-i $inputPath1 -i $inputPath2 -filter_complex "[0:v][1:v]concat=n=2:v=1:a=0[outv]" -map "[outv]" -y $outputPath';
 print("FFmpeg process starting with command: $command");
 print(command);
 print("LOADING LOADING LOADING LOADING LOADING LOADING LOADING MERGE");
 emit(LoadMergeVideos());
 await FFmpegKit.execute(command).then((value) async {
 await value.getDuration();
 var id = await value.getSessionId();

 print(value);
 print(id);
 print(await value.getDuration());
 });
 await FFmpegKit.executeAsync(command, (session) async {
 final returnCode = await session.getReturnCode();
 await session.getSessionId();
 print(await session.getSessionId());

 if (ReturnCode.isSuccess(returnCode)) {
 print("SUCCESS: Video merged successfully at $outputPath");
 print("SUCCESS SUCCESS SUCCESS SUCCESS SUCCESS SUCCESS MERGE");
 emit(SuccessMergeVideos());

 } else if (ReturnCode.isCancel(returnCode)) {
 print("CANCELLED: Video merging was cancelled.");
 print("CANCEL CANCEL CANCEL CANCEL CANCEL CANCEL CANCEL MERGE");
 emit(CancelMergeVideos());

 } else {
 print("ERROR: Failed to merge videos.");
 print("ERROR ERROR ERROR ERROR ERROR ERROR ERROR ERROR MERGE");
 emit(ErrorMergeVideos());

 final failLog = await session.getFailStackTrace();

 print("FFmpeg Failure Log: $failLog");
 }
 });
 }

 String now() {
 final DateTime now = DateTime.now();
 return "${now.year}${now.month}${now.day}_${now.hour}${now.minute}${now.second}";
 }
}
</void></assetentity></void></mergevideosstate>


The console when i run the code



D/EGL_emulation(23858): app_time_stats: avg=2379.66ms min=5.87ms max=23160.09ms count=10
I/PhotoManager(23858): uri: content://media/external/file
I/PhotoManager(23858): projection: _display_name, _data, _id, title, bucket_id, bucket_display_name, width, height, orientation, date_added, date_modified, mime_type, datetaken, duration, media_type, relative_path
I/PhotoManager(23858): selection: _id = ?
I/PhotoManager(23858): selectionArgs: 1000000039
I/PhotoManager(23858): sortOrder: null
I/PhotoManager(23858): sql: _id = 1000000039
I/PhotoManager(23858): cursor count: 1
I/flutter (23858): Selected Asset Path: /storage/emulated/0/Movies/VID_20240512_115116.mp4
I/flutter (23858): FFmpeg process starting with command: -i /storage/emulated/0/Movies/VID_20240512_115128.mp4 -i /storage/emulated/0/Movies/VID_20240512_115116.mp4 -filter_complex "[0:v][1:v]concat=n=2:v=1:a=0[outv]" -map "[outv]" -y /storage/emulated/0/merged_video_2024513_122719.mp4
I/flutter (23858): -i /storage/emulated/0/Movies/VID_20240512_115128.mp4 -i /storage/emulated/0/Movies/VID_20240512_115116.mp4 -filter_complex "[0:v][1:v]concat=n=2:v=1:a=0[outv]" -map "[outv]" -y /storage/emulated/0/merged_video_2024513_122719.mp4
I/flutter (23858): LOADING LOADING LOADING LOADING LOADING LOADING LOADING MERGE
I/flutter (23858): Instance of 'FFmpegSession'
I/flutter (23858): 1
I/flutter (23858): 246
I/flutter (23858): 2
I/flutter (23858): ERROR: Failed to merge videos.
I/flutter (23858): ERROR ERROR ERROR ERROR ERROR ERROR ERROR ERROR MERGE
D/EGL_emulation(23858): app_time_stats: avg=87.27ms min=4.92ms max=319.92ms count=13
I/flutter (23858): FFmpeg Failure Log: null



In "mergeVideos" function the "returnCode" it return value "1" when i look in the package code


getState() async {
 try {
 return _platform
 .abstractSessionGetState(this.getSessionId())
 .then((state) {
 switch (state) {
 case 0:
 return SessionState.created;
 case 1:
 return SessionState.running;
 case 2:
 return SessionState.failed;
 case 3:
 default:
 return SessionState.completed;
 }
 });
 } on PlatformException catch (e, stack) {
 print("Plugin getState error: ${e.message}");
 return Future.error("getState failed.", stack);
 }
}



Which mean its keep running but in my code it does not wait until complete merging


how can i fix that ??!


but i think my problem is in command(concat)



-i /storage/emulated/0/Movies/VID_20240512_115128.mp4 -i /storage/emulated/0/Movies/VID_20240512_115116.mp4 -filter_complex "[0:v][1:v]concat=n=2:v=1:a=0[outv]" -map "[outv]" -y /storage/emulated/0/merged_video_2024513_122719.mp4



(This is my command when i run the code) ??