
Recherche avancée
Médias (91)
-
DJ Z-trip - Victory Lap : The Obama Mix Pt. 2
15 septembre 2011
Mis à jour : Avril 2013
Langue : English
Type : Audio
-
Matmos - Action at a Distance
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
DJ Dolores - Oslodum 2004 (includes (cc) sample of “Oslodum” by Gilberto Gil)
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Danger Mouse & Jemini - What U Sittin’ On ? (starring Cee Lo and Tha Alkaholiks)
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Cornelius - Wataridori 2
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
The Rapture - Sister Saviour (Blackstrobe Remix)
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
Autres articles (90)
-
List of compatible distributions
26 avril 2011, parThe table below is the list of Linux distributions compatible with the automated installation script of MediaSPIP. Distribution nameVersion nameVersion number Debian Squeeze 6.x.x Debian Weezy 7.x.x Debian Jessie 8.x.x Ubuntu The Precise Pangolin 12.04 LTS Ubuntu The Trusty Tahr 14.04
If you want to help us improve this list, you can provide us access to a machine whose distribution is not mentioned above or send the necessary fixes to add (...) -
MediaSPIP Core : La Configuration
9 novembre 2010, parMediaSPIP Core fournit par défaut trois pages différentes de configuration (ces pages utilisent le plugin de configuration CFG pour fonctionner) : une page spécifique à la configuration générale du squelettes ; une page spécifique à la configuration de la page d’accueil du site ; une page spécifique à la configuration des secteurs ;
Il fournit également une page supplémentaire qui n’apparait que lorsque certains plugins sont activés permettant de contrôler l’affichage et les fonctionnalités spécifiques (...) -
Le profil des utilisateurs
12 avril 2011, parChaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...)
Sur d’autres sites (9285)
-
Concatenating/Splicing overlapping video clips with ffmpeg
16 décembre 2020, par Jimbo1987I'm trying to concatenate multiple short .mp4 video clips from a security camera. The camera records short clips, with a few seconds on either end of a timespan when motion is detected. For example, two minutes of video will often be broken up into four 35 second clips, with the first/last few seconds of each clip being duplicative of the last/first few seconds of the previous/next clip.


I simply concatenate the clips together using the ffmpeg concat demuxer, as described here : How to concatenate two MP4 files using FFmpeg ?, with


(echo file 'first file.mp4' & echo file 'second file.mp4' )>list.txt
ffmpeg -safe 0 -f concat -i list.txt -c copy output.mp4



Or else I transcode them into intermediate MPEG-2 transport streams, which I can then concatenate with the file-level concat protocol, as described here : https://trac.ffmpeg.org/wiki/Concatenate#protocol, with


ffmpeg -i "first file.mp4" -c copy -bsf:v h264_mp4toannexb -f mpegts intermediate1.ts
ffmpeg -i "second file.mp4" -c copy -bsf:v h264_mp4toannexb -f mpegts intermediate2.ts
ffmpeg -i "concat:intermediate1.ts|intermediate2.ts" -c copy -bsf:a aac_adtstoasc output.mp4



But either way, the resulting video (output.mp4) jumps backward in time a few seconds every half-minute or so because of the duplicated frames.


I want to throw out the duplicate frames, and tie the clips together based on timestamps to achieve smooth playback of the concatenated full-length video. I'd strongly prefer to do this on Windows with ffmpeg if possible. Surely this has been done before, right ? Are there timestamps in the .mp4 files that I can use to determine how much overlap there is, and then splice at the proper point-in-time ? And if so, how do I read them, how do I splice at an exact point in time, and how do I get around the KeyFrames issue if I can splice at the exact point in time ?


-
Flutter record front facing camera at exact same time as playing video
27 novembre 2020, par xceedI've been playing around with Flutter and trying to get it so I can record the front facing camera (using the camera plugin [https://pub.dev/packages/camera]) as well as playing a video to the user (using the video_player plugin [https://pub.dev/packages/video_player]).


Next I use ffmpeg to horizontally stack the videos together. This all works fine but when I play back the final output there is a slight delay when listening to the audio. I'm calling
Future.wait([cameraController.startVideoRecording(filePath), videoController.play()]);
but there is a slight delay in these tasks actually starting. I don't even need them to fire at the exact same time (which I'm realising is probably impossible), instead if I knew exactly when each of the tasks begun then I can use the time difference to sync the audio using ffmpeg or similar.

I've tried adding a listener on the videoController to see when isPlaying first returns true, and also watching the output directory for when the recorded video appears on the filesystem :


listener = () {
 if (videoController.value.isPlaying) {
 isPlaying = DateTime.now().microsecondsSinceEpoch;
 log('isPlaying '+isPlaying.toString());
 }
 videoController.removeListener(listener);
 };
 videoController.addListener(listener);

 var watcher = DirectoryWatcher('${extDir.path}/');
 watcher.events.listen((event) {
 if (event.type == ChangeType.ADD) {
 fileAdded = DateTime.now().microsecondsSinceEpoch;
 log('added '+fileAdded.toString());
 }
 });



Then likewise for checking if the camera is recording :


var listener;
 listener = () {
 if (cameraController.value.isRecordingVideo) {
 log('isRecordingVideo '+DateTime.now().microsecondsSinceEpoch.toString());
 //cameraController.removeListener(listener);
 }
 };
 cameraController.addListener(listener);



This results in (for example) the following order and microseconds for each event :


is playing: 1606478994797247
is recording: 1606478995492889 (695642 microseconds later)
added: 1606478995839676 (346787 microseconds later)



However, when I play back the video the syncing is off by approx 0.152 seconds so doesn't marry up to the time differences reported above.


Does anyone have any idea how I could accomplish near perfect syncing when combining 2 videos ? Thanks.


-
How do I know the time left in order to convert to an audio file, to use that to display the remainder for downloading in front of the user
6 novembre 2020, par Kerols afifiI got this code from GITHUB, and this is better for me than using the FFmpeg library, because this library only works at a minimum. Issued 24 APIs, but the code that I got I don't know exactly how to tell me when the conversion is finished and also how much time is left because it's the conversion to appear. In front of the user


@SuppressLint("NewApi")
 public void genVideoUsingMuxer(Context context,String srcPath, String dstPath, int startMs, int endMs, boolean useAudio, boolean useVideo,long time) throws IOException {
 // Set up MediaExtractor to read from the source.
 extractor = new MediaExtractor();
 extractor.setDataSource(srcPath);
 int trackCount = extractor.getTrackCount();
 // Set up MediaMuxer for the destination.

 mediaMuxer = new MediaMuxer(dstPath, MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);
 // Set up the tracks and retrieve the max buffer size for selected
 // tracks.
 HashMap indexMap = new HashMap(trackCount);
 int bufferSize = -1;
 for (int i = 0; i < trackCount; i++) {
 MediaFormat format = extractor.getTrackFormat(i);
 String mime = format.getString(MediaFormat.KEY_MIME);
 boolean selectCurrentTrack = false;
 if (mime.startsWith("audio/") && useAudio) {
 selectCurrentTrack = true;
 } else if (mime.startsWith("video/") && useVideo) {
 selectCurrentTrack = true;
 }
 if (selectCurrentTrack) {
 extractor.selectTrack(i);
 int dstIndex = mediaMuxer.addTrack(format);
 indexMap.put(i, dstIndex);
 if (format.containsKey(MediaFormat.KEY_MAX_INPUT_SIZE)) {
 int newSize = format.getInteger(MediaFormat.KEY_MAX_INPUT_SIZE);
 bufferSize = newSize > bufferSize ? newSize : bufferSize;
 }
 }
 }
 if (bufferSize < 0) {
 bufferSize = DEFAULT_BUFFER_SIZE;
 }
 // Set up the orientation and starting time for extractor.
 MediaMetadataRetriever retrieverSrc = new MediaMetadataRetriever();
 retrieverSrc.setDataSource(srcPath);
 String degreesString = retrieverSrc.extractMetadata(MediaMetadataRetriever.METADATA_KEY_VIDEO_ROTATION);
 if (degreesString != null) {
 int degrees = Integer.parseInt(degreesString);
 if (degrees >= 0) {
 mediaMuxer.setOrientationHint(degrees);
 }
 }
 if (startMs > 0) {
 extractor.seekTo(startMs * 1000, MediaExtractor.SEEK_TO_CLOSEST_SYNC);
 }
 // Copy the samples from MediaExtractor to MediamediaMuxer. We will loop
 // for copying each sample and stop when we get to the end of the source
 // file or exceed the end time of the trimming.
 int offset = 0;
 int trackIndex = -1;
 time = bufferSize;
 ByteBuffer dstBuf = ByteBuffer.allocate(bufferSize);
 mediaMuxer.start();
 while (true) {
 bufferInfo.offset = offset;
 bufferInfo.size = extractor.readSampleData(dstBuf, offset);
 if (bufferInfo.size < 0) {
 Log.d(TAG, "Saw input EOS.");
 bufferInfo.size = 0;
 break;
 } else {
 bufferInfo.presentationTimeUs = extractor.getSampleTime();
 if (endMs > 0 && bufferInfo.presentationTimeUs > (endMs * 1000)) {
 Log.d(TAG, "The current sample is over the trim end time.");
 break;
 } else {
 bufferInfo.flags = extractor.getSampleFlags();
 trackIndex = extractor.getSampleTrackIndex();
 mediaMuxer.writeSampleData(indexMap.get(trackIndex), dstBuf, bufferInfo);
 extractor.advance();
 }
 }
 }

 mediaMuxer.stop();
 mediaMuxer.release();

 return;
}