
Recherche avancée
Autres articles (36)
-
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
De l’upload à la vidéo finale [version standalone]
31 janvier 2010, parLe chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
Upload et récupération d’informations de la vidéo source
Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)
Sur d’autres sites (4044)
-
How do I sync 4 videos in a grid to play the same frame at the same time ?
28 décembre 2022, par PirateApp- 

- 4 of us have recorded ourselves playing a game and want to create a 4 x 4 video grid
- The game has cutscenes at the beginning followed by each person having their unique part for the rest of the video
- I am looking to synchronize the grid such that it starts at the same place in the cutscene for everyone
- Kindly take a look at what is happening currently. The cutscene is off by a few seconds for everyone
- Imagine a time offset a,b,c,d such that when I add this offet to each video, the entire video grid will be in sync
- How to find this a,b,c,d and more importantly how to add it in filter_complex














I used the ffmpeg command below to generate a 4 x 4 video grid and it seems to work


ffmpeg
 -i nano_prologue.mkv -i macko_nimble_guardian.mkv -i nano_nimble_guardian.mkv -i ghost_nimble_guardian_subtle_arrow_1.mp4
 -filter_complex "
 nullsrc=size=1920x1080 [base];
 [0:v] setpts=PTS-STARTPTS, scale=960x540 [upperleft];
 [1:v] setpts=PTS-STARTPTS, scale=960x540 [upperright];
 [2:v] setpts=PTS-STARTPTS, scale=960x540 [lowerleft];
 [3:v] setpts=PTS-STARTPTS, scale=960x540 [lowerright];
 [base][upperleft] overlay=shortest=1 [tmp1];
 [tmp1][upperright] overlay=shortest=1:x=960 [tmp2];
 [tmp2][lowerleft] overlay=shortest=1:y=540 [tmp3];
 [tmp3][lowerright] overlay=shortest=1:x=960:y=540
 "
 -c:v libx264 output.mkv



My problem though is that since each of us starts recording at slightly different times, the cutscenes are out of sync


As per the screenshot below, you can see that each video has the same scene starting at a slightly different time.


Is there a way to find where the same frame will start on all videos and then sync each video to start from that frame or 20 seconds before that frame ?




UPDATE 1


i have figured out the offset for each video in millisecond precision using the following technique


take a screenshot of the first video at a particular point in the cutscene and save image as png and run the script below for the remaining 3 videos to find out where this screenshot appears in each video


ffmpeg -i "video2.mp4" -r 1 -loop 1 -i screenshot.png -an -filter_complex "blend=difference:shortest=1,blackframe=90:32" -f null -



Use the command above to search for the offset in every video for that cutscene


It gave me this


VIDEO 3 OFFSET


[Parsed_blackframe_1 @ 0x600003af00b0] frame:3144 pblack:92 pts:804861 t:52.399805 type:P last_keyframe:3120

[Parsed_blackframe_1 @ 0x600003af00b0] frame:3145 pblack:96 pts:805117 t:52.416471 type:P last_keyframe:3120



VIDEO 2 OFFSET


[Parsed_blackframe_1 @ 0x6000014dc0b0] frame:3629 pblack:91 pts:60483 t:60.483000 type:P last_keyframe:3500



VIDEO 4 OFFSET


[Parsed_blackframe_1 @ 0x600002f84160] frame:2885 pblack:93 pts:48083 t:48.083000 type:P last_keyframe:2880

[Parsed_blackframe_1 @ 0x600002f84160] frame:2886 pblack:96 pts:48100 t:48.100000 type:P last_keyframe:2880



Now how do I use filter_complex to say start each video at either the frame above or the timestamp above ?. I would like to include say 10 seconds before the above frame in each video so that it starts from the beginning


UPDATE 2


This command currently gives me a 100% synced video, how do I make it start 15 seconds before the specified frame numbers and how to make it use the audio track from video 2 instead ?


ffmpeg
 -i v_nimble_guardian.mkv -i macko_nimble_guardian.mkv -i ghost_nimble_guardian_subtle_arrow_1.mp4 -i nano_nimble_guardian.mkv
 -filter_complex "
 nullsrc=size=1920x1080 [base];
 [0:v] trim=start_pts=49117,setpts=PTS-STARTPTS, scale=960x540 [upperleft];
 [1:v] trim=start_pts=50483,setpts=PTS-STARTPTS, scale=960x540 [upperright];
 [2:v] trim=start_pts=795117,setpts=PTS-STARTPTS, scale=960x540 [lowerleft];
 [3:v] trim=start_pts=38100,setpts=PTS-STARTPTS, scale=960x540 [lowerright];
 [base][upperleft] overlay=shortest=1 [tmp1];
 [tmp1][upperright] overlay=shortest=1:x=960 [tmp2];
 [tmp2][lowerleft] overlay=shortest=1:y=540 [tmp3];
 [tmp3][lowerright] overlay=shortest=1:x=960:y=540
 "
 -c:v libx264 output.mkv



-
Rails 5 with Carrierwave and S3 - creating multiple video formats for DASH streaming works but mpd file breaks
22 novembre 2019, par Milindwhat I am doing -
i have aRails 5
app for video streaming(DASH MPEG) that usesFFMPEG
to get encoded stream videos by converting any single video into multiple videos of multiple bit rates/size and primarily also MPD FILE that can be played easily on html video player, which i have already tested by manually running the ffmpeg scripts on the console that generates all the files.However, I want to automate this process and hencecarrierwave
comes into the pictures.
Here, i usecarrierwave
to generate different versions(size/bitrate) of videos(mp4/webm) to upload to s3 but during running the version, where all the versions are successfully created in tmp folder, only the last version(mpd) that needs to create .mpd file, carrierwave creates a mp4 video file and just replaces the extension instead of actually creating the mpd file.So in the
aws s3(screenshot added below)
, i can see my all versions andmpd
file , but thatmpd
file which must be xml file is actually amp4
video file or uploaded version file itself.
I have also tried to create new file during the process, but it never works.
Has some one encountered this problem ?any help will be greatly appreciated ?
Ny code snippets below - model,uploader,output of script on the console during upload, s3 screenshot
##### models/video.rb ##########
mount_uploader :video, VideoUploader
####### uploaders/video_uploader.rb #########
class VideoUploader < CarrierWave::Uploader::Base
include CarrierWave::MiniMagick
include CarrierWave::Video
include CarrierWave::Video::Thumbnailer
include ::CarrierWave::Backgrounder::Delay
####### for streaming ..first get the audio and then convert the input video into multiple bitrates/scale #######
###first get audio and then get all different versions of same video
version :video_audio do
process :get_audio
def get_audio
`ffmpeg -y -i "#{file.path}" -c:a aac -ac 2 -ab 128k -vn video_audio.mp4`
end
def full_filename(for_file)
"video_audio.mp4"
end
def filename
"video_audio.mp4"
end
end
####### similar to the above i have various version like ...#########
version :video_1080 do...end
version :video_720 do... end
version :video_480 do ...end
...and so on..and all these versions are successfully created and uploaded to s3, however..in next version ...show it also creates a video file whereas i need a simple mpd file ONLY.
###this is where even after everything works, in S3, i can see a video file of version mpd and not actual mpd file
version :mpd do
process :get_manifest
###here in the command below, the video.mpd file is successfully obtained but its uploaded as video.mpd file of added/uploaded video file and not a new mpd file
###tried with ffmpeg -f webm_dash_manifest -i too, but s3 still shows a mp4 file
`MP4Box -dash 1000 -rap -frag-rap -profile onDemand -out video.mpd video_1080.mp4 video_720.mp4 video_480.mp4 video_360.mp4 video_240.mp4 video_audio.mp4 `
end
end
######### sidekiq console output - successful mpd is generated ################
DASH-ing files - single segment
Subsegment duration 1.000 - Fragment duration: 1.000 secs
Splitting segments and fragments at GOP boundaries
DASHing file video_1080.mp4
DASHing file video_720.mp4
DASHing file video_480.mp4
DASHing file video_360.mp4
DASHing file video_240.mp4
DASHing file video_audio.mp4
\[DASH\] Generating MPD at time 2019-11-22T00:01:59.872Z
mpd_1mb.mp4
mpd_video.mpdthis is what the uploaded files looks on s3, notice the video.mpd, its a mp4 video file just like others which should have been a simple mpd file of not more than 2kb.
Is there something that I am missing ?
Can Carrierwave do this or is it not made for this ?
Do I have to write a callback and then programmatically upload files to s3, if carrierwave is not helping in this regard ?Kindly provide any suggestion or useful advice so that I can move ahead.
-
Flutter_ffmpeg : At least one output file must be specified
22 mars 2020, par Jehonadab OkpukoroI’m trying to crop a video with Flutter_ffmpeg package using this
-i $inputPath -filter:v "crop=80:60:200:100" -c $outputPath
, but I’m having this error messageRunning FFmpeg with arguments: [-i, /data/user/0/com.timz/app_flutter/Movies/flutter_test/1584827688309.mp4, -filter:v, crop=80:60:200:100, -c, /data/user/0/com.timz/cache/output.mp4].
I/flutter (20728): ffmpeg version git-2020-01-25-fd11dd500
I/flutter (20728): Copyright (c) 2000-2020 the FFmpeg developers
I/flutter (20728):
I/flutter (20728): built with Android (5220042 based on r346389c) clang version 8.0.7 (https://android.googlesource.com/toolchain/clang b55f2d4ebfd35bf643d27dbca1bb228957008617) (https://android.googlesource.com/toolchain/llvm 3c393fe7a7e13b0fba4ac75a01aa683d7a5b11cd) (based on LLVM 8.0.7svn)
I/flutter (20728): configuration: --cross-prefix=aarch64-linux-android- --sysroot=/files/android-sdk/ndk-bundle/toolchains/llvm/prebuilt/linux-x86_64/sysroot --prefix=/home/taner/Projects/mobile-ffmpeg/prebuilt/android-arm64/ffmpeg --pkg-config=/usr/bin/pkg-config --enable-version3 --arch=aarch64 --cpu=armv8-a --cc=aarch64-linux-android24-clang --cxx=aarch64-linux-android24-clang++ --target-os=android --enable-neon --enable-asm --enable-inline-asm --enable-cross-compile --enable-pic --enable-jni --enable-optimizations --enable-swscale --enable-shared --disable-v4l2-m2m --disable-outdev=v4l2 --disable-outdev=fbdev --disable-indev=v4l2 --disable-indev=fbdev --enable-small --disable-openssl --disable-xmm-clobber-test --disable-debug --enable-lto --disable-neon-clobber-test --disable-programs --disable-postproc --disable-doc --disable-htmlpages --disable-manpages --disable-podpages --disable-txtpages --disable-static --disable-sndio --disable-schannel --disable-securetransport --disable-xlib --disable-cuda --disable-cuvid --disa
I/flutter (20728): libavutil 56. 38.100 / 56. 38.100
I/flutter (20728): libavcodec 58. 65.102 / 58. 65.102
I/flutter (20728): libavformat 58. 35.101 / 58. 35.101
I/flutter (20728): libavdevice 58. 9.103 / 58. 9.103
I/flutter (20728): libavfilter 7. 70.101 / 7. 70.101
I/flutter (20728): libswscale 5. 6.100 / 5. 6.100
I/flutter (20728): libswresample 3. 6.100 / 3. 6.100
I/flutter (20728): Trailing option(s) found in the command: may be ignored.
D/flutter-ffmpeg(20728): FFmpeg exited with rc: 1
I/flutter (20728): Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/data/user/0/com.timz/app_flutter/Movies/flutter_test/1584827688309.mp4':
I/flutter (20728): Metadata:
I/flutter (20728): major_brand :
I/flutter (20728): mp42
I/flutter (20728):
I/flutter (20728): minor_version :
I/flutter (20728): 0
I/flutter (20728):
I/flutter (20728): compatible_brands:
I/flutter (20728): isommp42
I/flutter (20728):
I/flutter (20728): creation_time :
I/flutter (20728): 2020-03-21T21:54:56.000000Z
I/flutter (20728):
I/flutter (20728): com.android.version:
I/flutter (20728): 9
I/flutter (20728):
I/flutter (20728): Duration:
I/flutter (20728): 00:00:07.17
I/flutter (20728): , start:
I/flutter (20728): 0.000000
I/flutter (20728): , bitrate:
I/flutter (20728): 3870 kb/s
I/flutter (20728):
I/flutter (20728): Stream #0:0
I/flutter (20728): (eng)
I/flutter (20728): : Video: h264 (avc1 / 0x31637661), yuv420p(tv, bt709), 720x480, 3854 kb/s
I/flutter (20728): , SAR 1:1 DAR 3:2
I/flutter (20728): ,
I/flutter (20728): 29.44 fps,
I/flutter (20728): 29.83 tbr,
I/flutter (20728): 90k tbn,
I/flutter (20728): 180k tbc
I/flutter (20728): (default)
I/flutter (20728):
I/flutter (20728): Metadata:
I/flutter (20728): rotate :
I/flutter (20728): 270
I/flutter (20728):
I/flutter (20728): creation_time :
I/flutter (20728): 2020-03-21T21:54:56.000000Z
I/flutter (20728):
I/flutter (20728): handler_name :
I/flutter (20728): VideoHandle
I/flutter (20728):
I/flutter (20728): Side data:
I/flutter (20728):
I/flutter (20728): displaymatrix: rotation of 90.00 degrees
I/flutter (20728):
I/flutter (20728): Stream #0:1
I/flutter (20728): (eng)
I/flutter (20728): : Audio: aac (mp4a / 0x6134706D), 48000 Hz, mono, fltp, 12 kb/s
I/flutter (20728): (default)
I/flutter (20728):
I/flutter (20728): Metadata:
I/flutter (20728): creation_time :
I/flutter (20728): 2020-03-21T21:54:56.000000Z
I/flutter (20728):
I/flutter (20728): handler_name :
I/flutter (20728): SoundHandle
I/flutter (20728):
I/flutter (20728): At least one output file must be specifiedI’ve been cracking my head with this for the past two days, kindly share your thoughts on what might wrong.