
Recherche avancée
Autres articles (41)
-
Encoding and processing into web-friendly formats
13 avril 2011, parMediaSPIP automatically converts uploaded files to internet-compatible formats.
Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
All uploaded files are stored online in their original format, so you can (...) -
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...) -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...)
Sur d’autres sites (6561)
-
Ramp- Slow Motion in IOS
26 juin 2013, par 2vision2I have to do "ramp-slow motion" in a video file along with audio, in-between some frames and need to store the ramped video as a new video.
Ref : http://www.youtube.com/watch?v=BJ3_xMGzauk (watch from 0 to 10s)
From my analysis, I've found that AVFoundation framework can be helpful.
Copy and pasted from the above link :
"
Editing
AV Foundation uses compositions to create new assets from existing pieces of media (typically, one or more video and audio tracks). You use a mutable composition to add and remove tracks, and adjust their temporal orderings. You can also set the relative volumes and ramping of audio tracks ; and set the opacity, and opacity ramps, of video tracks. A composition is an assemblage of pieces of media held in memory. When you export a composition using an export session, it's collapsed to a file.
On iOS 4.1 and later, you can also create an asset from media such as sample buffers or still images using an asset writer."
Questions :
Can I "ramp-slow mote" the video/audio file using the AVFoundation framework ? Or Is there any other package available ? If i want to handle audio and video separately, please guide me how to do ? -
ffmpeg got stuck while trying to crossfade merge two videos
30 juin 2017, par JeflopoI’m trying to do a crossfade merge (1s) between two videos. An intro (39secs duration) video with the main video. When I executed the command it started working without throwing errors but at some frame ffmpeg gets stuck.
I read a lot of q/a here in stackoverflow, and the official docs but I can’t solve this so :
This is the command :
ffmpeg -i "inputs/intro.mp4" -i "inputs/240p.mp4" -an -filter_complex \
"[0:v]trim=start=0:end=38,setpts=PTS-STARTPTS[firstclip]; \
[0:v]trim=start=38:end=39,setpts=PTS-STARTPTS[fadeoutsrc]; \
[1:v]trim=start=1,setpts=PTS-STARTPTS[secondclip]; \
[1:v]trim=start=0:end=1,setpts=PTS-STARTPTS[fadeinsrc]; \
[fadeinsrc]format=pix_fmts=yuva420p, fade=t=in:st=0:d=1:alpha=1[fadein]; \
[fadeoutsrc]format=pix_fmts=yuva420p, fade=t=out:st=0:d=1:alpha=1[fadeout]; \
[fadein]fifo[fadeinfifo]; \
[fadeout]fifo[fadeoutfifo]; \
[fadeoutfifo][fadeinfifo]overlay[crossfade]; \
[firstclip][crossfade][secondclip]concat=n=3[output]; \
[0:a][1:a] acrossfade=d=1 [audio]" -vcodec libx264 -map "[output]" -map "[audio]" "outputs/240p.mp4"Here’s the raw command (the exact command I used) :
ffmpeg -i "inputs/intro.mp4" -i "inputs/240p.mp4" -an -filter_complex "[0:v]trim=start=0:end=38,setpts=PTS-STARTPTS[firstclip]; [0:v]trim=start=38:end=39,setpts=PTS-STARTPTS[fadeoutsrc]; [1:v]trim=start=1,setpts=PTS-STARTPTS[secondclip]; [1:v]trim=start=0:end=1,setpts=PTS-STARTPTS[fadeinsrc]; [fadeinsrc]format=pix_fmts=yuva420p, fade=t=in:st=0:d=1:alpha=1[fadein]; [fadeoutsrc]format=pix_fmts=yuva420p, fade=t=out:st=0:d=1:alpha=1[fadeout]; [fadein]fifo[fadeinfifo]; [fadeout]fifo[fadeoutfifo]; [fadeoutfifo][fadeinfifo]overlay[crossfade]; [firstclip][crossfade][secondclip]concat=n=3[output]; [0:a][1:a] acrossfade=d=1 [audio]" -vcodec libx264 -map "[output]" -map "[audio]" "outputs/240p.mp4"
The "error" is reproducible with and without the
-an
and theacrossfade
filters.This is the output :
PS C:\scripts\ffmpeg> ffmpeg -i "inputs/intro.mp4" -i "inputs/240p.mp4" -an -filter_complex "[0:v]trim=start=0:end=38,setpts=PTS-STARTPTS[firstclip]; [0:v]trim=start=38:end=39,setpts=PTS-STARTPTS[fadeoutsrc]; [1:v]trim=start=1,setpts=PTS-STARTPTS[secondclip]; [1:v]trim=start=0:end=1,setpts=PTS-STARTPTS[fadeinsrc]; [fadeinsrc]format=pix_fmts=yuva420p, fade=t=in:st=0:d=1:alpha=1[fadein]; [fadeoutsrc]format=pix_fmts=yuva420p, fade=t=out:st=0:d=1:alpha=1[fadeout]; [fadein]fifo[fadeinfifo]; [fadeout]fifo[fadeoutfifo]; [fadeoutfifo][fadeinfifo]overlay[crossfade]; [firstclip][crossfade][secondclip]concat=n=3[output]; [0:a][1:a] acrossfade=d=1 [audio]" -map "[output]" -map "[audio]" "outputs/240p.mp4"
ffmpeg version N-86669-gc1d1274 Copyright (c) 2000-2017 the FFmpeg developers
built with gcc 7.1.0 (GCC)
configuration: --enable-gpl --enable-version3 --enable-cuda --enable-cuvid --enable-d3d11va --enable-dxva2 --enable-libmfx --enable-nvenc --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libfreetype --enable-libgme --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenh264 --enable-libopenjpeg --enable-libopus --enable-librtmp --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-libzimg --enable-lzma --enable-zlib
libavutil 55. 67.100 / 55. 67.100
libavcodec 57.100.102 / 57.100.102
libavformat 57. 75.100 / 57. 75.100
libavdevice 57. 7.100 / 57. 7.100
libavfilter 6. 94.100 / 6. 94.100
libswscale 4. 7.101 / 4. 7.101
libswresample 2. 8.100 / 2. 8.100
libpostproc 54. 6.100 / 54. 6.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'inputs/intro.mp4':
Metadata:
major_brand : isom
minor_version : 512
compatible_brands: isomiso2avc1mp41
encoder : Lavf57.72.101
Duration: 00:06:24.45, start: 0.000000, bitrate: 491 kb/s
Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p, 426x240 [SAR 1:1 DAR 71:40], 353 kb/s, 25 fps, 25 tbr, 12800 tbn, 50 tbc (default)
Metadata:
handler_name : VideoHandler
Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 130 kb/s (default)
Metadata:
handler_name : SoundHandler
Input #1, mov,mp4,m4a,3gp,3g2,mj2, from 'inputs/240p.mp4':
Metadata:
major_brand : isom
minor_version : 512
compatible_brands: isomiso2avc1mp41
encoder : Lavf56.40.101
Duration: 00:06:24.43, start: 0.000000, bitrate: 375 kb/s
Stream #1:0(und): Video: h264 (Main) (avc1 / 0x31637661), yuv420p, 426x240 [SAR 1:1 DAR 71:40], 243 kb/s, 25 fps, 25 tbr, 90k tbn, 50 tbc (default)
Metadata:
handler_name : VideoHandler
Stream #1:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 125 kb/s (default)
Metadata:
handler_name : SoundHandler
Stream mapping:
Stream #0:0 (h264) -> trim
Stream #0:0 (h264) -> trim
Stream #0:1 (aac) -> acrossfade:crossfade0
Stream #1:0 (h264) -> trim
Stream #1:0 (h264) -> trim
Stream #1:1 (aac) -> acrossfade:crossfade1
concat -> Stream #0:0 (libx264)
acrossfade -> Stream #0:1 (aac)
Press [q] to stop, [?] for help
[libx264 @ 00000000026b2240] using SAR=1/1
[libx264 @ 00000000026b2240] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 BMI2 AVX2
[libx264 @ 00000000026b2240] profile High, level 2.1
[libx264 @ 00000000026b2240] 264 - core 152 r2851 ba24899 - H.264/MPEG-4 AVC codec - Copyleft 2003-2017 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=7 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
Output #0, mp4, to 'outputs/240p.mp4':
Metadata:
major_brand : isom
minor_version : 512
compatible_brands: isomiso2avc1mp41
encoder : Lavf57.75.100
Stream #0:0: Video: h264 (libx264) ([33][0][0][0] / 0x0021), yuv420p, 426x240 [SAR 1:1 DAR 71:40], q=-1--1, 25 fps, 12800 tbn, 25 tbc (default)
Metadata:
encoder : Lavc57.100.102 libx264
Side data:
cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: -1
Stream #0:1: Audio: aac (LC) ([64][0][0][0] / 0x0040), 44100 Hz, stereo, fltp, 128 kb/s (default)
Metadata:
encoder : Lavc57.100.102 aac
frame=10369 fps=503 q=28.0 size= 24064kB time=00:06:55.68 bitrate= 474.2kbits/s speed=20.2xAt frame 10000 it gets stuck... I waited for 1hour but it keeps stuck.
I’ve updated ffmpeg :
ffmpeg -version
ffmpeg version N-86669-gc1d1274 Copyright (c) 2000-2017 the FFmpeg developers
built with gcc 7.1.0 (GCC)
configuration: --enable-gpl --enable-version3 --enable-cuda --enable-cuvid --enable-d3d11va --enable-dxva2 --enable-libmfx --enable-nvenc --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libfreetype --enable-libgme --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenh264 --enable-libopenjpeg --enable-libopus --enable-librtmp --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-libzimg --enable-lzma --enable-zlib
libavutil 55. 67.100 / 55. 67.100
libavcodec 57.100.102 / 57.100.102
libavformat 57. 75.100 / 57. 75.100
libavdevice 57. 7.100 / 57. 7.100
libavfilter 6. 94.100 / 6. 94.100
libswscale 4. 7.101 / 4. 7.101
libswresample 2. 8.100 / 2. 8.100
libpostproc 54. 6.100 / 54. 6.100I used these references :
-
FFMpeg concat streams
9 juillet 2017, par chourizoI am trying to receive two H264 UDP streams from two cameras, and save them to one file (so they are always synchronized). I tried a lot of things, but it always says that there is no video on the second stream (although I can watch it).
ffmpeg -probesize 20M -analyzeduration 20M -i udp://@127.0.0.1:1234 -probesize 20M -analyzeduration 20M -i udp://@127.0.0.1:1235 -filter_complex "[0:v]fps=15,scale=320:240,setsar=1/1,setpts=PTS-STARTPTS[v0]; [1:v]fps=15,scale=320:240,setsar=1/1,setpts=PTS-STARTPTS[v1]; [v0][v1]concat=n=2:v=1:a=0 [v0] [v1]" -map "[v0]" -map "[v1]" -threads 0 -y kk.ts
Is it possible to apply the concat to real time streams, so we have a video with two programs as a result ?