
Recherche avancée
Médias (1)
-
SWFUpload Process
6 septembre 2011, par
Mis à jour : Septembre 2011
Langue : français
Type : Texte
Autres articles (31)
-
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...) -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
De l’upload à la vidéo finale [version standalone]
31 janvier 2010, parLe chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
Upload et récupération d’informations de la vidéo source
Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)
Sur d’autres sites (5567)
-
Does ffmpeg 3.0 support hardware acceleration for video stream on iOS, if it does, how to enable it ?
30 mars 2016, par Gaojin HsuDo I have to use Apple’s VideoToolBox.framework and CoreVideo.framework to enable hardware acceleration on iOS ? is that the only way ? Does the newest ffmpeg have the ability?if it does, how to enable it ?
-
CMTimeGetSeconds doesn't get the right video duration
26 février 2015, par JLCastillosome users don’t get the right duration of videos they capture with their own device. The funny thing is others do actually see it right, using the same device models and OS version. Anyway, we observed it in a iPhone 5c 7.1.2 and an iPhone 5s 8.1.3.
This code works for most users, but not all :
ALAssetRepresentation *representation = [mediaObject.asset defaultRepresentation];
NSURL *url = [representation url];
NSDictionary *options = @{ AVURLAssetPreferPreciseDurationAndTimingKey : @YES };
AVAsset *avAsset = [AVURLAsset URLAssetWithURL:url options:options];
videoDurationTime = CMTimeGetSeconds(avAsset.duration);I asked them to send the input videos, and this is the output from "ffmpeg -i"
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'bug_duration1.MOV':
Metadata:
major_brand : qt
minor_version : 0
compatible_brands: qt
creation_time : 2015-02-23 08:30:01
encoder : 8.1.3
encoder-eng : 8.1.3
date : 2015-02-23T16:30:01+0800
date-eng : 2015-02-23T16:30:01+0800
model : iPhone 5s
model-eng : iPhone 5s
make : Apple
make-eng : Apple
Duration: 00:00:03.67, start: 0.000000, bitrate: 16793 kb/s
Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, bt709), 1920x1080, 16719 kb/s, 29.99 fps, 29.97 tbr, 600 tbn, 1200 tbc (default)
Metadata:
creation_time : 2015-02-23 08:30:01
handler_name : Core Media Data Handler
encoder : H.264
Stream #0:1(und): Audio: aac (mp4a / 0x6134706D), 44100 Hz, mono, fltp, 61 kb/s (default)
Metadata:
creation_time : 2015-02-23 08:30:01
handler_name : Core Media Data HandlerThe video is detected with a duration of several minutes. Did anybody face this problem before ?
Thanks in advance.
-
FFMPEG fails while processing MOV files
29 septembre 2020, par TomI'm trying to convert video files to DASH format. All videos work great except MOV videos.


I'm using the following command :


/usr/local/bin/ffmpeg -y -i /path/to/mov/video.mov -c:v libx264 -c:a aac -bf 1 -keyint_min 25 -g 250 -sc_threshold 40 -use_timeline 1 -use_template 1 -init_seg_name 'video_init_$RepresentationID$.$ext$' -media_seg_name 'video_chunk_$RepresentationID$_$Number%05d$.$ext$' -seg_duration 10 -hls_playlist 0 -f dash -adaptation_sets -0:s -map 0 -s:v:0 854x480 -b:v:0 750k -strict -2 -threads 12 /output/path/video.mpd



I get the error :


- 

Could not write header for output file #0 (incorrect codec parameters ?): Invalid argument ... Error initializing output stream 0:1




The full command output is :


ffmpeg version 4.3.1 Copyright (c) 2000-2020 the FFmpeg developers
built with Apple clang version 11.0.3 (clang-1103.0.32.62)
configuration: --prefix=/usr/local/Cellar/ffmpeg/4.3.1 --enable-shared --enable-pthreads --enable-version3 --enable-avresample --cc=clang --host-cflags= --host-ldflags= --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libbluray --enable-libdav1d --enable-libmp3lame --enable-libopus --enable-librav1e --enable-librubberband --enable-libsnappy --enable-libsrt --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-librtmp --enable-libspeex --enable-libsoxr --enable-videotoolbox --disable-libjack --disable-indev=jack
libavutil 56. 51.100 / 56. 51.100
libavcodec 58. 91.100 / 58. 91.100
libavformat 58. 45.100 / 58. 45.100
libavdevice 58. 10.100 / 58. 10.100
libavfilter 7. 85.100 / 7. 85.100
libavresample 4. 0. 0 / 4. 0. 0
libswscale 5. 7.100 / 5. 7.100
libswresample 3. 7.100 / 3. 7.100
libpostproc 55. 7.100 / 55. 7.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/path/to/file.mov':
Metadata:
major_brand : qt
minor_version : 0
compatible_brands: qt
creation_time : 2020-09-21T09:45:27.000000Z
com.apple.quicktime.make: Apple
com.apple.quicktime.model: iPhone 7
com.apple.quicktime.software: 13.4.1
com.apple.quicktime.creationdate: 2020-06-15T11:59:36+0200
com.apple.photos.originating.signature: AXfhZgW4nrUdSusOMUuJRarfxD7R
Duration: 00:01:13.40, start: 0.000000, bitrate: 10616 kb/s
Stream #0:0(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, mono, fltp, 96 kb/s (default)
Metadata:
creation_time : 2020-09-21T09:45:27.000000Z
handler_name : Core Media Audio
Stream #0:1(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, bt709), 1920x1080 [SAR 1:1 DAR 16:9], 10514 kb/s, 30 fps, 30 tbr, 600 tbn, 1200 tbc (default)
Metadata:
creation_time : 2020-09-21T09:45:27.000000Z
handler_name : Core Media Video
encoder : H.264
Stream #0:2(und): Data: none (mebx / 0x7862656D) (default)
Metadata:
creation_time : 2020-09-21T09:45:27.000000Z
handler_name : Core Media Metadata
Stream #0:3(und): Data: none (mebx / 0x7862656D), 0 kb/s (default)
Metadata:
creation_time : 2020-09-21T09:45:27.000000Z
handler_name : Core Media Metadata
Stream mapping:
Stream #0:0 -> #0:0 (aac (native) -> aac (native))
Stream #0:1 -> #0:1 (h264 (native) -> h264 (libx264))
Stream #0:2 -> #0:2 (copy)
Stream #0:3 -> #0:3 (copy)
Press [q] to stop, [?] for help
[libx264 @ 0x7f7f2600e000] using SAR=1280/1281
[libx264 @ 0x7f7f2600e000] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 BMI2 AVX2
[libx264 @ 0x7f7f2600e000] profile High, level 3.1, 4:2:0, 8-bit
[libx264 @ 0x7f7f2600e000] 264 - core 160 r3011 cde9a93 - H.264/MPEG-4 AVC codec - Copyleft 2003-2020 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=12 lookahead_threads=2 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=1 b_pyramid=0 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=abr mbtree=1 bitrate=750 ratetol=1.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
Could not write header for output file #0 (incorrect codec parameters ?): Invalid argument
Error initializing output stream 0:1 --
[aac @ 0x7f7f2600c000] Qavg: 880.111
[aac @ 0x7f7f2600c000] 2 frames left in the queue on closing
[libx264 @ 0x7f7f2600e000] final ratefactor: 28.97
Conversion failed!



Stream #0:2 -> #0:2 (copy)
Stream #0:3 -> #0:3 (copy)



I guess the problem is that the file contains two not audio/video streams :
I can not find a way to exclude or ignore or copy without processing those last two streams (#2 and #3).


Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'IMG_3599.mov':
 Metadata:
 major_brand : qt
 minor_version : 0
 compatible_brands: qt
 creation_time : 2020-09-21T09:45:27.000000Z
 com.apple.quicktime.make: Apple
 com.apple.quicktime.model: iPhone 7
 com.apple.quicktime.software: 13.4.1
 com.apple.quicktime.creationdate: 2020-06-15T11:59:36+0200
 com.apple.photos.originating.signature: AXfhZgW4nrUdSusOMUuJRarfxD7R
 Duration: 00:01:13.40, start: 0.000000, bitrate: 10616 kb/s
 Stream #0:0(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, mono, fltp, 96 kb/s (default)
 Metadata:
 creation_time : 2020-09-21T09:45:27.000000Z
 handler_name : Core Media Audio
 Stream #0:1(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, bt709), 1920x1080 [SAR 1:1 DAR 16:9], 10514 kb/s, 30 fps, 30 tbr, 600 tbn, 1200 tbc (default)
 Metadata:
 creation_time : 2020-09-21T09:45:27.000000Z
 handler_name : Core Media Video
 encoder : H.264
 Stream #0:2(und): Data: none (mebx / 0x7862656D) (default)
 Metadata:
 creation_time : 2020-09-21T09:45:27.000000Z
 handler_name : Core Media Metadata
 Stream #0:3(und): Data: none (mebx / 0x7862656D), 0 kb/s (default)
 Metadata:
 creation_time : 2020-09-21T09:45:27.000000Z
 handler_name : Core Media Metadata