Recherche avancée

Médias (1)

Mot : - Tags -/publishing

Autres articles (90)

  • Mise à jour de la version 0.1 vers 0.2

    24 juin 2013, par

    Explications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
    Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

Sur d’autres sites (9082)

  • Extracting metadata from incomplete video files

    17 juillet 2013, par npgall

    Can anyone tell me where metadata is stored in common video file formats ? And if it would be located towards the start of the file, or scattered throughout.

    I'm working with a remote object store containing a lot of video files and I want to extract metadata, in particular video duration and video dimensions from those files, without streaming the entire file contents to the local machine.

    I'm hoping that this metadata will be stored in the first X bytes of files, and so I can just fetch a byte range starting at the beginning instead of the whole file, passing this partial file data to ffprobe.

    For testing purposes I created a 22MB MP4 file, and used the following command to supply only the first 1MB of data to ffprobe :

    head -c1024K '2013-07-04 12.20.07.mp4' | ffprobe -

    It prints :

    avprobe version 0.8.6-4:0.8.6-0ubuntu0.12.04.1, Copyright (c) 2007-2013 the Libav developers
     built on Apr  2 2013 17:02:36 with gcc 4.6.3
    [mov,mp4,m4a,3gp,3g2,mj2 @ 0x1a6b7a0] stream 0, offset 0x10beab: partial file
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'pipe:':
     Metadata:
       major_brand     : isom
       minor_version   : 0
       compatible_brands: isom3gp4
       creation_time   : 1947-07-04 11:20:07
     Duration: 00:00:09.84, start: 0.000000, bitrate: N/A
       Stream #0.0(eng): Video: h264 (High), yuv420p, 1920x1080, 20028 kb/s, PAR 65536:65536 DAR 16:9, 29.99 fps, 30 tbr, 90k tbn, 180k tbc
       Metadata:
         creation_time   : 1947-07-04 11:20:07
       Stream #0.1(eng): Audio: aac, 48000 Hz, stereo, s16, 189 kb/s
       Metadata:
         creation_time   : 1947-07-04 11:20:07

    So I see the first 1MB was enough to extract video duration 9.84 seconds and video dimensions 1920x1080, even though ffprobe printed the warning about detecting a partial file. If I supply less than 1MB, it fails completely.

    Would this approach work for other common video file formats to reliably extract metadata, or do any common formats scatter metadata throughout the file ?

    I'm aware of the concept of container formats and that various codecs may be used represent the audio/video data inside those containers. I'm not familiar with the details though. So I guess the question may apply to common combinations of containers + codecs ? Thanks in advance.

  • I have failed the make of Android-FFmpeg-Android

    5 juillet 2013, par Tsurumaru Makoto

    I am having trouble compiling Android-FFmpeg-Android with make command.

    I'm using the ndk : android-ndk-r8e-linux-x86.tar.bz2 and I have ffmpeg installed on my machine :

    [root@sv ffmpeg]#cat /proc/version
    Linux version 2.6.18-8.1.14.el5 (brewbuilder@hs20-bc2-2.build.redhat.com)
    (gcc version 4.1.1 20070105 (Red Hat 4.1.1-52)) #1 SMP Tue Sep 25 11:45:53 EDT 2007

    [root@sv ffmpeg]# ffmpeg
    FFmpeg version SVN-r26402, Copyright (c) 2000-2011 the FFmpeg developers
     built on Jul  5 2013 11:45:24 with gcc 4.1.2 20080704 (Red Hat 4.1.2-54)
     configuration: --enable-shared --enable-gpl --enable-version3 --enable-nonfree --enable-postproc --enable-x11grab --enable-libfaac --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libfaac --enable-libmp3lame --enable-libtheora --enable-libvorbis --enable-libx264 --enable-libxvid --enable-libvpx --enable-pic
     libavutil     50.36. 0 / 50.36. 0
     libavcore      0.16. 1 /  0.16. 1
     libavcodec    52.108. 0 / 52.108. 0
     libavformat   52.93. 0 / 52.93. 0
     libavdevice   52. 2. 3 / 52. 2. 3
     libavfilter    1.74. 0 /  1.74. 0
     libswscale     0.12. 0 /  0.12. 0
     libpostproc   51. 2. 0 / 51. 2. 0
    Hyper fast Audio and Video encoder
    usage: ffmpeg [options] [[infile options] -i infile]... {[outfile options] outfile}...

    Use -h to get full help or, even better, run 'man ffmpeg'

    I'm using following repository : https://github.com/chu888chu888/Android-FFmpeg-Android

    Where I get :

    [homepage@sv FFmpeg-Android-master]$ ./FFmpeg-Android.sh

    /tmp/vplayer/bin/../lib/gcc/arm-linux-androideabi/4.6/../../../../arm-linux-androideabi/bin/as: /lib/libz.so.1: no version information available (required by
    /tmp/vplayer/bin/../lib/gcc/arm-linux-androideabi/4.6/../../../../arm-linux-androideabi/bin/as)
    LD libavutil/libavutil.so.51
    /tmp/vplayer/bin/../lib/gcc/arm-linux-androideabi/4.6/../../../../arm-linux-androideabi/bin/ld: error: cannot open /tmp/vplayer/bin/../sysroot/usr/lib/libm.so: Unknown error 530
    /tmp/vplayer/bin/../lib/gcc/arm-linux-androideabi/4.6/../../../../arm-linux-androideabi/bin/ld: error: cannot open /tmp/vplayer/bin/../sysroot/usr/lib/libz.so: Unknown error 530
    /tmp/vplayer/bin/../lib/gcc/arm-linux-androideabi/4.6/../../../../arm-linux-androideabi/bin/ld: error: cannot open /tmp/vplayer/bin/../lib/gcc/arm-linux-androideabi/4.6/libgcc.a: Unknown error 530
    /tmp/vplayer/bin/../lib/gcc/arm-linux-androideabi/4.6/../../../../arm-linux-androideabi/bin/ld: error: cannot open /tmp/vplayer/bin/../sysroot/usr/lib/libc.so: Unknown error 530
    /tmp/vplayer/bin/../lib/gcc/arm-linux-androideabi/4.6/../../../../arm-linux-androideabi/bin/ld: error: cannot open /tmp/vplayer/bin/../sysroot/usr/lib/libdl.so: Unknown error 530
    /tmp/vplayer/bin/../lib/gcc/arm-linux-androideabi/4.6/../../../../arm-linux-androideabi/bin/ld: error: cannot open /tmp/vplayer/bin/../lib/gcc/arm-linux-androideabi/4.6/libgcc.a: Unknown error 530
    collect2: ld returned 1 exit status
    make: *** [libavutil/libavutil.so.51] Error 1

    It is an error.

    [homepage@sv FFmpeg-Android-master]$ ls /tmp/vplayer/sysroot/usr/lib
    crtbegin_dynamic.o  libEGL.so        libandroid.so      liblog.so     libthread_db.so
    crtbegin_so.o       libGLESv1_CM.so  libc.a             libm.a        libz.so
    crtbegin_static.o   libGLESv2.so     libc.so            libm.so
    crtend_android.o    libOpenMAXAL.so  libdl.so           libstdc++.a
    crtend_so.o         libOpenSLES.so   libjnigraphics.so  libstdc++.so

    You can see that file is present. Can anyone advice a solution to this issue ?

  • How to concatenate two videos w. ffmpeg — documented code not working

    2 mars 2014, par Jim Miller

    I'm trying to concatenate two videos with ffmpeg. Nothing fancy ; I just want one video that consists of video A immediately followed by video B.

    I've tried the code from How to concatenate (join, merge) media files on a freshly built and otherwise-working-fine install of ffmpeg 1.2.1 on Fedora 17, but the following error message appears :

    $ ffmpeg -i video_a.mov -i video_b.mov -filter_complex '[0:0] [0:1] [1:0] [1:1] concat=n=2:v=1:a=1 [v] [a]' -map '[v]' -map '[a]' output.mp4

    ffmpeg version N-54271-g7f866c1 Copyright (c) 2000-2013 the FFmpeg developers
     built on Jun 29 2013 11:05:42 with gcc 4.7.2 (GCC) 20120921 (Red Hat 4.7.2-2)
     configuration: --enable-gpl --enable-nonfree --enable-pthreads --enable-libx264 --enable-libfaac --extra-cflags=-I/usr/local/include --extra-ldflags=-L/usr/local/lib
     libavutil      52. 37.101 / 52. 37.101
     libavcodec     55. 17.100 / 55. 17.100
     libavformat    55. 10.100 / 55. 10.100
     libavdevice    55.  2.100 / 55.  2.100
     libavfilter     3. 77.101 /  3. 77.101
     libswscale      2.  3.100 /  2.  3.100
     libswresample   0. 17.102 /  0. 17.102
     libpostproc    52.  3.100 / 52.  3.100

    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'v1221-MTQxMzcyNTIxODU2.mov':
     Metadata:
       major_brand     : qt  
       minor_version   : 0
       compatible_brands: qt  
       creation_time   : 2013-03-28 20:34:59
       encoder         : Mac OS X v10.8.3 (CMA 914, CM 926.87, x86_64)
       encoder-eng     : Mac OS X v10.8.3 (CMA 914, CM 926.87, x86_64)
     Duration: 00:00:05.34, start: 0.000000, bitrate: 15837 kb/s
       Stream #0:0(und): Audio: aac (mp4a / 0x6134706D), 44100 Hz, mono, fltp, 221 kb/s
       Metadata:
         creation_time   : 2013-03-28 20:34:59
         handler_name    : Core Media Data Handler
       Stream #0:1(und): Video: h264 (Main) (avc1 / 0x31637661), yuv420p, 1920x1080 [SAR 1:1 DAR 16:9], 15512 kb/s, 29.81 fps, 30 tbr, 600 tbn, 1200 tbc
       Metadata:
         creation_time   : 2013-03-28 20:34:59
         handler_name    : Core Media Data Handler

    Input #1, mov,mp4,m4a,3gp,3g2,mj2, from 'v1224-MTQxMzcyNTIxODg5.mov':
     Metadata:
       major_brand     : qt  
       minor_version   : 0
       compatible_brands: qt  
       creation_time   : 2013-03-28 20:36:28
       encoder         : Mac OS X v10.8.3 (CMA 914, CM 926.87, x86_64)
       encoder-eng     : Mac OS X v10.8.3 (CMA 914, CM 926.87, x86_64)
     Duration: 00:00:04.13, start: 0.000000, bitrate: 15689 kb/s
       Stream #1:0(und): Audio: aac (mp4a / 0x6134706D), 44100 Hz, mono, fltp, 221 kb/s
       Metadata:
         creation_time   : 2013-03-28 20:36:28
         handler_name    : Core Media Data Handler
       Stream #1:1(und): Video: h264 (Main) (avc1 / 0x31637661), yuv420p, 1920x1080 [SAR 1:1 DAR 16:9], 15446 kb/s, 29.79 fps, 30 tbr, 600 tbn, 1200 tbc
       Metadata:
         creation_time   : 2013-03-28 20:36:28
         handler_name    : Core Media Data Handler

    Stream specifier ':0' in filtergraph description [0:0] [0:1] [1:0] [1:1] concat=n=2:v=1:a=1 [v] [a] matches no streams.

    A few other things to note :

    • The two videos I'm working with were shot with the same camera, so there shouldn't be any problems with aspect ratio or other gory video details.
    • I'm able to do other things with my ffmpeg installation, like convert one of those videos from .mov to .mp4 (yes, I had to recompile with faac...), which seems to vouch for both the ffmpeg and the video.
    • I've tried modifying the above invocation to produce a .mov file at the end, but I get the same error as before.
    • I've tried some stupid hacking tricks on the request above, like concatenating two copies of the same video, as well as some other invocations from other places around the web that involve filter_complex. Even on ones that were cited as working, I get the "matches no streams" message.