Recherche avancée

Médias (91)

Autres articles (72)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • MediaSPIP version 0.1 Beta

    16 avril 2011, par

    MediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • Amélioration de la version de base

    13 septembre 2013

    Jolie sélection multiple
    Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
    Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)

Sur d’autres sites (14849)

  • Is there something equivalent to FilterKit for android ?

    29 octobre 2014, par Varun Jain

    I am looking to put filters on a video after it has been recorded. Explored FFMPEG but it is way too slow (take 10-15 seconds for a 20 second video). The instagram app does it in under 1-2 seconds every time. Sample FFMPEG code I am using :

    ffmpeg -y -i vid_0.mp4 -preset ultrafast  -vf format=gray,format=yuv422p -vsync 2  webcam.mp4

    Is there a library similar to FilterKit for android ? Is there a feasible way in which I can use FilterKit on Android ? Any other approach to apply filters on video would also be welcome.

    Note : I am not talking about giving the user a preview, that is easily done by using a surface view. I need to convert the video.

  • ffmpeg "End mismatch 1" warning, jpeg2000 to avi

    11 avril 2023, par jklebes

    Trying to convert a directory of jpeg2000 grayscale images to a video with ffmpeg, I get warnings

    


    [0;36m[jpeg2000 @ 0x55d8fa1b68c0] [0m[0;33mEnd mismatch 1


    


    (and lots of

    


    Last message repeated <n> times&#xA;</n>

    &#xA;

    )

    &#xA;

    The command was

    &#xA;

    ffmpeg -y -r 10 -start_number 1 -i <path>/surface_30///img_000%01d.jp2 -vcodec msmpeg4 -vf scale=1920:-1 -q:v 8 <path>//surface_30///surface_30.avi&#xA;</path></path>

    &#xA;

    The output is

    &#xA;

    ffmpeg version 4.2.2 Copyright (c) 2000-2019 the FFmpeg developers&#xA;  built with gcc 7.3.0 (crosstool-NG 1.23.0.449-a04d0)&#xA;  configuration: --prefix=/home/jklebes001/miniconda3 --cc=/tmp/build/80754af9/ffmpeg_1587154242452/_build_env/bin/x86_64-conda_cos6-linux-gnu-cc --disable-doc --enable-avresample --enable-gmp --enable-hardcoded-tables --enable-libfreetype --enable-libvpx --enable-pthreads --enable-libopus --enable-postproc --enable-pic --enable-pthreads --enable-shared --enable-static --enable-version3 --enable-zlib --enable-libmp3lame --disable-nonfree --enable-gpl --enable-gnutls --disable-openssl --enable-libopenh264 --enable-libx264&#xA;  libavutil      56. 31.100 / 56. 31.100&#xA;  libavcodec     58. 54.100 / 58. 54.100&#xA;  libavformat    58. 29.100 / 58. 29.100&#xA;  libavdevice    58.  8.100 / 58.  8.100&#xA;  libavfilter     7. 57.100 /  7. 57.100&#xA;  libavresample   4.  0.  0 /  4.  0.  0&#xA;  libswscale      5.  5.100 /  5.  5.100&#xA;  libswresample   3.  5.100 /  3.  5.100&#xA;  libpostproc    55.  5.100 / 55.  5.100&#xA;[0;36m[jpeg2000 @ 0x55cb44144480] [0m[0;33mEnd mismatch 1&#xA;&#xA;[0m    Last message repeated 1 times&#xA;    Last message repeated 2 times&#xA;    Last message repeated 3 times&#xA;

    &#xA;

    ...

    &#xA;

    Last message repeated 73 times&#xA;&#xA;Input #0, image2, from &#x27;<path>//surface_30///img_000%01d.jp2&#x27;:&#xA;&#xA;  Duration: 00:00:00.20, start: 0.000000, bitrate: N/A&#xA;&#xA;    Stream #0:0: Video: jpeg2000, gray, 6737x4869, 25 tbr, 25 tbn, 25 tbc&#xA;&#xA;Stream mapping:&#xA;&#xA;  Stream #0:0 -> #0:0 (jpeg2000 (native) -> msmpeg4v3 (msmpeg4))&#xA;&#xA;Press [q] to stop, [?] for help&#xA;&#xA;[0;36m[jpeg2000 @ 0x55cb4418e200] [0m[0;33mEnd mismatch 1&#xA;&#xA;[0m[0;36m[jpeg2000 @ 0x55cb441900c0] [0m[0;33mEnd mismatch 1&#xA;</path>

    &#xA;

    ...

    &#xA;

    (about 600 lines of "end mismatch" and "last message repeated" cut)

    &#xA;

    ...

    &#xA;

    [0m[0;36m[jpeg2000 @ 0x55cb4418e8c0] [0m[0;33mEnd mismatch 1&#xA;&#xA;[0mOutput #0, avi, to &#x27;<path>/surface_30///surface_30.avi&#x27;:&#xA;&#xA;  Metadata:&#xA;&#xA;    ISFT            : Lavf58.29.100&#xA;&#xA;    Stream #0:0: Video: msmpeg4v3 (msmpeg4) (MP43 / 0x3334504D), yuv420p, 1920x1388, q=2-31, 200 kb/s, 10 fps, 10 tbn, 10 tbc&#xA;&#xA;    Metadata:&#xA;&#xA;      encoder         : Lavc58.54.100 msmpeg4&#xA;&#xA;    Side data:&#xA;&#xA;      cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1&#xA;&#xA;frame=    2 fps=0.8 q=8.0 size=       6kB time=00:00:00.20 bitrate= 227.1kbits/s speed=0.0844x    &#xA;frame=    5 fps=1.7 q=8.0 size=       6kB time=00:00:00.50 bitrate=  90.8kbits/s speed=0.172x    &#xA;frame=    5 fps=1.7 q=8.0 Lsize=     213kB time=00:00:00.50 bitrate=3494.7kbits/s speed=0.172x    &#xA;video:208kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 2.732246%&#xA;</path>

    &#xA;

    What is the meaning of characters like [0 ;33m here ?

    &#xA;

    I thought it might have something to do with bit depth and color format. Setting -pix_fmt gray had no effect, and indeed the format of the jp2 images is already detected as 8-bit gray.

    &#xA;

    The output .avi exists and seems fine.

    &#xA;

    The line was previously used on jpeg files and works fine on jpeg. With jpeg, the output has the line

    &#xA;

    Input #0, image2, from &#x27;<path>/surface_30///img_000%01d.jpeg&#x27;:&#xA;&#xA;  Duration: 00:00:00.16, start: 0.000000, bitrate: N/A&#xA;&#xA;    Stream #0:0: Video: mjpeg (Baseline), gray(bt470bg/unknown/unknown), 6737x4869 [SAR 1:1 DAR 6737:4869], 25 tbr, 25 tbn, 25 tbc&#xA;&#xA;Stream mapping:&#xA;&#xA;  Stream #0:0 -> #0:0 (mjpeg (native) -> msmpeg4v3 (msmpeg4))&#xA;&#xA;Press [q] to stop, [?] for help&#xA;&#xA;Output #0, avi, to &#x27;<path>/surface_30///surface_30.avi&#x27;:&#xA;&#xA;  Metadata:&#xA;&#xA;    ISFT            : Lavf58.29.100&#xA;&#xA;    Stream #0:0: Video: msmpeg4v3 (msmpeg4) (MP43 / 0x3334504D), yuv420p, 6737x4869 [SAR 1:1 DAR 6737:4869], q=2-31, 200 kb/s, 10 fps, 10 tbn, 10 tbc&#xA;&#xA;    Metadata:&#xA;&#xA;      encoder         : Lavc58.54.100 msmpeg4&#xA;&#xA;    Side data:&#xA;&#xA;      cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1&#xA;&#xA;frame=    2 fps=0.0 q=8.0 size=    6662kB time=00:00:00.20 bitrate=272859.9kbits/s speed=0.334x    &#xA;frame=    3 fps=2.2 q=10.0 size=   10502kB time=00:00:00.30 bitrate=286764.2kbits/s speed=0.22x    &#xA;frame=    4 fps=1.9 q=12.3 size=   13574kB time=00:00:00.40 bitrate=277987.7kbits/s speed=0.19x    &#xA;frame=    4 fps=1.4 q=12.3 size=   13574kB time=00:00:00.40 bitrate=277987.7kbits/s speed=0.145x    &#xA;frame=    4 fps=1.4 q=12.3 Lsize=   13657kB time=00:00:00.40 bitrate=279702.3kbits/s speed=0.145x    &#xA;video:13652kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.041926%&#xA;</path></path>

    &#xA;

    detecting mjpeg format and similar, but more detailed format gray(bt470bg/unknown/unknown), 6737x4869 [SAR 1:1 DAR 6737:4869].

    &#xA;

    What is the difference when switching input to jp2 ?

    &#xA;

  • Why Yuv data readed from ffmpeg is different from original input yuv ?

    13 mai 2021, par sianyi Huang

    I use ffmpeg to make HDR test video, my approach is write a image, converting the image to yuv420p and then use ffmpeg to make the HDR test video.

    &#xA;&#xA;

    But I found the yuv data readed from mp4 is different from the original input..&#xA;I was stucked in here for a while, does anyone know how to read the correct yuv data from mp4 ?

    &#xA;&#xA;

    #ffmpeg encode command&#xA;ffmpeg_encode_mp4 = \&#xA;"ffmpeg -y -s 100*100 -pix_fmt yuv420p -threads 4 -r 1 -stream_loop -1 -f rawvideo -i write_yuv.yuv -vf \&#xA;scale=out_h_chr_pos=0:out_v_chr_pos=0,format=yuv420p10le \&#xA;-c:v libx265 -tag:v hvc1 -t 10 -pix_fmt yuv420p10le -preset medium -x265-params \&#xA;crf=12:colorprim=bt2020:transfer=smpte2084:colormatrix=bt2020nc:master-display=\"G(13250,34500)B(7500,3000)R(34000,16000)WP(15635,16450)L(10000000,1)\":max-cll=\"1000,400\" \&#xA;-an test.mp4"&#xA;&#xA;#ffmpeg read yuv from mp4 command&#xA;ffmpeg_extract_yuv = "ffmpeg -i test.mp4 -vframes 1 -c:v rawvideo -pix_fmt yuv420p read_yuv.yuv"&#xA;&#xA;&#xA;#make 100*100 yuv raw&#xA;w, h = 100, 100&#xA;test_gray = 255&#xA;test = np.full((100, 100, 3), test_gray, dtype=np.uint8)&#xA;yuv_cv = cv.cvtColor(test, cv.COLOR_RGB2YUV_I420)&#xA;yuv_cv.tofile("write_yuv.yuv")&#xA;&#xA;#encode yuv raw to mp4 with HDR metadata&#xA;print(ffmpeg_encode_mp4)&#xA;result = subprocess.check_output(ffmpeg_encode_mp4, shell = True)&#xA;print(result)&#xA;sleep(0.5)&#xA;&#xA;#extract yuv from mp4&#xA;kill_existing_file("read_yuv.yuv")&#xA;print(ffmpeg_extract_yuv)&#xA;result = subprocess.check_output(ffmpeg_extract_yuv, shell = True)&#xA;print(result)&#xA;sleep(0.5)&#xA;&#xA;write_yuv = np.fromfile("write_yuv.yuv",dtype=&#x27;uint8&#x27;)&#xA;read_yuv = np.fromfile("read_yuv.yuv",dtype=&#x27;uint8&#x27;)&#xA;&#xA;print("input gray:", test_gray)&#xA;print("write_yuv", write_yuv[:10])&#xA;print("read_yuv", read_yuv[:10])&#xA;&#xA;reader = imageio.get_reader("test.mp4")&#xA;img = reader.get_data(0)&#xA;print("imgeio read:", img[50, 50])&#xA;&#xA;&#x27;&#x27;&#x27;&#xA;ouput result:&#xA;input gray: 255&#xA;write_yuv [235 235 235 235 235 235 235 235 235 235]&#xA;read_yuv [234 235 234 235 234 235 234 235 234 235]&#xA;imgeio read: [253 253 253]&#xA;&#x27;&#x27;&#x27;&#xA;

    &#xA;&#xA;

    I have no idea how to validate the video I made is corret&#xA;Any feedback will be very appreciated !

    &#xA;