Recherche avancée

Médias (1)

Mot : - Tags -/illustrator

Autres articles (71)

  • Le profil des utilisateurs

    12 avril 2011, par

    Chaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
    L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...)

  • Configurer la prise en compte des langues

    15 novembre 2010, par

    Accéder à la configuration et ajouter des langues prises en compte
    Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
    De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
    Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...)

  • XMP PHP

    13 mai 2011, par

    Dixit Wikipedia, XMP signifie :
    Extensible Metadata Platform ou XMP est un format de métadonnées basé sur XML utilisé dans les applications PDF, de photographie et de graphisme. Il a été lancé par Adobe Systems en avril 2001 en étant intégré à la version 5.0 d’Adobe Acrobat.
    Étant basé sur XML, il gère un ensemble de tags dynamiques pour l’utilisation dans le cadre du Web sémantique.
    XMP permet d’enregistrer sous forme d’un document XML des informations relatives à un fichier : titre, auteur, historique (...)

Sur d’autres sites (10510)

  • How to combine 100 video clips with transition using FFMPEG ? [closed]

    17 octobre 2024, par Muhammad Lutfi Rahmawan

    I've been working on project which involves FFMPEG to process video. The goal is to produce a video from several clips and combine them with transition to each clips. We've succeeded to create video from 20 clips and 19 transitions are applied to merge each clips. But when it comes to a larger amount of clips, say 50, it becomes be failed.

    


    The first thing we do is split the original video into several clips based on user selection and this command is successful (log splitting). Then we combine the clips with transition using below command.

    


    Here is the sample command of FFMPEG which failed :

    


    ffmpeg -y -i /tmp/0.mp4 -i /tmp/1.mp4 -i /tmp/2.mp4 -i /tmp/3.mp4 -i /tmp/4.mp4 -i /tmp/5.mp4 -i /tmp/6.mp4 -i /tmp/7.mp4 -i /tmp/8.mp4 -i /tmp/9.mp4 -i /tmp/10.mp4 -i /tmp/11.mp4 -i /tmp/12.mp4 -i /tmp/13.mp4 -i /tmp/14.mp4 -i /tmp/15.mp4 -i /tmp/16.mp4 -i /tmp/17.mp4 -i /tmp/18.mp4 -i /tmp/19.mp4 -i /tmp/20.mp4 -i /tmp/21.mp4 -i /tmp/22.mp4 -i /tmp/23.mp4 -i /tmp/24.mp4 -i /tmp/25.mp4 -i /tmp/26.mp4 -i /tmp/27.mp4 -i /tmp/28.mp4 -i /tmp/29.mp4 -i /tmp/30.mp4 -i /tmp/31.mp4 -i /tmp/32.mp4 -i /tmp/33.mp4 -i /tmp/34.mp4 -i /tmp/35.mp4 -i /tmp/36.mp4 -i /tmp/37.mp4 -i /tmp/38.mp4 -i /tmp/39.mp4 -i /tmp/40.mp4 -i /tmp/41.mp4 -i /tmp/42.mp4 -i /tmp/43.mp4 -i /tmp/44.mp4 -i /tmp/45.mp4 -i /tmp/46.mp4 -i /tmp/47.mp4 -i /tmp/48.mp4 -i /tmp/49.mp4 -filter_complex "[0:v][1:v]xfade=transition=circlecrop:duration=0.5:offset=24.474[tv0];[0:a][1:a]acrossfade=d=0.5[ta0];[tv0][2:v]xfade=transition=circlecrop:duration=0.5:offset=31.674[tv1];[ta0][2:a]acrossfade=d=0.5[ta1];[tv1][3:v]xfade=transition=circlecrop:duration=0.5:offset=37.234[tv2];[ta1][3:a]acrossfade=d=0.5[ta2];[tv2][4:v]xfade=transition=circlecrop:duration=0.5:offset=55.348[tv3];[ta2][4:a]acrossfade=d=0.5[ta3];[tv3][5:v]xfade=transition=circlecrop:duration=0.5:offset=76.07[tv4];[ta3][5:a]acrossfade=d=0.5[ta4];[tv4][6:v]xfade=transition=circlecrop:duration=0.5:offset=82.622[tv5];[ta4][6:a]acrossfade=d=0.5[ta5];[tv5][7:v]xfade=transition=circlecrop:duration=0.5:offset=103.122[tv6];[ta5][7:a]acrossfade=d=0.5[ta6];[tv6][8:v]xfade=transition=circlecrop:duration=0.5:offset=114.502[tv7];[ta6][8:a]acrossfade=d=0.5[ta7];[tv7][9:v]xfade=transition=circlecrop:duration=0.5:offset=122.258[tv8];[ta7][9:a]acrossfade=d=0.5[ta8];[tv8][10:v]xfade=transition=circlecrop:duration=0.5:offset=130.094[tv9];[ta8][10:a]acrossfade=d=0.5[ta9];[tv9][11:v]xfade=transition=circlecrop:duration=0.5:offset=134.33[tv10];[ta9][11:a]acrossfade=d=0.5[ta10];[tv10][12:v]xfade=transition=circlecrop:duration=0.5:offset=141.85[tv11];[ta10][12:a]acrossfade=d=0.5[ta11];[tv11][13:v]xfade=transition=circlecrop:duration=0.5:offset=145.59[tv12];[ta11][13:a]acrossfade=d=0.5[ta12];[tv12][14:v]xfade=transition=circlecrop:duration=0.5:offset=154.314[tv13];[ta12][14:a]acrossfade=d=0.5[ta13];[tv13][15:v]xfade=transition=circlecrop:duration=0.5:offset=155.998[tv14];[ta13][15:a]acrossfade=d=0.5[ta14];[tv14][16:v]xfade=transition=circlecrop:duration=0.5:offset=164.924[tv15];[ta14][16:a]acrossfade=d=0.5[ta15];[tv15][17:v]xfade=transition=circlecrop:duration=0.5:offset=168.184[tv16];[ta15][17:a]acrossfade=d=0.5[ta16];[tv16][18:v]xfade=transition=circlecrop:duration=0.5:offset=174.796[tv17];[ta16][18:a]acrossfade=d=0.5[ta17];[tv17][19:v]xfade=transition=circlecrop:duration=0.5:offset=186.724[tv18];[ta17][19:a]acrossfade=d=0.5[ta18];[tv18][20:v]xfade=transition=circlecrop:duration=0.5:offset=191.23[tv19];[ta18][20:a]acrossfade=d=0.5[ta19];[tv19][21:v]xfade=transition=circlecrop:duration=0.5:offset=195.778[tv20];[ta19][21:a]acrossfade=d=0.5[ta20];[tv20][22:v]xfade=transition=circlecrop:duration=0.5:offset=198.118[tv21];[ta20][22:a]acrossfade=d=0.5[ta21];[tv21][23:v]xfade=transition=circlecrop:duration=0.5:offset=201.506[tv22];[ta21][23:a]acrossfade=d=0.5[ta22];[tv22][24:v]xfade=transition=circlecrop:duration=0.5:offset=204.422[tv23];[ta22][24:a]acrossfade=d=0.5[ta23];[tv23][25:v]xfade=transition=circlecrop:duration=0.5:offset=210.243[tv24];[ta23][25:a]acrossfade=d=0.5[ta24];[tv24][26:v]xfade=transition=circlecrop:duration=0.5:offset=215.417[tv25];[ta24][26:a]acrossfade=d=0.5[ta25];[tv25][27:v]xfade=transition=circlecrop:duration=0.5:offset=219.057[tv26];[ta25][27:a]acrossfade=d=0.5[ta26];[tv26][28:v]xfade=transition=circlecrop:duration=0.5:offset=221.277[tv27];[ta26][28:a]acrossfade=d=0.5[ta27];[tv27][29:v]xfade=transition=circlecrop:duration=0.5:offset=224.875[tv28];[ta27][29:a]acrossfade=d=0.5[ta28];[tv28][ta28][30:v][30:a][31:v][31:a][32:v][32:a][33:v][33:a][34:v][34:a][35:v][35:a][36:v][36:a][37:v][37:a][38:v][38:a][39:v][39:a][40:v][40:a][41:v][41:a][42:v][42:a][43:v][43:a][44:v][44:a][45:v][45:a][46:v][46:a][47:v][47:a][48:v][48:a][49:v][49:a]concat=n=21:v=1:a=1[v][a]" -map "[v]" -map "[a]" /tmp/export-clip-94989d271066ace00459-9ba57089-d833-4cef-beb7-3c847e9958af.mp4


    


    The error is as follows :

    


    ffmpeg version 7.1 Copyright (c) 2000-2024 the FFmpeg developers\n  built with gcc 13 (Ubuntu 13.2.0-23ubuntu4)\n  configuration: --disable-debug --disable-doc --disable-ffplay --enable-alsa --enable-cuda-llvm --enable-cuvid --enable-ffprobe --enable-gpl --enable-libaom --enable-libass --enable-libdav1d --enable-libfdk_aac --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libharfbuzz --enable-libkvazaar --enable-liblc3 --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-libplacebo --enable-librav1e --enable-librist --enable-libshaderc --enable-libsrt --enable-libsvtav1 --enable-libtheora --enable-libv4l2 --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpl --enable-libvpx --enable-libvvenc --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzimg --enable-libzmq --enable-nonfree --enable-nvdec --enable-nvenc --enable-opencl --enable-openssl --enable-stripping --enable-vaapi --enable-vdpau --enable-version3 --enable-vulkan\n  libavutil      59. 39.100 / 59. 39.100\n  libavcodec     61. 19.100 / 61. 19.100\n  libavformat    61.  7.100 / 61.  7.100\n  libavdevice    61.  3.100 / 61.  3.100\n  libavfilter    10.  4.100 / 10.  4.100\n  libswscale      8.  3.100 /  8.  3.100\n  libswresample   5.  3.100 /  5.  3.100\n  libpostproc    58.  3.100 / 58.  3.100\n[mov,mp4,m4a,3gp,3g2,mj2 @ 0x56102f0e5580] moov atom not found\n[in#0 @ 0x56102f0ed480] Error opening input: Invalid data found when processing input\nError opening input file /tmp/0.mp4.\nError opening input files: Invalid data found when processing input\n


    


    It stated that /tmp/0.mp4 has invalid data moov atom not found

    


    This only occurs when the number of clips > 20

    


    Additional info :

    


      

    • I can run without any error in my local environment using Windows 11 (FFMPEG version N-114902-g277f051ff6-20240421)
    • 


    • It fails on AWS Fargate environment using Ubuntu base image docker (FFMPEG version 7.1)
    • 


    


    I hope I can get a solutive answer based on my case at least I get what is wrong with my approach

    


  • avutil : Fix linking x86 asm constants with Clang in MSVC mode

    12 juin, par Martin Storsjö
    avutil : Fix linking x86 asm constants with Clang in MSVC mode
    

    This fixes building with Clang in MSVC mode, for x86, which was
    broken in 6e49b8699657b808b7dc80033f2c3f2d0e029fa3 (in Nov 2024) ;
    previously it failed with undefined symbols for the constants
    defined with DECLARE_ASM_CONST, accessed via inline assembly.

    Before 57861911a34e1c33796be97f2b2f44e05fffd647, there was an
    #elif defined(__GNUC__) || defined(__clang__)
    case before the
    #elif defined(_MSC_VER)
    case for defining DECLARE_ASM_CONST, which included av_used.
    (This case included the explicit "defined(__clang__)" since
    f637046d3134a331e4b5a7243ac3dfb92735b8a5.)

    After 57861911a34e1c33796be97f2b2f44e05fffd647, it used the
    generic definition of DECLARE_ASM_CONST that also included
    av_used - which also worked for Clang in MSVC mode. But after
    6e49b8699657b808b7dc80033f2c3f2d0e029fa3, Clang in MSVC mode
    ended up using the MSVC specific variant which lacked the
    av_used declaration, causing linker errors due to undefined
    symbols.

    Signed-off-by : Martin Storsjö <martin@martin.st>

    • [DH] libavutil/mem_internal.h
  • I created a Python code to capture live video using FFmpeg, but the output screen only shows noise

    16 octobre 2024, par chun3 hyun

    The code below is Python code that made my computer screen video capture in real time via ffmpeg.

    &#xA;

    When I run the code below, it goes well until a new window named 'Captured Frame' is created. But this 'Captured Frame' window doesn't show the full screen of my computer, and the gray screen is generating a lot of noise.

    &#xA;

    import cv2&#xA;import numpy as np&#xA;import subprocess&#xA;&#xA;def frame_capture():&#xA;    # Set FFmpeg command (capture desired window or area)&#xA;    ffmpeg_command = [&#xA;        &#x27;ffmpeg&#x27;,&#xA;        &#x27;-f&#x27;, &#x27;gdigrab&#x27;,  # Windows screen capture (using gdigrab)&#xA;        &#x27;-framerate&#x27;, &#x27;30&#x27;,  # Setting the Frame Speed&#xA;        &#x27;-i&#x27;, &#x27;desktop&#x27;,  # What to capture (for example, full screen)&#xA;        &#x27;-pix_fmt&#x27;, &#x27;bgr0&#x27;,&#xA;        &#x27;-vcodec&#x27;, &#x27;rawvideo&#x27;,  # Video codec settings&#xA;        &#x27;-tune&#x27;, &#x27;zerolatency&#x27;,&#xA;        &#x27;-an&#x27;,  # Disable audio&#xA;        &#x27;-sn&#x27;,  # Disable Caption&#xA;        &#x27;-f&#x27;, &#x27;rawvideo&#x27;, &#x27;-&#x27;&#xA;    ]&#xA;&#xA;    # Running the FFmpeg process&#xA;    process = subprocess.Popen(ffmpeg_command, stdout=subprocess.PIPE, bufsize=10**8)&#xA;&#xA;    while True:&#xA;        # Read Frame from FFmpeg (Resolution Example: 1920x1080)&#xA;        raw_frame = process.stdout.read(1920 * 1080 * 3)  # 1920x1080 resolution, BGR format&#xA;        if not raw_frame:&#xA;            break  # Shut down the loop when you can no longer receive frames&#xA;&#xA;        # Converting frame data to a numpy array&#xA;        frame = np.frombuffer(raw_frame, np.uint8).reshape((1080, 1920, 3))&#xA;&#xA;        # Add frame processing code here&#xA;        # Example: Showing a frame on the screen&#xA;        cv2.imshow(&#x27;Captured Frame&#x27;, frame)&#xA;&#xA;        # Press the &#x27;q&#x27; key to end&#xA;        if cv2.waitKey(1) &amp; 0xFF == ord(&#x27;q&#x27;):&#xA;            break&#xA;&#xA;    # End of process and release of resources&#xA;    process.stdout.close()&#xA;    process.wait()&#xA;    cv2.destroyAllWindows()&#xA;frame_capture()&#xA;

    &#xA;

    What could I have done wrong ? When I directly input the FFmpeg command in the Windows command prompt(knows as 'cmd') as shown below to save the video (in .mp4 format), I can see that the screen is output normally in the saved file. It seems that FFmpeg itself is installed correctly, but I don't know what the cause is.

    &#xA;

    hwnd=132554 -pix_fmt yuv420p -vf "scale=iw-mod(iw\,2):ih-mod(ih\,2)" -draw_mouse 1 -t 10 output.mp4&#xA;

    &#xA;

    The handle number written above was the handle of the active Chrome window on my computer.

    &#xA;

    My ffmpeg version is 2024-10-10-git-0f5592cfc7-full_build-www.gyan.dev My Python version is 3.12.4&#xA;My Windows version and build are as specified below.&#xA;:Windows 11 Home, 10.0.22631

    &#xA;

    Capturing the computer screen with FFmpeg. I tried it, but the output screen shows only noise.

    &#xA;