Recherche avancée

Médias (1)

Mot : - Tags -/vidéo

Autres articles (27)

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • De l’upload à la vidéo finale [version standalone]

    31 janvier 2010, par

    Le chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
    Upload et récupération d’informations de la vidéo source
    Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
    Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)

Sur d’autres sites (5624)

  • How can I render frames decoded by FFmpeg using hardware decoding with D3D11 ?

    14 juin 2024, par mercuric taylor

    I have completed the process of decoding a video frame using FFmpeg. The format of the decoded frame is AV_PIX_FMT_NV12. Now, I want to render this frame to the screen using D3D11. My questions are :

    


      

    1. What is the equivalent concept in D3D11 for a decoded frame ? Is it a texture ?
    2. 


    3. I have seen many solutions that convert NV12 data to RGB, but it seems that DX11 does not require this conversion anymore.
    4. 


    5. I just want to display this frame, and since my frame is on the GPU, is there a more convenient way to render directly on the GPU without copying ?
    6. 


    


    Please forgive my not-so-good English. Can anyone provide a reference example ?

    


    I have already referenced this open-source project. texthttps://github.com/balapradeepswork/D3D11NV12Rendering/tree/master/D3D11NV12Rendering
But I don't understand it very well.
Since I don't use dx11 to make game, just for show video, I am looking forward for a more easy solution. It has worried me for weeks, can anyone give me some advice(even good tutorial, some I can found is too old). Thanks sincerely !

    


  • Finding a minimal configuration for ffmpeg

    28 novembre 2022, par Clemens3000

    i am currently developing a video player for Windows and Mac using JUCE and FFmpeg. For this i want to include FFmpeg as a shared library and compile it myself in a minimal configuration to get rid of features and dependencies i don't need. This turns out to be quite difficult, since FFmpeg has a HUGE load of dependencies. And i am pretty new to this compiling game.

    


    Of course i checked all the options for configuration, but there is so many and don't have much knowledge about what all the options actually do. Some are obvious, but many are not.

    


    The minimal configuration i am looking for should enable to playback videofiles encoded with h.264 and it should be LGPL compliant. So just decoding, no encoding.

    


    What i did :

    


      

    • i installed some of the dependencies i might need via macports (under /opt/local)
    • 


    • i downloaded ffmpeg source code, played around with the configure options and compiled it
    • 


    • i managed to build my app and it even runs
    • 


    • i created a configuration with "—disable_everything" and then tried to add features one after another until i build something that works. But that's like finding a needle in the haystack.
    • 


    


    For Mac my first attempt for this is :
./configure --disable-everything --prefix='../../osx_arm' --install-name-dir='@executable_path/../libs' --extra-ldflags=-Wl,-rpath,'@executable_path/../libs' --enable-version3 --enable-shared --disable-static --enable-swscale --enable-avfilter --enable-libfdk-aac --enable-audiotoolbox --enable-videotoolbox --enable-pthreads --cc=/usr/bin/clang --extra-ldflags=-L/opt/local/lib

    


    I can compile and link my program and as i expected I cannot do much more, which is totally fine for now.

    


    My questions are now :

    


      

    1. How can i find out which configure options are essential to enable simple playback ? How do i find out which options can be discarded ?

      


    2. 


    3. If i run otool on the resulting libs in my current minimal build i still get some dependencies :

      


    4. 


    


    otool -L ../../osx_arm/lib/libavcodec.59.37.100.dylib  ../../osx_arm/lib/libavcodec.59.37.100.dylib: @executable_path/../libs/libavcodec.59.dylib (compatibility version 59.0.0, current version 59.37.100) @executable_path/../libs/libavutil.57.dylib (compatibility version 57.0.0, current version 57.28.100) /opt/local/lib/libiconv.2.dylib (compatibility version 9.0.0, current version 9.1.0) /usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 1292.100.5) /System/Library/Frameworks/VideoToolbox.framework/Versions/A/VideoToolbox (compatibility version 1.0.0, current version 1.0.0) /opt/local/lib/libX11.6.dylib (compatibility version 11.0.0, current version 11.0.0) /System/Library/Frameworks/CoreFoundation.framework/Versions/A/CoreFoundation (compatibility version 150.0.0, current version 1775.118.101) /System/Library/Frameworks/CoreMedia.framework/Versions/A/CoreMedia (compatibility version 1.0.0, current version 1.0.0) /System/Library/Frameworks/CoreVideo.framework/Versions/A/CoreVideo (compatibility version 1.2.0, current version 1.5.0) /System/Library/Frameworks/CoreServices.framework/Versions/A/CoreServices (compatibility version 1.0.0, current version 1122.33.0)

    


    Why does FFmpeg actually need libiconv or libX11 ? I believe those don't have to do much with video/audio, right ? Well, I guess FFmpeg needs them anyway...So do i need to compile them myself to and add them to my Application ? I guess later i'll have even more dependencies.

    


      

    1. Any other tips on how to accomplish what i want ?
    2. 


    


  • How to fix FFmpeg crashed in Android 4/5/6 by sws_setColorspaceDetails function ?

    8 février 2023, par hachman

    I have using FFmpeg(ver 4.2) with cocos2dx game engine, and it works well at all iOS version and Android above level 24(include level 24). However, it may be crashed in Android under level 23(Andorid version 6.0.1).

    


    I am using bugly to record this problem, it located at "sws_setColorspaceDetails" function as this :

    


    #00 pc 01297b38 sws_setColorspaceDetails + 2704 [armeabi-v7a]

java:
org.cocos2dx.lib.Cocos2dxRenderer.onDrawFrame(Cocos2dxRenderer.java:94)
android.opengl.GLSurfaceView$GLThread.guardedRun(GLSurfaceView.java:1535)
android.opengl.GLSurfaceView$GLThread.run(GLSurfaceView.java:1240)


    


    So, how can I solve this problem ? This function is in "libswscale/utils.c", and I can't use "try-catch" to get more infomation.