Recherche avancée

Médias (0)

Mot : - Tags -/alertes

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (38)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • Contribute to translation

    13 avril 2011

    You can help us to improve the language used in the software interface to make MediaSPIP more accessible and user-friendly. You can also translate the interface into any language that allows it to spread to new linguistic communities.
    To do this, we use the translation interface of SPIP where the all the language modules of MediaSPIP are available. Just subscribe to the mailing list and request further informantion on translation.
    MediaSPIP is currently available in French and English (...)

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

Sur d’autres sites (4084)

  • dnn : add openvino as one of dnn backend

    25 mai 2020, par Guo, Yejun
    dnn : add openvino as one of dnn backend
    

    OpenVINO is a Deep Learning Deployment Toolkit at
    https://github.com/openvinotoolkit/openvino, it supports CPU, GPU
    and heterogeneous plugins to accelerate deep learning inferencing.

    Please refer to https://github.com/openvinotoolkit/openvino/blob/master/build-instruction.md
    to build openvino (c library is built at the same time). Please add
    option -DENABLE_MKL_DNN=ON for cmake to enable CPU path. The header
    files and libraries are installed to /usr/local/deployment_tools/inference_engine/
    with default options on my system.

    To build FFmpeg with openvion, take my system as an example, run with :
    $ export LD_LIBRARY_PATH=$LD_LIBRARY_PATH :/usr/local/deployment_tools/inference_engine/lib/intel64/ :/usr/local/deployment_tools/inference_engine/external/tbb/lib/
    $ ../ffmpeg/configure —enable-libopenvino —extra-cflags=-I/usr/local/deployment_tools/inference_engine/include/ —extra-ldflags=-L/usr/local/deployment_tools/inference_engine/lib/intel64
    $ make

    Here are the features provided by OpenVINO inference engine :
    - support more DNN model formats
    It supports TensorFlow, Caffe, ONNX, MXNet and Kaldi by converting them
    into OpenVINO format with a python script. And torth model
    can be first converted into ONNX and then to OpenVINO format.

    see the script at https://github.com/openvinotoolkit/openvino/tree/master/model-optimizer/mo.py
    which also does some optimization at model level.

    - optimize at inference stage
    It optimizes for X86 CPUs with SSE, AVX etc.

    It also optimizes based on OpenCL for Intel GPUs.
    (only Intel GPU supported becuase Intel OpenCL extension is used for optimization)

    Signed-off-by : Guo, Yejun <yejun.guo@intel.com>
    Signed-off-by : Pedro Arthur <bygrandao@gmail.com>

    • [DH] configure
    • [DH] libavfilter/dnn/Makefile
    • [DH] libavfilter/dnn/dnn_backend_openvino.c
    • [DH] libavfilter/dnn/dnn_backend_openvino.h
    • [DH] libavfilter/dnn/dnn_interface.c
    • [DH] libavfilter/dnn_interface.h
  • Error:No such property : targetPlatform for class : com.android.build.gradle.managed.NdkConfig

    30 juillet 2016, par Alder

    I am trying to build FFMPEG into my JNI code with gradle in Android Studio. I have build FFMPEG as a .so file, in order to adapt different platform, I build it for different ABI(arm64-v8a, armeabi-v7a, mip, etc).Then I need to determine the ABI of the current build in the build.gradle file.

    Refer Experimental Plugin User Guide, my build.gradle look like this :

    apply plugin: 'com.android.model.native'    
    model{
       repositories {
           prebuilt(PrebuiltLibraries){
               ffmpeg{
                   headers.srcDir "src/main/jni/build/${targetPlatform.getName()}/include"
                   binaries.withType(SharedLibraryBinary) {
                       sharedLibraryFile = file("src/main/jni/build/${targetPlatform.getName()}/libvflibrary.so")
                   }
               }
           }
       }
       android {
           compileSdkVersion = 24
           buildToolsVersion = "23.0.3"

           defaultConfig {
               minSdkVersion.apiLevel = 15
               targetSdkVersion.apiLevel = 24
               versionCode = 1
               versionName = "1.0"
           }

           ndk{
               //platformVersion = 21
               moduleName = "library-jni"
               stl = 'gnustl_static'
               toolchain = "clang"
               abiFilters.addAll(['armeabi', 'armeabi-v7a', 'arm64-v8a', 'mips'])
               cppFlags.addAll(['-std=c++11', '-D__STDC_CONSTANT_MACROS'])
               ldLibs.addAll(['log', 'android', 'z', 'EGL', 'GLESv2'])
           }

           sources {
               main {
                   jni {
                       source{
                           srcDirs 'src/main/jni'
                       }
                       dependencies {
                           library 'ffmpeg' linkage 'shared'
                       }
                   }
               }
           }
       }
    }

    I am getting an error :

    Error:No such property : targetPlatform for class :
    com.android.build.gradle.managed.NdkConfig.

    Does anyone have an idea on how I can solve this, please ?

  • CMTimeGetSeconds doesn't get the right video duration

    26 février 2015, par JLCastillo

    some users don’t get the right duration of videos they capture with their own device. The funny thing is others do actually see it right, using the same device models and OS version. Anyway, we observed it in a iPhone 5c 7.1.2 and an iPhone 5s 8.1.3.

    This code works for most users, but not all :

    ALAssetRepresentation *representation = [mediaObject.asset defaultRepresentation];
    NSURL *url = [representation url];
    NSDictionary *options = @{ AVURLAssetPreferPreciseDurationAndTimingKey : @YES };
    AVAsset *avAsset = [AVURLAsset URLAssetWithURL:url options:options];
    videoDurationTime = CMTimeGetSeconds(avAsset.duration);

    I asked them to send the input videos, and this is the output from "ffmpeg -i"

    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'bug_duration1.MOV':
     Metadata:
       major_brand     : qt
       minor_version   : 0
       compatible_brands: qt
       creation_time   : 2015-02-23 08:30:01
       encoder         : 8.1.3
       encoder-eng     : 8.1.3
       date            : 2015-02-23T16:30:01+0800
       date-eng        : 2015-02-23T16:30:01+0800
       model           : iPhone 5s
       model-eng       : iPhone 5s
       make            : Apple
       make-eng        : Apple
     Duration: 00:00:03.67, start: 0.000000, bitrate: 16793 kb/s
       Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, bt709), 1920x1080, 16719 kb/s, 29.99 fps, 29.97 tbr, 600 tbn, 1200 tbc (default)
       Metadata:
         creation_time   : 2015-02-23 08:30:01
         handler_name    : Core Media Data Handler
         encoder         : H.264
       Stream #0:1(und): Audio: aac (mp4a / 0x6134706D), 44100 Hz, mono, fltp, 61 kb/s (default)
       Metadata:
         creation_time   : 2015-02-23 08:30:01
         handler_name    : Core Media Data Handler

    The video is detected with a duration of several minutes. Did anybody face this problem before ?

    Thanks in advance.