Recherche avancée

Médias (0)

Mot : - Tags -/gis

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (106)

  • Contribute to a better visual interface

    13 avril 2011

    MediaSPIP is based on a system of themes and templates. Templates define the placement of information on the page, and can be adapted to a wide range of uses. Themes define the overall graphic appearance of the site.
    Anyone can submit a new graphic theme or template and make it available to the MediaSPIP community.

  • Librairies et binaires spécifiques au traitement vidéo et sonore

    31 janvier 2010, par

    Les logiciels et librairies suivantes sont utilisées par SPIPmotion d’une manière ou d’une autre.
    Binaires obligatoires FFMpeg : encodeur principal, permet de transcoder presque tous les types de fichiers vidéo et sonores dans les formats lisibles sur Internet. CF ce tutoriel pour son installation ; Oggz-tools : outils d’inspection de fichiers ogg ; Mediainfo : récupération d’informations depuis la plupart des formats vidéos et sonores ;
    Binaires complémentaires et facultatifs flvtool2 : (...)

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

Sur d’autres sites (7999)

  • avutil : remove deprecated FF_API_HDR_VIVID_THREE_SPLINE

    19 février, par James Almer
    avutil : remove deprecated FF_API_HDR_VIVID_THREE_SPLINE
    

    Deprecated since 2023-03-17.

    Signed-off-by : James Almer <jamrial@gmail.com>

    • [DH] libavcodec/dynamic_hdr_vivid.c
    • [DH] libavutil/hdr_dynamic_vivid_metadata.h
    • [DH] libavutil/version.h
  • ffmpeg conversion failed error while splitting a 1.57GB .tif file

    12 janvier 2023, par Minai

    I am running ffmpeg to split a large .tif image of dimensions 25966 * 64114 into grids 0f 256 * 256 pixels but am getting an error.

    &#xA;

    I ran the ffmpeg command on an 1883 * 1361 pixel .jpg image named coral2

    &#xA;

    Here is the image

    &#xA;

    Using the following command :

    &#xA;

    ffmpeg -i coral2.jpg -qscale:v 1 -vf "crop=256:256:0:256" coral2-0-256.jpg

    &#xA;

    Here is the crop

    &#xA;

    When I run the same command on a 1.57GB .tif image :

    &#xA;

    image.tif -qscale:v 1 -vf "crop=256:256:0:256" image%01d.tif

    &#xA;

    I get the following error :

    &#xA;

    &#xA;

    &#xA;

    C :\Users\gwmin\Downloads\coral\southbayapal>ffmpeg -i image.tif -qscale:v 1 -vf "crop=256:256:0:256" image%01d.tif

    &#xA;

    ffmpeg version 2023-01-12-git-fc263f073e-full_build-www.gyan.dev Copyright (c) 2000-2023 the FFmpeg developers&#xA;built with gcc 12.2.0 (Rev7, Built by MSYS2 project)

    &#xA;

    configuration : —enable-gpl —enable-version3 —enable-static —disable-w32threads —disable-autodetect —enable-fontconfig —enable-iconv —enable-gnutls —enable-libxml2 —enable-gmp —enable-bzlib —enable-lzma —enable-libsnappy —enable-zlib —enable-librist —enable-libsrt —enable-libssh —enable-libzmq —enable-avisynth —enable-libbluray —enable-libcaca —enable-sdl2 —enable-libaribb24 —enable-libdav1d —enable-libdavs2 —enable-libuavs3d —enable-libzvbi —enable-librav1e —enable-libsvtav1 —enable-libwebp —enable-libx264 —enable-libx265 —enable-libxavs2 —enable-libxvid —enable-libaom —enable-libjxl —enable-libopenjpeg —enable-libvpx —enable-mediafoundation —enable-libass —enable-frei0r —enable-libfreetype —enable-libfribidi —enable-liblensfun —enable-libvidstab —enable-libvmaf —enable-libzimg —enable-amf —enable-cuda-llvm —enable-cuvid —enable-ffnvcodec —enable-nvdec —enable-nvenc —enable-d3d11va —enable-dxva2 —enable-libvpl —enable-libshaderc —enable-vulkan —enable-libplacebo —enable-opencl —enable-libcdio —enable-libgme —enable-libmodplug —enable-libopenmpt —enable-libopencore-amrwb —enable-libmp3lame —enable-libshine —enable-libtheora —enable-libtwolame —enable-libvo-amrwbenc —enable-libilbc —enable-libgsm —enable-libopencore-amrnb —enable-libopus —enable-libspeex —enable-libvorbis —enable-ladspa —enable-libbs2b —enable-libflite —enable-libmysofa —enable-librubberband —enable-libsoxr —enable-chromaprint

    &#xA;

    libavutil 57. 43.100 / 57. 43.100&#xA;libavcodec 59. 56.100 / 59. 56.100&#xA;libavformat 59. 35.100 / 59. 35.100&#xA;libavdevice 59. 8.101 / 59. 8.101&#xA;libavfilter 8. 53.100 / 8. 53.100&#xA;libswscale 6. 8.112 / 6. 8.112&#xA;libswresample 4. 9.100 / 4. 9.100&#xA;libpostproc 56. 7.100 / 56. 7.100

    &#xA;

    [tiff @ 000002c4799b6b00] [IMGUTILS @ 000000e4045fed40] Picture size 25966x64115 is invalid

    &#xA;

    [tiff_pipe @ 000002c4799a3700] Could not find codec parameters for stream 0 (Video : tiff, rgba) : unspecified size&#xA;Consider increasing the value for the 'analyzeduration' (0) and 'probesize' (5000000) options&#xA;Input #0, tiff_pipe, from 'SouthBayApal.tif' :&#xA;Duration : N/A, bitrate : N/A&#xA;Stream #0:0 : Video : tiff, rgba, 25 fps, 25 tbr, 25 tbn&#xA;Stream mapping :&#xA;Stream #0:0 -> #0:0 (tiff (native) -> tiff (native))&#xA;Press [q] to stop, [?] for help&#xA;[tiff @ 000002c4799a5d40] [IMGUTILS @ 000000e404dff650] Picture size 25966x64115 is invalid

    &#xA;

    Error while decoding stream #0:0 : Invalid argument&#xA;[graph 0 input from stream 0:0 @ 000002c479e0cd00] Unable to parse option value "0x0" as image size&#xA;[graph 0 input from stream 0:0 @ 000002c479e0cd00] Error setting option video_size to value 0x0.&#xA;[graph 0 input from stream 0:0 @ 000002c479e0cd00] Error applying generic filter options.&#xA;Error reinitializing filters !&#xA;Error while filtering : Invalid argument&#xA;Finishing stream 0:0 without any data written to it.&#xA;[graph 0 input from stream 0:0 @ 000002c479e0c800] Unable to parse option value "0x0" as image size&#xA;[graph 0 input from stream 0:0 @ 000002c479e0c800] Error setting option video_size to value 0x0.&#xA;[graph 0 input from stream 0:0 @ 000002c479e0c800] Error applying generic filter options.&#xA;Error configuring filter graph&#xA;Conversion failed ! <

    &#xA;

    Please help. My code only works on splitting crops of small images but I cannot crop large .tif images. How can I split using ffmpeg ?

    &#xA;

  • Introducing the Data Warehouse Connector feature

    30 janvier, par Matomo Core Team

    Matomo is built on a simple truth : your data belongs to you, and you should have complete control over it. That’s why we’re excited to launch our new Data Warehouse Connector feature for Matomo Cloud, giving you even more ways to work with your analytics data. 

    Until now, getting raw data from Matomo Cloud required APIs and custom scripts, or waiting for engineering help.  

    Our new Data Warehouse Connector feature removes those barriers. You can now access your raw, unaggregated data and schedule regular exports straight to your data warehouse. 

    The feature works with all major data warehouses including (but not limited to) : 

    • Google BigQuery 
    • Amazon Redshift 
    • Snowflake 
    • Azure Synapse Analytics 
    • Apache Hive 
    • Teradata 

    You can schedule exports, combine your Matomo data with other data sources in your data warehouse, and easily query data with SQL-like queries. 

    Direct raw data access for greater data portability 

    Waiting for engineering support can delay your work. Managing API connections and writing scripts can be time-consuming. This keeps you from focusing on what you do best—analysing data. 

    BigQuery create-table-menu

    With the Data Warehouse Connector feature, you get direct access to your raw Matomo data without the technical setup. So, you can spend more time analysing data and finding insights that matter. 

    Bringing your data together 

    Answering business questions often requires data from multiple sources. A single customer interaction might span your CRM, web analytics, sales systems, and more. Piecing this data together manually is time-consuming—what starts as a seemingly simple question from stakeholders can turn into hours of work collecting and comparing data across different tools. 

    This feature lets you combine your Matomo data with data from other business systems in your data warehouse. Instead of switching between tools or manually comparing spreadsheets, you can analyse all your data in one place to better understand how customers interact with your business. 

    Easy, custom analysis with SQL-like queries 

    Standard, pre-built reports often don’t address the specific, detailed questions that analysts need to answer.  

    When you use the Data Warehouse Connector feature, you can use SQL-like queries in your data warehouse to do detailed, customised analysis. This flexibility allows you to explore your data in depth and uncover specific insights that aren’t possible with pre-built reports. 

    Here is an example of how you might use SQL-like query to compare the behaviours of paying vs. non-paying users : 

    				
                                            <xmp>SELECT  

    custom_dimension_value AS user_type, -- Assuming 'user_type' is stored in a custom dimension

    COUNT(*) AS total_visits,  

    AVG(visit_total_time) AS avg_duration,

    SUM(conversion.revenue) AS total_spent  

    FROM  

    `your_project.your_dataset.matomo_log_visit` AS visit

    LEFT JOIN  

    `your_project.your_dataset.matomo_log_conversion` AS conversion  

    ON  

    visit.idvisit = conversion.idvisit  

    GROUP BY  

    custom_dimension_value; </xmp>
                                   

    This query helps you compare metrics such as the number of visits, average session duration, and total amount spent between paying and non-paying users. It provides a full view of behavioural differences between these groups. 

    Advanced data manipulation and visualisation 

    When you need to create detailed reports or dive deep into data analysis, working within the constraints of a fixed user interface (UI) can limit your ability to draw insights. 

    Exporting your Matomo data to a data warehouse like BigQuery provides greater flexibility for in-depth manipulation and advanced visualisations, enabling you to uncover deeper insights and tailor your reports more effectively. 

    Getting started 

    To set up data warehouse exports in your Matomo : 

    1. Go to System Admin (cog icon in the top right corner) 
    2. Select ‘Export’ from the left-hand menu 
    3. Choose ‘Data Warehouse Connector’ 

    You’ll find detailed instructions in our data warehouse exports guide 

    Please note, enabling this feature will cost an additional 10% of your current subscription. You can view the exact cost by following the steps above. 

    New to Matomo ? Start your 21-day free trial now (no credit card required), or request a demo.