Recherche avancée

Médias (1)

Mot : - Tags -/artwork

Autres articles (67)

  • List of compatible distributions

    26 avril 2011, par

    The table below is the list of Linux distributions compatible with the automated installation script of MediaSPIP. Distribution nameVersion nameVersion number Debian Squeeze 6.x.x Debian Weezy 7.x.x Debian Jessie 8.x.x Ubuntu The Precise Pangolin 12.04 LTS Ubuntu The Trusty Tahr 14.04
    If you want to help us improve this list, you can provide us access to a machine whose distribution is not mentioned above or send the necessary fixes to add (...)

  • Submit bugs and patches

    13 avril 2011

    Unfortunately a software is never perfect.
    If you think you have found a bug, report it using our ticket system. Please to help us to fix it by providing the following information : the browser you are using, including the exact version as precise an explanation as possible of the problem if possible, the steps taken resulting in the problem a link to the site / page in question
    If you think you have solved the bug, fill in a ticket and attach to it a corrective patch.
    You may also (...)

  • Création définitive du canal

    12 mars 2010, par

    Lorsque votre demande est validée, vous pouvez alors procéder à la création proprement dite du canal. Chaque canal est un site à part entière placé sous votre responsabilité. Les administrateurs de la plateforme n’y ont aucun accès.
    A la validation, vous recevez un email vous invitant donc à créer votre canal.
    Pour ce faire il vous suffit de vous rendre à son adresse, dans notre exemple "http://votre_sous_domaine.mediaspip.net".
    A ce moment là un mot de passe vous est demandé, il vous suffit d’y (...)

Sur d’autres sites (7390)

  • How Media Analytics for Piwik gives you the insights you need to measure how effective your video and audio marketing is – Part 1

    31 janvier 2017, par InnoCraft — Community

    Do you have video or audio content on your website or in your app ? If you answered this with yes, you should continue reading and learn everything about our Media Analytics premium feature.

    When you produce video or audio content, you are either spending money or time or often both money and time on your content in the hope of increasing conversions or sales. This means you have to know how your media is being used, when it is used, for how long and by whom. You can simply not afford not to know how this content affects your overall business goals as you are likely losing money and time by not making the most out of it. Would you be able to answer any of the above questions ? Do you know whether you can justify the cost and time for producing them, which videos work better than others and how they support your marketing strategy ? Luckily, getting all these insights is now so trivial it is almost a crime to not measure it.

    Getting Media Analytics and Installation

    Media Analytics can be purchased from the Piwik Marketplace where you find all sorts of free plugins as well as several premium features such as A/B Testing or Funnel. After the purchase you will receive a license key that you can enter in your Piwik to install and update the plugin with just one click.

    The feature will in most cases automatically start tracking your media content and you don’t even need to change the tracking code on your website. Currently supported players are for example YouTube, Vimeo, HTML 5, JW Player, VideoJS and many more players. You can also easily extend it by adding a custom media player or simply by letting us know which player you use and we will add support for it for you.

    By activating this feature, you get more than 15 new media reports, even more exportable widgets, new segments, APIs, and more. We will cover some of those features in this blog post and in part 2. For a full list of features check out the Media Analytics page on the Piwik Marketplace.

    Media Overview

    As the name says, it gives you an overview over your media usage and how it performs over time. You can choose any media metrics in the big evolution graph and the sparklines below give you an overview over all important metrics in a glance.

    It lets you for example see how often media was shown to your users, how often users start playing your media, for how long they watched it, how often they finished it, and more. If you see some spikes there, you should definitely have a deeper look at the other reports. When you hover a metric, it will show you a tooltip explaining how the data for this is collected and what it means.

    Real-Time Media

    On the Real-Time page you can see how your content is being used by your visitors right now, for example within the last 30 minutes, last 60 minutes and last 24 hours.

    It shows you how many plays you had in the last minutes, for how long they played it, and it shows you currently most popular media titles. This is great to discover which media content performs best right now and lets you make decisions based on user behaviour that is happening right now.

    Below you can see our Audience Real-Time Map that shows you from where in the world your media is being played. A bigger circle indicates that a media play happened more recently and of course you can zoom in down to countries and regions.

    All the reports update every few seconds so you can always have a look at it and see in just a second how your content is doing and how certain marketing campaigns affect it. All these real-time reports can be also added as widgets to any of your Piwik Dashboards and they can be exported for example as an iframe.

    Video, Audio and Media Player reports

    Those reports come with so many features, we need a separate blog post and cover this in part 2.

    Events

    Media Analytics will automatically track events so you can see how often users pressed for example play or pause, how often they resumed a video and how often they finished a video. This helps you better understand how your media is being used.

    For example in the past we noticed a couple of videos with lots of pause and resume events. We then had a look at the Audience Log – which we will cover next – to better understand why visitors paused the videos so often. We then realized they did this especially for videos that were served from a specific server and because the videos were loading so slow, users often pressed pause to let the media buffer, then played the media for a few seconds and then paused it again as they had to wait for the video to load. Moving those videos to another, faster server showed us immediate results in the number of pauses going down and on average visitors watched the videos for much longer.

    Audience Log

    At InnoCraft, we understand that not only aggregated metrics matter but also that you often need the ability to dig into your data and “debug” certain behaviours to understand the cause for some unusual high or low metrics. For example you may find out that many of your users often pause a video, then you wonder how each individual user behaved so you can better understand the why.

    The audience log shows you a detailed log of every visitor. You can chronologically see every action a visitor has performed during their whole visit. If you click on the visitor profile link, you can even see all visits of a specific visitor, and all actions they have ever performed on your website.

    This lets you ultimately debug and understand your visitors and see exactly which actions they performed before playing your media, which media they played, how they played your media, and how they behaved after playing your media.

    The visitor log of course also shows important information about each visitor like where they came from (referrer), their location, software, device and much more information.

    Audience Map

    The Audience Map is similar to the Real-Time Map but it shows you the locations of your visitors based on a selected date range and not in real time. The darker the blue, the more visitors from that country, region or city have interacted with your media.

    Coming in part 2

    In the next part we will cover which video, audio and media player reports Media Analytics provides, how segmenting gives you insights into different personas, and how nicely it integrates into Piwik.

    How to get Media Analytics and related features

    You can get Media Analytics on the Piwik Marketplace. If you want to learn more about this feature, you might be also interested in the Media Analytics User Guide and the Media Analytics FAQ.

  • ffmpeg filter_complex concat never completes

    10 juillet 2020, par seawolf

    I am trying to concatenate mp4 files into a single file. I am attempting to use the concat filter directly for reasons independent from this question (so -i list.txt is not a valid solution). All video files in question are between 4 and 20 minutes in length.

    


    What I am executing :

    


    % ffmpeg -i f01.mp4 -i f02.mp4 -filter_complex "[0:v][0:a][1:v][1:a]concat=n=2:v=1:a=1[v][a]" -map "[v]" -map "[a]" output.mp4


    


    This is my understanding of the form of this command from references such as FFmpeg Filters Documentation : concat and Concatenate Videos Together Using FFMPEG !.

    


    What happens :

    


    ffmpeg gives a lot of output as it checks the metadata for each stream and then begins processing. After a short duration (several seconds to a minute or so, seems to be prortional to the duration of the first video) I start seeing messages like this :

    


    More than 1000 frames duplicated
More than 10000 frames duplicated     512kB time=00:00:00.12 bitrate=32771.0kbits/s dup=33365 drop=0 speed=0.00449x
More than 100000 frames duplicated   1280kB time=00:00:00.17 bitrate=61442.1kbits/s dup=66730 drop=0 speed=0.00272x


    


    ... and then the process never completes. If I leave my computer running for 24 hours, ffmpeg is still using max available CPU (200-300%). The output file is 48 bytes in length.

    


    Note : the inputs are only a few minutes each, so individually re-encoding the inputs would take only a few minutes each.

    


    Note : if I change the command to use f01.mp4 for both source 0 and source 1, the command completes as expected in under 5 minutes :

    


    % # example that works:
% # note, however, that this just repeats the same source twice
% ffmpeg -i f01.mp4 -i f01.mp4 -filter_complex "[0:v][0:a][1:v][1:a]concat=n=2:v=1:a=1[v][a]" -map "[v]" -map "[a]" output.mp4


    


    Full output :

    


    Here is the complete output of a run in case it's helpful :

    


    % ffmpeg -i f01.mp4 -i f02.mp4 -filter_complex "[0:v][0:a][1:v][1:a]concat=n=2:v=1:a=1[v][a]" -map "[v]" -map "[a]" output.mp4
ffmpeg version 4.3 Copyright (c) 2000-2020 the FFmpeg developers
  built with Apple clang version 11.0.3 (clang-1103.0.32.62)
  configuration: --prefix=/usr/local/Cellar/ffmpeg/4.3_2 --enable-shared --enable-pthreads --enable-version3 --enable-avresample --cc=clang --host-cflags= --host-ldflags= --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libbluray --enable-libdav1d --enable-libmp3lame --enable-libopus --enable-librav1e --enable-librubberband --enable-libsnappy --enable-libsrt --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-librtmp --enable-libspeex --enable-libsoxr --enable-videotoolbox --disable-libjack --disable-indev=jack
  libavutil      56. 51.100 / 56. 51.100
  libavcodec     58. 91.100 / 58. 91.100
  libavformat    58. 45.100 / 58. 45.100
  libavdevice    58. 10.100 / 58. 10.100
  libavfilter     7. 85.100 /  7. 85.100
  libavresample   4.  0.  0 /  4.  0.  0
  libswscale      5.  7.100 /  5.  7.100
  libswresample   3.  7.100 /  3.  7.100
  libpostproc    55.  7.100 / 55.  7.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'f01.mp4':
  Metadata:
    major_brand     : mp42
    minor_version   : 512
    compatible_brands: isomiso2avc1mp41
    creation_time   : 2020-06-04T21:34:26.000000Z
    encoder         : HandBrake 1.3.2 2020050300
  Duration: 00:04:14.66, start: 0.000000, bitrate: 525 kb/s
    Stream #0:0(und): Video: h264 (Main) (avc1 / 0x31637661), yuv420p(tv, smpte170m/smpte170m/bt709), 708x478 [SAR 8:9 DAR 944:717], 366 kb/s, 29.97 fps, 29.97 tbr, 90k tbn, 180k tbc (default)
    Metadata:
      creation_time   : 2020-06-04T21:34:26.000000Z
      handler_name    : VideoHandler
    Stream #0:1(jpn): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 149 kb/s (default)
    Metadata:
      creation_time   : 2020-06-04T21:34:26.000000Z
      handler_name    : Stereo
Input #1, mov,mp4,m4a,3gp,3g2,mj2, from 'f02.mp4':
  Metadata:
    major_brand     : mp42
    minor_version   : 512
    compatible_brands: isomiso2avc1mp41
    creation_time   : 2020-06-04T21:38:15.000000Z
    encoder         : HandBrake 1.3.2 2020050300
  Duration: 00:06:30.95, start: 0.000000, bitrate: 1328 kb/s
    Stream #1:0(und): Video: h264 (Main) (avc1 / 0x31637661), yuv420p(tv, smpte170m/smpte170m/bt709), 708x478 [SAR 8:9 DAR 944:717], 1179 kb/s, 29.97 fps, 30 tbr, 90k tbn, 180k tbc (default)
    Metadata:
      creation_time   : 2020-06-04T21:38:15.000000Z
      handler_name    : VideoHandler
    Stream #1:1(jpn): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 140 kb/s (default)
    Metadata:
      creation_time   : 2020-06-04T21:38:15.000000Z
      handler_name    : Stereo
File 'output.mp4' already exists. Overwrite? [y/N] y
Stream mapping:
  Stream #0:0 (h264) -> concat:in0:v0
  Stream #0:1 (aac) -> concat:in0:a0
  Stream #1:0 (h264) -> concat:in1:v0
  Stream #1:1 (aac) -> concat:in1:a0
  concat:out:v0 -> Stream #0:0 (libx264)
  concat:out:a0 -> Stream #0:1 (aac)
Press [q] to stop, [?] for help
[mp4 @ 0x7ff130014000] Frame rate very high for a muxer not efficiently supporting it.
Please consider specifying a lower framerate, a different muxer or -vsync 2
[libx264 @ 0x7ff130021200] using SAR=8/9
[libx264 @ 0x7ff130021200] MB rate (1350000000) > level limit (16711680)
[libx264 @ 0x7ff130021200] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 BMI2 AVX2
[libx264 @ 0x7ff130021200] profile High, level 6.2, 4:2:0, 8-bit
[libx264 @ 0x7ff130021200] 264 - core 160 r3011 cde9a93 - H.264/MPEG-4 AVC codec - Copyleft 2003-2020 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=6 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
Output #0, mp4, to 'output.mp4':
  Metadata:
    major_brand     : mp42
    minor_version   : 512
    compatible_brands: isomiso2avc1mp41
    encoder         : Lavf58.45.100
    Stream #0:0: Video: h264 (libx264) (avc1 / 0x31637661), yuv420p(progressive), 708x478 [SAR 8:9 DAR 944:717], q=-1--1, 1000k tbn, 1000k tbc (default)
    Metadata:
      encoder         : Lavc58.91.100 libx264
    Side data:
      cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: N/A
    Stream #0:1: Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 128 kb/s (default)
    Metadata:
      encoder         : Lavc58.91.100 aac
More than 1000 frames duplicated
More than 10000 frames duplicated     512kB time=00:00:00.12 bitrate=32771.0kbits/s dup=33365 drop=0 speed=0.00449x
More than 100000 frames duplicated   1280kB time=00:00:00.17 bitrate=61442.1kbits/s dup=66730 drop=0 speed=0.00272x
frame=667333 fps=1079 q=33.0 size=   14848kB time=00:00:00.76 bitrate=158379.2kbits/s dup=667312 drop=0 speed=0.00124x


    


  • H.264 to DNxHR 444 issue. Colors are not transcoded correctly (HDR project)

    21 janvier 2020, par Raulo1985

    I’m having an issue transcoding a H.264 UHD HDR file to a DNxHR file in a mxf container with FFmpeg. The issue is that both files don’t look the same at all, the colors look washed out on the DNxHR video, and I tried to make the transcoding as lossless as possible (DNxHR 444 flavor). The original file is a movie I ripped a while ago, H.264, UHD, HDR, in a mkv container.

    My goal : to create an almost lossless DNxHR file to use it as source file in Adobe Premiere Pro, and use another DNxHR file with less quality as proxy for editing. I wanted to do it that way and not use the original H.264 as the source file because it’s out of sync with the proxy file (I mean, when I toggle the proxy icon on and off, you can tell that there’s a short delay between them, which defeats all purposes for editing). My guess is that it may be because H.264 is compressed and DNxHR isn’t, and since I edit making a lot of fast cuts, I need both the source file and the proxy file to be as synced as possible. When the source file and the proxy file are both DNxHR, no matter the flavor, they are perfectly synced. I don’t want to go with Prores for the proxies, because the sync problem is a lot worse (several seconds of delay between files), maybe because it’s a VBR codec and my original file and DNxHR are CBR (for the record, I always prefer CBR).

    Well, the thing is that when I import the original H.264 file to Premiere Pro, use a DNxHR proxy, edit a little, and export it directly from the original file (H.264 10 bits, with all the settings required for HDR output enabled) the colors look as they should. When I do the same with the high quality DNxHR as source file, with the exact same export settings, the colors look washed out. The same with any DNxHR flavor.

    Then I opened both files (original H.264 and high quality DNxHR transcoded from the H.264 one) with VLC, and I also can tell that the mxf file looks washed out and the H.264 file doesn’t. So it’s not an export issue on Premiere’ side, it’s something that has to do with the original transcoding.

    I understand that DNxHR 444 is as lossless as you can get with that codec, preserving all the HDR required data, and I believe that the mfx container has some advantages over MOV, which is the other container that supports DNxHD/DNxHR. So I don’t know what’s happening really.

    The command I used was :

    ffmpeg -channel_layout 63 -i input.mkv -map 0:0 -c:v dnxhd -vf "scale=in_range=limited:out_range=full" -color_range 2 -profile:v dnxhr_444 -pix_fmt yuv444p10le -acodec pcm_s24le -ar 48000 -ac 6 -channel_layout 63 -map 0:2 -hide_banner output.mxf

    Like I said, after the transcoding, both video files look a lot different from each other, color wise. And after using them in Premiere and exporting with the exact same settings, the output files suffer from the same difference.

    Mediainfo shows the expected data for both files :
    - 10 bits, main 10, level 5, 4:2:0, CBR, BT.2020 for the original h.264 file.
    - 10 bits, 4:4:4, CBR for the DNxHR 444 file.

    One thing I noticed in Mediainfo is that both have YUV as color space, but the DNxHR 444 video has an extra field that says ColorSpace_Original : RGB. Honestly, I don’t know what that means, since the original is YUV. Color range is fine, from 0 to 1023 (and chroma range 1023). The other thing is that it says "limited" on the color range field of the H.264 file, but I’ve read that that could be a bug or missinterpretation of the file by Mediainfo.

    Well, that’s it, any help would be appreciated. I’d really like to edit with DNxHR 444 as source file and DNxHR LB for the proxies, so I can edit in a fast pace and without sync issues, but the color is just not acceptable. And I do understand that I’m adding an extra transcoding step (from original to DNxHR), but the sync issue between the original and the DNxHR proxies, even though it may be a delay of a fraction of a second, makes my workflow a lot harder since I’ll have to export many times to see if the cuts are made exactly where I want them to be. Not ideal by any means. And Prores is not an option apparently, the sync issue is a lot worse. For me, it all comes down to being able to get a DNxHR 444 file that looks, well, as close to lossless as it can be, and that goal obviously involves the colors.

    Thanks in advance.

    PS : file size is not an issue for me, so having an entire UHD HDR movie transcoded to DNxHR 444 is not a problem.

    PS2 : I tried with a different chroma subsampling (like DNxHR HQX 10 bits, which is 4:2:2), same result. Haven´t tried with 8 bits yet, but I don’t see the point since this is a HDR project.

    EXTRA INFO :

    1) FFprobe output of the MXF DNxHR file (this one is 4:2:2, the only difference with the command used compared to the one stated on the OP is -pix_fmt yuv444p10le being -pix_fmt yuv422p10le) :

     libavutil      56. 31.100 / 56. 31.100
     libavcodec     58. 54.100 / 58. 54.100
     libavformat    58. 29.100 / 58. 29.100
     libavdevice    58.  8.100 / 58.  8.100
     libavfilter     7. 57.100 /  7. 57.100
     libswscale      5.  5.100 /  5.  5.100
     libswresample   3.  5.100 /  3.  5.100
     libpostproc    55.  5.100 / 55.  5.100
    [mxf @ 000001f4d17fbac0] Stream #0: not enough frames to estimate rate; consider increasing probesize
    Input #0, mxf, from 'Interstellar_Master_DNxHR_444_UHD_422_PCM24_5.1.mxf':
     Metadata:
       operational_pattern_ul: 060e2b34.04010101.0d010201.01010900
       uid             : adab4424-2f25-4dc7-92ff-29bd000c0000
       generation_uid  : adab4424-2f25-4dc7-92ff-29bd000c0001
       company_name    : FFmpeg
       product_name    : OP1a Muxer
       product_version : 58.29.100
       product_uid     : adab4424-2f25-4dc7-92ff-29bd000c0002
       material_package_umid: 0x060A2B340101010501010D001393EE79529471348D93EE7900529471348D9300
       timecode        : 00:00:00:00
     Duration: 02:49:03.97, start: 0.000000, bitrate: 1404833 kb/s
       Stream #0:0: Video: dnxhd (DNXHR 444), yuv444p10le(bt709/unknown/unknown, progressive), 3840x2160, SAR 1:1 DAR 16:9, 23.98 tbr, 23.98 tbn, 23.98 tbc
       Metadata:
         file_package_umid: 0x060A2B340101010501010D001393EE79529471348D93EE7900529471348D9301
       Stream #0:1: Audio: pcm_s24le, 48000 Hz, 6 channels, s32 (24 bit), 6912 kb/s
       Metadata:
         file_package_umid: 0x060A2B340101010501010D001393EE79529471348D93EE7900529471348D9301

    2) FFprobe output of the MP4 H.264 source file (this one is 4:2:0, 10 bits, HDR) :

       Stream #0:0(eng): Video: hevc (Main 10) (hev1 / 0x31766568), yuv420p10le(tv, bt2020nc/bt2020/smpte2084), 3840x2160 [SAR 1:1 DAR 16:9], 15584 kb/s, 23.98 fps, 23.98 tbr, 16k tbn, 23.98 tbc (default)
       Metadata:
         handler_name    : VideoHandler
       Stream #0:1(eng): Audio: ac3 (ac-3 / 0x332D6361), 48000 Hz, 5.1(side), fltp, 640 kb/s (default)
       Metadata:
         handler_name    : SoundHandler
       Side data:
         audio service type: main
       Stream #0:2(eng): Data: bin_data (text / 0x74786574)
       Metadata:
         handler_name    : SubtitleHandler
    Unsupported codec with id 100359 for input stream 2