Recherche avancée

Médias (91)

Autres articles (95)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Amélioration de la version de base

    13 septembre 2013

    Jolie sélection multiple
    Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
    Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)

  • Emballe médias : à quoi cela sert ?

    4 février 2011, par

    Ce plugin vise à gérer des sites de mise en ligne de documents de tous types.
    Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ;

Sur d’autres sites (15016)

  • H.264 to DNxHR 444 issue. Colors are not transcoded correctly (HDR project). Note : issue not solved yet

    22 janvier 2020, par Raulo1985

    I’m having an issue transcoding a H.264 UHD HDR file to a DNxHR file in a mxf container with FFmpeg. The issue is that both files don’t look the same at all, the colors look washed out on the DNxHR video, and I tried to make the transcoding as lossless as possible (DNxHR 444 flavor). The original file is a movie I ripped a while ago, H.264, UHD, HDR, in a mkv container.

    My goal : to create an almost lossless DNxHR file to use it as source file in Adobe Premiere Pro, and use another DNxHR file with less quality as proxy for editing. I wanted to do it that way and not use the original H.264 as the source file because it’s out of sync with the proxy file (I mean, when I toggle the proxy icon on and off, you can tell that there’s a short delay between them, which defeats all purposes for editing). My guess is that it may be because H.264 is compressed and DNxHR isn’t, and since I edit making a lot of fast cuts, I need both the source file and the proxy file to be as synced as possible. When the source file and the proxy file are both DNxHR, no matter the flavor, they are perfectly synced. I don’t want to go with Prores for the proxies, because the sync problem is a lot worse (several seconds of delay between files), maybe because it’s a VBR codec and my original file and DNxHR are CBR (for the record, I always prefer CBR).

    Well, the thing is that when I import the original H.264 file to Premiere Pro, use a DNxHR proxy, edit a little, and export it directly from the original file (H.264 10 bits, with all the settings required for HDR output enabled) the colors look as they should. When I do the same with the high quality DNxHR as source file, with the exact same export settings, the colors look washed out. The same with any DNxHR flavor.

    Then I opened both files (original H.264 and high quality DNxHR transcoded from the H.264 one) with VLC, and I also can tell that the mxf file looks washed out and the H.264 file doesn’t. So it’s not an export issue on Premiere’ side, it’s something that has to do with the original transcoding.

    I understand that DNxHR 444 is as lossless as you can get with that codec, preserving all the HDR required data, and I believe that the mfx container has some advantages over MOV, which is the other container that supports DNxHD/DNxHR. So I don’t know what’s happening really.

    The command I used was :

    ffmpeg -channel_layout 63 -i input.mkv -map 0:0 -c:v dnxhd -vf "scale=in_range=limited:out_range=full" -color_range 2 -profile:v dnxhr_444 -pix_fmt yuv444p10le -acodec pcm_s24le -ar 48000 -ac 6 -channel_layout 63 -map 0:2 -hide_banner output.mxf

    Like I said, after the transcoding, both video files look a lot different from each other, color wise. And after using them in Premiere and exporting with the exact same settings, the output files suffer from the same difference.

    Mediainfo shows the expected data for both files :
    - 10 bits, main 10, level 5, 4:2:0, CBR, BT.2020 for the original h.264 file.
    - 10 bits, 4:4:4, CBR for the DNxHR 444 file.

    One thing I noticed in Mediainfo is that both have YUV as color space, but the DNxHR 444 video has an extra field that says ColorSpace_Original : RGB. Honestly, I don’t know what that means, since the original is YUV. Color range is fine, from 0 to 1023 (and chroma range 1023). The other thing is that it says "limited" on the color range field of the H.264 file, but I’ve read that that could be a bug or missinterpretation of the file by Mediainfo.

    Well, that’s it, any help would be appreciated. I’d really like to edit with DNxHR 444 as source file and DNxHR LB for the proxies, so I can edit in a fast pace and without sync issues, but the color is just not acceptable. And I do understand that I’m adding an extra transcoding step (from original to DNxHR), but the sync issue between the original and the DNxHR proxies, even though it may be a delay of a fraction of a second, makes my workflow a lot harder since I’ll have to export many times to see if the cuts are made exactly where I want them to be. Not ideal by any means. And Prores is not an option apparently, the sync issue is a lot worse. For me, it all comes down to being able to get a DNxHR 444 file that looks, well, as close to lossless as it can be, and that goal obviously involves the colors.

    Thanks in advance.

    PS : file size is not an issue for me, so having an entire UHD HDR movie transcoded to DNxHR 444 is not a problem.

    PS2 : I tried with a different chroma subsampling (like DNxHR HQX 10 bits, which is 4:2:2), same result. Haven´t tried with 8 bits yet, but I don’t see the point since this is a HDR project.

    UPDATE :

    I tried to transcode from the H.264 source file to a DNxHR video file in a MXF container using Adobe Media Encoder instead of FFmpeg, and the colors are not transcoded correctly again, but this time they seem to be over saturated instead of washed out. Adobe Media Encoder doesn’t give much room for tweaking, but I made sure to select 444 10 bits profile, same resolution (UHD), same frame rate and render with maximum quality and maximum bit depth. FFprobe output of the resulting file again shows BT709 as the color space (the same thing happens with the output file after transcoding using FFmpeg). Seems to be something not related to FFmpeg, apparently. Any ideas ? It’s like there’s no way I can transcode and retain the colors correctly from H.264 to DNxHR, even using its most high quality flavor and correct command settings (at least they look ok to me). How can I post this so maybe developers or people with lots of experience here can give us a clue to what’s happening ? Thanks.

    PS : More potentially useful info on the comments below.

    EXTRA INFO :

    1) FFprobe output of the MXF DNxHR file (this one is 4:2:2, the only difference with the command used compared to the one stated on the OP is -pix_fmt yuv444p10le being -pix_fmt yuv422p10le) :

     libavutil      56. 31.100 / 56. 31.100
     libavcodec     58. 54.100 / 58. 54.100
     libavformat    58. 29.100 / 58. 29.100
     libavdevice    58.  8.100 / 58.  8.100
     libavfilter     7. 57.100 /  7. 57.100
     libswscale      5.  5.100 /  5.  5.100
     libswresample   3.  5.100 /  3.  5.100
     libpostproc    55.  5.100 / 55.  5.100
    [mxf @ 000001f4d17fbac0] Stream #0: not enough frames to estimate rate; consider increasing probesize
    Input #0, mxf, from 'Interstellar_Master_DNxHR_444_UHD_422_PCM24_5.1.mxf':
     Metadata:
       operational_pattern_ul: 060e2b34.04010101.0d010201.01010900
       uid             : adab4424-2f25-4dc7-92ff-29bd000c0000
       generation_uid  : adab4424-2f25-4dc7-92ff-29bd000c0001
       company_name    : FFmpeg
       product_name    : OP1a Muxer
       product_version : 58.29.100
       product_uid     : adab4424-2f25-4dc7-92ff-29bd000c0002
       material_package_umid: 0x060A2B340101010501010D001393EE79529471348D93EE7900529471348D9300
       timecode        : 00:00:00:00
     Duration: 02:49:03.97, start: 0.000000, bitrate: 1404833 kb/s
       Stream #0:0: Video: dnxhd (DNXHR 444), yuv444p10le(bt709/unknown/unknown, progressive), 3840x2160, SAR 1:1 DAR 16:9, 23.98 tbr, 23.98 tbn, 23.98 tbc
       Metadata:
         file_package_umid: 0x060A2B340101010501010D001393EE79529471348D93EE7900529471348D9301
       Stream #0:1: Audio: pcm_s24le, 48000 Hz, 6 channels, s32 (24 bit), 6912 kb/s
       Metadata:
         file_package_umid: 0x060A2B340101010501010D001393EE79529471348D93EE7900529471348D9301

    2) FFprobe output of the MP4 H.264 source file (this one is 4:2:0, 10 bits, HDR) :

       Stream #0:0(eng): Video: hevc (Main 10) (hev1 / 0x31766568), yuv420p10le(tv, bt2020nc/bt2020/smpte2084), 3840x2160 [SAR 1:1 DAR 16:9], 15584 kb/s, 23.98 fps, 23.98 tbr, 16k tbn, 23.98 tbc (default)
       Metadata:
         handler_name    : VideoHandler
       Stream #0:1(eng): Audio: ac3 (ac-3 / 0x332D6361), 48000 Hz, 5.1(side), fltp, 640 kb/s (default)
       Metadata:
         handler_name    : SoundHandler
       Side data:
         audio service type: main
       Stream #0:2(eng): Data: bin_data (text / 0x74786574)
       Metadata:
         handler_name    : SubtitleHandler
    Unsupported codec with id 100359 for input stream 2
  • How to keep personally identifiable information safe

    23 janvier 2020, par Joselyn Khor

    The protection of personally identifiable information (PII) is important both for individuals, whose privacy may be compromised, and for businesses that may have their reputation ruined or be liable if PII is wrongly accessed, used, or shared.

    Curious about what PII is ? Here’s your introduction to personally identifiable information.

    Due to hacking, data leaks or data thievery, PII acquired can be combined with other pieces of information to form a more complete picture of you. On an individual level, this puts you at risk of identity theft, credit card theft or other harm caused by the fraudulent use of your personal information.

    On a business level, for companies who breach data privacy laws – like Cambridge Analytica’s harvesting of millions of FB profiles – the action leads to an erosion of trust. It can also impact your financial position as heavy fines can be imposed for the illegal use and processing of personally identifiable information.

    So what can you do to ensure PII compliance ?

    On an individual level :

    1. Don’t give your data away so easily. Although long, it’s worthwhile to read through privacy policies to make sure you know what you’re getting yourself into.
    2. Don’t just click ‘agree’ when faced with consent screens, as consent screens are majorly flawed. Users mostly always opt in without reading and without being properly informed what they opt in to.
    3. Did you know you’re most likely being tracked from website to website ? For example, Google can identify you across visits and websites. One of the things you can do is to disable third party cookies by default. Businesses can also use privacy friendly analytics which halt such tracking. 
    4. Use strong passwords.
    5. Be wary of public wifi – hackers can easily access your PII or sensitive data. Use a VPN (virtual private network), which lets you create a secure connection to a server of your choosing. This allows you to browse the internet in a safe manner.

    A PII compliance checklist for businesses/organisations :

    1. Identify where all PII exists and is stored – review and make sure this is in a safe environment.
    2. Identify laws that apply to you (GDPR, California privacy law, HIPAA) and follow your legal obligations.
    3. Create operational safeguards – policies and procedures for handling PII at an organisation level ; and building awareness to focus on the protection of PII.
    4. Encrypt databases and repositories where such info is kept.
    5. Create privacy-specific safeguards in the way your organisation collects, maintains, uses, and disseminates data so you protect the confidentiality of the data.
    6. Minimise the use, collection, and retention of PII – only collect and keep PII if it’s necessary for you to perform your legal business function.
    7. Conduct privacy impact assessments (PIA) to find and prevent privacy risks (identify what and why it’s to be collected ; how the information will be secured etc.).
    8. De-identify within the scope of your data collection and analytics tools.
    9. Anonymise data.
    10. Keep your privacy policy updated.
    11. Pseudonymisation.
    12. A more comprehensive guide for businesses can be found here : https://iapp.org/media/pdf/knowledge_center/NIST_Protecting_PII.pdf
  • Analytics for the Internet of Things : collecting all your things’ data with Piwik to stay in control ?

    25 novembre 2015, par Matthieu Aubry — About

    At Piwik our mission is to create the leading free and open source analytics platform, and supporting global organisations and communities to keep full control over their data.

    Our broad mission started 8 years ago and we focused at first helping people to liberate their website analytics data, then liberate their mobile app analytics data. But it is clear that there is much more than Web + Mobile : data is everywhere and a lot more data is being generated by software, people and their activities, robots, sensors…

    I’d like to share an interesting article which highlights one of the growing trends of technology : the rise of the Internet Of Things : 6 Ways Analytics And The Internet Of Things Will Transform Business.

    Here is an extract :

    The tech industry is no stranger to change, but the data derived from the IoT is taking disruption to a new level.

    At IBM’s Insight conference last month, Bob Picciano, senior vice president of IBM Analytics, talked about the rise of the “cognitive business”, or an enterprise that engages with analytics to improve its customer relations, business processes, and decision-making capabilities.

    There are dueling predictions over how ubiquitous the Internet of Things will be, but most indicate that the marketplace will host between 50 and 75 billion connected objects by 2020, signaling novel challenges for hardware manufacturing and development. Software engineers, likewise, may need to completely revamp programs to better exploit the influx of data, while innovators need to wrestle with the changes wrought by analytics.

    IBM’s Insight event unfolded in light of this wave of disruption. The lineup of corporate presenters converged on the same message : Analytics is for everyone, and your viability in the marketplace depends on it.

    […]

    IBM’s Insight 2015 conference sounded off on the most important trends in data usage and management. It also served a wake-up call for developers, engineers, and tech leaders. As the Internet of Things alters the landscape of analytics, hardware design needs to change, software development requires novel approaches, and tech management must become more agile in order to realize data’s greatest benefits.

    So far there are 1 million websites using Piwik… but what if there could be 10 or 50 million things (sensors, devices) being measured by Piwik ?

    Together we will be creating the best open source and generic analytics platform, that is engineered to last, and designed to help humanity keep control and gain Freedom.

    We aim for Piwik to be the ideal platform to measure the Internet Of Things.

    We’re still at the beginning of this journey and it will take the best of all of us to get there.

    See you on the way !

    PS : if you’d like to get involved with Piwik, we would be glad to welcome you !