Recherche avancée

Médias (0)

Mot : - Tags -/protocoles

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (48)

  • Les formats acceptés

    28 janvier 2010, par

    Les commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
    ffmpeg -codecs ffmpeg -formats
    Les format videos acceptés en entrée
    Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
    Les formats vidéos de sortie possibles
    Dans un premier temps on (...)

  • La file d’attente de SPIPmotion

    28 novembre 2010, par

    Une file d’attente stockée dans la base de donnée
    Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
    Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...)

  • Utilisation et configuration du script

    19 janvier 2011, par

    Informations spécifiques à la distribution Debian
    Si vous utilisez cette distribution, vous devrez activer les dépôts "debian-multimedia" comme expliqué ici :
    Depuis la version 0.3.1 du script, le dépôt peut être automatiquement activé à la suite d’une question.
    Récupération du script
    Le script d’installation peut être récupéré de deux manières différentes.
    Via svn en utilisant la commande pour récupérer le code source à jour :
    svn co (...)

Sur d’autres sites (5843)

  • Dreamcast Finds

    15 avril 2022, par Multimedia Mike — Sega Dreamcast

    Pursuant to my recent post about finally understanding how Sega Dreamcast GD-ROM rips are structured, I was able to prepare the contents of various demo discs in a manner that makes exploration easy via the Internet Archive. This is due to the way that IA makes it easy to browse archives such as ZIP or ISO files (anything that 7zip knows how to unpack), and also presents the audio tracks for native playback directly through the web browser.

    These are some of the interesting things I have found while perusing the various Dreamcast sampler discs.

    Multimedia Formats
    First and foremost : Multimedia-wise, SFD and ADX files abound on all the discs. SFD files are Sofdec, a middleware format used for a lot of FMV on Dreamcast games. These were little more than MPEG video files with a non-MPEG (ADPCM instead) audio codec. VLC will usually play the video portions of these files but has trouble detecting the audio. It’s not for lack of audio codec support because it can play the ADX files just fine.

    It should be noted that Dreamcast Magazine Disc 11 has an actual .mpg file (as opposed to a .sfd file) that has proper MPEG audio instead instead of ADX ADPCM.

    The only other multimedia format I know of that was used in any Dreamcast games was 4XM, used on Alone In The Dark : The New Nightmare. I wrote a simple C tool a long time to recover these files from a disc image I extracted myself. Rather than interpreting the ISO-9660 filesystem, the tool just crawled through the binary blob searching for ‘4XMV’ file signatures and using length data within the files for extraction.

    Also, there are plentiful PVR files (in reference to the PowerVR2 GPU hardware that the DC uses) which ‘file’ dutifully identifies as “Sega PVR image”. There are probably tools to view them. It doesn’t appear to be a complicated format.

    Scripting
    I was fascinated to see Lua files on at least one of the discs. It turns out that MDK 2 leverages the language, as several other games do. But it was still interesting to see the .lua files show up in the Dreamcast version as well.

    That Windows CE Logo
    Every Sega Dreamcast is famously emblazoned with a logo mentioning Microsoft Windows CE :


    Windows CE Logo on Dreamcast

    It has confused many folks. It also confused me until this exploratory exercise. Many would wonder if the Dreamcast booted up into some Windows CE OS environment that then ran the game, but that certainly wasn’t it. Indeed, Dreamcast was one of the last consoles that really didn’t have any kind of hypervisor operating system managing everything.

    I found a file called rt2dc.exe on one sampler disc. At first, I suspected that this was a development utility for Windows to convert some “RT” graphical format into a format more suitable for the Dreamcast. Then, ‘file’ told me that it was actually a Windows EXE but compiled for the Hitachi SH-4 CPU (the brain inside the DC). Does the conversion utility run on the Dreamcast itself ? Then I analyzed the strings inside the binary and saw references to train stations. That’s when it started to click for me that this was the binary executable for the demo version of Railroad Tycoon 2 : Gold Edition, hence “rt2dc.exe”. Still, this provides some insight about whether Dreamcast “runs” Windows. This binary was built against a series of Windows CE libraries. The symbols also imply DirectX compatibility.

    Here is a page with more info about the WinCE/DirectX variant for the Sega Dreamcast. It seems that this was useful for closing the gap between PC and DC ports of games (i.e., being able to re-use more code between the 2 platforms). I guess this was part of what made Dreamcast a dry run for the DirectXbox (later Xbox).

    Here is a list of all the Dreamcast games that are known to use Windows CE.

    Suddenly, I am curious if tools such as IDA Pro or Ghidra can possibly open up Windows CE binaries that contain SH-4 code. Not that I’m particularly interested in reverse engineering any algorithms locked up in Dreamcast land.

    Tomb Raider Easter Egg
    The volume 6 sampler disc has a demo of Tomb Raider : The Last Revelation. While inspecting the strings, I found an Easter egg. I was far from the first person to discover it, though, as seen on this The Cutting Room Floor wiki page (look under “Developer Message”). It looks like I am the first person to notice it on the Dreamcast version. It shows up at offset 0xE3978 in the Dreamcast (demo version) binary, if anyone with permissions wants to update the page.

    Web Browser
    Then there’s the Web Browser for Sega Dreamcast. It seemed to be included on a lot of these sampler discs. But only mentioning the web browser undersells it– the thing also bundled an email client and an IRC client. It’s important to remember that the Dreamcast also had a keyboard peripheral.

    I need to check the timeline for when the web browser first became available vs. when the MIL-CD hack became known. My thinking is that there is no way that the web browser program didn’t have some security issues– buffer overflows and the like. It seems like this would have been a good method of breaking the security of the system.

    Ironically, I suddenly can think of a reason why one might want to use advanced reverse engineering tools on Dreamcast binaries, something I struggled with just a few paragraphs ago.

    Odds ‘n Ends
    It’s always fun to find plain text files among video game assets and speculating on the precise meaning… while also marveling how long people have been struggling to correctly spell “length”.

    Internationalization via plain text files.

    Another game (Slave Zero) saw fit to zip its assets. Maybe this was to save space in order to fit everything on the magazine sampler disc. Quizzically, this didn’t really save an appreciable amount of space.

    Finally, all the discs have an audio track 2 that advises that the disc must be played in a Dreamcast console. Not unusual. However, volume 4 also has a Japanese lady saying the same thing on track 4. This is odd because track 4 is one of the GD area audio tracks and is not accessible with normal CD hardware. Further, she identifies the disc as a “Windows CE disc”.

    The post Dreamcast Finds first appeared on Breaking Eggs And Making Omelettes.

  • 10 Key Google Analytics Limitations You Should Be Aware Of

    9 mai 2022, par Erin

    Google Analytics (GA) is the biggest player in the web analytics space. But is it as “universal” as its brand name suggests ?

    Over the years users have pointed out a number of major Google Analytics limitations. Many of these are even more visible in Google Analytics 4. 

    Introduced in 2020, Google Analytics 4 (GA4) has been sceptically received. As the sunset date of 1st, July 2023 for the current version, Google Universal Analytics (UA), approaches, the dismay grows stronger.

    To the point where people are pleading with others to intervene : 

    GA4 Elon Musk Tweet
    Source : Chris Tweten via Twitter

    Main limitations of Google Analytics

    Google Analytics 4 is advertised as a more privacy-centred, comprehensive and “intelligent” web analytics platform. 

    According to Google, the newest version touts : 

    • Machine learning at its core provides better segmentation and fast-track access to granular insights 
    • Privacy-by-design controls, addressing restrictions on cookies and new regulatory demands 
    • More complete understanding of customer journeys across channels and devices 

    Some of these claims hold true. Others crumble upon a deeper investigation. Newly advertised Google Analytics capabilities such as ‘custom events’, ‘predictive insights’ and ‘privacy consent mode’ only have marginal improvements. 

    Complex setup, poor UI and lack of support with migration also leave many other users frustrated with GA4. 

    Let’s unpack all the current (and legacy) limitations of Google Analytics you should account for. 

    1. No Historical Data Imports 

    Google rushed users to migrate from Universal Analytics to Google Analytics 4. But they overlooked one important precondition — backwards compatibility. 

    You have no way to import data from Google Universal Analytics to Google Analytics 4. 

    Historical records are essential for analysing growth trends and creating benchmarks for new marketing campaigns. Effectively, you are cut short from past insights — and forced to start strategising from scratch. 

    At present, Google offers two feeble solutions : 

    • Run data collection in parallel and have separate reporting for GA4 and UA until the latter is shut down. Then your UA records are gone. 
    • For Ecommerce data, manually duplicate events from UA at a new GA4 property while trying to figure out the new event names and parameters. 

    Google’s new data collection model is the reason for migration difficulties. 

    In Google Analytics 4, all analytics hits types — page hits, social hits, app/screen view, etc. — are recorded as events. Respectively, the “‘event’ parameter in GA4 is different from one in Google Universal Analytics as the company explains : 

    GA4 vs Universal Analytics event parameters
    Source : Google

    This change makes migration tedious — and Google offers little assistance with proper events and custom dimensions set up. 

    2. Data Collection Limits 

    If you’ve wrapped your head around new GA4 events, congrats ! You did a great job, but the hassle isn’t over. 

    You still need to pay attention to new Google Analytics limits on data collection for event parameters and user properties. 

    GA4 Event limits
    Source : Google

    These apply to :

    • Automatically collected events
    • Enhanced measurement events
    • Recommended events 
    • Custom events 

    When it comes to custom events, GA4 also has a limit of 25 custom parameters per event. Even though it seems a lot, it may not be enough for bigger websites. 

    You can get higher limits by upgrading to Google Analytics 360, but the costs are steep. 

    3. Limited GDPR Compliance 

    Google Analytics has a complex history with European GDPR compliance

    A 2020 ruling by the Court of Justice of the European Union (CJEU) invalidated the Privacy Shield framework Google leaned upon. This framework allowed the company to regulate EU-US data transfers of sensitive user data. 

    But after this loophole was closed, Google faced a heavy series of privacy-related fines :

    • French data protection authority, CNIL, ruled that  “the transfers to the US of personal data collected through Google Analytics are illegal” — and proceeded to fine Google for a record-setting €150 million at the beginning of 2022. 
    • Austrian regulators also deemed Google in breach of GDPR requirements and also branded the analytics as illegal. 

    Other EU-member states might soon proceed with similar rulings. These, in turn, can directly affect Google Analytics users, whose businesses could face brand damage and regulatory fines for non-compliance. In fact, companies cannot select where the collected analytics data will be stored — on European servers or abroad — nor can they obtain this information from Google.

    Getting a web analytics platform that allows you to keep data on your own servers or select specific Cloud locations is a great alternative. 

    Google also has been lax with its cookie consent policy and doesn’t properly inform consumers about data collection, storage or subsequent usage. Google Analytics 4 addresses this issue to an extent. 

    By default, GA4 relies on first-party cookies, instead of third-party ones — which is a step forward. But the user privacy controls are hard to configure without losing most of the GA4 functionality. Implementing user consent mode to different types of data collection also requires a heavy setup. 

    4. Strong Reliance on Sampled Data 

    To compensate for ditching third-party cookies, GA4 more heavily leans on sampled data and machine learning to fill the gaps in reporting. 

    In GA4 sampling automatically applies when you :

    • Perform advanced analysis such as cohort analysis, exploration, segment overlap or funnel analysis with not enough data 
    • Have over 10,000,000 data rows and generate any type of non-default report 

    Google also notes that data sampling can occur at lower thresholds when you are trying to get granular insights. If there’s not enough data or because Google thinks it’s too complex to retrieve. 

    In their words :

    Source : Google

    Data sampling adds “guesswork” to your reports, meaning you can’t be 100% sure of data accuracy. The divergence from actual data depends on the size and quality of sampled data. Again, this isn’t something you can control. 

    Unlike Google Analytics 4, Matomo applies no data sampling. Your reports are always accurate and fully representative of actual user behaviours. 

    5. No Proper Data Anonymization 

    Data anonymization allows you to collect basic analytics about users — visits, clicks, page views — but without personally identifiable information (or PII) such as geo-location, assigns tracking ID or other cookie-based data. 

    This reduced your ability to :

    • Remarket 
    • Identify repeating visitors
    • Do advanced conversion attribution 

    But you still get basic data from users who ignored or declined consent to data collection. 

    By default, Google Analytics 4 anonymizes all user IP addresses — an upgrade from UA. However, it still assigned a unique user ID to each user. These count as personal data under GDPR. 

    For comparison, Matomo provides more advanced privacy controls. You can anonymize :

    • Previously tracked raw data 
    • Visitor IP addresses
    • Geo-location information
    • User IDs 

    This can ensure compliance, especially if you operate in a sensitive industry — and delight privacy-mindful users ! 

    6. No Roll-Up Reporting

    Getting a bird’s-eye view of all your data is helpful when you need hotkey access to main sites — global traffic volume, user count or percentage of returning visitors.

    With Roll-Up Reporting, you can see global-performance metrics for multiple localised properties (.co.nz, .co.uk, .com, etc,) in one screen. Then zoom in on specific localised sites when you need to. 

    7. Report Processing Latency 

    The average data processing latency is 24-48 hours with Google Analytics. 

    Accounts with over 200,000 daily sessions get data refreshes only once a day. So you won’t be seeing the latest data on core metrics. This can be a bummer during one-day promo events like Black Friday or Cyber Monday when real-time information can prove to be game-changing ! 

    Matomo processes data with lower latency even for high-traffic websites. Currently, we have 6-24 hour latency for cloud deployments. On-premises web analytics can be refreshed even faster — within an hour or instantly, depending on the traffic volumes. 

    8. No Native Conversion Optimisation Features

    Google Analytics users have to use third-party tools to get deeper insights like how people are interacting with your webpage or call-to-action.

    You can use the free Google Optimize tool, but it comes with limits : 

    • No segmentation is available 
    • Only 10 simultaneous running experiments allowed 

    There isn’t a native integration between Google Optimize and Google Analytics 4. Instead, you have to manually link an Optimize Container to an analytics account. Also, you can’t select experiment dimensions in Google Analytics reports.

    What’s more, Google Optimize is a basic CRO tool, best suited for split testing (A/B testing) of copy, visuals, URLs and page layouts. If you want to get more advanced data, you need to pay for extra tools. 

    Matomo comes with a native set of built-in conversion optimization features : 

    • Heatmaps 
    • User session recording 
    • Sales funnel analysis 
    • A/B testing 
    • Form submission analytics 
    A/B test hypothesis testing on Matomo
    A/B test hypothesis testing on Matomo

    9. Deprecated Annotations

    Annotations come in handy when you need to provide extra context to other team members. For example, point out unusual traffic spikes or highlight a leak in the sales funnel. 

    This feature was available in Universal Analytics but is now gone in Google Analytics 4. But you can still quickly capture, comment and share knowledge with your team in Matomo. 

    You can add annotations to any graph that shows statistics over time including visitor reports, funnel analysis charts or running A/B tests. 

    10. No White Label Option 

    This might be a minor limitation of Google Analytics, but a tangible one for agency owners. 

    Offering an on-brand, embedded web analytics platform can elevate your customer experience. But white label analytics were never a thing with Google Analytics, unlike Matomo. 

    Wrap Up 

    Google set a high bar for web analytics. But Google Analytics inherent limitations around privacy, reporting and deployment options prompt more users to consider Google Analytics alternatives, like Matomo. 

    With Matomo, you can easily migrate your historical data records and store customer data locally or in a designated cloud location. We operate by a 100% unsampled data principle and provide an array of privacy controls for advanced compliance. 

    Start your 21-day free trial (no credit card required) to see how Matomo compares to Google Analytics ! 

  • ffmpeg command never work in lambda function using nodejs [closed]

    4 décembre 2022, par Santosh swain

    I am trying to implement FFmpeg video streaming functionality such as Instagram countdown functionality. In this code, first of all, I get records(URLs) from the s3 bucket and then split them according to my need, and then create the command and execute it with exec() belonging to childe_process. in this, I am trying to store the out in some specific folder in lambda function but it was never stored. I thought lambda does allow to write files locally so I am trying to do the direct upload on the s3 bucket by using the stdout parameter of exec()'s callback. guys, please help to do that. I have a question lambda does allow to write content in its local folder ? or if not allow then whats the way to do that thing ? I just share my code please guide me.

    


    
    // dependencies
var AWS = require('aws-sdk');
var s3 = new AWS.S3();
var { exec } = require('child_process');
var path = require('path')
var AWS_ACCESS_KEY = '';
var AWS_SECRET_ACCESS_KEY = '';
var fs = require('fs')

s3 = new AWS.S3({
    accessKeyId: AWS_ACCESS_KEY,
    secretAccessKey: AWS_SECRET_ACCESS_KEY
});

exports.handler = async function (event, context) {

    var bucket_name = "sycu-game";
    var bucketName = "sycu-test";

    //CREATE OVERLAY AND BG_VALUE PATH TO GET VALUE FROM S3
    const bgValue = (event.Records[0].bg_value).split('/');
    const overlayImage = (event.Records[0].overlay_image_url).split('/');


    var s3_bg_value = bgValue[3] + "/" + bgValue[4];
    var s3_overlay_image = overlayImage[4] + "/" + overlayImage[5] + "/" + overlayImage[6];
    const signedUrlExpireSeconds = 60 * 5;


    //RETREIVE BG_VALUE FROM S3 AND CREATE URL FOR FFMPEG INPUT VALUE
    var bg_value_url = s3.getSignedUrl('getObject', {
        Bucket: bucket_name,
        Key: s3_bg_value,
        Expires: signedUrlExpireSeconds
    });
    bg_value_url = bg_value_url.split("?");
    bg_value_url = bg_value_url[0];


    //RETREIVE OVERLAY IMAGE FROM S3 AND CREATE URL FOR FFMPEG INPUT VALUE   
    var overlay_image_url = s3.getSignedUrl('getObject', {
        Bucket: bucket_name,
        Key: s3_overlay_image,
        Expires: signedUrlExpireSeconds
    });
    overlay_image_url = overlay_image_url.split("?");
    overlay_image_url = overlay_image_url[0];


    //MANUAL ASSIGN VARIABLE FOR FFMPEG COMMAND 
    var command,
        ExtraTimerSec = event.Records[0].timer_seconds + 5,
        TimerSec = event.Records[0].timer_seconds + 1,
        BackgroundWidth = 1080,
        BackgroundHeight = 1920,
        videoPath = (__dirname + '/tmp/' + event.Records[0].name);
    console.log("path", videoPath)
    //TEMP DIRECTORY

    var videoPath = '/media/volume-d/generatedCountdownS3/tmp/' + event.Records[0].name
    var tmpFile = fs.createWriteStream(videoPath)
    //FFMPEG COMMAND 
    if (event.Records[0].bg_type == 2) {
        if (event.Records[0].is_rotate) {
            command = ' -stream_loop -1 -t ' + ExtraTimerSec + ' -i ' + bg_value_url + ' -i ' + overlay_image_url + ' -filter_complex "color=color=0x000000@0.0:s= ' + event.Records[0].resized_box_width + 'x' + event.Records[0].resized_box_height + ',drawtext=fontcolor=' + event.Records[0].time_text_color + ':fontsize=' + event.Records[0].time_text_size + ':x=' + event.Records[0].minute_x + ':y=' + event.Records[0].minute_y + ':text=\'%{eif\\:trunc(mod(((' + TimerSec + '-if(between(t, 0, 1),1,if(gte(t,' + TimerSec + '),' + TimerSec + ',t)))/60),60))\\:d\\:2}\',drawtext=fontcolor=' + event.Records[0].time_text_color + ':fontsize=' + event.Records[0].time_text_size + ':x=' + event.Records[0].second_x + ':y=' + event.Records[0].second_y + ':text=\'%{eif\\:trunc(mod(' + TimerSec + '-if(between(t, 0, 1),1,if(gte(t,' + TimerSec + '),' + TimerSec + ',t))\,60))\\:d\\:2}\'[txt]; [txt] rotate=' + event.Records[0].box_angle + '*PI/180:fillcolor=#00000000 [rotated];[0] scale=w=' + BackgroundWidth + ':h=' + BackgroundHeight + '[t];[1] scale=w=' + BackgroundWidth + ':h=' + BackgroundHeight + '[ot];[t][ot] overlay = :x=0 :y=0 [m1];[m1][rotated]overlay = :x=' + event.Records[0].flat_box_coordinate_x + ' :y=' + event.Records[0].flat_box_coordinate_x + ' [m2]" -map "[m2]" -pix_fmt yuv420p -t ' +
                ExtraTimerSec + ' -r 24 -c:a copy ' + videoPath + "";
        }
        else {
            command = ' -stream_loop -1 -t ' + ExtraTimerSec + ' -i ' + bg_value_url + ' -i ' + overlay_image_url + ' -filter_complex "color=color=0x000000@0.0:s= ' + event.Records[0].resized_box_width + 'x' + event.Records[0].resized_box_height + ',drawtext=fontcolor=' + event.Records[0].time_text_color + ':fontsize=' + event.Records[0].time_text_size + ':x=' + event.Records[0].minute_x + ':y=' + event.Records[0].minute_y + ':text=\'%{eif\\:trunc(mod(((' + TimerSec + '-if(between(t, 0, 1),1,if(gte(t,' + TimerSec + '),' + TimerSec + ',t)))/60),60))\\:d\\:2}\',drawtext=fontcolor=' + event.Records[0].time_text_color + ':fontsize=' + event.Records[0].time_text_size + ':x=' + event.Records[0].second_x + ':y=' + event.Records[0].second_y + ':text=\'%{eif\\:trunc(mod(' + TimerSec + '-if(between(t, 0, 1),1,if(gte(t,' + TimerSec + '),' + TimerSec + ',t))\,60))\\:d\\:2}\'[txt]; [txt] rotate=' + event.Records[0].box_angle + '*PI/180:fillcolor=#00000000 [rotated];[0] scale=w=' + BackgroundWidth + ':h=' + BackgroundHeight + '[t];[1] scale=w=' + BackgroundWidth + ':h=' + BackgroundHeight + '[ot];[t][ot] overlay = :x=0 :y=0 [m1];[m1][rotated]overlay = :x=' + event.Records[0].flat_box_coordinate_x + ' :y=' + event.Records[0].flat_box_coordinate_x + ' [m2]" -map "[m2]" -pix_fmt yuv420p -t ' +
                ExtraTimerSec + ' -r 24 -c:a copy ' + videoPath + "";
        }
    }
    var final_command = '/usr/bin/ffmpeg' + command;


    //COMMAND EXECUTE HERE

    await exec(final_command, function (err, stdout, stderr) {
        console.log("data is here")
        console.log('err:', err);
        console.log('stdout:', stdout);
        console.log('stderr:', stderr);
        const params = {
            Bucket: bucketName,
            Key: "countdown/output.mp4",
            Body: stdout,
        }
        s3.upload(params).promise().then(data => {
            console.log("data is here -->", data)
        });
    });
    var tmpFile = fs.createReadStream(videoPath)
    console.log('temp file data:', tmpFile.toString())
};