Recherche avancée

Médias (1)

Mot : - Tags -/intégration

Autres articles (100)

  • Multilang : améliorer l’interface pour les blocs multilingues

    18 février 2011, par

    Multilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
    Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela.

  • Amélioration de la version de base

    13 septembre 2013

    Jolie sélection multiple
    Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
    Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)

  • Gestion des droits de création et d’édition des objets

    8 février 2011, par

    Par défaut, beaucoup de fonctionnalités sont limitées aux administrateurs mais restent configurables indépendamment pour modifier leur statut minimal d’utilisation notamment : la rédaction de contenus sur le site modifiables dans la gestion des templates de formulaires ; l’ajout de notes aux articles ; l’ajout de légendes et d’annotations sur les images ;

Sur d’autres sites (11344)

  • ISO-9660 Compromise, Part 2 : Finding Root

    25 octobre 2021, par Multimedia Mike — General

    A long time ago, I dashed off a quick blog post with a curious finding after studying the ISO-9660 spec : The format stores multi-byte numbers in a format I termed “omni-endian”– the committee developing the format apparently couldn’t come to an agreement on this basic point regarding big- vs. little-endian encoding (I’m envisioning something along the lines of “tastes great ! … less filling !” in the committee meetings).

    I recently discovered another bit of compromise in the ISO-9660 spec : It seems that there are 2 different methods for processing the directory structure. That means it’s incumbent upon ISO-9660 creation software to fill in the data structures to support both methods, because some ISO-reading programs out there rely on one set of data structures while the rest prefer to read the other set.

    Background

    As a refresher, the “ISO” extension of an ISO file refers to the ISO-9660 specification. This is a type of read-only filesystem (i.e, the filesystem is created once and never updated after initial creation) for the purpose of storing on a read-only medium, often an optical disc (CD-ROM, DVD-ROM). The level of nostalgic interest I display for the ISO-9660 filesystem reminds me of my computer science curriculum professors from the mid-90s reminiscing about ye olden days of punchcard programming, but such is my lot. I’m probably also alone in my frustration of seeing rips of, e.g., GameCube or Xbox or 3DO games being tagged with the extension .ISO since those systems use different read-only filesystems.

    I recently fell in with an odd bunch called the eXoDOS project and was trying to help fill in a few gaps. One request was a 1994 game called Power Drive for DOS.


    Power Drive CD-ROM


    My usual CD-ROM ripping method (for the data track) is a simple ‘dd’ command from a Linux command line to copy the string of raw sectors. However, it turned out to be unusually difficult to open the resulting ISO. A few of the the options I know of worked but most didn’t. What’s the difference ?

    Methods that work :

    • Mounting the file with the Linux iso9660 kernel module, i.e.,
      mount -t iso9660 /dev/optical-drive /mnt

      or

      mount -t iso9660 -o loop /path/to/Power-Drive.iso /mnt
    • Directory Opus
    • Windows 10 can read the filesystem when reading the physical disc
    • Windows 10 can burn the ISO image to a new CD (“right click” -> “Burn disc image”) ; this method does not modify any of the existing sectors but did append 149 additional empty sectors

    Methods that don’t work :

    Understanding The Difference

    I think I might have a handle on why some tools are able to process this disc while most can’t. There appears to be 2 sets of data structures to describe the base of the filesystem : A root directory, and a path table. These both occur in the first substantive sector of the ISO-9660 filesystem, usually sector 16.

    A compact disc can be abstractly visualized as a long string of sectors, each one 2,352 bytes long. (See my Grand Unified Theory of Compact Disc post for deeper discussion.) A CD-ROM data track will contain 2048 bytes of data. Thus, sector 16 appears at 0x8000 of an ISO filesystem. I like the clarity of this description of the ISO-9660 spec. It shows that the path table is defined at byte 140 (little-endian ; big comes later) and location of the root directory is at byte 158. Thus, these locations generally occur at 0x808c and 0x809e.


    Primary Volume Descriptor
    Primary Volume Descriptor

    The path table is highlighted in green and the root directory record is highlighted in red. These absolute locations are specified in sectors. So the path table is located at sector 0x12 = offset 0x9000 in the image, while the root directory record is supposed to be at sector 0x62 = 0x31000. Checking into those sectors, it turns out that the path table is valid while the root directory record is invalid. Thus, any tool that relies on the path table will be successful in interpreting the disc, while tools that attempt to recursively traverse starting from root directory record are gonna have a bad time.

    Since I was able to view the filesystem with a few different tools, I know what the root directory contains. Searching for those filenames reveals that the root directory was supposed to point to the next sector, number 0x63. So this was a bizarre off-by-1 error on the part of the ISO creation tool. Maybe. I manually corrected 0x62 -> 0x63 and that fixed the interaction with fuseiso, but not with other tools. So there may have been some other errors. Note that a quick spot-check of another, functional ISO revealed that this root directory sector is supposed to be exact, not 1-indexed.

    Upon further inspection, I noticed that, while fuseiso appeared to work with that one patch, none of the files returned correct data, and none of the directories contained anything. That’s when I noticed that ALL of the sector locations described in the various directory and file records are off by 1 !

    Further Investigation

    I have occasionally run across ISO images on the Internet Archive that return the error about not being able to read the contents when trying to “View contents” (error text : “failed to obtain file list from xyz.iso”, as seen with this ISO). Too bad I didn’t make a record of them because I would be interested to see if they have the same corruption.

    Eventually, I’ll probably be able to compile an archive of deviant ISO-9660 images. A few months ago, I was processing a large collection from IA and found a corrupted ISO which had a cycle, i.e., the subdirectory pointed to a parent directory, which caused various ISO tools to loop forever. Just one of those things that is “never supposed to happen”, so why write code to deal with it gracefully ?

    See Also

    The post ISO-9660 Compromise, Part 2 : Finding Root first appeared on Breaking Eggs And Making Omelettes.

  • How to Choose a GDPR Compliant Web Analytics Solution

    2 mars 2022, par Matthieu Aubry — Privacy

    Since the launch of GDPR, one big question has lingered around with uncertainty – is Google Analytics GDPR compliant ? The current GDPR enforcement trend happening across the EU is certainly shedding some light on this question.

    Starting with the Austrian Data Protection Authority’s ruling on Google Analytics and more recently, CNIL (the French Data Protection Authority) has followed suit by also ruling Google Analytics illegal to use. Organisations with EU-based web visitors are now scrambling to find a compliant solution.

    The French Data Protection Authority (CNIL) has already started delivering formal notices to websites using Google Analytics, so now is the time to act. According to CNIL, organisations have two options :

    1. Ceasing use of the Google Analytics functionality (under the current conditions) 
    2. Use a compliant web analytics tool that does not transfer data outside the EU

    Getting started 

    For organisations considering migrating to a compliant web analytics tool, I’ve outlined below the things you need to consider when weighing up compliant web analytics tools. Once you’ve made a choice, I’ve also included a step-by-step guide to migrating away from Google Analytics. This guide is useful regardless of which GDPR compliant analytics provider you choose.

    Before getting started, I recommend that you document your findings against the following considerations while reviewing GDPR compliant Google Analytics alternatives. This document can then be shared with your Data Protection Officer (DPO) to get their final recommendation.

    10 key considerations when selecting a GDPR compliant web analytics tools

    Many tools will claim to be GDPR compliant so it’s important that you do your due diligence and review tools against the following considerations. 

    1. Where does the tool store data ? 

    The rulings in France and Austria were based on the fact that Google Analytics stores data in the US, which does not have an adequate level of data protection. Your safest option is to find a tool that legally stores data in the EU.

    You should be able to find out where the data is stored in the organisation’s privacy policy. Generally, data storage information can be found under sections titled “Subprocessors” and “Third-party services”. Check out the Matomo Privacy Policy as an example. 

    If you’re unable to easily find this information or it’s unclear, reach out to the organisation for more information.

    2. Does the tool offer anonymous tracking ?

    Anonymous tracking comes with many benefits, including :

    • The ability to track visitors without a cookie consent screen. Due to the privacy-respecting aspect of cookieless tracking, you don’t need to worry about the extra steps involved with compliant cookie banners.
    • More accurate data. When visitors deny tracking cookies, you lose out on valuable data. With anonymous tracking there is no data lost as you don’t need consent to track.
    • Simplified GDPR compliance. With this enabled, there are fewer steps you need to take to get GDPR compliant and stay GDPR compliant.

    For those reasons, it may be important for you to select a tool that offers anonymous tracking functionalities. The level of anonymous tracking you require will depend on your situation but you should look out for tools that allow you to :

    • Disable fingerprinting 
    • Disable user profiles 
    • Anonymise data
    • Cookieless tracking

    If you want to read more about data anonymization, check out this guide on data anonymization in web analytics.

    3. Does the tool integrate with my existing tech stack ?

    You’ll want to ensure that a new web analytics tool will play well with other tools in your tech stack including things like your CMS (content management system), eCommerce shop, etc. You should list out all the existing tools that currently integrate with your Google Analytics and check that the same integrations can be re-created with the new tool, via integrations or APIs.

    If not, it could become costly trying to connect your existing tech stack to a new solution.

    4. Does the tool offer the same features and insights you are currently using in Google Analytics ? Or more, if necessary ? 

    Just because you are moving to a new web analytics platform, doesn’t mean you have to give up the insights, reports and features you’ve grown accustomed to with Google Analytics. Ensuring that a new platform provides the same features and reports that you value the most will result in a smoother transition away from Google Analytics.

    It’s unlikely that a new tool will have all of the same features as Google Analytics, so I’d recommend listing out and prioritising your business-critical features and reports. 

    If I had to guess, you probably set up Google Analytics years ago because it was the default option. Now is your chance to make the most of this switch from Google Analytics and find a tool that offers additional reports and features that better aligns with your business. If time permits, I’d highly recommend that you consider other features or reports that you might have been missing out on while using Google Analytics.

    Check out this comparison of Google Analytics vs Matomo to see side-by-side feature comparison.

    5. Does the tool accept Google Analytics data imports ? 

    The historical data in Google Analytics is a critical asset for many businesses. Fortunately, some tools accept Google Analytics data imports so you don’t lose all of the data you’ve generated over time.

    However, it’s important to note that any data you import from Google Analytics to a new tool needs to be compliant data. I’ll cover this more below.

    6. Does the tool provide conversion tracking exports ? 

    Do you invest in paid advertising ? If you do, then tracking the conversions from people clicking on these paid ads is critical in assessing your return on investment. Since sending IP addresses or other personal information to the US is illegal under GDPR, we can only assume that this will also apply to advertising pixel/conversion tracking (e.g., Facebook pixel, Google Ads conversion tracking, etc). 

    As an example, Matomo offers conversion tracking exports so you can get a better understanding of ad performance while meeting privacy laws and without requiring consent from users. See how it works with Matomo’s conversion tracking exports

    7. How will you train up your in-house team ? Or can you hire a contractor ?

    This is a common concern of many, and rightfully so. You’ll want to confirm what resources are readily available so you can hit the ground running with your new web analytics tool. If you’d prefer to train up your in-house team, check the provider’s site for training resources, videos, guides, etc.

    If you’d rather hire an external contractor, we recommend heading to LinkedIn, reaching out to your community or asking the provider if they have any recommendations for contractors.

    In addition, check that the provider offers technical support or a forum, in case you have specific questions and need help.   

    8. Does the tool offer self-hosting ? (optional)

    For organisations that want full control over their data and storage location, an on-premise web analytics tool will be the preferred option. From a GDPR perspective, this is also the easiest option for compliance.

    Keep in mind that this requires resources, regular maintenance, technical knowledge and/or technical consultants. If you’re unsure which option is best for your organisation, check out our on-premise vs cloud web analytics comparison breakdown.

    Find out more about self-hosting Matomo.

    9. Is the tool approved by the CNIL for tracking without consent ?

    This is an important step for websites with French users. This step will help narrow down your selection of tools. The CNIL offers a programme to identify web analytics solutions that can be used without tracking consent. The CNIL’s list of recommended web analytics tools can act as your starting point for solutions to review.

    While this step is specific to sites with French users, it can also be helpful for websites with visitors from any other EU country.

    Benefits of consent-free tracking

    There are many benefits of tracking without consent.

    For one, it simplifies GDPR compliance and reduces the chances of GDPR breaches and fines. Cookie consent screens have recently been the target for EU Data Protection Authorities because many websites are unknowingly serving cookie consent screens that do not meet GDPR requirements. 

    Yet another benefit, and quite possibly the most important is more accurate data. Even if a website displays a user-friendly, lawful consent screen, the majority of users will either ignore or reject cookie consent. Legally website owners can’t track anything unless the visitor gives consent. So not having a cookie consent screen ensures that every visit is tracked and your web analytics data is 100% accurate

    Lastly, many visitors have grown fatigued and frustrated with invasive cookie consent screens. Not having one on your site creates a user-friendly experience, which will likely result in longer user sessions and lower bounce rates.

    10. Does the tool offer a Data Processing Agreement (DPA) ? 

    Technically, any GDPR compliant web analytics tool should offer a DPA but for the sake of completeness, I’ve added this as a consideration. Double check that any tools you are looking at provide this legally binding document. This should be located in the Privacy Policy of the web analytics provider, if not reach out to request it.

    As an example, here’s Matomo’s Data Processing Agreement which can be found in our Privacy Policy under Subprocessors. 

    That wraps up the key considerations. When it comes to compliance, privacy and customer data, Matomo leads the way. We are looking forward to helping you achieve GDPR compliance easily. Start your free 21-day trial of Matomo now – no credit card required.

    A step-by-step guide to migrating from Google Analytics

    Once you’ve identified a tool that suits your needs and your Data Protection Officer (DPO) has approved, you’re ready to get started. Here’s a simple step-by-step guide with all the important steps for you to follow :

    1. Before getting started, you should sign or download the Data Processing Agreement (DPA) offered by your new web analytics provider.

    2. Register for the new tool and configure it for compliance. The provider should offer guides on how to configure for GDPR compliance. This will include things like giving your users an easy way to opt-out of all tracking, turning on cookieless tracking or asking users for consent and anonymizing data and IP addresses, for instance.

    3. Inform your organisation about the change. Whether your colleagues use the tool or not, it’s important that you share information about the new tool with your staff. Let them know what the tool will be used for, who will use the tool and how it complies with GDPR. 

    4. Let your DPO know that you’ve removed Google Analytics and have implemented the new tool.

    5. Update your records of processing activities to include the new tool.

    6. Update your privacy policy. You’ll need to include details about the web analytics provider, where the data is stored, what data is being collected, how long the data will be stored and why the data is being collected. The web analytics tool should readily have this information for you.

    As an example, if you decide to use Matomo as your web analytics tool, we provide a Privacy Policy template for you to use on your site and a guide on how to complete your privacy policy under GDPR with Matomo. Note that these are only applicable if you are using Matomo.

    In addition, if the tool has an opt-out feature, you will also need to put the opt-out into the privacy policy (e.g., when using cookieless tracking).

    7. Now, the exciting part. Add the tracking code to your site by following the steps provided by the web analytics tool. 

    If you’re not comfortable with this step, the provider should offer steps to do this and you can share this with your web developer.

    8. Once added, login to your tool and check to see if traffic is being tracked.

    9. If your tool does not offer Google Analytics data imports or you do not need the historical data in your new tool, go to step 11. 

    To plan for your Google Analytics data migration, you’ll first need to establish what historical data is compliant with GDPR.

    For example, you shouldn’t import any data stored beyond the retention period established in your Privacy Policy or any personally identifiable information (PII) like IP addresses that aren’t anonymised. Discuss this further with your DPO.

    10. Once you’ve established what data you can legally import, then you can begin the import. Follow the steps provided by your new web analytics solution provider.

    11. Remove Google Analytics tracking code from your site. This will stop the collection of your visitors data by Google as well as slightly increase the page load speed.

    If you still haven’t made a choice yet, try Matomo free for 21-days and see why over 1 million websites choose Matomo. 

  • How to debug ffmpeg reliability for long running rtsp streams

    13 septembre 2022, par Mark

    I have a long running ffmpeg background process that "watches" an rtsp stream and takes snapshots every 7 minutes.

    


    It's being run like this

    


    C:\Windows\System32\cmd.exe /c C:\ffmpeg\bin\ffmpeg.exe -nostdin -rtsp_transport tcp -y -timeout 5000000 -i rtsp://someurl -q:v 1 -an -vf fps=0.002381,scale="1280:720" -strftime 1 -f image2 C:\somelocalfolder\%Y-%m-%d_%H-%M-%S.jpg > c:\ffmpeglog.txt 2>&1


    


    This process runs for days but intermittently, for hours at a time, seems to miss taking snapshots, until eventually it starts to take them again - then fail again, etc. The logs at info level are not helpful. I checked the stream during times when it was not taking snapshots and the stream was up. What's happening here ? How can I debug this ?

    


    Below is an image of succesfull snapshots per hour. There should always be between 8 and 9.
metrics on successful snapshots taken

    


    Logs look like this

    


        ffmpeg version 2022-03-31-git-e301a24fa1-full_build-www.gyan.dev Copyright (c) 2000-2022 the FFmpeg developers
  built with gcc 11.2.0 (Rev7, Built by MSYS2 project)
  configuration: --enable-gpl --enable-version3 --enable-static --disable-w32threads --disable-autodetect --enable-fontconfig --enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp --enable-bzlib --enable-lzma --enable-libsnappy --enable-zlib --enable-librist --enable-libsrt --enable-libssh --enable-libzmq --enable-avisynth --enable-libbluray --enable-libcaca --enable-sdl2 --enable-libdav1d --enable-libdavs2 --enable-libuavs3d --enable-libzvbi --enable-librav1e --enable-libsvtav1 --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs2 --enable-libxvid --enable-libaom --enable-libopenjpeg --enable-libvpx --enable-mediafoundation --enable-libass --enable-frei0r --enable-libfreetype --enable-libfribidi --enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf --enable-cuda-llvm --enable-cuvid --enable-ffnvcodec --enable-nvdec --enable-nvenc --enable-d3d11va --enable-dxva2 --enable-libmfx --enable-libshaderc --enable-vulkan --enable-libplacebo --enable-opencl --enable-libcdio --enable-libgme --enable-libmodplug --enable-libopenmpt --enable-libopencore-amrwb --enable-libmp3lame --enable-libshine --enable-libtheora --enable-libtwolame --enable-libvo-amrwbenc --enable-libilbc --enable-libgsm --enable-libopencore-amrnb --enable-libopus --enable-libspeex --enable-libvorbis --enable-ladspa --enable-libbs2b --enable-libflite --enable-libmysofa --enable-librubberband --enable-libsoxr --enable-chromaprint
  libavutil      57. 24.101 / 57. 24.101
  libavcodec     59. 25.100 / 59. 25.100
  libavformat    59. 20.101 / 59. 20.101
  libavdevice    59.  6.100 / 59.  6.100
  libavfilter     8. 29.100 /  8. 29.100
  libswscale      6.  6.100 /  6.  6.100
  libswresample   4.  6.100 /  4.  6.100
  libpostproc    56.  5.100 / 56.  5.100
Input #0, rtsp, from 'rtsp://somerul':
  Metadata:
    title           : HIK Media Server V4.21.005
    comment         : HIK Media Server Session Description : standard
  Duration: N/A, start: 0.033000, bitrate: N/A
  Stream #0:0: Video: h264 (High), yuv420p(progressive), 704x576, 30 tbr, 90k tbn
Stream mapping:
  Stream #0:0 -> #0:0 (h264 (native) -> mjpeg (native))
[swscaler @ 000002a1c2c20680] [swscaler @ 000002a1c2c2e0c0] deprecated pixel format used, make sure you did set range correctly
[swscaler @ 000002a1c2c20680] [swscaler @ 000002a1c2c67c40] deprecated pixel format used, make sure you did set range correctly
[swscaler @ 000002a1c2c20680] [swscaler @ 000002a1c2cc6700] deprecated pixel format used, make sure you did set range correctly
Output #0, image2, to 'C:\somelocalfolder\Temp\stream_2\StreamedImages\%Y-%m-%d_%H-%M-%S.jpg':
  Metadata:
    title           : HIK Media Server V4.21.005
    comment         : HIK Media Server Session Description : standard
    encoder         : Lavf59.20.101
  Stream #0:0: Video: mjpeg, yuvj420p(pc, progressive), 1280x720, q=2-31, 200 kb/s, 0.0024 fps, 0.0024 tbn
    Metadata:
      encoder         : Lavc59.25.100 mjpeg
    Side data:
      cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: N/A
frame=    1 fps=0.0 q=0.0 size=N/A time=00:00:00.00 bitrate=N/A speed=   0x   


    


    Update
I got some trace logs. The ffmpeg seems to fail silently at some point and stop taking snapshots.

    


    After about 3 million log lines (which is really only a couple of hours in my case) I get the following

    


    rtsp://192.168.15.195:554/streaming/channels/904: Unknown error


    


    But ffmpeg silently continues. Here is a bit more of the log

    


        [Parsed_fps_0 @ 00000248e7d50e40] Read frame with in pts 1074443040, out pts 28
[Parsed_fps_0 @ 00000248e7d50e40] Dropping frame with pts 28
frame=   28 fps=0.0 q=1.0 size=N/A time=03:08:59.77 bitrate=N/A speed=0.95x    
[rtsp @ 00000248e765cf00] tcp_read_packet:
[h264 @ 00000248e7d59880] nal_unit_type: 1(Coded slice of a non-IDR picture), nal_ref_idc: 3
[rtsp @ 00000248e765cf00] ret=-138 c=24 [$]
rtsp://192.168.15.195:554/streaming/channels/904: Unknown error
[Parsed_fps_0 @ 00000248e7d50e40] Read frame with in pts 1074446100, out pts 28
[Parsed_fps_0 @ 00000248e7d50e40] Dropping frame with pts 28
frame=   28 fps=0.0 q=1.0 size=N/A time=03:08:59.77 bitrate=N/A speed=0.95x    
[rtsp @ 00000248e765cf00] tcp_read_packet:
[rtsp @ 00000248e765cf00] ret=1 c=24 [$]
[rtsp @ 00000248e765cf00] id=0 len=696
[rtsp @ 00000248e765cf00] Sending:
GET_PARAMETER rtsp://192.168.15.195:554/streaming/channels/904 RTSP/1.0

CSeq: 402

User-Agent: Lavf59.20.101

Session: 931848797

Authorization: Digest username="******", realm="709382dda4ccb674edf093d3", nonce="13fca62fc", uri="rtsp://192.168.15.195:554/streaming/channels/904", response="74341df9611f0ac3dc247b402424735b", algorithm="MD5"



--
[NULL @ 00000248e7662640] nal_unit_type: 7(SPS), nal_ref_idc: 3
[NULL @ 00000248e7662640] nal_unit_type: 8(PPS), nal_ref_idc: 3
[rtsp @ 00000248e765cf00] tcp_read_packet:
[rtsp @ 00000248e765cf00] ret=1 c=24 [$]
[rtsp @ 00000248e765cf00] id=0 len=756
[Parsed_fps_0 @ 00000248e7d50e40] Read frame with in pts 1074449070, out pts 28
[Parsed_fps_0 @ 00000248e7d50e40] Dropping frame with pts 28
[Parsed_fps_0 @ 00000248e7d50e40] Read frame with in pts 1074449070, out pts 28
[Parsed_fps_0 @ 00000248e7d50e40] Dropping frame with pts 28
frame=   28 fps=0.0 q=1.0 size=N/A time=03:08:59.77 bitrate=N/A speed=0.949x    
[rtsp @ 00000248e765cf00] tcp_read_packet:
[rtsp @ 00000248e765cf00] ret=1 c=24 [$]
[rtsp @ 00000248e765cf00] id=0 len=1352
frame=   28 fps=0.0 q=1.0 size=N/A time=03:08:59.77 bitrate=N/A speed=0.949x    
[rtsp @ 00000248e765cf00] tcp_read_packet:
[rtsp @ 00000248e765cf00] ret=1 c=24 [$]
[rtsp @ 00000248e765cf00] id=0 len=1352
[rtsp @ 00000248e765cf00] tcp_read_packet:
[rtsp @ 00000248e765cf00] ret=1 c=24 [$]
[rtsp @ 00000248e765cf00] id=0 len=1352
[rtsp @ 00000248e765cf00] tcp_read_packet:
[rtsp @ 00000248e765cf00] ret=1 c=24 [$]
[rtsp @ 00000248e765cf00] id=0 len=1352
[rtsp @ 00000248e765cf00] tcp_read_packet:
[rtsp @ 00000248e765cf00] ret=1 c=24 [$]
[rtsp @ 00000248e765cf00] id=0 len=1228
[rtsp @ 00000248e765cf00] tcp_read_packet:
[rtsp @ 00000248e765cf00] ret=1 c=24 [$]
[rtsp @ 00000248e765cf00] id=0 len=1352
[NULL @ 00000248e7662640] reference count 1 overflow
[rtsp @ 00000248e765cf00] tcp_read_packet:
[rtsp @ 00000248e765cf00] ret=1 c=24 [$]
[rtsp @ 00000248e765cf00] id=0 len=804
[rtsp @ 00000248e765cf00] tcp_read_packet:
[rtsp @ 00000248e765cf00] ret=1 c=24 [$]
[rtsp @ 00000248e765cf00] id=0 len=1352
[NULL @ 00000248e7662640] illegal memory management control operation 11
[rtsp @ 00000248e765cf00] tcp_read_packet:
[rtsp @ 00000248e765cf00] ret=1 c=24 [$]
[rtsp @ 00000248e765cf00] id=0 len=836


    


    Basically it appears an issue of ffmpeg silently failing. If it crashed, my software could detect it and I could rerun it, but if it fails silently like this, I need another solution.