Recherche avancée

Médias (0)

Mot : - Tags -/logo

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (48)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Emballe médias : à quoi cela sert ?

    4 février 2011, par

    Ce plugin vise à gérer des sites de mise en ligne de documents de tous types.
    Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ;

  • Encoding and processing into web-friendly formats

    13 avril 2011, par

    MediaSPIP automatically converts uploaded files to internet-compatible formats.
    Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
    Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
    Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
    All uploaded files are stored online in their original format, so you can (...)

Sur d’autres sites (7910)

  • MPEG backwards frame decoding using FFmpeg

    24 mars 2016, par Gediminas

    I have so-called "block’s" that stores some of MPEG4 (I,p,p,p,p...) frames.
    For every "block", frame starts with an "I" frame, and ends before next "I" frame.
    (VOL - "visual_object_sequence_start_code" is allways included before the "I" frame)

    I need to be able to play those "block" frames in "backwards" mode.
    The thick is that :

    1. It’s not possible to just take the last frame in my block and perform decoding, because it’s a "P" frame and it needs an "inter frame (I)" to be correctly decoded.

    2. I can’t just take my first "I" frame, then pass it to the ffmpeg’s "avcodec_decode_video" function and only then pass my last "P" frame to the ffmpeg, because that last "P" frame depends on the "P" frame before it, right ? (well.. as far as I’ve tested this method, my last decoded P frame had artifacts)

    Now the way I’m performing backwards playing is - first decoding all of my "block" frames in RGB and store them in memory. (in most cases it would be 25 frames per block max.) But this method really requires a lot of memory... (especially if frames resolutions is high)
    And I have a feeling that this is not the right way to do this...

    So I would like to ask, does any one have any suggestions how this "backwards" frame decoding/playing could be performed using FFmpeg ?

    Thanks

  • Android RTSP streaming buffer size for delayed playback

    19 novembre 2012, par frijj2k

    I can access and view RTSP streams from IP cameras on Android via the VideoView component without problems.

    Now I need to play the RTSP stream with a delay (i.e. if I specify a 30 second delay, the playback on screen should be 30 seconds behind the source and the delay needs to be variable though not during playback, only at the point of connecting to the source).

    I originally thought this would not be a problem as I could simply change the RTSP buffer duration before connecting to the camera but unfortunately it seems the buffer size is baked into the firmware and cannot be changed in software. Now I have got a horrible feeling that my way forward will be to compile a version of FFMpeg for Android and somehow get the stream data out from the library, buffer it and then render it myself and I have no experience with FFMpeg.

    I am unsure how I would now go about solving this problem and any help or pointers in the right direction would be greatly appreciated.

    Update :

    Sorry I forgot to mention, the RTSP stream is being accessed over WiFi on a LAN so no huge latency issues here from going over the Internet.

  • asyncio.create_subprocess_shell error : Future exception was never retrieved - BrokenPipeError

    10 juin 2021, par Abdelmalek

    Whenever I use asyncio.create_subprocess_shell() with sp.communicate() I get this error at the end of my program.

    


    If I run multiple suprocesses using asyncio, the error would be print at the end for each one of them.

    


    Aulthough it doesn't affect my program, I want to find the source and fix the issue. Thanks for the help !

    


    Traceback :

    


    Future exception was never retrieved&#xA;future: <future finished="finished" exception="BrokenPipeError(32," pipe="pipe" has="has" been="been">&#xA;Traceback (most recent call last):&#xA;  File "C:\Users\user\AppData\Local\Programs\Python\Python39\lib\asyncio\subprocess.py", line 153, in _feed_stdin&#xA;    await self.stdin.drain()&#xA;  File "C:\Users\user\AppData\Local\Programs\Python\Python39\lib\asyncio\streams.py", line 387, in drain&#xA;    await self._protocol._drain_helper()&#xA;  File "C:\Users\user\AppData\Local\Programs\Python\Python39\lib\asyncio\streams.py", line 197, in _drain_helper&#xA;    await waiter&#xA;  File "C:\Users\user\AppData\Local\Programs\Python\Python39\lib\asyncio\proactor_events.py", line 379, in _loop_writing&#xA;    f.result()&#xA;  File "C:\Users\user\AppData\Local\Programs\Python\Python39\lib\asyncio\windows_events.py", line 812, in _poll&#xA;    value = callback(transferred, key, ov)&#xA;  File "C:\Users\user\AppData\Local\Programs\Python\Python39\lib\asyncio\windows_events.py", line 538, in finish_send&#xA;    return ov.getresult()&#xA;BrokenPipeError: [WinError 109] The pipe has been ended&#xA;</future>

    &#xA;

    Code :

    &#xA;

    async def get(cs, url):&#xA;    async with cs.get(url) as r:&#xA;        b = b&#x27;&#x27;&#xA;        while True:&#xA;            chunk = await r.content.read(4000000)&#xA;            b &#x2B;= chunk&#xA;            if not chunk:&#xA;                break&#xA;        &#xA;        if int(r.headers[&#x27;content-length&#x27;]) &lt; 8000000:&#xA;            result = BytesIO(b)&#xA;            return [result, &#x27;full&#x27;]&#xA;        else:&#xA;            command = f"ffmpeg -y -i - -c:v copy -fs 8000000 -f matroska -"&#xA;            sp = await asyncio.create_subprocess_shell(command, stdin=asyncio.subprocess.PIPE,&#xA;                stdout=asyncio.subprocess.PIPE, stderr=asyncio.subprocess.PIPE)&#xA;            stdout, stderr = await sp.communicate(b)&#xA;            sp.wait()&#xA;&#xA;            print(stderr.decode())&#xA;            print(len(stdout))&#xA;            output = str(stderr)&#xA;            index_d = output.find(&#x27;Duration: &#x27;)&#xA;            index_t = output.rfind(&#x27;time=&#x27;)&#xA;            duration = await get_sec(output[index_d &#x2B; 10:].split(",")[0])&#xA;            time_ = await get_sec(output[index_t &#x2B; 5:].split(" ")[0])&#xA;            percentage = f"{round((time_ / duration) * 100)}%"&#xA;&#xA;            result = BytesIO(stdout)&#xA;            return [result, &#x27;preview&#x27;, percentage]&#xA;&#xA;async def main(urls):&#xA;    async with aiohttp.ClientSession() as cs:&#xA;            &#xA;        tasks = []&#xA;        for url in urls:&#xA;            task = asyncio.create_task(get(cs, url))&#xA;            tasks.append(task)&#xA;        &#xA;        results = []&#xA;        for task in tasks:&#xA;            result = await task&#xA;            results.append(result)&#xA;        &#xA;        return results&#xA;&#xA;loop = asyncio.get_event_loop()&#xA;results = loop.run_until_complete(main(urls))&#xA;

    &#xA;