Recherche avancée

Médias (1)

Mot : - Tags -/ogv

Autres articles (107)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Multilang : améliorer l’interface pour les blocs multilingues

    18 février 2011, par

    Multilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
    Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela.

  • Supporting all media types

    13 avril 2011, par

    Unlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)

Sur d’autres sites (18992)

  • ffmpeg - Poll folder for files, and stream as video with rtp

    17 janvier 2019, par Omer

    (I’m a newbie when it comes to ffmpeg).
    I have an image source which saves files to a given folder in a rate of 30 fps. I want to wait for every (let’s say) 30 frames chunk, encode it to h264 and stream it with RDP to some other app.

    I thought about writing a python app which just waits for the images, and then executes an ffmpeg command. For that I wrote the following code :

    main.py :

    import os
    import Helpers
    import argparse
    import IniParser
    import subprocess
    from functools import partial

    from Queue import Queue
    from threading import Semaphore, Thread


    def Run(config):

       os.chdir(config.Workdir)
       iteration = 1

       q = Queue()
       Thread(target=RunProcesses, args=(q, config.AllowedParallelRuns)).start()

       while True:

           Helpers.FileCount(config.FramesPathPattern, config.ChunkSize * iteration)

           command = config.FfmpegCommand.format(startNumber = (iteration-1)*config.ChunkSize, vFrames=config.ChunkSize)

           runFunction = partial(subprocess.Popen, command)
           q.put(runFunction)

           iteration += 1

    def RunProcesses(queue, semaphoreSize):

       semaphore = Semaphore(semaphoreSize)

       while True:

           runFunction = queue.get()

           Thread(target=HandleProcess, args=(runFunction, semaphore)).start()

    def HandleProcess(runFunction, semaphore):

       semaphore.acquire()

       p = runFunction()
       p.wait()

       semaphore.release()

    if __name__ == '__main__':

       argparser = argparse.ArgumentParser()
       argparser.add_argument("config", type=str, help="Path for the config file")
       args = argparser.parse_args()

       iniFilePath = args.config

       config = IniParser.Parse(iniFilePath)

       Run(config)

    Helpers.py (not really relevant) :

    import os
    import time
    from glob import glob

    def FileCount(pattern, count):

       count = int(count)

       lastCount = 0
       while True:

           currentCount = glob(pattern)

           if lastCount != currentCount:
               lastCount = currentCount

           if len(currentCount) >= count and all([CheckIfClosed(f) for f in currentCount]):

               break

           time.sleep(0.05)

    def CheckIfClosed(filePath):

       try:
           os.rename(filePath, filePath)
           return True
       except:
           return False

    I used the following config file :

    Workdir = "C:\Developer\MyProjects\Streaming\OutputStream\PPM"
    ; Workdir is the directory of reference from which all paths are relative to.
    ; You may still use full paths if you wish.

    FramesPathPattern = "F*.ppm"
    ; The path pattern (wildcards allowed) where the rendered images are stored to.
    ; We use this pattern to detect how many rendered images are available for streaming.
    ; When a chunk of frames is ready - we stream it (or store to disk).

    ChunkSize = 30 ; Number of frames for bulk.
    ; ChunkSize sets the number of frames we need to wait for, in order to execute the ffmpeg command.
    ; If the folder already contains several chunks, it will first process the first chunk, then second, and so on...

    AllowedParallelRuns = 1 ; Number of parallel allowed processes of ffmpeg.
    ; This sets how many parallel ffmpeg processes are allowed.
    ; If more than one chunk is available in the folder for processing, we will execute several ffmpeg processes in parallel.
    ; Only when on of the processes will finish, we will allow another process execution.

    FfmpegCommand = "ffmpeg -re -r 30 -start_number {startNumber} -i F%08d.ppm -vframes {vFrames} -vf vflip -f rtp rtp://127.0.0.1:1234" ; Command to execute when a bulk is ready for streaming.
    ; Once a chunk is ready for processing, this is the command that will be executed (same as running it from the terminal).
    ; There is however a minor difference. Since every chunk starts with a different frame number, you can use the
    ; expression of "{startNumber}" which will automatically takes the value of the matching start frame number.
    ; You can also use "{vFrames}" as an expression for the ChunkSize which was set above in the "ChunkSize" entry.

    Please note that if I set "AllowedParallelRuns = 2" then it allows multiple ffmpeg processes to run simultaneously.

    I then tried to play it with ffplay and see if I’m doing it right.
    The first chunk was streamed fine. The following chunks weren’t so great. I got a lot of [sdp @ 0000006de33c9180] RTP: dropping old packet received too late messages.

    What should I do so I get the ffplay, to play it in the order of the incoming images ? Is it right to run parallel ffmpeg processes ? Is there a better solution to my problem ?

    Thank you !

  • lavfi/drawtext : add expansion function eif()

    4 juillet 2014, par Andrey Utkin
    lavfi/drawtext : add expansion function eif()
    

    It evaluates expression and outputs it as integer value, using specified
    format.

    Address trac ticket #3699.

    Signed-off-by : Stefano Sabatini <stefasab@gmail.com>

    • [DH] doc/filters.texi
    • [DH] libavfilter/version.h
    • [DH] libavfilter/vf_drawtext.c
  • ffmpeg resize maintain aspect - filter not found

    4 décembre 2015, par paul

    I am trying to create image thumbnails that maintain the aspect of the orginal but crop it if the dimensions are too far out.
    found a likely code snippet here

    ffmpeg -i Einstein_500_459.jpg -vf scale="'if(gt(a,320/240),320,-1)':'if(gt(a,320/240),-1,240)'" Einstein_320x240_fit.png

    but cannot get ffmpeg to execute it. as is it generates error in terminal :

    ffmpeg -i Einstein_500_459.jpg -vf scale="'if(gt(a,320/240),320,-1)':'if(gt(a,320/240),-1,240)'" Einstein_320x240_fit.png
    ffmpeg version 0.10.12-7:0.10.12-1~precise1 Copyright (c) 2000-2014 the FFmpeg developers
     built on Apr 26 2014 09:49:36 with gcc 4.6.3
     configuration: --arch=amd64 --disable-stripping --enable-pthreads --enable-runtime-cpudetect --extra-version='7:0.10.12-1~precise1' --libdir=/usr/lib/x86_64-linux-gnu --prefix=/usr --enable-bzlib --enable-libdc1394 --enable-libfreetype --enable-frei0r --enable-gnutls --enable-libgsm --enable-libmp3lame --enable-librtmp --enable-libopencv --enable-libopenjpeg --enable-libpulse --enable-libschroedinger --enable-libspeex --enable-libtheora --enable-vaapi --enable-vdpau --enable-libvorbis --enable-libvpx --enable-zlib --enable-gpl --enable-postproc --enable-libcdio --enable-x11grab --enable-libx264 --shlibdir=/usr/lib/x86_64-linux-gnu --enable-shared --disable-static
     libavutil      51. 35.100 / 51. 35.100
     libavcodec     53. 61.100 / 53. 61.100
     libavformat    53. 32.100 / 53. 32.100
     libavdevice    53.  4.100 / 53.  4.100
     libavfilter     2. 61.100 /  2. 61.100
     libswscale      2.  1.100 /  2.  1.100
     libswresample   0.  6.100 /  0.  6.100
     libpostproc    52.  0.100 / 52.  0.100
    Input #0, image2, from 'Einstein_500_459.jpg':
     Duration: 00:00:00.04, start: 0.000000, bitrate: N/A
       Stream #0:0: Video: mjpeg, yuvj444p, 500x459 [SAR 300:300 DAR 500:459], 25 tbr, 25 tbn, 25 tbc
    Incompatible pixel format 'yuvj444p' for codec 'png', auto-selecting format 'rgb24'
    [buffer @ 0x236d720] w:500 h:459 pixfmt:yuvj444p tb:1/1000000 sar:300/300 sws_param:
    [scale @ 0x235d5e0] [Eval @ 0x7fffcb68f270] Missing ')' or too many args in 'if(gt(a,320/240),320,-1)'
    [scale @ 0x235d5e0] [Eval @ 0x7fffcb68f270] Missing ')' or too many args in 'if(gt(a,320/240),-1,240)'
    Error when evaluating the expression 'if(gt(a,320/240),-1,240)'.
    Maybe the expression for out_w:'if(gt(a,320/240),320,-1)' or for out_h:'if(gt(a,320/240),-1,240)' is self-referencing.
    Error opening filters!

    after escaping single quotes

    ffmpeg -i Einstein_500_459.jpg -vf scale="\'if(gt(a,320/240),320,-1)\':\'if(gt(a,320/240),-1,240)\'" Einstein_320x240_fit.png

    it generates error

    ffmpeg -i Einstein_500_459.jpg -vf scale="\'if(gt(a,320/240),320,-1)\':\'if(gt(a,320/240),-1,240)\'" Einstein_320x240_fit.png
    ffmpeg version 0.10.12-7:0.10.12-1~precise1 Copyright (c) 2000-2014 the FFmpeg developers
     built on Apr 26 2014 09:49:36 with gcc 4.6.3
     configuration: --arch=amd64 --disable-stripping --enable-pthreads --enable-runtime-cpudetect --extra-version='7:0.10.12-1~precise1' --libdir=/usr/lib/x86_64-linux-gnu --prefix=/usr --enable-bzlib --enable-libdc1394 --enable-libfreetype --enable-frei0r --enable-gnutls --enable-libgsm --enable-libmp3lame --enable-librtmp --enable-libopencv --enable-libopenjpeg --enable-libpulse --enable-libschroedinger --enable-libspeex --enable-libtheora --enable-vaapi --enable-vdpau --enable-libvorbis --enable-libvpx --enable-zlib --enable-gpl --enable-postproc --enable-libcdio --enable-x11grab --enable-libx264 --shlibdir=/usr/lib/x86_64-linux-gnu --enable-shared --disable-static
     libavutil      51. 35.100 / 51. 35.100
     libavcodec     53. 61.100 / 53. 61.100
     libavformat    53. 32.100 / 53. 32.100
     libavdevice    53.  4.100 / 53.  4.100
     libavfilter     2. 61.100 /  2. 61.100
     libswscale      2.  1.100 /  2.  1.100
     libswresample   0.  6.100 /  0.  6.100
     libpostproc    52.  0.100 / 52.  0.100
    Input #0, image2, from 'Einstein_500_459.jpg':
     Duration: 00:00:00.04, start: 0.000000, bitrate: N/A
       Stream #0:0: Video: mjpeg, yuvj444p, 500x459 [SAR 300:300 DAR 500:459], 25 tbr, 25 tbn, 25 tbc
    Incompatible pixel format 'yuvj444p' for codec 'png', auto-selecting format 'rgb24'
    [buffer @ 0x1897720] w:500 h:459 pixfmt:yuvj444p tb:1/1000000 sar:300/300 sws_param:
    No such filter: '320/240)'
    Error opening filters!

    is this just a syntax problem ? or is my ffmpeg missing something ? or what ?
    this is ubuntu 12.04 with what i believe is a full install of ffmpeg, not the libav default thing. any help is appreciated.