Recherche avancée

Médias (1)

Mot : - Tags -/bug

Autres articles (36)

  • Keeping control of your media in your hands

    13 avril 2011, par

    The vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
    While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
    MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
    MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...)

  • Contribute to documentation

    13 avril 2011

    Documentation is vital to the development of improved technical capabilities.
    MediaSPIP welcomes documentation by users as well as developers - including : critique of existing features and functions articles contributed by developers, administrators, content producers and editors screenshots to illustrate the above translations of existing documentation into other languages
    To contribute, register to the project users’ mailing (...)

  • Selection of projects using MediaSPIP

    2 mai 2011, par

    The examples below are representative elements of MediaSPIP specific uses for specific projects.
    MediaSPIP farm @ Infini
    The non profit organizationInfini develops hospitality activities, internet access point, training, realizing innovative projects in the field of information and communication technologies and Communication, and hosting of websites. It plays a unique and prominent role in the Brest (France) area, at the national level, among the half-dozen such association. Its members (...)

Sur d’autres sites (4957)

  • VP9 encoding limited to 4 threads ?

    14 juillet 2020, par kellerkindt

    I am considering to use VP9 to encode my BluRays in the future, since its an open source codec. But I cannot get Handbrake or ffmpeg use more than 50% (4) of my (8) cores. The encoding time is therefore much worse than x264/5 which uses all cores.

    


    In Handbrake I just set encoder to VP9 and CQ19. There is no difference if I add threads 8, threads 16 or threads 64 in the parameters field.

    


    Testing ffmpeg in the command line (-c:v libvpx-vp9 -crf 19 -threads 16 -tile-columns 6 -frame-parallel 1 -speed 0) also does not use any more cpu threads.

    


    Is the current encoder not capable of encoding on more than 4 threads or am I doing something wrong ?

    


      

    • Linux Mint 18
    • 


    • handbrake 0.10.2+ds1-2build1
    • 


    • ffmpeg 2.8.10-0ubuntu0.16.04.1
    • 


    • libvpx3 1.5.0-2ubuntu1
    • 


    


  • Error while opening encoder for output stream #0:1 Webm VP9 live streaming

    29 août 2019, par Salem

    The source video I use is H264 m3u8 live stream and this is the command I tried

    ffmpeg -re -i "http://sorce.com/live.m3u8" -c:v libvpx-vp9 -s 480x360 -keyint_min 60\
    -g 60 -speed 5 -tile-columns 4 -frame-parallel 1 -threads 8 -static-thresh 0 \
    -max-intra-rate 300 -deadline realtime -lag-in-frames 0 -error-resilient 1 \
    -b:v 300k -c:a libvorbis  -b:a 64k -ar 44100 -f webm_chunk -audio_chunk_duration 2000 \
    -header "/var/www/example.com/live/glass_360.hdr" -chunk_start_index 1  \
    /var/www/example.com/live/glass_360_%d.chk  

    I pickup this code from wkiki.webmproject.org
    most of the time I got this error message

    [libvorbis @ 0x5617bae0c240] encoder setup failed Error initializing
    output stream 0:1 -- Error while opening encoder for output stream
    #0:1 - maybe incorrect parameters such as bit_rate, rate, width or height

    Here is FFMPEG command output

    Input #0, mpegts, from 'http://sorce.com/live.m3u8':
     Duration: N/A, start: 1369.000978, bitrate: N/A
     Program 1
       Metadata:
         service_name    : Service01
         service_provider: FFmpeg
       Stream #0:0[0x100]: Video: h264 (Main) ([27][0][0][0] / 0x001B), yuv420p(progressive), 1280x720 [SAR 1:1 DAR 16:9], 25 fps, 25 tbr, 90k tbn, 50 tbc
       Stream #0:1[0x101]: Audio: aac (LC) ([15][0][0][0] / 0x000F), 44100 Hz, mono, fltp, 127 kb/s
    Stream mapping:
     Stream #0:0 -> #0:0 (h264 (native) -> vp9 (libvpx-vp9))
     Stream #0:1 -> #0:1 (aac (native) -> vorbis (libvorbis))

    I already add video map before video and audio coding argue but I got same error

    -map  0:v:0 -c:v libvpx-vp9 and -map 0:a:0 -c:a libvorbis

    I tried new broadcast command it works first after I disable the audio

    VP9_DASH_PARAMS="-tile-columns 4 -frame-parallel 1 -speed 6" && \
    ffmpeg -y -re -i http://sorce.com/live.m3u8 -c:v libvpx-vp9 -s 480x360 -b:v 150k \
    -keyint_min 150 -g 150 ${VP9_DASH_PARAMS} -an -f webm -dash 1 \
    video_1280x720_500k.webm && sleep 1 && ffmpeg  -f webm_dash_manifest \
    -i video_1280x720_500k.webm -c copy -f webm_dash_manifest - \ adaptation_sets "id=0"  manifest.mpd

    this command didn’t create manifest.mpd it created only video_1280x720_500k.webm

  • How to pipe multiple images, being created in parallel with an index, to ffmpeg so that it can match the speed of image creation ?

    23 septembre 2020, par vishwas.mittal

    We've a system that spews out 4-channel png images frame-by-frame (we control the output format of these images as well, so we can use something else as long as it supports transparency). Right now, we're waiting for all the images and then encoding them with ffmpeg into a webm video file with vp8 (libvpx encoder). But we now want to pipeline these images to FFmpeg to encode into the WebM video simultaneously as the images are being spewed out so that we don't wait for ffmpeg to encode all the images afterwards.

    


    This is the current command, in python syntax :

    


    ['/usr/bin/ffmpeg', '-hide_banner', '-y', '-loglevel', 'info', '-f', 'rawvideo', '-pix_fmt', 'bgra', '-s', '1573x900', '-framerate', '30', '-i', '-', '-i', 'audio.wav', '-c:v', 'libvpx', '-b:v', '0', '-crf', '30', '-tile-columns', '2', '-quality', 'good', '-speed', '4', '-threads', '16', '-auto-alt-ref', '0', '-g', '300000', '-map', '0:v:0', '-map', '1:a:0', '-shortest', 'video.webm']
# for ease of read:
# /usr/bin/ffmpeg -hide_banner -y -loglevel info -f rawvideo -pix_fmt bgra -s 1573x900 -framerate 30 -i - -i audio.wav -c:v libvpx -b:v 0 -crf 30 -tile-columns 2 -quality good -speed 4 -threads 16 -auto-alt-ref 0 -g 300000 -map 0:v:0 -map 1:a:0 -shortest video.webm

proc = subprocess.Popen(args, stdin=subprocess.PIPE)


    


    Here is a sample example of passing the image to FFMPEG proc stdin as :

    


    # wait for the next frame to get ready
for frame_path in frame_path_list:
    while not os.path.exists(frame_path):
        time.sleep(0.25)
    frame = cv2.imread(frame_path, cv2.IMREAD_UNCHANGED)
    
    # put the frame in stdin so that it gets ready
    proc.stdin.write(frame.astype(np.uint8).tobytes())


    


    The current speed of this process is 0.135x which is a huge bottleneck for us. Earlier when we were taking input as -pattern_type glob -i images/*.png we were getting around 1x-1.2x for this on a single core. So, our conclusion is that we're getting bottlenecked by stdin and hence are looking for ways to pass input through multiple sources or somehow help ffmpeg to parallelize this effort - a few options that we're thinking of :

    


      

    • Somehow feed it to different pipes and make ffmpeg read from them.
    • 


    • Append a new image to ffmpeg without re-encoding the whole video, but we didn't find a way to do this with giving input images directly.
    • 


    


    But we haven't been able to get either of these working, open to any other solutions as well. Will really appreciate the help on this. Thanks !