Recherche avancée

Médias (0)

Mot : - Tags -/objet éditorial

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (4)

  • Supporting all media types

    13 avril 2011, par

    Unlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)

  • Keeping control of your media in your hands

    13 avril 2011, par

    The vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
    While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
    MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
    MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...)

  • Le plugin : Podcasts.

    14 juillet 2010, par

    Le problème du podcasting est à nouveau un problème révélateur de la normalisation des transports de données sur Internet.
    Deux formats intéressants existent : Celui développé par Apple, très axé sur l’utilisation d’iTunes dont la SPEC est ici ; Le format "Media RSS Module" qui est plus "libre" notamment soutenu par Yahoo et le logiciel Miro ;
    Types de fichiers supportés dans les flux
    Le format d’Apple n’autorise que les formats suivants dans ses flux : .mp3 audio/mpeg .m4a audio/x-m4a .mp4 (...)

Sur d’autres sites (3525)

  • FFmpeg -ss parameter is the video duration, Output file is empty

    16 avril 2024, par noun nil

    The main function is to obtain the corresponding video frame based on the input seconds.Before processing, the duration of the video will be obtained to determine whether the input is within the duration range of the video. If so, the corresponding instruction will be executed.

    


    > ffprobe input.mp4
  ...
  Duration: 00:00:28.05, start: 0.000000, bitrate: 1136 kb/s


    


    > ffmpeg -ss 00:00:28 -i input.mp4 -frames:v 1 output.png
 ffmpeg version 7.0 Copyright (c) 2000-2024 the FFmpeg developers
  built with Apple clang version 15.0.0 (clang-1500.3.9.4)
  configuration: --prefix=/opt/homebrew/Cellar/ffmpeg/7.0 --enable-shared --enable-pthreads --enable-version3 --cc=clang --host-cflags= --host-ldflags='-Wl,-ld_classic' --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libaribb24 --enable-libbluray --enable-libdav1d --enable-libharfbuzz --enable-libjxl --enable-libmp3lame --enable-libopus --enable-librav1e --enable-librist --enable-librubberband --enable-libsnappy --enable-libsrt --enable-libssh --enable-libsvtav1 --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopenvino --enable-libspeex --enable-libsoxr --enable-libzmq --enable-libzimg --disable-libjack --disable-indev=jack --enable-videotoolbox --enable-audiotoolbox --enable-neon
  libavutil      59.  8.100 / 59.  8.100
  libavcodec     61.  3.100 / 61.  3.100
  libavformat    61.  1.100 / 61.  1.100
  libavdevice    61.  1.100 / 61.  1.100
  libavfilter    10.  1.100 / 10.  1.100
  libswscale      8.  1.100 /  8.  1.100
  libswresample   5.  1.100 /  5.  1.100
  libpostproc    58.  1.100 / 58.  1.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'input.mp4':
  Metadata:
    major_brand     : isom
    minor_version   : 512
    compatible_brands: isomiso2avc1mp41
    encoder         : Lavf58.29.100
    description     : Packed by Bilibili XCoder v2.0.2
  Duration: 00:00:28.05, start: 0.000000, bitrate: 1136 kb/s
  Stream #0:0[0x1](und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, bt709, progressive), 720x1280 [SAR 1:1 DAR 9:16], 1005 kb/s, 25.27 fps, 25.25 tbr, 16k tbn (default)
      Metadata:
        handler_name    : VideoHandler
        vendor_id       : [0][0][0][0]
  Stream #0:1[0x2](und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 128 kb/s (default)
      Metadata:
        handler_name    : SoundHandler
        vendor_id       : [0][0][0][0]
Stream mapping:
  Stream #0:0 -> #0:0 (h264 (native) -> png (native))
Press [q] to stop, [?] for help
[swscaler @ 0x1187a0000] [swscaler @ 0x110e48000] No accelerated colorspace conversion found from yuv420p to rgb24.
[swscaler @ 0x1187a0000] [swscaler @ 0x110e58000] No accelerated colorspace conversion found from yuv420p to rgb24.
[swscaler @ 0x1187a0000] [swscaler @ 0x110e68000] No accelerated colorspace conversion found from yuv420p to rgb24.
[swscaler @ 0x1187a0000] [swscaler @ 0x110e78000] No accelerated colorspace conversion found from yuv420p to rgb24.
[swscaler @ 0x1187a0000] [swscaler @ 0x110e88000] No accelerated colorspace conversion found from yuv420p to rgb24.
[swscaler @ 0x1187a0000] [swscaler @ 0x110e98000] No accelerated colorspace conversion found from yuv420p to rgb24.
[swscaler @ 0x1187a0000] [swscaler @ 0x110ea8000] No accelerated colorspace conversion found from yuv420p to rgb24.
[swscaler @ 0x1187a0000] [swscaler @ 0x110eb8000] No accelerated colorspace conversion found from yuv420p to rgb24.
[swscaler @ 0x1187a0000] [swscaler @ 0x110ec8000] No accelerated colorspace conversion found from yuv420p to rgb24.
[vost#0:0/png @ 0x12ce06840] No filtered frames for output stream, trying to initialize anyway.
Output #0, image2, to 'output.png':
  Metadata:
    major_brand     : isom
    minor_version   : 512
    compatible_brands: isomiso2avc1mp41
    description     : Packed by Bilibili XCoder v2.0.2
    encoder         : Lavf61.1.100
  Stream #0:0(und): Video: png, rgb24(progressive), 720x1280 [SAR 1:1 DAR 9:16], q=2-31, 200 kb/s, 25.25 fps, 16k tbn (default)
      Metadata:
        handler_name    : VideoHandler
        vendor_id       : [0][0][0][0]
        encoder         : Lavc61.3.100 png
[out#0/image2 @ 0x600000e983c0] video:0KiB audio:0KiB subtitle:0KiB other streams:0KiB global headers:0KiB muxing overhead: unknown
[out#0/image2 @ 0x600000e983c0] Output file is empty, nothing was encoded(check -ss / -t / -frames parameters if used)
frame=    0 fps=0.0 q=0.0 Lsize=N/A time=N/A bitrate=N/A speed=N/A    



    


    -ss 00:00:28 within the video duration range, but still cannot obtain frames...

    


    -ss 00:00:27 Can obtain video frames

    


  • JavaCV error AAC with no global headers is currently not supported

    19 juillet 2023, par zhoutian

    I'm trying to transcode dhav (one of the container format) to RTSP By JavaCV(FFmpegFrameGrabber + FFmpegFrameRecorder) , It's fine when i transcoding dhav to RTMP , but when I change to RTSP ,error occurred :

    


    
Error: [rtsp @ 0000002318df7c30] AAC with no global headers is currently not supported.

Exception in thread "pool-1-thread-2" java.lang.RuntimeException: org.bytedeco.javacv.FFmpegFrameRecorder$Exception: avformat_write_header error() error -1094995529: Could not write header to 'rtsp://127.0.0.1:8554/myapp/orange2' (For more details, make sure FFmpegLogCallback.set() has been called.)
    at org.jfjy.jvc.GetBytes2PipedStreamAndPushRTMP$2.run(GetBytes2PipedStreamAndPushRTMP.java:116)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
    at java.base/java.lang.Thread.run(Thread.java:833)
Caused by: org.bytedeco.javacv.FFmpegFrameRecorder$Exception: avformat_write_header error() error -1094995529: Could not write header to 'rtsp://127.0.0.1:8554/myapp/orange2' (For more details, make sure FFmpegLogCallback.set() has been called.)
    at org.bytedeco.javacv.FFmpegFrameRecorder.startUnsafe(FFmpegFrameRecorder.java:969)
    at org.bytedeco.javacv.FFmpegFrameRecorder.start(FFmpegFrameRecorder.java:437)
    at org.bytedeco.javacv.FFmpegFrameRecorder.start(FFmpegFrameRecorder.java:432)
    at org.jfjy.jvc.GetBytes2PipedStreamAndPushRTMP.grabAndPush(GetBytes2PipedStreamAndPushRTMP.java:215)
    at org.jfjy.jvc.GetBytes2PipedStreamAndPushRTMP$2.run(GetBytes2PipedStreamAndPushRTMP.java:100)
    ... 3 more


    


    After googled , I tried fowllowing :

    


      

    • set avFormatContext.flags(avformat.AVFMT_GLOBALHEADER); no use

      


    • 


    • add "-flags +global_header" or "-rtpflags latm" to command line, but I don't know how to do this in JavaCV。

      


    • 


    • set recorder.setAudioOption("flags", "+global_header"); no use

      


    • 


    • set       recorder.setAudioOption("flags", "global_header");  recorder.setVideoOption("flags", "global_header"); , no use

      


    • 


    


    could someone guide me on this appreciate

    


    about the dhav (a part of ) ,mabay can help :

    


    Input #0, dhav, from '.\videostream':
  Duration: 00:00:25.00, start: 1689678599.000000, bitrate: 2360 kb/s
  Stream #0:0: Audio: pcm_s16le, 16000 Hz, 1 channels, s16, 256 kb/s
  Stream #0:1: Video: h264 (High), yuvj420p(pc, bt470bg/bt470bg/bt709), 720x1280, 25 fps, 50 tbr, 1k tbn


    


    key code is :

    


    public static synchronized void grabAndPush(InputStream inputStream, String pushAddress, String pushPotocol) throws Exception {
        avutil.av_log_set_level(AV_LOG_DEBUG);
        FFmpegLogCallback.set();

        FFmpegFrameGrabber grabber = new FFmpegFrameGrabber(inputStream,0);

        long startTime = System.currentTimeMillis();
      
        grabber.start();

     

        AVFormatContext avFormatContext = grabber.getFormatContext();


        int streamNum = avFormatContext.nb_streams();

        if (streamNum < 1) {
            log.error("no media!");
            return;
        }

        int frameRate = (int) grabber.getVideoFrameRate();
        if (0 == frameRate) {
            frameRate = 15;
        }
        log.info("frameRate[{}],duration[{}]secs,number streams[{}]",
                frameRate,
                avFormatContext.duration() / 1000000,
                avFormatContext.nb_streams());

        for (int i = 0; i < streamNum; i++) {
            AVStream avStream = avFormatContext.streams(i);
            AVCodecParameters avCodecParameters = avStream.codecpar();
            log.info("stream index[{}],codec type[{}],codec ID[{}]", i, avCodecParameters.codec_type(), avCodecParameters.codec_id());
        }

        int frameWidth = grabber.getImageWidth();
        int frameHeight = grabber.getImageHeight();
        int audioChannels = grabber.getAudioChannels();

        log.info("frameWidth[{}],frameHeight[{}],audioChannels[{}]",
                frameWidth,
                frameHeight,
                audioChannels);

        FFmpegFrameRecorder recorder = new FFmpegFrameRecorder(pushAddress,
                frameWidth,
                frameHeight,
                audioChannels);

        recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
        recorder.setInterleaved(true);

        switch (pushPotocol) {
            case "rtsp" -> {
                recorder.setFormat("rtsp");
            }
            case "rtmp" -> {
                recorder.setFormat("flv");
            }
        }
        log.info("push protocol:{}| grabber format:{} | recorder format:{}",pushPotocol,grabber.getFormat(),recorder.getFormat());

        recorder.setFrameRate(frameRate);
        recorder.setAudioCodec(AV_CODEC_ID_AAC);
        log.info("grabber audio codec name :{}|recorder audio codec name :{}",grabber.getAudioCodecName(),recorder.getAudioCodecName());

        recorder.setGopSize(frameRate * 2);

        recorder.setAudioChannels(grabber.getAudioChannels());

        startTime = System.currentTimeMillis();


        avFormatContext.max_interleave_delta(0);
        avFormatContext.flags(avformat.AVFMT_TS_NONSTRICT);
        recorder.setTimestamp(0);
  
        recorder.start(avFormatContext);

     

        Frame frame;


        int videoFrameNum = 0;
        int audioFrameNum = 0;
        int dataFrameNum = 0;
    
        AVPacket packet;
        long lastDTS = 0;
        while ((packet = grabber.grabPacket()) != null) {
            if (packet.pts() == AV_NOPTS_VALUE) {
                if (packet.dts() != AV_NOPTS_VALUE) {
                    packet.pts(packet.dts());
                    lastDTS = packet.dts();
                } else {
                    packet.pts(lastDTS + 1);
                    packet.dts(packet.pts());
                    lastDTS = packet.pts();
                }
            } else {
                if (packet.dts() != AV_NOPTS_VALUE) {
                    if (packet.dts() < lastDTS) {
                        packet.dts(lastDTS + 1);
                    }
                    lastDTS = packet.dts();
                } else {
                    packet.dts(packet.pts());
                    lastDTS = packet.dts();
                }
            }

            if (packet.pts() < packet.dts()) {
                packet.pts(packet.dts());
            }


            recorder.recordPacket(packet);
            Thread.sleep(1);
        }

        log.info("push complete,videoFrameNum[{}],audioFrameNum[{}],dataFrameNum[{}],耗时[{}]秒",
                videoFrameNum,
                audioFrameNum,
                dataFrameNum,
                (System.currentTimeMillis() - startTime) / 1000);

   
        recorder.close();
        grabber.close();
    }


    


  • aarch64 : hevc : Produce plain neon versions of qpel_uni_hv

    22 mars 2024, par Martin Storsjö
    aarch64 : hevc : Produce plain neon versions of qpel_uni_hv
    

    As the plain neon qpel_h functions process two rows at a time,
    we need to allocate storage for h+8 rows instead of h+7.

    By allocating storage for h+8 rows, incrementing the stack
    pointer won't end up at the right spot in the end. Store the
    intended final stack pointer value in a register x14 which we
    store on the stack.

    AWS Graviton 3 :
    put_hevc_qpel_uni_hv4_8_c : 384.2
    put_hevc_qpel_uni_hv4_8_neon : 127.5
    put_hevc_qpel_uni_hv4_8_i8mm : 85.5
    put_hevc_qpel_uni_hv6_8_c : 705.5
    put_hevc_qpel_uni_hv6_8_neon : 224.5
    put_hevc_qpel_uni_hv6_8_i8mm : 176.2
    put_hevc_qpel_uni_hv8_8_c : 1136.5
    put_hevc_qpel_uni_hv8_8_neon : 216.5
    put_hevc_qpel_uni_hv8_8_i8mm : 214.0
    put_hevc_qpel_uni_hv12_8_c : 2259.5
    put_hevc_qpel_uni_hv12_8_neon : 498.5
    put_hevc_qpel_uni_hv12_8_i8mm : 410.7
    put_hevc_qpel_uni_hv16_8_c : 3824.7
    put_hevc_qpel_uni_hv16_8_neon : 670.0
    put_hevc_qpel_uni_hv16_8_i8mm : 603.7
    put_hevc_qpel_uni_hv24_8_c : 8113.5
    put_hevc_qpel_uni_hv24_8_neon : 1474.7
    put_hevc_qpel_uni_hv24_8_i8mm : 1351.5
    put_hevc_qpel_uni_hv32_8_c : 14744.5
    put_hevc_qpel_uni_hv32_8_neon : 2599.7
    put_hevc_qpel_uni_hv32_8_i8mm : 2266.0
    put_hevc_qpel_uni_hv48_8_c : 32800.0
    put_hevc_qpel_uni_hv48_8_neon : 5650.0
    put_hevc_qpel_uni_hv48_8_i8mm : 5011.7
    put_hevc_qpel_uni_hv64_8_c : 57856.2
    put_hevc_qpel_uni_hv64_8_neon : 9863.5
    put_hevc_qpel_uni_hv64_8_i8mm : 8767.7

    Signed-off-by : Martin Storsjö <martin@martin.st>

    • [DH] libavcodec/aarch64/hevcdsp_init_aarch64.c
    • [DH] libavcodec/aarch64/hevcdsp_qpel_neon.S