Recherche avancée

Médias (1)

Mot : - Tags -/Christian Nold

Autres articles (76)

  • MediaSPIP version 0.1 Beta

    16 avril 2011, par

    MediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • Amélioration de la version de base

    13 septembre 2013

    Jolie sélection multiple
    Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
    Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)

  • Mise à jour de la version 0.1 vers 0.2

    24 juin 2013, par

    Explications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
    Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...)

Sur d’autres sites (13762)

  • ffmpeg GRAY16 stream over network

    28 novembre 2023, par Norbert P.

    Im working in a school project where we need to use depth cameras. The camera produces color and depth (in other words 16bit grayscale image). We decided to use ffmpeg, as later on compression could be very useful. For now we got some basic stream running form one PC to other. These settings include :

    


      

    • rtmp
    • 


    • flv as container
    • 


    • pixel format AV_PIX_FMT_YUV420P
    • 


    • codec AV_CODEC_ID_H264
    • 


    


    The problem we are having is with grayscale image. Not every codec is able to cope with this format, so as not every protocol able to work with given codec. I got some settings "working" but receiver side is just stuck on avformat_open_input() method.
I have also tested it with commandline where ffmpeg is listening for connection and same happens.

    


    I include a minimum "working" example of client code. Server can be tested with "ffmpeg.exe -f apng -listen 1 -i rtmp ://localhost:9999/stream/stream1 -c copy -f apng -listen 1 rtmp ://localhost:2222/live/l" or code below. I get no warnings, ffmpeg is newest version installed with "vcpkg install —triplet x64-windows ffmpeg[ffmpeg,ffprobe,zlib]" on windows or packet manager on linux.

    


    The question : Did I miss something ? How do I get it to work ? If you have any better ideas I would very gladly consider them. In the end I need 16 bits of lossless transmission, could be split between channels etc. which I also tried with same effect.

    


    Client code that would have camera and connect to server :

    


    extern "C" {&#xA;#include <libavutil></libavutil>opt.h>&#xA;#include <libavcodec></libavcodec>avcodec.h>&#xA;#include <libavutil></libavutil>channel_layout.h>&#xA;#include <libavutil></libavutil>common.h>&#xA;#include <libavformat></libavformat>avformat.h>&#xA;#include <libavcodec></libavcodec>avcodec.h>&#xA;#include <libavutil></libavutil>imgutils.h>&#xA;}&#xA;&#xA;int main() {&#xA;&#xA;    std::string container = "apng";&#xA;    AVCodecID codec_id = AV_CODEC_ID_APNG;&#xA;    AVPixelFormat pixFormat = AV_PIX_FMT_GRAY16BE;&#xA;&#xA;    AVFormatContext* format_ctx;&#xA;    AVCodec* out_codec;&#xA;    AVStream* out_stream;&#xA;    AVCodecContext* out_codec_ctx;&#xA;    AVFrame* frame;&#xA;    uint8_t* data;&#xA;&#xA;    std::string server = "rtmp://localhost:9999/stream/stream1";&#xA;&#xA;    int width = 1280, height = 720, fps = 30, bitrate = 1000000;&#xA;&#xA;    //initialize format context for output with flv and no filename&#xA;    avformat_alloc_output_context2(&amp;format_ctx, nullptr, container.c_str(), server.c_str());&#xA;    if (!format_ctx) {&#xA;        return 1;&#xA;    }&#xA;&#xA;    //AVIOContext for accessing the resource indicated by url&#xA;    if (!(format_ctx->oformat->flags &amp; AVFMT_NOFILE)) {&#xA;        int avopen_ret = avio_open(&amp;format_ctx->pb, server.c_str(),&#xA;            AVIO_FLAG_WRITE);// , nullptr, nullptr);&#xA;        if (avopen_ret &lt; 0) {&#xA;            fprintf(stderr, "failed to open stream output context, stream will not work\n");&#xA;            return 1;&#xA;        }&#xA;    }&#xA;&#xA;&#xA;    const AVCodec* tmp_out_codec = avcodec_find_encoder(codec_id);&#xA;    //const AVCodec* tmp_out_codec = avcodec_find_encoder_by_name("hevc");&#xA;    out_codec = const_cast(tmp_out_codec);&#xA;    if (!(out_codec)) {&#xA;        fprintf(stderr, "Could not find encoder for &#x27;%s&#x27;\n",&#xA;            avcodec_get_name(codec_id));&#xA;&#xA;        return 1;&#xA;    }&#xA;&#xA;    out_stream = avformat_new_stream(format_ctx, out_codec);&#xA;    if (!out_stream) {&#xA;        fprintf(stderr, "Could not allocate stream\n");&#xA;        return 1;&#xA;    }&#xA;&#xA;    out_codec_ctx = avcodec_alloc_context3(out_codec);&#xA;&#xA;    const AVRational timebase = { 60000, fps };&#xA;    const AVRational dst_fps = { fps, 1 };&#xA;    av_log_set_level(AV_LOG_VERBOSE);&#xA;    //codec_ctx->codec_tag = 0;&#xA;    //codec_ctx->codec_id = codec_id;&#xA;    out_codec_ctx->codec_type = AVMEDIA_TYPE_VIDEO;&#xA;    out_codec_ctx->width = width;&#xA;    out_codec_ctx->height = height;&#xA;    out_codec_ctx->gop_size = 1;&#xA;    out_codec_ctx->time_base = timebase;&#xA;    out_codec_ctx->pix_fmt = pixFormat;&#xA;    out_codec_ctx->framerate = dst_fps;&#xA;    out_codec_ctx->time_base = av_inv_q(dst_fps);&#xA;    out_codec_ctx->bit_rate = bitrate;&#xA;    //if (fctx->oformat->flags &amp; AVFMT_GLOBALHEADER)&#xA;    //{&#xA;    //    codec_ctx->flags |= AV_CODEC_FLAG_GLOBAL_HEADER;&#xA;    //}&#xA;&#xA;    out_stream->time_base = out_codec_ctx->time_base; //will be set afterwards by avformat_write_header to 1/1000&#xA;&#xA;    int ret = avcodec_parameters_from_context(out_stream->codecpar, out_codec_ctx);&#xA;    if (ret &lt; 0)&#xA;    {&#xA;        fprintf(stderr, "Could not initialize stream codec parameters!\n");&#xA;        return 1;&#xA;    }&#xA;&#xA;    AVDictionary* codec_options = nullptr;&#xA;    av_dict_set(&amp;codec_options, "tune", "zerolatency", 0);&#xA;&#xA;    // open video encoder&#xA;    ret = avcodec_open2(out_codec_ctx, out_codec, &amp;codec_options);&#xA;    if (ret &lt; 0)&#xA;    {&#xA;        fprintf(stderr, "Could not open video encoder!\n");&#xA;        return 1;&#xA;    }&#xA;    av_dict_free(&amp;codec_options);&#xA;&#xA;    out_stream->codecpar->extradata_size = out_codec_ctx->extradata_size;&#xA;    out_stream->codecpar->extradata = static_cast(av_mallocz(out_codec_ctx->extradata_size));&#xA;    memcpy(out_stream->codecpar->extradata, out_codec_ctx->extradata, out_codec_ctx->extradata_size);&#xA;&#xA;    av_dump_format(format_ctx, 0, server.c_str(), 1);&#xA;&#xA;    frame = av_frame_alloc();&#xA;&#xA;    int sz = av_image_get_buffer_size(pixFormat, width, height, 32);&#xA;#ifdef _WIN32&#xA;    data = (uint8_t*)_aligned_malloc(sz, 32);&#xA;    if (data == NULL)&#xA;        return ENOMEM;&#xA;#else&#xA;    ret = posix_memalign(reinterpret_cast(&amp;data), 32, sz);&#xA;#endif&#xA;    av_image_fill_arrays(frame->data, frame->linesize, data, pixFormat, width, height, 32);&#xA;    frame->format = pixFormat;&#xA;    frame->width = width;&#xA;    frame->height = height;&#xA;    frame->pts = 1;&#xA;    if (avformat_write_header(format_ctx, nullptr) &lt; 0) //Header making problems!!!&#xA;    {&#xA;        fprintf(stderr, "Could not write header!\n");&#xA;        return 1;&#xA;    }&#xA;&#xA;    printf("stream time base = %d / %d \n", out_stream->time_base.num, out_stream->time_base.den);&#xA;&#xA;    double inv_stream_timebase = (double)out_stream->time_base.den / (double)out_stream->time_base.num;&#xA;    printf("Init OK\n");&#xA;    /*  Init phase end*/&#xA;    int dts = 0;&#xA;    int frameNo = 0;&#xA;&#xA;    while (true) {&#xA;        //Fill dummy frame with something&#xA;        for (int y = 0; y &lt; height; y&#x2B;&#x2B;) {&#xA;            uint16_t color = ((y &#x2B; frameNo) * 256) % (256 * 256);&#xA;            for (int x = 0; x &lt; width; x&#x2B;&#x2B;) {&#xA;                data[x&#x2B;y*width] = color;&#xA;            }&#xA;        }&#xA;&#xA;        memcpy(frame->data[0], data, 1280 * 720 * sizeof(uint16_t));&#xA;        AVPacket* pkt = av_packet_alloc();&#xA;&#xA;        int ret = avcodec_send_frame(out_codec_ctx, frame);&#xA;        if (ret &lt; 0)&#xA;        {&#xA;            fprintf(stderr, "Error sending frame to codec context!\n");&#xA;            return ret;&#xA;        }&#xA;        while (ret >= 0) {&#xA;            ret = avcodec_receive_packet(out_codec_ctx, pkt);&#xA;            if (ret == AVERROR(EAGAIN) || ret == AVERROR_EOF)&#xA;                break;&#xA;            else if (ret &lt; 0) {&#xA;                fprintf(stderr, "Error during encoding\n");&#xA;                break;&#xA;            }&#xA;            pkt->dts = dts;&#xA;            pkt->pts = dts;&#xA;            dts &#x2B;= 33;&#xA;            av_write_frame(format_ctx, pkt);&#xA;            frameNo&#x2B;&#x2B;;&#xA;            av_packet_unref(pkt);&#xA;        }&#xA;        printf("Streamed %d frames\n", frameNo);&#xA;    }&#xA;    return 0;&#xA;}&#xA;

    &#xA;

    And part of server that should receive. code where is stops and waits

    &#xA;

    extern "C" {&#xA;#include <libavcodec></libavcodec>avcodec.h>&#xA;#include <libavformat></libavformat>avformat.h>&#xA;#include <libavformat></libavformat>avio.h>&#xA;}&#xA;&#xA;int main() {&#xA;    AVFormatContext* fmt_ctx = NULL;&#xA;    av_log_set_level(AV_LOG_VERBOSE);&#xA;    AVDictionary* options = nullptr;&#xA;    av_dict_set(&amp;options, "protocol_whitelist", "file,udp,rtp,tcp,rtmp,rtsp,hls", 0);&#xA;    av_dict_set(&amp;options, "timeout", "500000", 0); // Timeout in microseconds &#xA;&#xA;//Next Line hangs   &#xA;    int ret = avformat_open_input(&amp;fmt_ctx, "rtmp://localhost:9999/stream/stream1", NULL, &amp;options);&#xA;    if (ret != 0) {&#xA;        fprintf(stderr, "Could not open RTMP stream\n");&#xA;        return -1;&#xA;    }&#xA;&#xA;    // Find the first video stream&#xA;    ret = avformat_find_stream_info(fmt_ctx, nullptr);&#xA;    if (ret &lt; 0) {&#xA;        return ret;&#xA;    }&#xA;    //...&#xA;} &#xA;&#xA;

    &#xA;

    Edit :&#xA;I tried to just create a animated png and tried to stream that from the console to another console window to avoid any programming mistakes on my side. It was the same, I just could not get 16 PNG encoded stream to work. I hung trying to receive and closed when the file ended with in total zero frames received.

    &#xA;

    I managed to get other thing working :&#xA;To not encode gray frames with YUV420, I installed ffmpeg with libx264 support (was thinking is the same as H264, which in code is, but it adds support to new pixel formats). Used H264 again but with GRAY8 with doubled image width and reconstructing the image on the other side.

    &#xA;

    Maybe as a side note, I could not get any other formats to work. Is "flv" the only option here ? Could I get more performance if I changed it to... what ?

    &#xA;

  • How to debug ffmpeg reliability for long running rtsp streams

    13 septembre 2022, par Mark

    I have a long running ffmpeg background process that "watches" an rtsp stream and takes snapshots every 7 minutes.

    &#xA;

    It's being run like this

    &#xA;

    C:\Windows\System32\cmd.exe /c C:\ffmpeg\bin\ffmpeg.exe -nostdin -rtsp_transport tcp -y -timeout 5000000 -i rtsp://someurl -q:v 1 -an -vf fps=0.002381,scale="1280:720" -strftime 1 -f image2 C:\somelocalfolder\%Y-%m-%d_%H-%M-%S.jpg > c:\ffmpeglog.txt 2>&amp;1&#xA;

    &#xA;

    This process runs for days but intermittently, for hours at a time, seems to miss taking snapshots, until eventually it starts to take them again - then fail again, etc. The logs at info level are not helpful. I checked the stream during times when it was not taking snapshots and the stream was up. What's happening here ? How can I debug this ?

    &#xA;

    Below is an image of succesfull snapshots per hour. There should always be between 8 and 9.&#xA;metrics on successful snapshots taken

    &#xA;

    Logs look like this

    &#xA;

        ffmpeg version 2022-03-31-git-e301a24fa1-full_build-www.gyan.dev Copyright (c) 2000-2022 the FFmpeg developers&#xA;  built with gcc 11.2.0 (Rev7, Built by MSYS2 project)&#xA;  configuration: --enable-gpl --enable-version3 --enable-static --disable-w32threads --disable-autodetect --enable-fontconfig --enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp --enable-bzlib --enable-lzma --enable-libsnappy --enable-zlib --enable-librist --enable-libsrt --enable-libssh --enable-libzmq --enable-avisynth --enable-libbluray --enable-libcaca --enable-sdl2 --enable-libdav1d --enable-libdavs2 --enable-libuavs3d --enable-libzvbi --enable-librav1e --enable-libsvtav1 --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs2 --enable-libxvid --enable-libaom --enable-libopenjpeg --enable-libvpx --enable-mediafoundation --enable-libass --enable-frei0r --enable-libfreetype --enable-libfribidi --enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf --enable-cuda-llvm --enable-cuvid --enable-ffnvcodec --enable-nvdec --enable-nvenc --enable-d3d11va --enable-dxva2 --enable-libmfx --enable-libshaderc --enable-vulkan --enable-libplacebo --enable-opencl --enable-libcdio --enable-libgme --enable-libmodplug --enable-libopenmpt --enable-libopencore-amrwb --enable-libmp3lame --enable-libshine --enable-libtheora --enable-libtwolame --enable-libvo-amrwbenc --enable-libilbc --enable-libgsm --enable-libopencore-amrnb --enable-libopus --enable-libspeex --enable-libvorbis --enable-ladspa --enable-libbs2b --enable-libflite --enable-libmysofa --enable-librubberband --enable-libsoxr --enable-chromaprint&#xA;  libavutil      57. 24.101 / 57. 24.101&#xA;  libavcodec     59. 25.100 / 59. 25.100&#xA;  libavformat    59. 20.101 / 59. 20.101&#xA;  libavdevice    59.  6.100 / 59.  6.100&#xA;  libavfilter     8. 29.100 /  8. 29.100&#xA;  libswscale      6.  6.100 /  6.  6.100&#xA;  libswresample   4.  6.100 /  4.  6.100&#xA;  libpostproc    56.  5.100 / 56.  5.100&#xA;Input #0, rtsp, from &#x27;rtsp://somerul&#x27;:&#xA;  Metadata:&#xA;    title           : HIK Media Server V4.21.005&#xA;    comment         : HIK Media Server Session Description : standard&#xA;  Duration: N/A, start: 0.033000, bitrate: N/A&#xA;  Stream #0:0: Video: h264 (High), yuv420p(progressive), 704x576, 30 tbr, 90k tbn&#xA;Stream mapping:&#xA;  Stream #0:0 -> #0:0 (h264 (native) -> mjpeg (native))&#xA;[swscaler @ 000002a1c2c20680] [swscaler @ 000002a1c2c2e0c0] deprecated pixel format used, make sure you did set range correctly&#xA;[swscaler @ 000002a1c2c20680] [swscaler @ 000002a1c2c67c40] deprecated pixel format used, make sure you did set range correctly&#xA;[swscaler @ 000002a1c2c20680] [swscaler @ 000002a1c2cc6700] deprecated pixel format used, make sure you did set range correctly&#xA;Output #0, image2, to &#x27;C:\somelocalfolder\Temp\stream_2\StreamedImages\%Y-%m-%d_%H-%M-%S.jpg&#x27;:&#xA;  Metadata:&#xA;    title           : HIK Media Server V4.21.005&#xA;    comment         : HIK Media Server Session Description : standard&#xA;    encoder         : Lavf59.20.101&#xA;  Stream #0:0: Video: mjpeg, yuvj420p(pc, progressive), 1280x720, q=2-31, 200 kb/s, 0.0024 fps, 0.0024 tbn&#xA;    Metadata:&#xA;      encoder         : Lavc59.25.100 mjpeg&#xA;    Side data:&#xA;      cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: N/A&#xA;frame=    1 fps=0.0 q=0.0 size=N/A time=00:00:00.00 bitrate=N/A speed=   0x   &#xA;

    &#xA;

    Update&#xA;I got some trace logs. The ffmpeg seems to fail silently at some point and stop taking snapshots.

    &#xA;

    After about 3 million log lines (which is really only a couple of hours in my case) I get the following

    &#xA;

    rtsp://192.168.15.195:554/streaming/channels/904: Unknown error&#xA;

    &#xA;

    But ffmpeg silently continues. Here is a bit more of the log

    &#xA;

        [Parsed_fps_0 @ 00000248e7d50e40] Read frame with in pts 1074443040, out pts 28&#xA;[Parsed_fps_0 @ 00000248e7d50e40] Dropping frame with pts 28&#xA;frame=   28 fps=0.0 q=1.0 size=N/A time=03:08:59.77 bitrate=N/A speed=0.95x    &#xA;[rtsp @ 00000248e765cf00] tcp_read_packet:&#xA;[h264 @ 00000248e7d59880] nal_unit_type: 1(Coded slice of a non-IDR picture), nal_ref_idc: 3&#xA;[rtsp @ 00000248e765cf00] ret=-138 c=24 [$]&#xA;rtsp://192.168.15.195:554/streaming/channels/904: Unknown error&#xA;[Parsed_fps_0 @ 00000248e7d50e40] Read frame with in pts 1074446100, out pts 28&#xA;[Parsed_fps_0 @ 00000248e7d50e40] Dropping frame with pts 28&#xA;frame=   28 fps=0.0 q=1.0 size=N/A time=03:08:59.77 bitrate=N/A speed=0.95x    &#xA;[rtsp @ 00000248e765cf00] tcp_read_packet:&#xA;[rtsp @ 00000248e765cf00] ret=1 c=24 [$]&#xA;[rtsp @ 00000248e765cf00] id=0 len=696&#xA;[rtsp @ 00000248e765cf00] Sending:&#xA;GET_PARAMETER rtsp://192.168.15.195:554/streaming/channels/904 RTSP/1.0&#xA;&#xA;CSeq: 402&#xA;&#xA;User-Agent: Lavf59.20.101&#xA;&#xA;Session: 931848797&#xA;&#xA;Authorization: Digest username="******", realm="709382dda4ccb674edf093d3", nonce="13fca62fc", uri="rtsp://192.168.15.195:554/streaming/channels/904", response="74341df9611f0ac3dc247b402424735b", algorithm="MD5"&#xA;&#xA;&#xA;&#xA;--&#xA;[NULL @ 00000248e7662640] nal_unit_type: 7(SPS), nal_ref_idc: 3&#xA;[NULL @ 00000248e7662640] nal_unit_type: 8(PPS), nal_ref_idc: 3&#xA;[rtsp @ 00000248e765cf00] tcp_read_packet:&#xA;[rtsp @ 00000248e765cf00] ret=1 c=24 [$]&#xA;[rtsp @ 00000248e765cf00] id=0 len=756&#xA;[Parsed_fps_0 @ 00000248e7d50e40] Read frame with in pts 1074449070, out pts 28&#xA;[Parsed_fps_0 @ 00000248e7d50e40] Dropping frame with pts 28&#xA;[Parsed_fps_0 @ 00000248e7d50e40] Read frame with in pts 1074449070, out pts 28&#xA;[Parsed_fps_0 @ 00000248e7d50e40] Dropping frame with pts 28&#xA;frame=   28 fps=0.0 q=1.0 size=N/A time=03:08:59.77 bitrate=N/A speed=0.949x    &#xA;[rtsp @ 00000248e765cf00] tcp_read_packet:&#xA;[rtsp @ 00000248e765cf00] ret=1 c=24 [$]&#xA;[rtsp @ 00000248e765cf00] id=0 len=1352&#xA;frame=   28 fps=0.0 q=1.0 size=N/A time=03:08:59.77 bitrate=N/A speed=0.949x    &#xA;[rtsp @ 00000248e765cf00] tcp_read_packet:&#xA;[rtsp @ 00000248e765cf00] ret=1 c=24 [$]&#xA;[rtsp @ 00000248e765cf00] id=0 len=1352&#xA;[rtsp @ 00000248e765cf00] tcp_read_packet:&#xA;[rtsp @ 00000248e765cf00] ret=1 c=24 [$]&#xA;[rtsp @ 00000248e765cf00] id=0 len=1352&#xA;[rtsp @ 00000248e765cf00] tcp_read_packet:&#xA;[rtsp @ 00000248e765cf00] ret=1 c=24 [$]&#xA;[rtsp @ 00000248e765cf00] id=0 len=1352&#xA;[rtsp @ 00000248e765cf00] tcp_read_packet:&#xA;[rtsp @ 00000248e765cf00] ret=1 c=24 [$]&#xA;[rtsp @ 00000248e765cf00] id=0 len=1228&#xA;[rtsp @ 00000248e765cf00] tcp_read_packet:&#xA;[rtsp @ 00000248e765cf00] ret=1 c=24 [$]&#xA;[rtsp @ 00000248e765cf00] id=0 len=1352&#xA;[NULL @ 00000248e7662640] reference count 1 overflow&#xA;[rtsp @ 00000248e765cf00] tcp_read_packet:&#xA;[rtsp @ 00000248e765cf00] ret=1 c=24 [$]&#xA;[rtsp @ 00000248e765cf00] id=0 len=804&#xA;[rtsp @ 00000248e765cf00] tcp_read_packet:&#xA;[rtsp @ 00000248e765cf00] ret=1 c=24 [$]&#xA;[rtsp @ 00000248e765cf00] id=0 len=1352&#xA;[NULL @ 00000248e7662640] illegal memory management control operation 11&#xA;[rtsp @ 00000248e765cf00] tcp_read_packet:&#xA;[rtsp @ 00000248e765cf00] ret=1 c=24 [$]&#xA;[rtsp @ 00000248e765cf00] id=0 len=836&#xA;

    &#xA;

    Basically it appears an issue of ffmpeg silently failing. If it crashed, my software could detect it and I could rerun it, but if it fails silently like this, I need another solution.

    &#xA;

  • Streaming RTP with ffmpeg and node.js to voip phone

    5 juillet 2023, par Nik Hendricks

    I am trying to implement SIP in node.js. Here is the library i am working on

    &#xA;

    Upon receiving an invite request such as

    &#xA;

    &#xA;Received INVITE&#xA;INVITE sip:201@192.168.1.2:5060 SIP/2.0&#xA;Via: SIP/2.0/UDP 192.168.1.39:5062;branch=z9hG4bK1534941205&#xA;From: "Nik" &lt;sip:nik@192.168.1.2&gt;;tag=564148403&#xA;To: &lt;sip:201@192.168.1.2&gt;&#xA;Call-ID: 2068254636@192.168.1.39&#xA;CSeq: 2 INVITE&#xA;Contact: &lt;sip:nik@192.168.1.39:5062&gt;&#xA;Authorization: Digest username="Nik", realm="NRegistrar", nonce="1234abcd", uri="sip:201@192.168.1.2:5060", response="7fba16dafe3d60c270b774bd5bba524c", algorithm=MD5&#xA;Content-Type: application/sdp&#xA;Allow: INVITE, INFO, PRACK, ACK, BYE, CANCEL, OPTIONS, NOTIFY, REGISTER, SUBSCRIBE, REFER, PUBLISH, UPDATE, MESSAGE&#xA;Max-Forwards: 70&#xA;User-Agent: Yealink SIP-T42G 29.71.0.120&#xA;Supported: replaces&#xA;Allow-Events: talk,hold,conference,refer,check-sync&#xA;Content-Length: 306&#xA;&#xA;v=0&#xA;o=- 20083 20083 IN IP4 192.168.1.39&#xA;s=SDP data&#xA;c=IN IP4 192.168.1.39&#xA;t=0 0&#xA;m=audio 11782 RTP/AVP 0 8 18 9 101&#xA;a=rtpmap:0 PCMU/8000&#xA;a=rtpmap:8 PCMA/8000&#xA;a=rtpmap:18 G729/8000&#xA;a=fmtp:18 annexb=no&#xA;a=rtpmap:9 G722/8000&#xA;a=fmtp:101 0-15&#xA;a=rtpmap:101 telephone-event/8000&#xA;a=ptime:20&#xA;a=sendrecv&#xA;&#xA;

    &#xA;

    I can then parse the SDP into an object like this

    &#xA;

     &#xA;{&#xA;    "session":{&#xA;        "version":"0",&#xA;        "origin":"- 20084 20084 IN IP4 192.168.1.39",&#xA;        "sessionName":"SDP data"&#xA;    },&#xA;    "media":[&#xA;        {&#xA;            "media":"audio",&#xA;            "port":11784,&#xA;            "protocol":"RTP/AVP",&#xA;            "format":"0",&#xA;            "attributes":[&#xA;                "rtpmap:0 PCMU/8000",&#xA;                "rtpmap:8 PCMA/8000",&#xA;                "rtpmap:18 G729/8000",&#xA;                "fmtp:18 annexb=no",&#xA;                "rtpmap:9 G722/8000",&#xA;                "fmtp:101 0-15",&#xA;                "rtpmap:101 telephone-event/8000",&#xA;                "ptime:20",&#xA;                "sendrecv"&#xA;            ]&#xA;        }&#xA;    ]&#xA;}&#xA;

    &#xA;

    After sending the 100 and 180 responses with my library i attempt to start a RTP stream with ffmpeg

    &#xA;

    var port = SDPParser.parse(res.message.body).media[0].port&#xA;var s = new STREAMER(&#x27;output.wav&#x27;, &#x27;192.168.1.39&#x27;, port)&#xA;

    &#xA;

    with the following STREAMER class

    &#xA;

    class Streamer{&#xA;    constructor(inputFilePath, rtpAddress, rtpPort){&#xA;        this.inputFilePath = &#x27;output.wav&#x27;;&#xA;        this.rtpAddress = rtpAddress;&#xA;        this.rtpPort = rtpPort;&#xA;    }&#xA;&#xA;    start(){&#xA;        return new Promise((resolve) => {&#xA;            const ffmpegCommand = `ffmpeg -re -i ${this.inputFilePath} -ar 8000 -f mulaw -f rtp rtp://${this.rtpAddress}:${this.rtpPort}`;&#xA;            const ffmpegProcess = spawn(ffmpegCommand, { shell: true });&#xA;    &#xA;            ffmpegProcess.stdout.on(&#x27;data&#x27;, (data) => {&#xA;                data = data.toString()&#xA;                //replace all instances of 127.0.0.1 with our local ip address&#xA;                data = data.replace(new RegExp(&#x27;127.0.0.1&#x27;, &#x27;g&#x27;), &#x27;192.168.1.3&#x27;);&#xA;&#xA;                resolve(data.toString())&#xA;            });&#xA;    &#xA;            ffmpegProcess.stderr.on(&#x27;data&#x27;, (data) => {&#xA;              // Handle stderr data if required&#xA;              console.log(data.toString())&#xA;            });&#xA;    &#xA;            ffmpegProcess.on(&#x27;close&#x27;, (code) => {&#xA;              // Handle process close event if required&#xA;              console.log(&#x27;close&#x27;)&#xA;              console.log(code.toString())&#xA;            });&#xA;    &#xA;            ffmpegProcess.on(&#x27;error&#x27;, (error) => {&#xA;              // Handle process error event if required&#xA;              console.log(error.toString())&#xA;            });&#xA;        })&#xA;    }&#xA;     &#xA;}&#xA;

    &#xA;

    the start() function resolves with the SDP that ffmpeg generates. I am starting to think that ffmpeg cant generate proper SDP for voip calls.

    &#xA;

    so when i create 200 response with the following sdp

    &#xA;

    v=0&#xA;o=- 0 0 IN IP4 192.168.1.3&#xA;s=Impact Moderato&#xA;c=IN IP4 192.168.1.39&#xA;t=0 0&#xA;a=tool:libavformat 58.29.100&#xA;m=audio 12123 RTP/AVP 97&#xA;b=AS:128&#xA;a=rtpmap:97 PCMU/8000/2&#xA;

    &#xA;

    the other line never picks up. from my understanding the first invite from the caller will provide SDP that will tell me where to send the RTP stream too and the correct codecs and everything. I know that currently, my wav file is PCMU and i can listen to it with ffplay and the provided sdp. what is required to make the other line pickup specifically a Yealink t42g

    &#xA;

    my full attempt looks like this

    &#xA;

    Client.on(&#x27;INVITE&#x27;, (res) => {&#xA;    console.log("Received INVITE")&#xA;    var d = Client.Dialog(res).then(dialog => {&#xA;        dialog.send(res.CreateResponse(100))&#xA;        dialog.send(res.CreateResponse(180))&#xA;        var port = SDPParser.parse(res.message.body).media[0].port&#xA;&#xA;        var s = new STREAMER(&#x27;output.wav&#x27;, &#x27;192.168.1.39&#x27;, port)&#xA;        s.start().then(sdp => {&#xA;            console.log(sdp.split(&#x27;SDP:&#x27;)[1])&#xA;            var ok = res.CreateResponse(200)&#xA;            ok.body = sdp.split(&#x27;SDP:&#x27;)[1]&#xA;            dialog.send(ok)&#xA;        })&#xA;&#xA;        dialog.on(&#x27;BYE&#x27;, (res) => {&#xA;            console.log("BYE")&#xA;            dialog.send(res.CreateResponse(200))&#xA;            dialog.kill()&#xA;        })&#xA;    })&#xA;})&#xA;

    &#xA;

    I have provided a link to my library at the top of this message. My current problem is in the examples/Client folder.

    &#xA;

    I'm not sure what could be going wrong here. Maybe i'm not using the right format or codec for the VOIP phone i dont see whats wrong with the SDP. especially if i can listen to SDP generated by ffmpeg if i stream RTP back to the same computer i use ffplay on. Any help is greatly appreciated.

    &#xA;

    Update

    &#xA;

    As i test i decided to send the caller back SDP that was generated by a Yealink phone like itself. but with some modifications

    &#xA;

    v=0&#xA;o=- ${this.output_port} ${this.output_port} IN IP4 192.168.1.39&#xA;s=SDP data&#xA;c=IN IP4 192.168.1.39&#xA;t=0 0&#xA;m=audio ${this.output_port} RTP/AVP 0 8 18 9 101&#xA;a=rtpmap:0 PCMU/8000&#xA;a=rtpmap:8 PCMA/8000&#xA;a=rtpmap:18 G729/8000&#xA;a=fmtp:18 annexb=no&#xA;a=rtpmap:9 G722/8000&#xA;a=fmtp:101 0-15&#xA;a=rtpmap:1&#xA;01 telephone-event/8000&#xA;a=ptime:20&#xA;a=sendrecv&#xA;

    &#xA;

    Finally, the phone that makes the call in the first place will fully answer but still no audio stream. I notice if I change the IP address or port to something wrong the other phone Will hear its own audio instead of just quiet. so this leads me to believe I am headed in the right direction. And maybe the problem lies in not sending the right audio format for what I'm describing.

    &#xA;

    Additionaly, Whenever using ffmpeg to stream my audio with rtp I notice that it sees the file format as this pcm_alaw, 8000 Hz, mono, s16, 64 kb/s My new SDP describes using both ulaw and alaw but I'm not sure which it is saying it prefers

    &#xA;

    v=0&#xA;o=- ${this.output_port} ${this.output_port} IN IP4 192.168.1.39&#xA;s=SDP data&#xA;c=IN IP4 192.168.1.39&#xA;t=0 0&#xA;m=audio ${this.output_port} RTP/AVP 0 101&#xA;a=rtpmap:0 PCMU/8000&#xA;a=fmtp:101 0-15&#xA;a=rtpmap:101 telephone-event/8000&#xA;a=ptime:0&#xA;a=sendrecv&#xA;

    &#xA;

    I have been able to simply the SDP down to this. This will let the other phone actually pickup and not hear its own audio. it's just a completely dead air stream.

    &#xA;