Recherche avancée

Médias (39)

Mot : - Tags -/audio

Autres articles (71)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Contribute to documentation

    13 avril 2011

    Documentation is vital to the development of improved technical capabilities.
    MediaSPIP welcomes documentation by users as well as developers - including : critique of existing features and functions articles contributed by developers, administrators, content producers and editors screenshots to illustrate the above translations of existing documentation into other languages
    To contribute, register to the project users’ mailing (...)

  • Submit bugs and patches

    13 avril 2011

    Unfortunately a software is never perfect.
    If you think you have found a bug, report it using our ticket system. Please to help us to fix it by providing the following information : the browser you are using, including the exact version as precise an explanation as possible of the problem if possible, the steps taken resulting in the problem a link to the site / page in question
    If you think you have solved the bug, fill in a ticket and attach to it a corrective patch.
    You may also (...)

Sur d’autres sites (5196)

  • FFmpeg : Decoding AVPackets received from UDP socket

    8 août 2022, par strong_kobayashi

    I am trying to stream video frames encoded with FFmpeg from an Unity game to a client UI via UDP. To be specific, I am sending AVPackets (which contain the compressed frames, as far as I understand) from the server, which are then, on the client side, extracted as follows :

    



    inline UDPpacket* SDLGameClient::receiveData()
{
    if(SDLNet_UDP_Recv(socket, packet))
        return packet;
    else
        return NULL;
}

int main()
{
    ...
    // Initialize UDP
    ...
    UDPpacket *udpPacket;

    int i = 0;

    for(;;)
    {
        udpPacket = client->receiveData();

        if(udpPacket != nullptr)
        {
            i++;
            std::cout << "Packet " << i << " received!" << std::endl;

            AVPacket packet;
            av_init_packet(&packet);

            packet.data = new uint8_t[udpPacket->len];
            memcpy(packet.data, udpPacket->data, udpPacket->len);
            packet.size = udpPacket->len;

            ...


    



    To realize networking, I use the SDL_net library. Fragmenting, sending and receiving the packets seem to work without any problem. My question is, how do I decode the incoming AVPackets and stream the frames recorded in Unity to the client ?
I am aware that I need to use avcodec_send_packet and avcodec_receive_frame for decoding (since avcodec_decode_video2, that is used in many example decoding code, is deprecated), but when I do something like this :

    



    int ret = avcodec_send_packet(codecContext, &packet);
if(ret < 0 || ret == AVERROR(EAGAIN) || ret == AVERROR_EOF)
    std::cout << "avcodec_send_packet: " << ret << std::endl;
else
{
    while(ret >= 0)
    {
        ret = avcodec_receive_frame(codecContext, frame);
        if(ret == AVERROR(EAGAIN) || ret == AVERROR_EOF)
            std::cout << "avcodec_receive_frame: " << ret << std::endl;

        std::cout << "Frame: " << codecContext->frame_number << std::endl;
    }
}

av_packet_unref(packet);


    



    ret always returns a negative value (-22), so perhaps something is wrong with the AVPackets, or I'm sending the frames way too fast, I really have no clue :/

    



    Thanks in advance for any help !

    


  • Unity : Converting Texture2D to YUV420P using FFmpeg

    23 juillet 2021, par strong_kobayashi

    I'm trying to create a game in Unity where each frame is rendered into a texture and then put together into a video using FFmpeg. The output created by FFmpeg should eventually be sent over the network to a client UI. However, I'm struggling mainly with the part where a frame is caught, and passed to an unsafe method as a byte array where it should be processed further by FFmpeg. The wrapper I'm using is FFmpeg.AutoGen.

    



    The render to texture method :

    



    private IEnumerator CaptureFrame()
{
    yield return new WaitForEndOfFrame();

    RenderTexture.active = rt;
    frame.ReadPixels(rect, 0, 0);
    frame.Apply();

    bytes = frame.GetRawTextureData();

    EncodeAndWrite(bytes, bytes.Length);
}


    



    The unsafe encoding method so far :

    



    private unsafe void EncodeAndWrite(byte[] bytes, int size)
{
    GCHandle pinned = GCHandle.Alloc(bytes, GCHandleType.Pinned);
    IntPtr address = pinned.AddrOfPinnedObject();

    sbyte** inData = (sbyte**)address;
    fixed(int* lineSize = new int[1])
    {
        lineSize[0] = 4 * textureWidth;
        // Convert RGBA to YUV420P
        ffmpeg.sws_scale(sws, inData, lineSize, 0, codecContext->width, inputFrame->extended_data, inputFrame->linesize);
    }

    inputFrame->pts = frameCounter++;

    if(ffmpeg.avcodec_send_frame(codecContext, inputFrame) < 0)
        throw new ApplicationException("Error sending a frame for encoding!");

    pkt = new AVPacket();
    fixed(AVPacket* packet = &pkt)
        ffmpeg.av_init_packet(packet);
    pkt.data = null;
    pkt.size = 0;

    pinned.Free();
    ...
}


    



    sws_scale takes a sbyte** as the second parameter, therefore I'm trying to convert the input byte array to sbyte** by first pinning it with GCHandle and doing an explicit type conversion afterwards. I don't know if that's the correct way, though.

    



    Moreover, the condition if(ffmpeg.avcodec_send_frame(codecContext, inputFrame) < 0) alwasy throws an ApplicationException, where I also really don't know why this happens. codecContext and inputFrame are my AVCodecContext and AVFrame objects, respectively, and the fields are defined as the following :

    



    codecContext

    



    codecContext = ffmpeg.avcodec_alloc_context3(codec);
codecContext->bit_rate = 400000;
codecContext->width = textureWidth;
codecContext->height = textureHeight;

AVRational timeBase = new AVRational();
timeBase.num = 1;
timeBase.den = (int)fps;
codecContext->time_base = timeBase;
videoAVStream->time_base = timeBase;

AVRational frameRate = new AVRational();
frameRate.num = (int)fps;
frameRate.den = 1;
codecContext->framerate = frameRate;

codecContext->gop_size = 10;
codecContext->max_b_frames = 1;
codecContext->pix_fmt = AVPixelFormat.AV_PIX_FMT_YUV420P;


    



    inputFrame

    



    inputFrame = ffmpeg.av_frame_alloc();
inputFrame->format = (int)codecContext->pix_fmt;
inputFrame->width = textureWidth;
inputFrame->height = textureHeight;
inputFrame->linesize[0] = inputFrame->width;


    



    Any help in fixing the issue would be greatly appreciated :)

    


  • Evolution #3964 : mise en forme minimum des formulaires

    6 juin 2018, par nico d_

    C’est l’avis de tetue mais pas le mien :)

    Dans cette logique, pour être cohérent jusqu’au bout il faudrait aussi déplacer les h3.legend, .editer, .choix .saisie_* etc, qui sont explicitement liés à spip, dans spip.css
    Et déplacer ce qui reste de margin, font-weight et tous les Elements de notifications des boites succes, info, notice, alerte dans theme.css

    Bref, vider forms.css quoi :)

    Tel que je le conçois, form.css devrait contenir tout le style de base des formulaires spip ainsi que boutons réponses et erreurs.
    Tout dans ce fichier, pour pouvoir facilement le surcharger intégralement.

    Et dans theme.css, les ajustements de marges, couleurs, pour la dist, comme actuellement.