Recherche avancée

Médias (0)

Mot : - Tags -/diogene

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (41)

  • Qu’est ce qu’un masque de formulaire

    13 juin 2013, par

    Un masque de formulaire consiste en la personnalisation du formulaire de mise en ligne des médias, rubriques, actualités, éditoriaux et liens vers des sites.
    Chaque formulaire de publication d’objet peut donc être personnalisé.
    Pour accéder à la personnalisation des champs de formulaires, il est nécessaire d’aller dans l’administration de votre MediaSPIP puis de sélectionner "Configuration des masques de formulaires".
    Sélectionnez ensuite le formulaire à modifier en cliquant sur sont type d’objet. (...)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Supporting all media types

    13 avril 2011, par

    Unlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)

Sur d’autres sites (10417)

  • Java - Stream OpenGL Display to Android

    24 octobre 2016, par Intektor

    I tried to solve this problem for days now, but I couldn’t find a working solution. I am trying to stream my game screen (lwjgl) to my android smartphone(I have a frame buffer with the texture), and I already built a fully working packet system and all that stuff. But there are several problem I have no idea how to solve them, first of all, I don’t know in which format I should send the frame buffer, e.g I can’t send it as a Buffered Image, because it doesn’t exist on android. I tried using the jcodec library, but there is no documentation for it, and I didn’t find any examples that fit my case. I think I have to encode and decode it with h264 to make it a realtime live stream(that’s very important). I also heard about ffmpeg (and I found a java library for it : https://github.com/bramp/ffmpeg-cli-wrapper) but there is again no documentation for how to use it to stream it to my mobile. Also I have the problem, that when would get the frames to my smartphone, how can I make them load by the graphics card

    Here is what I have done so far :
    My packet :

    public class ImagePacketToClient implements Packet {

    public byte[] jpgInfo;
    public int width;
    public int height;

    BufferedImage image;

    public ImagePacketToClient() {
    }

    public ImagePacketToClient(BufferedImage image, int width, int height) {
       this.image = image;
       this.width = width;
       this.height = height;
    }

    @Override
    public void write(DataOutputStream out) throws IOException {
       ByteArrayOutputStream baos = new ByteArrayOutputStream();
       ImageIO.write(image, "jpg", baos);
       baos.flush();
       byte[] bytes = baos.toByteArray();
       baos.close();
       out.writeInt(bytes.length);
       for (byte aByte : bytes) {
           out.writeInt(aByte);
       }
    }

    @Override
    public void read(DataInputStream in) throws IOException {
       int length = in.readInt();
       jpgInfo = new byte[length];
       for (int i = 0; i < length; i++) {
           jpgInfo[i] = in.readByte();
       }
    }

    The code that gets called after the rendering has finished : mc.framebuffer is the frame buffer I can use :

    ScaledResolution resolution = new ScaledResolution(mc);
               BufferedImage screenshot = ScreenShotHelper.createScreenshot(resolution.getScaledWidth(), resolution.getScaledHeight(), mc.getFramebuffer());
               ImagePacketToClient packet = new ImagePacketToClient(screenshot, screenshot.getWidth(), screenshot.getHeight());
               PacketHelper.sendPacket(packet, CardboardMod.communicator.connectedSocket);
               screenshot.flush();

    public static BufferedImage createScreenshot(int width, int height, Framebuffer framebufferIn)
    {
       if (OpenGlHelper.isFramebufferEnabled())
       {
           width = framebufferIn.framebufferTextureWidth;
           height = framebufferIn.framebufferTextureHeight;
       }

       int i = width * height;

       if (pixelBuffer == null || pixelBuffer.capacity() < i)
       {
           pixelBuffer = BufferUtils.createIntBuffer(i);
           pixelValues = new int[i];
       }

       GlStateManager.glPixelStorei(3333, 1);
       GlStateManager.glPixelStorei(3317, 1);
       pixelBuffer.clear();

       if (OpenGlHelper.isFramebufferEnabled())
       {
           GlStateManager.bindTexture(framebufferIn.framebufferTexture);
           GlStateManager.glGetTexImage(3553, 0, 32993, 33639, pixelBuffer);
       }
       else
       {
           GlStateManager.glReadPixels(0, 0, width, height, 32993, 33639, pixelBuffer);
       }

       pixelBuffer.get(pixelValues);
       TextureUtil.processPixelValues(pixelValues, width, height);
       BufferedImage bufferedimage;

       if (OpenGlHelper.isFramebufferEnabled())
       {
           bufferedimage = new BufferedImage(framebufferIn.framebufferWidth, framebufferIn.framebufferHeight, 1);
           int j = framebufferIn.framebufferTextureHeight - framebufferIn.framebufferHeight;

           for (int k = j; k < framebufferIn.framebufferTextureHeight; ++k)
           {
               for (int l = 0; l < framebufferIn.framebufferWidth; ++l)
               {
                   bufferedimage.setRGB(l, k - j, pixelValues[k * framebufferIn.framebufferTextureWidth + l]);
               }
           }
       }
       else
       {
           bufferedimage = new BufferedImage(width, height, 1);
           bufferedimage.setRGB(0, 0, width, height, pixelValues, 0, width);
       }

       return bufferedimage;
    }

    Honestly I don’t want to use this Buffered Image Stuff, because it halfs my framerate, and that’s not good.
    And I don’t have any code for my android application yet, because I couldn’t figure out how I could get this image recreated on Android, and how to load it after that.
    I hope you understand my problem and I am happy about every tip you can give to me :)

  • html display video file binary buffer

    15 mai 2022, par Martin

    I am working on an example of ffmpeg-wasm which will accept 3 files as input [song1.mp3, song2.mp3, image.png] and combine them into a video file [cool-output-video.mkv] which I want to be displayed as a playable video for the user in the html webpage.

    


    https://github.com/MartinBarker/ffmpeg-wasm-node

    


    My code is hosted in the above repo, right now I have a placeholder video element, which I want to change to the finished video after the user uploads the 3 files and clicks the render button.

    


    enter image description here

    


    inside server.js, I have the route app.post('/render' , which will render the video using ffmpeg, and return the video using the line res.end(Buffer.from(outputData, 'binary'));

    


            outputData = ffmpeg.FS('readFile', outputFileName);
        ffmpeg.FS('unlink', outputFileName);

        res.writeHead(200, {
            'Content-Type': 'image/png',
            'Content-Disposition': `attachment;filename=${outputFileName}`,
            'Content-Length': outputData.length
        });
        //res.end(btoa(Buffer.from(outputData, 'binary')));
        //res.end(outputData)
        res.end(Buffer.from(outputData, 'binary'));


    


    I can see in my node server console log that the ffmpeg command finished and successfully renders the mkv video file

    


    enter image description here

    


    and inside client.js is where my code recieves the binary buffer for my outputted video file, and tries to make it appear on the webpage by changing the src attribute for the html video element, but no matter what code I try, I cant get the video file to appear on the html page

    


          <video width="320" height="240" controls="controls">&#xA;            <source src="" type="video/mp4">&#xA;          Your browser does not support the video tag.&#xA;        </source></video>&#xA;...&#xA;&#xA;const mp4source = document.querySelector(&#x27;#mp4source&#x27;)&#xA;&#xA;...&#xA;&#xA;            console.log(&#x27;call renderVideo()&#x27;)&#xA;            const renderResult = await renderVideo(files);&#xA;            console.log(&#x27;renderVideo() finished. renderResult=&#x27;,renderResult)&#xA;            mp4source.src = renderResult&#xA;

    &#xA;

    I can see in my html file in my chrome webpage devtools console, that data is being returned and gets printed out, I'm just not sure how to handle this string data for my video file to make it appear on the webpage :

    &#xA;

    renderVideo() finished. renderResult= data:image/png;base64,GkXfo6NChoEBQveBAULygQRC84EIQoKIbWF0cm9za2FCh4EEQoWBAhhTgGcBAAAAC0APWxFNm3TCv4T&#x2B;wiwATbuLU6uEFUmpZlOsgaFNu4tTq4QWVK5rU6yB8U27jFOrhBJUw2dTrIIB1E27jlOrhBxTu2tTrIQLQA797AEAAAAAAABRAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAFUmpZsu/hLA0O5Qq17GDD0JATYCNTGF2ZjU4LjQ1LjEwMFdBjUxhdmY1OC40NS4xMDBzpJAgTn1WHXX52GUlw&#x2B;lW90wkRImIQRzeoAAAAAAWVK5rQN2/hEsGuLSuAQAAAAAAAD3XgQFzxYjtk28iSHTlJJyBACK1nIN1bmSGj............. long text was truncated etc&#xA;

    &#xA;

    enter image description here

    &#xA;

    EDIT changed my ffmpeg command to export an mp4 video, changed header to be video/mp4, and verified it is getting added to the html src element but the video still does not appear

    &#xA;

    enter image description here

    &#xA;

  • How can I display the video on sdl converted to emscripten ?

    3 février 2021, par pleasehelp

    I am trying to view video in a browser using ffmpeg's decoder.

    &#xA;

    So, I made the decoder into a js file using emscripten.

    &#xA;

    By the way, the decoder seems to work, but only the last scene is displayed.

    &#xA;

    How can a video come out from start to finish ?

    &#xA;

    Here is my code

    &#xA;

    &#xA;#include &#xA;#include &#xA;#include &#xA;&#xA;#include <libavcodec></libavcodec>avcodec.h>&#xA;&#xA;#include <sdl2></sdl2>SDL.h>&#xA;&#xA;#define INBUF_SIZE 128&#xA;&#xA;static void decode(AVCodecContext *dec_ctx, AVFrame *frame, AVPacket *pkt,&#xA;                   const char *filename,&#xA;                   SDL_Window * screen, SDL_Renderer * renderer, SDL_Texture * texture)&#xA;{&#xA;    char buf[1024];&#xA;    int ret;&#xA;&#xA;    ret = avcodec_send_packet(dec_ctx, pkt);&#xA;    if (ret &lt; 0) {&#xA;        fprintf(stderr, "Error sending a packet for decoding\n");&#xA;        exit(1);&#xA;    }&#xA;&#xA;    while (ret >= 0) {&#xA;        ret = avcodec_receive_frame(dec_ctx, frame);&#xA;        if (ret == AVERROR(EAGAIN) || ret == AVERROR_EOF)&#xA;            return;&#xA;        else if (ret &lt; 0) {&#xA;            fprintf(stderr, "Error during decoding\n");&#xA;            exit(1);&#xA;        }&#xA;&#xA;        printf("saving frame %3d\n", dec_ctx->frame_number);&#xA;        fflush(stdout);&#xA;&#xA;        SDL_Rect rect;&#xA;        rect.x = 0;&#xA;        rect.y = 0;&#xA;        rect.w = dec_ctx->width;&#xA;        rect.h = dec_ctx->height;&#xA;&#xA;        SDL_UpdateYUVTexture(&#xA;            texture,            // the texture to update&#xA;            &amp;rect,              // a pointer to the rectangle of pixels to update, or NULL to update the entire texture&#xA;            frame->data[0],      // the raw pixel data for the Y plane&#xA;            frame->linesize[0],  // the number of bytes between rows of pixel data for the Y plane&#xA;            frame->data[1],      // the raw pixel data for the U plane&#xA;            frame->linesize[1],  // the number of bytes between rows of pixel data for the U plane&#xA;            frame->data[2],      // the raw pixel data for the V plane&#xA;            frame->linesize[2]   // the number of bytes between rows of pixel data for the V plane&#xA;        );&#xA;&#xA;        SDL_RenderClear(renderer);&#xA;&#xA;        SDL_RenderCopy(&#xA;            renderer,   // the rendering context&#xA;            texture,    // the source texture&#xA;            NULL,       // the source SDL_Rect structure or NULL for the entire texture&#xA;            NULL        // the destination SDL_Rect structure or NULL for the entire rendering&#xA;                        // target; the texture will be stretched to fill the given rectangle&#xA;        );&#xA;&#xA;        SDL_RenderPresent(renderer);&#xA;        SDL_UpdateWindowSurface(screen);&#xA;    }&#xA;}&#xA;&#xA;int main(int argc, char **argv)&#xA;{&#xA;    const char *filename, *outfilename;&#xA;    const AVCodec *codec;&#xA;    AVCodecParserContext *parser;&#xA;    AVCodecContext *c= NULL;&#xA;    FILE *f;&#xA;    AVFrame *frame;&#xA;    uint8_t inbuf[INBUF_SIZE &#x2B; AV_INPUT_BUFFER_PADDING_SIZE];&#xA;    uint8_t *data;&#xA;    size_t   data_size;&#xA;    int ret;&#xA;    AVPacket *pkt;&#xA;&#xA;    if (argc &lt;= 2) {&#xA;        fprintf(stderr, "Usage: %s <input file="file" /> <output file="file">\n"&#xA;                "And check your input file is encoded by mpeg1video please.\n", argv[0]);&#xA;        exit(0);&#xA;    }&#xA;    filename    = argv[1];&#xA;    outfilename = argv[2];&#xA;&#xA;    pkt = av_packet_alloc();&#xA;    if (!pkt)&#xA;        exit(1);&#xA;&#xA;    /* set end of buffer to 0 (this ensures that no overreading happens for damaged MPEG streams) */&#xA;    memset(inbuf &#x2B; INBUF_SIZE, 0, AV_INPUT_BUFFER_PADDING_SIZE);&#xA;&#xA;    ret = SDL_Init(SDL_INIT_VIDEO | SDL_INIT_AUDIO | SDL_INIT_TIMER);   // [1]&#xA;    if (ret != 0)&#xA;    {&#xA;        // error while initializing SDL&#xA;        printf("Could not initialize SDL - %s\n.", SDL_GetError());&#xA;&#xA;        // exit with error&#xA;        // return -1;&#xA;    }&#xA;&#xA;    /* find the MPEG-1 video decoder */&#xA;    codec = avcodec_find_decoder(AV_CODEC_ID_H265);&#xA;    if (!codec) {&#xA;        fprintf(stderr, "Codec not found\n");&#xA;        exit(1);&#xA;    }&#xA;&#xA;    parser = av_parser_init(codec->id);&#xA;    if (!parser) {&#xA;        fprintf(stderr, "parser not found\n");&#xA;        exit(1);&#xA;    }&#xA;&#xA;    c = avcodec_alloc_context3(codec);&#xA;    if (!c) {&#xA;        fprintf(stderr, "Could not allocate video codec context\n");&#xA;        exit(1);&#xA;    }&#xA;&#xA;    /* open it */&#xA;    if (avcodec_open2(c, codec, NULL) &lt; 0) {&#xA;        fprintf(stderr, "Could not open codec\n");&#xA;        exit(1);&#xA;    }&#xA;&#xA;    f = fopen(filename, "rb");&#xA;    if (!f) {&#xA;        fprintf(stderr, "Could not open %s\n", filename);&#xA;        exit(1);&#xA;    }&#xA;&#xA;    frame = av_frame_alloc();&#xA;    if (!frame) {&#xA;        fprintf(stderr, "Could not allocate video frame\n");&#xA;        exit(1);&#xA;    }&#xA;&#xA;    // Create a window with the specified position, dimensions, and flags.&#xA;    SDL_Window * screen = SDL_CreateWindow( // [2]&#xA;                            "SDL Video Player",&#xA;                            SDL_WINDOWPOS_UNDEFINED,&#xA;                            SDL_WINDOWPOS_UNDEFINED,&#xA;                            640,&#xA;                            360,&#xA;                            SDL_WINDOW_OPENGL | SDL_WINDOW_ALLOW_HIGHDPI&#xA;    );&#xA;&#xA;    if (!screen)&#xA;    {&#xA;        // could not set video mode&#xA;        printf("SDL: could not set video mode - exiting.\n");&#xA;&#xA;        // exit with Error&#xA;        // return -1;&#xA;    }&#xA;&#xA;    // //&#xA;    // SDL_GL_SetSwapInterval(1);&#xA;&#xA;    // A structure that contains a rendering state.&#xA;    SDL_Renderer * renderer = NULL;&#xA;&#xA;    // Use this function to create a 2D rendering context for a window.&#xA;    renderer = SDL_CreateRenderer(screen, -1, SDL_RENDERER_ACCELERATED);   // [3]&#xA;&#xA;    // A structure that contains an efficient, driver-specific representation&#xA;    // of pixel data.&#xA;    SDL_Texture * texture = NULL;&#xA;&#xA;    // Use this function to create a texture for a rendering context.&#xA;    texture = SDL_CreateTexture(  // [4]&#xA;                renderer,&#xA;                SDL_PIXELFORMAT_YV12,&#xA;                SDL_TEXTUREACCESS_STREAMING,&#xA;                640,&#xA;                360&#xA;            );&#xA;&#xA;    while (!feof(f)) {&#xA;        /* read raw data from the input file */&#xA;        data_size = fread(inbuf, 1, INBUF_SIZE, f);&#xA;        if (!data_size)&#xA;            break;&#xA;&#xA;        /* use the parser to split the data into frames */&#xA;        data = inbuf;&#xA;        while (data_size > 0) {&#xA;            ret = av_parser_parse2(parser, c, &amp;pkt->data, &amp;pkt->size,&#xA;                                   data, data_size, AV_NOPTS_VALUE, AV_NOPTS_VALUE, 0);&#xA;            if (ret &lt; 0) {&#xA;                fprintf(stderr, "Error while parsing\n");&#xA;                exit(1);&#xA;            }&#xA;            data      &#x2B;= ret;&#xA;            data_size -= ret;&#xA;&#xA;            if (pkt->size)&#xA;                decode(c, frame, pkt, outfilename, screen, renderer, texture);&#xA;        }&#xA;    }&#xA;&#xA;    /* flush the decoder */&#xA;    decode(c, frame, NULL, outfilename, screen, renderer, texture);&#xA;&#xA;    fclose(f);&#xA;&#xA;    av_parser_close(parser);&#xA;    avcodec_free_context(&amp;c);&#xA;    av_frame_free(&amp;frame);&#xA;    av_packet_free(&amp;pkt);&#xA;&#xA;    return 0;&#xA;}&#xA;&#xA;</output>

    &#xA;

    Is it possible to continuously play sdl in a browser ?

    &#xA;