Recherche avancée

Médias (1)

Mot : - Tags -/Christian Nold

Autres articles (106)

  • Des sites réalisés avec MediaSPIP

    2 mai 2011, par

    Cette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
    Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.

  • Ajout d’utilisateurs manuellement par un administrateur

    12 avril 2011, par

    L’administrateur d’un canal peut à tout moment ajouter un ou plusieurs autres utilisateurs depuis l’espace de configuration du site en choisissant le sous-menu "Gestion des utilisateurs".
    Sur cette page il est possible de :
    1. décider de l’inscription des utilisateurs via deux options : Accepter l’inscription de visiteurs du site public Refuser l’inscription des visiteurs
    2. d’ajouter ou modifier/supprimer un utilisateur
    Dans le second formulaire présent un administrateur peut ajouter, (...)

  • Automated installation script of MediaSPIP

    25 avril 2011, par

    To overcome the difficulties mainly due to the installation of server side software dependencies, an "all-in-one" installation script written in bash was created to facilitate this step on a server with a compatible Linux distribution.
    You must have access to your server via SSH and a root account to use it, which will install the dependencies. Contact your provider if you do not have that.
    The documentation of the use of this installation script is available here.
    The code of this (...)

Sur d’autres sites (13934)

  • Send H.264 encoded stream through RTMP using FFmpeg

    15 novembre 2016, par Galaxy

    I followed this to encode a sequences images to h.264 video.

    Here is outputting part of my code :

    int srcstride = outwidth*4;
    sws_scale(convertCtx, src_data, &srcstride, 0, outheight, pic_in.img.plane, pic_in.img.i_stride);
    x264_nal_t* nals;
    int i_nals;
    int frame_size = x264_encoder_encode(encoder, &nals, &i_nals, &pic_in, &pic_out);
    if (frame_size) {
       fwrite(nals[0].p_payload, frame_size, 1, fp);
    }

    This is in a loop to process frames and write them into a file.

    Now, I’m trying to stream these encoded frames through RTMP. As I know, the container for the RTMP is FLV. So I used command line as a trial :

    ffmpeg -i test.h264 -vcodec copy -f flv rtmp://localhost:1935/hls/test

    This one works well as streaming a h.264 encoded video file.

    But how can I implement it as C++ code and stream the frames at the same time when they are generated, just like what I did to stream my Facetime camera.

    ffmpeg -f avfoundation -pix_fmt uyvy422  -video_size 1280x720 -framerate 30 -i "1:0" -pix_fmt yuv420p -vcodec libx264 -preset veryfast -acodec libvo_aacenc -f flv -framerate 30 rtmp://localhost:1935/hls/test

    This may be a common and practical topic. But I’m stuck here for days, really need some relevant exprience. Thank you !

  • Send H.264 encoded stream through RTMP using FFmpeg

    15 novembre 2016, par Galaxy

    I followed this to encode a sequences images to h.264 video.

    Here is outputting part of my code :

    int srcstride = outwidth*4;
    sws_scale(convertCtx, src_data, &srcstride, 0, outheight, pic_in.img.plane, pic_in.img.i_stride);
    x264_nal_t* nals;
    int i_nals;
    int frame_size = x264_encoder_encode(encoder, &nals, &i_nals, &pic_in, &pic_out);
    if (frame_size) {
       fwrite(nals[0].p_payload, frame_size, 1, fp);
    }

    This is in a loop to process frames and write them into a file.

    Now, I’m trying to stream these encoded frames through RTMP. As I know, the container for the RTMP is FLV. So I used command line as a trial :

    ffmpeg -i test.h264 -vcodec copy -f flv rtmp://localhost:1935/hls/test

    This one works well as streaming a h.264 encoded video file.

    But how can I implement it as C++ code and stream the frames at the same time when they are generated, just like what I did to stream my Facetime camera.

    ffmpeg -f avfoundation -pix_fmt uyvy422  -video_size 1280x720 -framerate 30 -i "1:0" -pix_fmt yuv420p -vcodec libx264 -preset veryfast -acodec libvo_aacenc -f flv -framerate 30 rtmp://localhost:1935/hls/test

    This may be a common and practical topic. But I’m stuck here for days, really need some relevant exprience. Thank you !

  • Java - Stream OpenGL Display to Android

    24 octobre 2016, par Intektor

    I tried to solve this problem for days now, but I couldn’t find a working solution. I am trying to stream my game screen (lwjgl) to my android smartphone(I have a frame buffer with the texture), and I already built a fully working packet system and all that stuff. But there are several problem I have no idea how to solve them, first of all, I don’t know in which format I should send the frame buffer, e.g I can’t send it as a Buffered Image, because it doesn’t exist on android. I tried using the jcodec library, but there is no documentation for it, and I didn’t find any examples that fit my case. I think I have to encode and decode it with h264 to make it a realtime live stream(that’s very important). I also heard about ffmpeg (and I found a java library for it : https://github.com/bramp/ffmpeg-cli-wrapper) but there is again no documentation for how to use it to stream it to my mobile. Also I have the problem, that when would get the frames to my smartphone, how can I make them load by the graphics card

    Here is what I have done so far :
    My packet :

    public class ImagePacketToClient implements Packet {

    public byte[] jpgInfo;
    public int width;
    public int height;

    BufferedImage image;

    public ImagePacketToClient() {
    }

    public ImagePacketToClient(BufferedImage image, int width, int height) {
       this.image = image;
       this.width = width;
       this.height = height;
    }

    @Override
    public void write(DataOutputStream out) throws IOException {
       ByteArrayOutputStream baos = new ByteArrayOutputStream();
       ImageIO.write(image, "jpg", baos);
       baos.flush();
       byte[] bytes = baos.toByteArray();
       baos.close();
       out.writeInt(bytes.length);
       for (byte aByte : bytes) {
           out.writeInt(aByte);
       }
    }

    @Override
    public void read(DataInputStream in) throws IOException {
       int length = in.readInt();
       jpgInfo = new byte[length];
       for (int i = 0; i < length; i++) {
           jpgInfo[i] = in.readByte();
       }
    }

    The code that gets called after the rendering has finished : mc.framebuffer is the frame buffer I can use :

    ScaledResolution resolution = new ScaledResolution(mc);
               BufferedImage screenshot = ScreenShotHelper.createScreenshot(resolution.getScaledWidth(), resolution.getScaledHeight(), mc.getFramebuffer());
               ImagePacketToClient packet = new ImagePacketToClient(screenshot, screenshot.getWidth(), screenshot.getHeight());
               PacketHelper.sendPacket(packet, CardboardMod.communicator.connectedSocket);
               screenshot.flush();

    public static BufferedImage createScreenshot(int width, int height, Framebuffer framebufferIn)
    {
       if (OpenGlHelper.isFramebufferEnabled())
       {
           width = framebufferIn.framebufferTextureWidth;
           height = framebufferIn.framebufferTextureHeight;
       }

       int i = width * height;

       if (pixelBuffer == null || pixelBuffer.capacity() < i)
       {
           pixelBuffer = BufferUtils.createIntBuffer(i);
           pixelValues = new int[i];
       }

       GlStateManager.glPixelStorei(3333, 1);
       GlStateManager.glPixelStorei(3317, 1);
       pixelBuffer.clear();

       if (OpenGlHelper.isFramebufferEnabled())
       {
           GlStateManager.bindTexture(framebufferIn.framebufferTexture);
           GlStateManager.glGetTexImage(3553, 0, 32993, 33639, pixelBuffer);
       }
       else
       {
           GlStateManager.glReadPixels(0, 0, width, height, 32993, 33639, pixelBuffer);
       }

       pixelBuffer.get(pixelValues);
       TextureUtil.processPixelValues(pixelValues, width, height);
       BufferedImage bufferedimage;

       if (OpenGlHelper.isFramebufferEnabled())
       {
           bufferedimage = new BufferedImage(framebufferIn.framebufferWidth, framebufferIn.framebufferHeight, 1);
           int j = framebufferIn.framebufferTextureHeight - framebufferIn.framebufferHeight;

           for (int k = j; k < framebufferIn.framebufferTextureHeight; ++k)
           {
               for (int l = 0; l < framebufferIn.framebufferWidth; ++l)
               {
                   bufferedimage.setRGB(l, k - j, pixelValues[k * framebufferIn.framebufferTextureWidth + l]);
               }
           }
       }
       else
       {
           bufferedimage = new BufferedImage(width, height, 1);
           bufferedimage.setRGB(0, 0, width, height, pixelValues, 0, width);
       }

       return bufferedimage;
    }

    Honestly I don’t want to use this Buffered Image Stuff, because it halfs my framerate, and that’s not good.
    And I don’t have any code for my android application yet, because I couldn’t figure out how I could get this image recreated on Android, and how to load it after that.
    I hope you understand my problem and I am happy about every tip you can give to me :)