Recherche avancée

Médias (1)

Mot : - Tags -/artwork

Autres articles (38)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Emballe médias : à quoi cela sert ?

    4 février 2011, par

    Ce plugin vise à gérer des sites de mise en ligne de documents de tous types.
    Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ;

  • Supporting all media types

    13 avril 2011, par

    Unlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)

Sur d’autres sites (7652)

  • Java - Stream OpenGL Display to Android

    24 octobre 2016, par Intektor

    I tried to solve this problem for days now, but I couldn’t find a working solution. I am trying to stream my game screen (lwjgl) to my android smartphone(I have a frame buffer with the texture), and I already built a fully working packet system and all that stuff. But there are several problem I have no idea how to solve them, first of all, I don’t know in which format I should send the frame buffer, e.g I can’t send it as a Buffered Image, because it doesn’t exist on android. I tried using the jcodec library, but there is no documentation for it, and I didn’t find any examples that fit my case. I think I have to encode and decode it with h264 to make it a realtime live stream(that’s very important). I also heard about ffmpeg (and I found a java library for it : https://github.com/bramp/ffmpeg-cli-wrapper) but there is again no documentation for how to use it to stream it to my mobile. Also I have the problem, that when would get the frames to my smartphone, how can I make them load by the graphics card

    Here is what I have done so far :
    My packet :

    public class ImagePacketToClient implements Packet {

    public byte[] jpgInfo;
    public int width;
    public int height;

    BufferedImage image;

    public ImagePacketToClient() {
    }

    public ImagePacketToClient(BufferedImage image, int width, int height) {
       this.image = image;
       this.width = width;
       this.height = height;
    }

    @Override
    public void write(DataOutputStream out) throws IOException {
       ByteArrayOutputStream baos = new ByteArrayOutputStream();
       ImageIO.write(image, "jpg", baos);
       baos.flush();
       byte[] bytes = baos.toByteArray();
       baos.close();
       out.writeInt(bytes.length);
       for (byte aByte : bytes) {
           out.writeInt(aByte);
       }
    }

    @Override
    public void read(DataInputStream in) throws IOException {
       int length = in.readInt();
       jpgInfo = new byte[length];
       for (int i = 0; i < length; i++) {
           jpgInfo[i] = in.readByte();
       }
    }

    The code that gets called after the rendering has finished : mc.framebuffer is the frame buffer I can use :

    ScaledResolution resolution = new ScaledResolution(mc);
               BufferedImage screenshot = ScreenShotHelper.createScreenshot(resolution.getScaledWidth(), resolution.getScaledHeight(), mc.getFramebuffer());
               ImagePacketToClient packet = new ImagePacketToClient(screenshot, screenshot.getWidth(), screenshot.getHeight());
               PacketHelper.sendPacket(packet, CardboardMod.communicator.connectedSocket);
               screenshot.flush();

    public static BufferedImage createScreenshot(int width, int height, Framebuffer framebufferIn)
    {
       if (OpenGlHelper.isFramebufferEnabled())
       {
           width = framebufferIn.framebufferTextureWidth;
           height = framebufferIn.framebufferTextureHeight;
       }

       int i = width * height;

       if (pixelBuffer == null || pixelBuffer.capacity() < i)
       {
           pixelBuffer = BufferUtils.createIntBuffer(i);
           pixelValues = new int[i];
       }

       GlStateManager.glPixelStorei(3333, 1);
       GlStateManager.glPixelStorei(3317, 1);
       pixelBuffer.clear();

       if (OpenGlHelper.isFramebufferEnabled())
       {
           GlStateManager.bindTexture(framebufferIn.framebufferTexture);
           GlStateManager.glGetTexImage(3553, 0, 32993, 33639, pixelBuffer);
       }
       else
       {
           GlStateManager.glReadPixels(0, 0, width, height, 32993, 33639, pixelBuffer);
       }

       pixelBuffer.get(pixelValues);
       TextureUtil.processPixelValues(pixelValues, width, height);
       BufferedImage bufferedimage;

       if (OpenGlHelper.isFramebufferEnabled())
       {
           bufferedimage = new BufferedImage(framebufferIn.framebufferWidth, framebufferIn.framebufferHeight, 1);
           int j = framebufferIn.framebufferTextureHeight - framebufferIn.framebufferHeight;

           for (int k = j; k < framebufferIn.framebufferTextureHeight; ++k)
           {
               for (int l = 0; l < framebufferIn.framebufferWidth; ++l)
               {
                   bufferedimage.setRGB(l, k - j, pixelValues[k * framebufferIn.framebufferTextureWidth + l]);
               }
           }
       }
       else
       {
           bufferedimage = new BufferedImage(width, height, 1);
           bufferedimage.setRGB(0, 0, width, height, pixelValues, 0, width);
       }

       return bufferedimage;
    }

    Honestly I don’t want to use this Buffered Image Stuff, because it halfs my framerate, and that’s not good.
    And I don’t have any code for my android application yet, because I couldn’t figure out how I could get this image recreated on Android, and how to load it after that.
    I hope you understand my problem and I am happy about every tip you can give to me :)

  • Capturing video over USB/HDMI/Thunderbolt

    12 novembre 2016, par Yatko

    Looking for a solution for capturing video over USB/HDMI/Thunderbolt from a digital output (e.g. digital camera) to a computer, Mac and/or Windows.

    The goal is to have an URL to a real-time video stream (e.g. IP/PATH/ ?.mp4) that we can further process/transcode/send to a media server.

    I’m looking for tips and ideas -similar to the method below-, maybe someone met a new project that’s focusing on capture-cards and devices, that does’n need a custom FFmpeg build. Something different.

    • we can capture the HDMI stream form a GoPro, using a Blackmagic Intensity Shuttle and DeckLink SDK with custom FFmpeg build using --extra-cflags and --extra-ldflags and the rest is straightforward

    Is there any tool, open-source project, something that’s made for this purpose ? Maybe something that also supports the Elgato Game Capture HD60 as well ? Any experimental projects for capturing and processing the incoming video over USB/HDMI/Thunderbolt ?

    The ultimate goal is live streaming to Wowza, using Cameleon live and a Sony Alpha a7S.

  • ffmpeg - Making a Clean WAV file

    24 septembre 2016, par Edward

    I’m looking to batch convert a number of files to audio files using ffmpeg for a game called Star Wars: Jedi Knight: Dark Forces II. The problem I’m having is that ffmpeg seems to be doing something that does so that Jedi Knight can’t play the sound file.

    Jedi Knight accepts plain old PCM WAV files of various ranges, from 5khz to 96khz, 8 and 16 bit, mono and stereo. This sounds plain and simple. Except for that if one were to create a WAV file using MS Sound Recorder, Jedi Knight could not play it. Speculation was that it added something extra to header or something. But it can play a WAV file created by Audacity, GoldWave or ModPlug Tracker to name a few.

    So why not ffmpeg ? Am I using the wrong codec or params ? I took an original sound file from the game and performed the following :

    ffmpeg -i "orig_thrmlpu2.wav" -f wav -acodec pcm_s16le -ar 22050 -ac 1 "ffmpeg_thrmlpu2.wav"

    The ffmpeg version does not play in the game. ffprobe shows that the ffmpeg version has some Metadata which the original doesn’t have. What params should I use to try and get the same WAV format as the original ? Mind you, -ar, -ac and bits aren’t the important parts.

    Here are the files for you to examine : http://www.edwardleuf.org/Games/JK/thrmlpu2.zip