
Recherche avancée
Médias (91)
-
Collections - Formulaire de création rapide
19 février 2013, par
Mis à jour : Février 2013
Langue : français
Type : Image
-
Les Miserables
4 juin 2012, par
Mis à jour : Février 2013
Langue : English
Type : Texte
-
Ne pas afficher certaines informations : page d’accueil
23 novembre 2011, par
Mis à jour : Novembre 2011
Langue : français
Type : Image
-
The Great Big Beautiful Tomorrow
28 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Texte
-
Richard Stallman et la révolution du logiciel libre - Une biographie autorisée (version epub)
28 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Texte
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (23)
-
Les formats acceptés
28 janvier 2010, parLes commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
ffmpeg -codecs ffmpeg -formats
Les format videos acceptés en entrée
Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
Les formats vidéos de sortie possibles
Dans un premier temps on (...) -
Problèmes fréquents
10 mars 2010, parPHP et safe_mode activé
Une des principales sources de problèmes relève de la configuration de PHP et notamment de l’activation du safe_mode
La solution consiterait à soit désactiver le safe_mode soit placer le script dans un répertoire accessible par apache pour le site -
Liste des distributions compatibles
26 avril 2011, parLe tableau ci-dessous correspond à la liste des distributions Linux compatible avec le script d’installation automatique de MediaSPIP. Nom de la distributionNom de la versionNuméro de version Debian Squeeze 6.x.x Debian Weezy 7.x.x Debian Jessie 8.x.x Ubuntu The Precise Pangolin 12.04 LTS Ubuntu The Trusty Tahr 14.04
Si vous souhaitez nous aider à améliorer cette liste, vous pouvez nous fournir un accès à une machine dont la distribution n’est pas citée ci-dessus ou nous envoyer le (...)
Sur d’autres sites (3149)
-
Grab frame from video (as Inputstream) using JavaCV in Java
23 janvier 2020, par Praveen GopalI am using JavaCV to grab frame from Video.
I can grab if video is in absolute path. But if video is in HTTP than JavaCV throw error.
url = new URL("http://www.sample-videos.com/video/mp4/720/SampleVideo.mp4");
urlConnection = (HttpURLConnection) url.openConnection();
InputStream inputStream = urlConnection.getInputStream();
Java2DFrameConverter bimConverter = new Java2DFrameConverter();
FFmpegFrameGrabber frameGrabber = new FFmpegFrameGrabber(inputStream);
String output = "C:\\Users\\xxxx\\Downloads\\Test";
frameGrabber.start();
Frame frame;
double frameRate=frameGrabber.getFrameRate();
int imgNum=5;
System.out.println("Video has "+frameGrabber.getFrameRate()+" frames and has frame rate of "+frameRate);
try {
frameGrabber.setFrameNumber(1000);
frame = frameGrabber.grabKeyFrame();
BufferedImage bi = bimConverter.convert(frame);
String path = output+File.separator+imgNum+".jpg";
ImageIO.write(bi,"png", new File(path));
frameGrabber.stop();
frameGrabber.close();
frameGrabber.flush();
} catch (Exception e) {
e.printStackTrace();
}Any help would be helpful.
Thanks in advance. -
Java - Stream OpenGL Display to Android
24 octobre 2016, par IntektorI tried to solve this problem for days now, but I couldn’t find a working solution. I am trying to stream my game screen (lwjgl) to my android smartphone(I have a frame buffer with the texture), and I already built a fully working packet system and all that stuff. But there are several problem I have no idea how to solve them, first of all, I don’t know in which format I should send the frame buffer, e.g I can’t send it as a Buffered Image, because it doesn’t exist on android. I tried using the jcodec library, but there is no documentation for it, and I didn’t find any examples that fit my case. I think I have to encode and decode it with h264 to make it a realtime live stream(that’s very important). I also heard about ffmpeg (and I found a java library for it : https://github.com/bramp/ffmpeg-cli-wrapper) but there is again no documentation for how to use it to stream it to my mobile. Also I have the problem, that when would get the frames to my smartphone, how can I make them load by the graphics card
Here is what I have done so far :
My packet :public class ImagePacketToClient implements Packet {
public byte[] jpgInfo;
public int width;
public int height;
BufferedImage image;
public ImagePacketToClient() {
}
public ImagePacketToClient(BufferedImage image, int width, int height) {
this.image = image;
this.width = width;
this.height = height;
}
@Override
public void write(DataOutputStream out) throws IOException {
ByteArrayOutputStream baos = new ByteArrayOutputStream();
ImageIO.write(image, "jpg", baos);
baos.flush();
byte[] bytes = baos.toByteArray();
baos.close();
out.writeInt(bytes.length);
for (byte aByte : bytes) {
out.writeInt(aByte);
}
}
@Override
public void read(DataInputStream in) throws IOException {
int length = in.readInt();
jpgInfo = new byte[length];
for (int i = 0; i < length; i++) {
jpgInfo[i] = in.readByte();
}
}The code that gets called after the rendering has finished : mc.framebuffer is the frame buffer I can use :
ScaledResolution resolution = new ScaledResolution(mc);
BufferedImage screenshot = ScreenShotHelper.createScreenshot(resolution.getScaledWidth(), resolution.getScaledHeight(), mc.getFramebuffer());
ImagePacketToClient packet = new ImagePacketToClient(screenshot, screenshot.getWidth(), screenshot.getHeight());
PacketHelper.sendPacket(packet, CardboardMod.communicator.connectedSocket);
screenshot.flush();
public static BufferedImage createScreenshot(int width, int height, Framebuffer framebufferIn)
{
if (OpenGlHelper.isFramebufferEnabled())
{
width = framebufferIn.framebufferTextureWidth;
height = framebufferIn.framebufferTextureHeight;
}
int i = width * height;
if (pixelBuffer == null || pixelBuffer.capacity() < i)
{
pixelBuffer = BufferUtils.createIntBuffer(i);
pixelValues = new int[i];
}
GlStateManager.glPixelStorei(3333, 1);
GlStateManager.glPixelStorei(3317, 1);
pixelBuffer.clear();
if (OpenGlHelper.isFramebufferEnabled())
{
GlStateManager.bindTexture(framebufferIn.framebufferTexture);
GlStateManager.glGetTexImage(3553, 0, 32993, 33639, pixelBuffer);
}
else
{
GlStateManager.glReadPixels(0, 0, width, height, 32993, 33639, pixelBuffer);
}
pixelBuffer.get(pixelValues);
TextureUtil.processPixelValues(pixelValues, width, height);
BufferedImage bufferedimage;
if (OpenGlHelper.isFramebufferEnabled())
{
bufferedimage = new BufferedImage(framebufferIn.framebufferWidth, framebufferIn.framebufferHeight, 1);
int j = framebufferIn.framebufferTextureHeight - framebufferIn.framebufferHeight;
for (int k = j; k < framebufferIn.framebufferTextureHeight; ++k)
{
for (int l = 0; l < framebufferIn.framebufferWidth; ++l)
{
bufferedimage.setRGB(l, k - j, pixelValues[k * framebufferIn.framebufferTextureWidth + l]);
}
}
}
else
{
bufferedimage = new BufferedImage(width, height, 1);
bufferedimage.setRGB(0, 0, width, height, pixelValues, 0, width);
}
return bufferedimage;
}Honestly I don’t want to use this Buffered Image Stuff, because it halfs my framerate, and that’s not good.
And I don’t have any code for my android application yet, because I couldn’t figure out how I could get this image recreated on Android, and how to load it after that.
I hope you understand my problem and I am happy about every tip you can give to me :) -
I want to convert relay server nodejs http->websocket code version to java netty to spring-websocket
20 août 2019, par rura6502I want to rewrite this example(https://github.com/phoboslab/jsmpeg/blob/master/websocket-relay.js) to java using netty and spring-WebSocket.
This nodejs example’s HTTP server get the media data from FFmpeg and relay to the WebSocket. And then javascript library draws on the HTML Canvas.
But My problem is that when I use netty, spring-WebSocket, some data cannot read by the javascript library and there are many data loss.
In the example, this nodejs code’s main part I think.
http = require('http'),
WebSocket = require('ws');
// setting websocket server ..............
var streamServer = http.createServer( function(request, response) {
// ....................
response.connection.setTimeout(0);
request.on('data', function(data){
socketServer.broadcast(data);
// .....
});
// .................
}).listen(STREAM_PORT);So I already tried to change this. I just used netty code in the official document(https://netty.io/wiki/user-guide-for-4.x.html) and changed sending a part for the Websocket
// this code is in channelRead method
ByteBuf buf = (ByteBuf) msg;
try {
while (buf.isReadable()) { // (1)
byte[] bytes = new byte[buf.readableBytes()];
buf.readBytes(bytes);
WSHandler.wsSessions.stream().forEach(wsSession -> {
try {
wsSession.sendMessage(new BinaryMessage(bytes));
} catch (IOException e) {
e.printStackTrace();
};
});
}
} finally {
ReferenceCountUtil.release(msg); // (2)
}Please tell me what I missed. help me. thanks.