
Recherche avancée
Médias (1)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (67)
-
MediaSPIP Player : problèmes potentiels
22 février 2011, parLe lecteur ne fonctionne pas sur Internet Explorer
Sur Internet Explorer (8 et 7 au moins), le plugin utilise le lecteur Flash flowplayer pour lire vidéos et son. Si le lecteur ne semble pas fonctionner, cela peut venir de la configuration du mod_deflate d’Apache.
Si dans la configuration de ce module Apache vous avez une ligne qui ressemble à la suivante, essayez de la supprimer ou de la commenter pour voir si le lecteur fonctionne correctement : /** * GeSHi (C) 2004 - 2007 Nigel McNie, (...) -
Demande de création d’un canal
12 mars 2010, parEn fonction de la configuration de la plateforme, l’utilisateur peu avoir à sa disposition deux méthodes différentes de demande de création de canal. La première est au moment de son inscription, la seconde, après son inscription en remplissant un formulaire de demande.
Les deux manières demandent les mêmes choses fonctionnent à peu près de la même manière, le futur utilisateur doit remplir une série de champ de formulaire permettant tout d’abord aux administrateurs d’avoir des informations quant à (...) -
Gestion de la ferme
2 mars 2010, parLa ferme est gérée dans son ensemble par des "super admins".
Certains réglages peuvent être fais afin de réguler les besoins des différents canaux.
Dans un premier temps il utilise le plugin "Gestion de mutualisation"
Sur d’autres sites (8867)
-
avcodec/crystalhd : Adapt to new new decode API
22 avril 2017, par Philip Langdaleavcodec/crystalhd : Adapt to new new decode API
The new new decode API requires the decoder to ask for the next input
packet, and it cannot just return EAGAIN if that packet cannot be
processed yet. This means we must finally confront how we get this
decoder to block when the input buffer is full and no output frames
are ready yet.In the end, that isn't too hard to achieve - the main trick seems to
be that you have to aggressively poll the hardware - it doesn't seem
to make any forward progress if you sleep.Signed-off-by : James Almer <jamrial@gmail.com>
-
Can't play at a good framerate with JavaAV
31 mars 2017, par TW2I have a problem in framerate when I use JavaAV (which use JavaCPP + ffmpeg). When I set my framerate as mentionned at DemuxerExample.java, the frame change each 47000ms which is too long and false. It’s better when I set 1000 / demuxer.getFrameRate() * 1000 ( 47ms) but it’s also false. I can’t have a good framerate. Here is my player class :
package gr.av;
import hoary.javaav.Audio;
import hoary.javaav.AudioFrame;
import hoary.javaav.Demuxer;
import hoary.javaav.Image;
import hoary.javaav.JavaAVException;
import hoary.javaav.MediaFrame;
import hoary.javaav.VideoFrame;
import java.awt.Color;
import java.awt.Graphics;
import java.awt.image.BufferedImage;
import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.util.logging.Level;
import java.util.logging.Logger;
import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioInputStream;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.DataLine;
import javax.sound.sampled.LineUnavailableException;
import javax.sound.sampled.SourceDataLine;
import javax.sound.sampled.UnsupportedAudioFileException;
import javax.swing.JPanel;
public class Player extends JPanel {
VideoThread videoTHREAD;
AudioThread audioTHREAD;
BufferedImage img = null;
public Player() {
init();
}
private void init(){
videoTHREAD = new VideoThread(this);
audioTHREAD = new AudioThread();
}
public void setImage(BufferedImage img){
this.img = img;
repaint();
}
@Override
public void paint(Graphics g){
if(img != null){
g.drawImage(img, 0, 0, null);
}else{
g.setColor(Color.blue);
g.fillRect(0, 0, getWidth(), getHeight());
}
}
public void setFilename(String filename){
videoTHREAD.setFilename(filename);
audioTHREAD.setFilename(filename);
}
public void play(){
videoTHREAD.playThread();
audioTHREAD.playThread();
}
public void stop(){
videoTHREAD.stopThread();
audioTHREAD.stopThread();
}
public static class VideoThread extends Thread {
//The video filename and a controller
String filename = null;
private volatile boolean active = false;
//Panel to see video
Player player;
public VideoThread(Player player) {
this.player = player;
}
public void setFilename(String filename){
this.filename = filename;
}
public void playThread(){
if(filename != null && active == false){
active = true;
this.start();
}
}
public void stopThread(){
if(active == true){
active = false;
this.interrupt();
}
}
public void video() throws JavaAVException, InterruptedException, IOException, UnsupportedAudioFileException{
Demuxer demuxer = new Demuxer();
demuxer.open(filename);
MediaFrame mediaFrame;
while (active && (mediaFrame = demuxer.readFrame()) != null) {
if (mediaFrame.getType() == MediaFrame.Type.VIDEO) {
VideoFrame videoFrame = (VideoFrame) mediaFrame;
player.setImage(Image.createImage(videoFrame, BufferedImage.TYPE_3BYTE_BGR));
double FPS = demuxer.getFrameRate() * 1000d;
long ms = (long)(1000d / FPS);
System.out.println("FPS = " + FPS + " ; Milliseconds = " + ms);
java.util.concurrent.TimeUnit.MILLISECONDS.sleep(ms);
}
}
demuxer.close();
}
@Override
public void run() {
if(filename != null){
try {
video();
} catch (JavaAVException | InterruptedException | IOException | UnsupportedAudioFileException ex) {
Logger.getLogger(Player.class.getName()).log(Level.SEVERE, null, ex);
}
}
}
}
public static class AudioThread extends Thread {
//The video filename and a controller
String filename = null;
private volatile boolean active = false;
//Audio
AudioFormat format = new AudioFormat(44000, 16, 2, true, false);
AudioInputStream ais;
DataLine.Info info = new DataLine.Info(SourceDataLine.class, format);
SourceDataLine soundLine;
public AudioThread() {
}
public void setFilename(String filename){
this.filename = filename;
}
public void playThread(){
if(filename != null && active == false){
active = true;
soundON();
this.start();
}
}
public void stopThread(){
if(active == true){
active = false;
soundOFF();
this.interrupt();
}
}
public void audio() throws JavaAVException, IOException {
Demuxer demuxer = new Demuxer();
demuxer.open(filename);
MediaFrame mediaFrame;
while (active && (mediaFrame = demuxer.readFrame()) != null) {
if (mediaFrame.getType() == MediaFrame.Type.AUDIO) {
AudioFrame audioFrame = (AudioFrame) mediaFrame;
byte[] bytes = Audio.getAudio16(audioFrame);
try (ByteArrayInputStream bais = new ByteArrayInputStream(bytes)) {
ais = new AudioInputStream(bais, format, bytes.length / format.getFrameSize());
try (ByteArrayOutputStream baos = new ByteArrayOutputStream()) {
int nBufferSize = bytes.length * format.getFrameSize();
byte[] abBuffer = new byte[nBufferSize];
while (true){
int nBytesRead = ais.read(abBuffer);
if (nBytesRead == -1)
break;
baos.write(abBuffer, 0, nBytesRead);
}
byte[] abAudioData = baos.toByteArray();
soundLine.write(abAudioData, 0, abAudioData.length);
}
ais.close();
}
}
}
demuxer.close();
}
@Override
public void run() {
if(filename != null){
try {
audio();
} catch (JavaAVException | IOException ex) {
Logger.getLogger(Player.class.getName()).log(Level.SEVERE, null, ex);
}
}
}
private void soundON(){
try {
soundLine = (SourceDataLine) AudioSystem.getLine(info);
soundLine.open(format);
soundLine.start();
} catch (LineUnavailableException ex) {
Logger.getLogger(Player.class.getName()).log(Level.SEVERE, null, ex);
}
}
private void soundOFF(){
soundLine.drain();
soundLine.stop();
soundLine.close();
}
}
} -
Add audio to Xuggler video stream (ffmpeg)
11 avril 2017, par zholmes1I am trying to set up Facebook live video streaming in Java. I maintain a
BufferedImage
separately from this method which contains the image that is being streamed. I am connecting successfully and streaming the video, but Facebook takes the video down after two minutes because I am not sending audio as well. How can I add audio to this stream ?IContainer container = IContainer.make();
IContainerFormat containerFormat_live = IContainerFormat.make();
containerFormat_live.setOutputFormat("flv", streamUrl, null);
container.setInputBufferLength(0);
int retVal = container.open(streamUrl, IContainer.Type.WRITE, containerFormat_live);
if (retVal < 0) {
System.err.println("Could not open output container for live stream");
System.exit(1);
}
IStream videoStream = container.addNewStream(0);
IStreamCoder videoCoder = videoStream.getStreamCoder();
ICodec videoCodec = ICodec.findEncodingCodec(ICodec.ID.CODEC_ID_H264);
videoCoder.setNumPicturesInGroupOfPictures(5);
videoCoder.setCodec(videoCodec);
videoCoder.setBitRate(200000);
videoCoder.setPixelType(IPixelFormat.Type.YUV420P);
videoCoder.setHeight(IMAGE_HEIGHT_PX_OUTPUT);
videoCoder.setWidth(IMAGE_WIDTH_PX_OUTPUT);
System.out.println("[ENCODER] video size is " + IMAGE_HEIGHT_PX_OUTPUT + "x" + IMAGE_WIDTH_PX_OUTPUT);
videoCoder.setFlag(IStreamCoder.Flags.FLAG_QSCALE, true);
videoCoder.setGlobalQuality(0);
IRational frameRate = IRational.make(30, 1);
videoCoder.setFrameRate(frameRate);
IRational timeBase = IRational.make(frameRate.getDenominator(), frameRate.getNumerator());
videoCoder.setTimeBase(timeBase);
// IStream audioStream = container.addNewStream(1);
// IStreamCoder audioCoder = audioStream.getStreamCoder();
// ICodec audioCodec = ICodec.findEncodingCodec(ICodec.ID.CODEC_ID_AAC);
// audioCoder.setCodec(audioCodec);
// audioCoder.setBitRate(128 * 1024);
// audioCoder.setChannels(1);
// audioCoder.setSampleRate(44100);
// audioCoder.setFrameRate(IRational.make(1, 1));
// audioCoder.setTimeBase(timeBase);
//
// IAudioResampler audioResampler = IAudioResampler.make(audioCoder.getChannels(), audioCoder.getChannels(), audioCoder.getSampleRate(), audioCoder.getSampleRate(), IAudioSamples.Format.FMT_S32, audioCoder.getSampleFormat());
Properties props = new Properties();
InputStream is = XugglerRtmpReferenceImpl.class.getResourceAsStream("/libx264-normal.ffpreset");
try {
props.load(is);
} catch (IOException e) {
System.err.println("You need the libx264-normal.ffpreset file from the Xuggle distribution in your classpath.");
System.exit(1);
}
Configuration.configure(props, videoCoder);
// Configuration.configure(props, audioCoder);
videoCoder.open();
// audioCoder.open();
container.writeHeader();
// IAudioSamples audioSamples = IAudioSamples.make(512, audioCoder.getChannels());
// audioSamples.setComplete(true, 1024, audioCoder.getSampleRate(), audioCoder.getChannels(), IAudioSamples.Format.FMT_S32, 0);
//
// IAudioSamples resampledAudio = IAudioSamples.make(512, audioCoder.getChannels(), IAudioSamples.Format.FMT_S32);
// audioResampler.resample(resampledAudio, audioSamples, 0);
long firstTimeStamp = System.currentTimeMillis();
long lastKeyFrameTimestamp = 0;
long lastTimeStamp = System.currentTimeMillis();
int i = 0;
while (streaming) {
//long iterationStartTime = System.currentTimeMillis();
long now = System.currentTimeMillis();
//convert it for Xuggler
BufferedImage currentScreenshot = new BufferedImage(bufferedImage.getWidth(), bufferedImage.getHeight(), BufferedImage.TYPE_3BYTE_BGR);
currentScreenshot.getGraphics().drawImage(bufferedImage, 0, 0, null);
//start the encoding process
IPacket packet = IPacket.make();
IConverter converter = ConverterFactory.createConverter(currentScreenshot, IPixelFormat.Type.YUV420P);
long timeStamp = (now - firstTimeStamp) * 1000;
IVideoPicture outFrame = converter.toPicture(currentScreenshot, timeStamp);
// make sure there is a keyframe at least every 2 seconds
if (System.currentTimeMillis() - lastKeyFrameTimestamp > 1500) {
outFrame.setKeyFrame(true);
lastKeyFrameTimestamp = System.currentTimeMillis();
}
outFrame.setQuality(0);
videoCoder.encodeVideo(packet, outFrame, 0);
// audioCoder.encodeAudio(packet, IAudioSamples.make(0, audioCoder.getChannels()), 0);
outFrame.delete();
if (packet.isComplete()) {
container.writePacket(packet);
System.out.println("[ENCODER] writing packet of size " + packet.getSize() + " for elapsed time " + ((timeStamp - lastTimeStamp) / 1000));
lastTimeStamp = System.currentTimeMillis();
}
System.out.println("[ENCODER] encoded image " + i + " in " + (System.currentTimeMillis() - now));
i++;
try {
// sleep for framerate milliseconds
Thread.sleep(Math.max((long) (1000 / frameRate.getDouble()) - (System.currentTimeMillis() - now), 0));
} catch (InterruptedException e) {
e.printStackTrace();
}
}
container.writeTrailer();