
Recherche avancée
Médias (1)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (48)
-
Mise à jour de la version 0.1 vers 0.2
24 juin 2013, parExplications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...) -
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir
Sur d’autres sites (10430)
-
h264pred : added AVX2 implementation for tm_vp8 16x16.
18 mars 2017, par Mirage Abeysekarah264pred : added AVX2 implementation for tm_vp8 16x16.
checkasm —bench results with 5000 runs
pred16x16_tm_vp8_c : 302.8
pred16x16_tm_vp8_mmx : 101.4
pred16x16_tm_vp8_mmxext : 95.5
pred16x16_tm_vp8_sse2 : 95.1
pred16x16_tm_vp8_avx2 : 38.2Signed-off-by : Ronald S. Bultje <rsbultje@gmail.com>
-
Android JavaCV FFmpeg webstream to local static website
26 mars 2017, par Thomas DevoogdtFor my integrated test I’m working on an application that needs to provide a live stream to a locally hosted website. I’ve already built a working site that run’s on nanohttpd. This application performs also special image processing. Therefore I use JavaCV. The library is working perfectly and all cpp bindings are working too.
My question : How to set up a live stream that can directly be played in a static site hosted by nanohttpd ? - I am on the right way ?
My code :
init :
private void initLiveStream() throws FrameRecorder.Exception {
/* ~~~ https://github.com/bytedeco/javacv/issues/598 ~~~ */
frameRecorder = new FFmpegFrameRecorder("http://localhost:9090", imageWidth, imageHeight, 0);
frameRecorder.setVideoOption("preset", "ultrafast");
frameRecorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
frameRecorder.setAudioCodec(0);
frameRecorder.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);
frameRecorder.setFormat("webm");
frameRecorder.setGopSize(10);
frameRecorder.setFrameRate(frameRate);
frameRecorder.setVideoBitrate(5000);
frameRecorder.setOption("content_type","video/webm");
frameRecorder.setOption("listen", "1");
frameRecorder.start();
}In my CameraView :
@Override
public void onPreviewFrame(byte[] data, Camera camera) {
Camera.Size size = camera.getParameters().getPreviewSize();
Frame frame = new AndroidFrameConverter().convert(data, size.width, size.height);
try {
if(frameRecorder!=null){
frameRecorder.record(frame);
}
} catch (FrameRecorder.Exception e) {
e.printStackTrace();
}
}Here is one of the stack traces that ar shown frequently in my search to the solution :
org.bytedeco.javacv.FrameRecorder$Exception: avio_open error() error -111: Could not open 'http://localhost:9090'
I couldn’t find any other thread addressing this specific issue.
Thanks in advance
EDIT
Thanks to Chester Cobus, Here is my used code :
Websocket :
//Constructor
AsyncHttpServer serverStream = new AsyncHttpServer();
List<websocket> sockets = new ArrayList<>();
//http://stackoverflow.com/a/33021907/5500092
//I'm planning to use more sockets. This is the only uniform expression I found.
serverStream.websocket("/((?:[^/]*/)*)(.*)", new AsyncHttpServer.WebSocketRequestCallback() {
@Override
public void onConnected(final WebSocket webSocket, AsyncHttpServerRequest request) {
String uri = request.getPath();
if (uri.equals("/live")) {
sockets.add(webSocket);
//Use this to clean up any references to your websocket
webSocket.setClosedCallback(new CompletedCallback() {
@Override
public void onCompleted(Exception ex) {
try {
if (ex != null)
Log.e("WebSocket", "Error");
} finally {
sockets.remove(webSocket);
}
}
});
}
}
});
//Updater (Observer pattern)
@Override
public void updated(byte[] data) {
for (WebSocket socket : sockets) {
socket.write(new ByteBufferList(data));
}
}
</websocket>Record Acitivy
private long start_time = System.currentTimeMillis();
@Override
public void onPreviewFrame(byte[] data, Camera camera) {
long now_time = System.currentTimeMillis();
if ((now_time - start_time) > 250) {
start_time = now_time;
//https://forums.xamarin.com/discussion/40991/onpreviewframe-issue-converting-preview-byte-to-android-graphics-bitmap
Camera.Size size = camera.getParameters().getPreviewSize();
YuvImage image = new YuvImage(data, ImageFormat.NV21, size.width, size.height, null);
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
image.compressToJpeg(new Rect(0, 0, size.width, size.height), 60, byteArrayOutputStream);
MainActivity.getWebStreamer().updated(byteArrayOutputStream.toByteArray());
}
}JavaScript
var socket;
var imageElement;
/**
* path - String.Format("ws://{0}:8090/live", Window.Location.HostName)
* image - HTMLImageElement
*/
function imageStreamer(path, image) {
imageElement = image;
socket = new WebSocket(path);
socket.onmessage = function(msg) {
var arrayBuffer = msg.data;
var reader = new FileReader();
reader.onload = function(e) {
imageElement.src = e.target.result;
};
reader.readAsDataURL(arrayBuffer);
};
} -
MPEG-DASH Livestreaming using ffmpeg, avconv, MP4Box and Dashjs
28 avril 2017, par Sushant MongiaI’m working on delivering Live Streaming with DASH capabilities. Long story short, it’s a very crude testbed setup so I might be off the mark in some respects. I’m also posting this as a simple setup for the community and people out there struggling to look up a Live Streaming with DASH Tutorial.
Setup :
OS : Ubuntu 16.04
Encoding Tools :
ffmpeg : To record a livestream using a desktop webcam in mpeg2 format
avconv : To convert mpeg2 to mpeg4 file format
MP4Box : To DASH it, i.e. produce the .mpd, some conf files, seg_init and the segments
Dashjs : Reference Client 2.4.1
Server : ApacheProcess :
I’ve written 3 bash scripts, with basically an infinite while loop in them, with ffmpeg , avconv and the MP4Box commands, one in each of them. I first run the ffmpeg script that records a video using the desktop webcam and then I run the avconv script that kills the ffmpeg command and converts the file from mpeg2 format to mpeg 4 format. Now since the ffmpeg command is in an infinite while loop, it restarts. Then I run the MP4Box command that DASH-es the avconv commands’ output. Then everything is sent to the DASHjs client and pretty much the whole setup gets repeated every 5 seconds.Commands :
ffmpeg -f v4l2 -framerate 25 -video_size 640x480 -i /dev/video0 -f mpegts -codec:v mpeg1video -s 640x480 -b:v 1000k -bf 0 livestream
avconv -i livestream out.mp4
MP4Box command in a loop, MP4Box -dash-live 4000 -fps 24 -frag 6000 -profile dashavc264:live -dynamic -mpd-refresh 5000 -dash-ctx dashtest.txt -time-shift -1 -inter 0 -segment-name output-seg -bs-switching no out.mp4
Problem :
The ffmpeg sends a chunk of 5 seconds (that’s due to the sleep command in my avconv bash script) and the MP4Box reads that 5 second chunk and loops that chunk. So when the next chunk comes in, newer segments are produced, but the player is still playing the older segments, typically just the very first few segments in a loop.Questions :
1) Am I missing out on some core concept here ? Are the commands and their respective attributes with the right parameters and the right values ?
2) I believe there should be a way to pipeline these processes in a better manner, should I be looking into writing a python script maybe ?Happy to provide more info ! Cheers