
Recherche avancée
Autres articles (42)
-
La file d’attente de SPIPmotion
28 novembre 2010, parUne file d’attente stockée dans la base de donnée
Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...) -
Contribute to documentation
13 avril 2011Documentation is vital to the development of improved technical capabilities.
MediaSPIP welcomes documentation by users as well as developers - including : critique of existing features and functions articles contributed by developers, administrators, content producers and editors screenshots to illustrate the above translations of existing documentation into other languages
To contribute, register to the project users’ mailing (...) -
Supporting all media types
13 avril 2011, parUnlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)
Sur d’autres sites (4456)
-
How to have multiple websocket RTSP streams ?
6 octobre 2020, par kinxAfter spending some time reading various open-source projects on how to develop RTSP and WebSocket streams, I've almost built a simple project that allows me to display multiple streams on the page.


I have a working example of just one stream working with the code below. A single URL in an array is sent to the client via WebSocket and with JSMPeg, it displays it with some success.


However, I'm not sure how to build this where I have multiple sockets with an RTSP stream in each one and how to give each socket url it's own id. The idea is to encrypt the URL and when the client requests the list of streams, send back that as a socket id, and with JSMPeg, request that data.


Server :


class Stream extends EventEmitter {
 constructor() {
 super();
 this.urls = ["rtsp://someIPAddress:554/1"];
 this.urls.map((url) => {
 this.start(url);
 });
 }
 start(url) {
 this.startStream(url);
 }
 setOptions(url) {
 const options = {
 "-rtsp_transport": "tcp",
 "-i": url,
 "-f": "mpegts",
 "-codec:v": "mpeg1video",
 "-codec:a": "mp2",
 "-stats": "",
 "-b:v": "1500k",
 "-ar": "44100",
 "-r": 30,
 };
 let params = [];
 for (let key in options) {
 params.push(key);
 if (String(options[key]) !== "") {
 params.push(String(options[key]));
 }
 }
 params.push("-");
 return params;
 }
 startStream(url) {
 const wss = new WebSocket.Server({ port: 8080 });
 this.child = child_process.spawn("ffmpeg", this.setOptions(url));
 this.child.stdout.on("data", (data) => {
 wss.clients.forEach((client) => {
 client.send(data);
 });
 return this.emit("data", data);
 });
 }
}

const s = new Stream();
s.on("data", (data) => {
 console.log(data);
});



In the constructor, there's an array of URLs, while I only have one here, i'd like to add multiple. I create a websocket and send that back. What I'd like to do is encrypt that URL with
Crypto.createHash('md5').update(url).digest('hex')
to give it it's own ID and create a websocket based on that id, send the data to that websocket and send that with a list of other id's to the client.

client :


<canvas style="width: 100%"></canvas>
 <code class="echappe-js"><script type="text/javascript">&#xA; var player = new JSMpeg.Player("ws://localhost:8080", {&#xA; loop: true,&#xA; autoplay: true,&#xA; canvas: document.getElementById("video"),&#xA; });&#xA; </script>



What I'd like to do here is request from /api/streams and get back an array of streams/socket id's and request them from the array.


But how do I open up multiple sockets with multiple URLs ?


-
FFmpeg can not open video file after adding the GLsurfaceView to render frames
4 avril 2016, par Kyle LoThe source code works perfectly without any modification.
I successfully use the below function to play the specified video.
playview.openVideoFile("/sdcard/Test/mv.mp4");
And for the research purpose I need to display the frame by using OpenGL ES. So I remove the original method below.
ANativeWindow* window = ANativeWindow_fromSurface(env, javaSurface);
ANativeWindow_Buffer buffer;
if (ANativeWindow_lock(window, &buffer, NULL) == 0) {
memcpy(buffer.bits, pixels, w * h * 2);
ANativeWindow_unlockAndPost(window);
}
ANativeWindow_release(window);And I add FrameRenderer class into my project
public class FrameRenderer implements GLSurfaceView.Renderer {
public long time = 0;
public short framerate = 0;
public long fpsTime = 0;
public long frameTime = 0;
public float avgFPS = 0;
private PlayNative mNative = null;
@Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {/*do nothing*/}
@Override
public void onSurfaceChanged(GL10 gl, int width, int height) {
}
@Override
public void onDrawFrame(GL10 gl) {
mNative.render();
}In the native side I create a corresponding method in VideoPlay.cpp And I only use
glClearColor
to test if the OpenGL function works or not.void VideoPlay::render() {
glClearColor(1.0f, 0.0f, 0.0f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT);
}And the
onCreate
is as below.protected void onCreate(Bundle savedInstanceState) {
// TODO Auto-generated method stub
super.onCreate(savedInstanceState);
setContentView(R.layout.main_layout);
playview = new PlayView(this);
playview.openVideoFile("/sdcard/test_tt_racing.mp4");
//playview.openVideoFile("/sdcard/big_buck_bunny.mp4");
GLSurfaceView surface = (GLSurfaceView)findViewById(R.id.surfaceviewclass);
surface.setRenderer(new FrameRenderer());
...Then test it on the mobile, the screen becomes red which means the GLSurfaceView and OpenGL works fine.
But after I press the play bottom, whole the app stucked. And Show in the
LogMy question is why I can open the video whose path is totally the same with the previous one, just after I added the GLsurface renderer and how can I fix it ?
-
Displaying 450 image files from SDCard at 30fps on android
11 décembre 2013, par nikhilkeralaI am trying to develop an app that takes a 15 seconds of video, allows the user to apply different filters, shows the preview of the effect, then allows to save the processed video to sdcard. I use ffmpeg to split the video into JPEG frames, apply the desired filter using GPUImage to all the frames, then use ffmpeg to encode the frames back to a video. Everything works fine except the part where user selects the filter. When user selects a filter, the app is supposed to display the preview of the video with the filter applied. Though 450 frames get the filter applied fairly quick, displaying the images sequentially at 30 fps (to make the user feel the video is being played) is performing poorly. I tried different approaches but the maximum frame rate I could attain even on the fastest devices is 10 to 12 fps.
The AnimationDrawable technique doesn't work in this case because it requires the entire images to be buffered into memory which in this case is huge. App crashes.
The below code is the best performing one so far (10 to 12 fps).
package com.example.animseqvideo;
import ......
public class MainActivity extends Activity {
Handler handler;
Runnable runnable;
final int interval = 33; // 30.30 FPS
ImageView myImage;
int i=0;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
myImage = (ImageView) findViewById(R.id.imageView1);
handler = new Handler();
runnable = new Runnable(){
public void run() {
i++; if(i>450)i=1;
File imgFile = new File(Environment.getExternalStorageDirectory().getPath() + "/com.example.animseqvideo/image"+ String.format("%03d", i) +".jpg");
if(imgFile.exists()){
Bitmap myBitmap = BitmapFactory.decodeFile(imgFile.getAbsolutePath());
myImage.setImageBitmap(myBitmap);
}
//SOLUTION EDIT - MOVE THE BELOW LINE OF CODE AS THE FIRST LINE OF run() AND FPS=30 !!!
handler.postDelayed(runnable, interval);
}
};
handler.postAtTime(runnable, System.currentTimeMillis()+interval);
handler.postDelayed(runnable, interval);
}
}I understand that the process of getting an image from SDCard, decoding it, then displaying it onto the screen involves the performance of the SDCard reading, the CPUs performance and graphics performance of the device. But I am wondering if there is a way I could save a few milliseconds in each iteration. Any suggestion would be of great help at this point.