
Recherche avancée
Médias (1)
-
Richard Stallman et le logiciel libre
19 octobre 2011, par
Mis à jour : Mai 2013
Langue : français
Type : Texte
Autres articles (69)
-
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
La file d’attente de SPIPmotion
28 novembre 2010, parUne file d’attente stockée dans la base de donnée
Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...) -
Creating farms of unique websites
13 avril 2011, parMediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)
Sur d’autres sites (7161)
-
HTML5 Video/Audio to Nodejs via Socket.io but with a twist - FFMPEG
9 janvier 2014, par user1840958I'm writing this very simple "skype clone". I tried a variety of other languages, python and layering over Node.js with Meteor, WebRTC, but Node.js+socket.io seems to be working the best and cleanest however I've hit a road block and I can't get it all to work correctly.
I have two issues,
1. I think I'm sending real data from the HTML5 getUserMedia, but I might not, and I don't know how to test or find out. I think that using, "video.src = window.URL.createObjectURL(stream) ;" makes the Blob stream an actual Data stream... but I don't know.This is my Broadcast.html
It's a very simple getUserMedia grab the camera and microphone... Then I connect to the Socket and on click of the Broadcast button, fires off the Emit to 'Join' and sends over the 'webcamstream' data.<video autoplay="autoplay" height="280"></video>
<button class="recordbutton">Broadcast</button>
<code class="echappe-js"><script language="javascript" type="text/javascript"><br />
var socket = io.connect(&#39;http://video.domain.com:3031&#39;);<br />
socket.on(&#39;connect&#39;, function() {<br />
$(&#39;#conversation&#39;).append(&#39;Connected <br />&#39;);<br />
});<br />
<br />
function onVideoFail(e) {<br />
console.log(&#39;webcam fail!&#39;, e);<br />
};<br />
<br />
function hasGetUserMedia() {<br />
return !!(navigator.getUserMedia || <br />
navigator.webkitGetUserMedia || <br />
navigator.mozGetUserMedia || <br />
navigator.msGetUserMedia);<br />
}<br />
<br />
if (hasGetUserMedia()) {<br />
alert(&#39;It is working...&#39;);<br />
} else {<br />
alert(&#39;getUserMedia() is not supported in your browser&#39;);<br />
}<br />
<br />
window.URL = window.URL || window.webkitURL;<br />
navigator.getUserMedia = navigator.getUserMedia || <br />
navigator.webkitGetUserMedia ||<br />
navigator.mozGetUserMedia || <br />
navigator.msGetUserMedia;<br />
<br />
var video = document.querySelector(&#39;video&#39;);<br />
var streamRecorder;<br />
var webcamstream;<br />
<br />
if (navigator.getUserMedia) {<br />
navigator.getUserMedia({audio: true, video: true}, function(stream) {<br />
video.src = window.URL.createObjectURL(stream);<br />
webcamstream = stream;<br />
}, onVideoFail);<br />
} else {<br />
alert (&#39;failed&#39;);<br />
}<br />
<br />
function startBroadcasting() {<br />
alert(&#39;Broadcast Now Clicked&#39;);<br />
console.log(webcamstream);<br />
socket.emit(&#39;join&#39;, webcamstream);<br />
socket.emit(&#39;echo&#39;, &#39;echo1 echo2 echo3 <br />&#39;);<br />
}<br />
<br />
socket.on(&#39;echo&#39;, function(data) {<br />
$(&#39;#conversation&#39;).append(data);<br />
}); <br />
</code></pre><br />
<br />
<p></p><br />
<br />
<p>This is the app.js<br />
2. What I&#39;m trying to do here is consume in the &#39;stream&#39; from the socket, but in it&#39;s place I have a test video to see if the FFMPEG is actually working. I&#39;m using <a href="https://github.com/schaermu/node-fluent-ffmpeg" rel="nofollow">https://github.com/schaermu/node-fluent-ffmpeg</a>.</p><br />
<br />
<p>When I run this test with my myth.mp4 file, I do get an out.avi however it&#39;s 0 bytes ??</p><br />
<br />
<pre><code>var express = require(&#39;express&#39;);<br />
var socket = require(&#39;socket.io&#39;);<br />
var ffmpeg = require(&#39;fluent-ffmpeg&#39;);<br />
var fs = require(&#39;fs&#39;);<br />
<br />
var app = express();<br />
<br />
app.configure(function(req, res){<br />
app.use(express.static(__dirname + &#39;/&#39;));<br />
});<br />
<br />
var server = app.listen(3031);<br />
var io = socket.listen(server);<br />
<br />
io.sockets.on(&#39;connection&#39;, function(socket) {<br />
socket.on(&#39;join&#39;, function(stream) {<br />
socket.stream = stream;<br />
socket.emit(&#39;echo&#39;, socket.stream + &#39;<br />&#39;);<br />
var proc = new ffmpeg({source:&#39;/srv/www/domain.com/video/red/myth.mp4&#39;})<br />
.withAspect(&#39;4:3&#39;)<br />
.withSize(&#39;640x480&#39;)<br />
.applyAutopadding(true, &#39;white&#39;)<br />
.saveToFile(&#39;/srv/www/domain.com/video/red/out.avi&#39;, function(retcode, error){<br />
socket.emit(&#39;echo&#39;, &#39;file has been converted succesfully <br />&#39;);<br />
});<br />
});<br />
socket.on(&#39;echo&#39;, function(data) {<br />
socket.emit(&#39;echo&#39;, data);<br />
});<br />
});<br />
</code></pre><br />
<br />
<p>I get no errors on Node Start up, I get no Errors on running. I do get a 0 Byte out.avi file freshly created every time I run this.</p><br />
<br />
<p>I have a linode VPS with CentOS/Nginx</p><br />
<br />
<p>node -v<br />
v0.10.21</p><br />
<br />
<p>FFMPEG<br />
ffmpeg version 1.2 Copyright (c) 2000-2013 the FFmpeg developers<br />
built on Nov 23 2013 17:43:13 with gcc 4.4.7 (GCC) 20120313 (Red Hat 4.4.7-3)<br />
configuration: --enable-version3 --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libvpx --enable-libfaac --enable-libmp3lame --enable-libtheora --enable-libvorbis --enable-libx264 --enable-libvo-aacenc --enable-libxvid --disable-ffplay --enable-shared --enable-gpl --enable-postproc --enable-nonfree --enable-avfilter --enable-pthreads --extra-cflags=-fPIC<br />
libavutil 52. 18.100 / 52. 18.100<br />
libavcodec 54. 92.100 / 54. 92.100<br />
libavformat 54. 63.104 / 54. 63.104<br />
libavdevice 54. 3.103 / 54. 3.103<br />
libavfilter 3. 42.103 / 3. 42.103<br />
libswscale 2. 2.100 / 2. 2.100<br />
libswresample 0. 17.102 / 0. 17.102<br />
libpostproc 52. 2.100 / 52. 2.100<br />
Hyper fast Audio and Video encoder<br />
usage: ffmpeg [options] [[infile options] -i infile]... {[outfile options] outfile}...</p><br />
<br />
<p>Thanks in advance for your help.</p> -
GC and onTouch cause Fatal signal 11 (SIGSEGV) error in app using ffmpeg through ndk
30 janvier 2015, par grzebykI am getting a nasty but well known error while working with FFmpeg and NDK :
A/libc(9845): Fatal signal 11 (SIGSEGV), code 1, fault addr 0xa0a9f000 in tid 9921 (AsyncTask #4)
UPDATE
After couple hours i found out that there might be two sources of the problem. One was related to multithreading. I checked it and I fixed it. Now the app crashes ONLY when the video playback (ndk) is on.
I put a "counter" in touch event
surfaceSterowanieKamera.setOnTouchListener(new View.OnTouchListener() {
int counter = 0;
@Override
public boolean onTouch(View v, MotionEvent event) {
if ((event.getAction() == MotionEvent.ACTION_MOVE)){
Log.i(TAG, "counter = " + counter);
//cameraMover.setPanTilt(some parameters);
counter++;
}And I started disabling other app functionalities one by one, but no video. I found out, that with every single functionality less, it takes app longer to crush - counter reaches higher values. After turning off everything besides video playback and touch interface (
cameraMover.setPanTilt()
commented out) the app crushes usually when counter is between 1600 - 1700.In such case logcat shows the above error and GC related info. For me it seems like GC is messing up with the ndk.
01-23 12:27:13.163: I/Display Activity(20633): n = 1649
01-23 12:27:13.178: I/art(20633): Background sticky concurrent mark sweep GC freed 158376(6MB) AllocSpace objects, 1(3MB) LOS objects, 17% free, 36MB/44MB, paused 689us total 140.284ms
01-23 12:27:13.169: A/libc(20633): Fatal signal 11 (SIGSEGV), code 1, fault addr 0x9bd6ec0c in tid 20734 (AsyncTask #3)Why is GC causing problem with ndk part of application ?
ORIGINAL PROBLEM
What am I doing ?
I am developing an application that streams live video feed from a webcam and enables user to pan and tilt the remote camera. I am using FFmpeg library built with NDK to achieve smooth playback with little delay.
I am using FFMpeg library to connect to the video stream. Then the ndk part creates bitmap, does the image processing and render frames on the
SurfaceView videoSurfaceView
object which is located in the android activity (java part).To move the webcam I created a separate class -
public class CameraMover implements Runnable{/**/}
. This class is a separate thread that connects through sockets with the remote camera and manages tasks connected ONLY with pan-tilt movement.Next in the main activity i created a touch listener
videoSurfaceView.setOnTouchListener(new View.OnTouchListener() {/**/
cameraMover.setPanTilt(some parameters);
/**/}which reads user’s finger movement and sends commands to the camera.
All tasks - moving camera around, touch interface and video playback are working perfectly when the one of the others is disabled, i.e. when I disable possibility to move camera, I can watch video streaming and register touch events till the end of time (or battery at least). The problem occurs only when task are configured to work simultaneously.
I am unable to find steps to reproduce the problem. It just happens, but only after user touches the screen to move camera. It can be 15 seconds after first interaction, but sometimes it takes app 10 or more minutes to crash. Usually it is something around a minute.
What have I done to fix it ?
- I tried to display millions of logs in logcat to find an error but
the last log was always different. - I created a transparent surface, that I put over the
videoSurfaceView
and assigned touch listener to it. It all ended in the same error. - As I mentioned before, I turned off some functionalities to find which one produces the error, but it appears that error occurs only when everything is working simultaneously.
Types of the error
Almost every time the error looks like this :
A/libc(11528): Fatal signal 11 (SIGSEGV), code 1, fault addr 0x9aa9f00c in tid 11637 (AsyncTask #4)
the difference between two errors is the number right after libc, addr number and tid number. Rarely the AsyncTask number varies - i received #1 couple times but I was unable to reproduce it.
Question
How can i avoid this error ? What can be the source of it ?
- I tried to display millions of logs in logcat to find an error but
-
Revision 34313 : textebrut sur le #NOM_SITE_SPIP
8 janvier 2010, par fil@… — Logtextebrut sur le #NOM_SITE_SPIP