
Recherche avancée
Autres articles (59)
-
Contribute to translation
13 avril 2011You can help us to improve the language used in the software interface to make MediaSPIP more accessible and user-friendly. You can also translate the interface into any language that allows it to spread to new linguistic communities.
To do this, we use the translation interface of SPIP where the all the language modules of MediaSPIP are available. Just subscribe to the mailing list and request further informantion on translation.
MediaSPIP is currently available in French and English (...) -
Ajouter notes et légendes aux images
7 février 2011, parPour pouvoir ajouter notes et légendes aux images, la première étape est d’installer le plugin "Légendes".
Une fois le plugin activé, vous pouvez le configurer dans l’espace de configuration afin de modifier les droits de création / modification et de suppression des notes. Par défaut seuls les administrateurs du site peuvent ajouter des notes aux images.
Modification lors de l’ajout d’un média
Lors de l’ajout d’un média de type "image" un nouveau bouton apparait au dessus de la prévisualisation (...) -
Ecrire une actualité
21 juin 2013, parPrésentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
Vous pouvez personnaliser le formulaire de création d’une actualité.
Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)
Sur d’autres sites (6369)
-
GC and onTouch cause Fatal signal 11 (SIGSEGV) error in app using ffmpeg through ndk
30 janvier 2015, par grzebykI am getting a nasty but well known error while working with FFmpeg and NDK :
A/libc(9845): Fatal signal 11 (SIGSEGV), code 1, fault addr 0xa0a9f000 in tid 9921 (AsyncTask #4)
UPDATE
After couple hours i found out that there might be two sources of the problem. One was related to multithreading. I checked it and I fixed it. Now the app crashes ONLY when the video playback (ndk) is on.
I put a "counter" in touch event
surfaceSterowanieKamera.setOnTouchListener(new View.OnTouchListener() {
int counter = 0;
@Override
public boolean onTouch(View v, MotionEvent event) {
if ((event.getAction() == MotionEvent.ACTION_MOVE)){
Log.i(TAG, "counter = " + counter);
//cameraMover.setPanTilt(some parameters);
counter++;
}And I started disabling other app functionalities one by one, but no video. I found out, that with every single functionality less, it takes app longer to crush - counter reaches higher values. After turning off everything besides video playback and touch interface (
cameraMover.setPanTilt()
commented out) the app crushes usually when counter is between 1600 - 1700.In such case logcat shows the above error and GC related info. For me it seems like GC is messing up with the ndk.
01-23 12:27:13.163: I/Display Activity(20633): n = 1649
01-23 12:27:13.178: I/art(20633): Background sticky concurrent mark sweep GC freed 158376(6MB) AllocSpace objects, 1(3MB) LOS objects, 17% free, 36MB/44MB, paused 689us total 140.284ms
01-23 12:27:13.169: A/libc(20633): Fatal signal 11 (SIGSEGV), code 1, fault addr 0x9bd6ec0c in tid 20734 (AsyncTask #3)Why is GC causing problem with ndk part of application ?
ORIGINAL PROBLEM
What am I doing ?
I am developing an application that streams live video feed from a webcam and enables user to pan and tilt the remote camera. I am using FFmpeg library built with NDK to achieve smooth playback with little delay.
I am using FFMpeg library to connect to the video stream. Then the ndk part creates bitmap, does the image processing and render frames on the
SurfaceView videoSurfaceView
object which is located in the android activity (java part).To move the webcam I created a separate class -
public class CameraMover implements Runnable{/**/}
. This class is a separate thread that connects through sockets with the remote camera and manages tasks connected ONLY with pan-tilt movement.Next in the main activity i created a touch listener
videoSurfaceView.setOnTouchListener(new View.OnTouchListener() {/**/
cameraMover.setPanTilt(some parameters);
/**/}which reads user’s finger movement and sends commands to the camera.
All tasks - moving camera around, touch interface and video playback are working perfectly when the one of the others is disabled, i.e. when I disable possibility to move camera, I can watch video streaming and register touch events till the end of time (or battery at least). The problem occurs only when task are configured to work simultaneously.
I am unable to find steps to reproduce the problem. It just happens, but only after user touches the screen to move camera. It can be 15 seconds after first interaction, but sometimes it takes app 10 or more minutes to crash. Usually it is something around a minute.
What have I done to fix it ?
- I tried to display millions of logs in logcat to find an error but
the last log was always different. - I created a transparent surface, that I put over the
videoSurfaceView
and assigned touch listener to it. It all ended in the same error. - As I mentioned before, I turned off some functionalities to find which one produces the error, but it appears that error occurs only when everything is working simultaneously.
Types of the error
Almost every time the error looks like this :
A/libc(11528): Fatal signal 11 (SIGSEGV), code 1, fault addr 0x9aa9f00c in tid 11637 (AsyncTask #4)
the difference between two errors is the number right after libc, addr number and tid number. Rarely the AsyncTask number varies - i received #1 couple times but I was unable to reproduce it.
Question
How can i avoid this error ? What can be the source of it ?
- I tried to display millions of logs in logcat to find an error but
-
Live streaming video on multiple platforms [closed]
4 mars 2019, par Nikolay NikolovI want to build an application similar to Twitch/YouTube, which mainly offers two things (and a couple of other, but they are not related to the question) - to record and send live streams and to watch other people’s live streams. Basically, if I wanted to build Twitch, where would I start in terms of protocols and back-end libraries for the processing and sending/receiving of video segments (more detailed questions follow) ? I am new to the video streaming software development and need a bit of guidance on where/how to start.
Here are the details/requirements :
- Video and audio
- Scalability and low latency are more important than supreme quality
- Adaptive bit-rate
- No services like Wowza and such (I am willing to build the whole structure)
- Has to work on iOS and Android (Desktop support is not as important)
- The users should be able to watch every stream and every user should be able to stream through his camera
- VOD is not as important
- Going back in the stream is not as important
If I have it right, this is how the whole process should work :
- Android/iOS camera records video
- Simultaneously the app saves every x seconds as a single segment and sends it to the server
- Server processes the video in different bit rates and saves them
- Another user requests stream based on the bandwidth of its internet connection
- Server responds with a playlist of segments and sends each new chunk of video to the user
Questions :
- What protocols should I be using (HLS, MPEG-DASH, WebRTC, RTSP, etc.) and do these protocols have implementations on Android/iOS or do I have to implement them myself ?
- What books/other resources would you recommend ?
Thank you very much for reading my question and I look forward to reading your answers !
-
Resize videos and keep codecs using PHP-FFMpeg
30 décembre 2019, par tmp_hallenserI’m trying to resize a couple of videos (with different codecs) using PHP-FFMpeg. The official documentation gives the following example (see https://github.com/PHP-FFMpeg/PHP-FFMpeg)
$ffmpeg = FFMpeg\FFMpeg::create();
$video = $ffmpeg->open('video.mpg');
$video
->filters()
->resize(new FFMpeg\Coordinate\Dimension(320, 240))
->synchronize();
$video
->frame(FFMpeg\Coordinate\TimeCode::fromSeconds(10))
->save('frame.jpg');
$video
->save(new FFMpeg\Format\Video\X264(), 'export-x264.mp4')
->save(new FFMpeg\Format\Video\WMV(), 'export-wmv.wmv')
->save(new FFMpeg\Format\Video\WebM(), 'export-webm.webm');Since I don’t want to specify the codec (take the same as the source file), I was trying to create a copy-format and use this format instead for the
save
command.class MOVFormat extends FFMpeg\Format\Video\DefaultVideo
{
public function __construct($audioCodec = 'copy', $videoCodec = 'copy')
{
$this
->setAudioCodec($audioCodec)
->setVideoCodec($videoCodec);
}
public function supportBFrames()
{
return false;
}
public function getAvailableAudioCodecs()
{
return ['copy'];
}
public function getAvailableVideoCodecs()
{
return ['copy'];
}
}This results in
Filtergraph '[in]scale=320:240 [out]' was defined for video output stream 0:0 but codec copy was selected.
andFiltering and streamcopy cannot be used together.
Technically, it should be possible to resize without specifying the codec if I look at this reply : https://superuser.com/a/624564
Any help is appreciated !