
Recherche avancée
Autres articles (32)
-
La file d’attente de SPIPmotion
28 novembre 2010, parUne file d’attente stockée dans la base de donnée
Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...) -
Contribute to documentation
13 avril 2011Documentation is vital to the development of improved technical capabilities.
MediaSPIP welcomes documentation by users as well as developers - including : critique of existing features and functions articles contributed by developers, administrators, content producers and editors screenshots to illustrate the above translations of existing documentation into other languages
To contribute, register to the project users’ mailing (...) -
Selection of projects using MediaSPIP
2 mai 2011, parThe examples below are representative elements of MediaSPIP specific uses for specific projects.
MediaSPIP farm @ Infini
The non profit organizationInfini develops hospitality activities, internet access point, training, realizing innovative projects in the field of information and communication technologies and Communication, and hosting of websites. It plays a unique and prominent role in the Brest (France) area, at the national level, among the half-dozen such association. Its members (...)
Sur d’autres sites (4733)
-
WebRTC Multi-Stream recording
11 janvier 2021, par Tim SpechtI'm currently trying to build a WebRTC streaming architecture that contains multiple users streaming content from their camera in the same "room" and a SFU / MCU on server-side "recording" the incoming video packets, merging them into one image and re-distributing them to the viewers as either RTMP or HLS for added scalability.


Upon doing some initial research on this, Janus Gateway seems like a good fit for this given it's wide adoption across the space + their (seemingly) extensible plugin architecture. Thus, I'm currently trying to figure out what a recommended architecture for my use-case would look like.
I looked at the following plugins :


- 

- Janus Streaming
- Janus Recordings






While Janus and the Streaming plugin seem like a good start to get the broadcasting aspect within the group of casters in the room, I'm trying to piece together how I could combine the different video sources into a combined one (split horizontally for example if there are 2 casters active) and retransmit the final result as something optimized for broadcast-consumption like HLS. Some of the ways I could imagine doing that :


- 

- Implement a custom Janus plugin that transcodes the incoming buffers on the gateway itself
- Forwarding the incoming packets via RTP to a Transcoding server

- 

- In this specific case I am not sure what would be best to implement that ? Are the video frames different tracks ? Could I stream all of them to the same port and have
ffmpeg
or something similar take care of the merging for me ?




- In this specific case I am not sure what would be best to implement that ? Are the video frames different tracks ? Could I stream all of them to the same port and have






-
Delphi android, deploying AND dynamic loading (external) libraries
13 décembre 2020, par cobanI am trying to start creating an application to test FFmpeg libraries, kind of a mediaplayer application, for ANDROID using Delphi 10.3/10.4


I am getting some (strange ?) behaviors on different machines and locations of the files on the phone/tablet.


The very first question should be ; what folder is the right one to put (external) libraries for dynamic/static loading ?


I tried 2 locations ; '.\assets\internal' -> 'files' folder of the app
and 'library\lib\armeabi-v7a' -> bin folder (if i'm right)


behavior on mobile phone Android 8


when I choose to place the (FFmpeg) libraries in the Files folder '.\assets\internal' and try to load the libraries, 3 of the 7 libraries succesfully loads, while the other does not. Every tiem the same libraries which fail and succeeds to load. The succesfully loading libraries are 'libavutil.so', 'swresample.so' and 'libswscale.so'.


When I choose to place the libraries in the bin folder 'library\lib\armeabi-v7a', all libraries are succesfully loaded.


behavior on tablet android 4.4.4


When choosing to put the libraries in the 'Files' folder, exact the same behavior as "Android 8 phone".


The strange thing is ; When I choos the bin folder, none of the libraries are being loaded ?


I did not compile/build the (FFmpeg) libraries myself, I downloaded them.
I tried libraries from different places.
In every attempt I checked for the existance of the files.
I used 'loadlibrary' function, after some reading and suggestions on the internet I allso tried 'dlopen' function directly which looks like unnecessary to use it directly after all.
I was not able to debug using D10.4 and Android 4.4.4 tablet, because of the minsdk version. Using D10.3 I am able to try on both machines.


Delphi10.3 'Android SDK 25.2.5 32bit', 'jdk1.8.0_60'


Delphi10.4 'Android SDK 25.2.5 32bit', 'AdoptOpenJDK jdk-8.0.242.08-hotspot'


Any idea why 3 of the libraries are able to load in case of they are in the 'Files' folder while all of them can be loaded when they are in the 'BIN' folder (android 8) ?
And why does nothing load by android 4.4.4 when the files are in the 'Bin' folder while 3 of them are able to be loaded when they are placed in the 'Files' Folder ?


I've been using FFmpeg libraries for windows (allmost)without issues, my question should not be FFmpeg specific but Delphi+android+(external)libraries specific except if this behavior is FFmpeg specific.


Both are Samsung machines,


Android 4.4 tablet cpu (using 'syscheck' embarcadero recommends);

family = ARM
processor = ARMv7 processor rev 5(v7I)
CPU Cores = 4
neon supported yes
armv7 (ARMv7 compatible architecture) yes

Android 8 phone cpu

Family ARM
processor unknown
CPU Cores 8
Neon yes
armv7 = Arm
armv7 (ARMv7 compatible architecture) yes



Edit


Test on Android 10, redmi note 10 lite


None of the library files are being loaded from the 'Files'->'.\assets\internal' folder. All library files are being succesfully loaded from the 'Bin'->'library\lib\armeabi-v7a' Folder.


I'll need an reasonably explanation for this. It looks like Andrid specific behavior ?


Edit 2


One of the reasons seems like that some those FFmpeg libraries are loading other FFmpeg libraries, even if they are in the same directory, if they are outside the folder of the EXE file, or not in a (library) folder where the OS searches by default, they cannot find/load eachother.


This looks like the explanation why some of them are able to load in the 'Files'->'.\assets\internal' folder.


-
Is it possible to send a temporary slate (image or video) into a running Azure Live Event RTMP-stream ?
15 novembre 2020, par Brian FrischI'm currently building a video streaming app which leverages Azure Media Services Live Events.


It consists of :


- 

- a mobile app that can stream live video and.
- a web client that plays the live event video.
- a producer screen with controls to start and stop the web client access to the video.
- a server that handles various operations around the entire system










It's working very well, but I would like to add a feature that would enable the producer to add some elegance to the experience. Therefore I'm trying to get my head around how I can enable the producer be able to switch the incoming source of the stream to a pre-recorded video or event a still image at any point during the recording, and also to switch back to live-video. A kill-switch of some kind, that would cover waiting-time if there's technical difficulties on the set, and it could also be used for pre-/post-roll branding slates when introing and outroing a video event. I would like this source switch to be embedded in the video stream (also so it would be possible to get this into the final video-product if I need it in an archive for later playback)


I'm trying to do it in a way where the producer can set a timestamp for when the video override should come in, and when it should stop. The I want to have my server respond to these timestamps and send the instructions over RTMP to the Azure Live Event. Is it possible to send such an instruction ("Hey, play this video-bit/show this image in the stream for x-seconds") in the RTMP-protocol ? I've tried to figure it out, and I've read about SCTE-35 markers and such, but I have not been able to find any examples on how to do it, so I'm a bit stuck.


My plan-B is to make it possible to stream an image from the mobile application that already handles the live video-stream, but I'm initially targeting an architecture where the mobile app is unaware of anything else than live streaming, and this override switch should preferably be handled by the server, which is a firebase functions setup.


If you are able to see other ways of doing it, I'm all ears.


I've already tried to build a ffmpeg method that listens to updates to the producer-set state, and then streams an image to the same RTMP-url that the video goes to from the mobile app. But it only works when the live video isn't already streaming - it seems like I cannot take over a RTMP-stream when it's already running.