
Recherche avancée
Médias (1)
-
Collections - Formulaire de création rapide
19 février 2013, par
Mis à jour : Février 2013
Langue : français
Type : Image
Autres articles (33)
-
Contribute to translation
13 avril 2011You can help us to improve the language used in the software interface to make MediaSPIP more accessible and user-friendly. You can also translate the interface into any language that allows it to spread to new linguistic communities.
To do this, we use the translation interface of SPIP where the all the language modules of MediaSPIP are available. Just subscribe to the mailing list and request further informantion on translation.
MediaSPIP is currently available in French and English (...) -
Les tâches Cron régulières de la ferme
1er décembre 2010, parLa gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
Le super Cron (gestion_mutu_super_cron)
Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...) -
(Dés)Activation de fonctionnalités (plugins)
18 février 2011, parPour gérer l’ajout et la suppression de fonctionnalités supplémentaires (ou plugins), MediaSPIP utilise à partir de la version 0.2 SVP.
SVP permet l’activation facile de plugins depuis l’espace de configuration de MediaSPIP.
Pour y accéder, il suffit de se rendre dans l’espace de configuration puis de se rendre sur la page "Gestion des plugins".
MediaSPIP est fourni par défaut avec l’ensemble des plugins dits "compatibles", ils ont été testés et intégrés afin de fonctionner parfaitement avec chaque (...)
Sur d’autres sites (7950)
-
Scheduling an RTMP stream remotely - using an intermediary server for storing + sending video stream packets before deploying to streaming service
25 février 2020, par hedgehog90This is more of a curiosity than something I really need, but I thought I’d ask anyway.
If I just want setup a normal livestream, I would use something like OBS, capture my input, encode it into something manageable for my network and send it to a remote rtmp server.
But I’d like to know if it’s possible to put an intermediary remote server between myself and the streaming service’s server. Basically so I can manage the stream (if it’s a playlist) and schedule when to send it to broadcast on the streaming service.
It’s also worth noting that there may be limited bandwidth on the client-side (my computer), assuming the intermediary has greater bandwidth, this method should eliminate the common issue of dropping frames while streaming.
Now for an example :
To make it simpler, instead of using OBS + capture hardware, I’m using a video file.
I want to encode that video in the same way that OBS does when streaming to a remote server via an rtmp protocol using ffmpeg.Now I upload that data, at my own rate, to a remote server that I control (running Ubuntu) for storage and eventual deployment. Importantly, I do not want or require any video-processing done on the intermediary server, as we have already encoded the data for deployment on the client-side. This is just simply managing and storing the data.
A day later, I want to run a script on my intermediary server that will then send the processed stream data, packet by packet, to the targeted streaming server.
I’m an experienced coder, with lots of experience with data handling and video encoding. It should be simple, but I’m not all that clued up on the way video streaming via RTMP works.
-
Can we Stream Live Audio from Android phone using FFmpeg ?
17 août 2023, par UdayIm using ffmpeg_kit_flutter package to Stream data to the RTSP server in Flutter.


IOS : Working


Android : Its not working


Command Used :


'ffmpeg -f avfoundation -i ":0" -acodec aac -f rtsp -rtsp_transport tcp "$Url"'



When i ran a "
ffmpeg -devices
" command on android, it returns follwing response, through which i got to know android doesn't support avfoundation but android hasandroid_camera
,
Does thisandroid_camera
support audio too ?

Command : 'ffmpeg -devices'


Response :


I/flutter (10620): logs: libavutil 57. 28.100 / 57. 28.100
I/flutter (10620): logs: libavcodec 59. 37.100 / 59. 37.100
I/flutter (10620): logs: libavformat 59. 27.100 / 59. 27.100
I/flutter (10620): logs: libavdevice 59. 7.100 / 59. 7.100
I/flutter (10620): logs: libavfilter 8. 44.100 / 8. 44.100
I/flutter (10620): logs: libswscale 6. 7.100 / 6. 7.100
I/flutter (10620): logs: libswresample 4. 7.100 / 4. 7.100
I/flutter (10620): logs:Devices:
I/flutter (10620): D. = Demuxing supported
I/flutter (10620): .E = Muxing supported
I/flutter (10620): --
I/flutter (10620): logs: D android_camera 
I/flutter (10620): logs: D lavfi 
I/flutter (10620): logs: DE video4linux2,v4l2



Commands which I tried in Android


FFmpegKit.execute('-y -f android_camera -i 0:1 -r 30 -c:a aac -f rtsp -rtsp_transport tcp "$Url"');

FFmpegKit.execute('-y -f android_camera -i 0:1 -r 30 -c:a libmp3lame -qscale:a 2 "/storage/emulated/0/Download/androidvideo.mp3"');

FFmpegKit.execute('-y -f android_camera -i 0:0 -r 30 -c:a wavpack -b:a 64k "/storage/emulated/0/Download/androidvideo.wav"');



This command records video but no audio in it.


FFmpegKit.execute('-video_size hd720 -f android_camera -camera_index 1 -i anything -r 10 -t 00:00:15 "$dir/androidvideo.mp4”');



-
Cannot use FFmpeg in Xcode iOS Project (file .h not found)
19 mars 2016, par BlackBoxI followed almost step by step this guide.
Almost because I downloaded, as a user suggested, ffmpeg ios library already built from here
I followed from "Linking static libraries in Xcode" but I cannot import anyway the header files of ffmpeg. (So I got the .a files)
For example
#include "avformat.h"
// or
#import "libavformat/avformat.h"
// or
#import <libavformat></libavformat>avformat.h>Everything that I use does not work.
I specify that those .a files are currently in my project directory, indeed, if I import the .a file, it doesn’t complain that it isn’t found, but when compiling, it complains about UTF-8 stuff because .a files are object libraries and cannot be imported that way.
I put also the Header Search Paths for the project as it was suggested and the config.log file but nothing.
Also I see libraries are missing from every project example of FFmpeg I was able to find on GitHub.
Any ideas ?