
Recherche avancée
Autres articles (60)
-
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...)
Sur d’autres sites (13073)
-
what is AV_SAMPLE_FMT_FLT
21 décembre 2020, par GoluSwrContext *swr_ctx = swr_alloc_set_opts(NULL, 
 AV_CH_LAYOUT_STEREO,
 AV_SAMPLE_FMT_FLT,
 sample_rate,
 pCodecParameters->channel_layout, 
 
 pCodecParameters->format,
 pCodecParameters->sample_rate, 
 0,
 NULL);



what exactly AV_SAMPLE_FMT_FLT is ? i already read docs but i want to know that what is float layout means in the context of Audio. How actually binary data of audio will look in that format.


-
How do I redirect the output of SpeechSynthesizer to a Process of ffmpeg
27 septembre 2020, par TheOneAndOnlyMrXI am trying to have a SpeechSynthesizer generate some audio data, pipe it into a Process of FFmpeg, and have FFmpeg save the data to a file (output.wav). Eventually, the audio data will be used for something else, which is why I am using FFmpeg.


using (MemoryStream voiceStream = new MemoryStream())
 using (Process ffmpeg = new Process())
 {
 SpeechSynthesizer synth = new SpeechSynthesizer();

 int samplesPerSecond = 48000;
 int bitsPerSample = 8;
 int channelCount = 2;
 int averageBytesPerSecond = samplesPerSecond * (bitsPerSample / 8) * channelCount;
 int blockalign = (bitsPerSample / 8) * channelCount;
 byte[] formatSpecificData = new byte[0];

 synth.SetOutputToAudioStream(
 voiceStream,
 new System.Speech.AudioFormat.SpeechAudioFormatInfo(
 System.Speech.AudioFormat.EncodingFormat.Pcm,
 samplesPerSecond,
 bitsPerSample,
 channelCount,
 averageBytesPerSecond,
 blockalign,
 formatSpecificData
 )
 );

 synth.Speak("Hello there");

 synth.SetOutputToNull();

 ffmpeg.StartInfo = new ProcessStartInfo
 {
 FileName = "ffmpeg",
 Arguments = $"-y -f u8 -ac 2 -ar 48000 -i pipe:0 out.wav",
 UseShellExecute = false,
 RedirectStandardOutput = true,
 RedirectStandardInput = true
 };

 ffmpeg.Start();

 using (Stream ffmpegIn = ffmpeg.StandardInput.BaseStream)
 {
 voiceStream.CopyTo(ffmpegIn);
 
 ffmpegIn.FlushAsync();
 }
 }



When running the program, FFmpeg said that the input stream contains no data, and returns an empty file.
I believe that I do not interface properly with the Process object, however, the problem might also be my incorrect specification of the audio stream, since I do not know much about audio.


-
How to load a custom java module into Wowza Streaming Engine ?
27 octobre 2018, par kw3rtiI’ve followed the tutorial below step by step, however, the module I’ve created does not appear to load or execute, as I’m not seeing any log entries relating to the getLogger calls in the Wowza Streaming Engine. More specifically, I have created a new Wowza project containing a new module (see code below). Eclipse has then created a jar file in the lib folder of the install directory. I have added the module to a live application on the streaming server. I have also edited the Application.xml file to include the new module.
To hopefully run the module I’ve written, I am streaming an mp4 file using ffmpeg (according to documentation here) to the streaming engine (via the live application), which I can see in the test players. My understanding was that this would trigger at least one of the event listeners in the module. However, nothing appears to come up in the logs. The only entries related to the stream that I can see are shown below.
I’ve been trying to debug what’s going wrong for quite a while now, so I’d appreciate any suggestions of what might fix the issue.
https://www.wowza.com/docs/How-to-extend-Wowza-Streaming-Engine-using-Java
public class GCStreamModule extends ModuleBase {
public void onAppStart(IApplicationInstance appInstance) {
String fullname = appInstance.getApplication().getName() + "/" + appInstance.getName();
getLogger().info("onAppStart: " + fullname);
}
public void onAppStop(IApplicationInstance appInstance) {
String fullname = appInstance.getApplication().getName() + "/" + appInstance.getName();
getLogger().info("onAppStop: " + fullname);
}
public void onConnect(IClient client, RequestFunction function, AMFDataList params) {
getLogger().info("onConnect: " + client.getClientId());
}
public void onConnectAccept(IClient client) {
getLogger().info("onConnectAccept: " + client.getClientId());
}
public void onConnectReject(IClient client) {
getLogger().info("onConnectReject: " + client.getClientId());
}
public void onDisconnect(IClient client) {
getLogger().info("onDisconnect: " + client.getClientId());
}
public void onStreamCreate(IMediaStream stream) {
getLogger().info("onStreamConnect");
}
public void onMediaStreamCreate(IMediaStream stream){
getLogger().info("onMediaStreamCreate: " + stream.getSrc());
}
}