
Recherche avancée
Autres articles (31)
-
Use, discuss, criticize
13 avril 2011, parTalk to people directly involved in MediaSPIP’s development, or to people around you who could use MediaSPIP to share, enhance or develop their creative projects.
The bigger the community, the more MediaSPIP’s potential will be explored and the faster the software will evolve.
A discussion list is available for all exchanges between users. -
MediaSPIP Player : problèmes potentiels
22 février 2011, parLe lecteur ne fonctionne pas sur Internet Explorer
Sur Internet Explorer (8 et 7 au moins), le plugin utilise le lecteur Flash flowplayer pour lire vidéos et son. Si le lecteur ne semble pas fonctionner, cela peut venir de la configuration du mod_deflate d’Apache.
Si dans la configuration de ce module Apache vous avez une ligne qui ressemble à la suivante, essayez de la supprimer ou de la commenter pour voir si le lecteur fonctionne correctement : /** * GeSHi (C) 2004 - 2007 Nigel McNie, (...) -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...)
Sur d’autres sites (4465)
-
Error in converting audio file format from ogg to wav [on hold]
9 juin 2014, par Sumit BishtI am trying to convert an ogg format file that was created using webrtc (html5 usermedia content generated on firefox) and transferred and decoded on the server into a wav file through ffmpeg but am getting this error on cmmand line while trying to convert :
$ ffmpeg -i 2014-6-5_16-17-54.ogg res1.wav
ffmpeg version 2.0.1 Copyright (c) 2000-2013 the FFmpeg developers
built on May 1 2014 13:12:12 with gcc 4.4.7 (GCC) 20120313 (Red Hat 4.4.7-4)
configuration: --enable-gpl --enable-version3 --enable-shared --enable-nonfree --enable-postproc --enable-libfaac --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libxvid
libavutil 52. 38.100 / 52. 38.100
libavcodec 55. 18.102 / 55. 18.102
libavformat 55. 12.100 / 55. 12.100
libavdevice 55. 3.100 / 55. 3.100
libavfilter 3. 79.101 / 3. 79.101
libswscale 2. 3.100 / 2. 3.100
libswresample 0. 17.102 / 0. 17.102
libpostproc 52. 3.100 / 52. 3.100
Guessed Channel Layout for Input Stream #0.0 : mono
Input #0, ogg, from '2014-6-5_16-17-54.ogg':
Duration: 00:00:01.84, start: 0.000000, bitrate: 18 kb/s
Stream #0:0: Audio: opus, 48000 Hz, mono
Metadata:
ENCODER : Mozilla29.0.1
[graph 0 input from stream 0:0 @ 0x18dca20] Invalid sample format (null)
Error opening filters!Although, I am able to play the file on server and using the same command, am able to convert .ogg files generated somewhere else. What might be I missing ?
Edit :
Here’s the source code that is used to write to the file :1) During startup - use the methods of getUserMedia API.
navigator.getUserMedia({
audio: true,
video: false
}, function(stream) {
audioStream = RecordRTC(stream, {
bufferSize: 16384
});
audioStream.startRecording();2) During stopping of the recording - extracting the recorded information.
function(audioDataURL) {
var audioFile = {};
audioFile = {
contents: audioDataURL
**strong text**};On server end, the following code is creating a file from this data :
dataURL = dataURL.split(',').pop(); // dataURL is the audioDataURL as defined above
fileBuffer = new Buffer(dataURL, 'base64');
fs.writeFileSync(filePath, fileBuffer); -
No audio output using FFmpeg
26 mars 2022, par John Mergene ArellanoI am having problem on Live stream output. I am streaming from mobile app to Node JS server to RTMP. Video output of the live stream is working but not the audio. There is no audio output from live stream.


From my client side, I am sending a stream using the Socket.IO library. I captured the video and audio using getUserMedia API.


navigator.mediaDevices.getUserMedia(constraints).then((stream) => {
 window.videoStream = video.srcObject = stream;
 let mediaRecorder = new MediaRecorder(stream, {
 videoBitsPerSecond : 3 * 1024 * 1024
 });
 mediaRecorder.addEventListener('dataavailable', (e) => {
 let data = e.data;
 socket.emit('live', data);
 });
 mediaRecorder.start(1000);
});



Then my server will receive the stream and write it to FFmpeg.


client.on('live', (stream)=>{
 if(ffmpeg)
 ffmpeg.stdin.write(stream);
});



I tried watching the live video in VLC media player. There is a 5 seconds delay and no audio output.


Please see below for FFmpeg options I used :


ffmpeg = this.CHILD_PROCESS.spawn("ffmpeg", [
 '-f',
 'lavfi',
 '-i', 'anullsrc',
 '-i','-',
 '-c:v', 'libx264', '-preset', 'veryfast', '-tune', 'zerolatency',
 '-c:a', 'aac', '-ar', '44100', '-b:a', '64k',
 '-y', //force to overwrite
 '-use_wallclock_as_timestamps', '1', // used for audio sync
 '-async', '1', // used for audio sync
 '-bufsize', '1000',
 '-f',
 'flv',
 `rtmp://127.0.0.1:1935/live/stream` ]);



What is wrong with my setup ? I need to fix the command so that the live stream will output both video and audio.


I tried streaming to youtube RTMP but still no audio. I am expecting to have an output of video and audio from the getUserMedia API.


What is wrong with my setup ? I need to fix the command so that the live stream will output both video and audio.


I tried streaming to youtube RTMP but still no audio. I am expecting to have an output of video and audio from the getUserMedia API.


-
ffmpeg legitimate decoding errors
20 juillet 2017, par Gideon OduroMy issue is as follows, i’m sending a H.264 encoded video captured with the help of WebRTC over WebSocket. The idea is to perform server side analysis and object tracking.
navigator.mediaDevices.getUserMedia(constraint).then((stream) => {
isVideoElement(target, stream)
mediaRecorder = recorder(stream, {mimeType: 'video/webm; codecs=H264'})
mediaRecorder.ondataavailable = (blob) => socket.send(blob.data)
mediaRecorder.start('2000');
})On the server side, data is being received as ByteBuffer :
override fun handleBinaryMessage(session: WebSocketSession, msg: BinaryMessage) {
analysis(msg.payload)
}Im using the following resources (resource_1, resource_2) to try to convert my ByteBuffer to a OpenCv frame :
fun startPreview(data: ByteBuffer) {
avcodec_register_all()
val pack = avcodec.AVPacket()
pack.data(BytePointer(data))
avcodec.av_init_packet(pack)
val videoData = BytePointer(data)
val codec = avcodec.avcodec_find_decoder(avcodec.AV_CODEC_ID_H264)
val videoCodecContext = avcodec.avcodec_alloc_context3(codec)
videoCodecContext.width(1280)
videoCodecContext.height(720)
videoCodecContext.pix_fmt(avutil.AV_PIX_FMT_YUV420P)
videoCodecContext.codec_type(avutil.AVMEDIA_TYPE_VIDEO)
videoCodecContext.extradata(videoData)
videoCodecContext.extradata_size(data.capacity())
videoCodecContext.flags2(videoCodecContext.flags2() or avcodec.CODEC_FLAG2_CHUNKS)
avcodec.avcodec_open2(videoCodecContext, codec, null as PointerPointer<*>?)
val decodedFrameLength = avcodec.avcodec_receive_frame(videoCodecContext, avutil.AVFrame())
println(decodedFrameLength)
}Im then receiving decodedFrameLength of -35 indicating a decoding error, cant figure out how to proceed from here ?