
Recherche avancée
Médias (1)
-
Sintel MP4 Surround 5.1 Full
13 mai 2011, par
Mis à jour : Février 2012
Langue : English
Type : Video
Autres articles (42)
-
La file d’attente de SPIPmotion
28 novembre 2010, parUne file d’attente stockée dans la base de donnée
Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...) -
Submit bugs and patches
13 avril 2011Unfortunately a software is never perfect.
If you think you have found a bug, report it using our ticket system. Please to help us to fix it by providing the following information : the browser you are using, including the exact version as precise an explanation as possible of the problem if possible, the steps taken resulting in the problem a link to the site / page in question
If you think you have solved the bug, fill in a ticket and attach to it a corrective patch.
You may also (...) -
Contribute to documentation
13 avril 2011Documentation is vital to the development of improved technical capabilities.
MediaSPIP welcomes documentation by users as well as developers - including : critique of existing features and functions articles contributed by developers, administrators, content producers and editors screenshots to illustrate the above translations of existing documentation into other languages
To contribute, register to the project users’ mailing (...)
Sur d’autres sites (4157)
-
Extended client ownership of MediaCodec encoder output buffers for RTMP streaming
13 février 2014, par dbroBackground :
I've connected Android's MediaCodec to FFmpeg for muxing a variety of formats not supported by MediaMuxer, including
rtmp://
output via a.flv
container. Such streaming muxers require longer, unpredictable ownership of MediaCodec's output buffers, as they may perform networking I/O on any packet processing step. For my video stream, I'm using MediaCodec configured for Surface input. To decouple muxing from encoding, I queue MediaCodec's ByteBuffer output buffers to my muxer via a Handler.All works splendidly if I mux the
.flv
output to file, rather than rtmp endpoint.Problem :
When muxing to
rtmp://...
endpoint I notice my streaming application begins to block on calls toeglSwapBuffers(mEGLDisplay, mEncodingEGLSurface)
atdequeueOutputBuffer()
once I'm retaining even a few MediaCodec output buffers in my muxing queue as MediaCodec seems to be locked to only 4 output buffers.Any tricks to avoid copying all encoder output returned by
MediaCodec#dequeueOutputBuffers
and immediately callingreleaseOutputBuffer(...)
?The full source of my project is available on Github. Specifically, see :
- AndroidEncoder.java : Abstract Encoder class with shared behavior between Audio and Video encoders : mainly drainEncoder(). Writes data to a
Muxer
instance. - FFmpegMuxer.java : Implements
Muxer
- CameraEncoder.java. Sends camera frames to an AndroidEncoder subclass configured for Video encoding.
Systrace
Here's some systrace output streaming 720p @ 2Mbps video to Zencoder.
Solved
Copying then releasing the MediaCodec encoder output ByteBuffers as soon as they're available solves the issue without significantly affecting performance. I recycle the ByteBuffer copies in an
ArrayDeque<bytebuffer></bytebuffer>
for each muxer track, which limits the number of allocations. - AndroidEncoder.java : Abstract Encoder class with shared behavior between Audio and Video encoders : mainly drainEncoder(). Writes data to a
-
Stream microphone from client browser to remote server and pass audio in real time to ffmpeg to combine with a second video source
4 mai 2021, par fakeguybrushthreepwoodAs a beginner at working with these kinds of real-time streaming services, I've spent hours trying to work out how this is possible, but can't seem to work out I'd precisely go about it.


I'm prototyping a personal basic web app that does the following :


- 

-
In a web browser, the web application has a button that says 'Stream Microphone' - when pressed it streams the audio from the user's microphone (the user obviously has to consent to give permission to send their microphone audio) through to the server which I was presuming would be running node.js (no specific reason at this point, just thought this is how I'd go about doing it).


-
The server receives the audio close enough to real-time somehow (not sure how I'd do this).


-
I can then run ffmpeg on the command line and take the real-time audio coming in real-time and add it as the sound to a video file (let's just say I'm going to play testmovie.mp4) that I want to play.










I've looked at various solutions - such as maybe using WebRTC, RTP/RTSP, Piping audio into ffmpeg, Gstreamer, Kurento, Flashphoner and/or Wowza - but somehow they look overly complicated and usually seem to focus on video along with audio. I just need to work with audio.


-
-
TypeError : parameter 'url' is missing a type annotation in callback 'play' + ffmpeg
10 décembre 2022, par morichhI can't understand why Python interpreter doesn't like the fact that I use the
url
in the last line

@client.tree.command()
async def play(interaction: Interaction, url = str):
 global voice
 channel = interaction.user.voice.channel
 voice = get(client.voice_clients, Guild=interaction.guild)
 if voice and voice.is_connected():
 await voice.move_to(channel)
 await interaction.response.send_message(f"*1*")
 else:
 voice = await channel.connect()
 await interaction.response.send_message(f"*2*")
 voice.play(discord.FFmpegPCMAudio(executable="C:/ffmpeg/bin/ffmpeg.exe", source = url)) 




By the way, ffmpeg also does not work.