
Recherche avancée
Autres articles (26)
-
Encoding and processing into web-friendly formats
13 avril 2011, parMediaSPIP automatically converts uploaded files to internet-compatible formats.
Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
All uploaded files are stored online in their original format, so you can (...) -
Supporting all media types
13 avril 2011, parUnlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)
-
Gestion de la ferme
2 mars 2010, parLa ferme est gérée dans son ensemble par des "super admins".
Certains réglages peuvent être fais afin de réguler les besoins des différents canaux.
Dans un premier temps il utilise le plugin "Gestion de mutualisation"
Sur d’autres sites (3807)
-
Discord FFMPEG audio wont play from yt-dlp
19 mars 2023, par user21236822My question is this : Why isn't my bot playing audio ?


I want the bot to join, play audio from queue, then disconnect without downloading an mp3 file.


I tried using youtube-dl, but I switched to the yt-dlp library after getting errors I couldn't fix.
I am running on Windows 10 locally. All my libraries are up to date.


Here are my ydl_opts and FFMPEG_OPTS :


ydl_opts = {
 'format': 'bestaudio/best',
 'postprocessors': [{
 'key': 'FFmpegExtractAudio',
 'preferredcodec': 'mp3',
 'preferredquality': '192',
 }],
}

FFMPEG_OPTIONS = {
 'before_options': '-reconnect 1 -reconnect_streamed 1 -reconnect_delay_max 5',
 'options': '-vn'
} 



Here is where I believe the problem is.


async def play():
 print("Play Called")
 musicPlay()
 # Get message object from initial request
 message = ytLinkQue.get()
 print(f"Message object recieved: {message}")
 voiceChannel = message.author.voice.channel
 vc = await voiceChannel.connect()
 songsPlayed = 0
 
 while not ytLinkQue.empty():
 # Get current song
 currentSong = ytLinkQue.get()[0]
 print(f"Current song: {currentSong}")

 # Get song from Youtube
 with yt_dlp.YoutubeDL(ydl_opts) as ydl:
 # song = ydl.download(currentSong)
 info = ydl.extract_info(currentSong, download=False)
 song = info['formats'][0]['url']

 # Play Song
 vc.play(discord.FFmpegPCMAudio(song, **FFMPEG_OPTIONS), after=lambda e: print('Song done'))

 # Wait until the song has finished playing
 while vc.is_playing():
 print("playing rn")
 await asyncio.sleep(1)
 
 await vc.disconnect()
 musicStop()



When play() is called, here is the output in terminal with my annotations as **** text **** :


>python main.py
2023-02-17 15:21:09 INFO discord.client logging in using static token
2023-02-17 15:21:10 INFO discord.gateway Shard ID None has connected to Gateway (Session ID: 60b9fce14faa5daa4aed9eb6db01a74d).
Max que: 50
Text Channel: 828698708123451434
Testing Bot#4591 is ready.
Passing message object
**** play() funciton is called ****
Play Called
Message object recieved: <message channel="<TextChannel" position="7" nsfw="False" news="False"> type= author=<member discriminator="'0199'" bot="False" nick="'Fragnk7?'" guild="<Guild" chunked="True">> flags=<messageflags value="0">>
2023-02-17 15:21:16 INFO discord.voice_client Connecting to voice...
2023-02-17 15:21:16 INFO discord.voice_client Starting voice handshake... (connection attempt 1)
2023-02-17 15:21:17 INFO discord.voice_client Voice handshake complete. Endpoint found seattle2004.discord.media
Current song: https://www.youtube.com/watch?v=vcAp4nmTZCA
[youtube] Extracting URL: https://www.youtube.com/watch?v=vcAp4nmTZCA 
[youtube] vcAp4nmTZCA: Downloading webpage 
[youtube] vcAp4nmTZCA: Downloading android player API JSON 
**** Does not play any audio ****
Playing rn
Song done
2023-02-17 15:21:18 INFO discord.player ffmpeg process 20700 successfully terminated with return code of 1.
2023-02-17 15:21:19 INFO discord.voice_client The voice handshake is being terminated for Channel ID 400178308467392513 (Guild ID 261601676941721602)
2023-02-17 15:21:19 INFO discord.voice_client Disconnecting from voice normally, close code 1000.
</messageflags></member></message>


On Discord's end, the bot successfully connects then disconnects after 2 second.


Note : I've only included code I think is relevant. Please let me know if I should add anything else to the post, otherwise, here is the github for the project. Code is in main.py.
https://github.com/LukeLeimbach/wallMomentMusic


Thank you in advance !


I've applied the advice from these posts but it still will not play audio :


-
https://stackoverflow.com/questions/45770016/how-do-i-make-my-discord-bot-play-audio-from-youtube


-
https://stackoverflow.com/questions/66070749/how-to-fix-discord-music-bot-that-stops-playing-before-the-song-is-actually-over?newreg=c70dd786cf5844e490045494223c0381


-
https://stackoverflow.com/questions/57688808/playing-music-with-a-bot-from-youtube-without-downloading-the-file


-
ffmpeg flip horizontally webcam to virtual video camera
30 mai 2023, par Kaiser SchwarczI need to horizontally flip my webcam image for a meeting.
I tried the instructions in this site https://wiki.archlinux.org/index.php/Webcam_setup#Applications which uses v4l2 and v4l2loopback to generate a virtual camera.


# modprobe v4l2loopback



Check the name of the newly created camera :


$ v4l2-ctl --list-devices

Dummy video device (0x0000) (platform:v4l2loopback-000):
 /dev/video1



Then you can run ffmpeg to read from your actual webcam (here /dev/video0) and invert it and feed it to the virtual camera :


$ ffmpeg -f v4l2 -i /dev/video0 -vf "vflip" -f v4l2 /dev/video1



You can use the "Dummy" camera in your applications instead of the "Integrated" camera.


With these settings I was successful in vertically flipping my video. But that is not what I want. I want it to be flipped horizontally.


So I tried this :


$ ffmpeg -f v4l2 -i /dev/video0 -vf **"hflip"** -f v4l2 /dev/video1



But I then I get no image from my cam.


What am I doing wrong ?


I'm using Fedora 31 in a desktop.


COMPLETE LOG :


ffmpeg version 4.2.2 Copyright (c) 2000-2019 the FFmpeg developers

 built with gcc 9 (GCC)

 configuration: --prefix=/usr --bindir=/usr/bin --datadir=/usr/share/ffmpeg --docdir=/usr/share/doc/ffmpeg --incdir=/usr/include/ffmpeg --libdir=/usr/lib64 --mandir=/usr/share/man --arch=x86_64 --optflags='-O2 -g -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fexceptions -fstack-protector-strong -grecord-gcc-switches -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection' --extra-ldflags='-Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld ' --extra-cflags=' ' --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libvo-amrwbenc --enable-version3 --enable-bzlib --disable-crystalhd --enable-fontconfig --enable-frei0r --enable-gcrypt --enable-gnutls --enable-ladspa --enable-libaom --enable-libdav1d --enable-libass --enable-libbluray --enable-libcdio --enable-libdrm --enable-libjack --enable-libfreetype --enable-libfribidi --enable-libgsm --enable-libmp3lame --enable-nvenc --enable-openal --enable-opencl --enable-opengl --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-librsvg --enable-libsrt --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libvorbis --enable-libv4l2 --enable-libvidstab --enable-libvmaf --enable-libvpx --enable-libx264 --enable-libx265 --enable-libxvid --enable-libzimg --enable-libzvbi --enable-avfilter --enable-avresample --enable-postproc --enable-pthreads --disable-static --enable-shared --enable-gpl --disable-debug --disable-stripping --shlibdir=/usr/lib64 --enable-libmfx --enable-runtime-cpudetect

 libavutil 56. 31.100 / 56. 31.100

 libavcodec 58. 54.100 / 58. 54.100

 libavformat 58. 29.100 / 58. 29.100

 libavdevice 58. 8.100 / 58. 8.100

 libavfilter 7. 57.100 / 7. 57.100

 libavresample 4. 0. 0 / 4. 0. 0

 libswscale 5. 5.100 / 5. 5.100

 libswresample 3. 5.100 / 3. 5.100

 libpostproc 55. 5.100 / 55. 5.100

Input #0, video4linux2,v4l2, from '/dev/video0':

 Duration: N/A, start: 233168.222502, bitrate: 147456 kb/s

 Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 640x480, 147456 kb/s, 30 fps, 30 tbr, 1000k tbn, 1000k tbc

Stream mapping:

 Stream #0:0 -> #0:0 (rawvideo (native) -> rawvideo (native))

Press [q] to stop, [?] for help

Output #0, video4linux2,v4l2, to '/dev/video2':

 Metadata:

 encoder : Lavf58.29.100

 Stream #0:0: Video: rawvideo (Y42B / 0x42323459), yuv422p, 640x480, q=2-31, 147456 kb/s, 30 fps, 30 tbn, 30 tbc

 Metadata:

 encoder : Lavc58.54.100 rawvideo

frame= 31 fps=0.0 q=-0.0 size=N/A time=00:00:01.03 bitrate=N/A dup=16 drop=0 sframe= 46 fps= 46 q=-0.0 size=N/A time=00:00:01.53 bitrate=N/A dup=16 drop=0 sframe= 61 fps= 40 q=-0.0 size=N/A time=00:00:02.03 bitrate=N/A .....



-
Does PTS have to start at 0 ?
5 juillet 2018, par stevendesuI’ve seen a number of questions regarding video PTS values not starting at zero, or asking how to make them start at zero. I’m aware that using ffmpeg I can do something like
ffmpeg -i <video> -vf="setpts=PTS-STARTPTS" <output></output></video>
to fix this kind of thingHowever it’s my understanding that PTS values don’t have to start at zero. For instance, if you join a live stream then odds are it has been going on for an hour and the PTS is already somewhere around 3600000+ but your video player faithfully displays everything just fine. Therefore I would expect there to be no problem if I intentionally created a video with a PTS value starting at, say, the current wall clock time.
I want to send a live stream using ffmpeg, but embed the current time into the stream. This can be used both for latency calculation while the stream is live, and later to determine when the stream was originally aired. From my understanding of PTS, something as simple as this should probably work :
ffmpeg -i video.flv -vf="setpts=RTCTIME" rtmp://<output>
</output>When I try this, however, ffmpeg outputs the following :
frame= 93 fps= 20 q=-1.0 Lsize= 9434kB time=535020:39:58.70 bitrate= 0.0kbits/s speed=1.35e+11x
Note the extremely large value for "time", the bitrate (0.0kbits), and the speed (135000000000x !!!)
At first I thought the issue might be my timebase, so I tried the following :
ffmpeg -i video.flv -vf="settb=1/1K,setpts=RTCTIME/1K" rtmp://<output>
</output>This puts everything in terms of milliseconds (1 PTS = 1 ms) but I had the same issue (massive time, zero bitrate, and massive speed)
Am I misunderstanding something about PTS ? Is it not allowed to start at non-zero values ? Or am I just doing something wrong ?
Update
After reviewing @Gyan’s answer, I formatted my command like so :
ffmpeg -re -i video.flv -vf="settb=1/1K, setpts=(RTCTIME-RTCSTART)/1K" -output_ts_offset $(date +%s.%N) rtmp://<output>
</output>This way the PTS values would match up to "milliseconds since the stream started" and would be offset by the start time of the stream (theoretically making PTS = timestamp on the server)
This looked like it was encoding better :
frame= 590 fps=7.2 q=22.0 size= 25330kB time=00:01:21.71 bitrate=2539.5kbits/s dup=0 drop=1350 speed= 1x
Bitrate was now correct, time was accurate, and speed was not outrageous. The frames per second was still a bit off, though (the source video is 24 fps but it’s reporting 7.2 frames per second)
When I tried watching the stream from the other end, the video was out of sync with the audio and played at about double normal speed for a while, then the video froze and the audio continued without it
Furthermore, when I dumped the stream to a file (
ffmpeg -i rtmp://<output> dump.mp4</output>
) and look at the PTS timestamps with ffprobe (ffprobe -show_entries packet=codec_type,pts dump.mp4 | grep "video" -B 1 -A 2
) the timestamps didn’t seem to show server time at all :...
--
[PACKET]
codec_type=video
pts=131072
[/PACKET]
[PACKET]
codec_type=video
pts=130048
[/PACKET]
--
[PACKET]
codec_type=video
pts=129536
[/PACKET]
[PACKET]
codec_type=video
pts=130560
[/PACKET]
--
[PACKET]
codec_type=video
pts=131584
[/PACKET]Is the problem just an incompatibility with RTMP ?
Update 2
I’ve removed the video filter and I’m now encoding like so :
ffmpeg -re -i video.flv -output_ts_offset $(date +%s.%N) rtmp://<output>
</output>This is encoding correctly :
frame= 910 fps= 23 q=25.0 size= 12027kB time=00:00:38.97 bitrate=2528.2kbits/s speed=0.981x
In order to verify that the PTS values are correct, I’m dumping the output to a file like so :
ffmpeg -i rtmp://<output> -copyts -write_tmcd 0 dump.mp4
</output>I tried saving it as
dump.flv
(since it’s RTMP) however this threw the error :[flv @ 0x5600f24b4620] Audio codec mp3 not compatible with flv
This is a bit weird since the video isn’t mp3-encoded (it’s speex) - but whatever.
While dumping this file the following error pops up repeatedly :
frame= 1 fps=0.0 q=0.0 size= 0kB time=00:00:09.21 bitrate= 0.0kbits/s dup=0 dr
43090023 frame duplication too large, skipping
43090027 frame duplication too large, skipping
Last message repeated 3 times
43090031 frame duplication too large, skipping
Last message repeated 3 times
43090035 frame duplication too large, skippingPlaying the resulting video in VLC plays an audio stream but displays no video. I then attempt to probe this video with
ffprobe
to look at the video PTS values :ffprobe -show_entries packet=codec_type,pts dump.mp4 | grep "video" -B 1 -A 2
This returns only a single video frame whose PTS is not large like I would expect :
[PACKET]
codec_type=video
pts=1020
[/PACKET]This has been a surprisingly difficult task