
Recherche avancée
Médias (1)
-
Richard Stallman et le logiciel libre
19 octobre 2011, par
Mis à jour : Mai 2013
Langue : français
Type : Texte
Autres articles (64)
-
Use, discuss, criticize
13 avril 2011, parTalk to people directly involved in MediaSPIP’s development, or to people around you who could use MediaSPIP to share, enhance or develop their creative projects.
The bigger the community, the more MediaSPIP’s potential will be explored and the faster the software will evolve.
A discussion list is available for all exchanges between users. -
Keeping control of your media in your hands
13 avril 2011, parThe vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...) -
MediaSPIP Player : les contrôles
26 mai 2010, parLes contrôles à la souris du lecteur
En plus des actions au click sur les boutons visibles de l’interface du lecteur, il est également possible d’effectuer d’autres actions grâce à la souris : Click : en cliquant sur la vidéo ou sur le logo du son, celui ci se mettra en lecture ou en pause en fonction de son état actuel ; Molette (roulement) : en plaçant la souris sur l’espace utilisé par le média (hover), la molette de la souris n’exerce plus l’effet habituel de scroll de la page, mais diminue ou (...)
Sur d’autres sites (9154)
-
Controlling end time in video player via AVPacket information / setting pts/dts properly
26 janvier 2017, par SyntheticGioI’m currently working in C/C++ using the FFMPEG core (libavcodec, etc.). I’m capturing a stream and writing it in chunks to different files. So imagine the stream is 5 minutes in length and I’m writing five files of one minute in length each. I’m able to do this successfully.
Currently, each file after the first file has a start time equal to the time it would have been in the un-chunked stream. So the second video file starts at 1 minute, the third starts at 2 minutes, etc. This was inadvertent but as it turns out is beneficial in my particular use case.
VLC or other video players that I’ve tried report this start time ’properly’, but the end time shows as the duration (not start time + duration). My gut feeling is that the player simply is making the assumption all videos start at 0 and it shows the length as the ’end time’ but I don’t actually know this so I’d like to know if there is anyway to set the AVPacket information so the player for the third video would start at 2 minutes and end at 3 minutes (for a 1 minute length video) - as an example ?
As an alternative, if I wanted to do this the traditional way (reset each chunk to starting at time 0), I assume I’d normalize the
AVPacket.pts
andAVPacket.dts
by subtracting the values of the final packet in the previous chunk ? This seems like this strategy would work for pts but I’m less sure about it working for dts. I feel like it would generally work for dts but there might be times when this fails, so I’d like to know if this is a safe method (or if there is a better method I should use in this case). -
Setting up time triggers on video player
28 juillet 2022, par gPatsBackground :



I have a some text that I want to display in a textbox whenever the video is playing in a certain interval, say
(00:02:00-00:04:00 "Hello there")
.


These textbox should only be displayed whenever the video is playing in the particular interval. If I rewind, or skip the video to a different time, the textbox should be generated/destroyed depending whether the current time falls within the interval or not.



I have searched the documentation for ffplay and vlc for implementing this functionality.



What I have tried :



ffplay : my strategy would be to keep polling the current time of the video which ffplay dumps in STDERR and check if the current time falls within the interval.



LibVLC :
LibVLC supports asynchronous events, but not the kind which would depend on the time at which video is playing.



My question :



1.Is it advisable to keep polling the current video time or setup callbacks/trigger functions ?



2.Is it possible to get the current playing time of the video as a variable and create callback functions accordingly ?



NOTE : I am aware that subtitles do the same thing, however I want the text to be in a different window. The textbox should be decoupled from the video player I use.


-
Piping stdout from child process directly to google cloud storage. Node.js
2 octobre 2018, par glennanyone here with expertise on piping to google cloud storage ?
i am using FFmpeg on a server, and want to pipe to my storage bucket.
I can get it working when I save FFmpeg output to a file, then createReadableStream from the file, and then pipe that to createWriteStream()
however, I dont want to generate a static file, and then go through that process every time
Why can I not just simply pipe FFmpeg.stdout.pipe(file.createWriteStream()) ?
FYI : I am running FFmpeg as a child process
spawnFFMPEG = () => {
this.ffmpeg = child_process.spawn('ffmpeg',
[
'-f', 'lavfi', '-i', 'anullsrc',
'-thread_queue_size', '512',
'-i', '-',
'-shortest', '-vcodec', 'copy',
'-f', 'avi',
'pipe:1'
]
)
this.ffmpeg.stdout.pipe(myWritableStream)