
Recherche avancée
Médias (1)
-
MediaSPIP Simple : futur thème graphique par défaut ?
26 septembre 2013, par
Mis à jour : Octobre 2013
Langue : français
Type : Video
Autres articles (47)
-
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
De l’upload à la vidéo finale [version standalone]
31 janvier 2010, parLe chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
Upload et récupération d’informations de la vidéo source
Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...) -
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)
Sur d’autres sites (5299)
-
How to add an arbitrary, formatted timestamp to a video with FFMPEG ? [duplicate]
23 avril 2018, par Neil M.This question already has an answer here :
-
Can ffmpeg burn in time code ?
7 answers
I am calling ffmpeg from a custom tool to concatenate video files while overlaying a timestamp in the output. The output video needs to have a timestamp starting at an arbitrary time. The format must be a 12-hour clock with seconds and meridiem, e.g. 10:34:59 AM or 6:13:09 PM. Here’s the full command I’m using right now :
ffmpeg\bin\ffmpeg.exe -y -i "concat:input.mod" -ss 00:00:00 -t 00:02:17
-an -vcodec libx264 -profile:v baseline -level 13 -b:v 2000k -vf
"drawtext=fontcolor=white:fontsize=16:fontfile="/Windows/Fonts/arial.ttf":
box=1:boxcolor=black@0.3:x=(w-text_w-10):y=(h-text_h-5):
timecode='02\:36\:17\;00':rate=30000/1001" output.mp4This outputs a 2 minute, 17 second duration video beginning at the start of the input file. The output video has a timecode in the bottom-right corner beginning at the time 02:36:17 and ending at 02:38:34. What I want is exactly this, but instead of printing "02:36:17 ;00" on frame 0 and counting up from there, it should print "2:36:17 AM" on frame 0 and count up from there.
I have tried using the
localtime
function to output formatted time, but the time value it uses is the time that thedrawtext
filter is called. It doesn’t take a parameter for an arbitrary time.I have also looked at the
pts
function, which seems to allow an arbitrary offset but only supports two formatting options, neither of which is the clock format I need.What is the proper way to add a timestamp with an arbitrary starting time and format using ffmpeg ?
-
Can ffmpeg burn in time code ?
-
ffprobe not showing language tag for ASS files in Ubuntu
9 juillet 2023, par EfraínThis is something that has me totally lost, because I'm not sure what could I have done, but from one week to another, "language" stopped appearing in the tags of an ASS track on a MKV. I know this just started to appear, because just last week I ended working on an script to detect the language of tracks, and today that I decided to test it again, I noted that it started to fail, and from my debug it's because ffprobe is now printing this


{
 "index": 2,
 "codec_name": "ass",
 "codec_long_name": "ASS (Advanced SSA) subtitle",
 "codec_type": "subtitle",
 "codec_tag_string": "[0][0][0][0]",
 "codec_tag": "0x0000",
 "r_frame_rate": "0/0",
 "avg_frame_rate": "0/0",
 "time_base": "1/1000",
 "start_pts": 0,
 "start_time": "0.000000",
 "duration_ts": 1430816,
 "duration": "1430.816000",
 "disposition": {
 "default": 1,
 "dub": 0,
 "original": 0,
 "comment": 0,
 "lyrics": 0,
 "karaoke": 0,
 "forced": 0,
 "hearing_impaired": 0,
 "visual_impaired": 0,
 "clean_effects": 0,
 "attached_pic": 0,
 "timed_thumbnails": 0
 },
 "tags": {
 "BPS-eng": "85",
 "DURATION-eng": "00:23:34.830000000",
 "NUMBER_OF_FRAMES-eng": "262",
 "NUMBER_OF_BYTES-eng": "15128",
 "_STATISTICS_WRITING_APP-eng": "mkvmerge v40.0.0 ('Old Town Road + Pony') 32-bit",
 "_STATISTICS_WRITING_DATE_UTC-eng": "2023-07-02 11:29:54",
 "_STATISTICS_TAGS-eng": "BPS DURATION NUMBER_OF_FRAMES NUMBER_OF_BYTES"
 }
}



Where it should be a field called "language" in the "tags", like this(Note that other file types, like SRT, or audio tracks show it correctly)


"tags": {
 "language": "eng",
 "BPS-eng": "128000",
 "DURATION-eng": "00:23:50.813000000",
 "NUMBER_OF_FRAMES-eng": "61620",
 "NUMBER_OF_BYTES-eng": "22893018",
 "_STATISTICS_WRITING_APP-eng": "mkvmerge v40.0.0 ('Old Town Road + Pony') 32-bit",
 "_STATISTICS_WRITING_DATE_UTC-eng": "2023-07-02 11:29:54",
 "_STATISTICS_TAGS-eng": "BPS DURATION NUMBER_OF_FRAMES NUMBER_OF_BYTES"
}



The weird part is that is the same file I used for testing(And it worked), and I haven't edited it, so I'm not sure why is this happening, Does someone knows if I might have messed up something installing a library for another thing that might have caused it ?


Just in case is necessary, this is the command I use :
ffprobe -v quiet -print_format json -show_streams <file></file>


-
OpenCV reading from live camera creates a short video that moves quickly
17 novembre 2022, par user19019404I am reading in a live vide stream from a CCTV camera. The camera is set to 5 fps, another is set to 25fps and another to 30fps. Irrespective of the FPS that the camera is set, I can record 5 minutes but end up with a 30 second recorded clip where everyone is running around the scene.


My code is the 'typical' read in video and write video code that you would find online such as (code below simplified for readability) :


import cv2

video = cv2.VideoCapture(live RTSP address of camera)

if (video.isOpened() == False):
 print("Error reading video file")
else:
 frame_width = video.get(cv2.CAP_PROP_FRAME_WIDTH)
 frame_height = video.get(cv2.CAP_PROP_FRAME_HEIGHT)
 frame_fps = video.get(cv2.CAP_PROP_FPS)
 size = (frame_width, frame_height)
 result = cv2.VideoWriter('filename.avi',cv2.VideoWriter_fourcc(*'MJPG'),frame_fps , size)

 while(True):
 ret, frame = video.read()
 if ret == True:
 result.write(frame)
 cv2.imshow('Frame', frame)
 if cv2.waitKey(1) & 0xFF == ord('s'):
 break
 else:
 break
 video.release()
 result.release()
 cv2.destroyAllWindows()
print("The video was successfully saved with new fps")



I have tried playing with the FPS by reading in the FPS from the live camera and using the same FPS in the video write, but all that results is a video that is a fraction of the real time and with people zooming around the scene. So watching a 5 minute smooth video results in a 20 second recorded video with everyone zooming around.


Is this something that I need to fix in the writing of the video or do I need a second pass with ffmpeg to readjust the video ?


Much appreciated


Update, corrected the code above and :
When printing the frames read and the frame written the numbers are the same, showing that each frame that is read is being written (so I am not losing frames along the way thereby writing half the amount of frames).