Recherche avancée
Médias (91)
-
Collections - Formulaire de création rapide
19 février 2013, par
Mis à jour : Février 2013
Langue : français
Type : Image
-
Les Miserables
4 juin 2012, par
Mis à jour : Février 2013
Langue : English
Type : Texte
-
Ne pas afficher certaines informations : page d’accueil
23 novembre 2011, par
Mis à jour : Novembre 2011
Langue : français
Type : Image
-
The Great Big Beautiful Tomorrow
28 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Texte
-
Richard Stallman et la révolution du logiciel libre - Une biographie autorisée (version epub)
28 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Texte
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (34)
-
Librairies et binaires spécifiques au traitement vidéo et sonore
31 janvier 2010, parLes logiciels et librairies suivantes sont utilisées par SPIPmotion d’une manière ou d’une autre.
Binaires obligatoires FFMpeg : encodeur principal, permet de transcoder presque tous les types de fichiers vidéo et sonores dans les formats lisibles sur Internet. CF ce tutoriel pour son installation ; Oggz-tools : outils d’inspection de fichiers ogg ; Mediainfo : récupération d’informations depuis la plupart des formats vidéos et sonores ;
Binaires complémentaires et facultatifs flvtool2 : (...) -
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...) -
De l’upload à la vidéo finale [version standalone]
31 janvier 2010, parLe chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
Upload et récupération d’informations de la vidéo source
Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)
Sur d’autres sites (2924)
-
TS video copied to MP4, missing 3 first frames when programmatically read (ffmpeg bug)
3 septembre 2023, par Vasilis LemonidisRunning :


ffmpeg -i test.ts -fflags +genpts -c copy -y test.mp4


for this test.ts, which has 30 frames, readable by opencv, I end up with 28 frames, out of which 27 are readable by opencv. More specifically :


ffprobe -v error -select_streams v:0 -count_packets -show_entries stream=nb_read_packets -of csv=p=0 tmp.ts 


returns 30.


ffprobe -v error -select_streams v:0 -count_packets -show_entries stream=nb_read_packets -of csv=p=0 tmp.mp4


returns 28.


Using OpenCV in that manner


cap = cv2.VideoCapture(tmp_path)
readMat = []
while cap.isOpened():
 ret, frame = cap.read()
 if not ret:
 break
 readMat.append(frame)


I get for the ts file 30 frames, while for the mp4 27 frames.


Could someone explain why the discrepancies ? I get no error during the transformation from ts to mp4 :


ffmpeg version N-111746-gd53acf452f Copyright (c) 2000-2023 the FFmpeg developers
 built with gcc 11.3.0 (GCC)
 configuration: --ld=g++ --bindir=/bin --extra-libs='-lpthread -lm' --pkg-config-flags=--static --enable-static --enable-gpl --enable-libaom --enable-libass --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libsvtav1 --enable-libdav1d --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265 --enable-nonfree --enable-cuda-nvcc --enable-cuvid --enable-nvenc --enable-libnpp 
 libavutil 58. 16.101 / 58. 16.101
 libavcodec 60. 23.100 / 60. 23.100
 libavformat 60. 10.100 / 60. 10.100
 libavdevice 60. 2.101 / 60. 2.101
 libavfilter 9. 10.100 / 9. 10.100
 libswscale 7. 3.100 / 7. 3.100
 libswresample 4. 11.100 / 4. 11.100
 libpostproc 57. 2.100 / 57. 2.100
[mpegts @ 0x4237240] DTS discontinuity in stream 0: packet 5 with DTS 306003, packet 6 with DTS 396001
Input #0, mpegts, from 'tmp.ts':
 Duration: 00:00:21.33, start: 3.400000, bitrate: 15 kb/s
 Program 1 
 Metadata:
 service_name : Service01
 service_provider: FFmpeg
 Stream #0:0[0x100]: Video: h264 (High) ([27][0][0][0] / 0x001B), yuv420p(progressive), 300x300, 1 fps, 3 tbr, 90k tbn
Output #0, mp4, to 'test.mp4':
 Metadata:
 encoder : Lavf60.10.100
 Stream #0:0: Video: h264 (High) (avc1 / 0x31637661), yuv420p(progressive), 300x300, q=2-31, 1 fps, 3 tbr, 90k tbn
Stream mapping:
 Stream #0:0 -> #0:0 (copy)
Press [q] to stop, [?] for help
[out#0/mp4 @ 0x423e280] video:25kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 4.192123%
frame= 30 fps=0.0 q=-1.0 Lsize= 26kB time=00:00:21.00 bitrate= 10.3kbits/s speed=1e+04x 


Additional information


The origin of the video I am processing comes from a continuous stitching operation of still images ts videos, produced by this class
updatemethod :

import cv2
import os
import subprocess
from tempfile import NamedTemporaryFile
class VideoUpdater:
 def __init__(
 self, video_path: str, framerate: int, timePerFrame: Optional[int] = None
 ):
 """
 Video updater takes in a video path, and updates it using a supplied frame, based on a given framerate.
 Args:
 video_path: str: Specify the path to the video file
 framerate: int: Set the frame rate of the video
 """
 if not video_path.endswith(".mp4"):
 LOGGER.warning(
 f"File type {os.path.splitext(video_path)[1]} not supported for streaming, switching to ts"
 )
 video_path = os.path.splitext(video_path)[0] + ".mp4"

 self._ps = None
 self.env = {
 
 }
 self.ffmpeg = "/usr/bin/ffmpeg "

 self.video_path = video_path
 self.ts_path = video_path.replace(".mp4", ".ts")
 self.tfile = None
 self.framerate = framerate
 self._video = None
 self.last_frame = None
 self.curr_frame = None


 def update(self, frame: np.ndarray):
 if len(frame.shape) == 2:
 frame = cv2.cvtColor(frame, cv2.COLOR_GRAY2BGR)
 else:
 frame = cv2.cvtColor(frame, cv2.COLOR_RGB2BGR)
 self.writeFrame(frame)

 def writeFrame(self, frame: np.ndarray):
 """
 The writeFrame function takes a frame and writes it to the video file.
 Args:
 frame: np.ndarray: Write the frame to a temporary file
 """


 tImLFrame = NamedTemporaryFile(suffix=".png")
 tVidLFrame = NamedTemporaryFile(suffix=".ts")

 cv2.imwrite(tImLFrame.name, frame)
 ps = subprocess.Popen(
 self.ffmpeg
 + rf"-loop 1 -r {self.framerate} -i {tImLFrame.name} -t {self.framerate} -vcodec libx264 -pix_fmt yuv420p -y {tVidLFrame.name}",
 env=self.env,
 shell=True,
 stdout=subprocess.PIPE,
 stderr=subprocess.PIPE,
 )
 ps.communicate()
 if os.path.isfile(self.ts_path):
 # this does not work to watch, as timestamps are not updated
 ps = subprocess.Popen(
 self.ffmpeg
 + rf'-i "concat:{self.ts_path}|{tVidLFrame.name}" -c copy -y {self.ts_path.replace(".ts", ".bak.ts")}',
 env=self.env,
 shell=True,
 stdout=subprocess.PIPE,
 stderr=subprocess.PIPE,
 )
 ps.communicate()
 shutil.move(self.ts_path.replace(".ts", ".bak.ts"), self.ts_path)

 else:
 shutil.copyfile(tVidLFrame.name, self.ts_path)
 # fixing timestamps, we dont have to wait for this operation
 ps = subprocess.Popen(
 self.ffmpeg
 + rf"-i {self.ts_path} -fflags +genpts -c copy -y {self.video_path}",
 env=self.env,
 shell=True,
 # stdout=subprocess.PIPE,
 # stderr=subprocess.PIPE,
 )
 tImLFrame.close()
 tVidLFrame.close()


-
FFMPEG : TS video copied to MP4, missing 3 first frames [closed]
21 août 2023, par Vasilis LemonidisRunning :


ffmpeg -i test.ts -fflags +genpts -c copy -y test.mp4


for this test.ts, which has 30 frames, readable by opencv, I end up with 28 frames, out of which 27 are readable by opencv. More specifically :


ffprobe -v error -select_streams v:0 -count_packets -show_entries stream=nb_read_packets -of csv=p=0 tmp.ts 


returns 30.


ffprobe -v error -select_streams v:0 -count_packets -show_entries stream=nb_read_packets -of csv=p=0 tmp.mp4


returns 28.


Using OpenCV in that manner


cap = cv2.VideoCapture(tmp_path)
readMat = []
while cap.isOpened():
 ret, frame = cap.read()
 if not ret:
 break
 readMat.append(frame)


I get for the ts file 30 frames, while for the mp4 27 frames.


Could someone explain why the discrepancies ? I get no error during the transformation from ts to mp4 :


ffmpeg version N-111746-gd53acf452f Copyright (c) 2000-2023 the FFmpeg developers
 built with gcc 11.3.0 (GCC)
 configuration: --ld=g++ --bindir=/bin --extra-libs='-lpthread -lm' --pkg-config-flags=--static --enable-static --enable-gpl --enable-libaom --enable-libass --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libsvtav1 --enable-libdav1d --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265 --enable-nonfree --enable-cuda-nvcc --enable-cuvid --enable-nvenc --enable-libnpp 
 libavutil 58. 16.101 / 58. 16.101
 libavcodec 60. 23.100 / 60. 23.100
 libavformat 60. 10.100 / 60. 10.100
 libavdevice 60. 2.101 / 60. 2.101
 libavfilter 9. 10.100 / 9. 10.100
 libswscale 7. 3.100 / 7. 3.100
 libswresample 4. 11.100 / 4. 11.100
 libpostproc 57. 2.100 / 57. 2.100
[mpegts @ 0x4237240] DTS discontinuity in stream 0: packet 5 with DTS 306003, packet 6 with DTS 396001
Input #0, mpegts, from 'tmp.ts':
 Duration: 00:00:21.33, start: 3.400000, bitrate: 15 kb/s
 Program 1 
 Metadata:
 service_name : Service01
 service_provider: FFmpeg
 Stream #0:0[0x100]: Video: h264 (High) ([27][0][0][0] / 0x001B), yuv420p(progressive), 300x300, 1 fps, 3 tbr, 90k tbn
Output #0, mp4, to 'test.mp4':
 Metadata:
 encoder : Lavf60.10.100
 Stream #0:0: Video: h264 (High) (avc1 / 0x31637661), yuv420p(progressive), 300x300, q=2-31, 1 fps, 3 tbr, 90k tbn
Stream mapping:
 Stream #0:0 -> #0:0 (copy)
Press [q] to stop, [?] for help
[out#0/mp4 @ 0x423e280] video:25kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 4.192123%
frame= 30 fps=0.0 q=-1.0 Lsize= 26kB time=00:00:21.00 bitrate= 10.3kbits/s speed=1e+04x 


Additional information


The origin of the video I am processing comes from a continuous stitching operation of still images ts videos, produced by this class
updatemethod :

import cv2
import os
import subprocess
from tempfile import NamedTemporaryFile
class VideoUpdater:
 def __init__(
 self, video_path: str, framerate: int, timePerFrame: Optional[int] = None
 ):
 """
 Video updater takes in a video path, and updates it using a supplied frame, based on a given framerate.
 Args:
 video_path: str: Specify the path to the video file
 framerate: int: Set the frame rate of the video
 """
 if not video_path.endswith(".mp4"):
 LOGGER.warning(
 f"File type {os.path.splitext(video_path)[1]} not supported for streaming, switching to ts"
 )
 video_path = os.path.splitext(video_path)[0] + ".mp4"

 self._ps = None
 self.env = {
 
 }
 self.ffmpeg = "/usr/bin/ffmpeg "

 self.video_path = video_path
 self.ts_path = video_path.replace(".mp4", ".ts")
 self.tfile = None
 self.framerate = framerate
 self._video = None
 self.last_frame = None
 self.curr_frame = None


 def update(self, frame: np.ndarray):
 if len(frame.shape) == 2:
 frame = cv2.cvtColor(frame, cv2.COLOR_GRAY2BGR)
 else:
 frame = cv2.cvtColor(frame, cv2.COLOR_RGB2BGR)
 self.writeFrame(frame)

 def writeFrame(self, frame: np.ndarray):
 """
 The writeFrame function takes a frame and writes it to the video file.
 Args:
 frame: np.ndarray: Write the frame to a temporary file
 """


 tImLFrame = NamedTemporaryFile(suffix=".png")
 tVidLFrame = NamedTemporaryFile(suffix=".ts")

 cv2.imwrite(tImLFrame.name, frame)
 ps = subprocess.Popen(
 self.ffmpeg
 + rf"-loop 1 -r {self.framerate} -i {tImLFrame.name} -t {self.framerate} -vcodec libx264 -pix_fmt yuv420p -y {tVidLFrame.name}",
 env=self.env,
 shell=True,
 stdout=subprocess.PIPE,
 stderr=subprocess.PIPE,
 )
 ps.communicate()
 if os.path.isfile(self.ts_path):
 # this does not work to watch, as timestamps are not updated
 ps = subprocess.Popen(
 self.ffmpeg
 + rf'-i "concat:{self.ts_path}|{tVidLFrame.name}" -c copy -y {self.ts_path.replace(".ts", ".bak.ts")}',
 env=self.env,
 shell=True,
 stdout=subprocess.PIPE,
 stderr=subprocess.PIPE,
 )
 ps.communicate()
 shutil.move(self.ts_path.replace(".ts", ".bak.ts"), self.ts_path)

 else:
 shutil.copyfile(tVidLFrame.name, self.ts_path)
 # fixing timestamps, we dont have to wait for this operation
 ps = subprocess.Popen(
 self.ffmpeg
 + rf"-i {self.ts_path} -fflags +genpts -c copy -y {self.video_path}",
 env=self.env,
 shell=True,
 # stdout=subprocess.PIPE,
 # stderr=subprocess.PIPE,
 )
 tImLFrame.close()
 tVidLFrame.close()


-
FFmpeg - record from stream terminating unexpectedly using kokorin/Jaffree ffmpeg wrapper for Java
18 avril 2024, par pyrmonI am programming a Spring Boot Application using Maven and Java 21. I am trying to record a stream from a url and save it to a mkv file. I intend to do this with kokorin/Jaffree in version 2023.09.10. The recording seems to work ok, however longer videos are terminating unexpectedly. Sometimes after 5 minutes, other times an hour or even longer. Sometimes with Exit Code 0 and sometimes with 1.


I have implemented the recording like this :


@Override
 public void startRecording(RecordingSchedule recordingSchedule) {
 logger.info("Starting recording for schedule with filename {}", recordingSchedule.getFileName());

 String m3uUrl = recordingSchedule.getM3uUrl();
 LocalDateTime endTime = timeUtils.parseStringToLocalDateTime(recordingSchedule.getEndTime());
 LocalDateTime stopTime = endTime.plusSeconds(20);
 String timeToRecord = timeUtils.calculateTimeToRecord(stopTime);
 Path outputPath = Paths.get("/recordings/" + recordingSchedule.getFileName());

 try {
 FFmpeg.atPath()
 .addInput(UrlInput.fromUrl(m3uUrl))
 .addArgument("-xerror")
 .addArguments("-reconnect", "5")
 .addArguments("-reconnect_streamed", "5")
 .addArguments("-reconnect_delay_max", "20")
 .addArguments("-t", timeToRecord)
 .addArguments("-c", "copy")
 .addOutput(
 UrlOutput.toPath(outputPath))
 .setLogLevel(LogLevel.WARNING)
 .execute();
 logger.info("Recording complete. Output file: {}", outputPath.toAbsolutePath());
 } catch (Exception e) {
 logger.error("Error recording M3U stream {}: {}", recordingSchedule.getFileName(), e.getMessage());
 }
 }


And I am calling the method like this :

executorConfig.executorService().submit(() -> ffmpegService.startRecording(recording));

Any ideas what I am doing wrong ?
Here are the log lines at the beginning and end of recording of the past two attempts :


2024-04-18T00:54:48.689+02:00 INFO 1 --- [pool-2-thread-1] m.s.r.service.impl.FfmpegServiceImpl : Starting recording for schedule with filename Example1.mkv
2024-04-18T00:54:48.697+02:00 WARN 1 --- [pool-2-thread-1] c.github.kokorin.jaffree.ffmpeg.FFmpeg : ProgressListener isn't set, progress won't be reported
2024-04-18T00:54:48.698+02:00 INFO 1 --- [pool-2-thread-1] c.g.k.jaffree.process.ProcessHandler : Command constructed:
ffmpeg -loglevel level+warning -i http://example.stream.url.com -n -xerror -reconnect 5 -reconnect_streamed 5 -reconnect_delay_max 20 -t 10771 -c copy /recordings/Example1.mkv
2024-04-18T00:54:48.698+02:00 INFO 1 --- [pool-2-thread-1] c.g.k.jaffree.process.ProcessHandler : Starting process: ffmpeg
2024-04-18T00:54:48.701+02:00 INFO 1 --- [pool-2-thread-1] c.g.k.jaffree.process.ProcessHandler : Waiting for process to finish
2024-04-18T01:31:02.633+02:00 WARN 1 --- [ StdErr] c.g.k.jaffree.process.BaseStdReader : [h264 @ 0x559cd22dd940] [warning] Increasing reorder buffer to 2
2024-04-18T01:31:02.633+02:00 INFO 1 --- [pool-2-thread-1] c.g.k.jaffree.process.ProcessHandler : Process has finished with status: 0
2024-04-18T01:31:02.734+02:00 INFO 1 --- [pool-2-thread-1] m.s.r.service.impl.FfmpegServiceImpl : Recording complete. Output file: /recordings/Example1.mkv

2024-04-18T03:54:48.678+02:00 INFO 1 --- [pool-2-thread-2] m.s.r.service.impl.FfmpegServiceImpl : Starting recording for schedule with filename Example2.mkv
2024-04-18T03:54:48.678+02:00 WARN 1 --- [pool-2-thread-2] c.github.kokorin.jaffree.ffmpeg.FFmpeg : ProgressListener isn't set, progress won't be reported
2024-04-18T03:54:48.678+02:00 INFO 1 --- [pool-2-thread-2] c.g.k.jaffree.process.ProcessHandler : Command constructed:
ffmpeg -loglevel level+warning -i http://example.stream.url.com/ -n -xerror -reconnect 5 -reconnect_streamed 5 -reconnect_delay_max 20 -t 11431 -c copy /recordings/Example2.mkv
2024-04-18T03:54:48.678+02:00 INFO 1 --- [pool-2-thread-2] c.g.k.jaffree.process.ProcessHandler : Starting process: ffmpeg
2024-04-18T03:54:48.679+02:00 INFO 1 --- [pool-2-thread-2] c.g.k.jaffree.process.ProcessHandler : Waiting for process to finish
2024-04-18T04:57:22.256+02:00 WARN 1 --- [ StdErr] c.g.k.jaffree.process.BaseStdReader : [h264 @ 0x55707ba988c0] [warning] Increasing reorder buffer to 3
2024-04-18T04:58:47.455+02:00 ERROR 1 --- [ StdErr] c.g.k.jaffree.process.BaseStdReader : [NULL @ 0x55707ba988c0] [error] Picture timing SEI payload too large
2024-04-18T04:58:47.456+02:00 ERROR 1 --- [ StdErr] c.g.k.jaffree.process.BaseStdReader : [NULL @ 0x55707ba988c0] [error] non-existing PPS 1 referenced
2024-04-18T04:58:47.456+02:00 WARN 1 --- [ StdErr] c.g.k.jaffree.process.BaseStdReader : [matroska @ 0x55707ba9a380] [warning] Timestamps are unset in a packet for stream 0. This is deprecated and will stop working in the future. Fix your code to set the timestamps properly
2024-04-18T04:58:47.456+02:00 ERROR 1 --- [ StdErr] c.g.k.jaffree.process.BaseStdReader : [matroska @ 0x55707ba9a380] [error] Can't write packet with unknown timestamp
2024-04-18T04:58:47.463+02:00 ERROR 1 --- [ StdErr] c.g.k.jaffree.process.BaseStdReader : [error] av_interleaved_write_frame(): Invalid argument
2024-04-18T04:58:47.463+02:00 INFO 1 --- [pool-2-thread-2] c.g.k.jaffree.process.ProcessHandler : Process has finished with status: 1
2024-04-18T04:58:47.564+02:00 ERROR 1 --- [pool-2-thread-2] m.s.r.service.impl.FfmpegServiceImpl : Error recording M3U stream Example2.mkv: Process execution has ended with non-zero status: 1. Check logs for detailed error message.


They were supposed to run nearly 3 hours and the other one over 3 hours. And with the timestamps you can see that they are not running nearly as long.
Thank you for your help !