
Recherche avancée
Médias (33)
-
Stereo master soundtrack
17 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Audio
-
#7 Ambience
16 octobre 2011, par
Mis à jour : Juin 2015
Langue : English
Type : Audio
-
#6 Teaser Music
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#5 End Title
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#3 The Safest Place
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#4 Emo Creates
15 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
Autres articles (40)
-
XMP PHP
13 mai 2011, parDixit Wikipedia, XMP signifie :
Extensible Metadata Platform ou XMP est un format de métadonnées basé sur XML utilisé dans les applications PDF, de photographie et de graphisme. Il a été lancé par Adobe Systems en avril 2001 en étant intégré à la version 5.0 d’Adobe Acrobat.
Étant basé sur XML, il gère un ensemble de tags dynamiques pour l’utilisation dans le cadre du Web sémantique.
XMP permet d’enregistrer sous forme d’un document XML des informations relatives à un fichier : titre, auteur, historique (...) -
Participer à sa documentation
10 avril 2011La documentation est un des travaux les plus importants et les plus contraignants lors de la réalisation d’un outil technique.
Tout apport extérieur à ce sujet est primordial : la critique de l’existant ; la participation à la rédaction d’articles orientés : utilisateur (administrateur de MediaSPIP ou simplement producteur de contenu) ; développeur ; la création de screencasts d’explication ; la traduction de la documentation dans une nouvelle langue ;
Pour ce faire, vous pouvez vous inscrire sur (...) -
Encodage et transformation en formats lisibles sur Internet
10 avril 2011MediaSPIP transforme et ré-encode les documents mis en ligne afin de les rendre lisibles sur Internet et automatiquement utilisables sans intervention du créateur de contenu.
Les vidéos sont automatiquement encodées dans les formats supportés par HTML5 : MP4, Ogv et WebM. La version "MP4" est également utilisée pour le lecteur flash de secours nécessaire aux anciens navigateurs.
Les documents audios sont également ré-encodés dans les deux formats utilisables par HTML5 :MP3 et Ogg. La version "MP3" (...)
Sur d’autres sites (6357)
-
ffmpeg : single frame from video is highly overexposed image
9 avril 2021, par mcgregor94086I am using ffmpeg on a Raspberry Pi 400, attached to a camera array to capture one image from each camera.


Most images generated are highly overexposed and washed out. I am trying to understand which command line options I should set to prevent this over exposure.



Are there options I need to set for the camera to automatically set the right exposure, or some way to set the length of an exposure as opposed to just a "frame" ?


Also, each image is taking about 3 to 4 seconds to capture. I just want to capture the first possible frame and that's it. Is there a set of options which would capture the image is less time ?


/usr/bin/ffmpeg -y -hide_banner -ss 0:0:0 -r 15 -s 1920x1080 -f video4linux2 -i /dev/video0 -frames:v 1 -f image2 /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam00.jpg
Input #0, video4linux2,v4l2, from '/dev/video0':
 Duration: N/A, start: 672949.710856, bitrate: 165888 kb/s
 Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 1920x1080, 165888 kb/s, 5 fps, 5 tbr, 1000k tbn, 1000k tbc
Stream mapping:
 Stream #0:0 -> #0:0 (rawvideo (native) -> mjpeg (native))
Press [q] to stop, [?] for help
[swscaler @ 0x184adb0] deprecated pixel format used, make sure you did set range correctly
Output #0, image2, to '/var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam00.jpg':
 Metadata:
 encoder : Lavf58.20.100
 Stream #0:0: Video: mjpeg, yuvj422p(pc), 1920x1080, q=2-31, 200 kb/s, 5 fps, 5 tbn, 5 tbc
 Metadata:
 encoder : Lavc58.35.100 mjpeg
 Side data:
 cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1
frame= 0 fps=0.0 q=0.0 size=N/A time=00:00:00.00 bitrate=N/A speed= 0x 
frame= 1 fps=1.0 q=3.7 Lsize=N/A time=00:00:00.40 bitrate=N/A speed=0.39x 
video:37kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown

Captured /dev/video0 image in: 3 seconds
linux_capture_photo_and_return_image_path( /dev/video0 , /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/ , 1 ) RETURNS /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam00.jpg
1 /dev/video0: /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam00.jpg
/usr/bin/ffmpeg -y -hide_banner -ss 0:0:0 -r 15 -s 1920x1080 -f video4linux2 -i /dev/video2 -frames:v 1 -f image2 /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam02.jpg
Input #0, video4linux2,v4l2, from '/dev/video2':
 Duration: N/A, start: 672958.327329, bitrate: 165888 kb/s
 Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 1920x1080, 165888 kb/s, 5 fps, 5 tbr, 1000k tbn, 1000k tbc
Stream mapping:
 Stream #0:0 -> #0:0 (rawvideo (native) -> mjpeg (native))
Press [q] to stop, [?] for help
[swscaler @ 0x1d27db0] deprecated pixel format used, make sure you did set range correctly
Output #0, image2, to '/var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam02.jpg':
 Metadata:
 encoder : Lavf58.20.100
 Stream #0:0: Video: mjpeg, yuvj422p(pc), 1920x1080, q=2-31, 200 kb/s, 5 fps, 5 tbn, 5 tbc
 Metadata:
 encoder : Lavc58.35.100 mjpeg
 Side data:
 cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1
frame= 0 fps=0.0 q=0.0 size=N/A time=00:00:00.00 bitrate=N/A speed= 0x 
frame= 1 fps=0.9 q=8.6 Lsize=N/A time=00:00:00.40 bitrate=N/A speed=0.374x 
video:136kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown

Captured /dev/video2 image in: 4 seconds
linux_capture_photo_and_return_image_path( /dev/video2 , /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/ , 2 ) RETURNS /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam02.jpg
2 /dev/video2: /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam02.jpg
/usr/bin/ffmpeg -y -hide_banner -ss 0:0:0 -r 15 -s 1920x1080 -f video4linux2 -i /dev/video4 -frames:v 1 -f image2 /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam04.jpg
Input #0, video4linux2,v4l2, from '/dev/video4':
 Duration: N/A, start: 672963.021864, bitrate: 165888 kb/s
 Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 1920x1080, 165888 kb/s, 5 fps, 5 tbr, 1000k tbn, 1000k tbc
Stream mapping:
 Stream #0:0 -> #0:0 (rawvideo (native) -> mjpeg (native))
Press [q] to stop, [?] for help
[swscaler @ 0x10bedb0] deprecated pixel format used, make sure you did set range correctly
Output #0, image2, to '/var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam04.jpg':
 Metadata:
 encoder : Lavf58.20.100
 Stream #0:0: Video: mjpeg, yuvj422p(pc), 1920x1080, q=2-31, 200 kb/s, 5 fps, 5 tbn, 5 tbc
 Metadata:
 encoder : Lavc58.35.100 mjpeg
 Side data:
 cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1
frame= 0 fps=0.0 q=0.0 size=N/A time=00:00:00.00 bitrate=N/A speed= 0x 
frame= 1 fps=1.0 q=3.7 Lsize=N/A time=00:00:00.40 bitrate=N/A speed=0.389x 
video:42kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown

Captured /dev/video4 image in: 4 seconds
linux_capture_photo_and_return_image_path( /dev/video4 , /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/ , 3 ) RETURNS /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam04.jpg
3 /dev/video4: /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam04.jpg
/usr/bin/ffmpeg -y -hide_banner -ss 0:0:0 -r 15 -s 1920x1080 -f video4linux2 -i /dev/video6 -frames:v 1 -f image2 /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam06.jpg
Input #0, video4linux2,v4l2, from '/dev/video6':
 Duration: N/A, start: 672967.663385, bitrate: 165888 kb/s
 Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 1920x1080, 165888 kb/s, 5 fps, 5 tbr, 1000k tbn, 1000k tbc
Stream mapping:
 Stream #0:0 -> #0:0 (rawvideo (native) -> mjpeg (native))
Press [q] to stop, [?] for help
[swscaler @ 0x24e4db0] deprecated pixel format used, make sure you did set range correctly
Output #0, image2, to '/var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam06.jpg':
 Metadata:
 encoder : Lavf58.20.100
 Stream #0:0: Video: mjpeg, yuvj422p(pc), 1920x1080, q=2-31, 200 kb/s, 5 fps, 5 tbn, 5 tbc
 Metadata:
 encoder : Lavc58.35.100 mjpeg
 Side data:
 cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1
frame= 0 fps=0.0 q=0.0 size=N/A time=00:00:00.00 bitrate=N/A speed= 0x 
frame= 1 fps=0.9 q=8.2 Lsize=N/A time=00:00:00.40 bitrate=N/A speed=0.352x 
video:126kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown

Captured /dev/video6 image in: 4 seconds
linux_capture_photo_and_return_image_path( /dev/video6 , /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/ , 4 ) RETURNS /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam06.jpg
4 /dev/video6: /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam06.jpg
/usr/bin/ffmpeg -y -hide_banner -ss 0:0:0 -r 15 -s 1920x1080 -f video4linux2 -i /dev/video8 -frames:v 1 -f image2 /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam08.jpg
Input #0, video4linux2,v4l2, from '/dev/video8':
 Duration: N/A, start: 672972.189025, bitrate: 165888 kb/s
 Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 1920x1080, 165888 kb/s, 5 fps, 5 tbr, 1000k tbn, 1000k tbc
Stream mapping:
 Stream #0:0 -> #0:0 (rawvideo (native) -> mjpeg (native))
Press [q] to stop, [?] for help
[swscaler @ 0x13fadb0] deprecated pixel format used, make sure you did set range correctly
Output #0, image2, to '/var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam08.jpg':
 Metadata:
 encoder : Lavf58.20.100
 Stream #0:0: Video: mjpeg, yuvj422p(pc), 1920x1080, q=2-31, 200 kb/s, 5 fps, 5 tbn, 5 tbc
 Metadata:
 encoder : Lavc58.35.100 mjpeg
 Side data:
 cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1
frame= 0 fps=0.0 q=0.0 size=N/A time=00:00:00.00 bitrate=N/A speed= 0x 
frame= 1 fps=0.9 q=9.1 Lsize=N/A time=00:00:00.40 bitrate=N/A speed=0.352x 
video:154kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown

Captured /dev/video8 image in: 3 seconds
linux_capture_photo_and_return_image_path( /dev/video8 , /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/ , 5 ) RETURNS /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam08.jpg
5 /dev/video8: /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam08.jpg
/usr/bin/ffmpeg -y -hide_banner -ss 0:0:0 -r 15 -s 1920x1080 -f video4linux2 -i /dev/video17 -frames:v 1 -f image2 /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam17.jpg
Input #0, video4linux2,v4l2, from '/dev/video17':
 Duration: N/A, start: 672976.730667, bitrate: 165888 kb/s
 Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 1920x1080, 165888 kb/s, 5 fps, 5 tbr, 1000k tbn, 1000k tbc
Stream mapping:
 Stream #0:0 -> #0:0 (rawvideo (native) -> mjpeg (native))
Press [q] to stop, [?] for help
[swscaler @ 0xae6e20] deprecated pixel format used, make sure you did set range correctly
Output #0, image2, to '/var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam17.jpg':
 Metadata:
 encoder : Lavf58.20.100
 Stream #0:0: Video: mjpeg, yuvj422p(pc), 1920x1080, q=2-31, 200 kb/s, 5 fps, 5 tbn, 5 tbc
 Metadata:
 encoder : Lavc58.35.100 mjpeg
 Side data:
 cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1
frame= 0 fps=0.0 q=0.0 size=N/A time=00:00:00.00 bitrate=N/A speed= 0x 
frame= 1 fps=0.9 q=8.7 Lsize=N/A time=00:00:00.40 bitrate=N/A speed=0.353x 
video:164kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown

Captured /dev/video17 image in: 4 seconds
linux_capture_photo_and_return_image_path( /dev/video17 , /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/ , 6 ) RETURNS /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam17.jpg
6 /dev/video17: /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam17.jpg
/usr/bin/ffmpeg -y -hide_banner -ss 0:0:0 -r 15 -s 1920x1080 -f video4linux2 -i /dev/video19 -frames:v 1 -f image2 /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam19.jpg
Input #0, video4linux2,v4l2, from '/dev/video19':
 Duration: N/A, start: 672981.425451, bitrate: 165888 kb/s
 Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 1920x1080, 165888 kb/s, 5 fps, 5 tbr, 1000k tbn, 1000k tbc
Stream mapping:
 Stream #0:0 -> #0:0 (rawvideo (native) -> mjpeg (native))
Press [q] to stop, [?] for help
[swscaler @ 0x15a7e20] deprecated pixel format used, make sure you did set range correctly
Output #0, image2, to '/var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam19.jpg':
 Metadata:
 encoder : Lavf58.20.100
 Stream #0:0: Video: mjpeg, yuvj422p(pc), 1920x1080, q=2-31, 200 kb/s, 5 fps, 5 tbn, 5 tbc
 Metadata:
 encoder : Lavc58.35.100 mjpeg
 Side data:
 cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1
frame= 0 fps=0.0 q=0.0 size=N/A time=00:00:00.00 bitrate=N/A speed= 0x 
frame= 1 fps=1.0 q=3.2 Lsize=N/A time=00:00:00.40 bitrate=N/A speed=0.391x 
video:40kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown

Captured /dev/video19 image in: 4 seconds
linux_capture_photo_and_return_image_path( /dev/video19 , /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/ , 7 ) RETURNS /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam19.jpg
7 /dev/video19: /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam19.jpg
/usr/bin/ffmpeg -y -hide_banner -ss 0:0:0 -r 15 -s 1920x1080 -f video4linux2 -i /dev/video21 -frames:v 1 -f image2 /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam21.jpg
Input #0, video4linux2,v4l2, from '/dev/video21':
 Duration: N/A, start: 672986.050603, bitrate: 165888 kb/s
 Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 1920x1080, 165888 kb/s, 5 fps, 5 tbr, 1000k tbn, 1000k tbc
Stream mapping:
 Stream #0:0 -> #0:0 (rawvideo (native) -> mjpeg (native))
Press [q] to stop, [?] for help
[swscaler @ 0x1722e20] deprecated pixel format used, make sure you did set range correctly
Output #0, image2, to '/var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam21.jpg':
 Metadata:
 encoder : Lavf58.20.100
 Stream #0:0: Video: mjpeg, yuvj422p(pc), 1920x1080, q=2-31, 200 kb/s, 5 fps, 5 tbn, 5 tbc
 Metadata:
 encoder : Lavc58.35.100 mjpeg
 Side data:
 cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1
frame= 0 fps=0.0 q=0.0 size=N/A time=00:00:00.00 bitrate=N/A speed= 0x 
frame= 1 fps=1.0 q=7.8 Lsize=N/A time=00:00:00.40 bitrate=N/A speed=0.388x 
video:119kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown

Captured /dev/video21 image in: 4 seconds
linux_capture_photo_and_return_image_path( /dev/video21 , /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/ , 8 ) RETURNS /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam21.jpg
8 /dev/video21: /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam21.jpg
/usr/bin/ffmpeg -y -hide_banner -ss 0:0:0 -r 15 -s 1920x1080 -f video4linux2 -i /dev/video23 -frames:v 1 -f image2 /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam23.jpg
Input #0, video4linux2,v4l2, from '/dev/video23':
 Duration: N/A, start: 672990.712888, bitrate: 165888 kb/s
 Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 1920x1080, 165888 kb/s, 5 fps, 5 tbr, 1000k tbn, 1000k tbc
Stream mapping:
 Stream #0:0 -> #0:0 (rawvideo (native) -> mjpeg (native))
Press [q] to stop, [?] for help
[swscaler @ 0x19f4e20] deprecated pixel format used, make sure you did set range correctly
Output #0, image2, to '/var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam23.jpg':
 Metadata:
 encoder : Lavf58.20.100
 Stream #0:0: Video: mjpeg, yuvj422p(pc), 1920x1080, q=2-31, 200 kb/s, 5 fps, 5 tbn, 5 tbc
 Metadata:
 encoder : Lavc58.35.100 mjpeg
 Side data:
 cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1
frame= 0 fps=0.0 q=0.0 size=N/A time=00:00:00.00 bitrate=N/A speed= 0x 
frame= 1 fps=1.0 q=3.4 Lsize=N/A time=00:00:00.40 bitrate=N/A speed=0.389x 
video:42kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown

Captured /dev/video23 image in: 4 seconds
linux_capture_photo_and_return_image_path( /dev/video23 , /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/ , 9 ) RETURNS /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam23.jpg
9 /dev/video23: /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam23.jpg
/usr/bin/ffmpeg -y -hide_banner -ss 0:0:0 -r 15 -s 1920x1080 -f video4linux2 -i /dev/video25 -frames:v 1 -f image2 /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam25.jpg
Input #0, video4linux2,v4l2, from '/dev/video25':
 Duration: N/A, start: 672995.359539, bitrate: 165888 kb/s
 Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 1920x1080, 165888 kb/s, 5 fps, 5 tbr, 1000k tbn, 1000k tbc
Stream mapping:
 Stream #0:0 -> #0:0 (rawvideo (native) -> mjpeg (native))
Press [q] to stop, [?] for help
[swscaler @ 0x9d7e20] deprecated pixel format used, make sure you did set range correctly
Output #0, image2, to '/var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam25.jpg':
 Metadata:
 encoder : Lavf58.20.100
 Stream #0:0: Video: mjpeg, yuvj422p(pc), 1920x1080, q=2-31, 200 kb/s, 5 fps, 5 tbn, 5 tbc
 Metadata:
 encoder : Lavc58.35.100 mjpeg
 Side data:
 cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1
frame= 0 fps=0.0 q=0.0 size=N/A time=00:00:00.00 bitrate=N/A speed= 0x 
frame= 1 fps=0.9 q=8.4 Lsize=N/A time=00:00:00.40 bitrate=N/A speed=0.369x 
video:146kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown

Captured /dev/video25 image in: 4 seconds
linux_capture_photo_and_return_image_path( /dev/video25 , /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/ , 10 ) RETURNS /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam25.jpg
10 /dev/video25: /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam25.jpg
/usr/bin/ffmpeg -y -hide_banner -ss 0:0:0 -r 15 -s 1920x1080 -f video4linux2 -i /dev/video27 -frames:v 1 -f image2 /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam27.jpg
Input #0, video4linux2,v4l2, from '/dev/video27':
 Duration: N/A, start: 673000.069328, bitrate: 165888 kb/s
 Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 1920x1080, 165888 kb/s, 5 fps, 5 tbr, 1000k tbn, 1000k tbc
Stream mapping:
 Stream #0:0 -> #0:0 (rawvideo (native) -> mjpeg (native))
Press [q] to stop, [?] for help
[swscaler @ 0x1f90e20] deprecated pixel format used, make sure you did set range correctly
Output #0, image2, to '/var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam27.jpg':
 Metadata:
 encoder : Lavf58.20.100
 Stream #0:0: Video: mjpeg, yuvj422p(pc), 1920x1080, q=2-31, 200 kb/s, 5 fps, 5 tbn, 5 tbc
 Metadata:
 encoder : Lavc58.35.100 mjpeg
 Side data:
 cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1
frame= 0 fps=0.0 q=0.0 size=N/A time=00:00:00.00 bitrate=N/A speed= 0x 
frame= 1 fps=1.0 q=3.9 Lsize=N/A time=00:00:00.40 bitrate=N/A speed=0.382x 
video:135kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown

Captured /dev/video27 image in: 4 seconds
linux_capture_photo_and_return_image_path( /dev/video27 , /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/ , 11 ) RETURNS /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam27.jpg
11 /dev/video27: /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam27.jpg
/usr/bin/ffmpeg -y -hide_banner -ss 0:0:0 -r 15 -s 1920x1080 -f video4linux2 -i /dev/video29 -frames:v 1 -f image2 /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam29.jpg
Input #0, video4linux2,v4l2, from '/dev/video29':
 Duration: N/A, start: 673004.676618, bitrate: 165888 kb/s
 Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 1920x1080, 165888 kb/s, 5 fps, 5 tbr, 1000k tbn, 1000k tbc
Stream mapping:
 Stream #0:0 -> #0:0 (rawvideo (native) -> mjpeg (native))
Press [q] to stop, [?] for help
[swscaler @ 0x22dde20] deprecated pixel format used, make sure you did set range correctly
Output #0, image2, to '/var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam29.jpg':
 Metadata:
 encoder : Lavf58.20.100
 Stream #0:0: Video: mjpeg, yuvj422p(pc), 1920x1080, q=2-31, 200 kb/s, 5 fps, 5 tbn, 5 tbc
 Metadata:
 encoder : Lavc58.35.100 mjpeg
 Side data:
 cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1
frame= 0 fps=0.0 q=0.0 size=N/A time=00:00:00.00 bitrate=N/A speed= 0x 
frame= 1 fps=1.0 q=9.3 Lsize=N/A time=00:00:00.40 bitrate=N/A speed=0.385x 
video:165kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown

Captured /dev/video29 image in: 4 seconds
linux_capture_photo_and_return_image_path( /dev/video29 , /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/ , 12 ) RETURNS /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam29.jpg
12 /dev/video29: /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam29.jpg
/usr/bin/ffmpeg -y -hide_banner -ss 0:0:0 -r 15 -s 1920x1080 -f video4linux2 -i /dev/video31 -frames:v 1 -f image2 /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam31.jpg
Input #0, video4linux2,v4l2, from '/dev/video31':
 Duration: N/A, start: 673009.555417, bitrate: 165888 kb/s
 Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 1920x1080, 165888 kb/s, 5 fps, 5 tbr, 1000k tbn, 1000k tbc
Stream mapping:
 Stream #0:0 -> #0:0 (rawvideo (native) -> mjpeg (native))
Press [q] to stop, [?] for help
[swscaler @ 0x22f5e20] deprecated pixel format used, make sure you did set range correctly
Output #0, image2, to '/var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam31.jpg':
 Metadata:
 encoder : Lavf58.20.100
 Stream #0:0: Video: mjpeg, yuvj422p(pc), 1920x1080, q=2-31, 200 kb/s, 5 fps, 5 tbn, 5 tbc
 Metadata:
 encoder : Lavc58.35.100 mjpeg
 Side data:
 cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1
frame= 0 fps=0.0 q=0.0 size=N/A time=00:00:00.00 bitrate=N/A speed= 0x 
frame= 1 fps=0.8 q=8.6 Lsize=N/A time=00:00:00.40 bitrate=N/A speed=0.335x 
video:141kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown

Captured /dev/video31 image in: 4 seconds
linux_capture_photo_and_return_image_path( /dev/video31 , /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/ , 13 ) RETURNS /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam31.jpg
13 /dev/video31: /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam31.jpg
/usr/bin/ffmpeg -y -hide_banner -ss 0:0:0 -r 15 -s 1920x1080 -f video4linux2 -i /dev/video33 -frames:v 1 -f image2 /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam33.jpg
Input #0, video4linux2,v4l2, from '/dev/video33':
 Duration: N/A, start: 673014.171570, bitrate: 165888 kb/s
 Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 1920x1080, 165888 kb/s, 5 fps, 5 tbr, 1000k tbn, 1000k tbc
Stream mapping:
 Stream #0:0 -> #0:0 (rawvideo (native) -> mjpeg (native))
Press [q] to stop, [?] for help
[swscaler @ 0x6d4e20] deprecated pixel format used, make sure you did set range correctly
Output #0, image2, to '/var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam33.jpg':
 Metadata:
 encoder : Lavf58.20.100
 Stream #0:0: Video: mjpeg, yuvj422p(pc), 1920x1080, q=2-31, 200 kb/s, 5 fps, 5 tbn, 5 tbc
 Metadata:
 encoder : Lavc58.35.100 mjpeg
 Side data:
 cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1
frame= 0 fps=0.0 q=0.0 size=N/A time=00:00:00.00 bitrate=N/A speed= 0x 
frame= 0 fps=0.0 q=0.0 size=N/A time=00:00:00.00 bitrate=N/A speed= 0x 
frame= 1 fps=0.9 q=9.7 Lsize=N/A time=00:00:00.20 bitrate=N/A speed=0.172x 
video:156kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown

Captured /dev/video33 image in: 3 seconds
linux_capture_photo_and_return_image_path( /dev/video33 , /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/ , 14 ) RETURNS /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam33.jpg
14 /dev/video33: /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam33.jpg
/usr/bin/ffmpeg -y -hide_banner -ss 0:0:0 -r 15 -s 1920x1080 -f video4linux2 -i /dev/video35 -frames:v 1 -f image2 /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam35.jpg
Input #0, video4linux2,v4l2, from '/dev/video35':
 Duration: N/A, start: 673018.565769, bitrate: 165888 kb/s
 Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 1920x1080, 165888 kb/s, 5 fps, 5 tbr, 1000k tbn, 1000k tbc
Stream mapping:
 Stream #0:0 -> #0:0 (rawvideo (native) -> mjpeg (native))
Press [q] to stop, [?] for help
[swscaler @ 0x1667e20] deprecated pixel format used, make sure you did set range correctly
Output #0, image2, to '/var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam35.jpg':
 Metadata:
 encoder : Lavf58.20.100
 Stream #0:0: Video: mjpeg, yuvj422p(pc), 1920x1080, q=2-31, 200 kb/s, 5 fps, 5 tbn, 5 tbc
 Metadata:
 encoder : Lavc58.35.100 mjpeg
 Side data:
 cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1
frame= 0 fps=0.0 q=0.0 size=N/A time=00:00:00.00 bitrate=N/A speed= 0x 
frame= 0 fps=0.0 q=0.0 size=N/A time=00:00:00.00 bitrate=N/A speed= 0x 
frame= 1 fps=0.8 q=9.8 Lsize=N/A time=00:00:00.20 bitrate=N/A speed=0.167x 
video:148kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown

Captured /dev/video35 image in: 3 seconds
linux_capture_photo_and_return_image_path( /dev/video35 , /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/ , 15 ) RETURNS /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam35.jpg
15 /dev/video35: /var/www/html/sonascan/data/scans/10000000e1f45394/20210405184504/SonaCam35.jpg
/usr/bin/ffmpeg -y -hide_banner -ss 0:0:0 -r 15 -s 1920x1080 -f video4linux2 -i /dev/video37 -frames:v 1 -f image2




$ v4l2-ctl --all
Driver Info:
 Driver name : uvcvideo
 Card type : FHD Camera: FHD Camera
 Bus info : usb-0000:01:00.0-1.2.1.1
 Driver version : 5.10.17
 Capabilities : 0x84a00001
 Video Capture
 Metadata Capture
 Streaming
 Extended Pix Format
 Device Capabilities
 Device Caps : 0x04200001
 Video Capture
 Streaming
 Extended Pix Format
Media Driver Info:
 Driver name : uvcvideo
 Model : FHD Camera: FHD Camera
 Serial : 
 Bus info : usb-0000:01:00.0-1.2.1.1
 Media version : 5.10.17
 Hardware revision: 0x00000001 (1)
 Driver version : 5.10.17
Interface Info:
 ID : 0x03000002
 Type : V4L Video
Entity Info:
 ID : 0x00000001 (1)
 Name : FHD Camera: FHD Camera
 Function : V4L2 I/O
 Flags : default
 Pad 0x01000007 : 0: Sink
 Link 0x02000013: from remote pad 0x100000a of entity 'Extension 4': Data, Enabled, Immutable
Priority: 2
Video input : 0 (Camera 1: ok)
Format Video Capture:
 Width/Height : 1920/1080
 Pixel Format : 'YUYV' (YUYV 4:2:2)
 Field : None
 Bytes per Line : 3840
 Size Image : 4147200
 Colorspace : sRGB
 Transfer Function : Rec. 709
 YCbCr/HSV Encoding: ITU-R 601
 Quantization : Default (maps to Limited Range)
 Flags : 
Crop Capability Video Capture:
 Bounds : Left 0, Top 0, Width 1920, Height 1080
 Default : Left 0, Top 0, Width 1920, Height 1080
 Pixel Aspect: 1/1
Selection: crop_default, Left 0, Top 0, Width 1920, Height 1080, Flags: 
Selection: crop_bounds, Left 0, Top 0, Width 1920, Height 1080, Flags: 
Streaming Parameters Video Capture:
 Capabilities : timeperframe
 Frames per second: 5.000 (5/1)
 Read buffers : 0
 brightness 0x00980900 (int) : min=0 max=255 step=1 default=128 value=128
 contrast 0x00980901 (int) : min=0 max=255 step=1 default=30 value=30
 saturation 0x00980902 (int) : min=0 max=100 step=1 default=64 value=64
 hue 0x00980903 (int) : min=-180 max=180 step=1 default=0 value=0
 white_balance_temperature_auto 0x0098090c (bool) : default=1 value=1
 gamma 0x00980910 (int) : min=90 max=150 step=1 default=120 value=120
 gain 0x00980913 (int) : min=4 max=8 step=1 default=5 value=5
 power_line_frequency 0x00980918 (menu) : min=0 max=2 default=1 value=1
 white_balance_temperature 0x0098091a (int) : min=2800 max=6500 step=1 default=4000 value=4000 flags=inactive
 sharpness 0x0098091b (int) : min=0 max=7 step=1 default=2 value=2
 backlight_compensation 0x0098091c (int) : min=0 max=2 step=1 default=2 value=2
 exposure_auto 0x009a0901 (menu) : min=0 max=3 default=3 value=3
 exposure_absolute 0x009a0902 (int) : min=9 max=2500 step=1 default=123 value=123 flags=inactive
error 22 getting ext_ctrl Exposure, Auto Priority



-
trying to make OpenCV 3.2.0 work with virtualenv
24 juillet 2017, par lollercoasterI’m on Ubuntu 16.04 with Python 2.7 and virtualenv & virtualenvwrapper.
By following this guide I managed to get the following script working with my system Python2.7 which has
cv2
globally installed.I used this script to install it :
######################################
# INSTALL OPENCV ON UBUNTU OR DEBIAN #
######################################
# | THIS SCRIPT IS TESTED CORRECTLY ON |
# |----------------------------------------------------|
# | OS | OpenCV | Test | Last test |
# |----------------|--------------|------|-------------|
# | Ubuntu 16.04.2 | OpenCV 3.2.0 | OK | 20 May 2017 |
# | Debian 8.8 | OpenCV 3.2.0 | OK | 20 May 2017 |
# | Debian 9.0 | OpenCV 3.2.0 | OK | 25 Jun 2017 |
# 1. KEEP UBUNTU OR DEBIAN UP TO DATE
sudo apt-get -y update
sudo apt-get -y upgrade
sudo apt-get -y dist-upgrade
sudo apt-get -y autoremove
# 2. INSTALL THE DEPENDENCIES
# Build tools:
sudo apt-get install -y build-essential cmake
# GUI (if you want to use GTK instead of Qt, replace 'qt5-default' with 'libgtkglext1-dev' and remove '-DWITH_QT=ON' option in CMake):
sudo apt-get install -y qt5-default libvtk6-dev
# Media I/O:
sudo apt-get install -y zlib1g-dev libjpeg-dev libwebp-dev libpng-dev libtiff5-dev libjasper-dev libopenexr-dev libgdal-dev
# Video I/O:
sudo apt-get install -y libdc1394-22-dev libavcodec-dev libavformat-dev libswscale-dev libtheora-dev libvorbis-dev libxvidcore-dev libx264-dev yasm libopencore-amrnb-dev libopencore-amrwb-dev libv4l-dev libxine2-dev
# Parallelism and linear algebra libraries:
sudo apt-get install -y libtbb-dev libeigen3-dev
# Python:
sudo apt-get install -y python-dev python-tk python-numpy python3-dev python3-tk python3-numpy
# Documentation:
sudo apt-get install -y doxygen
# UI stuff
sudo apt-get install libgtk-3-dev libatlas-base-dev gfortran
# 3. INSTALL THE LIBRARY (YOU CAN CHANGE '3.2.0' FOR THE LAST STABLE VERSION)
sudo apt-get install -y unzip wget
# opencv contrib
wget https://github.com/opencv/opencv_contrib/archive/3.2.0.zip -O opencv_contrib-3.2.0.zip
unzip opencv_contrib-3.2.0.zip
rm opencv_contrib-3.2.0.zip
# opencv
wget https://github.com/opencv/opencv/archive/3.2.0.zip
unzip 3.2.0.zip
rm 3.2.0.zip
mv opencv-3.2.0 OpenCV-3.2.0
cd OpenCV-3.2.0
mkdir build
cd build
cmake -D WITH_QT=ON \
-D WITH_OPENGL=ON \
-D FORCE_VTK=ON \
-D WITH_TBB=ON \
-D WITH_GDAL=ON \
-D WITH_XINE=ON \
-D BUILD_EXAMPLES=ON \
-D INSTALL_PYTHON_EXAMPLES=ON \
-D ENABLE_PRECOMPILED_HEADERS=OFF \
-D BUILD_NEW_PYTHON_SUPPORT=ON \
..
make -j4
sudo make install
sudo ldconfig
# 4. EXECUTE SOME OPENCV EXAMPLES AND COMPILE A DEMONSTRATION
# To complete this step, please visit 'http://milq.github.io/install-opencv-ubuntu-debian'.The following script below works great with that system-wide installation :
import cv2
img = cv2.imread('some_img.jpg')Though this one doesn’t - even the system Python can’t read videos for some reason...
import cv2
video_capture = cv2.VideoCapture(0)
ret, frame = video_capture.read()
print ret # always Falsebut I want it to work with my virtualenv. So I recompiled OpenCV with :
cmake -D WITH_QT=ON \
-D WITH_OPENGL=ON \
-D FORCE_VTK=ON \
-D WITH_TBB=ON \
-D WITH_GDAL=ON \
-D WITH_XINE=ON \
-D BUILD_EXAMPLES=ON \
-D INSTALL_PYTHON_EXAMPLES=ON \
-D ENABLE_PRECOMPILED_HEADERS=OFF \
-D BUILD_NEW_PYTHON_SUPPORT=ON \
-D OPENCV_EXTRA_MODULES_PATH=/home/me/code/myproject/opencv_contrib-3.2.0/modules \
-D PYTHON_EXECUTABLE=~/.envs/myenv/bin/python \
..
make -j4
sudo make install
sudo ldconfigHere’s the CMake log :
-- Found VTK ver. 6.2.0 (usefile: /usr/lib/cmake/vtk-6.2/UseVTK.cmake)
-- Caffe: NO
-- Protobuf: YES
-- Glog: NO
-- freetype2: YES
-- harfbuzz: YES
-- Module opencv_sfm disabled because the following dependencies are not found: Glog/Gflags
-- freetype2: YES
-- harfbuzz: YES
-- Checking for modules 'tesseract;lept'
-- No package 'tesseract' found
-- No package 'lept' found
-- Tesseract: NO
-- Check contents of vgg_generated_48.i ...
-- Check contents of vgg_generated_64.i ...
-- Check contents of vgg_generated_80.i ...
-- Check contents of vgg_generated_120.i ...
-- Check contents of boostdesc_bgm.i ...
-- Check contents of boostdesc_bgm_bi.i ...
-- Check contents of boostdesc_bgm_hd.i ...
-- Check contents of boostdesc_binboost_064.i ...
-- Check contents of boostdesc_binboost_128.i ...
-- Check contents of boostdesc_binboost_256.i ...
-- Check contents of boostdesc_lbgm.i ...
--
-- General configuration for OpenCV 3.2.0 =====================================
-- Version control: 817bd7b-dirty
--
-- Extra modules:
-- Location (extra): /home/me/code/myproject/opencv_contrib-3.2.0/modules
-- Version control (extra): 817bd7b-dirty
--
-- Platform:
-- Timestamp: 2017-07-20T18:25:26Z
-- Host: Linux 4.8.0-58-generic x86_64
-- CMake: 3.5.1
-- CMake generator: Unix Makefiles
-- CMake build tool: /usr/bin/make
-- Configuration: Release
--
-- C/C++:
-- Built as dynamic libs?: YES
-- C++ Compiler: /usr/bin/c++ (ver 5.4.0)
-- C++ flags (Release): -fsigned-char -W -Wall -Werror=return-type -Werror=non-virtual-dtor -Werror=address -Werror=sequence-point -Wformat -Werror=format-security -Wmissing-declarations -Wundef -Winit-self -Wpointer-arith -Wshadow -Wsign-promo -Wno-narrowing -Wno-delete-non-virtual-dtor -Wno-comment -fdiagnostics-show-option -Wno-long-long -pthread -fomit-frame-pointer -msse -msse2 -mno-avx -msse3 -mno-ssse3 -mno-sse4.1 -mno-sse4.2 -ffunction-sections -fvisibility=hidden -fvisibility-inlines-hidden -O3 -DNDEBUG -DNDEBUG
-- C++ flags (Debug): -fsigned-char -W -Wall -Werror=return-type -Werror=non-virtual-dtor -Werror=address -Werror=sequence-point -Wformat -Werror=format-security -Wmissing-declarations -Wundef -Winit-self -Wpointer-arith -Wshadow -Wsign-promo -Wno-narrowing -Wno-delete-non-virtual-dtor -Wno-comment -fdiagnostics-show-option -Wno-long-long -pthread -fomit-frame-pointer -msse -msse2 -mno-avx -msse3 -mno-ssse3 -mno-sse4.1 -mno-sse4.2 -ffunction-sections -fvisibility=hidden -fvisibility-inlines-hidden -g -O0 -DDEBUG -D_DEBUG
-- C Compiler: /usr/bin/cc
-- C flags (Release): -fsigned-char -W -Wall -Werror=return-type -Werror=non-virtual-dtor -Werror=address -Werror=sequence-point -Wformat -Werror=format-security -Wmissing-declarations -Wmissing-prototypes -Wstrict-prototypes -Wundef -Winit-self -Wpointer-arith -Wshadow -Wno-narrowing -Wno-comment -fdiagnostics-show-option -Wno-long-long -pthread -fomit-frame-pointer -msse -msse2 -mno-avx -msse3 -mno-ssse3 -mno-sse4.1 -mno-sse4.2 -ffunction-sections -fvisibility=hidden -O3 -DNDEBUG -DNDEBUG
-- C flags (Debug): -fsigned-char -W -Wall -Werror=return-type -Werror=non-virtual-dtor -Werror=address -Werror=sequence-point -Wformat -Werror=format-security -Wmissing-declarations -Wmissing-prototypes -Wstrict-prototypes -Wundef -Winit-self -Wpointer-arith -Wshadow -Wno-narrowing -Wno-comment -fdiagnostics-show-option -Wno-long-long -pthread -fomit-frame-pointer -msse -msse2 -mno-avx -msse3 -mno-ssse3 -mno-sse4.1 -mno-sse4.2 -ffunction-sections -fvisibility=hidden -g -O0 -DDEBUG -D_DEBUG
-- Linker flags (Release):
-- Linker flags (Debug):
-- ccache: NO
-- Precompiled headers: NO
-- Extra dependencies: Qt5::Test Qt5::Concurrent Qt5::OpenGL /usr/lib/x86_64-linux-gnu/libwebp.so /usr/lib/x86_64-linux-gnu/libjasper.so /usr/lib/x86_64-linux-gnu/libImath.so /usr/lib/x86_64-linux-gnu/libIlmImf.so /usr/lib/x86_64-linux-gnu/libIex.so /usr/lib/x86_64-linux-gnu/libHalf.so /usr/lib/x86_64-linux-gnu/libIlmThread.so /usr/lib/libgdal.so dc1394 xine avcodec-ffmpeg avformat-ffmpeg avutil-ffmpeg swscale-ffmpeg Qt5::Core Qt5::Gui Qt5::Widgets /usr/lib/x86_64-linux-gnu/hdf5/serial/lib/libhdf5.so /usr/lib/x86_64-linux-gnu/libpthread.so /usr/lib/x86_64-linux-gnu/libsz.so /usr/lib/x86_64-linux-gnu/libdl.so /usr/lib/x86_64-linux-gnu/libm.so vtkRenderingOpenGL vtkImagingHybrid vtkIOImage vtkCommonDataModel vtkCommonMath vtkCommonCore vtksys vtkCommonMisc vtkCommonSystem vtkCommonTransforms vtkCommonExecutionModel vtkDICOMParser vtkIOCore /usr/lib/x86_64-linux-gnu/libz.so vtkmetaio /usr/lib/x86_64-linux-gnu/libjpeg.so /usr/lib/x86_64-linux-gnu/libpng.so /usr/lib/x86_64-linux-gnu/libtiff.so vtkImagingCore vtkRenderingCore vtkCommonColor vtkFiltersExtraction vtkFiltersCore vtkFiltersGeneral vtkCommonComputationalGeometry vtkFiltersStatistics vtkImagingFourier vtkalglib vtkFiltersGeometry vtkFiltersSources vtkInteractionStyle vtkRenderingLOD vtkFiltersModeling vtkIOPLY vtkIOGeometry /usr/lib/x86_64-linux-gnu/libjsoncpp.so vtkFiltersTexture vtkRenderingFreeType /usr/lib/x86_64-linux-gnu/libfreetype.so vtkftgl vtkIOExport vtkRenderingAnnotation vtkImagingColor vtkRenderingContext2D vtkRenderingGL2PS vtkRenderingContextOpenGL /usr/lib/libgl2ps.so vtkRenderingLabel dl m pthread rt /usr/lib/x86_64-linux-gnu/libGLU.so /usr/lib/x86_64-linux-gnu/libGL.so tbb
-- 3rdparty dependencies: libprotobuf
--
-- OpenCV modules:
-- To be built: core flann hdf imgproc ml photo reg surface_matching video viz dnn freetype fuzzy imgcodecs shape videoio highgui objdetect plot superres ts xobjdetect xphoto bgsegm bioinspired dpm face features2d line_descriptor saliency text calib3d ccalib cvv datasets rgbd stereo tracking videostab xfeatures2d ximgproc aruco optflow phase_unwrapping stitching structured_light java python2 python3
-- Disabled: world contrib_world
-- Disabled by dependency: -
-- Unavailable: cudaarithm cudabgsegm cudacodec cudafeatures2d cudafilters cudaimgproc cudalegacy cudaobjdetect cudaoptflow cudastereo cudawarping cudev cnn_3dobj matlab sfm
--
-- GUI:
-- QT 5.x: YES (ver 5.5.1)
-- QT OpenGL support: YES (Qt5::OpenGL 5.5.1)
-- OpenGL support: YES (/usr/lib/x86_64-linux-gnu/libGLU.so /usr/lib/x86_64-linux-gnu/libGL.so)
-- VTK support: YES (ver 6.2.0)
--
-- Media I/O:
-- ZLib: /usr/lib/x86_64-linux-gnu/libz.so (ver 1.2.8)
-- JPEG: /usr/lib/x86_64-linux-gnu/libjpeg.so (ver )
-- WEBP: /usr/lib/x86_64-linux-gnu/libwebp.so (ver encoder: 0x0202)
-- PNG: /usr/lib/x86_64-linux-gnu/libpng.so (ver 1.2.54)
-- TIFF: /usr/lib/x86_64-linux-gnu/libtiff.so (ver 42 - 4.0.6)
-- JPEG 2000: /usr/lib/x86_64-linux-gnu/libjasper.so (ver 1.900.1)
-- OpenEXR: /usr/lib/x86_64-linux-gnu/libImath.so /usr/lib/x86_64-linux-gnu/libIlmImf.so /usr/lib/x86_64-linux-gnu/libIex.so /usr/lib/x86_64-linux-gnu/libHalf.so /usr/lib/x86_64-linux-gnu/libIlmThread.so (ver 2.2.0)
-- GDAL: /usr/lib/libgdal.so
-- GDCM: NO
--
-- Video I/O:
-- DC1394 1.x: NO
-- DC1394 2.x: YES (ver 2.2.4)
-- FFMPEG: YES
-- avcodec: YES (ver 56.60.100)
-- avformat: YES (ver 56.40.101)
-- avutil: YES (ver 54.31.100)
-- swscale: YES (ver 3.1.101)
-- avresample: NO
-- GStreamer: NO
-- OpenNI: NO
-- OpenNI PrimeSensor Modules: NO
-- OpenNI2: NO
-- PvAPI: NO
-- GigEVisionSDK: NO
-- Aravis SDK: NO
-- UniCap: NO
-- UniCap ucil: NO
-- V4L/V4L2: NO/YES
-- XIMEA: NO
-- Xine: YES (ver 1.2.6)
-- gPhoto2: NO
--
-- Parallel framework: TBB (ver 4.4 interface 9002)
--
-- Other third-party libraries:
-- Use IPP: 9.0.1 [9.0.1]
-- at: /home/me/code/myproject/OpenCV-3.2.0/build/3rdparty/ippicv/ippicv_lnx
-- Use IPP Async: NO
-- Use VA: NO
-- Use Intel VA-API/OpenCL: NO
-- Use Lapack: NO
-- Use Eigen: YES (ver 3.2.92)
-- Use Cuda: NO
-- Use OpenCL: YES
-- Use OpenVX: NO
-- Use custom HAL: NO
--
-- OpenCL: <dynamic loading="loading" of="of" opencl="opencl" library="library">
-- Include path: /home/me/code/myproject/OpenCV-3.2.0/3rdparty/include/opencl/1.2
-- Use AMDFFT: NO
-- Use AMDBLAS: NO
--
-- Python 2:
-- Interpreter: /home/me/.envs/myenv/bin/python (ver 2.7.12)
-- Libraries: /usr/lib/x86_64-linux-gnu/libpython2.7.so (ver 2.7.12)
-- numpy: /home/me/.envs/myenv/local/lib/python2.7/site-packages/numpy/core/include (ver 1.13.1)
-- packages path: lib/python2.7/site-packages
--
-- Python 3:
-- Interpreter: /usr/bin/python3 (ver 3.5.2)
-- Libraries: /usr/lib/x86_64-linux-gnu/libpython3.5m.so (ver 3.5.2)
-- numpy: /usr/lib/python3/dist-packages/numpy/core/include (ver 1.11.0)
-- packages path: lib/python3.5/dist-packages
--
-- Python (for build): /home/me/.envs/myenv/bin/python
--
-- Java:
-- ant: /usr/bin/ant (ver 1.9.6)
-- JNI: /usr/lib/jvm/default-java/include /usr/lib/jvm/default-java/include/linux /usr/lib/jvm/default-java/include
-- Java wrappers: YES
-- Java tests: YES
--
-- Matlab: Matlab not found or implicitly disabled
--
-- Documentation:
-- Doxygen: /usr/bin/doxygen (ver 1.8.11)
--
-- Tests and samples:
-- Tests: YES
-- Performance tests: YES
-- C/C++ Examples: YES
--
-- Install path: /usr/local
--
-- cvconfig.h is in: /home/me/code/myproject/OpenCV-3.2.0/build
-- -----------------------------------------------------------------
--
</dynamic>Unfortunately, while this works and I can import
cv2
in the shell, it cannot read video using the above script, probably due to incorrect compilation or linking offfmpeg
? The confusing part is the system-wide installation of OpenCV works fine, even without ffmpeg installed !What am I doing wrong ? How can I get OpenCV working with a virtualenv ?
====
EDIT : Running the C++ video writing example has this result :
$ cd /home/me/code/myproject/OpenCV-3.2.0/build/bin
$ ./cpp-tutorial-video-write ../../samples/data/vtest.avi R Y
------------------------------------------------------------------------------
This program shows how to write video files.
You can extract the R or G or B color channel of the input video.
Usage:
./video-write [ R | G | B] [Y | N]
------------------------------------------------------------------------------
OpenCV: FFMPEG: tag 0xffffffff/'����' is not found (format 'avi / AVI (Audio Video Interleaved)')'
(cpp-tutorial-video-write:19523): GStreamer-CRITICAL **: gst_element_make_from_uri: assertion 'gst_uri_is_valid (uri)' failed
OpenCV Error: Unsupported format or combination of formats (Gstreamer Opencv backend does not support this codec.) in CvVideoWriter_GStreamer::open, file /home/me/code/myproject/OpenCV-3.2.0/modules/videoio/src/cap_gstreamer.cpp, line 1388
VIDEOIO(cvCreateVideoWriter_GStreamer(filename, fourcc, fps, frameSize, is_color)): raised OpenCV exception:
/home/me/code/myproject/OpenCV-3.2.0/modules/videoio/src/cap_gstreamer.cpp:1388: error: (-210) Gstreamer Opencv backend does not support this codec. in function CvVideoWriter_GStreamer::open
Could not open the output video for write: ../../samples/data/vtest.aviAnd the
opencv_test_videoio
unit test reports the following : https://pastebin.com/q4mf224QHowever, running the c++ video starter example DOES work, with the following command and output, I can see the webcam working and streaming video in the highgui interface :
$ ./cpp-example-videocapture_starter 0
VIDEOIO ERROR: V4L: device 0: Unable to query number of channels
(ERROR)icvOpenAVI_XINE(): Unable to initialize video driver.
GStreamer: Error opening bin: no element "0"
press space to save a picture. q or esc to quit
init done
opengl support available -
Audio & Video not synchronized properly if i merged more videos in mp4parser
1er octobre 2013, par maniyaI have used mp4parser for merging video with dynamic pause and record video capture for max 6 second recording. In preview its working fine when recorded video with minimum pause/record, If i tried with more than 3 pause/record mean the last video file not get merged properly with audio.At the start of the video the sync is ok but at the end the video hanged and audio playing in screen for the remaining file duration about 1sec.
My Recording manager
public class RecordingManager implements Camera.ErrorCallback, MediaRecorder.OnErrorListener, MediaRecorder.OnInfoListener {
private static final String TAG = RecordingManager.class.getSimpleName();
private static final int FOCUS_AREA_RADIUS = 32;
private static final int FOCUS_MAX_VALUE = 1000;
private static final int FOCUS_MIN_VALUE = -1000;
private static final long MINIMUM_RECORDING_TIME = 2000;
private static final int MAXIMUM_RECORDING_TIME = 70 * 1000;
private static final long LOW_STORAGE_THRESHOLD = 5 * 1024 * 1024;
private static final long RECORDING_FILE_LIMIT = 100 * 1024 * 1024;
private boolean paused = true;
private MediaRecorder mediaRecorder = null;
private boolean recording = false;
private FrameLayout previewFrame = null;
private boolean mPreviewing = false;
// private TextureView mTextureView = null;
// private SurfaceTexture mSurfaceTexture = null;
// private boolean mSurfaceTextureReady = false;
//
private SurfaceView surfaceView = null;
private SurfaceHolder surfaceHolder = null;
private boolean surfaceViewReady = false;
private Camera camera = null;
private Camera.Parameters cameraParameters = null;
private CamcorderProfile camcorderProfile = null;
private int mOrientation = -1;
private OrientationEventListener mOrientationEventListener = null;
private long mStartRecordingTime;
private int mVideoWidth;
private int mVideoHeight;
private long mStorageSpace;
private Handler mHandler = new Handler();
// private Runnable mUpdateRecordingTimeTask = new Runnable() {
// @Override
// public void run() {
// long recordingTime = System.currentTimeMillis() - mStartRecordingTime;
// Log.d(TAG, String.format("Recording time:%d", recordingTime));
// mHandler.postDelayed(this, CLIP_GRAPH_UPDATE_INTERVAL);
// }
// };
private Runnable mStopRecordingTask = new Runnable() {
@Override
public void run() {
stopRecording();
}
};
private static RecordingManager mInstance = null;
private Activity currentActivity = null;
private String destinationFilepath = "";
private String snapshotFilepath = "";
public static RecordingManager getInstance(Activity activity, FrameLayout previewFrame) {
if (mInstance == null || mInstance.currentActivity != activity) {
mInstance = new RecordingManager(activity, previewFrame);
}
return mInstance;
}
private RecordingManager(Activity activity, FrameLayout previewFrame) {
currentActivity = activity;
this.previewFrame = previewFrame;
}
public int getVideoWidth() {
return this.mVideoWidth;
}
public int getVideoHeight() {
return this.mVideoHeight;
}
public void setDestinationFilepath(String filepath) {
this.destinationFilepath = filepath;
}
public String getDestinationFilepath() {
return this.destinationFilepath;
}
public void setSnapshotFilepath(String filepath) {
this.snapshotFilepath = filepath;
}
public String getSnapshotFilepath() {
return this.snapshotFilepath;
}
public void init(String videoPath, String snapshotPath) {
Log.v(TAG, "init.");
setDestinationFilepath(videoPath);
setSnapshotFilepath(snapshotPath);
if (!Utils.isExternalStorageAvailable()) {
showStorageErrorAndFinish();
return;
}
openCamera();
if (camera == null) {
showCameraErrorAndFinish();
return;
}
public void onResume() {
Log.v(TAG, "onResume.");
paused = false;
// Open the camera
if (camera == null) {
openCamera();
if (camera == null) {
showCameraErrorAndFinish();
return;
}
}
// Initialize the surface texture or surface view
// if (useTexture() && mTextureView == null) {
// initTextureView();
// mTextureView.setVisibility(View.VISIBLE);
// } else if (!useTexture() && mSurfaceView == null) {
initSurfaceView();
surfaceView.setVisibility(View.VISIBLE);
// }
// Start the preview
if (!mPreviewing) {
startPreview();
}
}
private void openCamera() {
Log.v(TAG, "openCamera");
try {
camera = Camera.open();
camera.setErrorCallback(this);
camera.setDisplayOrientation(90); // Since we only support portrait mode
cameraParameters = camera.getParameters();
} catch (RuntimeException e) {
e.printStackTrace();
camera = null;
}
}
private void closeCamera() {
Log.v(TAG, "closeCamera");
if (camera == null) {
Log.d(TAG, "Already stopped.");
return;
}
camera.setErrorCallback(null);
if (mPreviewing) {
stopPreview();
}
camera.release();
camera = null;
}
private void initSurfaceView() {
surfaceView = new SurfaceView(currentActivity);
surfaceView.getHolder().addCallback(new SurfaceViewCallback());
surfaceView.setVisibility(View.GONE);
FrameLayout.LayoutParams params = new LayoutParams(
LayoutParams.MATCH_PARENT, LayoutParams.MATCH_PARENT, Gravity.CENTER);
surfaceView.setLayoutParams(params);
Log.d(TAG, "add surface view to preview frame");
previewFrame.addView(surfaceView);
}
private void releaseSurfaceView() {
if (surfaceView != null) {
previewFrame.removeAllViews();
surfaceView = null;
surfaceHolder = null;
surfaceViewReady = false;
}
}
private void startPreview() {
// if ((useTexture() && !mSurfaceTextureReady) || (!useTexture() && !mSurfaceViewReady)) {
// return;
// }
Log.v(TAG, "startPreview.");
if (mPreviewing) {
stopPreview();
}
setCameraParameters();
resizePreview();
try {
// if (useTexture()) {
// mCamera.setPreviewTexture(mSurfaceTexture);
// } else {
camera.setPreviewDisplay(surfaceHolder);
// }
camera.startPreview();
mPreviewing = true;
} catch (Exception e) {
closeCamera();
e.printStackTrace();
Log.e(TAG, "startPreview failed.");
}
}
private void stopPreview() {
Log.v(TAG, "stopPreview");
if (camera != null) {
camera.stopPreview();
mPreviewing = false;
}
}
public void onPause() {
paused = true;
if (recording) {
stopRecording();
}
closeCamera();
// if (useTexture()) {
// releaseSurfaceTexture();
// } else {
releaseSurfaceView();
// }
}
private void setCameraParameters() {
if (CamcorderProfile.hasProfile(CamcorderProfile.QUALITY_720P)) {
camcorderProfile = CamcorderProfile.get(CamcorderProfile.QUALITY_720P);
} else if (CamcorderProfile.hasProfile(CamcorderProfile.QUALITY_480P)) {
camcorderProfile = CamcorderProfile.get(CamcorderProfile.QUALITY_480P);
} else {
camcorderProfile = CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH);
}
mVideoWidth = camcorderProfile.videoFrameWidth;
mVideoHeight = camcorderProfile.videoFrameHeight;
camcorderProfile.fileFormat = MediaRecorder.OutputFormat.MPEG_4;
camcorderProfile.videoFrameRate = 30;
Log.v(TAG, "mVideoWidth=" + mVideoWidth + " mVideoHeight=" + mVideoHeight);
cameraParameters.setPreviewSize(mVideoWidth, mVideoHeight);
if (cameraParameters.getSupportedWhiteBalance().contains(Camera.Parameters.WHITE_BALANCE_AUTO)) {
cameraParameters.setWhiteBalance(Camera.Parameters.WHITE_BALANCE_AUTO);
}
if (cameraParameters.getSupportedFocusModes().contains(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO)) {
cameraParameters.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO);
}
cameraParameters.setRecordingHint(true);
cameraParameters.set("cam_mode", 1);
camera.setParameters(cameraParameters);
cameraParameters = camera.getParameters();
camera.setDisplayOrientation(90);
android.hardware.Camera.CameraInfo info = new android.hardware.Camera.CameraInfo();
Log.d(TAG, info.orientation + " degree");
}
private void resizePreview() {
Log.d(TAG, String.format("Video size:%d|%d", mVideoWidth, mVideoHeight));
Point optimizedSize = getOptimizedPreviewSize(mVideoWidth, mVideoHeight);
Log.d(TAG, String.format("Optimized size:%d|%d", optimizedSize.x, optimizedSize.y));
ViewGroup.LayoutParams params = (ViewGroup.LayoutParams) previewFrame.getLayoutParams();
params.width = optimizedSize.x;
params.height = optimizedSize.y;
previewFrame.setLayoutParams(params);
}
public void setOrientation(int ori) {
this.mOrientation = ori;
}
public void setOrientationEventListener(OrientationEventListener listener) {
this.mOrientationEventListener = listener;
}
public Camera getCamera() {
return camera;
}
@SuppressWarnings("serial")
public void setFocusArea(float x, float y) {
if (camera != null) {
int viewWidth = surfaceView.getWidth();
int viewHeight = surfaceView.getHeight();
int focusCenterX = FOCUS_MAX_VALUE - (int) (x / viewWidth * (FOCUS_MAX_VALUE - FOCUS_MIN_VALUE));
int focusCenterY = FOCUS_MIN_VALUE + (int) (y / viewHeight * (FOCUS_MAX_VALUE - FOCUS_MIN_VALUE));
final int left = focusCenterY - FOCUS_AREA_RADIUS < FOCUS_MIN_VALUE ? FOCUS_MIN_VALUE : focusCenterY - FOCUS_AREA_RADIUS;
final int top = focusCenterX - FOCUS_AREA_RADIUS < FOCUS_MIN_VALUE ? FOCUS_MIN_VALUE : focusCenterX - FOCUS_AREA_RADIUS;
final int right = focusCenterY + FOCUS_AREA_RADIUS > FOCUS_MAX_VALUE ? FOCUS_MAX_VALUE : focusCenterY + FOCUS_AREA_RADIUS;
final int bottom = focusCenterX + FOCUS_AREA_RADIUS > FOCUS_MAX_VALUE ? FOCUS_MAX_VALUE : focusCenterX + FOCUS_AREA_RADIUS;
Camera.Parameters params = camera.getParameters();
params.setFocusAreas(new ArrayList() {
{
add(new Camera.Area(new Rect(left, top, right, bottom), 1000));
}
});
camera.setParameters(params);
camera.autoFocus(new AutoFocusCallback() {
@Override
public void onAutoFocus(boolean success, Camera camera) {
Log.d(TAG, "onAutoFocus");
}
});
}
}
public void startRecording(String destinationFilepath) {
if (!recording) {
updateStorageSpace();
setDestinationFilepath(destinationFilepath);
if (mStorageSpace <= LOW_STORAGE_THRESHOLD) {
Log.v(TAG, "Storage issue, ignore the start request");
Toast.makeText(currentActivity, "Storage issue, ignore the recording request", Toast.LENGTH_LONG).show();
return;
}
if (!prepareMediaRecorder()) {
Toast.makeText(currentActivity, "prepareMediaRecorder failed.", Toast.LENGTH_LONG).show();
return;
}
Log.d(TAG, "Successfully prepare media recorder.");
try {
mediaRecorder.start();
} catch (RuntimeException e) {
Log.e(TAG, "MediaRecorder start failed.");
releaseMediaRecorder();
return;
}
mStartRecordingTime = System.currentTimeMillis();
if (mOrientationEventListener != null) {
mOrientationEventListener.disable();
}
recording = true;
}
}
public void stopRecording() {
if (recording) {
if (!paused) {
// Capture at least 1 second video
long currentTime = System.currentTimeMillis();
if (currentTime - mStartRecordingTime < MINIMUM_RECORDING_TIME) {
mHandler.postDelayed(mStopRecordingTask, MINIMUM_RECORDING_TIME - (currentTime - mStartRecordingTime));
return;
}
}
if (mOrientationEventListener != null) {
mOrientationEventListener.enable();
}
// mHandler.removeCallbacks(mUpdateRecordingTimeTask);
try {
mediaRecorder.setOnErrorListener(null);
mediaRecorder.setOnInfoListener(null);
mediaRecorder.stop(); // stop the recording
Toast.makeText(currentActivity, "Video file saved.", Toast.LENGTH_LONG).show();
long stopRecordingTime = System.currentTimeMillis();
Log.d(TAG, String.format("stopRecording. file:%s duration:%d", destinationFilepath, stopRecordingTime - mStartRecordingTime));
// Calculate the duration of video
MediaMetadataRetriever mmr = new MediaMetadataRetriever();
mmr.setDataSource(this.destinationFilepath);
String _length = mmr.extractMetadata(MediaMetadataRetriever.METADATA_KEY_DURATION);
if (_length != null) {
Log.d(TAG, String.format("clip duration:%d", Long.parseLong(_length)));
}
// Taking the snapshot of video
Bitmap snapshot = ThumbnailUtils.createVideoThumbnail(this.destinationFilepath, Thumbnails.MICRO_KIND);
try {
FileOutputStream out = new FileOutputStream(this.snapshotFilepath);
snapshot.compress(Bitmap.CompressFormat.JPEG, 70, out);
out.close();
} catch (Exception e) {
e.printStackTrace();
}
// mActivity.showPlayButton();
} catch (RuntimeException e) {
e.printStackTrace();
Log.e(TAG, e.getMessage());
// if no valid audio/video data has been received when stop() is
// called
} finally {
//
releaseMediaRecorder(); // release the MediaRecorder object
if (!paused) {
cameraParameters = camera.getParameters();
}
recording = false;
}
}
}
public void setRecorderOrientation(int orientation) {
// For back camera only
if (orientation != -1) {
Log.d(TAG, "set orientationHint:" + (orientation + 135) % 360 / 90 * 90);
mediaRecorder.setOrientationHint((orientation + 135) % 360 / 90 * 90);
}else {
Log.d(TAG, "not set orientationHint to mediaRecorder");
}
}
private boolean prepareMediaRecorder() {
mediaRecorder = new MediaRecorder();
camera.unlock();
mediaRecorder.setCamera(camera);
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mediaRecorder.setProfile(camcorderProfile);
mediaRecorder.setMaxDuration(MAXIMUM_RECORDING_TIME);
mediaRecorder.setOutputFile(this.destinationFilepath);
try {
mediaRecorder.setMaxFileSize(Math.min(RECORDING_FILE_LIMIT, mStorageSpace - LOW_STORAGE_THRESHOLD));
} catch (RuntimeException exception) {
}
setRecorderOrientation(mOrientation);
if (!useTexture()) {
mediaRecorder.setPreviewDisplay(surfaceHolder.getSurface());
}
try {
mediaRecorder.prepare();
} catch (IllegalStateException e) {
releaseMediaRecorder();
return false;
} catch (IOException e) {
releaseMediaRecorder();
return false;
}
mediaRecorder.setOnErrorListener(this);
mediaRecorder.setOnInfoListener(this);
return true;
}
private void releaseMediaRecorder() {
if (mediaRecorder != null) {
mediaRecorder.reset(); // clear recorder configuration
mediaRecorder.release(); // release the recorder object
mediaRecorder = null;
camera.lock(); // lock camera for later use
}
}
private Point getOptimizedPreviewSize(int videoWidth, int videoHeight) {
Display display = currentActivity.getWindowManager().getDefaultDisplay();
Point size = new Point();
display.getSize(size);
Point optimizedSize = new Point();
optimizedSize.x = size.x;
optimizedSize.y = (int) ((float) videoWidth / (float) videoHeight * size.x);
return optimizedSize;
}
private void showCameraErrorAndFinish() {
DialogInterface.OnClickListener buttonListener = new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialog, int which) {
currentActivity.finish();
}
};
new AlertDialog.Builder(currentActivity).setCancelable(false)
.setTitle("Camera error")
.setMessage("Cannot connect to the camera.")
.setNeutralButton("OK", buttonListener)
.show();
}
private void showStorageErrorAndFinish() {
DialogInterface.OnClickListener buttonListener = new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialog, int which) {
currentActivity.finish();
}
};
new AlertDialog.Builder(currentActivity).setCancelable(false)
.setTitle("Storage error")
.setMessage("Cannot read external storage.")
.setNeutralButton("OK", buttonListener)
.show();
}
private void updateStorageSpace() {
mStorageSpace = getAvailableSpace();
Log.v(TAG, "updateStorageSpace mStorageSpace=" + mStorageSpace);
}
private long getAvailableSpace() {
String state = Environment.getExternalStorageState();
Log.d(TAG, "External storage state=" + state);
if (Environment.MEDIA_CHECKING.equals(state)) {
return -1;
}
if (!Environment.MEDIA_MOUNTED.equals(state)) {
return -1;
}
File directory = currentActivity.getExternalFilesDir("vine");
directory.mkdirs();
if (!directory.isDirectory() || !directory.canWrite()) {
return -1;
}
try {
StatFs stat = new StatFs(directory.getAbsolutePath());
return stat.getAvailableBlocks() * (long) stat.getBlockSize();
} catch (Exception e) {
Log.i(TAG, "Fail to access external storage", e);
}
return -1;
}
private boolean useTexture() {
return false;
// return Build.VERSION.SDK_INT >= Build.VERSION_CODES.JELLY_BEAN_MR1;
}
private class SurfaceViewCallback implements SurfaceHolder.Callback {
@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
Log.v(TAG, "surfaceChanged. width=" + width + ". height=" + height);
}
@Override
public void surfaceCreated(SurfaceHolder holder) {
Log.v(TAG, "surfaceCreated");
surfaceViewReady = true;
surfaceHolder = holder;
startPreview();
}
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
Log.d(TAG, "surfaceDestroyed");
surfaceViewReady = false;
}
}
@Override
public void onError(int error, Camera camera) {
Log.e(TAG, "Camera onError. what=" + error + ".");
if (error == Camera.CAMERA_ERROR_SERVER_DIED) {
} else if (error == Camera.CAMERA_ERROR_UNKNOWN) {
}
}
@Override
public void onInfo(MediaRecorder mr, int what, int extra) {
if (what == MediaRecorder.MEDIA_RECORDER_INFO_MAX_DURATION_REACHED) {
stopRecording();
} else if (what == MediaRecorder.MEDIA_RECORDER_INFO_MAX_FILESIZE_REACHED) {
stopRecording();
Toast.makeText(currentActivity, "Size limit reached", Toast.LENGTH_LONG).show();
}
}
@Override
public void onError(MediaRecorder mr, int what, int extra) {
Log.e(TAG, "MediaRecorder onError. what=" + what + ". extra=" + extra);
if (what == MediaRecorder.MEDIA_RECORDER_ERROR_UNKNOWN) {
stopRecording();
}
}
}VideoUtils
public class VideoUtils {
private static final String TAG = VideoUtils.class.getSimpleName();
static double[] matrix = new double[] { 0.0, 1.0, 0.0, -1.0, 0.0, 0.0, 0.0,
0.0, 1.0 };
public static boolean MergeFiles(String speratedDirPath,
String targetFileName) {
File videoSourceDirFile = new File(speratedDirPath);
String[] videoList = videoSourceDirFile.list();
List<track> videoTracks = new LinkedList<track>();
List<track> audioTracks = new LinkedList<track>();
for (String file : videoList) {
Log.d(TAG, "source files" + speratedDirPath
+ File.separator + file);
try {
FileChannel fc = new FileInputStream(speratedDirPath
+ File.separator + file).getChannel();
Movie movie = MovieCreator.build(fc);
for (Track t : movie.getTracks()) {
if (t.getHandler().equals("soun")) {
audioTracks.add(t);
}
if (t.getHandler().equals("vide")) {
videoTracks.add(t);
}
}
} catch (FileNotFoundException e) {
e.printStackTrace();
return false;
} catch (IOException e) {
e.printStackTrace();
return false;
}
}
Movie result = new Movie();
try {
if (audioTracks.size() > 0) {
result.addTrack(new AppendTrack(audioTracks
.toArray(new Track[audioTracks.size()])));
}
if (videoTracks.size() > 0) {
result.addTrack(new AppendTrack(videoTracks
.toArray(new Track[videoTracks.size()])));
}
IsoFile out = new DefaultMp4Builder().build(result);
FileChannel fc = new RandomAccessFile(
String.format(targetFileName), "rw").getChannel();
Log.d(TAG, "target file:" + targetFileName);
TrackBox tb = out.getMovieBox().getBoxes(TrackBox.class).get(1);
TrackHeaderBox tkhd = tb.getTrackHeaderBox();
double[] b = tb.getTrackHeaderBox().getMatrix();
tkhd.setMatrix(matrix);
fc.position(0);
out.getBox(fc);
fc.close();
for (String file : videoList) {
File TBRFile = new File(speratedDirPath + File.separator + file);
TBRFile.delete();
}
boolean a = videoSourceDirFile.delete();
Log.d(TAG, "try to delete dir:" + a);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
return false;
}
return true;
}
public static boolean clearFiles(String speratedDirPath) {
File videoSourceDirFile = new File(speratedDirPath);
if (videoSourceDirFile != null
&& videoSourceDirFile.listFiles() != null) {
File[] videoList = videoSourceDirFile.listFiles();
for (File video : videoList) {
video.delete();
}
videoSourceDirFile.delete();
}
return true;
}
public static int createSnapshot(String videoFile, int kind, String snapshotFilepath) {
return 0;
};
public static int createSnapshot(String videoFile, int width, int height, String snapshotFilepath) {
return 0;
}
}
</track></track></track></track>my reference code project link is