
Recherche avancée
Médias (1)
-
The Slip - Artworks
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Texte
Autres articles (73)
-
Gestion des droits de création et d’édition des objets
8 février 2011, parPar défaut, beaucoup de fonctionnalités sont limitées aux administrateurs mais restent configurables indépendamment pour modifier leur statut minimal d’utilisation notamment : la rédaction de contenus sur le site modifiables dans la gestion des templates de formulaires ; l’ajout de notes aux articles ; l’ajout de légendes et d’annotations sur les images ;
-
Supporting all media types
13 avril 2011, parUnlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)
-
Dépôt de média et thèmes par FTP
31 mai 2013, parL’outil MédiaSPIP traite aussi les média transférés par la voie FTP. Si vous préférez déposer par cette voie, récupérez les identifiants d’accès vers votre site MédiaSPIP et utilisez votre client FTP favori.
Vous trouverez dès le départ les dossiers suivants dans votre espace FTP : config/ : dossier de configuration du site IMG/ : dossier des média déjà traités et en ligne sur le site local/ : répertoire cache du site web themes/ : les thèmes ou les feuilles de style personnalisées tmp/ : dossier de travail (...)
Sur d’autres sites (7291)
-
live video streaming not playing on dash.js player using ffmpeg, nginx-rtmp in raspberry pi
15 août 2016, par sparksI am trying to stream a live video from raspberry pi & pi camera module to web browser.But the video doesn’t play.I am using Nginx-rtmp and ffmpeg to process the video. On the client side i use dash js player trying to play the video but no luck.I see the warning on the terminal
[flv @ 0x2c3b950] Failed to update header with correct duration.
[flv @ 0x2c3b950] Failed to update header with correct filesize.But i am not sure if this could be the issue. Anybody knows what might be wrong ?I don’t expect a straight answer but i am sure somebody can guide me in the right direction. Thank you in advance. Here is my set up
Nginx.conf
events {
worker_connections 1024;
}
http {
include mime.types;
default_type application/octet-stream;
sendfile on;
keepalive_timeout 65;
server {
listen 80;
server_name 192.168.1.114 localhost;
location / {
root /var/www;
index index.html index.htm;
}
location /dash {
root /var/www;
add_header Cache-Control no-cache;
}
location /dash.js{
root /var/www;
}
location /hls {
types {
application/vnd.apple.mpegurl m3u8;
video/mp2t ts;
}
root /var/www;
index index.html;
add_header Cache-Control no-cache;
}
location /rtmpcontrol{
rtmp_control all;
}
location /rtmpstat{
rtmp_stat all;
}
#error_page 404 /404.html;
error_page 500 502 503 504 /50x.html;
location = /50x.html {
root html;
}
}
rtmp {
server {
listen 1935;
chunk_size 4000;
application rtmp {
live on;
hls on;
dash on;
dash_path /var/www/dash;
hls_path /var/www/hls;
}
}
}baseline.html
<code class="echappe-js"><script src='http://stackoverflow.com/feeds/tag/dash.all.js'></script><script><br />
function getUrlVars() {<br />
var vars = {};<br />
var parts = window.location.href.replace(/[?&amp;]+([^=&amp;]+)=([^&amp;]*)/gi, function(m,key,value) {<br />
vars[key] = value;<br />
});<br />
return vars;<br />
}<br />
<br />
function startVideo() {<br />
var vars = getUrlVars(),<br />
url = "http://192.168.1.114:80/dash/stream.mpd",<br />
video,<br />
context,<br />
player;<br />
<br />
if (vars &amp;&amp; vars.hasOwnProperty("url")) {<br />
url = vars.url;<br />
}<br />
<br />
video = document.querySelector(".dash-video-player video");<br />
context = new Dash.di.DashContext();<br />
player = new MediaPlayer(context);<br />
<br />
player.startup();<br />
<br />
player.attachView(video);<br />
player.setAutoPlay(false);<br />
<br />
player.attachSource(url);<br />
}<br />
</script><body onload="startVideo()">
How i grab and push video stream
raspivid -w 640 -h 480 -fps 25 -t 0 -b 1800000 -o - | ffmpeg -y -f h264 -i - -vcodec libx264 -f flv -rtmp_buffer 100 -rtmp_live live rtmp://localhost:1935/rtmp/stream
When i run the above command, i see the following on the terminal
ffmpeg version N-81256-gd3426fb Copyright (c) 2000-2016 the FFmpeg developers
built with gcc 4.9.2 (Raspbian 4.9.2-10)
configuration: --enable-gpl --enable-libx264 --enable-nonfree
libavutil 55. 28.100 / 55. 28.100
libavcodec 57. 51.100 / 57. 51.100
libavformat 57. 44.100 / 57. 44.100
libavdevice 57. 0.102 / 57. 0.102
libavfilter 6. 49.100 / 6. 49.100
libswscale 4. 1.100 / 4. 1.100
libswresample 2. 1.100 / 2. 1.100
libpostproc 54. 0.100 / 54. 0.100
Input #0, h264, from 'pipe:':
Duration: N/A, bitrate: N/A
Stream #0:0: Video: h264 (High), yuv420p, 640x480, 25 fps, 25 tbr, 1200k tbn, 50 tbc
[tcp @ 0x2c3c850] Connection to tcp://localhost:1935 failed (Connection refused), trying next address
[libx264 @ 0x2c50d80] using cpu capabilities: ARMv6 NEON
[libx264 @ 0x2c50d80] profile High, level 3.0
[libx264 @ 0x2c50d80] 264 - core 148 r2705 3f5ed56 - H.264/MPEG-4 AVC codec - Copyleft 2003-2016 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=6 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
[flv @ 0x2c3b950] Using AVStream.codec to pass codec parameters to muxers is deprecated, use AVStream.codecpar instead.
Output #0, flv, to 'rtmp://localhost:1935/rtmp/stream':
Metadata:
encoder : Lavf57.44.100
Stream #0:0: Video: h264 (libx264) ([7][0][0][0] / 0x0007), yuv420p, 640x480, q=-1--1, 25 fps, 1k tbn, 25 tbc
Metadata:
encoder : Lavc57.51.100 libx264
Side data:
cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: -1
Stream mapping:
Stream #0:0 -> #0:0 (h264 (native) -> h264 (libx264))
^Cmmal: Aborting program.0 size= 930kB time=00:00:45.44 bitrate= 167.7kbits/s speed=1.03x
[flv @ 0x2c3b950] Failed to update header with correct duration.
[flv @ 0x2c3b950] Failed to update header with correct filesize.
frame= 1197 fps= 26 q=-1.0 Lsize= 976kB time=00:00:47.76 bitrate= 167.5kbits/s speed=1.04x
video:953kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 2.484349%
[libx264 @ 0x2c50d80] frame I:5 Avg QP:18.91 size: 3600
[libx264 @ 0x2c50d80] frame P:299 Avg QP:21.38 size: 1486
[libx264 @ 0x2c50d80] frame B:893 Avg QP:20.62 size: 574
[libx264 @ 0x2c50d80] consecutive B-frames: 0.4% 0.0% 1.0% 98.6%
[libx264 @ 0x2c50d80] mb I I16..4: 10.1% 87.1% 2.8%
[libx264 @ 0x2c50d80] mb P I16..4: 3.8% 7.4% 0.0% P16..4: 34.2% 2.1% 1.5% 0.0% 0.0% skip:51.0%
[libx264 @ 0x2c50d80] mb B I16..4: 0.2% 0.3% 0.0% B16..8: 20.1% 0.4% 0.0% direct: 3.8% skip:75.1% L0:48.6% L1:50.9% BI: 0.6%
[libx264 @ 0x2c50d80] 8x8 transform intra:68.1% inter:97.0%
[libx264 @ 0x2c50d80] coded y,uvDC,uvAC intra: 10.1% 27.5% 1.8% inter: 2.0% 12.9% 0.1%
[libx264 @ 0x2c50d80] i16 v,h,dc,p: 18% 20% 9% 53%
[libx264 @ 0x2c50d80] i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 21% 11% 58% 2% 2% 1% 2% 1% 1%
[libx264 @ 0x2c50d80] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 22% 26% 30% 3% 4% 4% 6% 2% 2%
[libx264 @ 0x2c50d80] i8c dc,h,v,p: 66% 18% 15% 1%
[libx264 @ 0x2c50d80] Weighted P-Frames: Y:1.7% UV:1.3%
[libx264 @ 0x2c50d80] ref P L0: 58.2% 2.2% 25.5% 14.1% 0.1%
[libx264 @ 0x2c50d80] ref B L0: 81.7% 13.3% 5.0%
[libx264 @ 0x2c50d80] ref B L1: 93.6% 6.4%
[libx264 @ 0x2c50d80] kb/s:162.87
Exiting normally, received signal 2.On my var/www/dash directory
stream-0.m4a stream-19200.m4a stream-29200.m4a
stream-9600.m4a stream-init.m4a stream.mpd stream-raw.m4v
stream-0.m4v stream-19200.m4v stream-29200.m4v stream-9600.m4v stream-init.m4v stream-raw.m4a -
How to get .mp4 videos from motion on a Raspberry Pi ?
9 octobre 2016, par MaartiI use motion on my laptop and it works perfectly in any format. But when I use it on my Raspberry Pi 3 (Raspbian Jessie) with the Raspberry Camera V2, the only formats that work are :
.avi
and.swf
.When I choose any other format, the output video is a "0 sec video" that is played and closed instantly.
I would like to have
.mp4
or.ogg
output so I can read it easily with HTML5.Here is the motion codec documentation.
Here is my config file :
############################################################
# Daemon
############################################################
# Start in daemon (background) mode and release terminal (default: off)
daemon on
# File to store the process ID, also called pid file. (default: not defined)
process_id_file /var/run/motion/motion.pid
############################################################
# Basic Setup Mode
############################################################
# Start in Setup-Mode, daemon disabled. (default: off)
setup_mode off
# Use a file to save logs messages, if not defined stderr and syslog is used. (default: not defined)
#logfile /mnt/camshare/Cam1/motion.log
logfile /tmp/motion.log
# Level of log messages [1..9] (EMR, ALR, CRT, ERR, WRN, NTC, INF, DBG, ALL). (default: 6 / NTC)
log_level 2
# Filter to log messages by type (COR, STR, ENC, NET, DBL, EVT, TRK, VID, ALL). (default: ALL)
log_type all
###########################################################
# Capture device options
############################################################
# Videodevice to be used for capturing (default /dev/video0)
# for FreeBSD default is /dev/bktr0
#videodevice /dev/video0
# v4l2_palette allows to choose preferable palette to be use by motion
# to capture from those supported by your videodevice. (default: 17)
# E.g. if your videodevice supports both V4L2_PIX_FMT_SBGGR8 and
# V4L2_PIX_FMT_MJPEG then motion will by default use V4L2_PIX_FMT_MJPEG.
# Setting v4l2_palette to 2 forces motion to use V4L2_PIX_FMT_SBGGR8
# instead.
#
# Values :
# V4L2_PIX_FMT_SN9C10X : 0 'S910'
# V4L2_PIX_FMT_SBGGR16 : 1 'BYR2'
# V4L2_PIX_FMT_SBGGR8 : 2 'BA81'
# V4L2_PIX_FMT_SPCA561 : 3 'S561'
# V4L2_PIX_FMT_SGBRG8 : 4 'GBRG'
# V4L2_PIX_FMT_SGRBG8 : 5 'GRBG'
# V4L2_PIX_FMT_PAC207 : 6 'P207'
# V4L2_PIX_FMT_PJPG : 7 'PJPG'
# V4L2_PIX_FMT_MJPEG : 8 'MJPEG'
# V4L2_PIX_FMT_JPEG : 9 'JPEG'
# V4L2_PIX_FMT_RGB24 : 10 'RGB3'
# V4L2_PIX_FMT_SPCA501 : 11 'S501'
# V4L2_PIX_FMT_SPCA505 : 12 'S505'
# V4L2_PIX_FMT_SPCA508 : 13 'S508'
# V4L2_PIX_FMT_UYVY : 14 'UYVY'
# V4L2_PIX_FMT_YUYV : 15 'YUYV'
# V4L2_PIX_FMT_YUV422P : 16 '422P'
# V4L2_PIX_FMT_YUV420 : 17 'YU12'
#
v4l2_palette 7
# Tuner device to be used for capturing using tuner as source (default /dev/tuner0)
# This is ONLY used for FreeBSD. Leave it commented out for Linux
; tunerdevice /dev/tuner0
# The video input to be used (default: -1)
# Should normally be set to 0 or 1 for video/TV cards, and -1 for USB cameras
input -1
# The video norm to use (only for video capture and TV tuner cards)
# Values: 0 (PAL), 1 (NTSC), 2 (SECAM), 3 (PAL NC no colour). Default: 0 (PAL)
norm 0
# The frequency to set the tuner to (kHz) (only for TV tuner cards) (default: 0)
frequency 0
# Rotate image this number of degrees. The rotation affects all saved images as
# well as movies. Valid values: 0 (default = no rotation), 90, 180 and 270.
rotate 0
# Image width (pixels). Valid range: Camera dependent, default: 352
#width 1024
width 640
# Image height (pixels). Valid range: Camera dependent, default: 288
#height 576
height 480
# Maximum number of frames to be captured per second.
# Valid range: 2-100. Default: 100 (almost no limit).
framerate 15
# Minimum time in seconds between capturing picture frames from the camera.
# Default: 0 = disabled - the capture rate is given by the camera framerate.
# This option is used when you want to capture images at a rate lower than 2 per second.
minimum_frame_time 0
# URL to use if you are using a network camera, size will be autodetected (incl http:// ftp:// mjpg:// or file:///)
# Must be a URL that returns single jpeg pictures or a raw mjpeg stream. Default: Not defined
;netcam_url http://127.0.0.1/cgi-bin/raspicam.sh
# Username and password for network camera (only if required). Default: not defined
# Syntax is user:password
; netcam_userpass value
# The setting for keep-alive of network socket, should improve performance on compatible net cameras.
# off: The historical implementation using HTTP/1.0, closing the socket after each http request.
# force: Use HTTP/1.0 requests with keep alive header to reuse the same connection.
# on: Use HTTP/1.1 requests that support keep alive as default.
# Default: off
netcam_keepalive off
# URL to use for a netcam proxy server, if required, e.g. "http://myproxy".
# If a port number other than 80 is needed, use "http://myproxy:1234".
# Default: not defined
; netcam_proxy value
# Set less strict jpeg checks for network cameras with a poor/buggy firmware.
# Default: off
netcam_tolerant_check off
# Let motion regulate the brightness of a video device (default: off).
# The auto_brightness feature uses the brightness option as its target value.
# If brightness is zero auto_brightness will adjust to average brightness value 128.
# Only recommended for cameras without auto brightness
auto_brightness off
# Set the initial brightness of a video device.
# If auto_brightness is enabled, this value defines the average brightness level
# which Motion will try and adjust to.
# Valid range 0-255, default 0 = disabled
brightness 0
# Set the contrast of a video device.
# Valid range 0-255, default 0 = disabled
contrast 0
# Set the saturation of a video device.
# Valid range 0-255, default 0 = disabled
saturation 0
# Set the hue of a video device (NTSC feature).
# Valid range 0-255, default 0 = disabled
hue 0
############################################################
# File "camera" support - read raw YUV data from a file
############################################################
#filecam_path /home/pi/test-cap/motion-mmal.capture
############################################################
# OpenMax/MMAL camera support for Raspberry Pi
############################################################
mmalcam_name vc.ril.camera
#mmalcam_control_params
#mmalcam_raw_capture_file /home/pi/motion-mmal.capture
# Switch this setting to "on" to use the still image mode of the Pi's camera
# instead of video. This gives a wider field of view, but requires
# a much slower frame-rate to achieve exposure stability
# (e.g. 0.25 fps or slower). You can use the minimum_frame_time
# parameter above to achieve this
mmalcam_use_still off
############################################################
# Round Robin (multiple inputs on same video device name)
############################################################
# Number of frames to capture in each roundrobin step (default: 1)
roundrobin_frames 1
# Number of frames to skip before each roundrobin step (default: 1)
roundrobin_skip 1
# Try to filter out noise generated by roundrobin (default: off)
switchfilter off
############################################################
# Motion Detection Settings:
############################################################
# Threshold for number of changed pixels in an image that
# triggers motion detection (default: 1500)
threshold 1500
# Automatically tune the threshold down if possible (default: off)
threshold_tune off
# Noise threshold for the motion detection (default: 32)
noise_level 32
# Automatically tune the noise threshold (default: on)
noise_tune on
# Despeckle motion image using (e)rode or (d)ilate or (l)abel (Default: not defined)
# Recommended value is EedDl. Any combination (and number of) of E, e, d, and D is valid.
# (l)abeling must only be used once and the 'l' must be the last letter.
# Comment out to disable
despeckle_filter EedDl
# Detect motion in predefined areas (1 - 9). Areas are numbered like that: 1 2 3
# A script (on_area_detected) is started immediately when motion is 4 5 6
# detected in one of the given areas, but only once during an event. 7 8 9
# One or more areas can be specified with this option. Take care: This option
# does NOT restrict detection to these areas! (Default: not defined)
; area_detect value
# PGM file to use as a sensitivity mask.
# Full path name to. (Default: not defined)
; mask_file value
# Dynamically create a mask file during operation (default: 0)
# Adjust speed of mask changes from 0 (off) to 10 (fast)
smart_mask_speed 0
# Ignore sudden massive light intensity changes given as a percentage of the picture
# area that changed intensity. Valid range: 0 - 100 , default: 0 = disabled
lightswitch 0
# Picture frames must contain motion at least the specified number of frames
# in a row before they are detected as true motion. At the default of 1, all
# motion is detected. Valid range: 1 to thousands, recommended 1-5
minimum_motion_frames 1
# Specifies the number of pre-captured (buffered) pictures from before motion
# was detected that will be output at motion detection.
# Recommended range: 0 to 5 (default: 0)
# Do not use large values! Large values will cause Motion to skip video frames and
# cause unsmooth movies. To smooth movies use larger values of post_capture instead.
pre_capture 2
# Number of frames to capture after motion is no longer detected (default: 0)
post_capture 2
# Event Gap is the seconds of no motion detection that triggers the end of an event.
# An event is defined as a series of motion images taken within a short timeframe.
# Recommended value is 60 seconds (Default). The value -1 is allowed and disables
# events causing all Motion to be written to one single movie file and no pre_capture.
# If set to 0, motion is running in gapless mode. Movies don't have gaps anymore. An
# event ends right after no more motion is detected and post_capture is over.
event_gap 60
# Maximum length in seconds of an mpeg movie
# When value is exceeded a new movie file is created. (Default: 0 = infinite)
# ATTENTION: when you're not using the motion build from the tutorial, it might fail with error 'Unknown config option "max_mpeg_time"'
# the use this line instead:
# max_movie_time 60
max_movie_time 60
# Always save images even if there was no motion (default: off)
emulate_motion off
############################################################
# Image File Output
############################################################
# Output 'normal' pictures when motion is detected (default: on)
# Valid values: on, off, first, best, center
# When set to 'first', only the first picture of an event is saved.
# Picture with most motion of an event is saved when set to 'best'.
# Picture with motion nearest center of picture is saved when set to 'center'.
# Can be used as preview shot for the corresponding movie.
output_pictures best
# Output pictures with only the pixels moving object (ghost images) (default: off)
output_debug_pictures off
# The quality (in percent) to be used by the jpeg compression (default: 75)
quality 75
# Type of output images
# Valid values: jpeg, ppm (default: jpeg)
picture_type jpeg
############################################################
# FFMPEG related options
# Film (movies) file output, and deinterlacing of the video input
# The options movie_filename and timelapse_filename are also used
# by the ffmpeg feature
############################################################
# Use ffmpeg to encode movies in realtime (default: off)
ffmpeg_output_movies on
# Use ffmpeg to make movies with only the pixels moving
# object (ghost images) (default: off)
ffmpeg_output_debug_movies off
# Use ffmpeg to encode a timelapse movie
# Default value 0 = off - else save frame every Nth second
ffmpeg_timelapse 0
# The file rollover mode of the timelapse video
# Valid values: hourly, daily (default), weekly-sunday, weekly-monday, monthly, manual
ffmpeg_timelapse_mode daily
# Bitrate to be used by the ffmpeg encoder (default: 400000)
# This option is ignored if ffmpeg_variable_bitrate is not 0 (disabled)
ffmpeg_bps 500000
# Enables and defines variable bitrate for the ffmpeg encoder.
# ffmpeg_bps is ignored if variable bitrate is enabled.
# Valid values: 0 (default) = fixed bitrate defined by ffmpeg_bps,
# or the range 2 - 31 where 2 means best quality and 31 is worst.
ffmpeg_variable_bitrate 5
# Codec to used by ffmpeg for the video compression.
# Timelapse mpegs are always made in mpeg1 format independent from this option.
# Supported formats are: mpeg1 (ffmpeg-0.4.8 only), mpeg4 (default), and msmpeg4.
# mpeg1 - gives you files with extension .mpg
# mpeg4 or msmpeg4 - gives you files with extension .avi
# msmpeg4 is recommended for use with Windows Media Player because
# it requires no installation of codec on the Windows client.
# swf - gives you a flash film with extension .swf
# flv - gives you a flash video with extension .flv
# ffv1 - FF video codec 1 for Lossless Encoding ( experimental )
# mov - QuickTime ( testing )
# ogg - Ogg/Theora ( testing )
#ffmpeg_video_codec msmpeg4
ffmpeg_video_codec mp4
# Use ffmpeg to deinterlace video. Necessary if you use an analog camera
# and see horizontal combing on moving objects in video or pictures.
# (default: off)
ffmpeg_deinterlace off
############################################################
# SDL Window
############################################################
# Number of motion thread to show in SDL Window (default: 0 = disabled)
#sdl_threadnr 0
############################################################
# External pipe to video encoder
# Replacement for FFMPEG builtin encoder for ffmpeg_output_movies only.
# The options movie_filename and timelapse_filename are also used
# by the ffmpeg feature
#############################################################
# Bool to enable or disable extpipe (default: off)
use_extpipe off
# External program (full path and opts) to pipe raw video to
# Generally, use '-' for STDIN...
;extpipe mencoder -demuxer rawvideo -rawvideo w=320:h=240:i420 -ovc x264 -x264encopts bframes=4:frameref=1:subq=1:scenecut=-1:nob_adapt:threads=1:keyint=1000:8x8dct:vbv_bufsize=4000:crf=24:partitions=i8x8,i4x4:vbv_maxrate=800:no-chroma-me -vf denoise3d=16:12:48:4,pp=lb -of avi -o %f.avi - -fps %fps
############################################################
# Snapshots (Traditional Periodic Webcam File Output)
############################################################
# Make automated snapshot every N seconds (default: 0 = disabled)
snapshot_interval 0
############################################################
# Text Display
# %Y = year, %m = month, %d = date,
# %H = hour, %M = minute, %S = second, %T = HH:MM:SS,
# %v = event, %q = frame number, %t = thread (camera) number,
# %D = changed pixels, %N = noise level, \n = new line,
# %i and %J = width and height of motion area,
# %K and %L = X and Y coordinates of motion center
# %C = value defined by text_event - do not use with text_event!
# You can put quotation marks around the text to allow
# leading spaces
############################################################
# Locate and draw a box around the moving object.
# Valid values: on, off, preview (default: off)
# Set to 'preview' will only draw a box in preview_shot pictures.
locate_motion_mode off
# Set the look and style of the locate box if enabled.
# Valid values: box, redbox, cross, redcross (default: box)
# Set to 'box' will draw the traditional box.
# Set to 'redbox' will draw a red box.
# Set to 'cross' will draw a little cross to mark center.
# Set to 'redcross' will draw a little red cross to mark center.
locate_motion_style box
# Draws the timestamp using same options as C function strftime(3)
# Default: %Y-%m-%d\n%T = date in ISO format and time in 24 hour clock
# Text is placed in lower right corner
text_right %d.%m.%Y\n%T
# Draw a user defined text on the images using same options as C function strftime(3)
# Default: Not defined = no text
# Text is placed in lower left corner
; text_left CAMERA %t
text_left HofCam
# Draw the number of changed pixed on the images (default: off)
# Will normally be set to off except when you setup and adjust the motion settings
# Text is placed in upper right corner
text_changes off
# This option defines the value of the special event conversion specifier %C
# You can use any conversion specifier in this option except %C. Date and time
# values are from the timestamp of the first image in the current event.
# Default: %Y%m%d%H%M%S
# The idea is that %C can be used filenames and text_left/right for creating
# a unique identifier for each event.
text_event %Y%m%d%H%M%S
# Draw characters at twice normal size on images. (default: off)
text_double on
# Text to include in a JPEG EXIF comment
# May be any text, including conversion specifiers.
# The EXIF timestamp is included independent of this text.
;exif_text %i%J/%K%L
############################################################
# Target Directories and filenames For Images And Films
# For the options snapshot_, picture_, movie_ and timelapse_filename
# you can use conversion specifiers
# %Y = year, %m = month, %d = date,
# %H = hour, %M = minute, %S = second,
# %v = event, %q = frame number, %t = thread (camera) number,
# %D = changed pixels, %N = noise level,
# %i and %J = width and height of motion area,
# %K and %L = X and Y coordinates of motion center
# %C = value defined by text_event
# Quotation marks round string are allowed.
############################################################
# Target base directory for pictures and films
# Recommended to use absolute path. (Default: current working directory)
target_dir /home/pi
# File path for snapshots (jpeg or ppm) relative to target_dir
# Default: %v-%Y%m%d%H%M%S-snapshot
# Default value is equivalent to legacy oldlayout option
# For Motion 3.0 compatible mode choose: %Y/%m/%d/%H/%M/%S-snapshot
# File extension .jpg or .ppm is automatically added so do not include this.
# Note: A symbolic link called lastsnap.jpg created in the target_dir will always
# point to the latest snapshot, unless snapshot_filename is exactly 'lastsnap'
snapshot_filename %v-%Y%m%d%H%M%S-snapshot
# File path for motion triggered images (jpeg or ppm) relative to target_dir
# Default: %v-%Y%m%d%H%M%S-%q
# Default value is equivalent to legacy oldlayout option
# For Motion 3.0 compatible mode choose: %Y/%m/%d/%H/%M/%S-%q
# File extension .jpg or .ppm is automatically added so do not include this
# Set to 'preview' together with best-preview feature enables special naming
# convention for preview shots. See motion guide for details
picture_filename %v-%Y%m%d%H%M%S-%q
# File path for motion triggered ffmpeg films (movies) relative to target_dir
# Default: %v-%Y%m%d%H%M%S
# Default value is equivalent to legacy oldlayout option
# For Motion 3.0 compatible mode choose: %Y/%m/%d/%H%M%S
# File extension .mpg or .avi is automatically added so do not include this
# This option was previously called ffmpeg_filename
movie_filename %v-%Y%m%d%H%M%S
# File path for timelapse movies relative to target_dir
# Default: %Y%m%d-timelapse
# Default value is near equivalent to legacy oldlayout option
# For Motion 3.0 compatible mode choose: %Y/%m/%d-timelapse
# File extension .mpg is automatically added so do not include this
timelapse_filename %Y%m%d-timelapse
############################################################
# Global Network Options
############################################################
# Enable or disable IPV6 for http control and stream (default: off )
ipv6_enabled off
############################################################
# Live Stream Server
############################################################
# The mini-http server listens to this port for requests (default: 0 = disabled)
stream_port 8080
# Quality of the jpeg (in percent) images produced (default: 50)
stream_quality 50
# Output frames at 1 fps when no motion is detected and increase to the
# rate given by stream_maxrate when motion is detected (default: off)
stream_motion on
# Maximum framerate for stream streams (default: 1)
stream_maxrate 4
# Restrict stream connections to localhost only (default: on)
stream_localhost off
# Limits the number of images per connection (default: 0 = unlimited)
# Number can be defined by multiplying actual stream rate by desired number of seconds
# Actual stream rate is the smallest of the numbers framerate and stream_maxrate
stream_limit 0
# Set the authentication method (default: 0)
# 0 = disabled
# 1 = Basic authentication
# 2 = MD5 digest (the safer authentication)
stream_auth_method 0
# Authentication for the stream. Syntax username:password
# Default: not defined (Disabled)
; stream_authentication username:password
############################################################
# HTTP Based Control
############################################################
# TCP/IP port for the http server to listen on (default: 0 = disabled)
webcontrol_port 8081
# Restrict control connections to localhost only (default: on)
webcontrol_localhost off
# Output for http server, select off to choose raw text plain (default: on)
webcontrol_html_output on
# Authentication for the http based control. Syntax username:password
# Default: not defined (Disabled)
; webcontrol_authentication username:password
############################################################
# Tracking (Pan/Tilt)
#############################################################
# Type of tracker (0=none (default), 1=stepper, 2=iomojo, 3=pwc, 4=generic, 5=uvcvideo, 6=servo)
# The generic type enables the definition of motion center and motion size to
# be used with the conversion specifiers for options like on_motion_detected
track_type 0
# Enable auto tracking (default: off)
track_auto off
# Serial port of motor (default: none)
;track_port /dev/ttyS0
# Motor number for x-axis (default: 0)
;track_motorx 0
# Set motorx reverse (default: 0)
;track_motorx_reverse 0
# Motor number for y-axis (default: 0)
;track_motory 1
# Set motory reverse (default: 0)
;track_motory_reverse 0
# Maximum value on x-axis (default: 0)
;track_maxx 200
# Minimum value on x-axis (default: 0)
;track_minx 50
# Maximum value on y-axis (default: 0)
;track_maxy 200
# Minimum value on y-axis (default: 0)
;track_miny 50
# Center value on x-axis (default: 0)
;track_homex 128
# Center value on y-axis (default: 0)
;track_homey 128
# ID of an iomojo camera if used (default: 0)
track_iomojo_id 0
# Angle in degrees the camera moves per step on the X-axis
# with auto-track (default: 10)
# Currently only used with pwc type cameras
track_step_angle_x 10
[...] -
Ffmpeg in Android Studio
7 décembre 2016, par Ahmed Abd-Elmegedi’am trying to add Ffmpeg library to use it in Live Streaming app based in Youtube APi when i add it and i use Cmake Build tool the header files only shown in the c native class which i create and i want it to be included in the other header files
which came with library how can i do it in cmake script or onther way ?This my cmake build script
# Sets the minimum version of CMake required to build the native
# library. You should either keep the default value or only pass a
# value of 3.4.0 or lower.
cmake_minimum_required(VERSION 3.4.1)
# Creates and names a library, sets it as either STATIC
# or SHARED, and provides the relative paths to its source code.
# You can define multiple libraries, and CMake builds it for you.
# Gradle automatically packages shared libraries with your APK.
add_library( # Sets the name of the library.
ffmpeg-jni
# Sets the library as a shared library.
SHARED
# Provides a relative path to your source file(s).
# Associated headers in the same location as their source
# file are automatically included.
src/main/cpp/ffmpeg-jni.c )
# Searches for a specified prebuilt library and stores the path as a
# variable. Because system libraries are included in the search path by
# default, you only need to specify the name of the public NDK library
# you want to add. CMake verifies that the library exists before
# completing its build.
find_library( # Sets the name of the path variable.
log-lib
# Specifies the name of the NDK library that
# you want CMake to locate.
log )
# Specifies libraries CMake should link to your target library. You
# can link multiple libraries, such as libraries you define in the
# build script, prebuilt third-party libraries, or system libraries.
target_link_libraries( # Specifies the target library.
ffmpeg-jni
# Links the target library to the log library
# included in the NDK.
${log-lib} )
include_directories(src/main/cpp/include/ffmpeglib)
include_directories(src/main/cpp/include/libavcodec)
include_directories(src/main/cpp/include/libavformat)
include_directories(src/main/cpp/include/libavutil)This my native class i create as native file i android studio and i also have an error in jintJNIcall not found
#include <android></android>log.h>
#include
#include
#include "libavcodec/avcodec.h"
#include "libavformat/avformat.h"
#include "libavutil/opt.h"
#ifdef __cplusplus
extern "C" {
#endif
JNIEXPORT jboolean
JNICALL Java_com_example_venrto_ffmpegtest_Ffmpeg_init(JNIEnv *env, jobject thiz,
jint width, jint height,
jint audio_sample_rate,
jstring rtmp_url);
JNIEXPORT void JNICALL Java_com_example_venrto_ffmpegtest_Ffmpeg_shutdown(JNIEnv
*env,
jobject thiz
);
JNIEXPORT jintJNICALL
Java_com_example_venrto_ffmpegtest_Ffmpeg_encodeVideoFrame(JNIEnv
*env,
jobject thiz,
jbyteArray
yuv_image);
JNIEXPORT jint
JNICALL Java_com_example_venrto_ffmpegtest_Ffmpeg_encodeAudioFrame(JNIEnv *env,
jobject thiz,
jshortArray audio_data,
jint length);
#ifdef __cplusplus
}
#endif
#define LOGI(...) __android_log_print(ANDROID_LOG_INFO, "ffmpeg-jni", __VA_ARGS__)
#define URL_WRONLY 2
static AVFormatContext *fmt_context;
static AVStream *video_stream;
static AVStream *audio_stream;
static int pts
= 0;
static int last_audio_pts = 0;
// Buffers for UV format conversion
static unsigned char *u_buf;
static unsigned char *v_buf;
static int enable_audio = 1;
static int64_t audio_samples_written = 0;
static int audio_sample_rate = 0;
// Stupid buffer for audio samples. Not even a proper ring buffer
#define AUDIO_MAX_BUF_SIZE 16384 // 2x what we get from Java
static short audio_buf[AUDIO_MAX_BUF_SIZE];
static int audio_buf_size = 0;
void AudioBuffer_Push(const short *audio, int num_samples) {
if (audio_buf_size >= AUDIO_MAX_BUF_SIZE - num_samples) {
LOGI("AUDIO BUFFER OVERFLOW: %i + %i > %i", audio_buf_size, num_samples,
AUDIO_MAX_BUF_SIZE);
return;
}
for (int i = 0; i < num_samples; i++) {
audio_buf[audio_buf_size++] = audio[i];
}
}
int AudioBuffer_Size() { return audio_buf_size; }
short *AudioBuffer_Get() { return audio_buf; }
void AudioBuffer_Pop(int num_samples) {
if (num_samples > audio_buf_size) {
LOGI("Audio buffer Pop WTF: %i vs %i", num_samples, audio_buf_size);
return;
}
memmove(audio_buf, audio_buf + num_samples, num_samples * sizeof(short));
audio_buf_size -= num_samples;
}
void AudioBuffer_Clear() {
memset(audio_buf, 0, sizeof(audio_buf));
audio_buf_size = 0;
}
static void log_callback(void *ptr, int level, const char *fmt, va_list vl) {
char x[2048];
vsnprintf(x, 2048, fmt, vl);
LOGI(x);
}
JNIEXPORT jboolean
JNICALL Java_com_example_venrto_ffmpegtest_Ffmpeg_init(JNIEnv *env, jobject thiz,
jint width, jint height,
jint audio_sample_rate_param,
jstring rtmp_url) {
avcodec_register_all();
av_register_all();
av_log_set_callback(log_callback);
fmt_context = avformat_alloc_context();
AVOutputFormat *ofmt = av_guess_format("flv", NULL, NULL);
if (ofmt) {
LOGI("av_guess_format returned %s", ofmt->long_name);
} else {
LOGI("av_guess_format fail");
return JNI_FALSE;
}
fmt_context->oformat = ofmt;
LOGI("creating video stream");
video_stream = av_new_stream(fmt_context, 0);
if (enable_audio) {
LOGI("creating audio stream");
audio_stream = av_new_stream(fmt_context, 1);
}
// Open Video Codec.
// ======================
AVCodec *video_codec = avcodec_find_encoder(AV_CODEC_ID_H264);
if (!video_codec) {
LOGI("Did not find the video codec");
return JNI_FALSE; // leak!
} else {
LOGI("Video codec found!");
}
AVCodecContext *video_codec_ctx = video_stream->codec;
video_codec_ctx->codec_id = video_codec->id;
video_codec_ctx->codec_type = AVMEDIA_TYPE_VIDEO;
video_codec_ctx->level = 31;
video_codec_ctx->width = width;
video_codec_ctx->height = height;
video_codec_ctx->pix_fmt = PIX_FMT_YUV420P;
video_codec_ctx->rc_max_rate = 0;
video_codec_ctx->rc_buffer_size = 0;
video_codec_ctx->gop_size = 12;
video_codec_ctx->max_b_frames = 0;
video_codec_ctx->slices = 8;
video_codec_ctx->b_frame_strategy = 1;
video_codec_ctx->coder_type = 0;
video_codec_ctx->me_cmp = 1;
video_codec_ctx->me_range = 16;
video_codec_ctx->qmin = 10;
video_codec_ctx->qmax = 51;
video_codec_ctx->keyint_min = 25;
video_codec_ctx->refs = 3;
video_codec_ctx->trellis = 0;
video_codec_ctx->scenechange_threshold = 40;
video_codec_ctx->flags |= CODEC_FLAG_LOOP_FILTER;
video_codec_ctx->me_method = ME_HEX;
video_codec_ctx->me_subpel_quality = 6;
video_codec_ctx->i_quant_factor = 0.71;
video_codec_ctx->qcompress = 0.6;
video_codec_ctx->max_qdiff = 4;
video_codec_ctx->time_base.den = 10;
video_codec_ctx->time_base.num = 1;
video_codec_ctx->bit_rate = 3200 * 1000;
video_codec_ctx->bit_rate_tolerance = 0;
video_codec_ctx->flags2 |= 0x00000100;
fmt_context->bit_rate = 4000 * 1000;
av_opt_set(video_codec_ctx, "partitions", "i8x8,i4x4,p8x8,b8x8", 0);
av_opt_set_int(video_codec_ctx, "direct-pred", 1, 0);
av_opt_set_int(video_codec_ctx, "rc-lookahead", 0, 0);
av_opt_set_int(video_codec_ctx, "fast-pskip", 1, 0);
av_opt_set_int(video_codec_ctx, "mixed-refs", 1, 0);
av_opt_set_int(video_codec_ctx, "8x8dct", 0, 0);
av_opt_set_int(video_codec_ctx, "weightb", 0, 0);
if (fmt_context->oformat->flags & AVFMT_GLOBALHEADER)
video_codec_ctx->flags |= CODEC_FLAG_GLOBAL_HEADER;
LOGI("Opening video codec");
AVDictionary *vopts = NULL;
av_dict_set(&vopts, "profile", "main", 0);
//av_dict_set(&vopts, "vprofile", "main", 0);
av_dict_set(&vopts, "rc-lookahead", 0, 0);
av_dict_set(&vopts, "tune", "film", 0);
av_dict_set(&vopts, "preset", "ultrafast", 0);
av_opt_set(video_codec_ctx->priv_data, "tune", "film", 0);
av_opt_set(video_codec_ctx->priv_data, "preset", "ultrafast", 0);
av_opt_set(video_codec_ctx->priv_data, "tune", "film", 0);
int open_res = avcodec_open2(video_codec_ctx, video_codec, &vopts);
if (open_res < 0) {
LOGI("Error opening video codec: %i", open_res);
return JNI_FALSE; // leak!
}
// Open Audio Codec.
// ======================
if (enable_audio) {
AudioBuffer_Clear();
audio_sample_rate = audio_sample_rate_param;
AVCodec *audio_codec = avcodec_find_encoder(AV_CODEC_ID_AAC);
if (!audio_codec) {
LOGI("Did not find the audio codec");
return JNI_FALSE; // leak!
} else {
LOGI("Audio codec found!");
}
AVCodecContext *audio_codec_ctx = audio_stream->codec;
audio_codec_ctx->codec_id = audio_codec->id;
audio_codec_ctx->codec_type = AVMEDIA_TYPE_AUDIO;
audio_codec_ctx->bit_rate = 128000;
audio_codec_ctx->bit_rate_tolerance = 16000;
audio_codec_ctx->channels = 1;
audio_codec_ctx->profile = FF_PROFILE_AAC_LOW;
audio_codec_ctx->sample_fmt = AV_SAMPLE_FMT_FLT;
audio_codec_ctx->sample_rate = 44100;
LOGI("Opening audio codec");
AVDictionary *opts = NULL;
av_dict_set(&opts, "strict", "experimental", 0);
open_res = avcodec_open2(audio_codec_ctx, audio_codec, &opts);
LOGI("audio frame size: %i", audio_codec_ctx->frame_size);
if (open_res < 0) {
LOGI("Error opening audio codec: %i", open_res);
return JNI_FALSE; // leak!
}
}
const jbyte *url = (*env)->GetStringUTFChars(env, rtmp_url, NULL);
// Point to an output file
if (!(ofmt->flags & AVFMT_NOFILE)) {
if (avio_open(&fmt_context->pb, url, URL_WRONLY) < 0) {
LOGI("ERROR: Could not open file %s", url);
return JNI_FALSE; // leak!
}
}
(*env)->ReleaseStringUTFChars(env, rtmp_url, url);
LOGI("Writing output header.");
// Write file header
if (avformat_write_header(fmt_context, NULL) != 0) {
LOGI("ERROR: av_write_header failed");
return JNI_FALSE;
}
pts = 0;
last_audio_pts = 0;
audio_samples_written = 0;
// Initialize buffers for UV format conversion
int frame_size = video_codec_ctx->width * video_codec_ctx->height;
u_buf = (unsigned char *) av_malloc(frame_size / 4);
v_buf = (unsigned char *) av_malloc(frame_size / 4);
LOGI("ffmpeg encoding init done");
return JNI_TRUE;
}
JNIEXPORT void JNICALL
Java_com_example_venrto_ffmpegtest_Ffmpeg_shutdown(JNIEnv
*env,
jobject thiz
) {
av_write_trailer(fmt_context);
avio_close(fmt_context
->pb);
avcodec_close(video_stream
->codec);
if (enable_audio) {
avcodec_close(audio_stream
->codec);
}
av_free(fmt_context);
av_free(u_buf);
av_free(v_buf);
fmt_context = NULL;
u_buf = NULL;
v_buf = NULL;
}
JNIEXPORT jintJNICALL
Java_com_example_venrto_ffmpegtest_Ffmpeg_encodeVideoFrame(JNIEnv
*env,
jobject thiz,
jbyteArray
yuv_image) {
int yuv_length = (*env)->GetArrayLength(env, yuv_image);
unsigned char *yuv_data = (*env)->GetByteArrayElements(env, yuv_image, 0);
AVCodecContext *video_codec_ctx = video_stream->codec;
//LOGI("Yuv size: %i w: %i h: %i", yuv_length, video_codec_ctx->width, video_codec_ctx->height);
int frame_size = video_codec_ctx->width * video_codec_ctx->height;
const unsigned char *uv = yuv_data + frame_size;
// Convert YUV from NV12 to I420. Y channel is the same so we don't touch it,
// we just have to deinterleave UV.
for (
int i = 0;
i < frame_size / 4; i++) {
v_buf[i] = uv[i * 2];
u_buf[i] = uv[i * 2 + 1];
}
AVFrame source;
memset(&source, 0, sizeof(AVFrame));
source.data[0] =
yuv_data;
source.data[1] =
u_buf;
source.data[2] =
v_buf;
source.linesize[0] = video_codec_ctx->
width;
source.linesize[1] = video_codec_ctx->width / 2;
source.linesize[2] = video_codec_ctx->width / 2;
// only for bitrate regulation. irrelevant for sync.
source.
pts = pts;
pts++;
int out_length = frame_size + (frame_size / 2);
unsigned char *out = (unsigned char *) av_malloc(out_length);
int compressed_length = avcodec_encode_video(video_codec_ctx, out, out_length, &source);
(*env)->
ReleaseByteArrayElements(env, yuv_image, yuv_data,
0);
// Write to file too
if (compressed_length > 0) {
AVPacket pkt;
av_init_packet(&pkt);
pkt.
pts = last_audio_pts;
if (video_codec_ctx->coded_frame && video_codec_ctx->coded_frame->key_frame) {
pkt.flags |= 0x0001;
}
pkt.
stream_index = video_stream->index;
pkt.
data = out;
pkt.
size = compressed_length;
if (
av_interleaved_write_frame(fmt_context,
&pkt) != 0) {
LOGI("Error writing video frame");
}
} else {
LOGI("??? compressed_length <= 0");
}
last_audio_pts++;
av_free(out);
return
compressed_length;
}
JNIEXPORT jintJNICALL
Java_com_example_venrto_ffmpegtest_Ffmpeg_encodeAudioFrame(JNIEnv
*env,
jobject thiz,
jshortArray
audio_data,
jint length
) {
if (!enable_audio) {
return 0;
}
short *audio = (*env)->GetShortArrayElements(env, audio_data, 0);
//LOGI("java audio buffer size: %i", length);
AVCodecContext *audio_codec_ctx = audio_stream->codec;
unsigned char *out = av_malloc(128000);
AudioBuffer_Push(audio, length
);
int total_compressed = 0;
while (
AudioBuffer_Size()
>= audio_codec_ctx->frame_size) {
AVPacket pkt;
av_init_packet(&pkt);
int compressed_length = avcodec_encode_audio(audio_codec_ctx, out, 128000,
AudioBuffer_Get());
total_compressed +=
compressed_length;
audio_samples_written += audio_codec_ctx->
frame_size;
int new_pts = (audio_samples_written * 1000) / audio_sample_rate;
if (compressed_length > 0) {
pkt.
size = compressed_length;
pkt.
pts = new_pts;
last_audio_pts = new_pts;
//LOGI("audio_samples_written: %i comp_length: %i pts: %i", (int)audio_samples_written, (int)compressed_length, (int)new_pts);
pkt.flags |= 0x0001;
pkt.
stream_index = audio_stream->index;
pkt.
data = out;
if (
av_interleaved_write_frame(fmt_context,
&pkt) != 0) {
LOGI("Error writing audio frame");
}
}
AudioBuffer_Pop(audio_codec_ctx
->frame_size);
}
(*env)->
ReleaseShortArrayElements(env, audio_data, audio,
0);
av_free(out);
return
total_compressed;
}this an example for a header file which i have an error include
#include "libavutil/samplefmt.h"
#include "libavutil/avutil.h"
#include "libavutil/cpu.h"
#include "libavutil/dict.h"
#include "libavutil/log.h"
#include "libavutil/pixfmt.h"
#include "libavutil/rational.h"
#include "libavutil/version.h"in the libavcodec/avcodec.h
This code based in this example :
https://github.com/youtube/yt-watchme