Recherche avancée

Médias (33)

Mot : - Tags -/creative commons

Autres articles (76)

  • La file d’attente de SPIPmotion

    28 novembre 2010, par

    Une file d’attente stockée dans la base de donnée
    Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
    Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...)

  • Les vidéos

    21 avril 2011, par

    Comme les documents de type "audio", Mediaspip affiche dans la mesure du possible les vidéos grâce à la balise html5 .
    Un des inconvénients de cette balise est qu’elle n’est pas reconnue correctement par certains navigateurs (Internet Explorer pour ne pas le nommer) et que chaque navigateur ne gère en natif que certains formats de vidéos.
    Son avantage principal quant à lui est de bénéficier de la prise en charge native de vidéos dans les navigateur et donc de se passer de l’utilisation de Flash et (...)

  • Le profil des utilisateurs

    12 avril 2011, par

    Chaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
    L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...)

Sur d’autres sites (7360)

  • Revision a49d80bfc8 : Squash commits from master to playground Moving RD-opt related code from vp9_en

    26 juin 2014, par Yue Chen

    Changed Paths :
     Modify /build/make/gen_msvs_proj.sh


     Modify /build/make/gen_msvs_vcxproj.sh


     Modify /build/make/iosbuild.sh


     Modify /examples/vp9_spatial_svc_encoder.c


     Modify /test/decode_test_driver.cc


     Modify /test/decode_test_driver.h


     Add /test/invalid_file_test.cc


     Modify /test/svc_test.cc


     Modify /test/test-data.sha1


     Modify /test/test.mk


     Modify /test/test_vectors.cc


     Add /test/user_priv_test.cc


     Add /third_party/libmkv/EbmlIDs.h


     Add /third_party/libmkv/EbmlWriter.c


     Add /third_party/libmkv/EbmlWriter.h


     Modify /vp8/common/rtcd_defs.pl


     Modify /vp8/encoder/x86/quantize_sse2.c


     Delete /vp8/encoder/x86/quantize_sse4.asm


     Add /vp8/encoder/x86/quantize_sse4.c


     Modify /vp8/vp8cx.mk


     Modify /vp9/common/arm/neon/vp9_convolve_neon.c


     Modify /vp9/common/arm/neon/vp9_loopfilter_16_neon.c


     Modify /vp9/common/vp9_alloccommon.c


     Modify /vp9/common/vp9_alloccommon.h


     Modify /vp9/common/vp9_convolve.c


     Modify /vp9/common/vp9_mvref_common.c


     Modify /vp9/common/vp9_mvref_common.h


     Modify /vp9/common/vp9_quant_common.c


     Modify /vp9/common/vp9_quant_common.h


     Modify /vp9/common/vp9_scale.h


     Modify /vp9/decoder/vp9_decodeframe.c


     Modify /vp9/decoder/vp9_decoder.c


     Modify /vp9/decoder/vp9_dthread.h


     Modify /vp9/decoder/vp9_read_bit_buffer.c


     Modify /vp9/encoder/vp9_bitstream.c


     Modify /vp9/encoder/vp9_block.h


     Modify /vp9/encoder/vp9_denoiser.c


     Modify /vp9/encoder/vp9_denoiser.h


     Modify /vp9/encoder/vp9_encodeframe.c


     Modify /vp9/encoder/vp9_encoder.c


     Modify /vp9/encoder/vp9_encoder.h


     Modify /vp9/encoder/vp9_firstpass.c


     Modify /vp9/encoder/vp9_firstpass.h


     Modify /vp9/encoder/vp9_lookahead.c


     Modify /vp9/encoder/vp9_lookahead.h


     Modify /vp9/encoder/vp9_pickmode.c


     Modify /vp9/encoder/vp9_pickmode.h


     Modify /vp9/encoder/vp9_ratectrl.c


     Modify /vp9/encoder/vp9_ratectrl.h


     Modify /vp9/encoder/vp9_rdopt.c


     Modify /vp9/encoder/vp9_rdopt.h


     Modify /vp9/encoder/vp9_speed_features.c


     Modify /vp9/encoder/vp9_speed_features.h


     Modify /vp9/encoder/vp9_svc_layercontext.c


     Modify /vp9/encoder/vp9_svc_layercontext.h


     Modify /vp9/vp9_cx_iface.c


     Modify /vp9/vp9_dx_iface.c


     Modify /vp9/vp9cx.mk


     Modify /vpx/src/svc_encodeframe.c


     Modify /vpx/svc_context.h



    Squash commits from master to playground

    Moving RD-opt related code from vp9_encoder.h to vp9_rdopt.h.

    Squashed-Change-Id : I8fab776c8801e19d3f5027ed55a6aa69eee951de

    gen_msvs_proj : fix in tree configure under cygwin

    strip trailing ’/’ from paths, this is later converted to ’\’ which
    causes execution errors for obj_int_extract/yasm. vs10+ wasn’t affected
    by this issue, but make the same change for consistency.

    gen_msvs_proj :
    + add missing ’"’ to obj_int_extract call
    unlike gen_msvs_vcproj, the block is duplicated
    missed in : 1e3d9b9 build/msvs : fix builds in source dirs with spaces

    Squashed-Change-Id : I76208e6cdc66dc5a0a7ffa8aa1edbefe31e4b130

    Improve vp9_rb_bytes_read

    Squashed-Change-Id : I69eba120eb3d8ec43b5552451c8a9bd009390795

    Removing decode_one_iter() function.

    When superframe index is available we completely rely on it and use frame
    size values from the index.

    Squashed-Change-Id : I0011d08b223303a8b912c2bcc8a02b74d0426ee0

    iosbuild.sh : Add vpx_config.h and vpx_version.h to VPX.framework.

    - Rename build_targets to build_framework
    - Add functions for creating the vpx_config shim and obtaining
    preproc symbols.

    Squashed-Change-Id : Ieca6938b9779077eefa26bf4cfee64286d1840b0

    Implemented vp9_denoiser_alloc,free()

    Squashed-Change-Id : I79eba79f7c52eec19ef2356278597e06620d5e27

    Update running avg for VP9 denoiser

    Squashed-Change-Id : I9577d648542064052795bf5770428fbd5c276b7b

    Changed buf_2ds in vp9 denoiser to YV12 buffers

    Changed alloc, free, and running average code as necessary.

    Squashed-Change-Id : Ifc4d9ccca462164214019963b3768a457791b9c1

    sse4 regular quantize

    Squashed-Change-Id : Ibd95df0adf9cc9143006ee9032b4cb2ebfd5dd1b

    Modify non-rd intra mode checking

    Speed 6 uses small tx size, namely 8x8. max_intra_bsize needs to
    be modified accordingly to ensure valid intra mode checking.
    Borg test on RTC set showed an overall PSNR gain of 0.335% in speed
    - 6.

    This also changes speed -5 encoding by allowing DC_PRED checking
    for block32x32. Borg test on RTC set showed a slight PSNR gain of
    0.145%, and no noticeable speed change.

    Squashed-Change-Id : I1502978d8fbe265b3bb235db0f9c35ba0703cd45

    Implemented COPY_BLOCK case for vp9 denoiser

    Squashed-Change-Id : Ie89ad1e3aebbd474e1a0db69c1961b4d1ddcd33e

    Improved vp9 denoiser running avg update.

    Squashed-Change-Id : Ie0aa41fb7957755544321897b3bb2dd92f392027

    Separate rate-distortion modeling for DC and AC coefficients

    This is the first step to rework the rate-distortion modeling used
    in rtc coding mode. The overall goal is to make the modeling
    customized for the statistics encountered in the rtc coding.

    This commit makes encoder to perform rate-distortion modeling for
    DC and AC coefficients separately. No speed changes observed.
    The coding performance for pedestrian_area_1080p is largely
    improved :

    speed -5, from 79558 b/f, 37.871 dB -> 79598 b/f, 38.600 dB
    speed -6, from 79515 b/f, 37.822 dB -> 79544 b/f, 38.130 dB

    Overall performance for rtc set at speed -6 is improved by 0.67%.

    Squashed-Change-Id : I9153444567e5f75ccdcaac043c2365992c005c0c

    Add superframe support for frame parallel decoding.

    A superframe is a bunch of frames that bundled as one frame. It is mostly
    used to combine one or more non-displayable frames and one displayable frame.

    For frame parallel decoding, libvpx decoder will only support decoding one
    normal frame or a super frame with superframe index.

    If an application pass a superframe without superframe index or a chunk
    of displayable frames without superframe index to libvpx decoder, libvpx
    will not decode it in frame parallel mode. But libvpx decoder still could
    decode it in serial mode.

    Squashed-Change-Id : I04c9f2c828373d64e880a8c7bcade5307015ce35

    Fixes in VP9 alloc, free, and COPY_FRAME case

    Squashed-Change-Id : I1216f17e2206ef521fe219b6d72d8e41d1ba1147

    Remove labels from quantize

    Use break instead of goto for early exit. Unbreaks Visual Studio
    builds.

    Squashed-Change-Id : I96dee43a3c82145d4abe0d6a99af6e6e1a3991b5

    Added CFLAG for outputting vp9 denoised signal

    Squashed-Change-Id : Iab9b4e11cad927f3282e486c203564e1a658f377

    Allow key frame more flexibility in mode search

    This commit allows the key frame to search through more prediction
    modes and more flexible block sizes. No speed change observed. The
    coding performance for rtc set is improved by 1.7% for speed -5 and
    3.0% for speed -6.

    Squashed-Change-Id : Ifd1bc28558017851b210b4004f2d80838938bcc5

    VP9 denoiser bugfixes

    s/stdint.h/vpx\/vpx_int.h

    Added missing ’break ;’s

    Also included other minor changes, mostly cosmetic.

    Squashed-Change-Id : I852bba3e85e794f1d4af854c45c16a23a787e6a3

    Don’t return value for void functions

    Clears "warning : ’return’ with a value, in function returning void"

    Squashed-Change-Id : I93972610d67e243ec772a1021d2fdfcfc689c8c2

    Include type defines

    Clears error : unknown type name ’uint8_t’

    Squashed-Change-Id : I9b6eff66a5c69bc24aeaeb5ade29255a164ef0e2

    Validate error checking code in decoder.

    This patch adds a mechanism for insuring error checking on invalid files
    by creating a unit test that runs the decoder and tests that the error
    code matches what’s expected on each frame in the decoder.

    Disabled for now as this unit test will segfault with existing code.

    Squashed-Change-Id : I896f9686d9ebcbf027426933adfbea7b8c5d956e

    Introduce FrameWorker for decoding.

    When decoding in serial mode, there will be only
    one FrameWorker doing decoding. When decoding in
    parallel mode, there will be several FrameWorkers
    doing decoding in parallel.

    Squashed-Change-Id : If53fc5c49c7a0bf5e773f1ce7008b8a62fdae257

    Add back libmkv ebml writer files.

    Another project in ChromeOS is using these files. To make libvpx
    rolls simpler, add these files back unitl the other project removes
    the dependency.

    crbug.com/387246 tracking bug to remove dependency.

    Squashed-Change-Id : If9c197081c845c4a4e5c5488d4e0190380bcb1e4

    Added Test vector that tests more show existing frames.

    Squashed-Change-Id : I0ddd7dd55313ee62d231ed4b9040e08c3761b3fe

    fix peek_si to enable 1 byte show existing frames.

    The test for this is in test vector code ( show existing frames will
    fail ). I can’t check it in disabled as I’m changing the generic
    test code to do this :

    https://gerrit.chromium.org/gerrit/#/c/70569/

    Squashed-Change-Id : I5ab324f0cb7df06316a949af0f7fc089f4a3d466

    Fix bug in error handling that causes segfault

    See : https://code.google.com/p/chromium/issues/detail?id=362697

    The code properly catches an invalid stream but seg faults instead of
    returning an error due to a buffer not having been initialized. This
    code fixes that.

    Squashed-Change-Id : I695595e742cb08807e1dfb2f00bc097b3eae3a9b

    Revert 3 patches from Hangyu to get Chrome to build :

    Avoids failures :
    MSE_ClearKey/EncryptedMediaTest.Playback_VP9Video_WebM/0
    MSE_ClearKey_Prefixed/EncryptedMediaTest.Playback_VP9Video_WebM/0
    MSE_ExternalClearKey_Prefixed/EncryptedMediaTest.Playback_VP9Video_WebM/0
    MSE_ExternalClearKey/EncryptedMediaTest.Playback_VP9Video_WebM/0
    MSE_ExternalClearKeyDecryptOnly/EncryptedMediaTest.Playback_VP9Video_WebM/0
    MSE_ExternalClearKeyDecryptOnly_Prefixed/EncryptedMediaTest.Playback_VP9Video_We
    bM/0
    SRC_ExternalClearKey/EncryptedMediaTest.Playback_VP9Video_WebM/0
    SRC_ExternalClearKey_Prefixed/EncryptedMediaTest.Playback_VP9Video_WebM/0
    SRC_ClearKey_Prefixed/EncryptedMediaTest.Playback_VP9Video_WebM/0

    Patches are
    This reverts commit 9bc040859b0ca6869d31bc0efa223e8684eef37a
    This reverts commit 6f5aba069a2c7ffb293ddce70219a9ab4a037441
    This reverts commit 9bc040859b0ca6869d31bc0efa223e8684eef37a

    I1f250441 Revert "Refactor the vp9_get_frame code for frame parallel."
    Ibfdddce5 Revert "Delay decreasing reference count in frame-parallel
    decoding."
    I00ce6771 Revert "Introduce FrameWorker for decoding."

    Need better testing in libvpx for these commits

    Squashed-Change-Id : Ifa1f279b0cabf4b47c051ec26018f9301c1e130e

    error check vp9 superframe parsing

    This patch insures that the last byte of a chunk that contains a
    valid superframe marker byte, actually has a proper superframe index.
    If not it returns an error.

    As part of doing that the file : vp90-2-15-fuzz-flicker.webm now fails
    to decode properly and moves to the invalid file test from the test
    vector suite.

    Squashed-Change-Id : I5f1da7eb37282ec0c6394df5c73251a2df9c1744

    Remove unused vp9_init_quant_tables function

    This function is not effectively used, hence removed.

    Squashed-Change-Id : I2e8e48fa07c7518931690f3b04bae920cb360e49

    Actually skip blocks in skip segments in non-rd encoder.

    Copy split from macroblock to pick mode context so it doesn’t get lost.

    Squashed-Change-Id : Ie37aa12558dbe65c4f8076cf808250fffb7f27a8

    Add Check for Peek Stream validity to decoder test.

    Squashed-Change-Id : I9b745670a9f842582c47e6001dc77480b31fb6a1

    Allocate buffers based on correct chroma format

    The encoder currently allocates frame buffers before
    it establishes what the chroma sub-sampling factor is,
    always allocating based on the 4:4:4 format.

    This patch detects the chroma format as early as
    possible allowing the encoder to allocate buffers of
    the correct size.

    Future patches will change the encoder to allocate
    frame buffers on demand to further reduce the memory
    profile of the encoder and rationalize the buffer
    management in the encoder and decoder.

    Squashed-Change-Id : Ifd41dd96e67d0011719ba40fada0bae74f3a0d57

    Fork vp9_rd_pick_inter_mode_sb_seg_skip

    Squashed-Change-Id : I549868725b789f0f4f89828005a65972c20df888

    Switch active map implementation to segment based.

    Squashed-Change-Id : Ibb841a1fa4d08d164cf5461246ec290f582b1f80

    Experiment for mid group second arf.

    This patch implements a mechanism for inserting a second
    arf at the mid position of arf groups.

    It is currently disabled by default using the flag multi_arf_enabled.

    Results are currently down somewhat in initial testing if
    multi-arf is enabled. Most of the loss is attributable to the
    fact that code to preserve the previous golden frame
    (in the arf buffer) in cases where we are coding an overlay
    frame, is currently disabled in the multi-arf case.

    Squashed-Change-Id : I1d777318ca09f147db2e8c86d7315fe86168c865

    Clean out old CONFIG_MULTIPLE_ARF code.

    Remove the old experimental multi arf code that was under
    the flag CONFIG_MULTIPLE_ARF.

    Squashed-Change-Id : Ib24865abc11691d6ac8cb0434ada1da674368a61

    Fix some bugs in multi-arf

    Fix some bugs relating to the use of buffers
    in the overlay frames.

    Fix bug where a mid sequence overlay was
    propagating large partition and transform sizes into
    the subsequent frame because of :-
    sf->last_partitioning_redo_frequency > 1 and
    sf->tx_size_search_method == USE_LARGESTALL

    Squashed-Change-Id : Ibf9ef39a5a5150f8cbdd2c9275abb0316c67873a

    Further dual arf changes : multi_arf_allowed.

    Add multi_arf_allowed flag.
    Re-initialize buffer indices every kf.
    Add some const indicators.

    Squashed-Change-Id : If86c39153517c427182691d2d4d4b7e90594be71

    Fixed VP9 denoiser COPY_BLOCK case

    Now copies the src to the correct location in the running average buffer.

    Squashed-Change-Id : I9c83c96dc7a97f42c8df16ab4a9f18b733181f34

    Fix test on maximum downscaling limits

    There is a normative scaling range of (x1/2, x16)
    for VP9. This patch fixes the maximum downscaling
    tests that are applied in the convolve function.

    The code used a maximum downscaling limit of x1/5
    for historic reasons related to the scalable
    coding work. Since the downsampling in this
    application is non-normative it will revert to
    using a separate non-normative scaler.

    Squashed-Change-Id : Ide80ed712cee82fe5cb3c55076ac428295a6019f

    Add unit test to test user_priv parameter.

    Squashed-Change-Id : I6ba6171e43e0a43331ee0a7b698590b143979c44

    vp9 : check tile column count

    the max is 6. there are assumptions throughout the decode regarding
    this ; fixes a crash with a fuzzed bitstream

    $ zzuf -s 5861 -r 0.01:0.05 -b 6- \
    < vp90-2-00-quantizer-00.webm.ivf \
    | dd of=invalid-vp90-2-00-quantizer-00.webm.ivf.s5861_r01-05_b6-.ivf \
    bs=1 count=81883

    Squashed-Change-Id : I6af41bb34252e88bc156a4c27c80d505d45f5642

    Adjust arf Q limits with multi-arf.

    Adjust enforced minimum arf Q deltas for non primary arfs
    in the middle of an arf/gf group.

    Squashed-Change-Id : Ie8034ffb3ac00f887d74ae1586d4cac91d6cace2

    Dual ARF changes : Buffer index selection.

    Add indirection to the section of buffer indices.
    This is to help simplify things in the future if we
    have other codec features that switch indices.

    Limit the max GF interval for static sections to fit
    the gf_group structures.

    Squashed-Change-Id : I38310daaf23fd906004c0e8ee3e99e15570f84cb

    Reuse inter prediction result in real-time speed 6

    In real-time speed 6, no partition search is done. The inter
    prediction results got from picking mode can be reused in the
    following encoding process. A speed feature reuse_inter_pred_sby
    is added to only enable the resue in speed 6.

    This patch doesn’t change encoding result. RTC set tests showed
    that the encoding speed gain is 2% - 5%.

    Squashed-Change-Id : I3884780f64ef95dd8be10562926542528713b92c

    Add vp9_ prefix to mv_pred and setup_pred_block functions

    Make these two functions accessible by both RD and non-RD coding
    modes.

    Squashed-Change-Id : Iecb39dbf3d65436286ea3c7ffaa9920d0b3aff85

    Replace cpi->common with preset variable cm

    This commit replaces a few use cases of cpi->common with preset
    variable cm, to avoid unnecessary pointer fetch in the non-RD
    coding mode.

    Squashed-Change-Id : I4038f1c1a47373b8fd7bc5d69af61346103702f6

    [spatial svc]Implement lag in frames for spatial svc

    Squashed-Change-Id : I930dced169c9d53f8044d2754a04332138347409

    [spatial svc]Don’t skip motion search in first pass encoding

    Squashed-Change-Id : Ia6bcdaf5a5b80e68176f60d8d00e9b5cf3f9bfe3

    decode_test_driver : fix type size warning

    like vpx_codec_decode(), vpx_codec_peek_stream_info() takes an unsigned
    int, not size_t, parameter for buffer size

    Squashed-Change-Id : I4ce0e1fbbde461c2e1b8fcbaac3cd203ed707460

    decode_test_driver : check HasFailure() in RunLoop

    avoids unnecessary errors due to e.g., read (Next()) failures

    Squashed-Change-Id : I70b1d09766456f1c55367d98299b5abd7afff842

    Allow lossless breakout in non-rd mode decision.

    This is very helpful for large moving windows in screencasts.

    Squashed-Change-Id : I91b5f9acb133281ee85ccd8f843e6bae5cadefca

    Revert "Revert 3 patches from Hangyu to get Chrome to build :"

    This patch reverts the previous revert from Jim and also add a
    variable user_priv in the FrameWorker to save the user_priv
    passed from the application. In the decoder_get_frame function,
    the user_priv will be binded with the img. This change is needed
    or it will fail the unit test added here :
    https://gerrit.chromium.org/gerrit/#/c/70610/

    This reverts commit 9be46e4565f553460a1bbbf58d9f99067d3242ce.

    Squashed-Change-Id : I376d9a12ee196faffdf3c792b59e6137c56132c1

    test.mk : remove renamed file

    vp90-2-15-fuzz-flicker.webm was renamed in :
    c3db2d8 error check vp9 superframe parsing

    Squashed-Change-Id : I229dd6ca4c662802c457beea0f7b4128153a65dc

    vp9cx.mk : move avx c files outside of x86inc block

    same reasoning as :
    9f3a0db vp9_rtcd : correct avx2 references

    these are all intrinsics, so don’t depend on x86inc.asm

    Squashed-Change-Id : I915beaef318a28f64bfa5469e5efe90e4af5b827

    Dual arf : Name changes.

    Cosmetic patch only in response to comments on
    previous patches suggesting a couple of name changes
    for consistency and clarity.

    Squashed-Change-Id : Ida3a359b0d5755345660d304a7697a3a3686b2a3

    Make non-RD intra mode search txfm size dependent

    This commit fixes the potential issue in the non-RD mode decision
    flow that only checks part of the block to estimate the cost. It
    was due to the use of fixed transform size, in replacing the
    largest transform block size. This commit enables per transform
    block cost estimation of the intra prediction mode in the non-RD
    mode decision.

    Squashed-Change-Id : I14ff92065e193e3e731c2bbf7ec89db676f1e132

    Fix quality regression for multi arf off case.

    Bug introduced during multiple iterations on : I3831*

    gf_group->arf_update_idx[] cannot currently be used
    to select the arf buffer index if buffer flipping on overlays
    is enabled (still currently the case when multi arf OFF).

    Squashed-Change-Id : I4ce9ea08f1dd03ac3ad8b3e27375a91ee1d964dc

    Enable real-time version reference motion vector search

    This commit enables a fast reference motion vector search scheme.
    It checks the nearest top and left neighboring blocks to decide the
    most probable predicted motion vector. If it finds the two have
    the same motion vectors, it then skip finding exterior range for
    the second most probable motion vector, and correspondingly skips
    the check for NEARMV.

    The runtime of speed -5 goes down
    pedestrian at 1080p 29377 ms -> 27783 ms
    vidyo at 720p 11830 ms -> 10990 ms
    i.e., 6%-8% speed-up.

    For rtc set, the compression performance
    goes down by about -1.3% for both speed -5 and -6.

    Squashed-Change-Id : I2a7794fa99734f739f8b30519ad4dfd511ab91a5

    Add const mark to const values in non-RD coding mode

    Squashed-Change-Id : I65209fd1e06fc06833f6647cb028b414391a7017

    Change-Id : Ic0be67ac9ef48f64a8878a0b8f1b336f136bceac

  • about FFMPEG stream overlaying an object

    1er février 2014, par user3261087

    hello i have one question about blocking or overlaying an object in a live stream wich is working trough FFmpeg

    i have my own stream server installed and sofar running fine with FFMpeG ,
    but im stuck on a problem.

    the problem is when there is a soccer event on the channel i got serial nummer in my stream, i want to overlay this or block this can someone help me because i know that is possible

    the serial number is showing for 1 or 2 minutes and everytime in different place on the screen. so its not only default on one place

  • Audio & Video not synchronized properly if i merged more videos in mp4parser

    1er octobre 2013, par maniya

    I have used mp4parser for merging video with dynamic pause and record video capture for max 6 second recording. In preview its working fine when recorded video with minimum pause/record, If i tried with more than 3 pause/record mean the last video file not get merged properly with audio.At the start of the video the sync is ok but at the end the video hanged and audio playing in screen for the remaining file duration about 1sec.

    My Recording manager

    public class RecordingManager implements Camera.ErrorCallback, MediaRecorder.OnErrorListener, MediaRecorder.OnInfoListener {

       private static final String TAG = RecordingManager.class.getSimpleName();
       private static final int FOCUS_AREA_RADIUS = 32;
       private static final int FOCUS_MAX_VALUE = 1000;
       private static final int FOCUS_MIN_VALUE = -1000;
       private static final long MINIMUM_RECORDING_TIME = 2000;
       private static final int MAXIMUM_RECORDING_TIME = 70 * 1000;
       private static final long LOW_STORAGE_THRESHOLD = 5 * 1024 * 1024;
       private static final long RECORDING_FILE_LIMIT = 100 * 1024 * 1024;

       private boolean paused = true;

       private MediaRecorder mediaRecorder = null;
       private boolean recording = false;

       private FrameLayout previewFrame = null;

       private boolean mPreviewing = false;

    //    private TextureView mTextureView = null;
    //    private SurfaceTexture mSurfaceTexture = null;
    //    private boolean mSurfaceTextureReady = false;
    //
       private SurfaceView surfaceView = null;
       private SurfaceHolder surfaceHolder = null;
       private boolean surfaceViewReady = false;

       private Camera camera = null;
       private Camera.Parameters cameraParameters = null;
       private CamcorderProfile camcorderProfile = null;

       private int mOrientation = -1;
       private OrientationEventListener mOrientationEventListener = null;

       private long mStartRecordingTime;
       private int mVideoWidth;
       private int mVideoHeight;
       private long mStorageSpace;

       private Handler mHandler = new Handler();
    //    private Runnable mUpdateRecordingTimeTask = new Runnable() {
    //        @Override
    //        public void run() {
    //            long recordingTime = System.currentTimeMillis() - mStartRecordingTime;
    //            Log.d(TAG, String.format("Recording time:%d", recordingTime));
    //            mHandler.postDelayed(this, CLIP_GRAPH_UPDATE_INTERVAL);
    //        }
    //    };
       private Runnable mStopRecordingTask = new Runnable() {
           @Override
           public void run() {
               stopRecording();
           }
       };

       private static RecordingManager mInstance = null;
       private Activity currentActivity = null;
       private String destinationFilepath = "";
       private String snapshotFilepath = "";

       public static RecordingManager getInstance(Activity activity, FrameLayout previewFrame) {
           if (mInstance == null || mInstance.currentActivity != activity) {
               mInstance = new RecordingManager(activity, previewFrame);
           }
           return mInstance;
       }

       private RecordingManager(Activity activity, FrameLayout previewFrame) {
           currentActivity = activity;
           this.previewFrame = previewFrame;
       }

       public int getVideoWidth() {
           return this.mVideoWidth;
       }
       public int getVideoHeight() {
           return this.mVideoHeight;
       }
       public void setDestinationFilepath(String filepath) {
           this.destinationFilepath = filepath;
       }
       public String getDestinationFilepath() {
           return this.destinationFilepath;
       }
       public void setSnapshotFilepath(String filepath) {
           this.snapshotFilepath = filepath;
       }
       public String getSnapshotFilepath() {
           return this.snapshotFilepath;
       }
       public void init(String videoPath, String snapshotPath) {
           Log.v(TAG, "init.");
           setDestinationFilepath(videoPath);
           setSnapshotFilepath(snapshotPath);
           if (!Utils.isExternalStorageAvailable()) {
               showStorageErrorAndFinish();
               return;
           }

           openCamera();
           if (camera == null) {
               showCameraErrorAndFinish();
               return;
           }



       public void onResume() {
           Log.v(TAG, "onResume.");
           paused = false;

           // Open the camera
           if (camera == null) {
               openCamera();
               if (camera == null) {
                   showCameraErrorAndFinish();
                   return;
               }
           }

           // Initialize the surface texture or surface view
    //        if (useTexture() &amp;&amp; mTextureView == null) {
    //            initTextureView();
    //            mTextureView.setVisibility(View.VISIBLE);
    //        } else if (!useTexture() &amp;&amp; mSurfaceView == null) {
               initSurfaceView();
               surfaceView.setVisibility(View.VISIBLE);
    //        }

           // Start the preview
           if (!mPreviewing) {
               startPreview();
           }
       }

       private void openCamera() {
           Log.v(TAG, "openCamera");
           try {
               camera = Camera.open();
               camera.setErrorCallback(this);
               camera.setDisplayOrientation(90); // Since we only support portrait mode
               cameraParameters = camera.getParameters();
           } catch (RuntimeException e) {
               e.printStackTrace();
               camera = null;
           }
       }

       private void closeCamera() {
           Log.v(TAG, "closeCamera");
           if (camera == null) {
               Log.d(TAG, "Already stopped.");
               return;
           }

           camera.setErrorCallback(null);
           if (mPreviewing) {
               stopPreview();
           }
           camera.release();
           camera = null;
       }




       private void initSurfaceView() {
           surfaceView = new SurfaceView(currentActivity);
           surfaceView.getHolder().addCallback(new SurfaceViewCallback());
           surfaceView.setVisibility(View.GONE);
           FrameLayout.LayoutParams params = new LayoutParams(
                   LayoutParams.MATCH_PARENT, LayoutParams.MATCH_PARENT, Gravity.CENTER);
           surfaceView.setLayoutParams(params);
           Log.d(TAG, "add surface view to preview frame");
           previewFrame.addView(surfaceView);
       }

       private void releaseSurfaceView() {
           if (surfaceView != null) {
               previewFrame.removeAllViews();
               surfaceView = null;
               surfaceHolder = null;
               surfaceViewReady = false;
           }
       }

       private void startPreview() {
    //        if ((useTexture() &amp;&amp; !mSurfaceTextureReady) || (!useTexture() &amp;&amp; !mSurfaceViewReady)) {
    //            return;
    //        }

           Log.v(TAG, "startPreview.");
           if (mPreviewing) {
               stopPreview();
           }

           setCameraParameters();
           resizePreview();

           try {
    //            if (useTexture()) {
    //                mCamera.setPreviewTexture(mSurfaceTexture);
    //            } else {
                   camera.setPreviewDisplay(surfaceHolder);
    //            }
               camera.startPreview();
               mPreviewing = true;
           } catch (Exception e) {
               closeCamera();
               e.printStackTrace();
               Log.e(TAG, "startPreview failed.");
           }

       }

       private void stopPreview() {
           Log.v(TAG, "stopPreview");
           if (camera != null) {
               camera.stopPreview();
               mPreviewing = false;
           }
       }

       public void onPause() {
           paused = true;

           if (recording) {
               stopRecording();
           }
           closeCamera();

    //        if (useTexture()) {
    //            releaseSurfaceTexture();
    //        } else {
               releaseSurfaceView();
    //        }
       }

       private void setCameraParameters() {
           if (CamcorderProfile.hasProfile(CamcorderProfile.QUALITY_720P)) {
               camcorderProfile = CamcorderProfile.get(CamcorderProfile.QUALITY_720P);
           } else if (CamcorderProfile.hasProfile(CamcorderProfile.QUALITY_480P)) {
               camcorderProfile = CamcorderProfile.get(CamcorderProfile.QUALITY_480P);
           } else {
               camcorderProfile = CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH);
           }
           mVideoWidth = camcorderProfile.videoFrameWidth;
           mVideoHeight = camcorderProfile.videoFrameHeight;
           camcorderProfile.fileFormat = MediaRecorder.OutputFormat.MPEG_4;
           camcorderProfile.videoFrameRate = 30;

           Log.v(TAG, "mVideoWidth=" + mVideoWidth + " mVideoHeight=" + mVideoHeight);
           cameraParameters.setPreviewSize(mVideoWidth, mVideoHeight);

           if (cameraParameters.getSupportedWhiteBalance().contains(Camera.Parameters.WHITE_BALANCE_AUTO)) {
               cameraParameters.setWhiteBalance(Camera.Parameters.WHITE_BALANCE_AUTO);
           }

           if (cameraParameters.getSupportedFocusModes().contains(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO)) {
               cameraParameters.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO);
           }

           cameraParameters.setRecordingHint(true);
           cameraParameters.set("cam_mode", 1);

           camera.setParameters(cameraParameters);
           cameraParameters = camera.getParameters();

           camera.setDisplayOrientation(90);
           android.hardware.Camera.CameraInfo info = new android.hardware.Camera.CameraInfo();
           Log.d(TAG, info.orientation + " degree");
       }

       private void resizePreview() {
           Log.d(TAG, String.format("Video size:%d|%d", mVideoWidth, mVideoHeight));

           Point optimizedSize = getOptimizedPreviewSize(mVideoWidth, mVideoHeight);
           Log.d(TAG, String.format("Optimized size:%d|%d", optimizedSize.x, optimizedSize.y));

           ViewGroup.LayoutParams params = (ViewGroup.LayoutParams) previewFrame.getLayoutParams();
           params.width = optimizedSize.x;
           params.height = optimizedSize.y;
           previewFrame.setLayoutParams(params);
       }

       public void setOrientation(int ori) {
           this.mOrientation = ori;
       }

       public void setOrientationEventListener(OrientationEventListener listener) {
           this.mOrientationEventListener = listener;
       }

       public Camera getCamera() {
           return camera;
       }

       @SuppressWarnings("serial")
       public void setFocusArea(float x, float y) {
           if (camera != null) {
               int viewWidth = surfaceView.getWidth();
               int viewHeight = surfaceView.getHeight();

               int focusCenterX = FOCUS_MAX_VALUE - (int) (x / viewWidth * (FOCUS_MAX_VALUE - FOCUS_MIN_VALUE));
               int focusCenterY = FOCUS_MIN_VALUE + (int) (y / viewHeight * (FOCUS_MAX_VALUE - FOCUS_MIN_VALUE));
               final int left = focusCenterY - FOCUS_AREA_RADIUS &lt; FOCUS_MIN_VALUE ? FOCUS_MIN_VALUE : focusCenterY - FOCUS_AREA_RADIUS;
               final int top = focusCenterX - FOCUS_AREA_RADIUS &lt; FOCUS_MIN_VALUE ? FOCUS_MIN_VALUE : focusCenterX - FOCUS_AREA_RADIUS;
               final int right = focusCenterY + FOCUS_AREA_RADIUS > FOCUS_MAX_VALUE ? FOCUS_MAX_VALUE : focusCenterY + FOCUS_AREA_RADIUS;
               final int bottom = focusCenterX + FOCUS_AREA_RADIUS > FOCUS_MAX_VALUE ? FOCUS_MAX_VALUE : focusCenterX + FOCUS_AREA_RADIUS;

               Camera.Parameters params = camera.getParameters();
               params.setFocusAreas(new ArrayList() {
                   {
                       add(new Camera.Area(new Rect(left, top, right, bottom), 1000));
                   }
               });
               camera.setParameters(params);
               camera.autoFocus(new AutoFocusCallback() {
                   @Override
                   public void onAutoFocus(boolean success, Camera camera) {
                       Log.d(TAG, "onAutoFocus");
                   }
               });
           }
       }

       public void startRecording(String destinationFilepath) {
           if (!recording) {
               updateStorageSpace();
               setDestinationFilepath(destinationFilepath);
               if (mStorageSpace &lt;= LOW_STORAGE_THRESHOLD) {
                   Log.v(TAG, "Storage issue, ignore the start request");
                   Toast.makeText(currentActivity, "Storage issue, ignore the recording request", Toast.LENGTH_LONG).show();
                   return;
               }

               if (!prepareMediaRecorder()) {
                   Toast.makeText(currentActivity, "prepareMediaRecorder failed.", Toast.LENGTH_LONG).show();
                   return;
               }

               Log.d(TAG, "Successfully prepare media recorder.");
               try {
                   mediaRecorder.start();
               } catch (RuntimeException e) {
                   Log.e(TAG, "MediaRecorder start failed.");
                   releaseMediaRecorder();
                   return;
               }

               mStartRecordingTime = System.currentTimeMillis();

               if (mOrientationEventListener != null) {
                   mOrientationEventListener.disable();
               }

               recording = true;
           }
       }

       public void stopRecording() {
           if (recording) {
               if (!paused) {
                   // Capture at least 1 second video
                   long currentTime = System.currentTimeMillis();
                   if (currentTime - mStartRecordingTime &lt; MINIMUM_RECORDING_TIME) {
                       mHandler.postDelayed(mStopRecordingTask, MINIMUM_RECORDING_TIME - (currentTime - mStartRecordingTime));
                       return;
                   }
               }

               if (mOrientationEventListener != null) {
                   mOrientationEventListener.enable();
               }

    //            mHandler.removeCallbacks(mUpdateRecordingTimeTask);

               try {
                   mediaRecorder.setOnErrorListener(null);
                   mediaRecorder.setOnInfoListener(null);
                   mediaRecorder.stop(); // stop the recording
                   Toast.makeText(currentActivity, "Video file saved.", Toast.LENGTH_LONG).show();

                   long stopRecordingTime = System.currentTimeMillis();
                   Log.d(TAG, String.format("stopRecording. file:%s duration:%d", destinationFilepath, stopRecordingTime - mStartRecordingTime));

                   // Calculate the duration of video
                   MediaMetadataRetriever mmr = new MediaMetadataRetriever();
                   mmr.setDataSource(this.destinationFilepath);
                   String _length = mmr.extractMetadata(MediaMetadataRetriever.METADATA_KEY_DURATION);
                   if (_length != null) {
                       Log.d(TAG, String.format("clip duration:%d", Long.parseLong(_length)));
                   }

                   // Taking the snapshot of video
                   Bitmap snapshot = ThumbnailUtils.createVideoThumbnail(this.destinationFilepath, Thumbnails.MICRO_KIND);
                   try {
                       FileOutputStream out = new FileOutputStream(this.snapshotFilepath);
                       snapshot.compress(Bitmap.CompressFormat.JPEG, 70, out);
                       out.close();
                   } catch (Exception e) {
                       e.printStackTrace();
                   }

    //                mActivity.showPlayButton();

               } catch (RuntimeException e) {
                   e.printStackTrace();
                   Log.e(TAG, e.getMessage());
                   // if no valid audio/video data has been received when stop() is
                   // called
               } finally {
    //          

                   releaseMediaRecorder(); // release the MediaRecorder object
                   if (!paused) {
                       cameraParameters = camera.getParameters();
                   }
                   recording = false;
               }

           }
       }

       public void setRecorderOrientation(int orientation) {
           // For back camera only
           if (orientation != -1) {
               Log.d(TAG, "set orientationHint:" + (orientation + 135) % 360 / 90 * 90);
               mediaRecorder.setOrientationHint((orientation + 135) % 360 / 90 * 90);
           }else {
               Log.d(TAG, "not set orientationHint to mediaRecorder");
           }
       }

       private boolean prepareMediaRecorder() {
           mediaRecorder = new MediaRecorder();

           camera.unlock();
           mediaRecorder.setCamera(camera);

           mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
           mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);

           mediaRecorder.setProfile(camcorderProfile);

           mediaRecorder.setMaxDuration(MAXIMUM_RECORDING_TIME);
           mediaRecorder.setOutputFile(this.destinationFilepath);

           try {
               mediaRecorder.setMaxFileSize(Math.min(RECORDING_FILE_LIMIT, mStorageSpace - LOW_STORAGE_THRESHOLD));
           } catch (RuntimeException exception) {
           }

           setRecorderOrientation(mOrientation);

           if (!useTexture()) {
               mediaRecorder.setPreviewDisplay(surfaceHolder.getSurface());
           }

           try {
               mediaRecorder.prepare();
           } catch (IllegalStateException e) {
               releaseMediaRecorder();
               return false;
           } catch (IOException e) {
               releaseMediaRecorder();
               return false;
           }

           mediaRecorder.setOnErrorListener(this);
           mediaRecorder.setOnInfoListener(this);

           return true;

       }

       private void releaseMediaRecorder() {
           if (mediaRecorder != null) {
               mediaRecorder.reset(); // clear recorder configuration
               mediaRecorder.release(); // release the recorder object
               mediaRecorder = null;
               camera.lock(); // lock camera for later use
           }
       }

       private Point getOptimizedPreviewSize(int videoWidth, int videoHeight) {
           Display display = currentActivity.getWindowManager().getDefaultDisplay();
           Point size = new Point();
           display.getSize(size);

           Point optimizedSize = new Point();
           optimizedSize.x = size.x;
           optimizedSize.y = (int) ((float) videoWidth / (float) videoHeight * size.x);

           return optimizedSize;
       }

       private void showCameraErrorAndFinish() {
           DialogInterface.OnClickListener buttonListener = new DialogInterface.OnClickListener() {
               @Override
               public void onClick(DialogInterface dialog, int which) {
                   currentActivity.finish();
               }
           };
           new AlertDialog.Builder(currentActivity).setCancelable(false)
                   .setTitle("Camera error")
                   .setMessage("Cannot connect to the camera.")
                   .setNeutralButton("OK", buttonListener)
                   .show();
       }

       private void showStorageErrorAndFinish() {
           DialogInterface.OnClickListener buttonListener = new DialogInterface.OnClickListener() {
               @Override
               public void onClick(DialogInterface dialog, int which) {
                   currentActivity.finish();
               }
           };
           new AlertDialog.Builder(currentActivity).setCancelable(false)
                   .setTitle("Storage error")
                   .setMessage("Cannot read external storage.")
                   .setNeutralButton("OK", buttonListener)
                   .show();
       }

       private void updateStorageSpace() {
           mStorageSpace = getAvailableSpace();
           Log.v(TAG, "updateStorageSpace mStorageSpace=" + mStorageSpace);
       }

       private long getAvailableSpace() {
           String state = Environment.getExternalStorageState();
           Log.d(TAG, "External storage state=" + state);
           if (Environment.MEDIA_CHECKING.equals(state)) {
               return -1;
           }
           if (!Environment.MEDIA_MOUNTED.equals(state)) {
               return -1;
           }

           File directory = currentActivity.getExternalFilesDir("vine");
           directory.mkdirs();
           if (!directory.isDirectory() || !directory.canWrite()) {
               return -1;
           }

           try {
               StatFs stat = new StatFs(directory.getAbsolutePath());
               return stat.getAvailableBlocks() * (long) stat.getBlockSize();
           } catch (Exception e) {
               Log.i(TAG, "Fail to access external storage", e);
           }
           return -1;
       }

       private boolean useTexture() {
           return false;
    //        return Build.VERSION.SDK_INT >= Build.VERSION_CODES.JELLY_BEAN_MR1;
       }

       private class SurfaceViewCallback implements SurfaceHolder.Callback {

           @Override
           public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
               Log.v(TAG, "surfaceChanged. width=" + width + ". height=" + height);
           }

           @Override
           public void surfaceCreated(SurfaceHolder holder) {
               Log.v(TAG, "surfaceCreated");
               surfaceViewReady = true;
               surfaceHolder = holder;
               startPreview();
           }

           @Override
           public void surfaceDestroyed(SurfaceHolder holder) {
               Log.d(TAG, "surfaceDestroyed");
               surfaceViewReady = false;
           }

       }

       @Override
       public void onError(int error, Camera camera) {
           Log.e(TAG, "Camera onError. what=" + error + ".");
           if (error == Camera.CAMERA_ERROR_SERVER_DIED) {

           } else if (error == Camera.CAMERA_ERROR_UNKNOWN) {

           }
       }

       @Override
       public void onInfo(MediaRecorder mr, int what, int extra) {
           if (what == MediaRecorder.MEDIA_RECORDER_INFO_MAX_DURATION_REACHED) {
               stopRecording();
           } else if (what == MediaRecorder.MEDIA_RECORDER_INFO_MAX_FILESIZE_REACHED) {
               stopRecording();
               Toast.makeText(currentActivity, "Size limit reached", Toast.LENGTH_LONG).show();
           }
       }

       @Override
       public void onError(MediaRecorder mr, int what, int extra) {
           Log.e(TAG, "MediaRecorder onError. what=" + what + ". extra=" + extra);
           if (what == MediaRecorder.MEDIA_RECORDER_ERROR_UNKNOWN) {
               stopRecording();
           }
       }

    }

    VideoUtils

    public class VideoUtils {
       private static final String TAG = VideoUtils.class.getSimpleName();

       static double[] matrix = new double[] { 0.0, 1.0, 0.0, -1.0, 0.0, 0.0, 0.0,
               0.0, 1.0 };

       public static boolean MergeFiles(String speratedDirPath,
               String targetFileName) {
           File videoSourceDirFile = new File(speratedDirPath);
           String[] videoList = videoSourceDirFile.list();
           List<track> videoTracks = new LinkedList<track>();
           List<track> audioTracks = new LinkedList<track>();
           for (String file : videoList) {
               Log.d(TAG, "source files" + speratedDirPath
                       + File.separator + file);
               try {
                   FileChannel fc = new FileInputStream(speratedDirPath
                           + File.separator + file).getChannel();
                   Movie movie = MovieCreator.build(fc);
                   for (Track t : movie.getTracks()) {
                       if (t.getHandler().equals("soun")) {
                           audioTracks.add(t);
                       }
                       if (t.getHandler().equals("vide")) {

                           videoTracks.add(t);
                       }
                   }
               } catch (FileNotFoundException e) {
                   e.printStackTrace();
                   return false;
               } catch (IOException e) {
                   e.printStackTrace();
                   return false;
               }
           }

           Movie result = new Movie();

           try {
               if (audioTracks.size() > 0) {
                   result.addTrack(new AppendTrack(audioTracks
                           .toArray(new Track[audioTracks.size()])));
               }
               if (videoTracks.size() > 0) {
                   result.addTrack(new AppendTrack(videoTracks
                           .toArray(new Track[videoTracks.size()])));
               }
               IsoFile out = new DefaultMp4Builder().build(result);



               FileChannel fc = new RandomAccessFile(
                       String.format(targetFileName), "rw").getChannel();

               Log.d(TAG, "target file:" + targetFileName);
               TrackBox tb = out.getMovieBox().getBoxes(TrackBox.class).get(1);

               TrackHeaderBox tkhd = tb.getTrackHeaderBox();
               double[] b = tb.getTrackHeaderBox().getMatrix();

               tkhd.setMatrix(matrix);

               fc.position(0);
               out.getBox(fc);
               fc.close();
               for (String file : videoList) {
                   File TBRFile = new File(speratedDirPath + File.separator + file);
                   TBRFile.delete();
               }
               boolean a = videoSourceDirFile.delete();
               Log.d(TAG, "try to delete dir:" + a);
           } catch (IOException e) {
               // TODO Auto-generated catch block
               e.printStackTrace();
               return false;
           }

           return true;
       }

       public static boolean clearFiles(String speratedDirPath) {
           File videoSourceDirFile = new File(speratedDirPath);
           if (videoSourceDirFile != null
                   &amp;&amp; videoSourceDirFile.listFiles() != null) {
               File[] videoList = videoSourceDirFile.listFiles();
               for (File video : videoList) {
                   video.delete();
               }
               videoSourceDirFile.delete();
           }
           return true;
       }

       public static int createSnapshot(String videoFile, int kind, String snapshotFilepath) {
           return 0;
       };

       public static int createSnapshot(String videoFile, int width, int height, String snapshotFilepath) {
           return 0;
       }
    }
    </track></track></track></track>

    my reference code project link is

    https://github.com/jwfing/AndroidVideoKit