정확한 사람/바디 추적을 위한 계층적 가려짐 처리 기법
- 정확한 사람/바디 추적을 위한 계층적 가려짐 처리 기법
- Date Issued
- Human and body part tracking are one of the important problems in human motion analysis. To achieve accurate human and body part tracking, we need to solve the problems: appearance change, shape variation, rapid motion, cluttered background, and extra/self-occlusion. Among many problems, the extra/self-occlusion problem is the most difficult but to be solved one because invisibility makes a great effect on the accuracy of human and body part tracking. Specifically, extra-occlusion often occurs in the 2D human tracking under the crowded environment while self-occlusion often occurs in the 3D body part tracking under the personal environment. This thesis propose to solve the extra/self-occlusion problem by hierarchical occlusion handling methods as follows.
In the 2D human tracking, we define the 2D hierarchical human structure, which is a tree-like structure with hierarchical multi-features: full body -> body parts -> blocks -> points. Then, we estimate the degree of occlusion by counting the overlapping pixels among boundary boxes of humans and select an appropriate feature level according to the degree of occlusion. Then, we use four different occlusion handling methods according to the selected feature levels as follows. In the full body (or body part) level, we take the shape-based occlusion handling method using the full body (or body part) association that finds the best matching of position, size, and appearance among full bodies (or body parts) between two consecutive frames. In the block level, we take the spatial relation-based occlusion handling method using the dynamic Markov random field (DMRF) model that dynamically updates the geometric structure of image blocks according to their visibility. In the point level, we take the appearance-based occlusion handling method using the scale invariant feature transform (SIFT) point matching that finds the best matching of SIFT features among the points between two consecutive frames. This hierarchical extra-occlusion handling method guarantees the human tracking irrespective of the degree of occlusion. Experimental results using the public challenging databases:CAVIAR set and ETH Mobile Scene(ETHMS) show that the proposed method outperforms other state-of-art tracking methods in terms of low tracking error, high tracking precision, and high tracking accuracy.
Although the human tracking with the proposed hierarchical occlusion handling method can solve the extra-occlusion problem considerably, it does not give the satisfactory performance due to the limitation of the 2D appearance information. To overcome this limitation, we usually take the 3D image, which can easily solve the extra-occlusion problem using the depth information. However, we still need to solve another occlusion problem that some body parts are self-occluded by other body parts.
In the 3D body part tracking, we define the 3D hierarchical human body model, which is a tree-like structure with 10 body parts: torso -> head, upper arms, and upper legs -> lower arms, lower legs. We track each body part in hierarchical manners using iterative close point tracking (ICP) algorithm. Then, we determine whether the body part is self-occluded or not based on the ICP tracking error and the aspect ratio of the major and minor axial length of the 3D data. The body part is self-occluded when both the error and ratio are large. If the body part is self-occluded, we estimate its plausible position and rotation from the tracking result of the hierarchically lower body part. In addition, we propose a fast ICP algorithm using two acceleration techniques:
hierarchical model point selection and logarithmic data point search. Experimental results using PHD07 database show that the proposed method provides good tracking performance in terms of low tracking error, high tracking robustness, high tracking convergence rate, and high operating frame rate.
- Article Type
- Files in This Item:
- There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.