구조화된 수중 환경에서 매니퓰레이션을 위한 무인잠수정 3 차원 위치 인식 방법 개발
- 구조화된 수중 환경에서 매니퓰레이션을 위한 무인잠수정 3 차원 위치 인식 방법 개발
- Date Issued
- An essential component of an Unmanned Underwater Vehicle(UUV) for application in partially known environments is a reliable localization system. Usually, the goal is to develop ac UUV capable of performing various tasks predefined by the organization. As usual, the localization of the robot will play an important role in the robot performance. A reliable localization system will be in many other applications where a map of the environment is available. Inspection of dams, harbors and nuclear reactors are clear examples of applications where a mission performed within a structured scenario where a map is usually available. Moreover, it is essential work to pick and operate the objects. So we should research localization for manipulation in structured underwater environment. An UVMS(Underwater Vehicle Manipulation System) is complex dynamics and has uncertainty in underwater environment. It is time-consuming and not accurate processing using the UVMS’s dynamics. It should, only, use sensors information. It is necessary to manipulation works that have target recognition, stable estimation, position accuracy and map information. And it largely divided two parts. One part is detecting target. The other part is not detecting target. We should combine two parts. So we decide sensors considering this situation. We use vision for recognize the target and sonar sensor for using map information. And we use IMU(Inertial Measurement Unit) to compensate vision and sonar system. In order to complete this goal, we use sensor model for this mission and then use extended kalman filter for fuse sensor model. We also make new hardware for experiment in structured underwater environment. In this paper, we show sensor model, fusion algorithm and the result that include off-line and in-line experiment.
- Article Type
- Files in This Item:
- There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.