This item is licensed Korea Open Government License
dc.contributor.author
김민석
dc.contributor.author
이재열
dc.contributor.author
김재성
dc.contributor.author
김명일
dc.contributor.author
서동우
dc.contributor.author
김호윤
dc.date.accessioned
2022-03-23T08:09:55Z
dc.date.available
2022-03-23T08:09:55Z
dc.date.issued
2019-05-08
dc.identifier.issn
2508-4003
dc.identifier.uri
https://repository.kisti.re.kr/handle/10580/16481
dc.description.abstract
This paper proposes a new method to effectively visualize modeling & simulation (M&S) results in a real environment using augmented reality (AR) and deep learning. The proposed approach makes it possible to dynamically generate an M&S analysis space of the real environment, to recognize real objects by using a deep learning technique, and to place the analyzed M&S results onto them. In order to construct an M&S space dynamically, we perform area learning on the real space using a smart device supporting RGB-D camera. In addition, real objects are recognized through deep learning-based object detection. Spatial mapping and user interaction are conducted to match the recognized real object with corresponding M&S model in the mobile AR environment. A proof-of-concept system was developed to show the advantage and feasibility of the proposed method. Therefore, the proposed approach can be used for seamlessly integrating M&S models into various real spaces and for reviewing M&S results more consistently and effectively.