Validation of Infrared Scanner by the Assistance of Geomatic Documentation of the Historical Building of Etemad al-Saltanah - Journal of Research on Archaeometry
------------------------------------------ ---------------------------------------
year 5, Issue 2 (2019)                   JRA 2019, 5(2): 131-147 | Back to browse issues page


XML Persian Abstract Print


Download citation:
BibTeX | RIS | EndNote | Medlars | ProCite | Reference Manager | RefWorks
Send citation to:

Taher Tolou Del M S, Kamali Tabrizi S. Validation of Infrared Scanner by the Assistance of Geomatic Documentation of the Historical Building of Etemad al-Saltanah. JRA 2019; 5 (2) :131-147
URL: http://jra-tabriziau.ir/article-1-204-en.html
1- Shahid Rajaee Teacher Training University
2- Shahid Rajaee Teacher Training University , sina_kamali@yahoo.com
Abstract:   (2715 Views)
Since the historical buildings undergo a lot of changes and damages in the course of history, they are required to be documented. These changes might come about as a result of natural symptoms like rainfall, wind, earthquake, flood, explosion and/or by human beings (consciously or unconsciously). Therefore, efforts should be made in line with 3D documenting such buildings so that, besides precise identification of the buildings’ current status and the damages imposed to them during the time, the future changes and damages’ trends could be predicted, so enables us to prevent their continuation. The documentation system is selected according to the dimensions of the object, the density of the required point clouds and accuracy. Regarding that the current methods for laser-based or photography-based (photogrammetry) 3D reconstruction are expensive or complex, cost-effective infrared sensors, such as the structure sensor and the Kinect sensor, have been introduced as promising alternative tools. An infrared scanner, as a portable depth-sensing scanner, consists of a color sensor and a depth sensor that are capable of capturing color images and depths of objects in the visible and accessible range. These sensors are commonly referred to as RGB-D cameras because they output standard RGB images from the camera that have an additional Depth channel per pixel (Fig. 2). The most recent development of the infrared documentation system is the portable Structural Sensor provided by Occipital in collaboration with Prime Sense. This small, lightweight, wireless sensor directly collects and records point clouds data and create three-dimensional modeling of interiors. Structure sensor is a new technology in metric documentation; therefore, the capabilities of this system have not been evaluated for documenting cultural heritage. According to the error introduced for the structure sensor, the scanner has a precision of more than 99% in objects between 0.4 and 3.5 meters; therefore, it is suitable for heritage documentation. The main purpose of this research is, therefore, to verify this claim based on projects captured through experimental tests, in order to confirm the suitability of this tool for cultural heritage documentation. The historical house of Etemad al-Saltanah was documented (Fig. 8) and processed to experimentally examine the structure sensor, the results of which were compared with the actual dimensions of the house (Table 4). Results of the research showed that this system of documentation is not suitable for 3D capturing and reconstructing historical buildings and does not have the required and claimed level of precision (Table 5). Also, the structure sensor precision was assessed for documenting museum objects through testing the scale model of Imam Mosque in Isfahan, Iran (Fig. 11). Results (Table 6) indicate that the structure sensor is only suitable for historical objects with dimensions between 0.3 and 2 meters, and has a precision of more than 95%, which is acceptable according to the Cadastral spatial information regulation. The number of point clouds varies between 103 and 106 points in each capture (Fig. 12) and the capture dimensions are achievable considering a root-mean-square error up to 5 m3, beyond which is higher than the capability of the scanner. Pearson correlation test showed increasing errors of the scanner with enlarged sizes of objects (Table 3).
 
Full-Text [PDF 1873 kb]   (975 Downloads)    
Technical Note: Original Research | Subject: Conservation Science
Received: 2019/09/4 | Accepted: 2019/11/24 | Published: 2019/12/30 | ePublished: 2019/12/30

References
1. Georgopoulos A. CIPA's perspectives on cultural heritage. InDigital Research and Education in Architectural Heritage 2017 Mar 30 (pp. 215-245). Springer, Cham. [doi.org/10.1007/978-3-319-76992-9_13] [DOI:10.1007/978-3-319-76992-9_13]
2. Campion K. Blast through the past: Terrorist attacks on art and antiquities as a reconquest of the modern Jihadi identity. Perspectives on terrorism. 2017 Feb 1;11(1):26-39.
3. Sgrenzaroli M. Cultural heritage 3D reconstruction using high resolution laser scanner: new frontiers data processing. InCIPA 2005 XX International Symposium 2005 Sep 26 (Vol. 3).
4. Cooper N. Guide to recording historic buildings. 1990.
5. Letellier R, Eppich R. Recording, documentation and information management for the conservation of heritage places. Routledge; 2015 Dec 22. [doi.org/10.4324/9781315793917] [DOI:10.4324/9781315793917]
6. Shao J, Zhang W, Mellado N, Grussenmeyer P, Li R, Chen Y, Wan P, Zhang X, Cai S. Automated markerless registration of point clouds from TLS and structured light scanner for heritage documentation. Journal of Cultural Heritage. 2019 Jan 1;35:16-24. [doi.org/10.1016/j.culher.2018.07.013] [DOI:10.1016/j.culher.2018.07.013]
7. Serna CG, Pillay R, Trémeau A. Data fusion of objects using techniques such as Laser Scanning, Structured Light and Photogrammetry for Cultural Heritage Applications. InInternational Workshop on Computational Color Imaging 2015 Mar 24 (pp. 208-224). S. [doi.org/10.1007/978-3-319-15979-9_20] [DOI:10.1007/978-3-319-15979-9_20]
8. Mahajan A, Bharti V, Singh HP, Josyula L, Kumar P. Construction of a 3D Map of Indoor Environment. Procedia Computer Science. 2018 Jan 1;125:124-31. [doi.org/10.1016/j.procs.2017.12.018] [DOI:10.1016/j.procs.2017.12.018]
9. Remondino F, Rizzi A. Reality-based 3D documentation of natural and cultural heritage sites-techniques, problems, and examples. Applied Geomatics. 2010 Sep 1;2(3):85-100. [doi.org/10.1007/s12518-010-0025-x] [DOI:10.1007/s12518-010-0025-x]
10. Curless B, Seitz S. 3D Photography. Course Notes for SIGGRAPH 2000. 2000 Jul.
11. Zhang W, Wang C, Xi X. 3D Scan of Ornamental Column (huabiao) Using Terrestrial LiDAR and Hand-held Imager. The International Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences. 2015;40(5):491. [doi.org/10.5194/isprsarchives-XL-5-W7-491-2015] [DOI:10.5194/isprsarchives-XL-5-W7-491-2015]
12. Bok Y, Jeong Y, Choi DG, Kweon IS. Capturing village-level heritages with a hand-held camera-laser fusion sensor. International Journal of Computer Vision. 2011 Aug 1;94(1):36-53. [doi.org/10.1007/s11263-010-0397-8] [DOI:10.1007/s11263-010-0397-8]
13. Blake B, Bedford J. Documentation for conservation, a manual for teaching metric survey skills. 2008.
14. Microsoft Kinect. [(Accessed on 10 September 2015)]; Available online: http://www.xbox.com/en-US/xbox-one/accessories/kinect-for-xbox-one.
15. Anguelov D, Srinivasan P, Koller D, Thrun S, Rodgers J, Davis J. SCAPE: shape completion and animation of people. InACM SIGGRAPH 2005 Papers 2005 Jul 1 (pp. 408-416). [doi.org/10.1145/1073204.1073207] [DOI:10.1145/1073204.1073207]
16. Mao A, Zhang H, Liu Y, Zheng Y, Li G, Han G. Easy and fast reconstruction of a 3D avatar with an RGB-D sensor. Sensors. 2017 May;17(5):1113. [doi.org/10.3390/s17051113] [DOI:10.3390/s17051113]
17. Beňo P, Duchoň F, Tölgyessy M, Hubinský P, Kajan M. 3d map reconstruction with sensor kinect: Searching for solution applicable to small mobile robots. In2014 23rd International Conference on Robotics in Alpe-Adria-Danube Region (RAAD) 2014 Sep 3 (pp. 1-6). [doi.org/10.1109/RAAD.2014.7002252] [DOI:10.1109/RAAD.2014.7002252]
18. Freedman B, Shpunt A, Machline M, Arieli Y, inventors; PrimeSense Ltd, assignee. Depth mapping using projected patterns. United States patent US 8,493,496. 2013 Jul 23.
19. Kalantari M, Nechifor M. Accuracy and utility of the Structure Sensor for collecting 3D indoor information. Geo-spatial information science. 2016 Jul 2;19(3):202-9. [doi.org/10.1080/10095020.2016.1235817] [DOI:10.1080/10095020.2016.1235817]
20. Weiss A, Hirshberg D, Black MJ. Home 3D body scans from noisy image and range data. In2011 International Conference on Computer Vision 2011 Nov 6 (pp. 1951-1958). IEEE. [doi.org/10.1109/ICCV.2011.6126465] [DOI:10.1109/ICCV.2011.6126465]
21. Tong J, Zhou J, Liu L, Pan Z, Yan H. Scanning 3d full human bodies using kinects. IEEE transactions on visualization and computer graphics. 2012 Mar 9;18(4):643-50. [doi.org/10.1109/TVCG.2012.56] [DOI:10.1109/TVCG.2012.56]
22. Cui Y, Chang W, Nöll T, Stricker D. KinectAvatar: fully automatic body capture using a single kinect. InAsian Conference on Computer Vision 2012 Nov 5 (pp. 133-147). Springer, Berlin, Heidelberg. [doi.org/10.1007/978-3-642-37484-5_12] [DOI:10.1007/978-3-642-37484-5_12]
23. Li H, Vouga E, Gudym A, Luo L, Barron JT, Gusev G. 3D self-portraits. ACM Transactions on Graphics (TOG). 2013 Nov 1;32(6):1-9. [doi.org/10.1145/2508363.2508407] [DOI:10.1145/2508363.2508407]
24. Chen Y, Dang G, Cheng ZQ, Xu K. Fast capture of personalized avatar using two Kinects. Journal of Manufacturing Systems. 2014 Jan 1;33(1):233-40. [doi.org/10.1016/j.jmsy.2013.11.005] [DOI:10.1016/j.jmsy.2013.11.005]
25. Zhu H, Yu Y, Zhou Y, Du S. Dynamic human body modeling using a single RGB camera. Sensors. 2016 Mar;16(3):402. [doi.org/10.3390/s16030402] [DOI:10.3390/s16030402]
26. Khoshelham K, Elberink SO. Accuracy and resolution of kinect depth data for indoor mapping applications. Sensors. 2012 Feb;12(2):1437-54. [doi.org/10.3390/s120201437] [DOI:10.3390/s120201437]
27. Canessa A, Chessa M, Gibaldi A, Sabatini SP, Solari F. Calibrated depth and color cameras for accurate 3D interaction in a stereoscopic augmented reality environment. Journal of Visual Communication and Image Representation. 2014 Jan 1;25(1):227-37. [doi.org/10.1016/j.jvcir.2013.02.011] [DOI:10.1016/j.jvcir.2013.02.011]
28. Herrera D, Kannala J, Heikkilä J. Joint depth and color camera calibration with distortion correction. IEEE Transactions on Pattern Analysis and Machine Intelligence. 2012 May 29;34(10):2058-64. [doi.org/10.1109/TPAMI.2012.125] [DOI:10.1109/TPAMI.2012.125]
29. Zhang Z. A flexible new technique for camera calibration. IEEE Transactions on pattern analysis and machine intelligence. 2000 Nov;22(11):1330-4. [doi.org/10.1109/34.888718] [DOI:10.1109/34.888718]
30. Smisek J, Jancosek M, Pajdla T. 3D with Kinect. InConsumer depth cameras for computer vision 2013 (pp. 3-25). Springer, London. [doi.org/10.1007/978-1-4471-4640-7_1] [DOI:10.1007/978-1-4471-4640-7_1]
31. Herrera D, Kannala J, Heikkilä J. Accurate and practical calibration of a depth and color camera pair. InInternational Conference on Computer analysis of images and patterns 2011 Aug 29 (pp. 437-445). Springer, Berlin, Heidelberg. [doi.org/10.1007/978-3-642-23678-5_52] [DOI:10.1007/978-3-642-23678-5_52]
32. Yamazoe H, Habe H, Mitsugami I, Yagi Y. Easy depth sensor calibration. InProceedings of the 21st International Conference on Pattern Recognition (ICPR2012) 2012 Nov 11 (pp. 465-468). IEEE.
33. Raposo C, Barreto JP, Nunes U. Fast and accurate calibration of a kinect sensor. In2013 International Conference on 3D Vision-3DV 2013 2013 Jun 29 (pp. 342-349). IEEE. [doi.org/10.1109/3DV.2013.52] [DOI:10.1109/3DV.2013.52]
34. Zhang C, Zhang Z. Calibration between depth and color sensors for commodity depth cameras. InComputer vision and machine learning with RGB-D sensors 2014 (pp. 47-64). Springer, Cham. [doi.org/10.1007/978-3-319-08651-4_3] [DOI:10.1007/978-3-319-08651-4_3]
35. Zollhöfer M, Martinek M, Greiner G, Stamminger M, Süßmuth J. Automatic reconstruction of personalized avatars from 3D face scans. Computer Animation and Virtual Worlds. 2011 Apr;22(2‐3):195-202. [doi.org/10.1002/cav.405] [DOI:10.1002/cav.405]
36. Berdnikov Y, Vatolin D. Real-time depth map occlusion filling and scene background restoration for projected-pattern based depth cameras. InGraphic Conf., IETP 2011.
37. Qi F, Han J, Wang P, Shi G, Li F. Structure guided fusion for depth map inpainting. Pattern Recognition Letters. 2013 Jan 1;34(1):70-6. [doi.org/10.1016/j.patrec.2012.06.003] [DOI:10.1016/j.patrec.2012.06.003]
38. Schmeing M, Jiang X. Color segmentation based depth image filtering. InInternational Workshop on Depth Image Analysis and Applications 2012 Nov 11 (pp. 68-77). Springer, Berlin, Heidelberg. [doi.org/10.1007/978-3-642-40303-3_8] [DOI:10.1007/978-3-642-40303-3_8]
39. Chen L, Lin H, Li S. Depth image enhancement for Kinect using region growing and bilateral filter. InProceedings of the 21st International Conference on Pattern Recognition (ICPR2012) 2012 Nov 11 (pp. 3070-3073). IEEE.
40. Hornacek M, Rhemann C, Gelautz M, Rother C. Depth super resolution by rigid body self-similarity in 3d. InProceedings of the IEEE conference on computer vision and pattern recognition 2013 (pp. 1123-1130). [doi.org/10.1109/CVPR.2013.149] [DOI:10.1109/CVPR.2013.149]
41. Hu X, yuan Yu Y, zhong Wang Z. High-Definition 3D Reconstruction in Real-Time from a Moving Depth Sensor. In2013 International Conference on Advanced Computer Science and Electronics Information (ICACSEI 2013) 2013 Aug. Atlantis Press. [doi.org/10.2991/icacsei.2013.94] [DOI:10.2991/icacsei.2013.94]
42. Henry P, Krainin M, Herbst E, Ren X, Fox D. RGB-D mapping: Using depth cameras for dense 3D modeling of indoor environments. InExperimental robotics 2014 (pp. 477-491). Springer, Berlin, Heidelberg. [doi.org/10.1007/978-3-642-28572-1_33] [DOI:10.1007/978-3-642-28572-1_33]
43. Newcombe RA, Izadi S, Hilliges O, Molyneaux D, Kim D, Davison AJ, Kohi P, Shotton J, Hodges S, Fitzgibbon A. KinectFusion: Real-time dense surface mapping and tracking. In2011 10th IEEE. International Symposium on Mixed and Augmented Reality 2011 Oct 26 (pp. 127-136). IEEE. [doi.org/10.1109/ISMAR.2011.6092378] [DOI:10.1109/ISMAR.2011.6092378]
44. Izadi S, Newcombe RA, Kim D, Hilliges O, Molyneaux D, Hodges S, Kohli P, Shotton J, Davison AJ, Fitzgibbon A. Kinectfusion: real-time dynamic 3d surface reconstruction and interaction. InACM SIGGRAPH 2011 Talks 2011 Aug 7 (p. 23). ACM. [doi.org/10.1145/2037826.2037857] [DOI:10.1145/2037826.2037857]
45. Taher Tolou Del M.S, Kamali Tabrizi S. Interpretation of Architectural Elements in the Integrated Style of Etemad al-Saltaneh House of Tehran, First National Conference on Documentation of Natural and Cultural Heritage, Tehran, Shahid Rajaee University, 2016. [in Persian] ] طاهرطلوع دل محمدصادق، كمالي تبريزي سينا. برداشت عناصر معماري در سبك تلفيقي خانه اعتمادالسلطنه تهران. نخستين همايش ملي مستندنگاري ميراث طبيعي و فرهنگي، تهران، دانشگاه شهيد رجايي، 1396.[
46. Pearson K. VII. Note on regression and inheritance in the case of two parents. Proceedings of the royal society of London. 1895 Dec 31;58(347-352):240-2. [doi.org/10.1098/rspl.1895.0041] [DOI:10.1098/rspl.1895.0041]

Rights and permissions
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

© 2024 CC BY-NC 4.0 | Journal of Research on Archaeometry

Designed & Developed by : Yektaweb