0
Research Papers

Model-Based Coarse-Fine Virtual Calibration and Visual Servo for Augmented Reality-Assisted Peg-in-Hole Microassembly

[+] Author and Article Information
Ren-Jung Chang

Mechanical Engineering Department,
National Cheng Kung University,
Tainan 701, Taiwan, China
e-mail: rjchang@mail.ncku.edu.tw

Jun-Fu Liu

Mechanical Engineering Department,
National Cheng Kung University,
Tainan 701, Taiwan, China

1Corresponding author.

Contributed by the Manufacturing Engineering Division of ASME for publication in the JOURNAL OF MICRO-AND NANO-MANUFACTURING. Manuscript received May 25, 2018; final manuscript received September 16, 2018; published online October 16, 2018. Assoc. Editor: Marriner Merrill.

J. Micro Nano-Manuf 6(4), 041002 (Oct 16, 2018) (12 pages) Paper No: JMNM-18-1016; doi: 10.1115/1.4041531 History: Received May 25, 2018; Revised September 16, 2018

A three-dimensional (3D)-virtual calibration and visual servo are implemented for augmented reality (AR)-assisted peg-in-hole microassembly operations. By employing 3D model and ray casting, the 3D coordinates on virtual mating rod correspondent to the two-dimensional (2D) virtual image points are extracted. The detecting and tracking of image feature points for calibration is carried out by the proposed algorithm of regional template matching (TM) and scanning with edge fitting (RTM-SEF). For achieving subpixel error between the feature points in real and virtual images, a coarse-fine virtual calibration method is proposed. In regard to the image viewed by the real and virtual cameras, a calibrated virtual camera is utilized to track the mating rod. A visual servo control law including coarse and fine tuning is proposed to ensure sub-pixel error between the most important feature point in the real and virtual images. The AR technology is mainly employed in the alignment between micropeg and mating hole for inserting a micropeg of diameter 80 μm with length 1–1.4 mm into a mating rod with 100 μm hole.

FIGURES IN THIS ARTICLE
<>
Copyright © 2018 by ASME
Your Session has timed out. Please sign back in to continue.

References

Yesin, K. B. , and Nelson, B. J. , 2005, “ A CAD Model Based Tracking System for Visually Guided Microassembly,” Robotica, 23(4), pp. 409–418. [CrossRef]
Tamadazte, B. , and Marchands, E. , 2010, “ CAD Model Based Tracking and 3D Visual-Based Control for MEMS Microassembly,” Int. J. Rob. Res., 29(11), pp. 1416–1434. [CrossRef]
Gendreau, D. , Gauthier, M. , Heriban, D. , and Lutz, P. , 2010, “ Modular Architecture of the Microfactories for Automatic Micro-Assembly,” J. Rob. Comp. Integ. Manuf., 26(4), pp. 354–360. [CrossRef]
Kudryavtsev, A. V. , Laurent, G. J. , Clévy, C. , Tamadazte, B. , and Lutz, P. , 2015, “ Analysis of CAD Model-Based Visual Tracking for Microassembly Using a New Block Set for Matlab/Simulink,” Int. J. Optomechatronics, 9(4), pp. 295–309. [CrossRef]
Ferreira, A. , Cassier, C. , and Hirai, S. , 2004, “ Automatic Microassembly System Assisted by Vision Servoing and Virtual Reality,” IEEE/ASME Trans. Mechatronics, 9(2), pp. 321–333. [CrossRef]
Chang, R. J. , Lin, C. Y. , and Lin, P. S. , 2011, “ Visual-Based Automation of Peg-in-Hole Microassembly Process,” ASME J. Manuf. Sci. Eng., 133(4), p. 041015. [CrossRef]
Chang, R. J. , and Jau, J. C. , 2015, “ Error Measurement and Calibration in Developing Virtual-Reality-Assisted Microassembly System,” Int. J. Auto. Tech., 9(6), pp. 619–628. [CrossRef]
Cecil, J. , Bharathi Raj Kumar, M. B. , Lu, Y. , and Basallali, V. , 2016, “ A Review of Micro-Devices Assembly Techniques and Technology,” Int. J. Adv. Manuf. Technol., 83(9–12), pp. 1569–1581. [CrossRef]
Azuma, R. T. , 1997, “ A Survey of Augmented Reality,” Presence, Teleop. Virt. Environ., 6(4), pp. 355–385. [CrossRef]
Milgram, P. , and Kishino, F. , 1994, “ A Taxonomy of Mixed Reality Visual Displays,” IEICE Trans. Info. Syst., 77(12), pp. 1321–1329.
Popa, D. O. , and Stephanou, H. E. , 2004, “ Micro and Mesoscale Robotic Assembly,” J. Manuf. Processes, 6(1), pp. 52–71. [CrossRef]
Tietje, C. , and Ratchev, S. , 2007, “ Design for Microassembly—A Methodology for Product Design and Process Selection,” IEEE International Symposium on Assembly and Manufacturing, Ann Arbor, MI, July 22–25.
Syberfeldt, A. , Danielsson, O. , Holm, M. , and Wang, L. , 2015, “ Visual Assembling Guidance Using Augmented Reality,” Procedia Manuf., 1, pp. 98–109. [CrossRef]
Chang, R. J. , and Jau, J. C. , 2016, “ Augmented Reality in Peg-in-Hole Microassembly Operations,” Int. J. Autom. Tech., 10(3), pp. 438–446. [CrossRef]
Whitney, D. E. , 2004, Mechanical Assemblies: Their Design, Manufacture, and Role in Product Development, Vol. 1, Oxford University Press, New York.
Wright, R. S. , Haemel, N. , and Sellers, G. , 2010, OpenGL SuperBible: Comprehensive Tutorial and Reference, Pearson Education, Boston, MA.
Foley, J. D. , van Dam, A. , Feiner, S. K. , and Hughes, J. F. , 2014, Computer Graphics: Principles and Practice, 3rd ed., Pearson Education, Boston, MA.
Kim, W. S. , 1999, “ Computer Vision Assisted Virtual Reality Calibration,” IEEE Rob. Autom., 15(3), pp. 450–464. [CrossRef]
Comport, A. I. , Marchand, E. , Pressigout, M. , and Chaumette, F. , 2006, “ Real-Time Markerless Tracking for Augmented Reality: The Virtual Visual Servoing Framework,” IEEE Trans. Visual. Comp. Graph., 12(4), pp. 615–628. [CrossRef]
Chaumette, F. , 1998, “ Potential Problems of Stability and Convergence in Image-Based and Position-Based Visual Servoing,” The Confluence of Vision and Control, Springer-Verlag, London, pp. 66–78.
Chu, H. K. , Mills, J. K. , and Cleghorn, W. L. , 2011, “ Image-Based Visual Servoing Through Micropart Reflection for the Microassembly Process,” J. Micromech. Microeng., 21(6), p. 065016. [CrossRef]

Figures

Grahic Jump Location
Fig. 1

Ideal optomechanical installation with coordinate systems in real system and virtual environment

Grahic Jump Location
Fig. 2

Virtual environment and its visible region

Grahic Jump Location
Fig. 3

Three-dimensional feature points on mating rod and their raycasts illustrated in perspective projection. (The geometrical size of the mating rod is exaggerated).

Grahic Jump Location
Fig. 4

Detecting and tracking by RTM-SEF. (It is illustrated by CCD-2 image).

Grahic Jump Location
Fig. 5

Real features for calibration. (It needs binarization operation on subROI).

Grahic Jump Location
Fig. 6

Illustration of a pair of virtual and real lines on image plane

Grahic Jump Location
Fig. 7

Calibration process of CCD-1 and CCD-2 in virtual environment

Grahic Jump Location
Fig. 8

Block diagram of coarse-fine visual servo

Grahic Jump Location
Fig. 9

Initial and target positions of feature points. (Image size is 640 × 480).

Grahic Jump Location
Fig. 10

Error response under coarse and fine servo in tracking operation

Grahic Jump Location
Fig. 11

Real image of mating rod by CCD-1 is replaced by the virtual one

Grahic Jump Location
Fig. 12

Finish assembly operation

Tables

Errata

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In