This article proposes an analytical method of computing distorted laser sheet of a laser line scanner and compensating it. In this article, a distortion model of laser sheet generated from… Click to show full abstract
This article proposes an analytical method of computing distorted laser sheet of a laser line scanner and compensating it. In this article, a distortion model of laser sheet generated from a laser line scanner with large fan angle is derived from physical principles to achieve accurate depth perception even around its edges on which significant depth estimation errors occur in existing algorithms. From a laser beam incident obliquely on a contact surface of two cylindrical lenses, a curved laser sheet is expressed in terms of the nonzero incident angle by the laws of geometrical optics. From the mathematical model of the distorted laser sheet, the incident angle is estimated through an optimization technique and then its estimate is used for depth computation. It is shown through simulations and experiments that the proposed distortion model enables a proper compensation scheme and hence the distortion-compensated depth estimation errors are reduced all over the range of interest, specially much around the both side edges of the laser sheet. Quantitatively in comparison to the compensation-free method, the depth estimation error is improved from 225 to 28 mm on average and from 1781 to 187 mm for worst-case scenarios near edges of a laser sheet.
               
Click one of the above tabs to view related content.