分享

The Transparent Robot

 quasiceo 2016-01-18

Wednesday, June 19, 2013

SLAM with imperfect sensors

SLAM from 5 positions joined into a best belief map. Red is particle filter position likelihood estimation.  Green lines are Sonar and yellow are IR distance readings. Rectangles + line shows old and current poses. Circles are 10 cm apart.
I have been very sparse with my blogging but I'm actually getting somewhere with my SLAM attempts. I can now combine sensor-readings from multiple location and the robot moves between poses.

The sensor-readings consists of IR and Ultrasonic distance sensors on a servo taking readings 1 degree apart. resulting in 170 distance readings per pose. I have a compass but I'm not using it right now. These sensor-readings are used to estimate the new position using a particle filter.
Robot with LiPo battery. IR-sensor and Ultrasonic sensor on a servo in the front.
My biggest problem right now is that the sensors behave a bit uncertainly, not just noisy but also depending on material, angle and size of the object you get different readings. I have tried to mediate this by combining the IR with the ultrasonic.

IR has small angle but is sensitive to reflective, transparent and luminescent materials. It also behaves badly on striped materials. And my IR sensor also works ok between 20 and 100 cm.

Ultrasonic sensor has wide angle 15-20 degrees, is quite precise, works on most materials that I know of but is sensitive to steep angles.

    本站是提供个人知识管理的网络存储空间,所有内容均由用户发布,不代表本站观点。请注意甄别内容中的联系方式、诱导购买等信息,谨防诈骗。如发现有害或侵权内容,请点击一键举报。
    转藏 分享 献花(0

    0条评论

    发表

    请遵守用户 评论公约