As technology surrounding automation advances, so too do the navigation methods of the humble robot vacuum. And robot dead reckoning is the current key technology. In navigation, dead reckoning is the process of calculating a position estimate from a known starting location and internal estimates of speed and heading direction over time without any external references. Typical sensors used for dead reckoning in robotics include wheel encoders to estimate velocity by wheel rotation, optical flow sensors (like the sensor used in your computer mouse) that estimate velocity based on observed patterns on the floor, and IMUs which measure heading and acceleration. Dead reckoning is calculated from the combination of information from these sensors.
These dead reckoning algorithms are being implemented in robots that are slowly overtaking the well-known random walk (robots that move in seemingly random patterns on the floor). These robots utilize intelligent walk – a more advanced method of cleaning that uses dead reckoning to move along a calculated pattern. Dead reckoning is their main source of navigation information and is what allows these robots to clean surfaces more effectively and save battery life. For consumers, it saves time by cleaning efficiently; and for OEMs, it saves BOM costs by removing the need for LIDAR or an expensive camera (in a VLAM system).
Even in more advanced VSLAM and LIDAR systems, dead reckoning plays a key role. VSLAM-based solutions rely on a camera to calculate the robot’s position. The camera is often pointed at an angle looking forward and up, to find various edges or objects to localize its place. However, if the robot moves into a low-light area (like under the bed or sofa) it loses the helpful visual cues and can get lost. The same happens in a feature-less room (think white walls). LIDAR systems lose information when a robot goes over a threshold or uneven terrain, creating a similarly uneven world. Dead reckoning fills in the positioning gaps during these crucial instances and maintains the appropriate course.
The Purpose (and Challenges) of Each Sensor
Dead reckoning seems like it can solve a lot of issues with automated navigation and it can. But it does not come without its own difficulties. Over time, the position calculated via dead reckoning will drift from the true location due to internal estimation errors (of speed and heading). Each sensor has their own situations that affect the accuracy of their output.
Wheel encoders keep track of the wheels’ rotations (how far or how fast its moved around its axle) which translates directly into linear displacement. Well, it would in an ideal world. In the real world, wheels can slip or skid on soft surfaces and flooring changes. During these slips and skids, the wheels will move further or less than the robot. Readings from the wheel encoder are accurate to the wheels’ behavior, but not of the robot’s displacement.
(Source: CEVA)
Optical flow sensors are the sensors that you find in your computer mouse. Using either an LED or laser, optical flow sensors look for relative changes in their images. If a certain subset of pixels in its image moved (or flowed) together, it gives information on which way the sensor has moved as a whole. Your computer mouse is almost always used on a flat, smooth surface, which makes its conditions and output consistent. But a robot moving over different floors gives different information depending on the height difference between the floor and the sensor, and what type of sensor it is. Certain floors work better with an LED (more textured), others better with a laser (flatter). In order to get a robot the best information from an optical flow sensor, calibrating it to adapt to floor height and flooring type are essential.
(Source: CEVA)
Inertial Measurement Units (IMUs) are sensors designed to live up to their name-sake. They measure acceleration and angular velocity, which can be turned into heading and tilt information. As I mention in a previous post, heading accuracy is crucial for robots to figure out where they’re going. Tilt information aids in keeping the robot from moving up a wall or chair too high and causing it to get stuck. IMUs have their own challenges in sensor consistency. They are affected by temperature and require calibration to use correctly.
Putting it All Together
In order to create a precise dead reckoning algorithm, each sensor has to be calibrated to provide accurate information. This is a task in itself, and the fusion of these sensors is an added layer of complexity. By comparing the wheel encoder information to the IMU and optical flow sensors, slips and skids can be detected and the information disregarded. Similarly, optical flow can be calibrated dynamically by comparing its flow to the wheel encoders expected linear motion. Utilizing the IMU to check for a consistently smooth surface increases confidence in the measurement. In addition to the constant monitoring of temperature relative to performance, this keeps the sensors in check. The value of sensor fusion is to know which sensor’s information is most trusted at any given time to give the best possible. As you can see, there are lot of complexities to this dead reckoning process, in addition to ones that I have not mentioned.
Dead reckoning is a powerful tool that can be used for any terrestrial-based robot in lieu of, or in addition to, advanced VSLAM or LIDAR systems. It can reduce cleaning times with intelligent walk, and add robustness to complex SLAM algorithms during difficult situations. Thoughtful fusion between wheel encoders, optical flow sensors, and IMUs is what makes dead reckoning possible. And although it may be a surprise to learn that a seemingly simple pattern on the floor is the result of something so complex, but the smooth result you see is a result of a lot of hard work with sensors and sensor fusion. In this case, at least, the whole is greater than the sum of its parts.