Replacing HC-SRO4 Ultrasonic Sensors with VL53L0X Arrays Part II

Posted 16 June 2020

In my previous post on this subject, I described my efforts to replace the ultrasonic ‘ping’ distance sensors with modules built around the ST Microelectronics VL53L0X ‘Time of Flight’ LIDAR distance sensor to improve Wall_E2’s (my autonomous wall-following robot) ability to track a nearby wall at a specified standoff distance.

At the conclusion of my last post, I had determined that a three-element linear array of VL53L0X controlled by a Teensy 3.5 was effective in achieving and tracking parallel orientation to a nearby wall. This, should allow the robot to initially orient itself parallel to the target wall and then capture and maintain the desired offset distance.

This post describes follow-on efforts to verify that the Arduino Mega 2560 robot controller can acquire distance and steering information from the Teensy 3.5 sensor controller via its I2C bus.  Currently the robot system communicates with four devices via I2C – a DF Robots MPU6050 6DOF IMU, an Adafruit DS3231 RTC, an Adafruit FRAM, and the Teensy 3.2 controller for the IR Homing module for autonomous charging.  The idea here is to use two three-element linear arrays of VL53L0X modules controlled by a Teensy 3.5 on its secondary I2C bus, controlled by the Mega system controller on the main I2C bus.  The Mega would see the Teensy 3.5 as a just a fifth slave device on its I2C bus, and the Teensy 3.5 would handle all the interactions with the VL53L0X sensors.  As an added benefit, the Teensy 3.5 can calculate the steering value for tracking, as needed.

The first step in this process was to verify that the Mega could communicate with the Teensy 3.5 over the main I2C bus, while the Teensy 3.5 communicated with the sensor array(s) via its secondary I2C bus.  To do this I created two programs – an Arduino Mega program to simulate the main robot controller, and a Teensy 3.5 program to act as the sensor controller (the Teensy program will eventually be the final sensor controller program).

Here’s the Mega 2560 simulator program:

And the Teensy 3.5 program

Here’s the hardware layout for the first test:

Arduino Mega system controller in the foreground, followed by the Teensy 3.5 sensor array controller in the middle, followed by a single 3-element sensor array in the background

The next step was to mount everything  on the  robot’s second deck, and verify that the Teensy 3.5 sensor array controller can talk to the robot’s Mega controller via the robot’s I2C bus.   Here’s the hardware layout:

Right-side three sensor array mounted on robot second deck. Teensy 3.5 sensor array controller shown at middle foreground.

After mounting the array and array controller on the robot’s second deck, I connected the Teensy to robot +5V, GND and the I2C bus, and loaded the VL53L0X_Master.ino sketch into the robot’s Mega system controller.  With this setup I was able to demonstrate much improved parallel orientation detection.  The setup is much more precise and straightforward than the previous algorithm using the ‘ping’ sensors.  Instead of having to make several turns toward and away from the near wall looking for a distance inflection point, I can now determine immediately from the sign of the steering value which way to turn, and can detect the parallel condition when the steering value changes sign.

I think I may even be able to use this new ‘super-power’ to simplify the initial ‘offset capture’ phase, as well as the offset tracking phase.  In earlier work I demonstrated that I can use a PID engine to drive a stepper motor to keep the sensor array parallel to a moving target ‘wall’, so I’m confident I can use the same technique to drive the robot’s motors to maintain a parallel condition with respect to a nearby wall, by driving the steering value toward zero.  In addition, I should be able to set the PID ‘setpoint’ to a non-zero value and cause the robot to assume a stable oblique orientation with respect to the wall, meaning I should be able to have the robot drive at a stable oblique angle toward or away from the wall to capture the desired offset.

19 June 2020 Update:

As a preliminary step to using a PID engine to drive the ‘RotateToParallelOrientation’ maneuver, I modified the robot code to report the front, center, and rear distances from the VL53L0X array, along with the calculated steering value computed as (F-R)/C.  Then I recorded the distance and steering value vs relative pointing angle for 20, 30, and 40 cm offsets from my target ‘wall’.

Here’s the physical setup for the experiment (only the setup for the 20 cm offset is shown – the others are identical except for the offset).

Experiment setup for 20 cm offset from target wall

Then I recorded the three distances and steering value for -40 to +40º relative pointing angles into an Excel workbook and created the plots shown below:

Array distances and steering value for 20 cm offset. Note steering value zero is very close to parallel orientation

Array distances and steering value for 30 cm offset. Note steering value zero is very close to parallel orientation

Array distances and steering value for 40 cm offset. Note steering value zero is very close to parallel orientation

The front, center, and rear distance and steering value plots all look very similar from 20-40 cm offset, as one would expect.  Here’s a combined distance plot of all three offset values.

Combined distance plots for all three offsets. Note that the ‘flyback’ lines are an artifact of the plotting arrangement

In the above chart, it is clear that the curves for each offset are very similar, so an algorithm that works at one offset should also work for all offsets between 20 and 40 cm.  The next chart shows the steering values for all three offsets on the same plot.

Steering values vs relative pointing angle for all three offset distances

As might be expected from the way in which the steering value is computed, i.e. (Front-Rear)/Center, the value range decreases significantly as the offset distance increases.  At 20 cm the range is from -0.35 to +0.25, but for 40 cm it is only -0.18 to +0.15.  So, whatever algorithm I come up with for achieving parallel orientation should be effective for the lowest range in order to be effective at all anticipated starting offsets.

To try out my new VL53L0X super-powers, I re-wrote the ‘RotateToParallelOrientation’ subroutine that attempts to rotate the robot so it is parallel to the nearest wall.  The modification uses the PID engine to drive the calculated steering value from the VL53L0X sensor array to zero.  After verifying that this works (quite well, actually!) I decided to modify it some more to see if I could also use the PID engine and the VL53L0X steering value to track the wall once parallel orientation was obtained.  So I set up a two-stage process; the parallel orientation stage uses a PID with Kp = 800 (Ki = Kd = 0), and the tracking stage uses the same PID but with Kp set to half the parallel search value.  This worked amazingly well – much better than anything I have been able to achieve to date.

The following short video is composed of two separate ‘takes’ the first one shows a ‘parallel orientation’ find operation with Kp = 800.  The second one shows the effect of combining the ‘parallel find’ operation with Kp = 800 with a short segment of wall tracking with Kp = 400.  As can be seen in the video, the tracking portion looks for all the world as if there were no corrections being made at all – it’s that smooth!  I actually had to look through the telemetry data to verify that the tracking PID was actually making corrections to keep the steering value near zero.  Here’s the video

and here’s the output from the two-step ‘find parallel and track’ portion of the video

From the telemetry it can be seen that the ‘find parallel’ stage results in the robot oriented parallel to the wall at about 360 mm or so.  Then in the ‘track’ stage, the ‘Steer’ value is held to within a few hundredths (0.01 to 0.07 right at the end), and the offset distance stays practically constant at 361 to 369 mm – wow!

The results of the above experiments show without a doubt that the three VL53L0X linear array is quite accurate parallel orientation and wall tracking operations – much better than anything I’ve been able to do with the ‘ping’ sensors.

The last step in this process has yet to be accomplished – showing that the setup can be used to capture a desired offset after the parallel find operation but before the wall tracking stage. Based on the work to date, I think this will be straightforward, but…..

21 June 2020 Update:

I wasn’t happy with the small range of values resulting from the above steering value computation, especially the way the value range decreased for larger offsets.  So, I went back to my original data in Excel and re-plotted the data using (Front-Rear)/100 instead of (Front-Rear)/Center.  This produced a much nicer set of curves, as shown in the following plot

Steering values vs relative pointing angle using (Front-Rear)/100 instead of (Front-Rear)/Center

Using the above curve, I modified the program to demonstrate basic wall offset capture, as shown in the following short video

The video demonstrates a three stage wall offset capture algorithm, with delays inserted between the stages for debugging purposes.  In the first stage, a PID engine is used with a very high Kp value to rotate the robot until the steering value from the VL53L0X array is near zero, denoting the parallel condition.  After a 5 second delay, the PID engine Kp value is reduced to about 1/4 the ‘parallel rotate’ value, and the PID setpoint is changed to maintain a steering value that produces an approximately 20º ‘cut’ toward the desired wall offset value.  In the final stage, the robot is rotated back to parallel, and the robot is stopped.   In the above demonstration, the robot started out oriented about 30-40º toward the wall, about 60-70 cm from the wall.  After the initial parallel rotation, the robot is located about 50 cm from the wall.  In the offset capture stage, the robot moves slowly until the center VL53L0X reports about 36 cm, and then the robot rotates to parallel again, and stops the motors.  At this point the robot is oriented parallel to the wall at an offset that is approximately 28 cm – not quite the desired 30 cm, but pretty close!

The next step will be to eliminate the inter-stage pauses, and instead of stopping after the third (rotate to parallel) stage, the PID engine will be used to track the resulting offset by driving the VL53L0X array steering value to zero.

23 June 2020 Update:

After nailing down the initial parallel-find, offset capture  and return-to-parallel steps, I had some difficulty getting the wall tracking portion to work well. Initially I was trying to track the wall offset using the measured distance after the return-to-parallel step, and this was blowing up pretty much regardless of the PID Kp setting.  After thinking about this for a while, I realized I had fallen back into the same trap I was trying to escape by going to an array of sensors rather than just one – when the robot rotates due to a tracking correction, a single distance measurement will always cause problems.

So, I went back to what I knew really worked – using all three sensor measurements to form a ‘steering value’ as follows:

To complete the setup, the PID setpoint is made equal to the measured center sensor distance at the conclusion of the ‘return-to-parallel’ step.  When the robot is parallel to the wall, Front = Rear = 0, so the Steering value is just the Center distance, as desired, and the PID output will drive the motors to maintain the offset setpoint.  When the robot rotates, the front and rear sensors develop a difference  which either adds to or subtracts from the center reading in a way that forms a reliable ‘steering value’ for the PID.

Here are a couple of short videos demonstrating all four steps; parallel-find, approach-and-capture, return-to-parallel, and wall-tracking.  In the first video, the PID is set for (5,0,0) and it tracks pretty nicely.  In the second video, I tried a PID set for (25,0,0) and this didn’t seem to work as well.

At this point I’m pretty satisfied with the way the 3-element VL53L0X ToF sensor array is working as a replacement for the single ultrasonic ‘ping’ sensor.  The robot now has the capability to capture and track a specific offset from a nearby wall – just what I started out to do lo these many months ago.

25 June 2020 Update:

Here are a couple of short videos in my ‘outdoor’ (AKA entry hallway) range. The first video shows the response using a PID of (25,0,0) while the second one shows the same thing but with a PID value of (5,0,0).

The following Excel plot shows the steering value (in this case, just Fdist – Rdist), the corresponding PID response, and the center sensor distance measurement

I Googled around a bit and found some information on PID tuning, including this blog post.  I tried the recommended heuristic method, and wound up with a PID tuning set of (10,0,1), resulting in the following Excel plot and video

 

Stay tuned!

Frank

 

 

2 thoughts on “Replacing HC-SRO4 Ultrasonic Sensors with VL53L0X Arrays Part II

  1. Pingback: Replacing HC-SRO4 Ultrasonic Sensors with VL53L0X Arrays Part IV | Paynter's Palace

  2. Pingback: Another Try at Wall Offset Tracking, Part II | Paynter's Palace

Leave a Reply

Your email address will not be published. Required fields are marked *