Monthly Archives: October 2020

Adafruit DS3231 Module vs generic ZS-042 Module

Posted 30 October 2020,

Back in May of 2018, well over 2 years ago, I posted about adding an Adafruit DS3231 RTC module to Wall-E2, my autonomous wall-following robot project. This addition went swimmingly until about 6 months later in September of 2018 when I posted to the Adafruit support forum, saying that I was having trouble with the ‘lostPower()’ function return values; it seems like it was returning FALSE (no power loss) even though I had removed the battery and turned off the power to the system. As described in the post, I eventually gave up on this in February of 2019after discovering that I was getting radically different results when I used a different Arduino Mega and two different Adafruit DS3231 modules. Eventually I wound up in the situation where both DS3231 modules appeared to work correctly no matter what I did – strange!

Fast-forward to the present. In the process of adding a rear distance sensor to Wall-E2, I once again ran across the same anomalous behavior by the Adafruit DS3231 RTC module; The ‘lostPower()’ function stubbornly refused to declare a loss of power, even with the battery removed and the main power turned off. After a lot more investigation, including a dedicated test program and some more back-and-forth on the Adafruit forum, I (and the Adafruit support guys) still was unable to resolve the issue.

In desperation, I fished a generic ‘ZS-042’ DS3231 RTC module out of my parts bin and started working with it, thinking maybe I could use it to get a clue why the Adafruit modules were failing. As it turned out, the ZS-042 module worked perfectly from the get-go with the Adafruit RTC library, and the ‘lostPower()’ function correctly returned TRUE when main power was lost with the battery removed, and FALSE when power was lost but the battery was in place.

Here are some photos of the Adafruit and ZS-042 modules:

As can be readily seen, the ZS-042 module is considerably larger, due almost entirely to the decision to use the LIR-2032 Li-ion rechargeable cell instead of the smaller non-rechargeable CR1220 type. Other differences:

  • The ZS-042 module includes a power LED. This LED illuminates when main power is available on the VCC pin, but not when the RTC module is running from the battery
  • The Adafruit module exposes the RST (reset) line. If you need this, the ZS-042 won’t work for you.
  • When used with the supplied LIR2032, the battery is recharged and/or float-charged from VCC through a 1N4148 diode. This works fine if VCC is 5V, but doesn’t work at all if VCC is 3.3V.
  • The 32KHz output is open-drain, without a pullup on both the Adafruit module, but the ZS-042 module has a pullup to VCC. What this means in practice is you can’t easily monitor this output when operating off the battery, so it is hard to tell if the RTC module is still running. My solution to that was to attach a completely separate power supply to the 32KHz output via a 10K pullup resistor. The Adafruit module needs this to see the 32KHz output for both battery power and mains power. The ZS-042 module only needs it for battery power.
Adafruit module with temporary 10K pullup resistor installed. Note clock scope trace in background
ZS-042 module with main power applied to USB connector. 32KHz output is present even without an external pullup
Same setup but with USB connector removed. Now need a 10K external pullup to an external supply to monitor 32KHz clock

So, there you have it. The Adafruit module is smaller, has an additional output (RST) and uses a smaller, non-rechargeable CR2210 button cell. However, in my testing and use over a two-year period, I came to distrust its ability to reliably detect and report on complete power loss situations that would require a forced date/time update.

The ZS-042 module is significantly larger due to its use of the rechargeable Li-ion LIR2032 button cell, and doesn’t have the RST output. It is also considerably cheaper and widely available. Lastly, it appears to more reliably report complete power loss occurrences, allowing proper date/time updates.

For my money, I have replaced the Adafruit DS3231 module in my system with the ZS-042 module. In practice, complete RTC power failure events are very rare, so in all probability there would be no appreciable difference between the two choices. However, for those applications (like mine) where you really do want to know if the RTC loses its sense of time, I don’t feel comfortable with the Adafruit module.

If anyone has a better understanding of the Adafruit module, please feel free to comment.

30 October 2020 Update

I replaced the Adafruit DS3231 RTC module on my Wall-E2 autonomous wall-following robot with the ZS-042 DS3231 RTC module. As shown in the following photos, I had to re-arrange the I2C FRAM and I2C MPU6050 IMU modules in order to make room for the significantly larger ZS-042 module.

Original layout. Adafruit RTC module on left, MPU6050 IMU in center, FRAM on right
Straight replacement not going to work – oops!
After re-arrangement

Stay tuned,

Frank

Adding a VL53L0X Rear Distance Sensor to Wall-E2

posted 24 October 2020

After documenting left-side wall-tracking success with Wall-E2, my autonomous wall-tracking robot (see this post and this post), I started thinking about improving Wall-E2’s obstacle avoidance performance.

Wall-E2 can encounter several distinct obstacle situations during wall tracking operations. In the simplest case, Wall-E2 approaches an upcoming corner while tracking a wall, and needs to know how to transition from tracking the current wall to tracking the upcoming wall. A more difficult situation arises when Wall-E2 is ‘stuck’ – prevented from moving forward by an obstacle that isn’t detected by its front LIDAR distance sensor; a shoe, or the curved foot of a coat rack. A third situation arises when Wall-E2 encounters an obstacle that just wasn’t there a second ago; a cat or a human foot or a bag of groceries.

In the simple wall-to-wall transition case, all Wall-E2 has to do is make a right-angle turn away from the current wall and start following the next wall; this was successfully demonstrated several times in the previous posts. This maneuver utilizes a ‘spin-turn’ technique intended to minimize the backward movement of the robot while turning. This is done to prevent Wall-E2 from backing into the currently-tracked wall while attempting to turn toward and track the upcoming wall. Unfortunately, this maneuver is not always successful, whereupon Wall-E2 tries to climb backwards up the current wall, often with disastrous results.

In the ‘stuck’ case, Wall-E2 has to first recognize that it is no longer moving forward (or in any other direction for that matter), and then figure out what to do about it. Detection is accomplished by looking at the variance of front distance measurements over time; the ‘stuck’ condition is declared when the front-distance variance falls below a pre-determined value. A typical ‘stuck’ recovery maneuver is to back up slightly, and then make a right-angle turn away from the wall currently being tracked. This maneuver, while usually successful, has the same problem as the simple wall-to-wall transition; it sometimes results in the same backward-up-the-wall climb, with similar results.

The ‘suddenly appearing obstacle’ case can be handled in a manner similar to ‘stuck’ detection, but bypassing the variance measurement stage. and the resulting avoidance maneuver is similar to the ‘stuck’ case

Wall-E2 currently handles all of the above cases fairly well, except when it backs into something while maneuvering to avoid the detected obstacle. So, my challenge was to find a way to avoid running into something while backup up from something else. The easy answer to this problem was to add a rear-distance sensor to Wall-E2, and then use that information to modify obstacle-avoidance behavior as necessary.

During the changeover from ‘ping’ style distance sensors to left and right 3-element arrays of VL53L0X time-of-flight sensors I learned quite a bit about the care and feeding of the VL53L0X, and also wound up with quite a few spares. So, I took one of the spares and installed it on the rear ‘bumper’ plate on Wall-E2, as shown in the following photo:

GY-530 VL53L0X mounted on rear ‘bumper’

Since the 2nd-deck Teensy 3.5 was already handling both 3-element VL53L0X arrays, I simply added the rear sensor to the left-hand array ‘Wire2’ daisy-chain, and connected its XSHUT pin to Teensy pin 8. Then I modified the Teensy’s program to initialize and poll the rear sensor in the same manner as all the others, and tested it to make sure it was responding properly to rear-aspect obstacles.

The next step is to incorporate rear-aspect distance information into the various obstacle avoidance algorithms in the main program.

‘Stuck’ case:

The ‘stuck’ case by definition occurs when the mathematical variance of the last 3-5 seconds of forward distance measurements fall below a set value, indicating that the robot is no longer moving forward or backward. When this happens while wall tracking, the robot has to decide what to do. The current response is to back up for 1 second at half speed, execute a 90 deg ‘spin turn’ away from the nearest wall and then go back to normal operations.

I think I would like to enhance this algorithm as follows:

  • If the measured front distance is less than MAX_FRONT_DISTANCE_CM (currently set at 400 cm) by at least STUCK_BACKUP_DISTANCE_CM (currently set at 25), then back up by STUCK_BACKUP_DISTANCE_CM using front distance measurements as the primary means of terminating the backup maneuver. If the front distance measurement cannot be used, but the rear distance measurement is valid (less than MAX_REAR_DISTANCE_CM, currently set at 100), then back up using the rear sensor measurement. If neither measurement is available, then revert back to a 1 second half-speed movement. In all cases, use the rear distance measurement to prevent ‘reverse wall climb’ by stopping the motors if the robot gets too close to an obstacle while backing up.
  • Execute a ‘spin turn’ away from the nearest wall – this is the same as the current algorithm.
  • Execute a ‘rolling turn’ back toward the original direction of travel. This should offset the robot further away from the nearest wall, and hopefully allow it to bypass the obstacle.

Left Side Wall Tracking Success With VL53L0X Array, Part II

Posted 10 October 2020

After the left-wall tracking success described previously in this post, I made some more adjustments and also set up a ‘ tracking sandbox’ in my lab to test Wall-E2’s ability to detect & respond to upcoming obstacles. Here’s a short video showing Wall-E2 in action

Tracking run demonstrating obstacle avoidance maneuvers

Here’s the raw output from the run:

And here is an Excel plot of just the movement sections of the above, highlighting the avoidance maneuvers.

left-side wall distances are shown in mm, while the front distance is shown in cm. Note 1-2 sec gaps during turns

Comparing the Excel plot to to the video, the front distance plot shows a monotonically decreasing value and then a large jump after each obstacle avoidance turn. It appears that the robot acquires and tracks the 30cm offset target successfully on the first wall, but doesn’t do as well on the second one. It was much more successful on the third wall. The plot for the last wall is only about 2 seconds long.

All in all, this looks like a pretty successful run for Wall-E2. It tracked three different walls (the fourth wall was too short to track) and successfully avoided obstacles three times – woo hoo!

12 October 2020 Update:

On the above ‘sandbox’ run, I noticed that at the end of the third leg at about 14 seconds into the run, the ‘spin turn’ at the white foam core wall wasn’t a ‘step turn’, but a ‘backup and turn’ triggered by the front distance going below the front obstacle limit of 20 cm, rather than the tracking obstacle clearance limit of 30 cm. Here are two output lines that illustrate the difference

and

In the video, these events are at about 7 & 14 seconds respectively. From this I came to the conclusion that at least the front distance wasn’t getting updated enough to keep the robot from getting too close to the obstacle before it realized there was a problem. At the time, the update rate for the system was set at 5Hz or 200 mSec. If the robot is travelling at 50 cm/sec, it means that it will travel 10 cm between distance updates – ouch!

So, I changed the timer interrupt timeout value for a 10Hz rate, and ran the ‘sandbox’ run again. This time when I looked at the output I could see that each leg terminated with something like

and it was clear that the updates were happening about every 100 mSec. Here’s the output:

and a short video:

And an Excel plot showing the left wall and forward distances progressing through the run.

Note that the front distance is shown in cm, while the left wall distances are shown in mm

At this point, I’m pretty happy with Wall-E2’s new-found wall tracking superpowers, at least for the left wall case. Now I need to port the V7 left-side-only code back into the main program and also port it to the right wall case.

Stay tuned!

Frank

Left Side Wall Tracking Success With VL53L0X Array

Posted 05 October 2020

This post describes the successful left-side wall-tracking performance of my re-motored, re-wheeled, and re-sensored robot. Back in January of this year I was able to demonstrate reasonable wall tracking performance with my two-wheel robot using the old HC-SR04 ‘Ping’ sensors. However, I still wasn’t able to consistently track and maintain a desired wall offset, the main goal in this project stage

Since January, I have made the following changes to my larger four-wheel robot:

With all the changes, I had kind of lost track of the ultimate goal, which is to have the robot follow the nearest wall at a specified offset distance. All of the above updates were intended, in one way or another, to facilitate that goal, but I hadn’t yet got the robot to actually perform to expectations.

To help clear away some of the fog, I created a new version of the operating software that was pared down to just what was required to track the left wall, and nothing else. The idea was to work out all the bugs for offset capture and subsequent wall tracking with just the minimum required software, and then incorporate the modified code back into the mainstream software.

At first I was working with a 4-stage process;

  • find the parallel heading to the selected wall
  • drive at an angle toward the desired offset distance
  • when the offset distance is obtained, turn parallel to the wall again
  • track the wall at the desired offset

However, I found that the when the robot started off outside the desired wall offset, the second ‘turn to parallel’ operation took up too much space, both in terms of wall offset distance, and distance along the wall. By the time the second ‘find parallel’ operation was completed, the robot was usually much too close to the wall for effective offset tracking, meaning the entire 4-step process would have to be repeated. So, I eliminated step 3 in the process (the second ‘turn to parallel’ operation) entirely, and modified the wall tracking algorithm to capture the desired wall offset and track it. Instead of using the distance sensor measurements directly, I generate a ‘steering value’ proportional to the difference between the front and rear sensor measurements, and a target ‘steering value’ proportional to the difference between the desired offset and the center sensor measurement and use a PID controller to match the measured steering value to the target steering value. The effect of this is that the robot will track toward the offset at an angle, and then turn parallel to the wall and continue to track, as shown in the video below:

Left-side offset capture and track demonstration

Here’s an Excel plot showing the wall offset distance versus time for the above demonstration run.

As can be seen in the above plot, the robot starts off at about 45 cm from the wall, tracks inward to capture the desired offset, and then continues to track the desired offset even when it goes around the 45-degree bend. The code that accomplished this is posted below:

Stay Tuned!

Frank

Replacing HC-SRO4 Ultrasonic Sensors with VL53L0X Arrays Part V

I have been running some wall-tracking tests with Wall-E2 and the new VL53L0X sensor array arrangement, but have been having poor results, especially with offset capture. After a bunch of test runs, I started to think that the distances aren’t updating fast enough to keep up with the robot’s forward speed, so it runs into the wall before it knows that it has gotten too close

Looking at the Teensy 3.5 I2C Slave code that manages the sensor array, I see the following loop() code

And I get the following output:

Looking at the timestamps, it appears that a measurement cycle takes about 200 mSec, taking into account the added 100 mSec delay from the delay(100); statement. This is consistent with the default 30 mSec measurement time for a single VL53L0X, but unfortunately this is much greater than the default 100 mSec PID controller update rate.

The VL53L0X can make measurements faster, but at the cost of lower accuracy. In my case, the increased accuracy from a 30 mSec measurement time is useless if it isn’t fast enough to keep up with the robot. Searching around the net, I found this post on the Pololu support forum, dealing with just this problem. So, I modified my Teensy 3.5 I2C Slave program to use ‘continuous measurements and the shorter (20 mSec vs 30 mSec) timing budget, as follows:

with the following results:

From the above it is apparent that the new loop time is about 19 mSec for all six sensors. This is very interesting, as it implies that in ‘continuous’ mode, all six sensors run all the time, and all the readContinuousMillimeters() function does is pull the latest measurement out of a buffer.

As a quick test, I rigged up a ‘fan blade’ (piece of paper taped to a old robot wheel on a motor) as shown, and then ran the program again with the motor spinning the ‘blade’ in front of the left-side sensor array (at about 100 RPM, I think). The plot shows that the sensor response is certainly fast enough to ‘see’ the rise and fall times on the ‘fan blade’.

03 October 2020 Update

With the above results in mind, I decided to try speeding up the ‘fan blade’ setup to see if I could find out how fast the VL53L0X sensor could go. I thought I should be able to use the shaft encoder setup on the back of the motor to acquire an accurate RPM reading and convert that into ‘milliseconds/blade’ to tell how short of an interval the VL53L0X could detect. As things often happen, determining motor RPM from encoder signals turned out to be a LOT harder than I thought. After a loooonnnng side-trip into geared-motor hell, I wound up more or less disregarding the encoder feature and modified the Teensy 3.5 ‘loop()’ code to produce a direct tachometer reading, as follows:

This allowed me to directly monitor ‘effective’ RPM & obstruction frequency. So I set up the experiment using a ‘four blade fan’ as shown below, and monitored the obstruction detector output with my Hanmatek DSO

DSO Output from VL53L0X Obstruction Detection loop() code

As can be seen from the DSO screenshot, the obstruction detection pulse frequency is about 26Hz, with a period of a little over 38 mSec. So it is clear that the VL53L0X running in continuous mode with a timing budget of 20 mSec can easily produce readings every 30 mSec or so.

04 October Update:

The next step was to see if the ‘VL53L0X fast/continuous’ code running on the Teensy VL53L0X sensor array manager would allow the main robot MCU to fetch distance readings faster. To do this, I uncommented the #define DISTANCES_ONLY //added 11/14/18 to just display distances in infinite loop line in my program to eliminate all code except a short loop displaying distances. Then I took measurements with my 4-blade ‘fan’ running in front of the left-front sensor. I ran the motor voltage up to the point where the Teensy’s blade sensor output was showing about a 20Hz blade rate, and got the following output from the main MCU ‘DISTANCES_ONLY’ loop.

From the above, it is clear that the main MCU code can ‘see’ sensor output changes occurring at 20 Hz (50 mSec period). This should be fast enough to keep up with the physical movement of the robot during offset capture and wall-tracking activities.

In theory, I won’t have to do anything to the main MCU code to enjoy the faster response

Stay Tuned!

Frank

Working on B-Ball Techniques with Mark Clay

Posted 01 October 2020,

I have been working with Mark Clay, a personal basketball coach I found through CoachUp – a site for connecting coaches and clients. We have been working on ball handling and shooting fundamentals, and it seems to be improving my game considerably (to the extent that a 71-year old B-Ball Wannabe’s game can be improved). Here are some videos from my September 28, 2020 session (that’s Mark in the background of the first video). Mark has been super helpful and supportive, even though I’m absolutely certain that I’m not his (or anyone’s) idea of a hot NBA prospect ;-).

23 March 2021 Update:

I had to suspend training with Mark last Fall, as I underwent right shoulder surgery to address a long-running pain issue. After a successful surgery and rehab, I re-connected with Mark to use up the last three sessions of the original 10-session package. Last night was the final session, and the following video shows the results. I have to say that I never thought I would be draining threes off the dribble, but that’s exactly what happened last night (well, there were a lot of misses, too, but….

And here’s a link to a much longer, hi-res record of the above session