Yearly Archives: 2020

Working on B-Ball Techniques with Mark Clay

Posted 01 October 2020,

I have been working with Mark Clay, a personal basketball coach I found through CoachUp – a site for connecting coaches and clients. We have been working on ball handling and shooting fundamentals, and it seems to be improving my game considerably (to the extent that a 71-year old B-Ball Wannabe’s game can be improved). Here are some videos from my September 28, 2020 session (that’s Mark in the background of the first video). Mark has been super helpful and supportive, even though I’m absolutely certain that I’m not his (or anyone’s) idea of a hot NBA prospect ;-).

23 March 2021 Update:

I had to suspend training with Mark last Fall, as I underwent right shoulder surgery to address a long-running pain issue. After a successful surgery and rehab, I re-connected with Mark to use up the last three sessions of the original 10-session package. Last night was the final session, and the following video shows the results. I have to say that I never thought I would be draining threes off the dribble, but that’s exactly what happened last night (well, there were a lot of misses, too, but….

And here’s a link to a much longer, hi-res record of the above session

Replacing Wall-E2’s L298N Motor Drivers with Adafruit DRV8871

Posted 07 September 2020,

I’ve been having some ‘issues’ with driving Wall-E2’s Pololu 20D 125:1 12V metal geared motors with the old L298N motor drivers, so I thought it was time to replace them with the Adafruit DRV8871 models used in ‘re-motoring’ my two-wheel robot (see this post for some of the details).

The first step was to review the work I had done earlier replacing the L298N on my two-wheel robot with the same DRV8871 Driver. The two-wheel robot uses a UNO controller, while Wall-E2 uses a Mega, but they are similar enough so porting the wiring and code should be simple enough. The two-wheel robot uses UNO pins 5, 6, 9 & 10 (all PWM lines) for direction and speed control, while Wall-E2 uses pins 8-13 and 36, 38, 40, 42, 44 & 46 for controlling the two L298N motor drivers. My plan is to try using just two motor drivers, one for both left motors, and another for both right motors. If this works, I’ll need just 4 lines, say 8-11 (If I later need to use four drivers vs two, I’ll use 12/13 for one of the ‘extras’ and 6/7 for the other one. Currently 6 is unused and 7 drives the red laser diode, so moving it shouldn’t be a big problem).

So, I replaced the two L298N modules with two DRV8871 modules, and wired both left motors into one driver and both right motors into the other, as shown in the following photos

‘before’ – dual L298N motor drivers
‘after’ – Dual Adafruit DRV8871 motor drivers

Since I previously replaced an L298N with a DRV8871 in my two-wheel robot, I had already modified all the relevant motor driver code to use the DRV8871 module vs the L298N, so all I had to do was replace the low-level motor interface modules in my four-wheel robot project with the corresponding ones from my two-wheel robot project. The replaced modules were:

  • SetLeftMotorDirAndSpeed
  • SetRightMotorDirAndSpeed
  • StopBothMotors
  • MoveAhead
  • MoveReverse

Note that the four-wheel robot code uses separate SetLeft/RightMotorDir & SetLeft/RightMotorSpeed functions, so these needed to be modified or replaced.

SetLeft/RightMotorSpeed() is called from RunBothMotors() & SpinTurn(). I could modify SpinTurn() to call RunBothMotors() instead of calling SetLeft/RightMotorSpeed() directly.

RunBothMotors() is called from RunBothMotorsMsec(), Setup(), MoveReverse() and MoveForward(). Every call to RunBothMotors() is paired with calls to SetLeft/RightMotorDir(), so I could replace Each set of calls with a single call to SetLeft/RightMotorDirAndSpeed(). The only issue with this are the RunBothMotorsMsec() calls in Setup() & ExecDisconManeuver().

I decided to modify the RunBothMotors() & RunBothMotorsMsec() functions to take a direction parameter, and then the functions will simply call SetLeft/RightDirAndSpeed().

10 September 2020 Update:

I got lost in the details of the Wall-E2 code, so I decided to simplify things to check out the new DRV8871-based motor setup. I modified my original ‘Adafruit_DRV8871_Driver_Test’ project to run both motor sets forward and backward from the minimum PWM value (about 50) to max (255) while monitoring the total current for all four motors, as shown in the following Excel plot

Total motor current for motor speed commands from 50-255

As can be seen from the above plot, the total motor current for all four motors is right around 0.8A or about 0.4A per driver – well within the current limits for the DRV8871 module. As a side note, the TO-3 cans on the modules barely got warm, so this looks like a real winner.

Now that I have the technical issues sorted out with respect to the driver replacement project, I can get back to the main project of improving Wall-E2’s wall-following ability.

Stay tuned!

Frank

13 September 2020 Update:

Unfortunately, when I started running the full Wall-E2 code, the right set of motors would go backwards, but not forwards – awkward to say the least. After a lot of troubleshooting, it finally dawned on me that the problem was being caused by my introduction of a TIMER1-based interrupt a while ago to manage ‘stuck’ detection. TIMER1 controls PWM on pins 11 & 12, and one of those pins was being used by the right motor – bummer!

After a LOT of screwing around, I finally decided that the only way to really figure things out was to remove everything but the timer and motor driver code from the program to figure out what timer (if any) I can use for the ‘stuck’ detection interrupt and still have proper motor control.

To that end, I created a new Arduino program ‘TimerISRvsPWMTest.ino’ as shown below:

With the TIMER1 setup commented out, both motors rotate forwards and backwards no problem. However, as soon as the TIMER1 code was un-commented, the motor driver on pins 10/11 would turn one way but not the other. With an O’scope I could see the PWM waveform on pin 10 but not on pin 11. This is consistent with Timer1 controlling PWM on pins 11,12 & 13.

So, I moved the wire connected to DRV8871 IN2 from 11 to 3 and changed the code to use pins 10 & 3 vs 10 & 11. Now the motor connected to these pins rotates both forward and backwards

Then I went back to my WallE2_V6 project and tried the same trick – moving the IN2 pin from 11 to 3; nope – still doesn’t work. In fact, moving from 11 to any other PWM pin doesn’t work in the WallE2 project, but does in the TimerPWM test project – weird.

So, thinking that maybe there were some timer dependencies hidden in one or more of the libraries being used for the WallE2 project, I copied them all over to the TimerPWM project folder, added them to the project in VS2019, and added them to the project code. After I got everything to compile, I ran the TimerPWM test project successfully using 10 & 7, 10 & 3, 10 & 2, etc (but 10 & 11 still doesn’t work with TIMER1 ISR enabled).

So, it’s not the libraries. Next I changed the timer interrupt code to use TIMER5 instead of TIMER1, moving the pin dependency from 11-13 to 44-46. I confirmed this by changing ‘In1_Right‘ to 44, 45, 46 in turn and moving the physical In1_Right connection to the corresponding pin, noting that the motors don’t rotate properly when driven from any of the affected pins.

Next, I changed the timer interrupt code in WallE2_V6 from TIMER1 to TIMER5 to see if I can get back pin 11 as a PWM pin. Nope – it still doesn’t work in the WallE2 code, nor does pin 7.

Back to the test program. Copied all the ‘pre-setup’ code from WallE2 to the test program; no change – test program still works properly.

After a few back-and-forths of this nature, I eventually narrowed the problem down to the driver modules themselves. A physical inspection revealed that I had forgotten to solder the second pin on both 2-pin screw terminals on one of the drivers – oops! So, like many seemingly intractable technical problems, this one was caused by two independent issues; the use of TIMER1 caused pins 10-12 to be unavailable for motor drive PWM, and the intermittent connections to the left-side motors complicated the symptoms. This was a perfect example of why ‘cutting the problem in half’ (in my case, by eliminating all the Wall-E2 hardware from the problem) is so effective in troubleshooting.

Stay tuned!

Frank

New Wheels for Wall-E2

Posted 24 August 2020, 1402 days into the Covid-19 Lockdown

My autonomous wall-following robot Wall-E2 is now smart enough to reliably follow walls and connect to a charging station, at least in my office ‘sandbox’ testing area, as shown in the following video

However, as can be seen toward the end of the video, Wall-E2 had some trouble and almost got stuck making the third 90 degree turn.  Apparently the current thin 90mm wheels just don’t provide enough traction on carpet.

So, I decided to see what I could do about re-wheeling Wall-E2.  After some research I found there are now plenty of larger diameter wheels for robots out there, but I couldn’t seem to find a set that would fit Wall-E2 and still allow me to keep the current set of wheel guards.  I needed the same (or maybe slightly larger) diameter for ‘road’ clearance, but something less than about 20 mm thick to fit within the current wheel guard dimensions. Then it occurred to me while reading the specs for one of the wheels (ABS for the wheel, and TPU for the tire)  that I already had two 3D printers standing around waiting for something to do, and I had a plentiful supply of ABS (or in my case, PETG) and TPU filaments – why not build my own?  After all, how hard could it be?  As you might guess, that question started what now feels like a 10-year slog through ‘3D printed wheel hell’

I wanted to create a spoked wheel with a hub that would accept a 3mm flatted motor shaft, and I wanted to fit this wheel with a simple TPU treaded tire.  The wheel would have small ‘guard rail’ rims that would keep the tire from sliding off.

It started innocently enough with a search through Thingiverse, where I found several SCAD scripts for ‘parameterized’ wheels.  Great – just what the doctor ordered!  Well, except that the scripts, which may have worked fine for the authors, didn’t do what I wanted.  and as soon as I tried to adjust them to fit my design specs, I discovered they were incomplete, buggy, or both.

I had wanted to learn a bit more about SCAD anyway and this seemed like a good project to do that with, so I persevered, and eventually came up with a SCAD design that I liked.

I started with bioconcave’s ‘Highly Modular Wheel_v1.0.scad’ file from Thingiverse, and (after what seemed like years trying to understand what was going on) was able to extract modular pieces into my own ‘FlatTireWheel’ scad script, as follows:

Here’s a screenshot of a completed 86mm wheel with elliptical spokes and a hub compatible with a 3mm flatted shaft.  When a TPU tire is added to the rim, the assembly should be about 90mm in diameter.

SCAD-generated wheel with elliptical spokes and a hub compatible with a 3mm flatted shaft. Note extensions to set screw hole through spoke and rim

One of the many issues I had with the original code is it assumed the hub would sit on top of the spokes, and therefore there was no need to worry about whether or not the setscrew arrangement would be blocked by the spokes and/or rim.  Since I wanted a wheel that was mostly tread, I wanted to ‘sink’ the hub into the spokes as shown above. In order to make this work, I needed to extend the setscrew access hole through the spoke assembly and through the rim.  In the finished design, the hub assembly can be moved freely up and down in the center of the wheel, and the hole extensions will follow.  If the hub setscrew hole isn’t blocked by the spokes and/or rim, then the extensions don’t do anything; otherwise they extend the setscrew hole as shown above.

Here’s a photo of separate wheel/tire pieces, and a completed wheel/tire combination on Wall-E2

Separate wheel & tire parts, plus a completed wheel on Wall-E2

2 September 2020 Update:

After running some sandbox tests with my new wheels, I discovered that the new tires didn’t have much more traction than the old ones. However, now that I ‘had the technology’, it was a fairly simple task to design and print new tires to fit onto the existing new rims. Rather than do the tire design in SCAD, I found it much much easier to do this in TinkerCad. Here’s a couple of screenshots showing the TCad design.

Original printed tire on the left, new one on the right. Note more aggressive tread on new tire
Exploded view showing construction technique used for new (and old) tire. Very easy to do in TinkerCad!

03 September 2020 Update:

The increased traction provided by the new tires have caused a new problem; on a hard surface the rotation during a ‘spin turn’ (one side’s motors going forward, the other going in reverse) is too fast, causing the robot to slide well past the target heading. Not so much of a problem on carpet, but how would the robot know which surface is in play at the moment. After some thought, I decided to try and modulate the turn rate, in deg/sec, as a proxy for the surface type. So, in ‘SpinTurn()’ I put in some code to monitor the turn rate and adjust the motor speeds upward or downward to try and keep the turn rate at a reasonable level.

Here’s a video of a recent run utilizing the new ‘SpinTurn’ rate modulation algorithm

And the data from the three ‘spin turn’ executions, one on hard surface, and two on carpet.

Spin Turn executions; hard surface, followed by two carpet turns

As can be seen in the above video and plots, the motor speeds used on the hard surface turn is much lower than the speeds used during the carpet turns, as would be expected. This is a much nicer result than the ‘fire and forget’ algorithm used before. Moreover, the carpet turns are much more positive now thanks to the more aggressive tread on the new tires – yay!

Offloading Distance Measurements from Wall-E2’s Main (Mega) MCU

Posted 08 August 2020,

Now that I have the two 3-element VL53L0X proximity sensors integrated into Wall-E2, my autonomous wall following robot, I have been running some ‘field’ tests in my ‘sandbox’ test area, as shown in the photo  below.

‘Sandbox’ field test area. Very simple layout with no obstacles (other than my feet)

As can be seen from the above video, Wall-E2 ‘ran into’ a problem at the end of the second leg.  The problem is caused by the extended time required for finding the parallel orientation and capturing the desired wall offset. During this time, Wall-E2 isn’t checking the front distance for upcoming obstacles, and isn’t checking for the ‘isStuck’ condition.  In the function that homes to the IR beam on a charging station have the following guard code:

that checks for a ‘stuck’ condition or an upcoming obstacle.  I could also put this same guard code in the functions that handle parallel orientation and offset acquisition/tracking, but it occurred to me that I might want to try off-loading this responsibility to the newly-added Teensy 3.5 I2C slave that manages the dual 3-element VL53L0X proximity sensors.  This Teensy spends 99.9% of its time just waiting for the next distance check interval to appear on the horizon, so it would probably welcome something else to do.  I could route the LIDAR-lite front LIDAR data to it as well, which would give it all the distance sensors to play with.  It could then do the forward/left/right distance array updates, and calculate the forward distance variance that is the heart of the ‘isStuck()’ function.  This is all great, but I’m not sure how all that gets integrated back into the main Mega MCU processing loop.

I currently have a ‘receiveEvent(int numBytes)’ implemented on the Teensy to receive a ‘request type’ value from the Mega.  This value determines what dataset (left, right, both, just centers, etc) gets sent back to the Mega at the next ‘requestEvent()’ event (triggered by a ‘Wire.requestFrom()’ call from the Mega).  So, I could simply add some more request types to ‘GetRequestedVL53l0xValues(VL53L0X_REQUEST which)’ or better yet, add a new function to the Mega to specifically request things like front distance or front distance variance (or simply have the Mega request that the Teensy report the current value of the ‘bisStuck’ boolean variable.

I could also set up a Mega input as an interrupt pin with a CHANGE condition, and have the Teensy interrupt the Mega whenever the ‘stuck’ condition changed (from ‘not stuck’ to ‘stuck, or vice versa).  The Mega’s ISR would simply set the ‘bisStuck’ boolean variable to the state of the interrupt input (HIGH or LOW).

Of course, this all presumes there is some real advantage to moving this functionality to the Teensy.  If the Mega isn’t processing-challenged, there is no reason to do all this.  As this post shows, the time cost for the incremental variance calculation on the Mega is only about 225 uSec, or less than 0.5% of the current 200 mSec loop time.

And, looking at the timestamps on the last actual ‘sandbox’ run, I see that the timestamps during the wall-tracking portion were pretty constant – between 197 and 202 mSec – not the kind of data one would expect from an MCU struggling to keep up.

10 August 2020 Update:

After realizing that the Mega really wasn’t having much problem keeping up with a 200 mSec loop cycle, I went ahead and added the ‘!bIsStuck’ guard to both the ‘RotateToParallelOrientation()’ and ‘CaptureWallOffset()’ functions, thinking this would solve the problem.  What actually happened is that as soon as Wall-E2 started running, it immediately sensed a ‘Stuck’ condition and started backing up – WTF!  Some head-scratching and some troubleshooting revealed that the while() loops in which I had put the new guard code were running much faster than the main loop (which runs at 200 mSec intervals via use of a ‘elapsedMillis’ variable).  What this meant was that it was calculating the forward distance variance hundreds of times per second rather than five, and the forward distance wasn’t changing nearly fast enough to prevent the variance from quickly winding down to zero – oops!  In order to fix this problem I had to install some more ‘elapseMillis’ variables to make sure that CalcDistArrayVariance() was called at the same rate as in the main loop, namely every MIN_PING_INTERVAL_MSEC mSec.  This works, but now I not only had more ‘Stuck’ guard code scattered throughout my code, but additional kludge-code needed to make the ‘Stuck’ kludge-code work – YUK!!

After thinking about this a bit more, I realized there was another option I hadn’t considered – a timer interrupt set at a convenient interval (like MIN_PING_INTERVAL_MSEC mSec as I am doing now) that does nothing but calculate front distance variance and set bIsStuck accordingly.  I haven’t used any timer interrupts up to this point in my Arduino journey, but this seems like a perfect application.

As I normally do, I grabbed a spare Arduino Mega from my parts box, Googled for some timer interrupt examples, and created a demo program to test my theory.  First I just did the normal ‘blink without delay’ demo, copying the ‘Timer1’ portion of the code from this post to a new project, and then modifying the ISR to call my ‘CalcDistArrayVariance()’ function with a constant number for the ‘frontdist’ parameter.  The CalcDistArrayVariance() function computes the variance of a 25-element array of distance values, and feeding a constant into the function simulates what would happen if the robot gets stuck at some point.

I set up the program to show how long it takes to calculate the variance each time, and how long it takes for the variance to fall to zero.  When I ran the program, I got this output:

As can be seen from the above, calls to CalcDistArrayVariance() occur at 200 mSec intervals and each call takes about 150 uSec (insignificant compared to the 200 mSec loop interval), and it takes about 5 seconds for a constant distance input to produce zero variance on the output.  This is pretty much perfect for my application.  Before implementing the timer idea, I had over 20 calls to IsStuck() scattered throughout my code, and now they can all be replaced by the boolean bIsStuck variable, which is managed by the timer1 ISR.

Here’s the entire timer interrupt demo code:

 

 

 

Stay tuned,

Frank

 

 

 

Replacing HC-SRO4 Ultrasonic Sensors with VL53L0X Arrays Part IV

Posted 18 July 2020

In my continuing effort to update Wall-E2’s superpowers, I have been trying to replace the HC-SR04 ‘ping’ sensors with ST Microelectronics VL53L0X Time-of-Flight (ToF)  sensors, as implemented by the popular GY530 modules available on eBay.

First, I got a 3-element array working and demonstrated effective parallel-heading determination and wall tracking, as described in this post.

Next, I added a second 3-element array on the other side of the robot, but I have been running into trouble getting both arrays to work properly at the same time.  Somehow there seems to be some interaction between the two arrays that I can’t seem to nail down.

  • I have determined that all six elements respond properly when operated individually or as a member of a 3-element array
  • Adding a 4th element to the array causes one or more of the first three elements to respond with an out-of-range measurement
  • Adding 2.2K pullups to the I2C bus makes the problem worse, not better.  After some investigation, I discovered that the GY530 module already has 10K pullups included, so three modules on the bus would reduce the pullups to 3.3K, and four would already reduce the value to 2.5K.  Adding a 2.2K in parallel with 3.3 or 2.5K would drive the value down to around 1.2-1.5K.  However, that did lead me to my next idea – using separate I2C busses for the left and right 3-element arrays.
  • Moved the left-hand 3-element array from the Wire1 I2C bus to the Wire2 I2C bus.  Now the 10K pullups shouldn’t be an issue, as I had already demonstrated proper operation of a 3-element array on the Wire1 I2C bus.  Unfortunately, this exhibited similar problems; When running all six elements, all three left-side elements measure properly, but only one right-side element produces reasonable values – the other two give nonsense readings.

Here’s a photo of the top deck of my autonomous wall-following robot, with the two 3-element arrays installed on the Wire1 and Wire2 I2C busses of a Teensy 3.5.

two 3-element VL53L0X arrays installed on a Teensy 3.5 Wire1 & Wire2 busses

And here is the schematic for the split-bus configuration:

A typical output sequence follows:  The first column is milliseconds since program start, and the following 6 columns are the front, center, and rear sensor measurements for the right & left arrays, respectively.

In the above, the data from the first two sensors on the right side is invalid, but all the rest show ‘real’ values.

If the left-side array is disconnected (unplugged) from the Wire2 bus and the program modified to not initialize/measure the left-side array, then the right-side array reads normally, as shown below

If the right-side array is disconnected (unplugged) from the Wire1 bus and the program modified to not initialize/measure the right-side array, then the left-side array reads normally, as shown below

Here’s the code being used to drive just the left-side VL53L0X array (right-side array code commented out and the right-side array physically disconnected from the Teensy 3.5):

And the same program, with the left-side array commented out

So, it appears that there is some sort of interaction between the Wire1 & Wire2 I2C busses on the Teensy 3.5.

22 July 2020 Update:

Based on some feedback from the Teensy forum, I added some code to my program to verify that each VL53L0X sensor I2C address had been set properly in the setup code.  When I did this, I got results that were more than a little mystifying; Initialization of the 3 sensors on the Wire1 bus all reported success, but the I2C scanning code reported a different story, as shown below:

The three sensors on the Wire1 bus were supposed to wind up at 2A, 2B & 2C, but the scanner showed them at 29 and 2C, with one of them missing entirely – wow!

So, I decided to go back to basics.  I modified my original triple VL53L0X demo program to include a I2C bus scan to verify the actual addresses of the sensors, as follows:

And got the following output:

As shown above, the VL53L0X sensors got programmed correctly, and appear to operate correctly as well.

Then I created a new program identical to the above, except for using the Wire2 bus instead of the Wire 1 bus, using different I2C addresses and different XSHUT pin assignments for programming the sensors.

When I ran this program, I got the following output:

Then I modified my original hex-sensor program to initialize one array at a time, with a I2C bus scan in between, as follows:

When I ran this program, I got the following output:

So it seems pretty clear that there is something going on with the Teensy 3.5 that doesn’t like it when I try to run both Wire1 & Wire2 buses at the same time.

As additional background data, the original impetus for splitting the six sensors between two I2C buses was my discovery that adding the 4th through 6th sensors on the Wire1 bus caused a similar problem to the one described here; clearly bad readings from the 1st & second sensors, while readings from later ones were fine.  I don’t know if these issues are related, but something is happening for sure.

23 July 2020 Update:

I’m frustrated at the lack of response from both the Teensy and ST Micro support forums on this issue.  The Teensy guys are trying to help, but nobody wants to look at the elephant in the room – the fact that all six VL53L0X units work fine when their respective I2C bus is the only one operating, but not when both buses are in operation.  The ST Micro guys just don’t answer at all.

I went back and modified my program to print out as many of the detailed measurement parameters as I could find for each sensor, in an effort to gain some understanding about what is happening, and got the following output:

This output style is much harder to read, but is also much more complete. Each line (distances, signal rate, SpadCount, and RangeStatus) has six entries – one for each of the six sensors.  The first three entries are sensors 1-3 on Teensy Wire1, and the remaining three are sensors 4-6 on Teensy Wire2.  As the data shows, sensors 1 & 2 always have bogus results, while sensors 3-6 have what appears to be valid data, although I’m not competent to say anything more than “the distance values for sensors 3-6 track reality, while the ones for sensors 1-2 do not”.

Then I modified the program yet again to just use sensors 1-3 on Teensy Wire1 I2C bus, without changing any hardware (leaving sensors 4-6 still attached to Teensy Wire2, but not initializing or addressing them in any way, and got the following output:

Now the parameters for sensors 1-3 all look real, and of course all the parameters for sensors 4-6 are zeroed out.

Then I modified the program to just initialize and access sensors 4-6 on Teensy Wire2

Now it is clear that the data for sensors 1-3 (Teensy Wire1) are all zeroes, and the data for sensors 4-6 (Teensy Wire2) are valid.  Again, this is with no hardware changes at all; all sensors are still powered and connected to their respective I2C buses.

29 July 2020 Update:

Still working the multiple VL53L0X issue.  After getting nowhere with the Teensy and ST Micro forums, I decided to try a different tack.  I decided to try controlling all six VL53L0X sensors using the single I2C bus on an Arduino Mega.  I reasoned that if I could get them all to play using a Mega, this would lend credence to my theory that something funny is going on with the Teensy 3.5 auxiliary I2C buses.

Unfortunately, I immediately ran into problems getting multiple sensors to work using the Arduino Mega.  At first I thought this was due to the fact that the Mega is a 5V controller and so I needed a level shifter setup on the I2C bus between the Mega and the VL53L0X sensors, but that didn’t change anything.  Then, after a more thorough look at the VL53L0X schematic and documentation I discovered that the real problem was that while the I2C bus lines have internal level shifters already implemented on the module, the XSHUT & GPIO lines do not.  This meant that I had been driving the XSHUT line of each of my attached sensors well over the do-not-exceed level — oops!

At this point I was starting to wonder if I had damaged the sensors’ XSHUT lines, thereby making any further diagnostic attempts with these sensors fruitless.  In addition, I was starting to wonder if I hadn’t also given myself problems by using ‘no-name’ modules – cheap, but maybe worth every penny?  I also had read some posts that indicated that the Adafruit VL53L0X driver library might have some problems, so maybe I had a trifecta going – cheap no-name modules, potentially damaged by my abuse of the XSHUT lines, being driven by a questionable library – yikes!

So, I started over; I acquired some Pololu VL53L0X modules, installed their Arduino driver library, and used their ‘Single’ code example to verify basic operation with a single VL53L0X sensor connected to an Arduino Mega controller.  Then I added in the multi-sensor initialization code, being careful to simply switch the XSHUT lines from output (for outputting a LOW signal) to input (for ‘outputting’ a HIGH signal by allowing the onboard 47K pullup to take over) for XSHUT line management.

With the above setup I have been able to demonstrate a working 3-sensor setup using an Arduino Mega controller.  When my remaining Pololu VL53L0X modules arrive later this week, I hope to show that I can run (at least) six Pololu VL53L0X sensors on a single I2C bus.  If I can do that, then I’ll be in a position to make some more waves on the Teensy forum (I hope).

By the way, one of the side-effects of this effort was a reply from John Kvam mentioning that ST Micro makes an Arduino compatible 3.3V 32-bit microcontroller called the ST32 (also known as the ‘blue pill’ for it’s PCB color).  This is pretty capable device, with the single drawback that it doesn’t come with an Arduino bootloader installed, meaning it can’t be directly programmed via its USB-C connector.  Instead, one has to use a FTDI device (like the CKDevices FTDI Pro or the Sparkfun FTDI breakout board.  However, there are plenty of “How To’s” out there describing how a bootloader can be loaded into flash memory, after which you can program it just like any other Arduino device – pretty cool!  Anyway, I ordered a couple of these boards to play with the next time I need something Arduino-ish but not as fast (or expensive) as a Teensy.

31 July 2020 Update:

Success!  I finally got more than three VL53L0X sensors working on the same I2C bus using an Arduino controller!  However, I’m embarrassed to say that in the process, I discovered a hidden broken ground wire in one of my two I2C bus daisy chains, and this may have been causing the symptoms I was seeing with the Teensy 2-bus configuration – don’t know yet.

In any case, after repairing the wire break I got a set of six VL53L0X sensors working, consisting of the three Pololu modules I just got in, plus three of the older GY530 modules I was using on earlier Teensy-based experiments.  After that, I was able to demonstrate proper operation of the two 3-sensor arrays from my Wall-E2 robot, as shown in the following photo and Excel plot.

Sensor arrays dismounted from Wall-E2 and connected to Arduino Mega I2C bus

01 August 2020 Update:

Well, it is officially time to eat crow.  After all the whooping and hollering I’ve done, it turns out the entire problem was a hidden ground wire break in I2C daisy-chain cable attached to the Teensy 3.5’s Wire2 I2C bus.  After repairing the break, I can now demonstrate operation of six VL53L0X sensors on two different I2C buses on the Teensy 3.5, as shown in photo and Excel plot below.

Six VL53L0X sensors on Teensy 3.5 Wire1 & Wire2 I2C buses. Note ground wire repair on left rear connector (top right in photo)

Now that this saga has been thankfully resolved, I can get back to the original project of integrating these two sensor arrays onto Wall-E2, my autonomous wall-following robot.

August 08, 2020 Update:

I believe I have finally completed the effort to integrate the VL53L0X sensor arrays onto Wall-E2, my autonomous wall-following robot.  Here’s the physical setup

Dual 3-element VL53L0X sensor arrays on top deck of Wall-E2.

Note that the USB cable to the Teensy 3.5 is temporary, just for testing.  To verify proper operation, I wrote a small program for the Mega 2560 main controller containing only the code  from the main FourWD_WallE2_V5.ino required to retrieve sensor values from the Teensy 3.5, and used this program to verify and debug the Teensy 3.5 program. As the two Excel plots below show, the main Mega 2560 controller can now retrieve distance data from all six VL53L0X sensors at once.

VL53L0X distances reported locally by the Teensy 3.5

VL53L0X distances as retrieved from the Teensy 3.5 by the Mega 2560

There are some very small differences in these two plots, which I attribute to the fact that the Teensy measurement timing and the Mega 2560 retrieval timing are asynchronous, so the Mega may be retrieving some new and some ‘old’ (in the sense that it might be 100 mSec older than the rest) measurement data.  This is insignificant operationally, and wouldn’t be evident unless this sort of simultaneous local/remote reporting was done.

A minor side note; I wound up using the GY530 ‘no name’ sensors rather than the Pololu ones because they were a) smaller, and b) already mounted on the two custom brackets I printed for them.  The Pololu sensors (along with a whole bunch of GY530’s that finally arrived from Ali Express) went into my ‘Sensors’ parts bin for the next project.  If anyone needs VL53L0X sensors, let me know! 😉

Stay tuned,

Frank

 

 

 

 

 

 

I2C Hangup bug cured! Miracle of Miracles! Film at 11!

Posted 06 July 2020

Miracle of miracles!  Arduino finally got off their collective asses and decided to do something about the well-known, well-documented, and long-ignored I2C hangup bug.  Thanks to Grey Christoforo of Oxford, England for submitting the pull request that started the ball rolling.  See this  github issue thread for all the gory details.  However, in a bizarre outcome, the implementation of the needed timeouts isn’t implemented by default! You have to modify your code to add a call to a new function, like the following:

Note that you have to explicitly add a timeout value (3000 in my example above) or the timeout feature will still not be enabled! The ‘true’ parameter tells the library to reset the I2C bus if a timeout is detected – surely something you will want to do.

I’m currently working on a ‘before/after’ post to demonstrate that the new timeout feature actually works with real hardware scenarios.  However, due to the intermittent nature of the I2C hangup bug, it takes a while (hours/days) to grind through enough iterations to excite the bug reliably, so it may be a while before I have a good demonstration

One last thing; at some point the examples in C:\Program Files (x86)\Arduino\hardware\arduino\avr\libraries\Wire\examples (on my Win 10 machine) will probably be updated/expanded to show how to properly implement the new timeout feature, but this has not happened yet AFAICT.

The rest of this post describes my attempt to verify that the new timeout feature does, in fact, work as advertised.  The idea is to construct a “before-and-after” demonstration, where the ‘before’ configuration reliably hangs up using the Wire library without the timeout enabled, and an ‘after’ configuration that is identical to the ‘before’ setup except with the timeout enabled.

Before Configuration:

I actually started with a ‘before-before’ configuration using the SBWire library, as I have been working with I2C projects and the SBWire library ever since I gave up on the Arduino Wire library two years ago.  This configuration is patterned after Wall-E2, my current autonomous wall-following robot, which uses an Adafruit RTC, an Adafruit FRAM, a DFRobots MPU6050 IMU, and six VL53L0X time-of-flight proximity sensors (the ToF sensors are managed by a slave Teensy over the I2C bus).  For this test, I arranged all the I2C components on a plug board and connected to them using an Arduino Mega 2560 (the same controller I have on Wall-E2), as shown in the following photo.

From left to right; two VL53L0X ToF modules, FRAM module, DS3231 RTC module, MPU6050 IMU module

The software is a cut down version of the robot software, and in this first test all it does is print out time/date from the RTC and the relative heading value from the IMU.  After almost 13 hours, it was still running fine, as shown below:

So now I have a ‘known good’ (with SBWire) hardware configuration.  The next step is to change the software back from SBWire to Wire without the timeout implemented.  This should fail – the IMU readout should hangup within a few hours as it did before I originally switched to SBWire.

July 08 2020 Update:

After laboriously changing back from SBWire to Wire, I got the configuration shown in the following photo to work properly using the new Wire library without the new timeout feature enabled.

From left to right: MPU6050 IMU, DS3231 RTC, Adafruit I2C FRAM, and 3e VL53L0X ToF proximity sensors, all on the Mega 2560’s I2C bus

I programmed the Mega to access everything but the FRAM 10 times/second, and print out the results on the serial monitor, and then let it run overnight.  When I got up this morning I expected to see that it had hung up after a few hours, but discovered that it was still running fine after eight hours – bummer!  at 10 meas/sec that is 480 min * 60 sec/min * 10 = 288,000 I2C measurement cycles * 5 I2C transactions per cycle = 1,440,000 I2C transactions.  I was bummed out because it will be impossible to verify whether or not the timeout feature actually works if I can’t get a configuration that reliably hangs up. When I came back a few hours later, I saw that the printout to the serial monitor had stopped at around 700 minutes, but this turned out to be the monitor hanging up – not the I2C bus – double bummer.

So, I modified the program to only report results every second instead of 10/second so I won’t run out of serial monitor again, and restarted the ‘before’ configuration.

10 July 2020 Update:

I added the Sunfounder 20 x 4 I2C LCD display to the setup so I could display the IMU heading and proximity sensor distances locally, as shown below

I2C Test setup with Sunfounder 20 x 4 I2C LCD added

After getting this setup running, I was trying to figure out how to definitively demonstrate I2C bus hangups without the Wire library timeout feature (the ‘before’ configuration) and then demonstrate continued operation with timeouts enabled (the ‘after’ configuration).  In an email conversation, Grey Christoforo pointed me to another poster who was doing the same thing, by using an external transistor to short one I2C line to ground under program control, thereby demonstrating that the timeout feature allowed continued operation.  This gave me the idea that manually shorting one of the I2C lines to ground should do the same thing, and would allow me to demonstrate the ‘before’ and ‘after’ configurations.

The following code snippet shows the code necessary to enable the Wire library timeout feature

Although not entirely necessary, this is how I instrumented my code to capture timeout events and display them on my serial monitor

All my other hardware setup code has been removed for clarity.  Notice though, that I tried a number of different timeout values, starting from the default value of 25000 (25 mSec) down to 2000, and then back up to 3000.  At least in my particular configuration, the 1000 value was too small – it caused a timeout flag to be generated on every pass through the loop.  This was an unexpected result, as the SBWire library uses a 100 uSec (i.e. a timeout value of 100) for it’s default timeout value, and this setting has always worked fine in all my I2C projects.

In any case, here’s a short video that demonstrates that the Wire library can now recover from an I2C bus traffic interruption via the use of the new timeout feature.

 

Stay tuned!

Lab Power Supply

Posted 30 June 2020,

Almost exactly one year ago I ran across some posts regarding a very nice lab power supply regulator and display called the DPS500x power supply front-end.  My existing linear style supply was getting a little long in the tooth, and ran out of gas pretty quickly above about 12V and 1A.  So, I decided it was time to upgrade, and wound up with a very nice, lightweight well-performing unit.  Unfortunately I forgot to document the project and when I recently wanted to point someone else to this nifty product, I didn’t have anything comprehensive to point to!

So, this post is a belated documentation post for the project.

There are three ‘big’ pieces (big in terms of importance, but not in actual size or weight) used in this project; the controller head itself, an appropriate housing, and a AC/DC power supply that will fit inside the housing

The DPS5005 50V 5A constant voltage/constant current digital controller head

This is a really neat little gadget that takes a DC power supply as input and steps it down to the desired output voltage using a highly efficient buck voltage converter technique, and applying some constant voltage/constant current magic to the output, all in a package that is maybe 40 x 60 x 30 mm.  I got mine from Banggood.com, but they are available everywhere.

DPS5005 from Banggood

A suitable housing

In this case, an aluminum two-piece housing and hardware kit custom designed to house the DPS500x series of power supply controller heads.  The upper and lower halves slide together via a tongue-and-groove arrangement, and is VERY well done.  In addition, the housing has three sets of internal rails that make it easy to align/mount internal assemblies – NICE!

As the second photo above suggests, one suggested layout is to have an external 25-50V power supply connected to this piece, with basically nothing inside.  However, I did some research and convinced myself that I could fit a small open-frame AC/DC power supply inside the housing and wind up with a complete unit, just with lower wattage.

An AC/DC power supply

I wanted one that could fit inside the housing between the front and back plates to provide the ‘raw’ DC voltage to the controller head.  In the past I have used a number of Mean Well supplies and found them to be small, reliable, and cheap, so I started looking for a unit that could deliver 24V at 2A or greater (the most I thought I would need for a general-purpose bench-top power supply) while still fitting into the housing.  After a bit of research, I found that the Mean Well EPS-65-24 24V 2.7A open-frame power supply would do the trick nicely, and was available from Mouser for $13.80

Side view of power supply with model number shown

With all the ‘big’ parts identified and ordered, there some smaller issues that needed to be addressed:

Custom 3D printed back panel:

Because I was planning to use an internal AC/DC power supply rather than an external one, I needed a panel-mounted AC plug instead of DC Banana plugs, and I wanted an AC ON/OFF switch as well.  So using the basic dimensions and layout of the existing back panel, I designed a new one in TinkerCad to meet my needs.  I found some designs for 40mm fan grills on thingiverse and used them to create a cutout directly into the back panel, and took the cutout dimensions for the power switch and AC plug from the manufacturer’s specs for the parts.  When I was finished, I had a nice 3D printed panel as shown below:

TinkerCad back panel design

Custom back panel installed on power supply housing

Power Supply Mounting Rails:

The housing sports three sets of internal rails that can be used for parts mountings, so I designed and printed some ‘runners’ that would attache to the bottom of the power supply and mate with the internal rail geometry, as shown below:

3D-printed runners to mate with housing internal rail structures

Tilt Stand:

The finished power supply worked great, but it was so small and flat that it was difficult to get my fingers on the controls, so I designed and fabricated a custom tilt stand, as shown below

Front view showing tilt stand. Ignore the missing screw 😉

Miscellaneous:

Mean Well AC/DC power supply AC input connector:

Mean Well AC/DC power supply DC output connector:

HP style AC cord panel-mount receptacle:

AC Power switch: KCD3 T85 16A 250VAC, 20A 125VAC.

Here’s a link to the 3D print (STL) files for the tilt stand and the custom rear panel.  If you don’t have access to a 3D printer, there will surely be someone in your local area who can print them for you.

 

Frank

 

Replacing HC-SRO4 Ultrasonic Sensors with VL53L0X Arrays Part III

In the previous two posts on this subject, I described my efforts to replace the ultrasonic ‘ping’ sensors on my autonomous wall-following robot Wall-E2 with ST Microelectronics VL53L0X ‘Time-of-Flight’ LIDAR proximity sensors, with the aim of using an array of the ToF sensors to solve the problem of wall tracking at a specified offset. Part II demonstrated just such a capability, but only on the right side, and not in a way that was fully integrated with the rest of the robot operating system.

This post describes the work to fully integrate the new ‘parallel-find’ and ‘OffsetCapture/Track’ algorithms into the rest of the robot’s autonomous wall-following operating system. The robot operating system is a loop that runs every 200 mSec.  Each time through the loop the current operating mode is determined in GetOpMode(), as follows:

  • Dead Battery:  Battery died before robot could find and connect to a charging station
  • IR Homing: Robot has detected a charging station and is homing on it’s IR signal
  • Charging: Robot is currently charging on one of the available charging stations
  • Wall Follow:  nothing else is going on, so continue tracking the nearest wall

In the above list, all but the last (Wall Follow)  are ‘blocking’ in the sense that once they start, they run to completion without interruption.  The ‘Wall Follow’ mode processing path however, typically makes a small adjustment to the left/right motor speeds depending on the situation and then exits back to the main loop, where GetOpMode() runs again.  So wall following involves multiple passes through GetOpModes() so that higher-priority tasks (like dead battery recognition, IR homing, and battery charging are executed in a timely manner.

The result of GetOpMode() is passed to a CASE statement where actions appropriate to the determined mode are executed as follows:

  • MODE_CHARGING:   This mode blocks until charging is complete
  • MODE_IRHOMING:  This mode blocks while homing station to the charge station.  If the robot isn’t ‘hungry’, it attempts to go around the charging station rather than connecting to it.
  • MODE_DEADBATTERY:  This mode blocks forever
  • MODE_WALLFOLLOW: This is the default robot mode – whenever one of the above ‘special’ modes doesn’t apply. The MODE_WALLFOLLOW mode is further broken down into tracking cases as determined in GetTrackingDir(), which returns one of TRACKING_NONE, TRACKING_LEFT, TRACKING_RIGHT, or TRACKING_NEITHER.

With the replacement of the ‘ping’ sensors with the VL53L0X array) Wall-E2 can now accurately determine its orientation with respect to a nearby wall.  So I now believe I can make the offset capture operation a blocking call, significantly simplifying the MODE_WALLFOLLOW code.  Here is the code for the TRACKING_RIGHT case:

In the above code the robot first checks for obstacles, and either maneuvers to capture the desired wall offset (first time through this case) or continues to track the previously captured offset.

 

 

 

Replacing HC-SRO4 Ultrasonic Sensors with VL53L0X Arrays Part II

Posted 16 June 2020

In my previous post on this subject, I described my efforts to replace the ultrasonic ‘ping’ distance sensors with modules built around the ST Microelectronics VL53L0X ‘Time of Flight’ LIDAR distance sensor to improve Wall_E2’s (my autonomous wall-following robot) ability to track a nearby wall at a specified standoff distance.

At the conclusion of my last post, I had determined that a three-element linear array of VL53L0X controlled by a Teensy 3.5 was effective in achieving and tracking parallel orientation to a nearby wall. This, should allow the robot to initially orient itself parallel to the target wall and then capture and maintain the desired offset distance.

This post describes follow-on efforts to verify that the Arduino Mega 2560 robot controller can acquire distance and steering information from the Teensy 3.5 sensor controller via its I2C bus.  Currently the robot system communicates with four devices via I2C – a DF Robots MPU6050 6DOF IMU, an Adafruit DS3231 RTC, an Adafruit FRAM, and the Teensy 3.2 controller for the IR Homing module for autonomous charging.  The idea here is to use two three-element linear arrays of VL53L0X modules controlled by a Teensy 3.5 on its secondary I2C bus, controlled by the Mega system controller on the main I2C bus.  The Mega would see the Teensy 3.5 as a just a fifth slave device on its I2C bus, and the Teensy 3.5 would handle all the interactions with the VL53L0X sensors.  As an added benefit, the Teensy 3.5 can calculate the steering value for tracking, as needed.

The first step in this process was to verify that the Mega could communicate with the Teensy 3.5 over the main I2C bus, while the Teensy 3.5 communicated with the sensor array(s) via its secondary I2C bus.  To do this I created two programs – an Arduino Mega program to simulate the main robot controller, and a Teensy 3.5 program to act as the sensor controller (the Teensy program will eventually be the final sensor controller program).

Here’s the Mega 2560 simulator program:

And the Teensy 3.5 program

Here’s the hardware layout for the first test:

Arduino Mega system controller in the foreground, followed by the Teensy 3.5 sensor array controller in the middle, followed by a single 3-element sensor array in the background

The next step was to mount everything  on the  robot’s second deck, and verify that the Teensy 3.5 sensor array controller can talk to the robot’s Mega controller via the robot’s I2C bus.   Here’s the hardware layout:

Right-side three sensor array mounted on robot second deck. Teensy 3.5 sensor array controller shown at middle foreground.

After mounting the array and array controller on the robot’s second deck, I connected the Teensy to robot +5V, GND and the I2C bus, and loaded the VL53L0X_Master.ino sketch into the robot’s Mega system controller.  With this setup I was able to demonstrate much improved parallel orientation detection.  The setup is much more precise and straightforward than the previous algorithm using the ‘ping’ sensors.  Instead of having to make several turns toward and away from the near wall looking for a distance inflection point, I can now determine immediately from the sign of the steering value which way to turn, and can detect the parallel condition when the steering value changes sign.

I think I may even be able to use this new ‘super-power’ to simplify the initial ‘offset capture’ phase, as well as the offset tracking phase.  In earlier work I demonstrated that I can use a PID engine to drive a stepper motor to keep the sensor array parallel to a moving target ‘wall’, so I’m confident I can use the same technique to drive the robot’s motors to maintain a parallel condition with respect to a nearby wall, by driving the steering value toward zero.  In addition, I should be able to set the PID ‘setpoint’ to a non-zero value and cause the robot to assume a stable oblique orientation with respect to the wall, meaning I should be able to have the robot drive at a stable oblique angle toward or away from the wall to capture the desired offset.

19 June 2020 Update:

As a preliminary step to using a PID engine to drive the ‘RotateToParallelOrientation’ maneuver, I modified the robot code to report the front, center, and rear distances from the VL53L0X array, along with the calculated steering value computed as (F-R)/C.  Then I recorded the distance and steering value vs relative pointing angle for 20, 30, and 40 cm offsets from my target ‘wall’.

Here’s the physical setup for the experiment (only the setup for the 20 cm offset is shown – the others are identical except for the offset).

Experiment setup for 20 cm offset from target wall

Then I recorded the three distances and steering value for -40 to +40º relative pointing angles into an Excel workbook and created the plots shown below:

Array distances and steering value for 20 cm offset. Note steering value zero is very close to parallel orientation

Array distances and steering value for 30 cm offset. Note steering value zero is very close to parallel orientation

Array distances and steering value for 40 cm offset. Note steering value zero is very close to parallel orientation

The front, center, and rear distance and steering value plots all look very similar from 20-40 cm offset, as one would expect.  Here’s a combined distance plot of all three offset values.

Combined distance plots for all three offsets. Note that the ‘flyback’ lines are an artifact of the plotting arrangement

In the above chart, it is clear that the curves for each offset are very similar, so an algorithm that works at one offset should also work for all offsets between 20 and 40 cm.  The next chart shows the steering values for all three offsets on the same plot.

Steering values vs relative pointing angle for all three offset distances

As might be expected from the way in which the steering value is computed, i.e. (Front-Rear)/Center, the value range decreases significantly as the offset distance increases.  At 20 cm the range is from -0.35 to +0.25, but for 40 cm it is only -0.18 to +0.15.  So, whatever algorithm I come up with for achieving parallel orientation should be effective for the lowest range in order to be effective at all anticipated starting offsets.

To try out my new VL53L0X super-powers, I re-wrote the ‘RotateToParallelOrientation’ subroutine that attempts to rotate the robot so it is parallel to the nearest wall.  The modification uses the PID engine to drive the calculated steering value from the VL53L0X sensor array to zero.  After verifying that this works (quite well, actually!) I decided to modify it some more to see if I could also use the PID engine and the VL53L0X steering value to track the wall once parallel orientation was obtained.  So I set up a two-stage process; the parallel orientation stage uses a PID with Kp = 800 (Ki = Kd = 0), and the tracking stage uses the same PID but with Kp set to half the parallel search value.  This worked amazingly well – much better than anything I have been able to achieve to date.

The following short video is composed of two separate ‘takes’ the first one shows a ‘parallel orientation’ find operation with Kp = 800.  The second one shows the effect of combining the ‘parallel find’ operation with Kp = 800 with a short segment of wall tracking with Kp = 400.  As can be seen in the video, the tracking portion looks for all the world as if there were no corrections being made at all – it’s that smooth!  I actually had to look through the telemetry data to verify that the tracking PID was actually making corrections to keep the steering value near zero.  Here’s the video

and here’s the output from the two-step ‘find parallel and track’ portion of the video

From the telemetry it can be seen that the ‘find parallel’ stage results in the robot oriented parallel to the wall at about 360 mm or so.  Then in the ‘track’ stage, the ‘Steer’ value is held to within a few hundredths (0.01 to 0.07 right at the end), and the offset distance stays practically constant at 361 to 369 mm – wow!

The results of the above experiments show without a doubt that the three VL53L0X linear array is quite accurate parallel orientation and wall tracking operations – much better than anything I’ve been able to do with the ‘ping’ sensors.

The last step in this process has yet to be accomplished – showing that the setup can be used to capture a desired offset after the parallel find operation but before the wall tracking stage. Based on the work to date, I think this will be straightforward, but…..

21 June 2020 Update:

I wasn’t happy with the small range of values resulting from the above steering value computation, especially the way the value range decreased for larger offsets.  So, I went back to my original data in Excel and re-plotted the data using (Front-Rear)/100 instead of (Front-Rear)/Center.  This produced a much nicer set of curves, as shown in the following plot

Steering values vs relative pointing angle using (Front-Rear)/100 instead of (Front-Rear)/Center

Using the above curve, I modified the program to demonstrate basic wall offset capture, as shown in the following short video

The video demonstrates a three stage wall offset capture algorithm, with delays inserted between the stages for debugging purposes.  In the first stage, a PID engine is used with a very high Kp value to rotate the robot until the steering value from the VL53L0X array is near zero, denoting the parallel condition.  After a 5 second delay, the PID engine Kp value is reduced to about 1/4 the ‘parallel rotate’ value, and the PID setpoint is changed to maintain a steering value that produces an approximately 20º ‘cut’ toward the desired wall offset value.  In the final stage, the robot is rotated back to parallel, and the robot is stopped.   In the above demonstration, the robot started out oriented about 30-40º toward the wall, about 60-70 cm from the wall.  After the initial parallel rotation, the robot is located about 50 cm from the wall.  In the offset capture stage, the robot moves slowly until the center VL53L0X reports about 36 cm, and then the robot rotates to parallel again, and stops the motors.  At this point the robot is oriented parallel to the wall at an offset that is approximately 28 cm – not quite the desired 30 cm, but pretty close!

The next step will be to eliminate the inter-stage pauses, and instead of stopping after the third (rotate to parallel) stage, the PID engine will be used to track the resulting offset by driving the VL53L0X array steering value to zero.

23 June 2020 Update:

After nailing down the initial parallel-find, offset capture  and return-to-parallel steps, I had some difficulty getting the wall tracking portion to work well. Initially I was trying to track the wall offset using the measured distance after the return-to-parallel step, and this was blowing up pretty much regardless of the PID Kp setting.  After thinking about this for a while, I realized I had fallen back into the same trap I was trying to escape by going to an array of sensors rather than just one – when the robot rotates due to a tracking correction, a single distance measurement will always cause problems.

So, I went back to what I knew really worked – using all three sensor measurements to form a ‘steering value’ as follows:

To complete the setup, the PID setpoint is made equal to the measured center sensor distance at the conclusion of the ‘return-to-parallel’ step.  When the robot is parallel to the wall, Front = Rear = 0, so the Steering value is just the Center distance, as desired, and the PID output will drive the motors to maintain the offset setpoint.  When the robot rotates, the front and rear sensors develop a difference  which either adds to or subtracts from the center reading in a way that forms a reliable ‘steering value’ for the PID.

Here are a couple of short videos demonstrating all four steps; parallel-find, approach-and-capture, return-to-parallel, and wall-tracking.  In the first video, the PID is set for (5,0,0) and it tracks pretty nicely.  In the second video, I tried a PID set for (25,0,0) and this didn’t seem to work as well.

At this point I’m pretty satisfied with the way the 3-element VL53L0X ToF sensor array is working as a replacement for the single ultrasonic ‘ping’ sensor.  The robot now has the capability to capture and track a specific offset from a nearby wall – just what I started out to do lo these many months ago.

25 June 2020 Update:

Here are a couple of short videos in my ‘outdoor’ (AKA entry hallway) range. The first video shows the response using a PID of (25,0,0) while the second one shows the same thing but with a PID value of (5,0,0).

The following Excel plot shows the steering value (in this case, just Fdist – Rdist), the corresponding PID response, and the center sensor distance measurement

I Googled around a bit and found some information on PID tuning, including this blog post.  I tried the recommended heuristic method, and wound up with a PID tuning set of (10,0,1), resulting in the following Excel plot and video

 

Stay tuned!

Frank