Tag Archives: Garmin LIDAR-Lite V4/LED

WallE3 Doesn’t Like Reflective Surfaces

Posted 08 December 2023

WallE3 went with us last month when we travelled to St. Louis for Thanksgiving with family, and I showed off his autonomous wall following skills. WallE3 actually did great for quite a while – that is until he found himself staring at the side of a floor-mounted wine cooler (wine ‘safe’?). As can be seen in the following short video, WallE3 fell in love with the cooler, and showed his love by repeatedly head-butting it – oops!

After looking at the telemetry data for the run, I saw that WallE3 was measuring much larger front distances – like several hundred centimeters – when it was only a few centimeters from the object. It appears he was backing up to a defined front distance in response to a ‘WALL_OFFSET_DISTANCE_AHEAD’ anomaly, but somehow convinced himself that instead of 20cm from the wall, he was actually more like 100cm away. Of course, since he wanted to be at 30cm (the desired wall offset distance), he drove forward to lessen the distance, thereby bonking into the wall. Then, when he hit the wall, the front LIDAR line of sight geometry changed enough to produce a true measurement of just a few centimeters, which then sent WallE3 running backwards to open up the distance. Lather, rinse, repeat. Here’s an Excel plot of a representative (but not exact – I somehow lost the actual telemetry data for this run).

Representative reflective surface ‘headbutt’ telemetry

As can be seen from the above the measured distance oscillates between the maximum measurable distance of 1000cm to nearly zero. Here’s a photo of the experimental setup that produced the above data.

WallE3 and a reflective surface

The ‘reflective surface’ is a piece of glossy black translucent plastic, oriented at an angle to reflect WallE3’s LIDAR beam upward to the ceiling and then the reflected signal from the ceiling back to WallE3.

I’ve been thinking about this issue ever since first seeing it in St. Louis, but hadn’t come up with any firm ideas about how to solve it. I tinkered with the idea of generating two running averages of the front distance when in the ‘MoveToDesiredFrontDistCm()’ function, with the two averages separated in time by some amount. When approaching a normal non-reflective surface, the two averages would closely track each other, but when approaching a reflective surface that produced the above dramatic distance shifts, then the two averages would be dramatically different around the transitions. This could then be detected, and something done to recover. Then last night while falling asleep, I wondered whether or not the STMicro VL53LXX infra-red LIDAR sensors would have the same problem – hmm, maybe not! If that were the case, then I could probably run one in parallel with the Garmin LIDAR unit, and use it instead of the Garmin for all ‘MoveToDesiredFront/RearDistCm()’ calls.

I tried this experiment using the currently installed rear distance sensors by calling ‘MoveToDesiredRearDistCm()’ with the same reflective surface setup as before, as shown in the following photo:

‘MoveToDesiredRearDistCm()’ setup with reflective surface

Here’s an excel plot of the rear distance run:

MoveToRearDistCm() with 30cm target and reflective surface

As can be seen in the above plot, the rear distance run was completely normal, so the VL53LXX infra-red LIDAR sensors don’t have the same problem – at least not with this translucent glossy plastic material.

10 December 2023 Update:

I got to thinking that maybe the reason the rear distance sensor worked so well with the shiny black material is that it could be IR transparent, meaning that while the front sensor (an LED LIDAR system) would see a ‘mirror’, the rear sensor would just see the toolbox. So, I jumped up on Amazon and got a cheap mirror square so I could answer that question. Here is a short video and Excel plot showing a ‘MoveToDesiredFrontDistCm()’ run with the new mirror square.

As can be seen from the video, WallE was perfectly happy to drive right through the mirror, but I had visions of mirror pieces all over the bench, the floor, and me, so I manually prevented that from happening. Here’s an Excel plot showing the same run:

MoveToFrontDist(cm) approaching tilted mirror

As the Excel plot shows, the robot did OK for the first 20 measurements (about 1sec) but immediately thereafter started a steady 200cm, and it stayed that way until I stopped it at about 2sec

Then I tried the same experiment, but this time utilizing the STMicro VL53LXX IR LIDAR sensor on the rear of the robot, as shown in the following short video:

As the video shows, the IR LIDAR behaved pretty much the same as the LED LIDAR (no real surprise, as they are both LIDAR technology, but still a bummer!

Here’s the Excel plot for this run:

As the Excel plot shows, the distance decreased monotonically for the first 30 points (about 1.5sec) but then shot up to 200 due to the mirror. I believe the lower distances after about point 40 (2sec) were due to me interfering with the IR beam.

So, unfortunately my theory about the rear IR sensor doing better with the shiny black plastic ‘mirror’ because it appeared transparent at that wavelength seems to be bolstered, so I now think that using the VL53LXX sensor instead of the Garmin LIDAR LED sensor for ‘MoveToFrontDistCm()’ operations is NOT going to work. Back to the drawing board :(.

23 December 2023 Update:

I’ve been thinking about this problem for a while now, and have not come up with a good answer; WallE3’s perception of the world around it depends entirely on LIDAR-type distance sensors, so if those sensors produce ‘false’ distance reports due to mirror or mirror-ish surfaces, WallE3 has no way to know the ‘true’ distance – bummer!

So, what I decided to do is to simply detect the symptoms of the ‘mirrored surface’ situation, halt the robot and yell for help. The detection algorithm uses the knowledge that when the robot is moving forward toward a ‘normal’ object, the front distance should decrease monotonically with time. Similarly when the robot is moving backwards toward a ‘normal’ object, the rear distance should decrease monotonically with time. Conversely, when approaching a ‘mirror-like’ object, the measured distance tends to be unstable, with distance increasing instead of decreasing.

The detection algorithm calculates a three-point average each time through the action loop, and compares the result to the last time through the loop. If the new average is greater than the old average, a ‘mirrored surface detection’ is declared and the robot calls ‘YellForHelp()’ which stops the motors and emits an audible Morse code ‘SOS’.

Here’s the full code for ‘DoOneMoveToFrontDistCm()’ function that actually moves the robot and checks for ‘mirrored surface’ error conditions

02 January 2024 Update:

I wasn’t really happy with my previous attempt at ‘mirrored surface’ detection as it seemed pretty to produce false positives. After thinking about the problem some more, I thought I might be able to use a distance variance calculation as a more robust detection method. The idea is that a normal monotonic increase or decrease in distance measurements would have a pretty low variance, while a distance reversal would generate a much larger value.

So, I ran some simulations in Excel using a 5-point running variance calculation, and the results were encouraging. Then, with the help of my lovely lab assistant, I set up an experiment in my office sandbox to see if I could capture a representative mirrored surface ‘screwup’, as shown below:

And here is an Excel plot showing the results (both the distance and 5-pt variance vertical scales have been truncated to show the smaller scale variations)

Front Distance and 5-pt Variance (truncated vertical scales for better visibility)

As can be see in the video and the Excel plot, the robot undergoes a number of distinct front/back oscillations, but then eventually settles down. It is clear from the plot that the 5-pt variance calculation is a good indicator of a ‘mirrored surface’ condition. Just looking at the plot, it appears that a variance threshold of 40-60 should provide for robust detection without much of a risk of false positives.

One other note about this experiment. To get multiple oscillations as shown in the video and plot, the mirrored surface had to be slanted slightly up toward the ceiling. If the surface was oriented vertically like a normal wall, the robot would often miss the first distance and hit the wall, but then would typically back off to the correct distance. I think this indicates that in the vertical configuration, there is enough backscatter from the floor and/or the robot itself to get a reasonable (if not entirely accurate) LIDAR distance measurement.

07 January 2024 Update:

After playing around some more with this issue, I think the above 5pt variance calculation for ‘mirrored surface’ detection will work, so I revised my current well-tested ‘CalcBruteFrontDistArrayVariance()’ function to take a integer argument denoting the number of elements to use, starting at the end (most recent data) and working backwards.

Then I ran another test in my lab, but this time on a non-mirrored surface, to verify that the 5pt variance calc would still work properly and to settle on a good threshold for ‘mirror’ detection. The Excel plot below shows the results of a run where the robot moved backwards (still using the front LIDAR sensor) to 90cm from the wall.

Running 5-point variance of front distances

As can be seen, the variance starts out in the 10 to 20 range during the initial ‘coarse’ distance movement, but then drops into the 0 to 5 range during the second ‘fine tuning’ movement. Comparing this to the previous ‘mirrored surface’ plot leads me to believe that a threshold of 40 would almost certainly (eventually) detect a ‘mirrored surface’ condition.

24 January 2024 Update:

I added the ‘CalcFrontNPointVar(uint16_t N)’ function to the robot code so I could obtain the front variance using just the last N front distances instead of the entire 100 point array. This turned out to work very well for detecting the ‘Mirrored Surface’ anomaly condition. Then I added ‘ANOMALY_MIRRORED_SFC’ to the list of anomaly codes, and added a ‘case’ block in ‘HandleAnomalousConditions()’ to deal with this condition. The handler function is:

At the moment, all it does is call the ‘RunToDaylightV2()’ function, which does a 360º search for the best direction in which to move next. Here’s the telemetry and a short video showing the action.

This experiment led to one of those “Well, DUH!! moments, as the robot’s preferred ‘RunToDaylight()’ heading was right back at the mirrored surface!

After recovering from my ‘face-palm’ moment, I realized I needed to modify the ‘RunToDaylight() function to take heading-start & heading-end parameters to exclude the mirrored surface sector from the ‘RunToDaylight()’ search for an appropriate recovery heading.

28 January 2024 Update:

I modified ‘RunToDaylightV2()’ to take two float parameters denoting the start and ending headings for the search. In the normal non-mirrored surface case, ‘RunToDaylightV2()’ is called with no arguments. The no-argument overload simply calls ‘RunToDaylightV2(startHdg, endHdg)’ with startHdg = endHdg. In the mirrored surface case, the two-parameter version of ‘RunToDaylightV2()’ is called from the ‘ANOMALY_MIRRORED_SFC’ case block of ‘HandleAnomalousConditions()’

Here’s a short video and the telemetry from a test run toward a mirrored surface.

Stay tuned!

Frank

Garmin LIDAR-Lite V4/LED Distance Compensation

Posted 20 September 2023

Last February I changed out the Pulsed Light ‘Blue Label’ LIDAR system on WallE3 for the newer and better Garmin LIDAR-Lite V4 LED time-of-flight distance sensor (see this post and this post), and it has been working very well for me since then. However, I recently had occasion to re-visit my ‘MoveToDesiredForwardDistance()’ function, and found that WallE3 wasn’t doing a very good job of stopping where it should; it was often stopping 8-10cm farther away from the wall than desired.

Going back through the above two posts from last February, I found that this was something I had noticed at the time, but hadn’t worried about as precise front distances are generally not needed. However, precise front distances are required for the ‘MoveToDesiredForwardDistance()’ operation, so it is time to think about distance compensation, at least for distances less than 1m.

I started by redoing the distance measurement experiment shown in my initial study, as shown in the photo below:

Distance Compensation Measurement Setup

This produced the following plot:

Then I took the average from each of the above ‘stairsteps’ and plotted that against the actual distances, arriving at the following plot:

Avg LIDAR Distance, Meas Distance, and Calculated Compensated Distance

In the above chart, I used Excel’s ‘trendline’ tool to display the ‘best fit’ linear equation for both the plot of average values and the plot of actual distances. Of course the equation for the actual distances is exact, but the average values show a consistent too-high measurement.

Then I created a compensation function to make the LIDAR values look like measured values, and came up with y_comp = =0.987*(y_meas – 4.77), and plotted this line (shown in gray above). This compensation works very well for values down to about 40cm, but not so much below that. A separate compensation equation may be required for this section.

21 September 2023 Update:

I modified WallE3’s ‘GetFrontDistCm()’ function to include the above compensation expression and re-ran the test from yesterday, with the following result:

Compensated LIDAR Distance Measurements

As can be seen in the above plot, the compensated distance measurements track the actual distances quite nicely. This should result in the ‘MoveToDesiredFrontDistCm()’ function behaving much better than it did before front distance compensation.

Stay Tuned!

Frank

More ‘WallE3_Complete_V2’ Testing

Posted 13 March 2023,

Now that I have the new Garmin LIDAR installed, It’s time to return to ‘real world’ testing. Here’s a short video of a run starting in our entry hallway and proceeding past two open doorways into our dining/living area:

As can be seen in the video, WallE3 handles the oblique turn to the left and the two open doorways perfectly, but then loses it’s way when (I think) the right-hand wall disappears entirely. Not sure what happened there. Here’s the telemetry from the entire run:

Here’s the Excel plot of the run up to the point where the first open doorway is detected. Note this also covers the oblique turn. From the video, the oblique turn occurs at about six seconds from the start, which should put it somewhere in the 20,000 mSec area. However, comparing the video and the plot, it looks like this turn actually occurs at about 21,500 – 22,000 mSec, where the distance to the right-hand wall falls from the max 200 to about 60-70 cm. The turn itself goes very smoothly, and then the first open doorway condition occurs about four seconds later, at 26,200 mSec.

Segment from start to the first ‘open doorway’ detection

This next plot shows the period from the time of the first ‘open doorway’ detection to the end of the run, including the time where the robot passes the second open doorway (apparently without detecting it).

Segment from the first ‘open doorway’ detection to the end of the run

The left-hand wall distance starts out at 200 (max distance) due to the first open doorway. During this period the robot is tracking the right-hand wall (kitchen counter). When the left-hand distance returns to normal at about 27500 mSec, the robot continues to track the right-hand wall. Apparently the very short section of wall between the first and second doorways wasn’t enough to trigger the ‘open doorway’ condition. However, when the left-hand wall distance comes back down again after the second open doorway (at approximately 30400 mSec, the robot should have reverted to left-wall tracking, but it obviously didn’t. Soon thereafter both the left and right-hand distances started increasing to max values and the poor little robot lost its way – so sad!

After looking through the data and the code, I began to see that although the code could detect the ‘open doorway’ condition, it wasn’t smart enough to detect the end of the ‘open doorway’, so the robot continued to track the right-hand wall – “to infinity and beyond!”. After making some changes to fix the problem, I mad a run in my test range (aka my office) to test the changes. Here’s a photo of the setup:

‘Tracking Wrong Wall’ bugfix test setup

The test wall are set up so the ‘wrong side’ wall starts before the open doorway, and ends after it, and the distance between the two walls was set such that the measured distance to either side would be less than MAX_TRACKING_DISTANCE_CM (100 cm).

Here’s the telemetry from the run, with an additional column added to show the current ANOMALY CODE:

The robot starts the run tracking the left wall at the default offset (40cm), with an anomaly code of ANOMALY_NONE. This continues until the 19 sec mark (19,039 mSec), where the robot detects the ‘open doorway’ condition. This causes the code to re-assess the tracking condition, and it decides to track the right wall, starting at about 19,181 mSec.

During this segment, the AnomalyCode value is OPEN_DOORWAY. This continues until about 20,890 mSec where the ‘Tracking Wrong Wall’ condition is detected. Note that the actual/physical ‘open doorway condition ended at about 20,389 mSec when the left-side distance changed from 200 (max) to 67 cm, but it took another 0.5 sec for the algorithm to catch the change.

The ‘Tracking Wrong Wall’ detection caused the robot to once again re-assess the tracking configuration, whereupon it changed back to left-wall tracking at 21,027 mSec with the new anomaly code of ‘ANOMALY_TRACKING_WRONG_WALL’. Left side wall tracking continues until the run is terminated at 23,235 mSec. Note that the right-hand wall stops at about 22,235 mSec and the right side distance measurement goes to 200 (max) cm and stays there for the rest of the run.

Looking at the above, I believe the fixes I implemented were effective in addressing the ‘wandering robot syndrome’ I observed on the previous run. Next I will remove the debugging printout code, clean things up a bit, and then repeat the last ‘real-world’ run from before.

10 May 2023 Update:

I was definitely having problems with the ‘Open Doorway’ condition, so I wound up back in my ‘indoor range’ (AKA my office) to see if I could work through the issues. It turned out I was not detecting the onset or end of the ‘Open Doorway’ condition properly. I made some changes to the code and to the telemetry output to more thoroughly describe the action, and then ran the test again. The short movie and the telemetry output show the results:

230510 Open Doorway Run

Salient points in the video and telemetry printout:

  • WallE3 captures and then tracks the desired 30cm offset with pretty decent accuracy up until 3.8sec where it encounters the ‘open doorway’ on the left. This results in an ‘OPEN_DOORWAY’ anomaly report, which in turn causes TrackLeftWallOffset() function to exit, which in turn causes the program to start over at the top of loop().
  • The ‘top of loop’ code reevaluates the tracking condition, and because the left distance is well over 100cm and the right distance is about 50cm, it decides to track the right wall instead.
  • The right wall is tracked from 4.0 to 6.2 sec (where the right wall ends) and again detects an ‘Open Doorway’ condition, which forces the loop() function to restart. This time the right distance is about 143cm and the left distance is about 42cm, so the code chooses the left wall for tracking
  • Left wall tracking continues from 6.4 to 8.7sec – the end of the run.

Stay tuned,

Frank

Integrating Garmin LIDAR-Lite V4/LED into WallE3

Posted 22 February 2023,

After thoroughly (I hope) investigating the performance of Garmin’s new LIDAR-Lite V4/LED, I’m now in the process of replacing my current Pulsed Light LIDAR with the Garmin unit. Integrating this into the current system is not a simple pull-and-replace operation. The Pulsed Light unit uses a digital control line to trigger a measurement, and the Arduino pulseIn() statement to compute a distance. The Garmin uses I2C, so I’ll need to find a free (or at least one that’s not too busy) I2C port for this purpose. Currently my system architecture looks like this (with I2C ports highlighted):

Main system schematic with I2C ports highlighted
Sensor array schematic with I2C ports highlighted.

As it stands, all three of the available I2C ports on the VL53L0X Array Teensy are in use, and two of the three are in use on the main system Teensy. However, I2C port 2 (Pins 3 & 4) aren’t being used for I2C, and both lines are already routed to WallE3’s second deck. Even better, one of those lines (pin3/purple) already routes to the Pulsed Light LIDAR, so it would become available when the Pulsed Light unit is removed. The other line (pin4/orange) goes to the VL53L0X Teensy’s ‘Reset’ line, so a replacement for this would have to be found. There are still a number of free pins on the main system Teensy, and there are still a number of free pins available on the inter-deck connector, so this should work. I had thought of replacing the VL53L0X arrays with a single VL53L5CX (see this post for the details), putting both the VL53L5CX’s on a single I2C port, and then using the now-free I2C port for the Garmin LIDAR, but I would much rather keep these two projects separate from each other if I can :).

One minor fly in the ointment is that I don’t want to use the Garmin LIDAR I got from Sparkfun, as this unit came with the ‘Qwiic’ breakout board already attached. While this is a great setup for testing, I don’t need the extra Sparkfun board hanging off the end of the LIDAR unit for the installation on the robot. So, I got a new Garmin without the added board directly from Garmin, and the first step in this project is to make sure this unit actually works.

OK, I can now check the box that says “new Garmin LIDAR-Lite V4/LED unit works in my test setup”. Interestingly though, the undocumented (at least as far as I can tell) feature of the ‘heartbeat’ indicator inside one of the lenses is a bit less visible on this new unit (03/02/23 update – in response to my emailed query, Garmin said that undocumented feature is indeed intended to be a ‘heartbeat’ indicator).

Now the next step will be to remove the Pulsed Light unit from WallE3, install the Garmin unit, and make the wiring changes noted above.

23 February 2023 Update:

After pulling the 2nd deck off the robot and physically inspecting the inter-deck connector wiring, I was able to determine that there were three free lines available to replace (or act as) the VL53L0X Teensy reset line.

02 March 2023 Update:

After making the above hardware changes, here is the new system schematic

It was at this point that I discovered that the Garmin library for this device doesn’t have support for I2C ports other than ‘Wire’ (Wire0). After dicking around for a while, I looked again at the Sparkfun library and saw that it does have multiport support – yay! So, after dicking around some more, I got the Garmin device working again on I2C port 3 (Wire2) with the Sparkfun library using my little Teensy 3.5 plugboard setup as shown below:

Note the Garmin LIDAR connected to pins 3/4 (SCL2/SDA2)

However, somewhere in the change-over from Garmin to Sparkfun libraries, I got all screwed up, and couldn’t make sense of the results. So I decided to ‘go back to baseline’ with the ‘Fast’ example from Garmin. Unfortunately this turned out to be less-than-simple. Both the Garmin and Sparkfun libraries use the same .h/.cpp filenames so having both libraries installed posed a problem, so I ‘solved’ that problem by removing the Garmin libraries from the ‘libraries’ folder. So, when I decided to revert to the Garmin ‘Fast’ example (which references the Garmin library, of course), I got the Sparkfun libraries instead, and so down the rabbit hole I went – again.

I eventually figured all this out, but it took my entire evening to do it. I wound up

  • starting an entirely new VS/VM project called ‘Garmin Fast Example’
  • copy/pasted the Garmin ‘Fast’ example .ino file into the blank project file
  • copied the Garmin library .cpp/.h files into the local project folder, and changed their names to make sure they couldn’t be confused with the Sparkfun versions
  • ‘add’ed the .cpp/.h files to the project in Solution Explorer
  • Edited the ‘fast’ example #include line to match the .cpp/.h filenames (I had changed them to make sure they wouldn’t conflict with the Sparkfun names)
  • made the required edits to the .ino file to get a more informative readout
  • Fixed the inevitable screwups.

At this point I had a brand-new ‘Fast’ example file, properly instrumented, which produced the following Excel chart:

So, after blowing the entire evening, I’m finally confident that I once again have a working ‘Fast’ example using the Garmin library as a baseline, so now I can continue the adventure by moving to the Sparkfun library so I can move the LIDAR from Wire0 (the default I2C port) to Wire2 where I need it to integrate with WallE3.

Before switching over to the Sparkfun library, I decided to make a couple more runs for a better understanding of the ‘HIGH ACCURACY’ mode. In the above example ‘0’ is written to the HIGH ACCURACY register (0xEB) to turn it off completely. However, intermediate values from 1 to 20 can also be used. Here’s a run at 0XEB = 10 (0x0A):

‘HIGH ACCURACY’ register set to 10

And another run with 0XEB = 0x05:

‘HIGH ACCURACY’ register set to 5

And one more with the register set for 2:

‘HIGH ACCURACY’ register set to 2

Looking at the above, it seems that a value of ‘5’ should work. This produces a measurement time of about 150-170mSec for a 7m distant target, which is about the most I can reasonably expect to encounter, and this would work fine with a 200mSec front distance cycle time. Or, I could go with a value of ‘2’, suffer a bit less repeatability (not really an issue for the robot) and keep the measurement times well under 100mSec. This would be nice, as then I could use a single measurement interval for all distance measurements.

Next I went back to the Sparkfun version of the library so I could run the Garmin LIDAR-Lite V4/LED unit from Teensy 3.5 Wire2 port (pins 3/4). This program is functionally identical to the Garmin ‘Fast’ example. Here’s the output using 0xEB = 0x02.

Garmin LIDAR on WallE3 I2C port2, 0XEB = 2

The above plot, taken with the Garmin-supplied LIDAR (without Qwiic breakout) connected to SCL2/SDA2 on WallE3’s main Teensy 3.5 processor is basically identical to the one taken on my separate Teensy 3.5 plugboard with the Sparkfun LIDAR (the one with Qwiic connectors).

Mounting the Garmin on WallE3 turned out to be non-trivial. The unit doesn’t have mounting flanges like the Pulsed Light unit, and my first attempt with double-sided tape lasted less than an hour – oops! So, I designed a mounting adaptor that picks up the threaded holes used by the Pulsed Light unit and provides a way to mount the Garmin unit with a plastic sta-strap, as shown in the following photos:

Now that I have the Garmin LIDAR working on WallE3, the next step is to integrate the hardware-specific code (the ‘drivers’) into WallE3’s current software.

05 March 2023 Update:

I ported the relevant setup() code from ‘Garmin_LIDARV4_Sparkfun_V1.ino’ to ‘WallE3_Complete_V2.ino’ and the actual driver code to ‘GetFrontDistCm()’. At this point ‘WallE3_Complete_V2.ino’ compiles properly, but I have not tested anything yet.

06 March 2023 Update:

After porting the code as above, I tried to run ‘WallE3_Complete_V2.ino’, but it hung up where it tries to connect to the Garmin LIDAR…. and this is where my painstaking effort to take things in small steps paid off. I am certifiably obsessed with always having a backup and/or a way to retreat to a known-good baseline. With this project it involved the following:

  • Before even touching my robot, I got the Sparkfun-supplied Garmin LIDAR (with the Qwiic connector breakout board) working with the Sparkfun library on a Teensy 3.5 on a plugboard – no extra hardware at all, using the default I2C port.
  • Then I did the same thing with the Garmin-supplied LIDAR (without the Sparkfun breakout board) on the same plugboard with Garmin library.
  • Then I moved the LIDAR to Wire2 on the plugboard mounted Teensy, and verified that the LIDAR and the Sparkfun library (the Garmin library isn’t multi-port capable) operated OK.
  • Then I moved the Garmin-supplied LIDAR to my WallE3 robot and connected it to the main Teensy 3.5’s Wire2 port, with just the minimalist test program loaded into the robot’s main Teensy 3.5, and confirmed I could get the same performance as before with the plugboard.
  • Finally, I ported the necessary code into my ‘WallE3_Complete_V2.ino’, and tried to run it, at which point it hung up on the check for the LIDAR, as noted above

At this point, I immediately backed up to my ‘last-good’ baseline, with the LIDAR test program loaded onto the Teensy 3.5, and confirmed that it still worked fine. Now I know that the hardware is OK, and the problem has to be something screwy with my robot program. Since the point at which the ‘complete’ program died was in setup(), I also know that the problem has to lie before the LIDAR connection check. I also know that since the connection check succeeded with my small test program but not with the ‘complete’ one, the problem has something to do with the way the I2C ports were set up or initialized. Looking backwards in setup() from the LIDAR connection check, I ran across the following lines:

The first two sets of lines in the above snippets were necessary to enable the internal pullup resistors on the main Teensy 3.5’s primary and secondary I2C ports (see this post and this post for all the gory details). The third set of lines was intended to do the same thing (enable the internal ~33KΩ pullups) for Wire2 on pins 3/4), but shouldn’t be needed for the Garmin LIDAR because it provides internal 13KΩ pullups (see this doc, top of column 2 on page 2). I wasn’t sure why having a 13KΩ and 33KΩ pullups would be a problem, but I decided to comment those two lines out and see if it made a difference. Well, as it happened – it did, and now the Garmin LIDAR on I2C port2 (Wire2) responded properly, and with very little additional effort was fully integrated with the rest of the ‘complete’ program.

The reason I took so much time and space describing this relatively minor hiccup in the integration effort is because I wanted to demonstrate how important it is to proceed on a complex task by changing just one thing at a time, and to always have a fallback to a ‘known-good’ baseline. Otherwise you are just asking for an unguided tour through the infinitely turning tunnels down the rabbit-hole.

11 March 2023:

After getting the Garmin LIDAR integrated into WallE3, and cleaning up the normal amount of screwups, I finally got a good run on my office ‘test wall’, and was able to confirm that the Garmin LIDAR was performing well, as shown in the following Excel plot.

Wall run showing Garmin LIDAR ‘test wall’ run

Stay Tuned!

Frank