Monthly Archives: March 2023

Using ‘Processing’ to Display Robot Wall-following Trials

Posted 27 March 2023,

While waiting for my new VL53L1X time-of-flight distance sensors to arrive, I’m in a bit of a lull. So, I decided to make a run at using the ‘Processing’ language & IDE to try out some ideas on displaying the telemetry from recent autonomous wall following trials.

I currently use Excel to produce 2D plots of various telemetry parameters, and this does a pretty good job of giving me good insight into the behavior of WallE3, my autonomous wall-following robot. However in this last batch of trials after giving WallE3 the ability to ignore open doorways if another wall is available for tracking, it became more difficult to tell what was going on using just the 2D plots. I also video the runs, but I have discovered it is very difficult to synchronize the video with the 2D Excel plots; when I look at the video I can see specific events, but I have real trouble seeing those same events in the 2D plot, and vice versa. In one of my going-to-sleep brainstorming sessions I recalled some web research I had done regarding the ‘Processing’ language & IDE – and thought this might be an opportunity to try using Processing to display robot telemetry visually. Maybe ‘Processing’ would allow me to create the equivalent of a video directly from the telemetry, which hopefully would give me greater insight into the details of robot behavior, especially when dealing with ‘open doorway’ and ‘wrong wall’ events.

So, I downloaded and installed Processing. I really hate the name; it’s too vague and doesn’t work very well in the English language; it always sounds like the sentence is missing a noun or a verb. You wind up with sentences like “using Processing, I processed robot telemetry to create a visual representation of a wall-following run.” Oh, well – maybe all the other names were taken?

In any case, after fumbling around with some tutorials and trying (with mixed success) to get my head around Processing’s architecture, I was able to make a crude representation of some recent wall-following telemetry.

first try at displaying robot wall-following using ‘Processing’

For such a crude result, the above plot is actually pretty informative. I could easily see where the robot started off by capturing the desired offset, and then tracked the offset very accurately up to the point where the wall changed direction. The robot did OK, but the above display doesn’t accurately display the wall direction change.

Here’s the entire Processing ‘RobotTelemetry.pde’ sketch:

First impressions of using ‘Processing’:

Pros:

  • Easy to get started – fast download, quick install, lots of tutorials
  • Java programming syntax close enough to C++ to be usable
  • Very easy to get a working program to display in viewing window
  • Lots of examples
  • Lots of extensions

Cons:

  • Not easy to expand beyond simple fixed-window-size displays
  • Not obvious how to handle user interactions with screens
  • Not obvious how to handle non-fixed window sizes, viewports with different size than display window, scrollbars, or other user controls
  • Can’t change default IDE window size – very frustrating.
  • Programming IDE kind of clunky; no ‘Intellisense’ capability, have to keep jumping to Processing ‘Reference’ website for function syntax, etc.

After going through the ‘process’ of using ‘Processing’ (I hate that name!) to ‘process’ robot telemetry to produce a graphical view of the robot’s behavior, I realized that although it was relatively easy to get to ‘first display’, In my particular case I would probably have been better off building this in C#/Visual Studio due to the wealth of support for GUI systems.

09 April 2023 Update:

In a little more than a week I was able to put together a pretty decent Windows Forms App using my normal 2022 Visual Studio Community Edition IDE and C#/.Net. Here’s a screenshot displaying approximately the same section of wall as is displayed above using ‘Processing’ (I still hate that title!): For convenience, I also copied the image from above:

first try at displaying robot wall-following using ‘Processing’
C#/Windows Forms App showing same section of wall as above

The C#/.Net environment was so much easier to use – there is almost no comparison (although I may be a bit biased by the fact that I have been programming Windows graphical user interfaces for over 30 years).

Pros:

  • It is actually very easy to get started in the modern Windows Forms App genre. The Visual Studio Community Edition is free, and there are more tutorials than you can shake a stick at
  • Great community support. If you have a problem, the probability that someone else has already experienced the same problem and solved it! is essentially 1. For instance, on this project I wanted to use CTRL-SCROLLWHEEL input to zoom in or out of the view, but the ‘pictureBox’ control I was using as a drawing surface didn’t support this feature ‘out of the box’. After a few Google searches I easily found multiple posts about the issues, and several different solutions, and now I have the feature enabled in my app – neat!
  • Great language support from Microsoft. The Visual Studio IDE allows the programmer to press F1 on any language element, and this links to the relevant reference page – also neat.
  • You get the benefit of a huge number of graphical entities and a very successful IDE. Even a novice can generate a complete, working Windows app in just a few minutes. The app won’t do much, but it will have a visible window that can be resized, moved around, minimized, maximized, and everything else you would expect all Windows apps to do. From there that same novice can easily add graphical elements like scrollbars, labels, data entry boxes and more.

Cons:

  • The Visual Studio IDE can be a bit daunting at first, as it is meant to be everything to everyone. However, an installation with a basic set of features for Windows Forms apps can be easily downloaded and installed for free from Microsoft, and the installation process is straightforward
  • If you haven’t done any graphical programming at all, then there will be a learning curve to get used to concepts like the drawing surface and drawing operations (but this would be true of ‘Processing’ as well).
  • In order to share a Windows Forms app, either the recipient has to have the same programming environment so the app can be run in DEBUG mode, or the app must be compiled into an executable that can then be run on the recipient’s machine in a stand-alone fashion. With a ‘Processing’ sketch, it is a bit easier (I think) to send someone a Processing ‘sketch’ which they can then run in their ‘Processing’ environment (there may also be a way to compile a ‘Processing’ sketch into an executable, but I didn’t investigate that).

In any case, as the man says, “Your mileage may vary”

Stay Tuned,

Frank

More ‘WallE3_Complete_V2’ Testing

Posted 13 March 2023,

Now that I have the new Garmin LIDAR installed, It’s time to return to ‘real world’ testing. Here’s a short video of a run starting in our entry hallway and proceeding past two open doorways into our dining/living area:

As can be seen in the video, WallE3 handles the oblique turn to the left and the two open doorways perfectly, but then loses it’s way when (I think) the right-hand wall disappears entirely. Not sure what happened there. Here’s the telemetry from the entire run:

Here’s the Excel plot of the run up to the point where the first open doorway is detected. Note this also covers the oblique turn. From the video, the oblique turn occurs at about six seconds from the start, which should put it somewhere in the 20,000 mSec area. However, comparing the video and the plot, it looks like this turn actually occurs at about 21,500 – 22,000 mSec, where the distance to the right-hand wall falls from the max 200 to about 60-70 cm. The turn itself goes very smoothly, and then the first open doorway condition occurs about four seconds later, at 26,200 mSec.

Segment from start to the first ‘open doorway’ detection

This next plot shows the period from the time of the first ‘open doorway’ detection to the end of the run, including the time where the robot passes the second open doorway (apparently without detecting it).

Segment from the first ‘open doorway’ detection to the end of the run

The left-hand wall distance starts out at 200 (max distance) due to the first open doorway. During this period the robot is tracking the right-hand wall (kitchen counter). When the left-hand distance returns to normal at about 27500 mSec, the robot continues to track the right-hand wall. Apparently the very short section of wall between the first and second doorways wasn’t enough to trigger the ‘open doorway’ condition. However, when the left-hand wall distance comes back down again after the second open doorway (at approximately 30400 mSec, the robot should have reverted to left-wall tracking, but it obviously didn’t. Soon thereafter both the left and right-hand distances started increasing to max values and the poor little robot lost its way – so sad!

After looking through the data and the code, I began to see that although the code could detect the ‘open doorway’ condition, it wasn’t smart enough to detect the end of the ‘open doorway’, so the robot continued to track the right-hand wall – “to infinity and beyond!”. After making some changes to fix the problem, I mad a run in my test range (aka my office) to test the changes. Here’s a photo of the setup:

‘Tracking Wrong Wall’ bugfix test setup

The test wall are set up so the ‘wrong side’ wall starts before the open doorway, and ends after it, and the distance between the two walls was set such that the measured distance to either side would be less than MAX_TRACKING_DISTANCE_CM (100 cm).

Here’s the telemetry from the run, with an additional column added to show the current ANOMALY CODE:

The robot starts the run tracking the left wall at the default offset (40cm), with an anomaly code of ANOMALY_NONE. This continues until the 19 sec mark (19,039 mSec), where the robot detects the ‘open doorway’ condition. This causes the code to re-assess the tracking condition, and it decides to track the right wall, starting at about 19,181 mSec.

During this segment, the AnomalyCode value is OPEN_DOORWAY. This continues until about 20,890 mSec where the ‘Tracking Wrong Wall’ condition is detected. Note that the actual/physical ‘open doorway condition ended at about 20,389 mSec when the left-side distance changed from 200 (max) to 67 cm, but it took another 0.5 sec for the algorithm to catch the change.

The ‘Tracking Wrong Wall’ detection caused the robot to once again re-assess the tracking configuration, whereupon it changed back to left-wall tracking at 21,027 mSec with the new anomaly code of ‘ANOMALY_TRACKING_WRONG_WALL’. Left side wall tracking continues until the run is terminated at 23,235 mSec. Note that the right-hand wall stops at about 22,235 mSec and the right side distance measurement goes to 200 (max) cm and stays there for the rest of the run.

Looking at the above, I believe the fixes I implemented were effective in addressing the ‘wandering robot syndrome’ I observed on the previous run. Next I will remove the debugging printout code, clean things up a bit, and then repeat the last ‘real-world’ run from before.

10 May 2023 Update:

I was definitely having problems with the ‘Open Doorway’ condition, so I wound up back in my ‘indoor range’ (AKA my office) to see if I could work through the issues. It turned out I was not detecting the onset or end of the ‘Open Doorway’ condition properly. I made some changes to the code and to the telemetry output to more thoroughly describe the action, and then ran the test again. The short movie and the telemetry output show the results:

230510 Open Doorway Run

Salient points in the video and telemetry printout:

  • WallE3 captures and then tracks the desired 30cm offset with pretty decent accuracy up until 3.8sec where it encounters the ‘open doorway’ on the left. This results in an ‘OPEN_DOORWAY’ anomaly report, which in turn causes TrackLeftWallOffset() function to exit, which in turn causes the program to start over at the top of loop().
  • The ‘top of loop’ code reevaluates the tracking condition, and because the left distance is well over 100cm and the right distance is about 50cm, it decides to track the right wall instead.
  • The right wall is tracked from 4.0 to 6.2 sec (where the right wall ends) and again detects an ‘Open Doorway’ condition, which forces the loop() function to restart. This time the right distance is about 143cm and the left distance is about 42cm, so the code chooses the left wall for tracking
  • Left wall tracking continues from 6.4 to 8.7sec – the end of the run.

Stay tuned,

Frank