We spent the first three days of last week taking part in the AAAI 2007 Spring Symposium. In particular, we joined the Robotics and Education track with about 50 others interested in using robotics to teach everything from introductory programming to vision, pathfinding/planning, decision making, and all the other tasty bits of artificial intelligence.
We saw some very cool projects and platforms. I really liked Roomba Pac-Man (where students program Roombas to wander hallways and vacuum up messes in a pac-man like way), and the Scribbler (while not very aesthetically pleasing) is a cute re-creation of the LOGO turtle. (I've provided a side-by-side comparison of what these two robots look like; the original turtle was, I think, far more attractive.)
In terms of platforms, many people are doing "tethered" robotics, where they either physically or wirelessly tether their robot to a PC. This is, I think, unfortunate, as I feel it takes something away from the process of programming the robot---it is a remote control process, as opposed to an autonomous one. However, like many things, it depends on what your pedagogic goals are. However, I do look forward to a few things: I think the Qwerk is a neat (part of the TeRK project) and we hope that we have a chance to work with it sometime in the future. It's a powerful computational platform with a lot of nice outputs for motor and servo control, as well as managing a host of sensor inputs. Likewise, the Blackfin Handyboard is a very powerful platform, and will also be great to begin exploring, as it provides some very flexible programming options in the form of two Xilinx FPGAs. Fred and Andrew did a very nice job with the board design, and I expect many cool things will be done with this platform.
We also saw David Miller's XBC/Gameboy combination, which I now have a Gameboy to try this out with (once I have a budget to get the rest of the bits, which is actually the expensive part). However, the really fun new platform was the Surveyor Robotics SRV-1.
Howard wrote about our experiments with the SRV-1 on his weblog, and I'll add a bit here, and followup in a later post with some more detail. We were given an SRV-1 to borrow on Monday evening, and did very little with it, as we were missing critical software. Tuesday, during coffee breaks and the like, Christian ported the Transterpreter to the SRV-1. In less than two days, we saw a new, ARM7-based robotics platform, and had the Transterpreter running subsumption code on it. In three days (that is, the day after the conference), Christian had vision working.
I went ahead and stole a picture from Howard; here, you can see the end-result of our hacking, which is that we won the AAAI Robotics and Education robotics programming challenge. We did this with the SRV-1 after having seen it for the first time on Monday, having ported our runtime to it on Tuesday, and having seen the challenge on Wednesday morning. We managed a (not-quite-working) 3-layer subsumption network that tackled the challenge in about 20 minutes of furious hacking; I'll write more in detail about our code (which we've subsequently cleaned up and corrected) in a followup post.
For now, we're still kicking around San Francisco and the Bay area, enjoying two days off before flying back to England on Wednesday.