Yesterday we scrimmaged the German Team in our lab. The idea is very similar to how we competed at the German Open in April. We sent the German Team a copy of our color table for the lab, and some code that loaded the binary into memory, and Max converted it to the german team’s format.
Our play was comparable to the German Team’s, and we finished the first half leading 1-0 after a lucky shot on goal. The second half ended prematurely after drained batteries and bugs on both sides made play deteriorate, but not until after both teams scored a goal, leaving the final score 2-1. Our major problems came from a bug which keeps the chaser in the grabbing position instead of kicking, resulting in a ball holding penalty, as well as a short episode of ball fright induced by a DEBUG_CHASER switch turned on at half. The German Team’s only issue was an unidentified problem which resulted in the robot ceasing to play soccer and simply swaying back and forth in the middle of the second half.
Unfortunately we were only able to scrimmage 4. v 4. since we our robots are deteriorating quickly. Currently Sam and Pippin are in the shop, and Mike’s leg is about to give out. Even though development on the Aibo is generally straightforward, it is clear that only a few teams (if any!) will be able to muster robots to play again next year if the Aibos are brought back for another year.
So I have just returned from the celebration of the great German Team leader Matthias’ thirtieth birthday. Many members of the Humboldt Aibo team were there, along with Humanoid team members. I learned a number of things about the future of RoboCup at Humboldt and the inner-workings of the Humanoid team.
Matthias also promises to have his wonderful videos from RoboCup 2007 posted online sometime within the next week.
NUbots vs. GermanTeam [154 mb] — hosted by Aibo Team Humboldt.
Again, congrats to the GermanTeam! Enjoy your supremacy now, for you are going down in Atlanta. All 140 of you.
Some New Goal work and some Horizon work.
First, I tried to debug and integrate Joho’s work on recognizing the new goals. I’ve narrowed down the buggy code, but couldn’t fix it. So hopefully more luck with that when Joho can take a closer look at it.
Second, I actually used the Horizon line calculation for some good. The purpose of the horizon line is basically twofold: it gives you an idea of where to look for things and it gives you an idea of where not to look for things. This seems simple but the GermanTeam‘s report helped me clarify this a bit.
Most important objects more frequently appear around the horizon: goals, posts, far-away balls, etc. Close objects included really close goals, close balls, dogs, and lines. The former are also by nature further away from you and are therefore take up less room in the image. The horizon line gives you a good idea of where to scan intently. Beneath the horizon line you can scan sparsely. Above the horizon line–heavens to betsy–you don’t really have to scan at all.
This last bit was clear to me from the beginning. Scan less above the horizon line and you’ll cut down on false positives and speed up the vision system simply because it has fewer pixels to process. But giving you a better idea of where to scan more fervently, however, now that’s an idea that deserves some good coding.