Friday, November 17, 2006

"On the floor"-action

Today we have made some satisfying progress. The status so far on the different parts of our project is as follows.

Grid:
At the moment, we operate with three different colors on the floor, such that the NXT can differentiate between floor, line and intersection in our grid. We have a black line, yellow intersections and gray floor. This is not too good, because the floor and intersections is both brighter than the line. This results in problems when we want to distinguish between the two because it is possible for the light sensor to read a value that is similar to gray, when it is situated between the black line and a yellow intersection, hence both reads from black and yellow and gives a mean that "looks like" gray. Ie. readings can be ambiguous.

The solution to this problem is to create a grid where the line has the intermediate brightness between intersection and floor. For example, let the floor be brighter than the line and the intersections darker. In this environment, no ambiguous readings can occur.

NXT program:
The program running on the NXT is basically a big switch-statement.
It runs forever, and listens on a Bluetooth inbox. Depending on what message is delivered (from our .NET application) in this inbox, the NXT can turn left, turn right, follow the line to next intersection, turn around, pick up a ball, release the ball or leave the ball. (I might be missing some actions...)

This program is almost complete for our final application, and only minor correction/additions is required.

.Net application:
By now, we have an application that can control one NXT in the grid. It takes initial values, being NXT position and direction. For example, x=1, y=1, direction="North".

We can then input a new position, and the program will calculate a shortest path to this new position from it's current position. For this purpose we use an implementation of the heuristic algorithm A*. And then (of course) the NXT will go to this new position following the calculated path.

Quite nice to finally have something working "on the floor".

Now the real fun begins. We will implement a complex agent system with multiple agents that will cooperate to solve a task in the grid.

Friday, November 10, 2006

Different lightsensor values


After testing all the sensors we found out that two of the five sensors showed different values. Instead of calibrating the sensors we now set them to default. This can be done in NXT-G. We now hope that the ten other lightsensors we ordered return the same value as the three sensors.

More hardware problems

Yesterday everything seemed to work for us. We made a grid out of black isolationtape with yellow dots at the intersection. Yellow seemed to return the best value when light was reflected on it.
Then we got the communication between computer and NXT up and running. That means the NXT will stop and send a message if a yellow dot occurs or if the bumper is pressed. It then waits for a command from the computer.

Today the lightsensors all returned different values and had a hard time distinguish between the grey floor and the yellow dots. This could be because of the NXT is running low on battery but it could also be the change of ambient light. We now try a solution where the the line returns a value between the floor value and the intersection value.

Wednesday, November 01, 2006

New goals for the project

After realizing a number of major difficulties with creating a lifting mechanism, we've decided to fall back on Lego's Tribot design for our project.

This decision will perhaps set us back a bit, but we're convinced that it will pay off in the end.

The altered problem is now defined as follows:

Two teams, red and blue, of modified tribot agents are collaborate in removing a number of blue and red items in a closed world. Only red team tribots can remove up red items, and the blue items can only be removed up by the blue team. If a tribot encounters an item which it is not permitted to remove, it must communicate the finding to the other tribots.

The world will start off being a simple grid made of isolation tape, where the agents can navigate around. When an agent is communicating it will be shown by an extra lightsensor being switched on and off, signaling red light, on the tribot.