Wednesday, December 06, 2006

2 NXT's collaborating to move 3 objects from a grid

We now have a working multi-agent system. The video below shows 2 NXT's removing their respective objects from them lego grid. The video is low quality due the fact that it's hosted at google video. A high res version is available here (25,0 MB).



Our updated problem is now defined as follows:

Two teams, black and silver, of modified tribot agents are collaborating to move a number of black and silver items in a closed world back to their individual home bases (their initial postions). Only silver team tribots can remove silver items, and the black items can only be removed by the black team. If a tribot encounters an item which it is not able to remove, it must communicate the finding to the other tribots.

For all danes who did not get to read the twopage special on our project in Ingeniøren (The Engineer), here is a link to the online version with pictures.

Friday, December 01, 2006

Single agent identifying and retrieving objects

We have made a lot of progress since the last post. We have a fully functional implementation of the A*-path planning algorithm, which has been implemented in our C# agent. Also we have made a new grid with genuine lego plates, where we have put some black tape and pieces of reflective foil to make the actual grid, which allows the NXT to navigate.

We currently have a working solution that enables one NXT controlled by an agent on the .NET platform to identify and retrieve objects of a desired color. We shot a video of a run:



At the moment we are working on putting more players in the game. Hopefully we will have a working solution on wednesday.

Friday, November 17, 2006

"On the floor"-action

Today we have made some satisfying progress. The status so far on the different parts of our project is as follows.

Grid:
At the moment, we operate with three different colors on the floor, such that the NXT can differentiate between floor, line and intersection in our grid. We have a black line, yellow intersections and gray floor. This is not too good, because the floor and intersections is both brighter than the line. This results in problems when we want to distinguish between the two because it is possible for the light sensor to read a value that is similar to gray, when it is situated between the black line and a yellow intersection, hence both reads from black and yellow and gives a mean that "looks like" gray. Ie. readings can be ambiguous.

The solution to this problem is to create a grid where the line has the intermediate brightness between intersection and floor. For example, let the floor be brighter than the line and the intersections darker. In this environment, no ambiguous readings can occur.

NXT program:
The program running on the NXT is basically a big switch-statement.
It runs forever, and listens on a Bluetooth inbox. Depending on what message is delivered (from our .NET application) in this inbox, the NXT can turn left, turn right, follow the line to next intersection, turn around, pick up a ball, release the ball or leave the ball. (I might be missing some actions...)

This program is almost complete for our final application, and only minor correction/additions is required.

.Net application:
By now, we have an application that can control one NXT in the grid. It takes initial values, being NXT position and direction. For example, x=1, y=1, direction="North".

We can then input a new position, and the program will calculate a shortest path to this new position from it's current position. For this purpose we use an implementation of the heuristic algorithm A*. And then (of course) the NXT will go to this new position following the calculated path.

Quite nice to finally have something working "on the floor".

Now the real fun begins. We will implement a complex agent system with multiple agents that will cooperate to solve a task in the grid.

Friday, November 10, 2006

Different lightsensor values


After testing all the sensors we found out that two of the five sensors showed different values. Instead of calibrating the sensors we now set them to default. This can be done in NXT-G. We now hope that the ten other lightsensors we ordered return the same value as the three sensors.

More hardware problems

Yesterday everything seemed to work for us. We made a grid out of black isolationtape with yellow dots at the intersection. Yellow seemed to return the best value when light was reflected on it.
Then we got the communication between computer and NXT up and running. That means the NXT will stop and send a message if a yellow dot occurs or if the bumper is pressed. It then waits for a command from the computer.

Today the lightsensors all returned different values and had a hard time distinguish between the grey floor and the yellow dots. This could be because of the NXT is running low on battery but it could also be the change of ambient light. We now try a solution where the the line returns a value between the floor value and the intersection value.

Wednesday, November 01, 2006

New goals for the project

After realizing a number of major difficulties with creating a lifting mechanism, we've decided to fall back on Lego's Tribot design for our project.

This decision will perhaps set us back a bit, but we're convinced that it will pay off in the end.

The altered problem is now defined as follows:

Two teams, red and blue, of modified tribot agents are collaborate in removing a number of blue and red items in a closed world. Only red team tribots can remove up red items, and the blue items can only be removed up by the blue team. If a tribot encounters an item which it is not permitted to remove, it must communicate the finding to the other tribots.

The world will start off being a simple grid made of isolation tape, where the agents can navigate around. When an agent is communicating it will be shown by an extra lightsensor being switched on and off, signaling red light, on the tribot.

Wednesday, October 25, 2006

Designing the mover

We've been spending some hours creating the NXTMover, which will end up representing the agents in our Multi-Agent system. The first prototype is seen below. We've used parts from the NXT box and most parts for the lift is from the Lego Technics Forklift kit.



We still some more stabilizing of the lift, but the prototype on the picture is probably not far from the final version.

Wednesday, October 11, 2006

The first multi-agent system

The BallFinder is ready. A BallFinder works as follows: 1) Starts searching for a ball. 2) When a ball is found, notify other BallFinders, and meet up with them. 3) If another Ballfinder finds a ball, stop searching and meet up with the others.

A scenario with two BallFinders is shown bellow.



Now we are getting close to developing much more complex multi-agent systems.

Wednesday, October 04, 2006

Follow the black line, part 2

After having spend some hours on testing, we must conclude that it seems to be almost impossible to create a stable program that makes the NXT follow a line that curves both left and right by using bluetooth direct commands. We're using an approach that makes the NXT sweep first one way then the other, then double the distance and sweep again until the line is found again. The NXT's keeps on missing the line, and the amount of readings of the light sensor delays any other bluetooth commands to an extend which is not acceptable.

Therefore, we have decided to make the "Follow Line" program as a Lego Mindstorms NXT-G program, and execute it on the VM at the NXT. The result has shown to be very stable, and the fact that the execution of a program on the NXT still allows for direct commands over bluetooth to be received and executed. This is a feature which e.g. gives us the possibility of running the line following program on the NXT and then turning the NXT 180 degrees while the line follower is still running.

We shot a short video of two NXT's using the line following program:

Tuesday, October 03, 2006

Follow the black line

We archived to implement a simple algorithm that enables the NXT to follow a black line forming a circle. It is simple because the NXT can only go one way around and if the black line gets on the wrong side of the NXT, it it not able to find back on track.
After this was accomplished we tried to implement a sweep algorithm that would make the NXT go both ways.
We ran into some problems:
1. Not enough values from the light sensors
2. Can not count on tacho value.

We will continue to work on the FollowLine class, so hopefully it will be done next week.

Wednesday, September 27, 2006

NXTMover

Having the library NXTRemoteControl, we can now build another library using that.

The idea behind seperating these two levels, is that you can make countless configurations of a NXT. The NXTRemoteControl should be, and is, able to control each one of them. All you need, is to develop a library on top of it.

Our robot needs to be able to drive around and lift/move objects. Hence the name NXTMover. The library (NXTMover) will be action based, with each action running in a seperate thread. You will then be able to tell the NXTMover, to "move the robot along a line", "pick up object" or "go to this place".

Above this, we will implement some intelligence and multi agent abilities such that several NXTMovers can coorporate about moving object. More about this later. The figure below illustrates these concepts.

NXTRemoteControl

So, what have wee been up to lately.

Most of our time has been put into developing a .NET library, that we have called NXTRemoteControl. This library uses the Direct Commands published by LEGO. This allows us to remotely control every component of the NXT. That is, all sensors, motors and the Bluetooth communication device can be controlled from a computer connected to the NXT via Bluetooth. This is much similar to what iCommand does.

It has been quite a challenge to deciffer the Direct Commands and figure out exactly what the different parameters actually does. Our effort has been rewarded. We now have a lib supporting both the sonar and synchronized motors among all the other stuff.

I.e, so far our lib support motors and sensors.

We have been programming in C#.

Public Relations (PR)

More and more people are becoming interested in our project. Different people from down the hall have asked what we have achieved and what the robots can do so far. Because most of the time has been used to produce the library NXTRemoteControl, it is limited what the actual robot can do.
Today a photographer came by and took some pictures to a magazine advertising for DTU and IMM. The magazine will come out in the beginning of January.
We have been hired by Lego to give a demonstration in Bilka (shopping mall) of what the NXT Robots can do. Parents with their children can get a short introduction to how the Lego NXT works and how easy it is to build and control. This will have nothing to do with our project because the demonstrations should natural be in NXT-G.

Our supervisors created at poster for a seminar called "IT overalt" (IT everywhere) describing the Multi Agent problem and how we tackle the different challenges. This was the picture they used:

NXTRemoteControl

NXTRemoteContol 0.1 is ready. Will be available for download soon.

Friday, September 15, 2006

NXTRemoteConrol

After studying the source code in iCommand we have now made the structure for our NXTRemoteControl library. It's very similar to iCommand with the biggest difference being, that it is writtin in C#. The plan is to reuse most of the code from iCommand but make a little more flexibility for the user and of course fix the bugs :)

We are hoping the library will be done next week, so we can start programming some NXT's..

Five working NXTs

We found out that it was in fact possible to control five NXTs via Bluetooth from one computer.



It is straight forward in the .NET framework. All you need to do is to open a serialport (COM) to each NXT.

Once a serialport is opened, it is possible to send "Direct Commands" to the NXT. That is, you can remote control everything on the NXT. LEGO has published the protocol used for doing this. What you do is to send a number of bytes to the NXT, and it can reply (if wanted).

This is the approach iCommand uses.

Using this approach, some of the autonomous idea behing our project is undermined, while the NXTs will not run any code, but just be "dumb" remote controlled robots. However, given our situation and having time as a factor as well, this looks like it could be our choice of approach.

Friday, September 08, 2006

.NET tutorials and stable Bluetooth software

Today we've gone through several of the tutorials related to the Microsoft Robotics Studio. It seems to be the best way to get more acquainted with this .NET approach to controlling NXT robots. It is our impression, so far, that this just might be the framework we will choose for out project. The possibilities are countless and other users have had positive experiences with it.

Finally it seems like we've found some more stable software for using bluetooth in windows. It's called Widcomm software stack, and we're currently using version 5.0.1.802, which can be found here (If you are patient.. very slow).

Wednesday, September 06, 2006

Computer, Bluetooth and .NET

A new computer arrived today. Supposedly a beast. This is the computer we will use to program the NXTs.

About the Bluetooth, it is still killing us. Sometimes it works, sometimes it does not. The problem might be, might be, that we in fact do not know what is going on under the hood. Which protocols and how many protocol layers is used? Communication between NXTs works seamlessly, but as soon as Windows is drawn in the picture, something goes wrong.

Despite the Bluetooth problems, we had a breakthrough with .NET. Microsoft Robotics Studio together with Visual Studio might do the job for us. Hopefully it has a full API such that we can program and control every bit of the NXTs. We have not yet created a program ourselves that can be downloaded or can control the NXTs, but we have tested a .NET application that did.

Thursday, August 31, 2006

Environments

We have tested the RobotC environment. This required an update of the NXT firmware, and seemed to be a good alternative to the Mindstorms graffical environment. It is based on C, but not fully C compliant yet. Also, there are no Bluetooth functionality, except upload from computer to NXT of program files, which is what we imagine the NXTs will use to communicate with each other. Drawbacks of RobotC is that it might get commercial, and it is not in fact C, just based on C.

We could not get the Java-based environment iCommand to work. iCommand might fullfil our demands for a environment, and it would be convenient for us to program in Java, so we will spend some more time getting this environment tested.

The most succesfull alternative to the mindstorms development environment is the simple assembly like language called NBC (Next Byte Codes) which can be compiled and executed to run on a NXT with default firmware. A feature rich editor for the NBC language is Bricxcc.



The handling of the bluetooth connection in windows is rather annoying at the moment, but we hope to find a more stable solution.

Further we have performed small unstructured tests of some of the sensors, and they appear to be ok, but further structured testing will reveal how accurate the sensors in fact are.

Tuesday, August 29, 2006

The Claw

Day 2. We've added the claw to the first NXT robot. Take a look at the awesome design:


We've also made a simple program in the Lego's commercial Mindstorm studio. Here is a demo:


Also, the program was downloaded to The Claw via Bluetooth.

Still to be worked out, is Bluetooth communication from The Claw to computer, and between NXTs.

Monday, August 28, 2006

We are the robots...

One LEGO Mindstorm NXT arrived. Another four and a powerfull computer to program the animals on, is still to come.

Nice toy.

We created a small vehicle, with the one NXT. Made a little program on The PowerBook and ran it succesfully on the NXT.

No Bluetooth yet, though.