Adventures with Lego Mindstorms: NXT

Monday, February 19, 2007

BricxCC - I think we have a winner!

Well it looks like I may have missed my deadline here. Luckily I don't have masses of angry readers posting comments, although that would be welcome indeed! Lots has happened since my last post and I figured I would take the time to give my rare, and endangered readers a quick update and some details on my current project.

I want to quickly go over BricxCC and NXC before I get into the project details. The first thing you will notice with BricxCC, as compared to RobotC is the lack of professional polish. This is very much an open source project, and unlike RobotC, it is designed to work with a variety of languages and Firmware. The choice is astonishing. There is a C# language for MS.net, Java and more! I haven't really looked at anything more then the availability of these languages as I am most comfortable programming in C/C++ and as NXC is available, I am unlikely to move to a completely different environment. The main choice for programming the NXT using BricxCC are NXC (Not eXactly C) and NBC (Next Byte Code). NBC has a structure very close to assembly language so if you are more comfortable in that environment it is an excellent alternative.

Both NBC and NXC are compiled with the NBC compiler, which has to be downloaded separately from BricxCC itself. Installation is simply unzipping a file into the root directory of the BricxCC application itself. One note here. NBC code should compile with the current release version of BricxCC, however, in order to use NXC code you will have to download and unzip the "test" version of the interface. You will need the standard release as well since the test version only contains changes and isn't the full application.

So far I am loving NXC. The reason, plain and simple, is that there is a programmers manual! I can't stress how necessary this is for any kind of programming language. That RobotC doesn't have a manual readily available makes it almost completely useless. Hopefully someone is working on getting this available for download ASAP. So until that happens I am almost certainly going to be sticking to NXC exclusively.

Da Code Boss! Da Code!

I want to give some background information here before I go on. I am not a developer, I am a systems administrator. As such, I have read thousands of lines of code, in various languages, in the course of my work. However, as a Systems Admin, it is very rare that I ever actually have to produce any from scratch. Back when I was still in school it was a totally different situation. Back then I was quite the coder. I loved the marathon sessions and can remember not eating, drinking, sleeping.. hell, it felt like we didn't even blink as we worked our way through esoteric problems, mostly just to see if we could. That was at least 16 years ago and pretty much the last time I could say that I coded anything from scratch. One of my biggest motivations in purchasing this set was to get down and dirty in the trenches and to see if I still have any foo at all left in these old bones. Oddly enough, it was like riding a bicycle. A quick look through the programmer's reference was enough and soon I was happily coding away almost like the old days. I really have to say, it felt GREAT!

The Project

The end goal of this project is to create and autonomous learning robot using evolutionary methods. At the moment I have a great deal of theory floating around in my head, but am still working out the details of implementation. So I decided to start out simple. In order to achieve my goal my robot will have to have an idea of where it is in the environment, what objects exist, their properties and location. As we don't want to have to program all this in, the robot has to have a method to gather this information from a starting point of knowing nothing. Also, as the instruments we are using to record this data are very inaccurate, and impossible to check, we have to build in a certain amount of fuzziness into the system and a way to resolve conflicts must be created. The end goal, and the mind numbing part of the whole project, is this conflict resolution. I am hoping to implement a Social Dissidence algorithm in order to resolve conflicts and a Q-based learning algorithm using weights to modify behavior. This is the road that I am embarking on. It will be a long journey and I fully admit that there are huge gaps in my knowledge and understanding that will have to be filled before I even know if it is possible or if we are speedily riding to the brink of a precipice.

However, there is a long way to go before we get into the more complex mathematical aspects of this endeavor. The first step in all of this is to have a system of reference. The easiest to implement is a simple Cartesian grid. This part of the programming is fairly straight forward and won't change even after learning is implemented. in order to define our grid, we first need a system of measurement, then we need to define the following;

xMax, xMin, yMax, yMin

Lastly, the robot has to return to a fixed point and return a (correct) location in the format of Location[x,y]. Our method of measurement is to use the GetOutput(OUT_B, RotationCount) function using the NXT rotation sensors to collect the data and using one rotation as our base unit of measure. Defining our grid is the easy part. The robot just moves forward from its start position (Location[0,0])until the Touch Sensor (tSensor) is activated. The number of rotations from Location[0,0] becomes our xMax. Turning around and repeating the process and subtracting xMax from the total rotations gives us xMin. The robot then returns to Location[0,0], turns 90 degrees, and repeats the process to return the values of yMax and yMin. Lastly the robot then returns to the centre, giving us our defined initial Location[0,0].

Considerations

At this point we have to make some decisions as to how we will implement this logic. It is important to think about this now. While the initialization routine can be programmed in a linear fashion, our decisions at this point will effect the methodology of the entire project. Also, by insuring that the code generated is written in such a way that it is both generic and portable, I can start creating my own library of functions which I can then snap together just like building a LEGO set.

So, how do we go about doing this. First off I have decided to use a State based programming methodology. Each of the various tasks the robot performs is assigned a state with 0 set as NO STATE. As each function is executed, it changes the State flag and sets PreviousState == State. This way we can modify the behavior of the robot depending on what it was last doing. So far our states are as follows;

  • 0 - No State
  • 1 - Collision
  • 2 - Initialization

This requires several different tasks with different functions. First, we need a State Monitor to watch the state variable and call the functions based on that variable. We also need a Sensor Monitor to watch for our tSensor hit and to record values for our Light Sensor (lSensor) which watches to see if the robot is on carpet or hard flooring so that it can modify it's turning depending on what kind of surface it is traveling over. Environmental effects, like a tSensor hit, causes the state to change but don't initiate any other action. Instead the State Monitor takes over and initiates an appropriate response. We will also need a function to monitor the location of the robot, but that comes later and is unnecessary until the Initialization sequence is complete. Object handling and Avoidance are also considerations that we can ignore for now.

While this may sound complex, it actually is quite simple and drastically reduces having to code redundant functionality. Lets look at our Initialization sequence. When turned on, our Robot will be in state 0. The state manager checks the previous state, sees that it is also 0. This will only happen before any action has been taken on the part of the robot. The State Manger then changes the robot's state to 2 and starts the Initialization sequence and sets PreviousState == 0. As the robot moves forward to define Xmax we are using our tSesnor to tell us when to stop. When hit, the Sensor Monitor would set PreviousState == State (in this case 2 and 1 respectively) and the State Monitor then pauses the initialization sequence and calls for the code for tSensorHit to be executed. This is simply to back the robot up enough that it can turn around with out hitting anything and is common to any tSensor event. State == PreviousState (back to 2, initialization) and PreviousState is set to 1 to show that the last action was a collision. The state manager then returns control to our initialization sequence and away our robot goes having determined xMax.

I like to use flow charts to visualize the logic before I write the code and here (When I figure out how to post PDF files) is the rough draft of the logic for this initialization sequence. I haven't yet defined the logic for the State Manager, but I am hoping that this will give you some idea of the logic we are trying to implement. Flow charts are an excellent tool to help visualize the logic, and let me see how changes in one subsystem will affect the rest of the program, without having to modify code. As I write the code itself, and learn the functions available to me in NXC it requires some rethinking of the logic. You can see this process in the modifications I have made to the REVmotorLR and DefineXmax functions as compared to FWDmotorLR and DefineYmax.

Now for elbow grease!

So now I have the logic worked out it is a matter rolling up my sleeves and writing the code. Hopefully that won't take long and will get faster as time goes on. I am still feeling my way around the language and my printed copy of the programmer's manual is getting rather worn and dog eared by now. One problem that I know I will run into and I have to solve before the robot can do anything is the question of motor control. In NBC there is a function that takes over motor control for a thread expressly and then releases it once done in a cascading manner. I am sure that there is a counterpart in NXC, but I haven't found it yet. Gaining and releasing control gracefully is going to be critical.

So that is it for now! I have ungraded Mikey so that he has tracks now. I have also done some modifications on the chassis to make him more sturdy. More pics and hopefully some building instructions coming up soon! Stay tuned!

3 comments:

AuKiiraa said...

hmm cool project. I'm new with BricxCC and when I try to download something it says it has to be on a system path, I don't know what's wrong.

Matt E. said...

wow, someone read this! Thanks! =).

Not sure what is going on there. What are you trying to download? Most of the time it just means that whatever it is has to be in a folder that is in the PATH. You can find your path statement in windows by looking in SYSTEM PROPERTIES, and then checking out the environment variable. You can also just download whatever it is to the same folder as your BrixCC application as the system will look in the run folder first and then go looking through the path for the specified file.

Hope that helps, sorry about the time it took to reply. I haven't been keeping up with the blog (obviously), and didn't know anyone was reading.

Anyhow, you can also take ask the BrixCC developer on the BrixCC homepage. I happen to know that he will reply to your questions. At least he has for me.

TTYL

Lego Geek said...

Hello:

Do you have some code to share about your robot? I am particularly interested in knowing how to plan to use the data collected from the ultrasonic sensor? How do tell an opening and move in that direction.

Mike