|
|
Week 1 |
|
The first week was spent
getting settled in for the most part. I arrived in Minneapolis on
Sunday, June 2nd, after a 14.5 hour drive. Four of those hours were
spent driving through thunderstorms, which wouldn't have been too bad,
except that one of my windshield wipers tried to escape. I found myself
standing by the interstate in the pouring rain fighting with it as
truckers drove by, making the whole situation even more pleasant.
Once I finally arrived at my apartment and brought my things inside, I
felt much better about the whole drive. It is amazing how finally
being out of the car (and not in the rain) can do to lighten one's mood.
My room mates, Daphne and Michelle, are very nice, but they like to keep
to themselves, which isn't always bad. Daphne has two cats,
Mouser
and Taz, that are friendly once they sniff you out. I got a call
from Devon and we made plans to meet for lunch the next day to talk things
over and then look for Dr. Gini's office. She was still in Korea, so
we couldn't meet her until Tuesday. On Monday, Devon and I sort of
stumbled into the Big 10 sports bar. I can imagine that the place
gets pretty packed with drunken sports fans during post-game. We
discussed what we thought the whole program would be
like. Devon and I found that we have a lot in common and that we had
similar impressions of the internship. On Tuesday, we met Dr. Gini
and discussed the two fields we could consider for our projects-Robotics
with the
MinDART
group or E-Commerce with the
MAGNET
group. Then we
went to lunch with Dr. GIni and her husband and Devon and I had Vietnamese
food for the first time. I still can't eat with chopsticks. Oh
well. We also filled out some paper work to get our computer
accounts set up as well as our Ucards. Our accounts were functional
by Thursday so we could actually use the machines to do some
investigating. I had used Linux at Clarion for writing C++ code in
emacs, but I am learning a lot more about navigating in a Linux based
system here. On Wednesday, we met with the MAGNET group and listened
to Anne's presentation. I decided that MAGNET was almost
finished-they were talking about getting a license for the software-and
that, by the time I had learned Java enough to add anything to the
project, they would be long finished. The project sounded
interesting and I probably would have chosen it had it not been almost
done. Thursday we got our Ucards and talked to
Chris
about robots. Of
all the graduate students I have talked to, he was definitely the most
friendly and helpful. Don't get me wrong, everyone has been nice,
but he is just overly willing to help in any way that he can. Although
the MinDART project was also very far along, he made it seem like we could
actually do something to contribute to the project. He joined the
group three months ago. He
showed us the
CMUcams
and
explained the robotics project to us. He also emailed us several
articles and useful information about the cameras and robotics.
Throughout all this time, both groups were preparing to move to the new
technology building, so a lot was being packed up for the move. This
made it difficult to do hands on work. On Friday, we got to see the
new building and watch adults peaceably argue over desks. We also
met with the robotics group and got more information about where we might
be able to chip in. Devon is much more experienced with
Lego robots and Handyboards
(which are used to program the robots)
than I am (I had never seen one before this point). Interactive C is the
language used for the boards. I have never written in this language,
but I have used C++ quite a bit. I just think I need to look over
the basics and read the Handyboard manual. I am a bit familiar with
robotics, but have only used
Parallax
robots, which are significantly different than Lego robots. As
of now, I have a full plate to stuff my brain with - read the handyboard
manual, interactive C manual, and CMUcam manual as well as articles Dr.
Gini gave me. Happy digesting. I also starting working on this
website, but since I don't know html and I have a lot to learn already, I
am cheating by using FrontPage. |
Week 2 |
|
On Monday, I read the
Handyboard and Interactive C manuals and I didn't think it looked too bad.
I planned on testing some code just to get used to it, but I did not have
the correct account setup, so I had to wait for Paul to give me some more
information. I wrote the code to play a scale on the speaker, but
could not test it until I had access to the Interactive C interface.
Then Devon and I looked at the CMUcam and looked at some of the images it
was reading in. For some reason, we aren't sure if it is hardware or
software related, the camera is showing white to be yellow and the blue
values are very low. We thought that this was interesting and are
looking at the CMUcamGUI (the java code that allows the user to interface
with the camera) to see if it is in the software. I need to read up
on Java so I can understand some of the code. We also plan to test
other cameras to see if the same problem occurs. Paul sent
us
articles
to read about artificial intelligence, visual
localization and homing, and robotics to read. Some of them are very
complicated. Tuesday was spent looking over code and reading
articles. Then Devon taught me how to solder. I made my own
touch sensor. I also got
Interactive C working and was able to test
my code
on the Handyboard.
It was pretty neat when it worked. I actually felt like I
accomplished something other than reading, even if it was something so low
level. Wednesday, I worked on my
web page quite a bit and went to a writing workshop with Devon.
Since the lab we were using is gone because of the move, we couldn't
really work with the cameras. The workshop lasted most of the day
anyways. I also read the CMUcam Manual and retained a decent
understanding about how it works. On Thursday, Devon found a little
glitch in the GUI that was setting red=0 when it should have been blue.
This didn't solve our initial problem, but it fixed another error.
Go Devon. We discovered that our original problem really had to do
with the hardware and not the software. We also began testing how
the cameras detect different colored lights. We want to use these
lights for implicit communication between the robots by assigning meaning
to color. For now, we just want to use one color for simplicity's sake.
We first had to make our own light circuit. We tore apart on of
those touch lights that they show on the infomercials to see how the
circuit worked. 1.5 hours, 4 burned out lights, 4 AA batteries, 0.5
rolls of electrical tape, and a bunch of solder later, we had a 2 light,
switched circuit put together and functional. How's that for random
intuition? (PS: I know little to nothing about making a circuit so I am
pretty proud of what we did even if you don't think it is a large feat :P
) We then put the lights inside of a little Tupperware type
container and changed their color by using miscellaneous items, such as
$.99 candle holders from Target, spray painted clear plastic covers, a
torn up air mattress, and some cellophane, you know-high tech equipment.
If you want to see what this little contraption looked like, check out the
Photo Gallery.
We found that the camera best detects the color difference between lights
on and lights off using the color red. Even though we did this and
were proud of ourselves, we are wondering when we will get to do something
other than guinea pig work. We would like to see how the camera will
be integrated into the robot implementation, but aren't sure when that
will happen. On
Friday, we met with Dr. Gini and discussed what we might be able to do
next. Some ideas are looking at how the camera is coded to work for
the Handyboards (since we have been using the GUI so monitor the values
and test colors) and to fine tune the search for targets. We also
built our own CMUcams. It took over 3 hours to build the camera.
We did a lot of soldering. When they were built, we tested them and
were extremely happy when they both worked! You can see photos of us
working on the cameras
here. |
Week 3 |
|
On Monday, Devon and I
began researching
Sharp IR detectors that give back a
distance range for object detection. This will be useful in picking
up targets and in backing away from obstacles before actually hitting
them. We have a lot of ideas about how to use these, but first want
to make sure that we can fit them onto our already full robot HandyBoards.
Once the cameras are incorporated into the robot implementation, we can
use them with the IRs for deciding whether the robot should move away from
an object (if it is an obstacle) or towards it (if it is a target).
On Tuesday, Paul gave us a demo of the robots in action. It was
helpful to see what they could do. We also began testing an IR range
detector on the robot by making the robot follow the "arena" wall, but it still doesn't handle navigating sharp turns
with the sensor well yet. The accuracy range for distance with these
IRs is 10cm to 80cm. Outside of this range, the sensors' behavior is
very unpredictable. We just need to make some adjustments to
the single IR implementation and then we can start adding more IRs and
seeing what we can do with them to help improve the robots navigation
skills. We also want to improve our light circuit that we made so that it
can be turned on and off through the Handyboard rather than via touch
sensors. This will allow the robots to "decide" when to turn their
lights on and off. On Wednesday, we experimented with the
positioning of the IR on our wall follower and found that if the IR is
turned vertically, it is much more accurate. We also moved the
sensor farther back onto the robot so that the "bad" close up range is
covered. This improved the wall follower significantly. On
Thursday, we took a break from the IRs and dealt with trying to make a
relay switch so that the Handyboard can control the lights turning on and
off for implicit communications. This was quite a task. We spent a
long time looking for the right kind of circuit. Once we found it,
we had to go shopping for parts (resistors, relay, breadboard, etc).
Then we made our relay switch. Our original plan was to control the
switch from the digital ports on the Handyboard; however, the relay that
we bought was for the wrong voltage and only worked from the motor ports.
At least they work. It will be more resourceful to use the digital ports
because they won't drain the Handyboard's power as much. We plan to
try a different relay tomorrow. On Friday, we tried the other relay and,
although the voltage was correct for the digital ports, there was not
enough current coming from the ports to the relay to turn the light on.
Paul found some more relays over the weekend for us to try on Monday.
We also had another meeting on Friday and discussed some of the troubles
we were having. We continued to work with the IRs and are working on
getting a distance/voltage table created. |
Week 4 |
|
On Monday, Devon and I
tested three IRs by taking readings from 10 cm to 80 cm in 2 cm intervals
and recording the scaled voltage (from 0 to 255) at each interval.
Then we made a spreadsheet with the values and used power regression to
try to find the best fit curve for the values. We did this for each
of the individual sensors and also with the average of the all the sensor
readings. We want to use the most accurate quadratic approximation
of the data points so that we can use it in a lookup function to convert
voltage to distance. On Tuesday, we had hoped we could use Maple
to graph some of the functions; however, UMN does not use Maple in
their labs and both Devon and I have used it in the past quite a bit.
Instead, we worked on using Mathematica to solve for various
x-values to add to our spreadsheet and tried to find a different way to
graph our quadratics. We also figured out the error of each
approximation. The help system for both Mathematica and
Matlab were somewhat disfunctional, so we are still trying to figure
out how to graph our functions. Devon is going to see if she brought
her Maple CD with her so we can use that. On Wednesday, we got
Maple up and running and eventually got our graphs and data points to
display. We are trying to manipulate the regression so that it has
as little floating point in it as possible. HandyBoards supposedly
don't do well with the floating point. I also worked on looking at
the code for the robots to see where we can fit behaviors for the
light/implicit communication and the IRs into the previously defined
states. I have never really worked with subsumption architecture and
finite state machines before, but from what I already have learned, they
weren't all that difficult to understand. On Thursday, Devon did a
few finishing touches to the IR distance formulas while I built another
relay to try. We ended up making a relay without resistors and it worked.
Then we went to buy some more light bulbs, colored plastic wrap, and
containers to make beacons with. When we came back, our relay wasn't
working very well any more, so we need to figure out what is wrong with
it. We had a MinDART meeting and discussed what needs to be done.
Devon and I are in charge of working out behaviors for the beacons
(implicit communication). We are meeting with Paul tomorrow morning
to discuss some ideas. The group decided that using the IRs was last on
the priority list, so if we get the beacons working well, then we will go
back and work with the IRs some more. On Friday, we discussed the beacon
behaviors and looked in the finite state machines for other behaviors to
see where ours could fit in. We also tried the different types of
plastic wrap colors to see how the camera detected them. Then we
went to a luncheon. |
Week 5 |
|
On
Tuesday, I worked with Paul to set up the 9 volt lights on the robot so we
could test different colors with the camera while the lights were on the
robot. I went to Target to get some more supplies (we had decided to
use a larger piece of tupperware to put the four lights in, which, if I
remember, I will eventually take a picture of so you can see what I am
talking about. When I tested the camera again with the red plastic
wrap and lights, it wasn't seeing the red anymore because the lights were
stronger. This was not bad, because we wanted to stop using the 9 volt
lights with the motor ports anyways since they used the handyboard's
battery. We wanted to use 2 or 2.2 volt lights instead and control
them by using their own battery supply and turning them on and off by
setting the digital ports to be output ports. Then we had a MinDART
meeting and made some significant decisions. We decided that the
robots will turn a beacon on only if they see more than one target in the
same region. For example, if the robot is carrying a target back to
base and it sees another target, the original code has the robot stopping
to localize so it can go back to get the second target after it is
finished delivering the first target. Now, while it is localizing,
it will turn on its beacon and then turn it off when it is finished
localizing. If another robot sees the beacon, it will go in that
direction only if it is not doing something else. In the event that
the beacon should turn off while a robot is following it, the robot will
go straight for a time and if it does not see the beacon again, it will go
back to its normal routine. After the meeting, Harini, Esra, and I
worked for a long time on testing the cameras. We have three
possible lens choices. We also wanted to test different colors and
shapes for the landmarks used in localization. So we tested each
color and shape with each of the three lenses and chose the best lens and
colors. For the landmarks, the colors we will use are green, yellow,
and orange. The orange, we decided, would conflict with the red
beacon, so we wanted to test white light instead. The landmark shape
that is the best is a rectangle, but it is not the most convenient, so we
are still working on that. On Wednesday, Harini brought in some
different colored bulb coverings that look the Christmas tree bulbs.
We tried each of the colors in the beacon and the camera thought they were
all white. So, Harini was goofing around and put 4 different colors
in one beacon. This gave the beacon a tie dyed appearance that was
so neat. The best part was, that even thought the beacon appeared
very colorful, it looked completely white to the camera, which had
extremely high confidence levels while tracking it. We have decided
to go with this colorful look to give the robots character. I think Dr.
Gini will love the new look when she comes back to see it. Another
advantage of this white light tracking is that the landmarks have very low
blue values (between 0 and 16) and high values in red and green through
the camera's eyes and the beacon, because it looks white, has very high
blue values as well as high red and green values. This allows the camera
to easily distinguish between a beacon and a landmark. Next, I made
a small light circuit with a 2 volt light, but because I only had one
light and no one was around to go to the shop (you have to have an access
card to buy things there) with me to get more, I couldn't really do a lot
of testing with it. We also got some new Jameco relays in to test.
On Friday, Chris took me to the shop and we got some more lights. I
put together a relay circuit on the breadboard using 4 lights, 4 relays,
and 8 AA batteries and connected it to the handyboard. These lights
were not even close to being bright enough. So I tried it with the
2.2 volts lights, which were better, but still not enough to give the
psychedelic yet white light effect. I make this sound like it was
easy to do, which it was mentally easy, but it took me FOREVER to solder
all those lights and battery packs. I even dripped hot solder on my
leg which really sucked since it is 470 degrees Celsius. |
Week 6 |
|
On
Monday it was nice having Devon back since I worked completely alone on
Friday. Robots are fun and all, but they are horrible
conversationalists. I explained to her what I had been working on
and we decided that, if we had to use 8 AA batteries anyways, we should
just try the 9 volt lights with the digital ports. And they worked
quite well-except for when we tried to put 4 lights in one port and we
think we killed that port. 2 lights in one port with a 100 ohm resistor
works nicely. So we stocked up on 9 volt lights and sockets (it is
nice not to have to directly solder a light bulb) and are working on
making relay circuits with perf board for all of the robots. On Tuesday, I
worked on setting up the layout for the camera and lights on the robots
and came up with a good layout. I also showed Devon, or tried to
show Devon, how we came up with the confidence levels for the camera.
It was rather difficult since the computers were being fiesty. None
of the serial ports wanted to work for us today. We also went to see a
presentation given by one of the EE students to get an idea of what we are
expected to present. Our presentation is on the 30th. After that, I
started working on the perf board layout. Tomorrow we will make the
boards for all the light/relay circuits. Wednesday we worked on
using the perf boards and had a very frustrating day. Not only were
they very hard to make, but the ones we made didn't work and we can figure
out why. We tested and tested and tested and can't debug the
problem. You can see pictures of the beacon and these little
circuits in the
Photogallery now. Devon and
I also found out that we may be able to go to the
AAAI conference where
Paul and Chris will be exhibiting the robots. I think it would be a
great experience to see all the different projects that go on. We
would also be able to attend a workshop. Hopefully we get approved
for funding. On Friday, Devon and I found out that we were approved
for funding to go to the AAAI conference so we made our flight
reservations. We will be going to Edmonton, Alberta for the conference
from July 27 - August 2. This means we won't be doing the EE Power
Point presentations any more. I was at home for my best friend,
Jenny's, wedding so Devon and I were on the phone back and forth
throughout the weekend trying to organize some of the details. She emailed
me and told me that she got the perf board circuit figured out, so that is
good. |
Week 7 |
|
I came
back on Monday and was exhausted because the wedding was on Sunday night
and we got up early to get to the airport on time. Devon and I
worked on getting our student housing and conference forms filled out.
We also looked over past CRA DMP participants' papers to see what forms we
might write our project summaries in. On Tuesday we continued
to finish the circuits for the relays and tested out some resistors we
needed so we didn't fry the relays. We also worked on tidying up the
packaging for the whole circuitry including the handyboard connections and
lights. On Wednesday, we color coded our circuits and
hot-glued them to Lego pieces so that they could sit securely within the
robot framework. With a lot of Paul's help and guidance, we
basically finished the design of one robot to use as an example to convert
the other four to. On Thursday, we began reconstructing the rest of
the robots and it took a long time. Esra joined the fun as we
starting acting silly from Lego overload. On Friday, we finished
those up and started lengthening some of the wiring that would be used to
connect to the handyboards. |
Week 8 |
|
Monday
was a day for soldering. I continued to lengthen the wiring for the
lights, batteries, and handyboard (for the relay circuit), while Devon
lengthened the wires that we already on the robots. We actually ran
out of some key supplies, so we had to go to the shop for some header and
shrink wrap. We needed a bunch more stuff, but the shop didn't have
what we needed because we had bought all of what they had throughout the
summer. We found
out that Dr. Gini will be back on Thursday, so it will be nice to show her
everything we have gotten done since she left. On Tuesday, we
finished lengthening the existing handyboard cables for the robot and
tested the bumpers and IR sensors. Then we wired the servo boards so
that they could be added to the robots as well. We also worked on
our project abstract for our poster presentation that we will be giving on
our last two days of the internship. On Wednesday, we added the
servo boards and relays to the robots and built places for the lights onto
them. We added lego pieces to the tupperware for the beacon and
added cameras to the robots. A test for the lights was added to the
sensor testing code for the robots. For the next two days, we will
be testing like crazy. On Thursday, Devon and I took another practice GRE
in the morning and we both did better than we did on the first test.
Afterwards, we worked on the robots. We realized that we needed even more
resistance for the lights so that they wouldn't kill the relays, so we
added 2 more to each set of lights. We will be doing a lot of
tweaking when we get to Edmonton to adjust the robots to the setting
there. Paul told us that we could work on setting up a presentation
with him which would help us in summarizing the project and putting the
information all together. |