BMW K1600 Forum banner

1 - 11 of 11 Posts

·
Registered
Joined
·
227 Posts
Discussion Starter #1
For those who remember, I am part of a FIRST robotics team here in Goldsboro, NC (3737 Roto Raptors). This year we tackled an off season project called swerve drive.
The idea is that each wheel is separately powered to drive and steer 360 deg at the same time (8 motors total).

After 8 months of design, building and coding, on Christmas eve, we ran our first test on an open carpeted area. We were blown away at the performance.
This 26 sec video shows its capabilities. And note - the video is not sped up!!! It's driven by an X-box controller over WiFi.


Onto this chassis would be built the mechanisms to handle the game pieces. We find out our challenge for this year on Jan 4th.

In 2019, we made it all the way to the FRC world championships in Houston, TX and got to the finals on our field (6 fields with 66 robots each).

This is one of our qualification matches - we are the blue robot in the middle right side (3737). The end climb we complete is 19" high! (and our alliance wins)


Cheers,

Derek

PS: Took a 2h ride today with a bunch of local bikers - sun shining but a little chilly - glad of the heated grips and seat, electric windscreen and great body protection. Every time I ride this bike I am impressed!
 

·
Premium Member
Joined
·
1,244 Posts
The drive seems to be performing well. How does this work? It appears that you have drivers operating but are they line of sight, through onboard cameras? Congrats on the performance.
 

·
Registered
Joined
·
227 Posts
Discussion Starter #3
For the match, there are three drivers each end of the field behind a Lexan wall (plus other drive team members). Driving starts after the initial 15 second autonomous section.

Many do have onboard cameras (including ours) which display on their driver station (laptop screen). This helps with game piece placement when 'stuff' is in the way. Some even have vision systems which assist with placement (we don't).

Cheers.

Derek
 

·
Premium Member
Joined
·
1,244 Posts
Its a perfect opportunity for ML. We did some experiments last summer with simple ML/feedback to drive an autonomous vehicle. At the time the ML was too heavy for the available onboard processor so we used a cloud based GPU. This allowed us to process video images (forward only camera) and provide feedback (control) @ 15fps which as it turns out was sufficient for the task we had (course following with an autonomous vehicle).

I noticed the driver getting confused by orientation with your system. Since your system can drive in any direction I would expect the conversion from joystick (or whatever control) to direction would be decoupled from actual bot orientation. What I see is a natural tendency to realign the bot for 'forward'. I suspect that this is normal and something that has to be trained out of the driver. Anyhow, great work.
 

·
Registered
Joined
·
227 Posts
Discussion Starter #5
Its a perfect opportunity for ML. We did some experiments last summer with simple ML/feedback to drive an autonomous vehicle. At the time the ML was too heavy for the available onboard processor so we used a cloud based GPU. This allowed us to process video images (forward only camera) and provide feedback (control) @ 15fps which as it turns out was sufficient for the task we had (course following with an autonomous vehicle).

I noticed the driver getting confused by orientation with your system. Since your system can drive in any direction I would expect the conversion from joystick (or whatever control) to direction would be decoupled from actual bot orientation. What I see is a natural tendency to realign the bot for 'forward'. I suspect that this is normal and something that has to be trained out of the driver. Anyhow, great work.
Nice to have someone who clearly understands some of the programming challenges.
Firstly this was literally the first drive in an open area and somewhat random driving. We noticed the rotational drift also and so this will be fixed in the next software update in the PID loop.

Thanks for the feedback!

Cheers,

Derek
 

·
Registered
Joined
·
2,692 Posts
so scaling up, what G forces would be put onto the occupants
 

·
Registered
Joined
·
5,999 Posts
Very cool stuff.

I was part of a MicroMouse team in college, and we were one of two teams that found the center of the maze.

That was some 25 years ago, so it's great to see how far things have come.

I'm sure the proliferation of cheap memory, better sensors, and powerful, efficient processors has a lot to do with it.
 

·
Registered
Joined
·
227 Posts
Discussion Starter #8

·
Registered
Joined
·
227 Posts
Discussion Starter #9
Very cool stuff.

I was part of a MicroMouse team in college, and we were one of two teams that found the center of the maze.

That was some 25 years ago, so it's great to see how far things have come.

I'm sure the proliferation of cheap memory, better sensors, and powerful, efficient processors has a lot to do with it.
So nice to be able to finish the challenge - and what a great sense of achievement for the team!

We actually use a computer from National Instruments that is in the same family as used on the Mars rovers. Very powerful and filled with multiple I/O.

And yes, even in the 10 years we have been in FIRST, we have seen everything get smaller and more powerful. The motors on our swerve drive are brushless and more powerful than the previous CIMs which were conventional type motors and over twice the size.

Cheers
 

·
Registered
Joined
·
5,999 Posts
So nice to be able to finish the challenge - and what a great sense of achievement for the team!
Somebody was videotaping that event, but we never did get a copy. I also wrote and published a paper on our little robot and gave an IEEE talk about it, but never got a copy of those publications either. :(

I did take the robot to my first "real" engineering job interview with a semiconductor equipment manufacturer, and gave an short talk on it to some very interested working engineers. That made the rest of the interviews go quite smoothly, and I did get the job...

And of course, we left the robot behind for next year's team to upgrade as they saw fit.

We actually use a computer from National Instruments that is in the same family as used on the Mars rovers. Very powerful and filled with multiple I/O.
I think we were using an 8088 processor with an external motor controller chip, all programmed in Assembly. Much simpler than what's available today, but it worked well enough for our needs.

And yes, even in the 10 years we have been in FIRST, we have seen everything get smaller and more powerful. The motors on our swerve drive are brushless and more powerful than the previous CIMs which were conventional type motors and over twice the size.
Did I mention I'm working in semiconductors? We make the machines that process the wafers into chips, so sometimes we got to see what's coming a few years ahead of release.

And yes, the focus is always smaller, faster, more powerful, more integrated, more efficient, and cheaper. But that's starting to get quite difficult as chip critical dimensions get smaller and smaller. A couple of fabs are currently producing chips at the 5nm node, with certain dielectric layers approaching 0.5nm. Which is really tough when a Silicon atom is only ~0.2nm across, so you start to see significant quantum tunneling effects where individual transistors bleed charge across each other.

Even still, Lawrence Berkeley National Laboratory is using carbon nanotubes barely 1nm long to build even smaller transistors, although replicating that at scale is still quite difficult...

Exciting stuff indeed. :)
 

·
Premium Member
Joined
·
1,244 Posts
Somebody was videotaping that event, but we never did get a copy. I also wrote and published a paper on our little robot and gave an IEEE talk about it, but never got a copy of those publications either. :(

I did take the robot to my first "real" engineering job interview with a semiconductor equipment manufacturer, and gave an short talk on it to some very interested working engineers. That made the rest of the interviews go quite smoothly, and I did get the job...

And of course, we left the robot behind for next year's team to upgrade as they saw fit.

I think we were using an 8088 processor with an external motor controller chip, all programmed in Assembly. Much simpler than what's available today, but it worked well enough for our needs.

Did I mention I'm working in semiconductors? We make the machines that process the wafers into chips, so sometimes we got to see what's coming a few years ahead of release.

And yes, the focus is always smaller, faster, more powerful, more integrated, more efficient, and cheaper. But that's starting to get quite difficult as chip critical dimensions get smaller and smaller. A couple of fabs are currently producing chips at the 5nm node, with certain dielectric layers approaching 0.5nm. Which is really tough when a Silicon atom is only ~0.2nm across, so you start to see significant quantum tunneling effects where individual transistors bleed charge across each other.

Even still, Lawrence Berkeley National Laboratory is using carbon nanotubes barely 1nm long to build even smaller transistors, although replicating that at scale is still quite difficult...

Exciting stuff indeed. :)
While we software types truly appreciate you building our hardware, its the rider that makes all the difference :cheers:

The most exciting thing that is happening to me is the race to zero cost compute. The simultaneous drive to shrink everything and make more of it is creating an intersection point of viable computing horsepower that is ubiquitous in availability, low power, highly capable, and is part of everything we do, own, or use. For a software guy, its the motorcycle equivalent of people just leaving high performance machines everywhere that can be ridden for free >:)
 
1 - 11 of 11 Posts
Top