-
Autonomous rules?
Is there any weight bonus' (similar to what walkers get) for completely autonomous robots? There's some nice compact Lidar gear out there now and I'm toying with the idea of putting something together, but it wouldn't be worth it without some kind of weight bonus.
-
-
Nothing in there mentions a weight bonus for autonomous robots - so I assume there is none? Could this potentially be something to come into the ruling in some way in the future? I think it would encourage a lot more interesting builds! With all the extra kit that comes with autonomy (and the fact that you'll end up with a half blind dumb robot) it's not worth pursuing without any bonus I think.
-
Semi autonomous cluster bots maybe?
-
I could see semi-autonomous robots having quite an advantage in the future if the autonomy was used practically - e.g. weapon firing dependent on sensors and other factors (e.g. auto self righting with gyros). But as the control isn't too complex I can't see an advantage. Perhaps best to let evolution take its course rather than trying to force it with bias?
-
I regularly compete in the US [long trip!] with autonomous 1 lb and 3 lb machines. They have a class for fully autonomous machines which compete only against other autonomous machines. It's great fun. For heavies at present I think that Will, above, is right. Semi autonomous has potential but you still want to have a person driving the machine. And of course there is the safety aspect - you don't want to come anywhere near an autonomous machine unless it autonomous functions have been switched off and you know they are off.
-
There is no additional weight bonus for autonomous robots.
-
I saw Chomp on battlebots uses a sensor to fire its hammer when an opponent is in range and to keep facing towards its opponent
-
I'm more interested in building a fully autonomous bot. It would use similar tech to chomp though, but the only control I would have is the stop/start button. I wonder if its ok for the bot to communicate to a laptop for calculations, or would it have to do everything on board...
-
For fully auto you could push a lot of comms through a 2.4GHz link although I'm not sure of the specifically allowed comms channels for robot combat. You'd only need to directly activate functions (motors and weapons) so you could have the laptop in the booth. So more like a computer driving it than a completely self contained robot. 3 reasons I can think of for doing it that way -
safety - you can directly intervene with the control of the robot should it be required;
simplicity - the robot could initially be identical to the human controlled one but with an AI moving the sticks;
protection - a laptop or embedded PC (or even a smart phone) likely wouldn't last 1 drop / impact and certainly not several and would be an expensive bit to replace.
-
Lidar would need to gather data from the bot and stream it back to the laptop, so would need to use something other than radio. Most probably wifi or bluetooth. If that is allowed then it might be a viable project for me.
-
Depending on the level of processing power required, you could do Lidar with a Raspberry Pi (One of the new ones) or stepping up a Intel Compute stick, or I suppose if you need an actual computer in the robot then of of the Intel NUC's. I say this because I'd be worried about the reliability of a Wifi signal. Not sure why but despite our TX's and Wifi both being 2.4Ghz, the former seems far more reliable.
Cramming that kind of tech into a FW would be tricky. But I'd love to see it happen!
-
A cheap smartphone would probably be capable of doing all that in theory at least - decent processing power, WiFi, low power consumption, very compact with It's own power source, and lightweight with reasonable shock resistance. I wouldn't know where to start programming all that.
-
There was also a robot on robot wars extreme that had automated side axe's but have you checked the new robot wars rules
-
For computation sake I would need something fairly fast. The Odroid-C2 looks pretty good. With a Mali 450 GPU I can use Open-CL to speed up a lot of calculations. GPU is very good for processing big data sets like a lidar point cloud. It's CPU is 2ghz quadcore too, which is an improvement over the Raspberry Pi 3. I'll no doubt be able to find suitable hardware if I did go ahead with this, just wondering if its worth the fuss :p
-
I was considering this for a UK ant so I'll put down my thoughts.
My first thought it that being an ant there was no way I could get sufficient processing and sensers on it to fight effectively. SO I started going down the offboard control route. I settled on a kinect 2 camera. Offboard gives a much more stable datastream than a robot being thrown around the arena and much easier to process stuff like the robots position in the arena. The 3D imaging capability of the Kinect 2 simplifies the image processing methods needed as you can reasonably assume a non flat bit of floor is probably a robot/hazard or wall. This gets around bots that might be very hard for the system to visualise. It also would have allowed me to throw a much bigger chunk of processing power at the problem using a desktop as opposed to a single board PC.
The eventual plan was to minimise the information needed for the robot to feed back to the point none is needed. Effectively this would allow the control system to control any bot not just one built for autonomy provided it had some kind of tactic for each.
The only issue is kinect 2 cameras can have trouble seeing through polycarbonate so there might have been the possibility of having to place the camera within the arena, not much risk in ants but more so in higher weight classes. On the plus side you can rig it to capture some nice battle footage too!
-
I am keeping an eye on this thread regarding building autonomous robots.
Would be so cool to program a game plan for fighting the other robot, which you also have programmed to follow a game plan for each type of bots.
I can only work with arduino though. So it would probably just be ultrasonic and infrared sensing.
But I love the idea of external processing with the Kinect camera. Unfortunately I can't figure something like that out unless I got 10 years Lol.