Author Topic: So about the AI War AI...  (Read 1780 times)

Offline Echo35

  • Master Member Mark II
  • *****
  • Posts: 1,703
  • More turrets! MORE TURRETS!
So about the AI War AI...
« on: December 14, 2011, 01:48:29 pm »
nVidia is making CUDA open source it seems. Can we have GPU processed AI now please? :D

http://developer.nvidia.com/content/cuda-platform-source-release

Offline TechSY730

  • Core Member Mark V
  • *****
  • Posts: 4,570
Re: So about the AI War AI...
« Reply #1 on: December 14, 2011, 04:44:48 pm »
GPU computing is awesome, and its great seeing them opening up their library, so that others can refine it and/or build on it.

I'm pretty sure that you were joking with the GPU AI, but I will give a few good reasons why this is a bad idea anyways.

1. Graphic card dependence: Currently CUDA assumes an NVIDIA GPU to work with. (Though I am sure someone will create a fully "software rendering"/CPU only implementation of the CUDA API at some point, even though it's performance would be in the toilet). Thus, using it will create a dependency on the graphics cards, locking out everyone not using a NVIDIA graphics card.
2. GPU Operations: IIRC, GPUs are great for floating point and matrix operations (though CPUs are getting better and better and floating point math). If you aren't using those heavily, you're better off not invoking the overhead of shipping an operation to the GPU and getting the result, and just do it on the CPU. I don't have the source code, so I can't be sure, but I'm pretty sure there aren't many floating point operations or matrix operations in the AI thread, nor do I think many of the the algorithms used by the AI thread could be converted to a matrix representation and operations without tons of overhead to work with. So it is almost certainly not worth it.

So, the really the only place for GPU use in AI War is (appropriately enough) graphics rendering. Thankfully, Unity is smart enough to use a graphics card for graphics if there is one (I think).

Offline keith.lamothe

  • Arcen Games Staff
  • Arcen Staff
  • Zenith Council Member Mark III
  • *****
  • Posts: 19,505
Re: So about the AI War AI...
« Reply #2 on: December 14, 2011, 09:55:10 pm »
Yea, we intentionally don't do any floating point math in the simulation because two machines don't always give the same results on floating point math and that would cause multiplayer desyncs.

Plenty of floating point math and such tomfoolery in the graphics, but, well, it's already happening over there :)
Have ideas or bug reports for one of our games? Mantis for Suggestions and Bug Reports. Thanks for helping to make our games better!

Offline Mánagarmr

  • Core Member Mark V
  • *****
  • Posts: 4,272
  • if (isInRange(target)) { kill(target); }
Re: So about the AI War AI...
« Reply #3 on: January 17, 2012, 11:08:02 am »
When I'm coding I use integers as much as I possibly can. Everytime I'm forced to use floats (or long), God kills a kitten. (Yes, I hate floats)
Click here to get started with Mantis for Suggestions and Bug Reports.

Thank you for contributing to making the game better!