Author Topic: New Programming Challenge?  (Read 801 times)

Offline TechSY730

  • Core Member Mark V
  • *****
  • Posts: 4,562
New Programming Challenge?
« on: April 05, 2012, 11:54:45 PM »
After the bust that was the energy efficiency challenge, I've been trying to think of a better, non-trivial one that won't degenerate into an O(n) algorithm and then all of a sudden become an NP-hard problem when the problem is made generic.
I think I got one.

Auto-targetting efficiency.
Given a set of ships and ship types, each ship type having a damage, firing rate, max health, armor, hull type, and hull type bonuses, can you come up with an algorithm that takes two such sets and can optimize DPS (or optimize kills per second, haven't decided yet).

Scores would be judged based on highest DPS, but every algorithm would need to be O(n^2 * m^2) or less (where n and m are the size of each set of ships). Also, the algorithm must be consistent. Aka, given the same sets of inputs, it must always give the same output. (An iterative algorithm with a non-deterministic path to the solution would be allowed, provided the eventual solution is consistent and you can still meet the running time requirement)

The problem is testing this. It would basically require a simulation of sorts.
If we ignore range, mobility, support effects, and immunities outside of 0x hull multipliers, the simulation can become rather simple. Though to become truly useful, you would need to at least factor in range and mobility, requiring an almost full fledged battle simulation.
Thus, this would make testing quite difficult.

Also what would be difficult would be comparing it to the algorithm AI war has, as of course I don't have the source to it. That wouldn't necessarily be needed for this though, we can just see how well we can all get.
IIRC, the targeting algorithm AI War uses is in the absolute worst case O(m * n^2 + m^2 * n), though this worst case almost never happens due to all kinds of optimizations they use for common cases. This is why I chose the asymptotic running time bound I gave above.

Yet another difficulty would be judging this. How would we handle solutions that stray a little bit from the "true optimal" for a big gain of speed?

What do you guys think? Too complicated to setup a framework to make this kind of a programming challenge possible?