Based off of the GC errors, it still surprises me how badly Mono messed up at a macro level the GC in this version. I mean, how hard is this control flow?
1. Allocate memory for object. (If success, done)
1a. If not enough memory, run a GC (preferably a short one, though depending on the GC scheduling policy, it may be time for a mid or even full collection).
1b. If the heap is still at or near the current size of the heap, expand it, if there is still room to grow. If heap expansion was successful or not needed, retry step 1, else goto step 2
2. Retry allocation. (If success, done)
2b. If still not enough memory, run full GC
2c. Retry allocation. (If success, done)
2d. If still not enough memory, THEN throw an out of memory error.
Yes, this still does not account for when shrinking the heap may be desirable, nor does it quickly handle the case where a memory request is for more than the max heap size, but this policy is still FAR better than the policy Mono (whatever version Unity uses) uses.
Any word about when Unity plans to use the new version of Unity? IIRC, the latest versions of Mono have a decent GC and memory allocation policy. It doesn't seem like it should be too hard as long as they were sticking with the documented API, while avoiding deprecated calls and reliance on implementation specific details.