This question was derived from my off-topic questioning here: Does Centralisation Decrease the Probability but Increase the Damage of Exploits?
Like before, I suspect this may be overly abstract as a question. But the excellent responses to my previous question proved me wrong before.
One of the tenets of security seems to be that if something is overcomplicated, the probability of exploiting increases. For example, if an OS has many corner-cases with the permissions system, it'll probably be easier to bypass. If code involves the juggling of pointers and offsets, it'll be more likely to be exploitable via a buffer overflow than a Python script.
Simplicity, however, implies homogeneity. After all, it's simpler to secure fewer OSes, maintain fewer antimalware solutions, and patch fewer applications.
Diversity on a site -- such as a diversity of OSes, security packages, and applications -- is obviously less simple to maintain, thus more likely to fatigue the sysadmins and hide corner cases. However, diversity is also more likely to stop malware spreading due to environment incompatibilities, and if a system-dependent failure hits, it'll leave the other systems unscathed.
TL;DR: It seems that simplicity is key to security, especially security via correctness. However, diversity reduces the impact of system-dependent malware or failures. Which approach do those securing sites lean towards on the average case? (Whatever on earth an 'average case' in this situation would be.)