Since my longer post seems to be caught in moderator purgatory I will rehash its results into this much smaller message. I apologize for the spamming.
I present a theorem whose thesis is obvious to many.
THESIS: All hashrates h' > h generate a revenue per unit of hash v' > v.
Let us absurdly[1] assume that an optimal hashrate h exists where the average revenue for each hash in service is maximized. This will result from perpetually mining blocks of size q, is v. All larger hashrates h' > h will generate an average revenue per hash v' < v(effectively the conclusion of my paper) due to the higher orphan risk carried from having to mine blocks of size q' > q. Leading from Peter's model and my analysis, the origin of this balance lies in the fact that larger miners must somehow be forced to mine larger blocks which in turn carry a larger orphan risk.
What happens if a large miner h' chooses not to mine his optimal block size q' in favor of a seemingly "sub-optimal" block size q?