The thing causing the confusion is the ambiguity, seemingly deliberate, over this whole thing. He said they had major challenges trying to maintain fixed 2Ghz on the GPU and some lower CPU frequency that I don't care enough to look up. They then say they're going to use variable clocks, and they expect to spend most of the time at or near those frequencies. Bolded three fuzzy approximations. What we don't know is what types of workloads will cause the drops (since it's not based on % utilized or temp, but something deterministic based on the actual instructions), and how frequently they occur or what the typical and worst case drops are. We know that if they're 10% over the per budget a small percentage of drop in GPU frequency meets that, but they're clocked up over 10% higher on GPU than they were having issues guaranteeing at fixed clocks, so if there are workloads taxing both CPU and GPU, it might push way over budget. I'm sure that they've profiled extensively with their library and games in development, but who knows where we'll be in there or four years. Just not enough info to make intelligent assumptions, which is of course why nobody has anything to say and...oh.