That's not how the SGX model is said to work, that's the (admittedly possible) tin-foil hat version. But Intel aren't going to sell processors to which only they have the private key to run enclaved code, and more to the point, consumers won't buy them. What sort of a "feature" would that even be? Intel could use much more insidious ways to back-door their processors.
Of course they're not going to sell processors that only run enclaved code signed by them. That would indeed be silly. I'm saying that code that runs within an enclave will be impossible to reverse engineer without the private keys.
Ok, well I think that in principle it's not such a bad thing. It's exactly as I described earlier: this is a very powerful mechanism, for use and abuse. If you have good evidence to trust your hardware manufacturer and your OS (...and your other software) then it's actually highly resistant to malevolent state actors. And therefore this
category of innovation has the potential to safeguard your digital privacy in a way that's as close to absolute as there ever has been (given what we know now about the past). But perhaps SGX itself will be conniving in the extreme, we will find out in time. Intel will do themselves commercial harm to do this too overtly though, I strongly suspect the barriers to entry in the processor design/manufacture market will become lower and lower as we go through the 2020's. Imagine 3D printing your own processor design, as it will happen at some point in our lifetime.