The whole issue would be getting access to the GPU or even publishing an app that makes full use of the GPU, xbox one x is really, really, locked down.
With the latest SDK version (Win10 Fall Creators update) devs can make UWP apps that have access to 9GB RAM (5GB for OG models), 6 exclusives Jaguar cores (probably not worth much compared to GPGPU) and full access to the iGPU via DX12/Compute Shaders.
Are there any UWP devs on this forum to give us more insight?
There's huge untapped potential here...
I highly doubt they want people mining, afterall, microsoft will have to deal with any issues if the consols break under warranty.
If that's an issue, they can offer you an option to cancel the warranty and start mining at your own risk (Sony didn't do this for the PS3/Folding@Home, even though OG PS3 was a ticking bomb due to YLOD).
I cant imagine either Sony or MS would condone a real useful crypto mining app, though it would be genuinely useful on the Xbox One X as it actually has a higher end GPU. Then again they had folding@home running on the PS3 so its possible they might allow something like this eventually.
If MS allowed it, it would give them a serious advantage over the competition. They need to find a way to make XBOX an enticing platform (considering the lack of exclusive games).
Imagine if MS issued their own coin and allowed you to buy stuff from the store with it? Just an idea. It would be revolutionary in the console space.
I can only imagine what would happen if half of the xbox one and xbox one X's in the country started mining with a nice hash type app. Suddenly you cant find any old original xboxs for under $100 anymore lol People would suddenly have warehouses full of OG xbox ones.
Old OG Xbones use dated 28nm lithography and GCN 1.0 GPUs with only 1.31TF of compute power (125 watts).
XBOX ONE X is a much better value proposition (175 watts, 16nm FinFET lithography, 6 Teraflops).
It would also be a nice alternative, considering the fact that you cannot find cards like RX 580, Vega 56, 1070 Ti in the market anymore, while there is an abundance of XB1X consoles for less than 500$/€. Doesn't sound like a bad deal, since Vega GPUs can cost up to €1000 these days.
The iGPU in the XBox is going to perform closer to a RX 560 than a RX 580.
It is going to have serious thermal limits that won't let it clock as high as a discrete GPU with the same number of cores when under heavy load, so even though it has a similar core count to the 580 it's NOT going to be close on performance.
There are no thermal limits, it's meant to deliver 6TF of power with no issues thanks to vapor chamber cooling:
http://www.eurogamer.net/articles/digitalfoundry-2017-project-scorpio-tech-revealedhttp://www.eurogamer.net/articles/digitalfoundry-2017-microsoft-xbox-one-x-review_1They may also be limited on how much memory they can access - the A10 series has a 1 GB limit on it's iGPU to date, even though some BIOS claim they can work with 2 they just CRASH if you try.
Don't compare it to PC APUs, console APUs are semi-custom and they support technologies like HSA/hUMA (unified address space for the CPU/GPU).
Games/apps have access to 5GB DDR3 + 32MB eSRAM (OG) or 9GB of GDDR5 RAM (Scorpio).
For perspective - my A10-7860k and A10-7890k both have 512 cores running at about 800 Mhz - which is the SAME specs at the old HD 7750 - but for almost EVERY algorithm I've tried on both the HD 7750 blows those iGPUs out of the water on hashrate.
Part of that is that the A10s are using GDDR3, but it's not enough to account for ALL of the difference.
I think you meant DDR3. GDDR3 is the predecessor of GDDR5 (graphics-oriented memory).