Bitcoin Forum
December 12, 2017, 03:06:51 PM *
News: Latest stable version of Bitcoin Core: 0.15.1  [Torrent].
 
   Home   Help Search Donate Login Register  
Pages: « 1 [2]  All
  Print  
Author Topic: Will the new invention by Intel & Micron help PCs to replace ASIC again ?  (Read 2319 times)
hexafraction
Sr. Member
****
Offline Offline

Activity: 378

Tips welcomed: 1CF4GhXX1RhCaGzWztgE1YZZUcSpoqTbsJ


View Profile
August 02, 2015, 07:01:26 PM
 #21

The "RAM" will be persistent and the DRAM will be another cache layer above the persistent storage. To realize the full potential of the tech, the CPUs and OSes will need to be changed.

What's wrong with preserving the current CPU design, with onboard SRAM cache above DRAM, and DRAM above this "persistent RAM" made accessible by a high-speed interface such as wide PCIe, where the OS itself uses its paging/memory management mechanisms to move data to and from DRAM?

I have recently become active again after a long period of inactivity. Cryptographic proof that my account has not been compromised is available.
1513091211
Hero Member
*
Offline Offline

Posts: 1513091211

View Profile Personal Message (Offline)

Ignore
1513091211
Reply with quote  #2

1513091211
Report to moderator
1513091211
Hero Member
*
Offline Offline

Posts: 1513091211

View Profile Personal Message (Offline)

Ignore
1513091211
Reply with quote  #2

1513091211
Report to moderator
1513091211
Hero Member
*
Offline Offline

Posts: 1513091211

View Profile Personal Message (Offline)

Ignore
1513091211
Reply with quote  #2

1513091211
Report to moderator
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.
Spondoolies-Tech
Donator
Legendary
*
Offline Offline

Activity: 924


It was fun while it lasted


View Profile WWW
August 02, 2015, 07:13:41 PM
 #22

The "RAM" will be persistent and the DRAM will be another cache layer above the persistent storage. To realize the full potential of the tech, the CPUs and OSes will need to be changed.

What's wrong with preserving the current CPU design, with onboard SRAM cache above DRAM, and DRAM above this "persistent RAM" made accessible by a high-speed interface such as wide PCIe, where the OS itself uses its paging/memory management mechanisms to move data to and from DRAM?
I need to be careful here.
Public info:
https://www.google.com/patents/WO2012087471A2?cl=en

We released the following miners only when we were operational: SP10, SP20, SP30, SP31 and SP35.
If you see anything else offered online, it's a scam.
We designed but never completed the SP50
AJRGale
Hero Member
*****
Offline Offline

Activity: 762



View Profile
August 03, 2015, 06:40:40 AM
 #23

If anyone is interested in this "technology", this is a little spiel on his thoughts about it.  http://thememoryguy.com/micronintel-3d-xpoint-raises-more-questions-than-answers/

on the other hand

Intel and Micron unveil 3D Xpoint™ technology and create the first new memory category in more than 25 years.[Citation needed]

Over the last 25 years there has been many memory types been developed (eg:CBRAM, SONOS, RRAM, Racetrack memory, NRAM, Millipede memory and FJG)  by not only intel, but basically all major memory companies, eg, Hynix, Sandisk, Corsair, etc..

now I thought this might of been a processor development (some quantum quadrant bit process) that may have just thrown Any calculation process to the dirt, Thank goodness I was wrong! i think the only thing that could throw the asic's for a 6 would be quantum processing.
VirosaGITS
Hero Member
*****
Offline Offline

Activity: 910


Want to save $? Use Coupon Code VOFF5 at GPUShack


View Profile
August 03, 2015, 02:54:15 PM
 #24

If anyone is interested in this "technology", this is a little spiel on his thoughts about it.  http://thememoryguy.com/micronintel-3d-xpoint-raises-more-questions-than-answers/

on the other hand

Intel and Micron unveil 3D Xpoint™ technology and create the first new memory category in more than 25 years.[Citation needed]

Over the last 25 years there has been many memory types been developed (eg:CBRAM, SONOS, RRAM, Racetrack memory, NRAM, Millipede memory and FJG)  by not only intel, but basically all major memory companies, eg, Hynix, Sandisk, Corsair, etc..

now I thought this might of been a processor development (some quantum quadrant bit process) that may have just thrown Any calculation process to the dirt, Thank goodness I was wrong! i think the only thing that could throw the asic's for a 6 would be quantum processing.

I'm not sure who did it first, i first heard similar technology coming from AMD. A quick search found some articles such as;
http://wccftech.com/amd-working-hynix-development-highbandwidth-3d-stacked-memory/

And for quantum computing, i suppose if we had a true quantum processor able to emulate conventional computers software (base 8/binary), i believe that could indeed throw a wrench in current encryption of all kind.

But such computers would be qualified strictly as military grade for the long foreseeable future, hopefully it would be done gradually as a sudden jump in technology always cause much chaos.

Ethereum Mining Calculator - Simple, elegant, mobile-friendly Ethereum Mining Profitability Calculator
gpuShack mining hardware - Your one stop shop for all GPU mining related hardware! Use Coupon Code VOFF5 to save on any order.
ethOS - #ethosdistro on freenode - linux distro that mines Ethereum out-of-the-box.
Lauda
Legendary
*
Offline Offline

Activity: 1694


GUNBOT Licenses -20% with ref. code 'GrumpyKitty'


View Profile WWW
August 03, 2015, 03:16:54 PM
 #25

I'm not sure who did it first, i first heard similar technology coming from AMD. A quick search found some articles such as;
http://wccftech.com/amd-working-hynix-development-highbandwidth-3d-stacked-memory/

And for quantum computing, i suppose if we had a true quantum processor able to emulate conventional computers software (base 8/binary), i believe that could indeed throw a wrench in current encryption of all kind.

But such computers would be qualified strictly as military grade for the long foreseeable future, hopefully it would be done gradually as a sudden jump in technology always cause much chaos.
HBM isn't comparable to 3D Xpoint because it is just not as fundamentally different to what we have today. However, it is much better than GDDR5. To simplify: "High Bandwidth Memory (HBM) is a high-performance RAM interface for 3D-stacked DRAM memory from AMD and Hynix. "HBM gen. 1 has already been released in commercial products (AMD Fury), and HBM gen. 2 is coming very soon.



The image clearly shows the benefits for GPUs. We are also supposed to see generation 2 products in 2016.



Update: Some corrections.


          ▄▄█████▌▐█████▄▄
       ▄█████████▌    ▀▀▀███▄
     ▄███████████▌  ▄▄▄▄   ▀██▄
   ▄█████████████▌  ▀▄▄▀     ▀██▄
  ▐██████████████▌  ▄▄▄▄       ▀█▌
 ▐███████████████▌             ▀█▌
 ████████████████▌  ▀▀▀█         ██
▐████████████████▌  ▄▄▄▄         ██▌
▐████████████████▌  ▀  ▀         ██▌
 ████████████████▌  █▀▀█         ██
 ▐███████████████▌  ▀▀▀▀        ▄█▌
  ▐██████████████▌  ▀▀▀▀       ▄█▌
   ▀█████████████▌  ▀▀█▀     ▄██▀
     ▀███████████▌  ▀▀▀▀   ▄██▀
       ▀█████████▌    ▄▄▄███▀
          ▀▀█████▌▐█████▀▀
▬◉▬
▬◉▬
▬◉▬
▬◉▬
▬◉▬
▬◉▬
▬◉▬
▬◉▬
      ▄▄▄
 ▄▄█████████▄▄
  ▀▀▀▀▀▀▀▀▀▀▀
   █▌▐█ █▌▐█
   █▌▐█ █▌▐█
 ▄███████████▄
▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄






▄█████████████▄
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
███████████████
██▀▀█▀▀████████
▀█████████████▀
VirosaGITS
Hero Member
*****
Offline Offline

Activity: 910


Want to save $? Use Coupon Code VOFF5 at GPUShack


View Profile
August 03, 2015, 03:33:33 PM
 #26

I'm not sure who did it first, i first heard similar technology coming from AMD. A quick search found some articles such as;
http://wccftech.com/amd-working-hynix-development-highbandwidth-3d-stacked-memory/

And for quantum computing, i suppose if we had a true quantum processor able to emulate conventional computers software (base 8/binary), i believe that could indeed throw a wrench in current encryption of all kind.

But such computers would be qualified strictly as military grade for the long foreseeable future, hopefully it would be done gradually as a sudden jump in technology always cause much chaos.
HBM isn't comparable to 3D Xpoint because it is just not as fundamentally different to what we have today. However, it is much better than GDDR5. To simplify: "High Bandwidth Memory (HBM) is a high-performance RAM interface for 3D-stacked DRAM memory from AMD and Hynix. "HBM gen. 1 has already been released in commercial products (AMD Fury), and HBM gen. 2 is coming very soon.



As the image clearly shows the benefits for GPUs are there and we are supposed to see generation 2 products in 2016.



Update: Some corrections.

I see, i'm not sure how "fundamentally different" it is, it actually sound like it is just an evolutionary technology over AMD's HBM, for memory.
3D Xpoint is a still-in-development-phase technology true, but also happen to be coming right after AMD's HBM.

It seem to me its Intel's project to keep in the race.
The main article itself(link) mention the technology is also basically stacking chips on top of one another. Though i suppose i'll need to wait for a proper comparison of the completed systems to further comment.

I just very much doubt it will stop at Ram for AMD and non volatile storage for Intel.

Ethereum Mining Calculator - Simple, elegant, mobile-friendly Ethereum Mining Profitability Calculator
gpuShack mining hardware - Your one stop shop for all GPU mining related hardware! Use Coupon Code VOFF5 to save on any order.
ethOS - #ethosdistro on freenode - linux distro that mines Ethereum out-of-the-box.
baristor
Sr. Member
****
Offline Offline

Activity: 364


View Profile
August 05, 2015, 11:21:59 PM
 #27

It wont unless intel sees a good market potential in bitcoin although one might say it might, as Asrock already advertise some of their boards working okok with bitcoins
hexafraction
Sr. Member
****
Offline Offline

Activity: 378

Tips welcomed: 1CF4GhXX1RhCaGzWztgE1YZZUcSpoqTbsJ


View Profile
August 05, 2015, 11:25:18 PM
 #28

It wont unless intel sees a good market potential in bitcoin although one might say it might, as Asrock already advertise some of their boards working okok with bitcoins

No, it won't work AT ALL for Bitcoin. It's NOT a mining technology, but rather a storage technology. It could possibly make sense for storing and reading the DAG for Ethereum, however.

I have recently become active again after a long period of inactivity. Cryptographic proof that my account has not been compromised is available.
VirosaGITS
Hero Member
*****
Offline Offline

Activity: 910


Want to save $? Use Coupon Code VOFF5 at GPUShack


View Profile
August 06, 2015, 12:31:14 AM
 #29

It wont unless intel sees a good market potential in bitcoin although one might say it might, as Asrock already advertise some of their boards working okok with bitcoins

No, it won't work AT ALL for Bitcoin. It's NOT a mining technology, but rather a storage technology. It could possibly make sense for storing and reading the DAG for Ethereum, however.

Hehh, the high bandwidth high cost version of storage isint going to be useful or efficient. The mining, blockchain and also Eth technology really isint built around the need for Ram-like bandwidth performance. The need for memory is very small and the need for storage vary, but the storage devices again don't make use of ram-like or better performance.

Ethereum Mining Calculator - Simple, elegant, mobile-friendly Ethereum Mining Profitability Calculator
gpuShack mining hardware - Your one stop shop for all GPU mining related hardware! Use Coupon Code VOFF5 to save on any order.
ethOS - #ethosdistro on freenode - linux distro that mines Ethereum out-of-the-box.
hexafraction
Sr. Member
****
Offline Offline

Activity: 378

Tips welcomed: 1CF4GhXX1RhCaGzWztgE1YZZUcSpoqTbsJ


View Profile
August 06, 2015, 12:33:16 AM
 #30

It wont unless intel sees a good market potential in bitcoin although one might say it might, as Asrock already advertise some of their boards working okok with bitcoins

No, it won't work AT ALL for Bitcoin. It's NOT a mining technology, but rather a storage technology. It could possibly make sense for storing and reading the DAG for Ethereum, however.

Hehh, the high bandwidth high cost version of storage isint going to be useful or efficient. The mining, blockchain and also Eth technology really isint built around the need for Ram-like bandwidth performance. The need for memory is very small and the need for storage vary, but the storage devices again don't make use of ram-like or better performance.

My bad; I'm not 100% familiar with the exact balance of storage vs fast memory vs large memory requirements of Eth yet. As long as DAG fits into normal DRAM, XPoint doesn't really have a useful effect.

I have recently become active again after a long period of inactivity. Cryptographic proof that my account has not been compromised is available.
VirosaGITS
Hero Member
*****
Offline Offline

Activity: 910


Want to save $? Use Coupon Code VOFF5 at GPUShack


View Profile
August 06, 2015, 12:45:12 AM
 #31

It wont unless intel sees a good market potential in bitcoin although one might say it might, as Asrock already advertise some of their boards working okok with bitcoins

No, it won't work AT ALL for Bitcoin. It's NOT a mining technology, but rather a storage technology. It could possibly make sense for storing and reading the DAG for Ethereum, however.

Hehh, the high bandwidth high cost version of storage isint going to be useful or efficient. The mining, blockchain and also Eth technology really isint built around the need for Ram-like bandwidth performance. The need for memory is very small and the need for storage vary, but the storage devices again don't make use of ram-like or better performance.

My bad; I'm not 100% familiar with the exact balance of storage vs fast memory vs large memory requirements of Eth yet. As long as DAG fits into normal DRAM, XPoint doesn't really have a useful effect.

Indeed. There is not much in term of specs released for Xpoint, so its hard to say and their technology is interesting for non volatile storage, for faster SSDs. But usually with high speed come a storage reduction and high prices. So in term of cost effectiveness, this won't be it, i suspect.

But if you look at AMD's 3D stacked ram, you quickly see there's not much interest there in the world of crypto, currently.

For gaming i would love that stuff and its very interesting for stuff like Graphic cards but now its just moving the bottleneck to something else. Like CPU response time. Which Mantle API looked to fix. Anyways, veering off topic.

Ethereum Mining Calculator - Simple, elegant, mobile-friendly Ethereum Mining Profitability Calculator
gpuShack mining hardware - Your one stop shop for all GPU mining related hardware! Use Coupon Code VOFF5 to save on any order.
ethOS - #ethosdistro on freenode - linux distro that mines Ethereum out-of-the-box.
johnyj
Legendary
*
Offline Offline

Activity: 1834


Beyond Imagination


View Profile
August 13, 2015, 07:20:19 PM
 #32

I think the more important is the production process in semi-conductor industry, it is going to be vast different in the coming years

KNC have claimed that they adopted 16nm Finfet 3d and achieved one magnitude of efficiency. So we can expect similar 3D process will all raise the efficiency dramatically in the coming years, but then it will be highly centralized in 1-2 semi-conductor factory 

Dexter770221
Legendary
*
Offline Offline

Activity: 1027


View Profile
August 14, 2015, 05:15:10 PM
 #33

LUT's in FPGA's are basically small RAM blocks. That's why you can program any logical function into it. So, this technology isn't designed to build SHA ASICS but it should be possible to build one. But I doubt it could have any advantage over regular ASIC.

Under development Modular UPGRADEABLE Miner (MUM). Looking for investors.
Changing one PCB with screwdriver and you have brand new miner in hand... Plug&Play, scalable from one module to thousands.
hexafraction
Sr. Member
****
Offline Offline

Activity: 378

Tips welcomed: 1CF4GhXX1RhCaGzWztgE1YZZUcSpoqTbsJ


View Profile
August 14, 2015, 05:28:02 PM
 #34

LUT's in FPGA's are basically small RAM blocks. That's why you can program any logical function into it. So, this technology isn't designed to build SHA ASICS but it should be possible to build one. But I doubt it could have any advantage over regular ASIC.

First, I'm not sure that XPoint works for SRAMs to begin with. Not sure about block RAM elements.

There's not really that much of a benefit to LUTs since they're quite small. A LUT4 (as on older architectures such as spartan3e) contains 16 bits in a LUT, meaning 128 bits per CLB. Even newer architectures have around 64 bits in a LUT6, such as Xilinx 7 series. Making LUTs denser won't help since there are tons of other components such as dedicated gates, single registers, carry chain multiplexers, and so on.

There are some optimizations in which a synchronously used function is implemented in a block RAM. It's more important to make faster routing and CLBs, as well as more CLBs.

I have recently become active again after a long period of inactivity. Cryptographic proof that my account has not been compromised is available.
Pages: « 1 [2]  All
  Print  
 
Jump to:  

Sponsored by , a Bitcoin-accepting VPN.
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!