Bitcoin Forum
May 24, 2024, 03:17:06 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 2 3 [4]  All
  Print  
Author Topic: New 14nm miners???????  (Read 5932 times)
Mabsark
Legendary
*
Offline Offline

Activity: 826
Merit: 1004


View Profile
September 02, 2014, 11:34:13 AM
 #61

There's no point to going towards 14nm when 20nm hasn't even been optimized.  28nm offerings are still sometimes more efficient than their 20nm brethren.  Spending money on improving the current design before stepping down to an expensive process should seem to be the better route.

What 20nm brethren? There's only KnC's half-arsed rushed attempt so far. Also, making ASICs doesn't work the way you are describing it. You don't just take a highly optimised 28nm design and run it through the 20nm fab and end up with a highly optimised 20nm ASIC. The design usually has to be reworked because the actual fab process has changed and once working, it can then be optimised for that specific process.



Well I meant what you said - may not have come out right.  basically they're trying to take a leap too far ahead if they rush to 14nm.  Since it would be prohibitively expensive to do so they should just work on optimizing the 20 & 28nm stuff that they have right now.

TSMC 20nm is fully booked for at least the rest of the year and probably a good portion of next year too. You've basically got Nvidia, AMD and ARM manufacturers fighting for capacity for their next gen products. The bitcoin ASIC manufacturers would be far better off focusing on 28nm
DrG
Legendary
*
Offline Offline

Activity: 2086
Merit: 1035


View Profile
September 02, 2014, 04:48:57 PM
 #62

There's no point to going towards 14nm when 20nm hasn't even been optimized.  28nm offerings are still sometimes more efficient than their 20nm brethren.  Spending money on improving the current design before stepping down to an expensive process should seem to be the better route.

What 20nm brethren? There's only KnC's half-arsed rushed attempt so far. Also, making ASICs doesn't work the way you are describing it. You don't just take a highly optimised 28nm design and run it through the 20nm fab and end up with a highly optimised 20nm ASIC. The design usually has to be reworked because the actual fab process has changed and once working, it can then be optimised for that specific process.



Well I meant what you said - may not have come out right.  basically they're trying to take a leap too far ahead if they rush to 14nm.  Since it would be prohibitively expensive to do so they should just work on optimizing the 20 & 28nm stuff that they have right now.

TSMC 20nm is fully booked for at least the rest of the year and probably a good portion of next year too. You've basically got Nvidia, AMD and ARM manufacturers fighting for capacity for their next gen products. The bitcoin ASIC manufacturers would be far better off focusing on 28nm

Meh I've been waiting on those new GPUs for ages now.  The 7970s I have from 2012 are still the same as the current crop of GPUs from the end of 2014 more or less.  Moore's Law does appear to be reaching an end.

If TSMC is booked up well then 28nm it is.
dyask
Hero Member
*****
Offline Offline

Activity: 854
Merit: 510


View Profile
September 02, 2014, 10:51:24 PM
 #63

...  Moore's Law does appear to be reaching an end. ....

Heard the same thing in late 1980's.   It is more likely that there will be a shift to other approaches, like stacking many layers or something else.   Still that that doesn't mean it will be a few guys in a bar figuring out how to do it.
DrG
Legendary
*
Offline Offline

Activity: 2086
Merit: 1035


View Profile
September 03, 2014, 05:38:51 AM
 #64

...  Moore's Law does appear to be reaching an end. ....

Heard the same thing in late 1980's.   It is more likely that there will be a shift to other approaches, like stacking many layers or something else.   Still that that doesn't mean it will be a few guys in a bar figuring out how to do it.

I was on my 3rd PC in the late 80s and I never heard such a thing.  I did hear that they would have problems with the clocks because of possible interference with other electronics - turned out that didn't happen  Cheesy

The basic transistor gate design has to change (ie 3D) but even that only has so much it can go.  The concept of a switch or gate has to change.  Graphene or something like that which we haven't used yet.
dyask
Hero Member
*****
Offline Offline

Activity: 854
Merit: 510


View Profile
September 03, 2014, 06:39:29 AM
Last edit: September 03, 2014, 06:54:59 AM by dyask
 #65

...  Moore's Law does appear to be reaching an end. ....

Heard the same thing in late 1980's.   It is more likely that there will be a shift to other approaches, like stacking many layers or something else.   Still that that doesn't mean it will be a few guys in a bar figuring out how to do it.

I was on my 3rd PC in the late 80s and I never heard such a thing.  I did hear that they would have problems with the clocks because of possible interference with other electronics - turned out that didn't happen  Cheesy

The basic transistor gate design has to change (ie 3D) but even that only has so much it can go.  The concept of a switch or gate has to change.  Graphene or something like that which we haven't used yet.
3rd PC by the late 80's, you started a later them me.   I was on my third computer before 1980.  

The talk about Moore's law in the late 80's was mostly around hard disk density and capacity at the time.   I was working as a Hardware/Software engineer designing sub-systems used in oscilloscopes.  You may have been too young to follow the trade journals at that time.   There wasn't many stories in the popular press as computers weren't mainstream.    Anyway at the time a typical hard disk was measured in MB.   A few GB drive was a monster and very expensive and was physically very large.      

Currently there is a great deal of research on how to improve ICs and we are long ways from the end of improvements, however that might not be true for reducing sizes of transistors.  Also much of the new stuff will take years to work its way out to smaller companies unless they are buying general CPU's etc.   I know for example that stacking layers of transistors has been in the works for at least 5 years, probably more, but so far I haven't actually seen it.  
 
EDIT: In the late 80's there was a general feeling that solid state storage (i.e. flash) would easily replace hard disks within 10 years.   Still hasn't happen, although we are getting closer to the cross over.   In 1995 I wrote a driver for storing information on flash, then the 28F010's Intel, 1 Mb each, wow that was really hard and there was like a 90 second cycle where if you lost power you lost your flash contents.   Things have come a long ways since then.   I know a ton about pre-2000 technology but after about 2000 I've mostly done pure software.    Grin   My electrical skills are getting rusty and I don't read the journals much now.  Kids take up too much time.   
DrG
Legendary
*
Offline Offline

Activity: 2086
Merit: 1035


View Profile
September 03, 2014, 06:54:24 AM
 #66

...  Moore's Law does appear to be reaching an end. ....

Heard the same thing in late 1980's.   It is more likely that there will be a shift to other approaches, like stacking many layers or something else.   Still that that doesn't mean it will be a few guys in a bar figuring out how to do it.

I was on my 3rd PC in the late 80s and I never heard such a thing.  I did hear that they would have problems with the clocks because of possible interference with other electronics - turned out that didn't happen  Cheesy

The basic transistor gate design has to change (ie 3D) but even that only has so much it can go.  The concept of a switch or gate has to change.  Graphene or something like that which we haven't used yet.
3rd PC by the late 80's, you started a later them me.   I was on my third computer before 1980.   

The talk about Moore's law in the late 80's was mostly around hard disk density and capacity at the time.   I was working as a Hardware/Software engineer designing sub-systems used in oscilloscopes.  You may have been too young to follow the trade journals at that time.   There wasn't many stories in the popular press as computers weren't mainstream.    Anyway at the time a typical hard disk was measured in MB.   A few GB drive was a monster and very expensive and was physically very large.     

Currently there is a great deal of research on how to improve ICs and we are long ways from the end of improvements, however that might not be true for reducing sizes of transistors.  Also much of the new stuff will take years to work its way out to smaller companies unless they are buying general CPU's etc.   I know for example that stacking layers of transistors has been in the works for at least 5 years, probably more, but so far I haven't actually seen it.   
 

Yeah I'm a physician so all my computer knowledge is hobbywork, but my dad was an EE so my science project in 1st grade was parallel vs series lighting while other kids were doing things with plants and volcanos  Cheesy

And yes I remember MB HDs.  After my IIc I bought a 386 SX16 (turbo 16MHz) from Gateway and my dad payed an extra $500 for the 40MB WD HD.  It was an IDE drive, not the RLL/MFM/ESDI ones.  Yeah he still has punchcards in the garage along with his oscilloscope.  I do indeed remember the talk about the hard drive limits at that time. I learned to program in Fortran, Basic, C, and Cobalt but that's is all erased from my mind and filled with useless medical knowledge.


I still read through the IEEE magazines form time to time and I have the 2011 issue talking about digital currency and Bitcoin.

Anyways the circuit will be revolutionized - just need to have a Eureka moment.  It's not my field of study so I get my information from stuff like this and other forum members:
https://www.youtube.com/watch?v=gjx5y9lrtwU
dyask
Hero Member
*****
Offline Offline

Activity: 854
Merit: 510


View Profile
September 03, 2014, 07:05:14 AM
 #67

...  Moore's Law does appear to be reaching an end. ....

Heard the same thing in late 1980's.   It is more likely that there will be a shift to other approaches, like stacking many layers or something else.   Still that that doesn't mean it will be a few guys in a bar figuring out how to do it.

I was on my 3rd PC in the late 80s and I never heard such a thing.  I did hear that they would have problems with the clocks because of possible interference with other electronics - turned out that didn't happen  Cheesy

The basic transistor gate design has to change (ie 3D) but even that only has so much it can go.  The concept of a switch or gate has to change.  Graphene or something like that which we haven't used yet.
3rd PC by the late 80's, you started a later them me.   I was on my third computer before 1980.   

The talk about Moore's law in the late 80's was mostly around hard disk density and capacity at the time.   I was working as a Hardware/Software engineer designing sub-systems used in oscilloscopes.  You may have been too young to follow the trade journals at that time.   There wasn't many stories in the popular press as computers weren't mainstream.    Anyway at the time a typical hard disk was measured in MB.   A few GB drive was a monster and very expensive and was physically very large.     

Currently there is a great deal of research on how to improve ICs and we are long ways from the end of improvements, however that might not be true for reducing sizes of transistors.  Also much of the new stuff will take years to work its way out to smaller companies unless they are buying general CPU's etc.   I know for example that stacking layers of transistors has been in the works for at least 5 years, probably more, but so far I haven't actually seen it.   
 

Yeah I'm a physician so all my computer knowledge is hobbywork, but my dad was an EE so my science project in 1st grade was parallel vs series lighting while other kids were doing things with plants and volcanos  Cheesy

And yes I remember MB HDs.  After my IIc I bought a 386 SX16 (turbo 16MHz) from Gateway and my dad payed an extra $500 for the 40MB WD HD.  It was an IDE drive, not the RLL/MFM/ESDI ones.  Yeah he still has punchcards in the garage along with his oscilloscope.  I do indeed remember the talk about the hard drive limits at that time. I learned to program in Fortran, Basic, C, and Cobalt but that's is all erased from my mind and filled with useless medical knowledge.


I still read through the IEEE magazines form time to time and I have the 2011 issue talking about digital currency and Bitcoin.

Anyways the circuit will be revolutionized - just need to have a Eureka moment.  It's not my field of study so I get my information from stuff like this and other forum members:
https://www.youtube.com/watch?v=gjx5y9lrtwU

I once considered medical school, even read up on the MCAT before deciding it wasn't for me.   I respect how much a physician has to go through to get to practice.   It seems medical fields are really starting to explode with innovation!   
Nagle
Legendary
*
Offline Offline

Activity: 1204
Merit: 1000


View Profile WWW
September 03, 2014, 07:53:17 AM
 #68

If TSMC is booked up well then 28nm it is.
The hash rate is rising so fast that by the time 14nm ASIC fab capability becomes available, fabbing the parts will be uneconomic. We're headed for 1 exahash around the beginning of 2015, about 4x the current hash rate.

By Q1 2015, all miners will be losing money. Then what?

Difficulty will go to the moon!
DrG
Legendary
*
Offline Offline

Activity: 2086
Merit: 1035


View Profile
September 03, 2014, 08:28:19 PM
 #69

If TSMC is booked up well then 28nm it is.
The hash rate is rising so fast that by the time 14nm ASIC fab capability becomes available, fabbing the parts will be uneconomic. We're headed for 1 exahash around the beginning of 2015, about 4x the current hash rate.

By Q1 2015, all miners will be losing money. Then what?


Difficulty will go to the moon!

Well at some point somebody will shut off their miners.  First to go will be most European and Australian miners since they have high energy rates.  If that doesn't bring the difficulty down then the people who thought they had cheap power with $0.08/KWH will get kicked out.

The same thing happened with GPUs - that's when FPGA started becoming popular.  ASICs killed FPGA development.

Sooner or later we'll reach equilibrium.  At that point people with "free" or insanely cheap power can mine against large companies.
Pages: « 1 2 3 [4]  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!