Bitcoin Forum
November 21, 2017, 01:07:53 AM *
News: Latest stable version of Bitcoin Core: 0.15.1  [Torrent].
 
   Home   Help Search Donate Login Register  
Pages: [1]
  Print  
Author Topic: CPU and Intergrated gpu  (Read 905 times)
MajorGur
Member
**
Offline Offline

Activity: 66


View Profile
April 07, 2016, 10:00:56 AM
 #1

Hello guys,

Recently I watched a lot of PC stuff and also how to game on a very cheap computer.
A fact that was when you use an intergrated GPU and buff up the amount of RAM, that it will get you better performance.

Is this also happening when you are CPU mining?

~thanks in advance
1511226473
Hero Member
*
Offline Offline

Posts: 1511226473

View Profile Personal Message (Offline)

Ignore
1511226473
Reply with quote  #2

1511226473
Report to moderator
1511226473
Hero Member
*
Offline Offline

Posts: 1511226473

View Profile Personal Message (Offline)

Ignore
1511226473
Reply with quote  #2

1511226473
Report to moderator
Join ICO Now A blockchain platform for effective freelancing
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction. Advertise here.
1511226473
Hero Member
*
Offline Offline

Posts: 1511226473

View Profile Personal Message (Offline)

Ignore
1511226473
Reply with quote  #2

1511226473
Report to moderator
salaman112
Member
**
Offline Offline

Activity: 105


View Profile
April 07, 2016, 10:01:41 AM
 #2

I am interested in this to!

██████████    YoBit.net - Cryptocurrency Exchange - Over 350 coins
█████████    <<  ● $$$ - $$$ - $$$ - $$$ - $$$ - $$$ - $$$   >>
██████████    <<  ● Play DICE! Win 1-5 btc just for 5 mins!  >>
Zitdadast
Sr. Member
****
Offline Offline

Activity: 317

LTC fan 4ever


View Profile
April 07, 2016, 02:21:32 PM
 #3

Most of us are using the standalone graphics cards. These are more power and can handle many algorithms.
MaxDZ8
Hero Member
*****
Offline Offline

Activity: 673



View Profile
April 07, 2016, 05:51:03 PM
 #4

A fact that was when you use an intergrated GPU and buff up the amount of RAM, that it will get you better performance.
It is a fact under certain situations. Nowadays, if you bump up from 4GiB to 8 sure your games will run better.

Is this also happening when you are CPU mining?
No. Mining consumes negligible amount of memory (often under a single texture). Some algo consumes more but they're still relatively rare.
QuintLeo
Hero Member
*****
Offline Offline

Activity: 882


View Profile
April 08, 2016, 08:31:36 AM
 #5

AMD A-series, especially the high-end A10 units, have the equivilent of a low-end AMD graphic card on them - my A10-5700 integrated GPU has about 65% of the performance of the AMD 7750 (384 cores vs. 512, clocks a bit lower, but seems to talk to the CPU a bit more efficiently).

 The current top-line A10s have 512 cores and clock them a bit faster.


 Nothing else is even close on integrated graphics - Intel "Haswell" is a sad pathetic joke in comparison, just like Intel Graphics have never been competative with NVidea or AMD/ATI in general.


 Stand-alone cards are a LOT more effective and efficient to mine with.
MajorGur
Member
**
Offline Offline

Activity: 66


View Profile
April 08, 2016, 01:34:19 PM
 #6

They are also much better for gaming. I mean, buying a lot of RAM for integrated graphics will be more expensive then just buying a cheap GPU or even e GPU from freegeek etcetera
joblo
Legendary
*
Online Online

Activity: 938


View Profile
April 08, 2016, 08:53:11 PM
 #7

AMD A-series, especially the high-end A10 units, have the equivilent of a low-end AMD graphic card on them - my A10-5700 integrated GPU has about 65% of the performance of the AMD 7750 (384 cores vs. 512, clocks a bit lower, but seems to talk to the CPU a bit more efficiently).

 The current top-line A10s have 512 cores and clock them a bit faster.


 Nothing else is even close on integrated graphics - Intel "Haswell" is a sad pathetic joke in comparison, just like Intel Graphics have never been competative with NVidea or AMD/ATI in general.


 Stand-alone cards are a LOT more effective and efficient to mine with.

What miner do you use for the IGP?

Principal developer of cpuminer-opt, the optimized multi-algo CPU miner.
BTC donation address: 12tdvfF7KmAsihBXQXynT6E6th2c2pByTT
https://bitcointalk.org/index.php?topic=1326803.0
QuintLeo
Hero Member
*****
Offline Offline

Activity: 882


View Profile
April 09, 2016, 07:12:00 AM
 #8

Any miner that will work with a discrete AMD vid card will work with the video on an A10 - depending on the generation, it's either a 6xxx or 7xxx series AMD vid card for software purposes.
MajorGur
Member
**
Offline Offline

Activity: 66


View Profile
April 09, 2016, 08:51:59 AM
 #9

How About the polaris architecture? Will it improve gamning and mining power?
MaxDZ8
Hero Member
*****
Offline Offline

Activity: 673



View Profile
April 09, 2016, 04:28:02 PM
 #10

Of course.
Note that GCN1.2 (Tonga) improved performance as well (it has an instruction to accelerate AES byte swaps) but apparently not many people bought those and it might require new kernels so IDK if the improvements materialized in the real world.
joblo
Legendary
*
Online Online

Activity: 938


View Profile
April 09, 2016, 04:52:09 PM
 #11

Any miner that will work with a discrete AMD vid card will work with the video on an A10 - depending on the generation, it's either a 6xxx or 7xxx series AMD vid card for software purposes.


Thanks. My next rig might have an A10.


Principal developer of cpuminer-opt, the optimized multi-algo CPU miner.
BTC donation address: 12tdvfF7KmAsihBXQXynT6E6th2c2pByTT
https://bitcointalk.org/index.php?topic=1326803.0
QuintLeo
Hero Member
*****
Offline Offline

Activity: 882


View Profile
April 10, 2016, 06:22:19 AM
 #12

My last 2 rigs were A10-5700 based, the new one I should have the parts for sometime next week is going to be ... interesting.

 It's intended as a prototype for some future rig builds when I have money to burn.

 AMD A10-7860K (best price/performance point right now)
 3x GTX 950 (budget limitations, but should work fine for proof of concept while providing acceptable performance)
 ASUS A88-X Pro (3 onboard PCI-E 16 bit slots as I'm NOT interested in messing with risers on this machine, AND good heat management design)
 16 Gigs DDR (2133 I think, whatever the fastest the CPU will handle) - I did spend a bit extra here for lower-CAS G,Skill over the lowest-cost option

 3 of the 4 CPU cores will be running GIMPS, the 4'th probably dedicated to the "CPU to support the GPUs" requirement for Folding@home.
 The GTXs are intended for Folding@Home, but I might do some Ethereum mining for a while to help defray some of the cost of the machine.
 Yes, I WILL participate in CureCoin when I go Folding - it's not much, but even a tiny amount of income helps a little.
 The GPU on the A10 will crunch RC5-72.

 I'll probably set one of the GTX up on "run on idle only" mode, as I also intend this specific machine to be my new primary gaming machine.


 I thought about going Intel, but the cost of CPUs there is a LOT higher, the onboard GPU is very very poor performance (it will run RC5 but at VERY low keyrates compared to an A10 GPU) though the GIMPS performance would be a ton higher. Motherboard cost was going to be directly comparable for what I insist on for a motherboard (those ASUS "BTC" motherboards all depend on risers to run more than 1 or 2 cards, NOT acceptable in a gaming machine to me).


 No, this is NOT an "efficient miner" design - it's multipurpose with mining as an afterthought.
joblo
Legendary
*
Online Online

Activity: 938


View Profile
April 10, 2016, 06:57:38 AM
 #13

My last 2 rigs were A10-5700 based, the new one I should have the parts for sometime next week is going to be ... interesting.

 It's intended as a prototype for some future rig builds when I have money to burn.

 AMD A10-7860K (best price/performance point right now)
 3x GTX 950 (budget limitations, but should work fine for proof of concept while providing acceptable performance)
 ASUS A88-X Pro (3 onboard PCI-E 16 bit slots as I'm NOT interested in messing with risers on this machine, AND good heat management design)
 16 Gigs DDR (2133 I think, whatever the fastest the CPU will handle) - I did spend a bit extra here for lower-CAS G,Skill over the lowest-cost option

 3 of the 4 CPU cores will be running GIMPS, the 4'th probably dedicated to the "CPU to support the GPUs" requirement for Folding@home.
 The GTXs are intended for Folding@Home, but I might do some Ethereum mining for a while to help defray some of the cost of the machine.
 Yes, I WILL participate in CureCoin when I go Folding - it's not much, but even a tiny amount of income helps a little.
 The GPU on the A10 will crunch RC5-72.

 I'll probably set one of the GTX up on "run on idle only" mode, as I also intend this specific machine to be my new primary gaming machine.


 I thought about going Intel, but the cost of CPUs there is a LOT higher, the onboard GPU is very very poor performance (it will run RC5 but at VERY low keyrates compared to an A10 GPU) though the GIMPS performance would be a ton higher. Motherboard cost was going to be directly comparable for what I insist on for a motherboard (those ASUS "BTC" motherboards all depend on risers to run more than 1 or 2 cards, NOT acceptable in a gaming machine to me).


 No, this is NOT an "efficient miner" design - it's multipurpose with mining as an afterthought.


Looks like an interesting project, but I have yet to find a 3 slot MB with proper spacing for cooling. Any more than
2 for me and it's open case with risers.

Principal developer of cpuminer-opt, the optimized multi-algo CPU miner.
BTC donation address: 12tdvfF7KmAsihBXQXynT6E6th2c2pByTT
https://bitcointalk.org/index.php?topic=1326803.0
eddie13
Hero Member
*****
Offline Offline

Activity: 700


View Profile
April 10, 2016, 07:04:07 AM
 #14

I have one of these A-10's and it has proven impressive for a laptop hashing sha..

MajorGur
Member
**
Offline Offline

Activity: 66


View Profile
April 10, 2016, 07:35:47 PM
 #15

I have one of these A-10's and it has proven impressive for a laptop hashing sha..


The fact that you can overclock the laptops amd processor is also great
QuintLeo
Hero Member
*****
Offline Offline

Activity: 882


View Profile
April 11, 2016, 07:09:49 AM
 #16

Getting all 3 cards to cool will be tricky - it would be nice if someone would put out a MB with a 16/1/1/16/1/1/16 or similar configuration to FIX that issue, but too many current motherboards want to put a PCI-E 1-bit slot closest to the CPU thus forcing 2 of your GPUs to be right on top of each other.

 I am seriously considering water-cooling for at least one GPU when I have the money to start building serious numbers of machines.



 Overclocking ANY laptop is a really bad idea - they are NOT designed with good cooling to start with, you're almost guarentteed to have a FAILURE well before the warrenttee runs out - and overclocking probably WILL void that warrenttee.
Zitdadast
Sr. Member
****
Offline Offline

Activity: 317

LTC fan 4ever


View Profile
April 14, 2016, 03:16:27 PM
 #17

Getting all 3 cards to cool will be tricky - it would be nice if someone would put out a MB with a 16/1/1/16/1/1/16 or similar configuration to FIX that issue, but too many current motherboards want to put a PCI-E 1-bit slot closest to the CPU thus forcing 2 of your GPUs to be right on top of each other.

 I am seriously considering water-cooling for at least one GPU when I have the money to start building serious numbers of machines.



 Overclocking ANY laptop is a really bad idea - they are NOT designed with good cooling to start with, you're almost guarentteed to have a FAILURE well before the warrenttee runs out - and overclocking probably WILL void that warrenttee.


I used to mine some coins with the nVidia card in a laptop. But I undervoltage it and the temperature is just 60 degree.
QuintLeo
Hero Member
*****
Offline Offline

Activity: 882


View Profile
October 27, 2016, 09:18:40 AM
 #18

The final cure to the "can't get the right spacing" issue turned out to be simple.

 Put a SHORT or a LOW PROFILE card in the 3'd slot, leaving quite a bit of open space for the next card over to draw air in.

 Didn't work on the 950 based machine as such, as all 3 of those cards were short, but I eventually got a pair of 960s and put one of the 950s with them, then the other pair of 950s ended up in an older "only has 2 slots" motherboard-based system.

toptek
Legendary
*
Offline Offline

Activity: 1092


View Profile
October 27, 2016, 09:27:26 AM
 #19

This might be bad ass next year after years of second place maybe we see the cpu war days again .

http://www.pcgamer.com/amd-confirms-zen-processors-launching-early-2017/

AMD finally fighting back .
QuintLeo
Hero Member
*****
Offline Offline

Activity: 882


View Profile
October 27, 2016, 09:39:51 PM
 #20

It goes in waves.

 Intel lost their crown to Zilog for a while, back in the 8080/8085 vs Z80 days - the 8086 and 8088 got it back especially after IBM adopted the 8088 for the original PC.

 Intel held that crown for a quite long time before AMD briefly challenged it in the K5 vs Pentium days.

 Intel then LOST the crown outright for a while in the Athlon vs. Pentium III and early P4 days, but AMD hit scaling issues on the Athlon and the P4 eventually passed up the Athlon.

 Intel next lost the crown when AMD introduced the AMD 64 and Opteron 64 series, but eventually the Core line won it back.

 Dunno what's going to happen with Zen - seems like a lot of the competition currently is more in the "integrated with GPU" lines where AMD kills Intel on graphics performance but Intel wins on the CPU side.

 Say a lot that Intel's current line is ALL "GPU integrated" even though the GPUs involved have poor performance and make the overall cost of the CPUs quite a bit higher.

Pages: [1]
  Print  
 
Jump to:  

Sponsored by , a Bitcoin-accepting VPN.
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!