Go Back   Hardware Canucks > PC BUILDERS & TWEAKERS CORNER > Overclocking, Tweaking and Benchmarking

    
Reply
 
LinkBack Thread Tools Display Modes
  #1 (permalink)  
Old April 10, 2016, 03:24 AM
Top Prospect
 
Join Date: Aug 2011
Posts: 69
Exclamation Intel CPU Rant

Gaming/Simulations have very little benefit for Intel's quest to just keeping adding more cores and lowering the frequency. Desktop consumers want to see more than 7% gain after 3+ years of CPU "progress" ... adding cores solves NOTHING for games/simulations.

Best stable OC for the 5960X is around 4.3Ghz, the 3960X is 4.8Ghz (under common cooling solution, not extreme ones) ... performance difference is only 7% ... so 3+ years on CPU progress is only producing a 7% gain? Most of that gain is probably related to the chipset X99 and not the CPU.

Consumers want higher frequency less cores, that's what works best for desktop computing, games, simulations. Because this doesn't fit your marketing strategy isn't a justification to NOT provide what consumers really want ... more die space, higher frequency.

I would much rather pay $1000 for a 4 core CPU operating at 6Ghz, than 8 CPUs operating at 4.3 Ghz.

What do you guys think?

CG
Reply With Quote
  #2 (permalink)  
Old April 10, 2016, 05:10 AM
Sagath's Avatar
Moderator
F@H
 
Join Date: Feb 2009
Location: Petawawa, ON
Posts: 3,020

My System Specs

Default

It is not as simple as that. There are many technological reasons why high frequency scaling is and always will be a problem with any silicon product can only run so fast. Faster requires more voltage, which means more heat. There is a cube law of switching frequency vs voltage vs power consumption.

The focus needs to be on parallel processors for many reasons. Single threaded applicable sources of data are antiquidated.

This article is fairly good at describing what I've said, but it's not the best technical description if that's what you're really after: https://www.comsol.com/blogs/havent-...ed-last-years/
__________________
My Disclaimer to any advice or comment I make;
Quote:
Originally Posted by CroSsFiRe2009 View Post
I'm a self certified whizbang repair technician with 20 years of professional bullshit so I don't know what I'm talking about
Reply With Quote
  #3 (permalink)  
Old April 10, 2016, 10:13 AM
Shadowmeph's Avatar
Hall Of Fame
F@H
 
Join Date: Oct 2007
Posts: 3,617

My System Specs

Default

my personal opinion is that for gaming running a 4 core cpu at say 3.9-4.3 Ghz with a decent card works good , hell I am still using a 2500k and haven't had any CPU related problems running games, the only problem I have is I need a decent card I am only using a AMD 7770 lol works fine for most games but there are times I really notice the lack of GPU hehe
Reply With Quote
  #4 (permalink)  
Old April 10, 2016, 11:18 AM
ern88's Avatar
Allstar
 
Join Date: Jan 2011
Location: Halifax, Nova Scotia
Posts: 663

My System Specs

Default

Quote:
Originally Posted by Shadowmeph View Post
my personal opinion is that for gaming running a 4 core cpu at say 3.9-4.3 Ghz with a decent card works good , hell I am still using a 2500k and haven't had any CPU related problems running games, the only problem I have is I need a decent card I am only using a AMD 7770 lol works fine for most games but there are times I really notice the lack of GPU hehe
Same. I am also running am i5 2500K at 4.6 ghz. I am going to wait for the new R9 490x to replace my HD7950 Vapor-x with. I see no big need to upgrade my CPU.
Reply With Quote
  #5 (permalink)  
Old April 10, 2016, 11:29 AM
JD's Avatar
JD JD is offline
Moderator
F@H
 
Join Date: Jul 2007
Location: Toronto, ON
Posts: 7,486

My System Specs

Default

I'm inclined to say Intel doesn't feel motivated to push the boundaries like they did back in the P4/PD to C2D era. That was a time where AMD was beating them in performance and price. They had to do something to take back the market share. They kept inching ahead while AMD kept falling behind. Now it's become a point where Intel sees no reason to invest tons of money in R&D on CPUs when they hold the performance crown.
Reply With Quote
  #6 (permalink)  
Old April 10, 2016, 12:06 PM
Sagath's Avatar
Moderator
F@H
 
Join Date: Feb 2009
Location: Petawawa, ON
Posts: 3,020

My System Specs

Default

Quote:
Originally Posted by JD View Post
I'm inclined to say Intel doesn't feel motivated to push the boundaries like they did back in the P4/PD to C2D era. That was a time where AMD was beating them in performance and price. They had to do something to take back the market share. They kept inching ahead while AMD kept falling behind. Now it's become a point where Intel sees no reason to invest tons of money in R&D on CPUs when they hold the performance crown.
So true. We had 4 and 8 Core threads back in the X57 days. Here we are almost SEVEN years later still running on 4 and 8 cores for the masses. Sure 16/32 core Xeons are out there, but the price isnt consumer appropriate, not that it is really needed for the masses either.

Poor AMD and their Athlon 64's, including the FX series. Such Beasts. P4's were so bad relative to them on multiple levels.
__________________
My Disclaimer to any advice or comment I make;
Quote:
Originally Posted by CroSsFiRe2009 View Post
I'm a self certified whizbang repair technician with 20 years of professional bullshit so I don't know what I'm talking about
Reply With Quote
  #7 (permalink)  
Old April 10, 2016, 01:00 PM
Top Prospect
 
Join Date: Aug 2011
Posts: 69
Exclamation

Hello fellows,

I appreciate the feedback. The more I think about it, the more I am realizing that my i7-2600K and GTX970 will be good enough for the next three to five years anyway. I do not see much advantage in upgrading to a new CPU at this time. Intel has not provided a product that is superior enough to justify a full upgrade!

Thanks again

CG
Reply With Quote
  #8 (permalink)  
Old April 10, 2016, 02:19 PM
sswilson's Avatar
Moderator
F@H
 
Join Date: Dec 2006
Location: Moncton NB
Posts: 16,146

My System Specs

Default

I'd argue that the incremental improvements could make a fairly hefty difference depending on your use.

Admittedly, a 25% (just a guestimate I pulled out of my rear end, but I don't think it's too far out) improvement isn't going to make much difference in day to day use, or for any kind of gaming that is GPU vice CPU limited, but if you do any kind of heavy multi-threaded crunching (say like encoding / ripping content ) then the time saved could be substantial, especially if you're dealing with a large library.

We also can't dismiss power consumption improvements as inconsequential. Doesn't make much difference to us folks at home with one or two PCs, but there are real savings to be had for entities running hundreds and/or thousands of PCs.
__________________
Gigabyte Z97N-WIFI / i7 4770K / 2X 8G Gskill 1866 Sniper / XFX XTR 750 / EVGA GTX 970 SSC ACX 2.0+
Corsair 380T / Corsair H100i GTX AIO/ Intel 730 Series 480GB SSD / Seagate Barracuda 1TB / Dell UltraSharp U2412M

Asrock AM1H-ITX / AM1 Athlon 5350 / 2X4G Gskill PC3-14900 / Intel 6235 Wi-Fi / 90W Targus Power Brick / Uncased 256GB Sandisk Z400S SSD / Mini-Box M350 / 24" Westinghouse L2410NM
Reply With Quote
  #9 (permalink)  
Old April 10, 2016, 02:27 PM
MVP
 
Join Date: Jul 2014
Location: Metro-Vancouver
Posts: 321

My System Specs

Default

Earlier this year I was doing some early spring cleaning and came across a box of old Maximum PC magazines, it was funny just how wrong they were regarding the road to 10 GHz, single core chips.
Reply With Quote
  #10 (permalink)  
Old April 10, 2016, 03:25 PM
Sagath's Avatar
Moderator
F@H
 
Join Date: Feb 2009
Location: Petawawa, ON
Posts: 3,020

My System Specs

Default

Quote:
Originally Posted by sswilson View Post
I'd argue that the incremental improvements could make a fairly hefty difference depending on your use.

Admittedly, a 25% (just a guestimate I pulled out of my rear end, but I don't think it's too far out) improvement isn't going to make much difference in day to day use, or for any kind of gaming that is GPU vice CPU limited, but if you do any kind of heavy multi-threaded crunching (say like encoding / ripping content ) then the time saved could be substantial, especially if you're dealing with a large library.
Its not just that 32-36%, but the instruction sets they've added/changed/updated. Quicksync v1 to v5 is nothing to sneeze at for transcoding H264/265. All that time saved from faster x/y/z calculates into wattage savings on your power bill as well. You're also going to gain chipset differences out the WAZOO which isnt directly CPU related, but obviously comes with upgrading that motherboard.

This isnt even taking in to account overclocking of either CPU.
__________________
My Disclaimer to any advice or comment I make;
Quote:
Originally Posted by CroSsFiRe2009 View Post
I'm a self certified whizbang repair technician with 20 years of professional bullshit so I don't know what I'm talking about
Reply With Quote
Reply


Thread Tools
Display Modes

Similar Threads
Thread Thread Starter Forum Replies Last Post
[FS] Ravenor's Closet Cleanout! Intel i7-2700K, Gigabye/Intel combo Ravenor Buy/Sell & Trade 14 March 21, 2016 06:15 PM
Intel & NVIDIA Officially Bury Hatchet: Intel to Pay $1.5 Billion in Licensing Fees SKYMTL Press Releases & Tech News 4 January 12, 2011 04:51 AM
[WTT] GIGABYTE GA-EX58-UD4P LGA 1366 Intel X58 ATX Intel Motherboard For Video Card or CPU gerardfraser Buy/Sell & Trade 5 July 19, 2010 03:15 PM
EBay Makes Its Complaint Against Craigslist Public Hardware Canucks Press Releases & Tech News 3 May 1, 2008 10:06 AM