We can't explain the universe, just describe it; and we don't know whether our theories are true, we just know they're not wrong. >Harald Lesch
Here's a quickie: I want some considerations about which of these four i5s I should choose to buy for a
pure gaming rig.
http://www.schottenland.de/produkt-vergleich/20598003-20598048-21626618-21673651I should note that the price of the 3570 (no K) in the comparison is for some reason ~ 20 € too high. I could get it for 177,25 €.
Apparently I'd have to pay approx.
10-15 € premium for
Ivy Bridge and ~
20-30 € for being able to
OC the CPU.
1a) So where's the sweet spot?
1b) Is there even any advantage to Ivy Bridge on my Z68 board? (aside from lower power draw)
Keep in mind that I would just do mild
OC -- to around
4GHz (= +16%) -- which translates into roughly
6% higher framerates. (At least that's what I conclude from
this graph.)
Other Info: Currently I own a
XFX HD6870 which I assume would bottleneck any of those i5s,
2a) correct?
2b) But would the 2500 be the right choice then?
2c) How would the picture change when I'd throw in a new GPU in a year or two?
An artist's depiction of an Extended Unit Death
Well, rockz talked me out of an Ivy Bridge with
this graphic.
I don't really know enough about CPUs to give technical advice here, but my 2500K @4.0 GHz hasn't had anywhere near an issue with any of the games I've played with it (including SC2, which is supposed to be CPU-intensive, I believe), and to be honest, I don't think the overclock on it has really impacted performance at all in terms of gaming.
Ivy Bridge is only better if you don't want to overclock (or overvoltage, more accurately). But I can't think of a scenario where OC'ing is actually useful.
None.
We can't explain the universe, just describe it; and we don't know whether our theories are true, we just know they're not wrong. >Harald Lesch
Well, rockz talked me out of an Ivy Bridge with
this graphic.
Interesting, at 4GHz the Sandy Bridge is cooler by about 15% while only being about 2% slower.
my 2500K @4.0 GHz hasn't had anywhere near an issue with any of the games I've played with it (including SC2, which is supposed to be CPU-intensive, I believe), and to be honest, I don't think the overclock on it has really impacted performance at all in terms of gaming.
This is the very question I'm trying to answer with this thread. I guess what I'm asking is,
do I even need more than a regular 2500?Ivy Bridge is only better if you don't want to overclock (or overvoltage, more accurately). But I can't think of a scenario where OC'ing is actually useful.
Which means you'd recommend the 3570 over the 2500 despite the graph from the OP is showing that it's only ~ 5% faster for 20€ more, right? Why?
Or do you mean that I should go the more expensive OC route and pick the 2500K? Will I actually be able to use the CPU power in games?
Post has been edited 1 time(s), last time on Feb 9 2013, 7:50 pm by NudeRaider.
As I've been saying for a while now, there's no reason to overclock PCs any more, unless you just like big benchmark numbers that don't actually mean anything in practice.
None.
OC helps video encode times and can be a huge timesaver for streamers/uploaders, believe it or not.
None.
The processor isn't your bottleneck unless you're doing something CPU heavy, even then there's GPU acceleration.
None.
We can't explain the universe, just describe it; and we don't know whether our theories are true, we just know they're not wrong. >Harald Lesch
As I've been saying for a while now, there's no reason to overclock PCs any more, unless you just like big benchmark numbers that don't actually mean anything in practice.
I don't care for benchmarks, I care for having fps>=30. That being said I'd be interested to know where your advice is coming from. Is it simply because Core i5's won't bottleneck even powerful GPUs in games?
OC helps video encode times and can be a huge timesaver for streamers/uploaders, believe it or not.
Thanks for your input, but obviously this isn't relevant for a
As I've been saying for a while now, there's no reason to overclock PCs any more, unless you just like big benchmark numbers that don't actually mean anything in practice.
I don't care for benchmarks, I care for having fps>=30. That being said I'd be interested to know where your advice is coming from. Is it simply because Core i5's won't bottleneck even powerful GPUs in games?
Yes. The main thing that affects game FPS is your screen resolution. If you're at 1920x1200 or 1920x1080 (bleh!) then a mid-range, or upper mid-range, GPU is all you need. Pair it up with a mid-range CPU and you'll be fine.
OC helps video encode times and can be a huge timesaver for streamers/uploaders, believe it or not.
Thanks for your input, but obviously this isn't relevant for a
Yes, that's true. If you have a specific workload that is computationally heavy, overclocking your CPU will be worth it, although all it means in practice is you're spending less money up front and taking a slight risk of damaging your CPU (and mobo) over time. But most people don't, so for most people overclocking your CPU and going out of your way to buy fancy coolers is a false economy.
None.
We can't explain the universe, just describe it; and we don't know whether our theories are true, we just know they're not wrong. >Harald Lesch
As I've been saying for a while now, there's no reason to overclock PCs any more, unless you just like big benchmark numbers that don't actually mean anything in practice.
I don't care for benchmarks, I care for having fps>=30. That being said I'd be interested to know where your advice is coming from. Is it simply because Core i5's won't bottleneck even powerful GPUs in games?
Yes. The main thing that affects game FPS is your screen resolution. If you're at 1920x1200 or 1920x1080 (bleh!) then a mid-range, or upper mid-range, GPU is all you need. Pair it up with a mid-range CPU and you'll be fine.
I am at 1920x1080.
The graphs seem to indicate otherwise: (avg fps from shogun 2*)
I5 2500K @ 3.3 GHz: 26 fps (stock)
I5 2500K @ 5GHz: 34 fps (30% faster, 51% higher clock)
I5 3570K @ 3.4GHz: 34 fps (stock)
I5 3570K @ 5GHz: 43 fps (26% faster, 47% higher clock)
As nothing else changed I'd assume there is a strong correlation between fps and CPU clock. So what do you base your statement on? Are these 2 games exceptions to the rule because they are inefficient / CPU intensive?
* Numbers aren't as strong for the Arma II graph, which is probably because it's not as CPU intensive.
That game is quite CPU heavy, in fact the article mentions they picked the games specifically for their CPU-heavy attributes. In the Arma-II graph they intentionally ran AA and AS off to make the difference in CPU speeds wider.
You might see a small performance with a better CPU, but a better GPU is far superior in increasing frame rate.
None.
As nothing else changed I'd assume there is a strong correlation between fps and CPU clock. So what do you base your statement on? Are these 2 games exceptions to the rule because they are inefficient / CPU intensive?
Nothing specific, just the general power of CPUs and GPUs today. And yes, those two games were specifically chosen because they are CPU intensive, rather than being representative of most games.
The other points you have to consider here: how much did the extra cooler cost to enable the 5GHz overclock? There's a good chance saving that extra money and putting it towards a better GPU instead would give you a bigger FPS gain than the overclock would (and without risking your HW or voiding warranties). They're also using a top-of-the-line graphics card there, if you get a less power / cheaper one, you may end up GPU-limiting the game to the extent that faster CPU doesn't achieve much of anything.
Finally, you *can* always turn the graphical settings down, you know. Really the difference in quality between 'ultra' and 'high' for most games is very minor, if not imperceptible. It's not like games up until 2005 where you really didn't want to play on the lower settings because they looked like poo. I played through Far Cry 3 on High settings and it was great. I put it up to Very High and Ultra to see what they looked like and couldn't really tell much of a difference (Medium and Low on the other hand...), but my GPU didn't struggle with those in the area I was in at the time either so I probably could have just played on them; but I preferred to insulate myself from frame drop. This is on a Q6600 from 2007 running at 2.4GHz stock, 4gb of RAM and a GTX 460 playing on 1920x1200.
None.
We can't explain the universe, just describe it; and we don't know whether our theories are true, we just know they're not wrong. >Harald Lesch
Makes me wonder if I should even upgrade my Intel G840 @ 2,8GHz at all.
Possibly, to an i3, if you have the money.
None.
Possibly, to an i3, if you have the money.
That's not worth the upgrade money.
None.
We can't explain the universe, just describe it; and we don't know whether our theories are true, we just know they're not wrong. >Harald Lesch
Yeah, i3 is out of the question. Price isn't so much of an issue here, it's more about what makes sense and how much CPU power current games actually need and how much they will need in the foreseeable future.
how much they will need in the foreseeable future.
Which is an interesting point. Part of the reason for the stagnation in game requirements is the current generation of consoles, as lots of games are ported from consoles to PC (or designed for 2-3 targets from the ground-up) and so generally don't take advantage of the extra capabilities a PC has to offer. As Sony and Microsoft seem to be coming out with their new consoles this year, we should see games over the next 2-3 years have a step up in requirements compared to what we've been seeing for the last 3-4.
So I guess, find out how powerful those consoles are supposed to be compared to standard PC CPUs and use that as a factor in your purchase?
None.
fyi upgrading my athlon processor to a 3570k made my computer's framerates much higher on starcraft and oblivion. I only have a 5770 and now I'm getting 60 fps with fairly high detail on both. the 2500 is worth saving the money on.
"Parliamentary inquiry, Mr. Chairman - do we have to call the Gentleman a gentleman if he's not one?"
I get 60 FPS in sc2 on ultra with a pentium.
None.
I suppose I should say that this value is not comparable with other computers, since I got 25 fps before.
2560x1440 on notd and all that.
"Parliamentary inquiry, Mr. Chairman - do we have to call the Gentleman a gentleman if he's not one?"