Linguine I run the game maxed out at 1920x1080 1080p 120Hz on an i7-8700 / GTX 1070 Ti / 16GB DDR4 / 250GB m.2 Game runs fine..now if I plan on raiding I'll drop the in-game preset to about 3 or 4 in the "Overall" settings.
You'll be fine.
@Linguine I run the game maxed out at 1920x1080 1080p 120Hz on an i7-8700 / GTX 1070 Ti / 16GB DDR4 / 250GB m.2 Game runs fine..now if I plan on raiding I'll drop the in-game preset to about 3 or 4 in the "Overall" settings.
You'll be fine.
I saw in the graphics panel that they let you preset where to drop your graphics to for raids and battlegrounds. You should check it out!
Have you seen the new classic game engine? They actually cranked the graphics way up. Yeah it's still using old models and textures but they added all the lighting, particle, water, sunshaft, and shadow effects from the modern engine. The graphics slider goes from 1-10, and it notes 'classic' graphics setting at position 3.
You're not getting 165fps on a 165hz 1440P display with integrated potato graphics. This isn't actually the 2004 game client, it's running the modern engine.
You know what else is running on the modern engine? MODERN, which is where I was grinding Keepers of Time rep.
And nobody said anything about 165 fps, or 1440p. 1440p isn't even a real standard and any differences between framerates over 30 are unnoticeable if you're actually playing the game and focusing on what you're doing rather than standing over someone else's shoulder watching them play.
Have you seen the new classic game engine? They actually cranked the graphics way up. Yeah it's still using old models and textures but they added all the lighting, particle, water, sunshaft, and shadow effects from the modern engine. The graphics slider goes from 1-10, and it notes 'classic' graphics setting at position 3.
You're not getting 165fps on a 165hz 1440P display with integrated potato graphics. This isn't actually the 2004 game client, it's running the modern engine.
You know what else is running on the modern engine? MODERN, which is where I was grinding Keepers of Time rep.
And nobody said anything about 165 fps, or 1440p. 1440p isn't even a real standard and any differences between framerates over 30 are unnoticeable if you're actually playing the game and focusing on what you're doing rather than standing over someone else's shoulder watching them play.
Your graphics aren't potato if they're from a newer AMD chip. Furthermore, hitting the bare minimum required risks getting frame skips and is not ideal for a smooth experience, hence why people will go for higher FPS despite the eye not being able to tell above 60-ish.
In environments such as animated movies, Disney never had any problem with 24 FPS back in the day, but they could guarantee no frame skips.
The point is, you can run Classic at decent settings on a computer that is barely above potato.
The point is, you can run Classic at decent settings on a computer that is barely above potato.
The newer AMD chips are twice as good as potato, see below concerning playing Overwatch with both chips.
https://gizmodo.com/amds-newest-processors-are-so-good-you-can-skip-the-gra-1822920100
On the lowest graphics setting the AMD APUs managed an average 66fps—which is more than enough for most players and almost twice as fast at the 33fps of the Intel CPU. And on the Ultra it still managed a perfectly acceptable 35 frames per second versus the Intel’s unplayable 17fps. That means you could build an Overwatch ready computer with something as cheap as the $100 Ryzen 3 2200G.
If someone wants to run Classic on low settings with intel integrated graphics it's certainly doable, at least in the starter zones. For raids they may need to turn down the graphics to the bare minimum.
However, to take advantage of and enjoy the higher graphic qualities, intel integrated graphics do not cut it. Maxing out the graphics got me 15~ FPS in starter zones with no one around.
Linguine is asking about max graphics settings at a high frame rate, not decent graphics settings. He's already mentioned that he wants 60FPS or better based on his original post where he was running the graphics preset down to 3 to achieve that framerate. I'm not sure how pointing out that you can crank the graphics preset up to maximum while getting 30FPS answers his question on how to achieve 60+FPS with the graphics set to maximum.
Okay, try these settings when retail launches. MSAA, AA, Post-Process and texture filtering tax the GPU because it's making the image/edges of everything look sharper.
MSAA: Multisample anti-aliasing (MSAA) is a type of spatial anti-aliasing, a technique used in computer graphics to improve image quality.
AA: Anti-Aliasing (in computer graphics) a technique used to add greater realism to a digital image by smoothing jagged edges on curved lines and diagonals.
Post-Process AA: (a.k.a SMAA, not to be confused with MSAA): is an AA mode based on the Post-AA blur filter of MLAA (and FXAA). The alisasing "detection" is upgraded and is closer to the detection used in MSAA then the detection used in MLAA and FXAA. The result is that SMAA still remains very cheap, still smoothes alpha-textures and still greatly reduces the visible "jaggies", but doesn't blur the image as much.
**These are not my settings, just settings I recommend**
Here is an example of edges with Anti-Aliasing ON
Here is an example of edges with Anti-Aliasing OFF
15 is still playable.
I'm not recommending Intel integrated graphics. I'm just saying that, if my AMD Advanced Potato Unit 7860k can produce playable frame rates at maximum settings, then you don't need to spend money on a high-end or even medium GPU. Even a GT 1030 might be overkill.
He's already mentioned that he wants 60FPS or better
That language did not appear in his post.
15 is still playable.
I'm not recommending Intel integrated graphics. I'm just saying that, if my AMD Advanced Potato Unit 7860k can produce playable frame rates at maximum settings, then you don't need to spend money on a high-end or even medium GPU. Even a GT 1030 might be overkill.
He's already mentioned that he wants 60FPS or better
That language did not appear in his post.
I believe he is referring to this part of my original post, and he is correct in his extrapolation:
I hover around 3-4 on the graphics settings to get 45-60 FPS
As for 15 FPS, I do not enjoy faux lag spikes that are really just frame skips because there is no buffer, and especially not when it could impact my performance when no one else is being affected (in 2005, others would be likewise hindered by the hardware of the time).
Perhaps others do not PvP or expect to be anywhere near a raid or Alterac Valley, but I do, and I'd like my Classic WoW to be beautiful with all the benefits 2019 has to offer.
MSAA: Multisample anti-aliasing (MSAA) is a type of spatial anti-aliasing, a technique used in computer graphics to improve image quality.
AA: Anti-Aliasing (in computer graphics) a technique used to add greater realism to a digital image by smoothing jagged edges on curved lines and diagonals.
Post-Process AA: (a.k.a SMAA, not to be confused with MSAA): is an AA mode based on the Post-AA blur filter of MLAA (and FXAA). The alisasing "detection" is upgraded and is closer to the detection used in MSAA then the detection used in MLAA and FXAA. The result is that SMAA still remains very cheap, still smoothes alpha-textures and still greatly reduces the visible "jaggies", but doesn't blur the image as much.
I would add that MSAA, by sampling the signal at a higher rate to begin with and then downsampling with a box filter, is the only "true" AA from a mathematical point of view, and yields "more correct" images overall regardless of where the edges are. All other methods are based on sampling the signal at an already-low rate, identifying jump discontinuities, and filtering the signal around those discontinuities. This is a computationally efficient way to get rid of jaggies, but it really has nothing to do with aliasing, and should be more accurately referred to as "anti-jagging" than anti-aliasing.
Also, if you're using multisampling, then you do NOT need any anti-jagging on top of that. That's just a waste of CPU cycles and will make your image look worse instead of better, if it does anything at all.
I apologize for this spontaneous lesson in signal-processing.
As for 15 FPS, I do not enjoy faux lag spikes that are really just frame skips because there is no buffer, and especially not when it could impact my performance when no one else is being affected (in 2005, others would be likewise hindered by the hardware of the time).
Then it is a good thing that I did not recommend hardware that would yield 15 fps.
You can run it at max settings on integrated graphics.
However, to take advantage of and enjoy the higher graphic qualities, intel integrated graphics do not cut it. Maxing out the graphics got me 15~ FPS in starter zones with no one around.
Then it is a good thing that I did not recommend hardware that would yield 15 fps.
This thread was never about running Classic on a potato. The entire premise was that my computer, integrated graphics included, was not enough for me personally, that I was willing to spend to achieve higher, and that I did not want to overspend or over-upgrade to get the graphics quality I desired.
Did you think that my comment was referring to your exact integrated graphics solution, and that I was recommending running the game on the bare minimum hardware capable of running it? If that was your assumption, then it would explain the confusion.