If geeks love it, we’re on it

StarCraft 2 and anti-aliasing: a tempest in a teapot

StarCraft 2 and anti-aliasing: a tempest in a teapot

To much fanfare, Blizzard finally released StarCraft 2: Wings of Liberty on Tuesday, ending twelve years of waiting. The release was largely without major problems—although some users had difficulty connecting to Battle.net initially, and there is a fairly sizable list of known issues. The one we’ve been hearing about the most, however, isn’t on the list: anti-aliasing and how StarCraft 2 simply doesn’t support it. It’s not a hidden setting, it’s not an ‘only on this setup.’ No—Blizzard decided that they would not officially support anti-aliasing in the game at release.

The Tempest

Despite Blizzard’s lack of official support for it, some users are playing StarCraft 2 with anti-aliasing. NVIDIA elected to modify their drivers, and “enabled” anti-aliasing in StarCraft 2 with a “brute-force” approach. Users and websites have noted that, unsurprisingly, this can have a significant negative impact on performance—especially at higher resolutions.

ATI did not modify their drivers. Instead, ATI issued official statements stating effectively that they stood by Blizzard’s decision at this time, and that they were “committed to making AA perform at an acceptable level before we release it to our customers.” Which only means that ATI wasn’t comfortable with the performance level they could provide at release; nothing more, nothing less.

ATI has already addressed it twice in as many days, as well. Terry Makedon (@CatalystMaker) said on Wednesday that ATI was already testing a hotfix for the Catalyst drivers that would enable AA—presumably at performance levels ATI was comfortable with. But ATI will presumably continue to actively work with Blizzard to enhance the graphics quality in StarCraft 2—and so will NVIDIA.

The Teapot

ATI’s statement also included something very telling about the situation. “Blizzard indicated that they would not initially include options to set levels of in-game anti-aliasing.” There’s no reason or indication that this statement can’t be taken at face value, and as literally as it is presented.

In other words, Blizzard is working with ATI to enable anti-aliasing in StarCraft 2, but decided to not include anti-aliasing at release. Instead, support will likely be added in later patches. There is little question that Blizzard has a love affair with patching their games; the original StarCraft was still being patched on a fairly regular basis years after release. So one can reasonably conclude that StarCraft 2 will see patches, and that these patches will add things like anti-aliasing.

There Are More Important Things

AMD’s official statement says what I have to say pretty well, so I’m going to quote it again. “Blizzard’s focus on incredible game play for all, means that gamers using ATI Radeon products can enjoy smooth HD gameplay and industry-leading image quality with our current generation of ATI Radeon products as well as many of our past generation cards.

Emphasis mine, of course. Aren’t there more important things than to start up another GPU vendor holy war over anti-aliasing? StarCraft never offered things like anti-aliasing, or even 3D graphics, and it remains one of the most popular games of all time. To this very day, thousands of people play it online daily. See, StarCraft isn’t about pretty graphics or fancy GPU features or neat acceleration tricks. StarCraft is about gameplay.

Is the average player really going to freak out because they have “the jaggies,” as one person put it? Most likely not. They’re going to put the graphics settings at something that runs comfortably on their system, and they’re not going to worry about details they only see when they’re heavily zoomed in, or even pronounced ones. They’re going to focus on building up their forces quickly and defeating whatever menaces them in that mission, or kicking that other guy’s butt in multiplayer.

Ultimately, the gameplay is what determines just how important graphics really are. How many people aren’t playing StarCraft 2 because they don’t have anti-aliasing? How many people ran out to buy a new graphics card because they don’t have 4xAA today? Probably zero on both counts. If the gameplay is good, you’ll be hard pressed to find users who aren’t more than willing to overlook rough edges in graphics—they’re just not make or break when the play is good. A quick search of the StarCraft 2 forums only turned up a handful of topics on it, and many responses can be summarized as “Anti-aliasing would be nice, but I really don’t care that much. I’m more interested in just playing the game.”

So really, what does anti-aliasing matter in the end? So the game doesn’t look as good as it could; instead of starting GPU holy wars, you should discuss it with Blizzard, as it was their decision. And before you do that, you should step back and consider—does it really matter all that much to your gaming experience, or are you still having fun anyway?

Comments

  1. Sledgehammer70
    Sledgehammer70 I am pretty sure Nvidia worked with Blizzard.
  2. EX Doesn't really matter. I have a three year old Laptop and the game looks & runs great even on the lowest setting.

    I feel like todays gamers care about the stupidest crap.
  3. primesuspect
    primesuspect I think, in this case, gamers don't care, though.
  4. ardichoke
    ardichoke Bah to anti-aliasing I say. BAH! I've never found it made graphics look all that much better, especially for the performance penalty one pays for using it.
  5. Thrax
    Thrax I love anti-aliasing. I think it's one of the best features any GPU can offer. I can see jaggies from across the room (yes, really); I have a hard time UNseeing them.

    With all that in mind, I never noticed that AA wasn't functioning during my time in the SC2 beta. It was a total non-issue because Blizzard's engine looks fantastic without it.

    Talk about manufactured fuss.
  6. Tim
    Tim Here's one more for the list of issues - you can't custom bind keys yet. I don't like using the arrow keys to move the screen around, I want to use WASD like in WoW. It works so much better like that.
  7. shwaip
    shwaip you should be moving the screen either:
    1) By double tapping your control group key
    2) with your mouse, either by scrolling or clicking the minimap.
  8. pigflipper
    pigflipper Shwaip: you are wasting your figurative breath.

    As for the whole AA thing, don't care, game looks great on my backup system @ low/med settings.
  9. shwaip
    shwaip For those of you running low/med settings, you should see it at high/ultra :)
  10. Koreish
    Koreish I hate wifi. I can play the game on ultra just fine single player, get me playing multiplayer, I'm struggling with low.
  11. Cliff_Forster
    Cliff_Forster Does AA matter? Sure, it does to a degree, but not like it used to.

    At one time when you were playing Counterstrike at 800X600, 4X AA could do some remarkable things to clean up the image that you saw on screen. When you combined lower resolutions with lower poly counts the jagged lines could be a massive distraction.

    On modern games playing at 1920X1080 and beyond with massive poly counts the edge detail is so fine that jagged lines are far less noticeable without AA. When it comes down to it you always want to play in your monitors native highest resolution when you can with all the eye candy you can turn on to maintain a constant 60 frames per second rate. If your doing that, and the one sacrifice you make is AA in order to make performance, its not like the game experience is ruined anymore. The game just looks too good as it is, and to some degree too much AA can actually soften some games. Take Unreal Tournament III, a game where the designers ultimately decided not to build AA into the engine figuring that users might opt to run it from the cards driver if they really felt it necessary. Truth is, that game looks a little sharper and more defined with it off, might just be my personal preference, but its a case where AA actually deadens the image a little bit.

    Point being, AA is a nice option to have, but its no longer a deal breaker for a great visual experience.
  12. mirage
    mirage Yeah, I agree with the sentiment here. AA does not matter just like Tessellation. Radeon is the best as always. Go AMD!
  13. Sledgehammer70
    Sledgehammer70 In the case of RTS AA helps with direct shadows in the game engine, most of the other things are using other means to smooth out edges. If you play with AA and than turn it off you will see the difference.

    But as noted above, Blizzard has built the game to run on a slew of system types, so if you have not seen it than your probably not going to miss it.
  14. Sledgehammer70
    Sledgehammer70
    mirage wrote:
    Yeah, I agree with the sentiment here. AA does not matter just like Tessellation. Radeon is the best as always. Go AMD!

    Tessellation is a new technology that can bring a huge impact in games. Not many games are using this technology currently, but many new titles are building there games with it. Regardless of it being AMD or Nvidia, both companies support the tech in their current product offerings.
  15. AlexDeGruven
    AlexDeGruven
    I think, in this case, gamers don't care, though.

    Precisely.

    Fanboys care.
  16. Thrax
    Thrax I'd be surprised if we ever see a game that strongly leverages tessellation. As much as it sucks to hear this, it's not a feature developers intend to push.
  17. mirage
    mirage
    Thrax wrote:
    I'd be surprised if we ever see a game that strongly leverages tessellation. As much as it sucks to hear this, it's not a feature developers intend to push.

    No it does not suck since I don't agree.
  18. mirage
    mirage
    Precisely.

    Fanboys don't care.

    Corrected for you :buck:
  19. Sledgehammer70
    Sledgehammer70 I like to buy products that put the technology they are built with to use. Game looks great without AA. But looks even better with it.
  20. Thrax
    Thrax
    mirage wrote:
    No it does not suck since I don't agree.

    It really is a shame, because tessellation could do away with the trend of designing games for the lowest common denominator, or the "average" installed hardware base. Rather than using LOD levels or a pedestrian poly count, a heavily-tessellated engine could dynamically scale the detail appropriately for the hardware of the player. In other words, the player would always see the most detailed experience their PC can deliver in the given scene.

    That's why it sucks that game devs won't be using tessellation for less gimmicky things than water.
  21. mondi
    mondi
    Thrax wrote:
    It really is a shame, because tessellation could do away with the trend of designing games for the lowest common denominator, or the "average" installed hardware base. Rather than using LOD levels or a pedestrian poly count, a heavily-tessellated engine could dynamically scale the detail appropriately for the hardware of the player. In other words, the player would always see the most detailed experience their PC can deliver in the given scene.

    That's why it sucks that game devs won't be using tessellation for less gimmicky things than water.

    It means that the player would always see whatever the graphics driver deemed appropriate. Water works because it's abstract enough that you can rely on the underlying implementation to produce fairly similar results regardless of the details. Anything that cannot rely on runtime geometry wouldn't work.

    Remember, a "highly tessellated" cube is a sphere.
  22. mirage
    mirage Tessellation can be implemented in an adaptive way depending on the capability/power of hardware. For example, when the game detects an HD5870, it can scale tessellation back and do it more extensively with a GTX480. Of course, just like they did with AA, ATI can still decide to disable tessellation to avoid losing in the benchmarks. ;)
  23. mirage
    mirage
    mondi wrote:
    Remember, a "highly tessellated" cube is a sphere.

    A cube is always a cube. But a "low tessellated" sphere might look like a cube.
  24. Sledgehammer70
    Sledgehammer70
    Thrax wrote:
    It really is a shame, because tessellation could do away with the trend of designing games for the lowest common denominator, or the "average" installed hardware base. Rather than using LOD levels or a pedestrian poly count, a heavily-tessellated engine could dynamically scale the detail appropriately for the hardware of the player. In other words, the player would always see the most detailed experience their PC can deliver in the given scene.

    That's why it sucks that game devs won't be using tessellation for less gimmicky things than water.

    I agree thrax 100%. But I do know some dev's are starting to get creative with it.

    It's also a shame to see people say "AA" is not important as just a few years ago AA and AF were the top things. "OMG!!!! 32x AA"
  25. Thrax
    Thrax
    mirage wrote:
    Tessellation can be implemented in an adaptive way depending on the capability/power of hardware. For example, when the game detects an HD5870, it can scale tessellation back and do it more extensively with a GTX480. Of course, just like they did with AA, ATI can still decide to disable tessellation to avoid losing in the benchmarks. ;)

    Or do it like NVIDIA and rely exclusively on tessellation benchmarks to fool the public into believing that the 7 months spent catassing around on a deeply flawed architecture was worth more than an 8% lead. ;)

    DIRECTX 11 DONE RITE, GUISZ.

    //edit: Now with 100% more citation.
  26. mondi
    mondi
    mirage wrote:
    A cube is always a cube. But a "low tessellated" sphere might look like a cube.

    Same three vertices:

    1 - Cubic
    2 - Straight
    3 - Bezier

    Without any extra information as to what the tangents / vertex tension should be, we end up with 3 very different curves. Which one is correct?

    attachment.php?attachmentid=28464&stc=1&d=1280436987

    If you add tension info per vertex, then you're effectively doubling (possibly tripling depending on the implementation) the data sent to the graphics card _before_ you calculate the per vertex information. If you're going to do that, why not LOD the model and send the appropriate vertices directly, bypassing a large amount of bus transfers, and a whole bunch of calculations per vertex?
  27. fatcat
    fatcat so...when is Starcraft 2: Episode 1 (or will it be episode 2, and yes I know they are called Heart of the Swarm and Legacy of the Void) coming out? 12 years from now? Want more single player campaign (namely Protoss)
  28. mirage
    mirage
    Thrax wrote:
    Or do it like NVIDIA and rely exclusively ...

    Sure, I would only be happy to see if ATI had such good performance with tessellation and AA. After all, this is state of the art that improves 3D realism. That flawed architecture, as you call, is the "best architecture" I have seen to date. They went beyond designing traditional DX11 co-processors to push 100+ fps with two-year old games.
  29. Frylock
    Frylock Well. I just beat the game and can't wait for the others. I loved it!
  30. Obsidian
    Obsidian I was a bit let down by the short ending. After all the lengthy cutscenes leading up to it I expected more. At least the game itself was pretty damn fun.
  31. GnomeWizardd
    GnomeWizardd installed the new ATI drivers and turned AA on and cant tell a diff in SCII It looked amazing before
  32. Ian Just installed beta "hot fix" and WOW, visually it makes a huge difference, thought even with 8gb ram and 5870 it slows it down alot at 1600x1200 so it is a bodge job alright.
  33. Cliff_Forster
    Cliff_Forster
    Ian wrote:
    Just installed beta "hot fix" and WOW, visually it makes a huge difference, thought even with 8gb ram and 5870 it slows it down alot at 1600x1200 so it is a bodge job alright.

    Your specs are very similar to mine. Only difference is I run 1920X1080. Im curious, with the eye candy and AA enabled, what kind of frame rate are you getting? With AA off, how do you fare?
  34. Sledgehammer70
    Sledgehammer70 I cranked Nvidia's CP option to 32x SLI AA. The only thing bad about it is I am only running 138 FPS :( with or without AA the game is playing amazingly. But I can see the difference with AA switched on.
  35. Koreish
    Koreish
    fatcat wrote:
    so...when is Starcraft 2: Episode 1 (or will it be episode 2, and yes I know they are called Heart of the Swarm and Legacy of the Void) coming out? 12 years from now? Want more single player campaign (namely Protoss)

    Heart of the Swarm's story is being worked on now. There are no plans to start making the game proper until 2011. Production shouldn't be that long as most of the game is made already made, but it is Blizzard. So Zerg will probably be out late 2012 and Protoss 2014.
  36. Max Just got a GTX 460 and at 1920x1200 and Ultra Quality in SC2, I can force Nvidia to do 8x AA and Anisotropic and run without a noticeable performance decrease

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!