I really hope Microsoft employees (from the Visual Studio team) or the management read this post.
We all are internally talking about it, but don’t say it out loud. I don’t like pretending speaking for everyone else,
Here’s a shocker: latest revisions of Visual Studio are failing to satisfy the video game industry. Which, last time I check, it plays an important role in the business of Microsoft ecosystem.
The detonator were these two tweets, the more I read was “YES I FEEL THE SAME!!!”. In fact, Casey Muratori seems to be thinking that as soon as Clang becomes usable on Win32, move; which is exactly what I’m also considering of doing.
Let’s keep aside the horrible and controvertial VS2012 UI redesign (which fortunately VS2013 now includes colours) or the CAPS LOCKS thingy. Those unfortunately contain a lot subjectivity when one wants to argue. But let’s keep our talk about the objective failures of Visual Studio.
Also, let’s clear something out: I will only focus on C++. Also, VS 2013 is still very new, so I will talk mostly about my experience with VS 2012. At a glance, 2013 improves things a little, but not much.
Now that I’ve cleared that out, here’s a brief of the problems in VS; which I’ll go into detail, one by one:
- Horrible Compilation performance
- Excruciating slow Intellisense.
- Unusually high RAM consumption.
- No native 64-bit version
- High latency input (the UI becomes unresponsive when I’m typing very fast!)
Horrible Compilation Performance
For medium sized to large projects, this is a real PITA. The following timings are for compiling Ogre 2.0 (OgreMain only), I forced the MSVC 2008 IDE (yes, it can be done) on both to maximize available RAM (and thus avoid HDD bottlenecks, msvc 2012 and its new build tools consume a ridiculous amount of memory).
Btw, the project already uses precompiled headers, since the usual response I see on the web is to turn them on:
Intel Core 2 Quad Core Extreme QX9650 @3Ghz, 4GB RAM. OgreMain only. Params /O2, /Ob /arch:SSE2, /MP. (No LTCG).
- Visual C++ 2008: 3 mins 1 second.
- Visual Studio 2012: 7 mins 29 seconds.
- Visual Studio 2013: 5 mins 58 seconds.
- GCC: 2 mins 11 seconds. (not using precompiled headers, Linux)
- Clang: 1 min 31 seconds. (not using precompiled headers, Linux)
So, more than double compile time between 2008 and 2012; and exactly the double between 2008 and 2013. That is a major productivity hit, not to mention this gets on my nerves every time I hit the F7 button. I’m not being fully fair here, since VS 2012 does produce better code than 2008; however GCC and Clang produce code of comparable quality, yet they take considerably less time.
The “buy a faster PC” argument is pointless. Clang, GCC and VC 2008 will always be faster in comparison. Not to mention VS2012 uses more RAM, which automatically means it can’t scale as well as the others will (it fights Moore’s Law).
The good news from the Ogre team, we support unity builds which bring every compiler down to 1 minute (VC 2008 being the fastest at 49 seconds, VS 2012 being the slowest at 1 min 29; while GCC and Clang are nearly at a tie in the middle 1 min 12 seconds and 1 min 20 seconds respectively)
But Unity builds are a sub-optimal solution, since they suck when one is working directly on the code because recompiling a cpp file means recompiling many.
Excruciating slow Intellisense
In the C++ world, “Go to Definition” is a powerful tool. Autocomplete and highlighting are too, but a C++ programmer would trade those two for Go to Definition almost any day. Its most usefulness comes when examining other people’s code, and when refactoring (since it allows us to quickly navigate through the source files in the same flow the code does).
Something I really liked about VS 2008 is that it would take some time before updating its intellisense database and work with outdated information. Why would anyone use this horrible behavior you say? Work with outdated information? Heresy!? Not really. When I’m refactoring and have to change a function (be its name, or its arguments), I need to change both the function definition and its forward declaration.
In 2008; I change the definition, hit Ctrl+Alt+F12, change the forward declaration; then I can go back and forth with Ctrl+Tab (or viceversa, i.e. change the declaration first).
In 2012; I change any of both, hit Ctrl+Alt+F12; wait while staring at the “Please wait while IntelliSense and browsing information are updated…” dialog, and then see I didn’t go anywhere because there’s no forward declaration that matches the modified definition. Now I look for the file myself (whether through Find in Files, and God only knows how many hits I’ll get; or by looking for the right file); and by the time I reached the forward declaration; I totally forgot the code that was on my mind. This is really frustrating.
The next problem with intellisense is this f***ing dialog:
I would love to see the VS dev team to use their own tools on real projects at some point, and not just a few Hello World.
Intellisense’s “Go to Definition” is too slow. Often it takes noticeable time (between 750ms and 2 seconds probably) while VC 2008 was nearly instantaneous (except for a few cases).
Like with the compilation time case, I’m not being fully fair. VS 2012’s Intellisense is much more accurate; while VC 2008’s has always been criticized for being inaccurate or unable to parse complex C++ syntax (or in simple words “just broken”).
But the truth is 2012’s is so slow and sluggish that I’d prefer 2008’s inaccuracy and speed over 2012’s accuracy and slowness.
Quick review on VS2013 indicates that it has gotten faster when doing read only queries (I can still see the annoying dialog after writing a bit of code), and most importantly, it does put me back to the forward declaration after modifying the definition (but not the other way around). It’s something I guess. Credit is due, where credit is due.
And yet again, Visual Studio fails when compared to competition: Qt Creator’s Go to Definition feature works as fast as VC 2008’s and as accurate as VS 2012’s. And it solves the refactoring problem by drawing a light bulb on the function’s line when the definitions and declarations don’t match so that you can make them match automatically. Worse performance and less features than a competitor.
It’s a double fail for Visual Studio team. Ouch.
Unusually high RAM consumption
The IDE uses 3 times more RAM than Visual C++ 2008, and its compiler uses 2 to 3 as much RAM. Because running multiple instances of Visual C++ is actually quite common (normally 2, but sometimes up to 4; why so many? sometimes because this is required, sometimes the projects are not entirely related, sometimes is due to modularity, and sometimes is due to lack of 64-bit versions , see next problem), whereas it runs ultrasmooth and responsive with VC 2008 with just 4GB, VC 2012/2013 requires at least 12GB (16GB to get a good experience).
We’re talking about same projects, different IDE here. No excuses.
No native 64-bit version
How is this a problem? Running out of memory (the infamous 2GB mark for user space applications) is much more common in VS 2012/2013 than in VC 2008. Projects that ran just fine crash the IDE after the solution upgrade.
Still not convinced? Let’s talk about PIX: VS 2012 is now the replacement for PIX. But for anything other than small demos and small projects; running out of memory is very, very easy. I’ve hit 3GB usage with PIX. Very often it’s because I was leaking something big. But heck, I fired up PIX because I wanted to find the cause of a major problem, not to be nice to it!
High latency input
The situation has improved since Visual Studio 2010, which was infuriatingly slow; still however ocasionally I can type and see how the keystrokes start appearing shortly after. I can also hear the HDD cranking up like crazy when that happens. I’ve upgraded to 8GB while writing this post, and the problem persists. And taskmgr shows plenty of available ram for caching files. So, Visual Studio… what the hell are you doing!?
If it’s so bad, why don’t you move?
Oh, I AM trying to move. And I cannot wait for Clang’s MSVC frontend to be finished. That’s the whole point of this article: If the VS team doesn’t improve these serious pitfalls, on the long run more and more developers will walk away.
But there are three things in mind:
- Visual Studio IS the default and standard compiler for the Windows platform.
- MSVC 2008 is probably one of the best IDEs ever made (including the compiler). I still use it on a daily basis. However as time goes on, less libraries ship precompiled for 2008, and MS is dropping support for its latest platforms (i.e. Win 8 and Co.)
- Truth to be told, MSVC has one the best, if not the best, debugger out there. It’s also truth that MSVC’s debugging performance has also gotten slower (evaluating an expression and single stepping keeps taking longer and longer on every iteration); so watch out for that too. But it is its strongest selling point. If Qt Creator or another IDE had the powerful debugging UI that MSVC has without all the performance pitfalls, VS days would be numbered.
Overall VS 2013 is a big step forward over the horrible VS2012 and VS2010; so there’s still hope; but they still have a long way to run to recover the competitiveness they once had. Mainly in the areas of compiler performance, which is still horrible and the game industry demands fast iteration and very low compile times; providing 64-bit version, and better refactoring facilities (how smart intellisense adapt to code changes, even if that includes analyzing older data).