"Apple had about 2.06 percent of the US desktop market in 2003. By 2010, OS X had about 10.9% of the market," writes Github developer Zach Holman. "There's a slew of reasons for this growth, but I think a large part of it is the migration of software developers from Windows to OS X starting in the early 2000's. Attracted by the reasonable UNIX toolchain and the straightforward usability approach, more and more geeks adopted OS X as their primary machines."

But there's always been a blight in developing on OSX under languages other than Cocoa, and that's compiler support. In order to get gcc, developers have had to download Xcode. According to to Holman, this wasn't a big deal back when X-Code was less than 500MB. But now Xcode costs $5 from the Apple App Store, and it's a 4.5GB download that takes up 15GB once installed.

Holman writes:

If I want to release a great new Ruby gem that uses a C extension or library, I need to ask prospective users of that gem to:
  • Spend $4.99 in the App Store
  • Download a large 4.5GB file
  • Spend a decent amount of time installing XCode
  • Sacrifice 15GB of disk space to an app they likely won't use
  • Install my gem

What do you think? Is it really a cumbersome process? After all, as pointed out by several commenters on Hacker News, it's a process that only needs to be completed once for each machine you work on (unless you reload the OS). And $5 doesn't seem like much compared to the overall cost of a machine (and Xcode 3 is still free). But Holman isn't asking for much: just a stand-alone gcc package, either from Apple or a third-party. If you want gcc for Windows, you can download MinGW for free and it's only 576.1MB. It seems to come down to the principle of the thing, more than the actual inconvenience.

Is OSX becoming less developer friendly?