Recently a contemporary (50ish) friend shared his thoughts on finding a job as a software developer today:
I was thinking about this, and here are my thoughts...
A couple of huge things have happened in software, the cloud / web thing, and mobile. Most new development being undertaken today involves one or both.
All that said ... the big new thing is mobile. There are two platforms that matter, IOS / ObjectiveC and Android / Java. If I were trying to get a job as a software engineer today - building new stuff, not forensic debugging of 10-year-old still-working systems - I would be an app developer. And on the Internet, nobody knows you're a dog, or a 50+-year old engineer. I think with a mobile skillset anyone would be in demand.
So, how do you climb that learning curve? Well, the first thing is you have to get a Mac and learn OS X, enough to be a user. Hardly anyone develops *for* OS X, but just about everyone develops *on* OS X. I have installed OS X in a VM on my PC laptop, but I'm weird. Everyone just gets a Mac laptop.
Next, I would recommend learning IOS / ObjectiveC first. Android / Java is similar enough to be analogous, but it is a bit clunkier and has more variations. The development platform is XCode, from Apple. You join their developer program for $100/year and then download and install it. The first step in the learning curve is learning ObjectiveC. (Apple now have a new language called Swift, but it hasn't gained traction and ... I would not start there.) ObjectiveC is a mashup of C and Smalltalk. To learn it, I suggest reading the Big Nerd Ranch guide to Objective C. Yeah, that's what it's called, and it's a great hands-on learn-as-you-go guide. For an experienced engineer, I don't think this is going to be a huge curve. And it's a valuable skillset; ObjectiveC is used for IOS *and* OS X. It's also a step to learning C# or Java since they are quite similar.
After that, to learn developing for IOS you have to learn the intricacies of Cocoa. This is Apple's runtime library. It does a lot of the work for you, but it also hides a lot of the detail so it's a bit tough to get your arms around. The XCode environment is highly integrated with Cocoa, the seam between development and deployment is wiggly. (Think of it like [early-pre-.NET] VB on Windows.) To climb the curve, I suggest reading the Big Nerd Guide to IOS*. It helped me get through the initial "what the heck is going on here" to creating "Hello, World" apps for IOS. There is a lot beyond that but it sure is satisfying to be able to code apps that actually run on your phone.
Building crap for phones is all very exciting, and a lot of cool apps are client-side-only, but many real applications need a server component. It turns out the same kinds of interfaces you build for web-client-to-server apps are also used for mobile-client-to-server apps. On the server, you create simple stateless APIs (the cool kids call this REST) which do all the real work for mobile clients (like database access). In some applications you even have both web and mobile clients, using the same APIs. Once you've learned coding mobile apps, you're probably going to want to learn more about the tech on REST servers, too. That's a subject for another post.
I were an engineer looking for work in 2015, I would start teaching myself to build IOS apps. I'd get a Mac, get comfortable using OS X, learn to use XCode, learn ObjectiveC, and learn Cocoa. That's the biggest world in software development at the moment, and it isn't going away.
So what do *you* think? Please let me know if you have comments or suggestions!