"Too many Java developers?"
That's an interesting question, especially for those of us here in the UK, where there has been a lot of debate recently about the "IT skills shortage". I guess it depends where you are in the world and the industry, but here's how it looks from my small corner.
Recruiters say they can't find people with the IT skills they need, but 1 in 6 IT graduates here can't find work, and in September 2011 there were officially more than 37,000 unemployed IT workers in the UK, not counting all those who used to work in IT but are now forced to take other work. Employers have been complaining about the "skills shortage" ever since I joined the industry 25 years ago (hmm, maybe it's my fault?
), but in the last 10 years they have virtually stopped investing in maintaining/developing the skills of their existing staff and stopped hiring graduate trainees, while firing tens of thousands of experienced staff and moving the jobs offshore (or bringing in tens of thousands of cheaper but inexperienced staff from places like India). So if there is a "skills shortage", it is to some extent self-inflicted.
So there are thousands of IT workers out of work in the UK at least, many more are being fired every month, but everybody says there is a shortage of IT skills, including Java: what's going on?
I think it's partly a result of the success of the Java industry that there can be a surfeit of inexperienced Java programmers but a shortage of skilled/experienced Java developers. It's not just that Java is easy to learn. After all, you can also download .NET and VisualStudio for free, most enterprise databases as well, and use pretty much any language you like - from LISP and Haskell to Python and C - with free compilers/IDEs on free Linux systems. So it's not just the easy availability/accessibility of the languages or tools that counts.
But through a combination of its own merits and a massive culture of hype, Java has become pretty much the COBOL of the early 21st century. It's the default choice for most business systems, and when you factor in all the frameworks and libraries and cross-platform support etc, it's a pretty powerful general-purpose platform. Indeed, many people clearly think it is the only thing you ever need to know about for building enterprise IT systems. So if you want to get started in the mainstream IT industry, you're going to need some Java, and the colleges have responded by churning out IT graduates with basic Java skills (but often not much else to offer) for many years now. Also, constant talk of the "IT skills shortage" means people still think it's a quick route into well-paid work. Hence, thousands of inexperienced Java developers with low-to-intermediate skills.
In theory this should lead nicely to similar numbers of more highly skilled Java developers as the newbies gain experience and start to specialise in particular areas. The rise of Agile methods ought to encourage this process, through close collaboration with colleagues and business users on small multi-skilled teams: if you're working closely with people who have different skills from your own, you will learn more from them. And the "software craftsman" movement also encourages individuals to develop deeper/broader skills.
Unfortunately, at the same time as these positive developments, we have also seen massive efforts to turn software development into a production-line process, with armies of inexperienced and relatively unskilled - but cheap - programmers churning out code based on rigidly defined design specifications provided by a smaller number of more highly skilled - and thus more expensive - individuals. As Java is the de facto standard for so much commercial IT, many of the big outsourcers focus on Java technologies as well - it's easy to find the basic skills and easy to sell those skills to lots of different clients. This commodification of programming tries to apply a Walmart-style "stack 'em high and sell 'em cheap" approach to development skills, which obviously works against the idea of the "software craftsman". In this model, development skills are de-valued, and the role of the developer is de-skilled to nothing more than a dumb coding monkey.
For example, the last organisation I worked for had lots of Java developers, and was trying to hire more, but complained they couldn't find the people they wanted. Yet their existing team - many of them were skilled and experienced developers - were often bored out of their minds because they were treated as dumb coding monkeys and spent much of their time "coding" XML configuration files or the trivial 4-line methods demanded by some designer's/architect's UML diagram. The result was that they were probably the least productive development team I have ever encountered, not because they were lazy or incompetent (they were not), but because nobody was making proper use of their skills and experience. This approach actually helps to create the myth of the IT skills shortage as managers respond to poor productivity/quality by throwing more people at the problem, when the real shortage commodity is intelligent management.
Not surprisingly, many people in these grunt roles try to escape them as soon as possible, rather than taking their "10,000 hours" to become skilled craftspersons, so you often find people trying to get into analysis/design roles with very little experience of development. Meanwhile, the bulk outsourcing providers find it cheaper to hire a new batch of offshore graduate trainees every year than retain and develop most of their existing staff, so it's often a case of either move up or move on. From what I've seen, there is a growing band of people in analysis/design roles who lack the practical hands-on experience of development to perform those functions properly, and this problem seems to be spreading as those people move on up into "architecture" (another largely Java-specific boom industry) and so on. Here in the UK, some senior figures in the industry are beginning to realise that by ruthlessly outsourcing/offshoring so much development work and forcing experienced staff out of the industry altogether, they have effectively eliminated their own supply of skilled workers for the future.
Meanwhile, there is a lot of churn in the job market, as people try to move up or across into more interesting development jobs. This means lots of competition so applicants are inflating their skills/experience, which just makes it harder for recruiters to identify the good candidates. Recruiters often respond by ignoring skills acquired outside work (which can be hard to verify) and by adding lots of extra requirements to the role description, in the hope of filtering out some of the unsuitable candidates, but they often overshoot and can't find anybody who has e.g. 10 years' experience of Java 7. Then they complain about the "skills shortage".
You end up with the curious situation (here in the UK at least) that thousands of experienced IT staff are out of work or looking for jobs but can't find any, while a small number of people with the magic combination of skills are hopping between well-paid jobs at will, and employers complain endlessly about the "skills shortage" while firing staff and failing to invest in developing/retaining the skills they claim to need.
Anyway, that's just one perspective of course. And I really can't tell if there is a genuine shortage or surplus of Java skills, or even IT skills in general, because there seems to be so much contradictory evidence on both sides.
But a good question for anybody complaining about the "skills shortage" in Java or any other area would be "What has your organisation done to increase the skills of your existing staff or enhance the general pool of skills in the industry? Where do you think the skilled workers are supposed to come from?".