Win a copy of Java Persistence with Spring Data and Hibernate this week in the Spring forum!
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Ron McLeod
  • Tim Cooke
  • Paul Clapham
  • Liutauras Vilda
Sheriffs:
  • Junilu Lacar
  • Rob Spoor
  • Jeanne Boyarsky
Saloon Keepers:
  • Stephan van Hulst
  • Carey Brown
  • Tim Holloway
  • Piet Souris
Bartenders:

The future of Programming (and Programmers :)

 
Ranch Hand
Posts: 56
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
I am not sure how many developers have thought about it so I thought it might be a good idea to get some discussion going.
I personally feel that in the very near future more and more programmers will get out of work. This is not going happen because of recession. This will happen because of our own efficiency. With every passing day we are producing more and more frameworks and designs that are highly reusable. Software technology and development have matured very rapidly in the last few years. Databases have matured to the extent that Oracle and IBM are fighting over the same customer base. Application servers have evolved so rapidly that things that developers were building themselves are avilable off the shelf. The same is true for a lot of other application software. My question is, what is the future direction of the software industry? Is it going to be more like factories where working class folks(average programmers) put components together standing on the line in Taiwan (read Bangalore, Manila) and only a handful of brilliant software coders stay back in the R & D wing of the factory and try to come up with new components to be added? On the other hand the argument could be that there will always be something new to do with information. Industry has moved from mainframes to PC's to databases to web to ..... Is there more to come or is it the end of the road for programming?
I am looking for opinions from industry gurus who have been around for a couple of decades.
 
Sheriff
Posts: 7001
6
Eclipse IDE Python C++ Debian Java Linux
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
The one thing that being in the computer industry for a few decades teaches you is that this discussion happens all the time. And the end hasn't happened yet!
  • When they invented soft programming by flipping bits and punching cards, the people working the handful of hardware-configured computers thought it was the end. Instead it allowed many more computers to be built and used.
  • When they invented assembly language, FORTRAN and COBOL in which "anyone" could write programs did it mean that all programming is now done by secretaries and clerks ?
  • The invention of timesharing systems where more than one user and program can be running at once, allowed people to sit at a computer rather than writing programs on paper or card and submitting them to run overnight. This must have meant the end of the need to think in programming, right?. With a terminal in front of them, people would just type stuff and see what happens.
  • 4GL ("fourth Generation Languages"), especially one provocatively titled "The Last One", were supposed to eliminate the need for programmers by allowing anyone to just type in a few details of a business process, and have the computer do the rest. No more programming. Whoopee.
  • "Home" computers. Now every schoolkid has the tools to write any program he wants. Why would anyone pay an experienced software developer when you can get a teenager to do it for a few bucks.
  • The internet, portable Java, open-source software. Now you don't even need to write any programs. A google search can find you anything on the planet, and someone is bound to have done it already, hmm?


  • What do these all have in common ? It seems on the surface that each one might have heralded the end of commercial programming. In practice, almost all of them have actually increased rather than decreased the amount of software development going on. The technologies
    you describe are in the same boat.
    New computer technologies often empower whole new
    categories of tasks, which would have never been considered, doing things "the old way".
    Now we are at the gateway to an astonishing, interconnected world. Massive-bandwidth services(virtual presence, video on demand ...), dynamically aggregating and self-healing networks (read up on Jini and JavaSpaces some time), palmtops with more power than 1960's mainframes, and lots of things that are "below the radar" now, but will surprise everyone.
    Who would have thought that just the allocation of a few spare bytes in a cellphone packet protocol would have led to the huge social phenomenon that is SMS "text" messaging?
    The only "trick" you need to know is to stay at the front of the wave of technology, and keep your skills sharp. The move from hardware interconnects to programmable computers allowed everything we see around us to happen, but it meant the end of a lucrative and fascinating business for the people making the old systems. Don't be the one left behind muttering that punching cards is the only "real" way to write software, and if you can't write it in COBOL, it's not worth writing.
    Embrace Change.
     
    author
    Posts: 14112
    • Mark post as helpful
    • send pies
      Number of slices to send:
      Optional 'thank-you' note:
    • Quote
    • Report post to moderator
    Another aspect that didn't change: most software projects still fail!
    The hard part of software development is not the actual manufacturing of the product. Well, you can even argue that this part is so plainly simple that it is totally automatacilly done by compilers and the like. After all, the source code is not the product our customers care for, but just the "blue print" for the executable he will pay us for. But even creating this "blue print" is not the hardest part.
    The hardest part is understanding the customers needs, helping him understand his needs and how they can be satisfied - and satisfying them in a timely manner. That is nothing which will likely ever be accomplished in a factory-like environment, IMO...
     
    Ranch Hand
    Posts: 1874
    • Mark post as helpful
    • send pies
      Number of slices to send:
      Optional 'thank-you' note:
    • Quote
    • Report post to moderator

    Originally posted by HipHop Singh:
    My question is, what is the future direction of the software industry? Is it going to be more like factories where working class folks(average programmers) put components together standing on the line in Taiwan (read Bangalore, Manila) and only a handful of brilliant software coders stay back in the R & D wing of the factory and try to come up with new components to be added? On the other hand the argument could be that there will always be something new to do with information. Industry has moved from mainframes to PC's to databases to web to ..... Is there more to come or is it the end of the road for programming?


    regarding component technology , frameworks , what you are trying to say is correct ? future may be in that direction. as the industry matures , more & more components will be at disposal & the drag & drop of these components to make tailor made s/w for an organization is quite possible. but , let the industry mature like other engineering industries. standardization will happen. software engineering is becoming important faculty now.
    but , the industries which are mentioned like manufacturing still needs innovation. so is software industry.
    Plus , you don't know any disruptive technology can change the course of action. So, it is difficult to predict the future & the course of industry. We have to firmly grounded to belif that whatever may come , we will take it up head on. For person in software industry , the golden rule is life long learning. so, any technology or language should not pose any problem to anybody.
    i hope i am interpreting your question correctly.
    shailesh.
     
    Ranch Hand
    Posts: 919
    • Mark post as helpful
    • send pies
      Number of slices to send:
      Optional 'thank-you' note:
    • Quote
    • Report post to moderator
    We're all doomed
     
    High Plains Drifter
    Posts: 7289
    Netbeans IDE VI Editor
    • Mark post as helpful
    • send pies
      Number of slices to send:
      Optional 'thank-you' note:
    • Quote
    • Report post to moderator
    I more doomed than you are, so nyeah.
     
    Harpreet Singh
    Ranch Hand
    Posts: 56
    • Mark post as helpful
    • send pies
      Number of slices to send:
      Optional 'thank-you' note:
    • Quote
    • Report post to moderator
    Frank,
    Thanks for your response. I agree that technology keeps changing and we need to embrace change. If we don't then we become obsolete with the technology .
    But, I think my assumption is correct to some degree that maturing process of software has become accelerated in the last few years (e.g. database servers, application servers). This is not to say nothing is happening in the industry in general.
    Let me raise another question.
    The rapid maturity of technologies, shorter software life cycle ....Does it mean that software engineers in particular and IT professionals in general, especially those in non-technology companies, need to either (a) keep switching jobs to stay on top with technology or (b) become a business analyst and shift focus away from technology?
    Can you shed some light on how successful IT people have dealt with major technological changes in the past (e.g. from "punch cards" to compilers)?
     
    Ranch Hand
    Posts: 4716
    9
    Scala Java
    • Mark post as helpful
    • send pies
      Number of slices to send:
      Optional 'thank-you' note:
    • Quote
    • Report post to moderator
    just my opinion, but i think there might be a little less demand in future than before. for instance some will choose to use one of the new web publishing programs (or have their secretary do it) instead of hiring someone to code it. but many will still hire someone. i certainly dont think it's the end.
    down with OO! join the top-down procedural revolution!
     
    Frank Carver
    Sheriff
    Posts: 7001
    6
    Eclipse IDE Python C++ Debian Java Linux
    • Mark post as helpful
    • send pies
      Number of slices to send:
      Optional 'thank-you' note:
    • Quote
    • Report post to moderator
    I guess the thrust of my argument is that all of these changes, instead of reducing the amount of work for computer-people, have actually increased the amount of work. For every task which is simplified or eliminated, a whole bundle more is created which weren't possible before.
    Make basic web page creation and hosting simple enough for "anyone" and hundreds or thousands of organizations will consider getting an internet presence. Even if only 10% of these need the services of software developers, analysts, programmers or whatever, that's still a huge expansion in the job market.
    Ten years ago, I used to spend a lot of time writing code for basic data structures (assiciative arrays, linked-lists, stacks etc.) in C and C++ for each customer's applications. These days I use Java, which has a relatively rich Collections API. Has my workload decreased? Of course not!
    Over the last ten years or so I've spent a lot of time writing application frameworks, page template systems, object-relational persistence engines, XML parsers and so on. These days I'm spoilt for choice. I can pick and choose software in all these categories. Has my workload decreased? Of course not!
    I have far more ideas in my "when I get around to it" book now than I have ever had, and all of them are made possible by advances in generally-available software. The less low-level (however that is defined) work I have to do, the more possibilities open up.
     
    Frank Carver
    Sheriff
    Posts: 7001
    6
    Eclipse IDE Python C++ Debian Java Linux
    • Mark post as helpful
    • send pies
      Number of slices to send:
      Optional 'thank-you' note:
    • Quote
    • Report post to moderator
    As for how to cope with change, my trick is simple: never stop learning. And I don't just mean not turning down a training course if it's offered by management. Here are a few ideas to get you started...
  • Participate here (good, you are doing that already) and in other flexible, interesting sites ( c2.com , slashdot.org , java.sun.com , etc. etc. )
  • Read anything and everything - not just "computer" books, but history, philosophy, politics, language etc.
  • Challenge yourself by taking certifications and assessed classes in unfamiliar areas
  • Continually try and guess what will be important in the future, write down your guesses and why you made them. Then check back frequently. Sure you'll be wrong most of the time (we all are), but you'll get better. Develop a "gut feeling" for up and coming technologies and make time for studying them.
  • Play with stuff. If you think something sounds cool or useful, others might too. And practical experience (even just a few hours in your spare time) outranks book-learning every time.


  • I hope you get the picture.
     
    Greenhorn
    Posts: 12
    • Mark post as helpful
    • send pies
      Number of slices to send:
      Optional 'thank-you' note:
    • Quote
    • Report post to moderator
    Frank Carver Said:

    I guess the thrust of my argument is that all of these changes, instead of reducing the amount of work for computer-people, have actually increased the amount of work. For every task which is simplified or eliminated, a whole bundle more is created which weren't possible before.


    I agree but not totally. There must be a SATURATION POINT somwhere along the line, right?
     
    Harpreet Singh
    Ranch Hand
    Posts: 56
    • Mark post as helpful
    • send pies
      Number of slices to send:
      Optional 'thank-you' note:
    • Quote
    • Report post to moderator
    Frank,
    Appreciate your feedback.
    I do read history, philosophy and religion. Actually reading non-computer books keeps my reading interesting because software books have become sort of repetitive for me
    I also try to predict industry trends.
    While I am at it, let me throw this one out and see what others think. For a variety of reasons, I feel that IBM is all set to takeover the software market within the next decade. The only serious contender would be microsoft.
    IBM has got everything to do so .... the weight, the developers, the intent, the lingering bitterness of PC time, the product line(database, ide, app servers, os, you-name-it-IBM-has-it), global consulting to push products, established mainframe shops .... the list goes on....
    microsoft may gain ground with newer businesses who are looking for cheaper solutions and don't have the the kind of money IBM asks. But again once they start scaling their business ....
    Where are sun and oracle in this picture... well I see sun getting sold out as a whole or getting sold out part by part. It will be back to just selling boxes over the next few years. Oracle will have to scale back drastially and be content with being the second biggest database vendor.
    To sum it all I believe the consolidation in software has just started. Also there is going to be a lot of emphasis on SEI and ISO standarts in IT companies and IT shops of end-users.
    Opinions?
     
    Ranch Hand
    Posts: 1055
    • Mark post as helpful
    • send pies
      Number of slices to send:
      Optional 'thank-you' note:
    • Quote
    • Report post to moderator
    Doomed... we're all doomed...
     
    Ranch Hand
    Posts: 356
    • Mark post as helpful
    • send pies
      Number of slices to send:
      Optional 'thank-you' note:
    • Quote
    • Report post to moderator
    If we don't do anything ... we can have more time to spend here .. at javaranch.
     
    Frank Carver
    Sheriff
    Posts: 7001
    6
    Eclipse IDE Python C++ Debian Java Linux
    • Mark post as helpful
    • send pies
      Number of slices to send:
      Optional 'thank-you' note:
    • Quote
    • Report post to moderator
    In the spirit of https://coderanch.com/wiki/660215/Dont-Wake-Zombies (but really do) I thought it would be fun to add a "twenty years later" viewpoint to this discussion.

    Your experience may be different, but from where I am sitting the demand for software developers has never been stronger. Sure, the technologies have moved on over the last two decades. Java is still the language of big corporate projects, but relative newcomer Python currently tops the TIOBE charts. Twenty years ago application servers and databases were the big thing, but these days its all microservices, docker, kubernetes, mobile apps and the internet of things. Even JavaScript, once dismissed as only fit for flipping images on web pages now powers some pretty big business systems.

    If you want to predict doom, the current bogeyman is machine learning and artificial intelligence. Neural networks are everywhere, and even some of the tiniest chips have hardware support for AI acceleration. Twenty years ago face recognition, voice interfaces, and the idea of self-driving vehicles was just for research labs and science fiction, dismissed as impossible, or at  least impractical, by ordinary people. But guess what, theses smart systems just push the need for more software developers.

    The key thing to note here is that we are living in the future. Back in 2002 when this thread was started there were no smart phones, a powerful development PC might have had 256 megabytes (a quarter of a GB) of RAM and only a small proportion of users had an internet connection faster than a 56K modem, Java was still at version 1.4, Google had not had an IPO, and the state of the art in version control was subversion and sourceforge.

    I don't reckon software development is disappearing anytime soon. What do you think?
     
    Marshal
    Posts: 77529
    372
    • Mark post as helpful
    • send pies
      Number of slices to send:
      Optional 'thank-you' note:
    • Quote
    • Report post to moderator
    Welcome back, Frank
     
    Sheriff
    Posts: 17489
    300
    Mac Android IntelliJ IDE Eclipse IDE Spring Debian Java Ubuntu Linux
    • Mark post as helpful
    • send pies
      Number of slices to send:
      Optional 'thank-you' note:
    • Quote
    • Report post to moderator

    Frank Carver wrote:I don't reckon software development is disappearing anytime soon. What do you think?


    I agree, we're not disappearing, not just yet anyway. I also agree that ML and AI are probably what's going to lead to some kind of change that will make most of what we do today, well, different.

    I saw this article today: https://scitechdaily.com/artificial-intelligence-discovers-alternative-physics/

    The thing that stood out to me about it was that the researchers said "we don't yet understand the mathematical language it is speaking." I find that quite chilling. Making an AI whose reasoning we can't ourselves reason about seems like a recipe for disaster.
     
    Junilu Lacar
    Sheriff
    Posts: 17489
    300
    Mac Android IntelliJ IDE Eclipse IDE Spring Debian Java Ubuntu Linux
    • Mark post as helpful
    • send pies
      Number of slices to send:
      Optional 'thank-you' note:
    • Quote
    • Report post to moderator

    Frank Carver wrote:As for how to cope with change, my trick is simple: never stop learning.  And I don't just mean not turning down a training course if it's offered by management.  Here are a few ideas to get you started...

  • Participate here (good, you are doing that already) and in other flexible, interesting sites ( c2.com , slashdot.org , java.sun.com , etc. etc. )
  • Read anything and everything - not just "computer" books, but history, philosophy, politics, language etc.
  • Challenge yourself by taking certifications and assessed classes in unfamiliar areas
  • Continually try and guess what will be important in the future, write down your guesses and why you made them.  Then check back frequently.  Sure you'll be wrong most of the time (we all are), but you'll get better.  Develop a "gut feeling" for up and coming technologies and make time for studying them.
  • Play with stuff.  If you think something sounds cool or useful, others might too.  And practical experience (even just a few hours in your spare time) outranks book-learning every time.


  • I hope you get the picture.


    This aged quite well, with the lone exception maybe of the reference to java.sun.com.
     
    Saloon Keeper
    Posts: 26728
    190
    Android Eclipse IDE Tomcat Server Redhat Java Linux
    • Mark post as helpful
    • send pies
      Number of slices to send:
      Optional 'thank-you' note:
    • Quote
    • Report post to moderator

    Junilu Lacar wrote:I also agree that ML and AI are probably what's going to lead to some kind of change that will make most of what we do today, well, different



    Quite different. You don't program AI systems, you train them. Take a basic AI "brain" off the shelf and feed it data and associations until it becomes reliable. The data feed job can be done by unskilled labor, eliminating the need for expensive programmer types. And Management went wild with joy!

    Well, almost. In reality it seems like AIs are subject to many of the same vices as humans as we've seen incidents of racial and gender discrimination in AI systems.

    As well as unintended consequences. A great story I read recently was about the accuracy of an AI system that could successfully diagnose from X-Rays. Turns out that the key decision input came from an unexpected area.  Each X-Ray was signed by the doctor who'd previously examined it. The AI noticed that one particular doctor had a high success rate and keyed in on whether or not that doctor's signature was on the X-Ray.

    This implies the likely rise of AI troubleshooting as a profession. Rather like the protagonists in Isaac Asimov's "I Robot" series of stories.

    I still like the idea of replacing top Management with AIs, though. Wouldn't require obscene financial incentives, wouldn't make decisions based on personal concerns. Would probably treat employees more humanely — at least if you pre-loaded them with the 3 Laws of Robotics.

    All that said, AI is a discipline that excels where a certain fuzziness is permitted. There are areas where more precise computation is needed and they're not going to go away either. I still don't see the day that an AI will be suitable for developing that sort of software.
     
    Junilu Lacar
    Sheriff
    Posts: 17489
    300
    Mac Android IntelliJ IDE Eclipse IDE Spring Debian Java Ubuntu Linux
    • Mark post as helpful
    • send pies
      Number of slices to send:
      Optional 'thank-you' note:
    • Quote
    • Report post to moderator

    Tim Holloway wrote:
    I still like the idea of replacing top Management with AIs, though. Wouldn't require obscene financial incentives, wouldn't make decisions based on personal concerns. Would probably treat employees more humanely — at least if you pre-loaded them with the 3 Laws of Robotics.


    Shades of the TV series "Raised by Wolves" (on HBO?) where the AI makes all high-level decisions for humankind. Humans seem to be getting really good at plotting their own demise through fiction that later is imitated by real life. Sometimes I wonder whether we aren't truly in a computer simulation and the shows we're seeing on TV and other media are just ways to inure/prepare us to the impending doom we're going to have to suffer through in the near future. <shrug>
     
    Tim Holloway
    Saloon Keeper
    Posts: 26728
    190
    Android Eclipse IDE Tomcat Server Redhat Java Linux
    • Mark post as helpful
    • send pies
      Number of slices to send:
      Optional 'thank-you' note:
    • Quote
    • Report post to moderator

    Junilu Lacar wrote:

    Tim Holloway wrote:
    I still like the idea of replacing top Management with AIs, though. Wouldn't require obscene financial incentives, wouldn't make decisions based on personal concerns. Would probably treat employees more humanely — at least if you pre-loaded them with the 3 Laws of Robotics.


    Shades of the TV series "Raised by Wolves" (on HBO?) where the AI makes all high-level decisions for humankind. Humans seem to be getting really good at plotting their own demise through fiction that later is imitated by real life. Sometimes I wonder whether we aren't truly in a computer simulation and the shows we're seeing on TV and other media are just ways to inure/prepare us to the impending doom we're going to have to suffer through in the near future. <shrug>


    Idiocracy. The Time Machine (Morlocks versus Eloi). Recent statistics on breeding populations by various socio-economic and intellectual factors. Cynical observations on intelligent human behaviour by various leaders throughout history (e.. g., Winston Churchill).

    Or, as they say, no amount of Artificial Intelligence can replace Natural Stupidity.

    In fact, a certain glee in malicious and wilful stupidity has been a continual thread throughout at least the last century. Isaac Asimov commented on it as an American characteristic, although I wouldn't be too sure about other countries and other times, either.

    I didn't consider it too serious when I thought that, like so many other social populations, only 10-20% of the population made up the extremes, but alas, recent years have shown that it's depressingly close to 40%.

    As I said, recent statistics indicate that poorer, less-educated people do, in fact, breed disproportionately (also noted by Abraham Lincoln). The question is, what is cause and what is effect? If we had a Basic Universal Income and no one was truly poor nor had to deal with "wealth traps" (where you lose benefits faster by self-improvement than you gain by self-improvement), would they fall into the same patterns as those of more fortunate means? It would be interesting to know. Are we putting a damper on the limits of human intelligence, and can we remove it? Would AI help or hinder that?
     
    Rancher
    Posts: 1071
    27
    Netbeans IDE Oracle MySQL Database Tomcat Server C++ Java
    • Mark post as helpful
    • send pies
      Number of slices to send:
      Optional 'thank-you' note:
    • Quote
    • Report post to moderator
    Harpreet,

    i've been hearing that lame and tired discussion now for over 35 years, and guess what?  There are more programmers than ever before!!! And automated tools are doing more now, than ever before.  I've also heard that there will not be a need for "programmers", but only implementers, code monkeys, proficient in specific languages and not expert in general programming.  Guess what--the need is alive and well there too

    Les
     
    girl power ... turns out to be about a hundred watts. But they seriuosly don't like being connected to the grid. Tiny ad:
    The Low Tech Laboratory Movie Kickstarter is LIVE NOW!
    https://www.kickstarter.com/projects/paulwheaton/low-tech
    reply
      Bookmark Topic Watch Topic
    • New Topic