• Post Reply Bookmark Topic Watch Topic
  • New Topic

On Architercture  RSS feed

 
A Putra
Greenhorn
Posts: 9
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
hi ranchers,
an interview question: "in the project lifecycle, when would you choose the technologies for implementation? just before implementation or during the architecture design phase?".
your views please, with the reason(s)?
thanks in advance,
-arjunaputra
 
Jeroen Wenting
Ranch Hand
Posts: 5093
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
In my experience the technologies are usually chosen during the requirements phase if they're chosen at all.
"Let's make a billing application using EJB running on Websphere" is something you will hear a lot in the wild. In one instance we were faced with a blanket requirement coming from the board of directors (this was a major bank, not a software company) that ALL future projects HAD to use XML for example.
IMO this is wrong.
The architecture should tell what tech is available for use, the technical design should then decide which of those technologies is appropriate for the problem at hand.
If it turns out that a better choice is available (or becomes available during development) that architecture had overlooked the project should be flexible enough to take full advantage of it (obviously switching from Sun to HP hardware and/or a completely different appserver and database would be taking this too far, but substituting JDO for entity beans for example should be possible), all after feedback up the pipe of course.
Some things may of course be set in stone by the limitations of the deployment environment. Usually this is the hardware on which the app will run and the database and possibly appserver to be used, simply because of massive prior investment in those and the high support and acquisition cost of adding dissimilar systems into the mix.
 
Stan James
(instanceof Sidekick)
Ranch Hand
Posts: 8791
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Some core architecture usually becomes evident as you go through early requirements. Something like "Will be used by the general public without requiring them to install proprietary software" pretty well means it's going to be a web app. "Must support a million concurrent users, 24x7, 99.99% uptime" tells you to think about clustered redundant servers. You have no clues about Java vs .Net yet, so there are obviously some judgement calls or places for corporate standards.
The app I make my living on is an integration hub ... it interacts with dozens (and dozens more) applications on CICS, web services, Windows APIs, etc. Those requirements drive a lot of architecture, too.
Most software development methods have an early architectural "spike". A few key developers prove that server X can talk to database Y and middleware Z, and maybe build up some frameworks to make it easier to do before you bring on the coding hoards and start mass construction. The "agile" methods allow finer points of architecture to evolve over time. I think that can work if you have mature designers on board.
Jump down the page to the Process UML/XP etc forum for lively discussions of this kinda thing!
 
A Putra
Greenhorn
Posts: 9
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
dear jeroen and stan,
thanks a ton, guys! very insightful. this is all evident in the daily stuff i do, but goddamit, i was not able to put it across like you have.
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
Boost this thread!