• Post Reply Bookmark Topic Watch Topic
  • New Topic

Basic info on JARs and app servers....

 
Janus Tetsuo
Greenhorn
Posts: 11
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Little confused as to exactly what goes on with respect to the sdk, the jre and an application server.
I develop my apps and compile them with the path set to where the sdk is. This includes the JARs that I need to make my program run.
When I deploy the program in my app server does the program still use all the JARs that it needed when it compiled?
I know that I need some of the JARs as I have been told to include them when I make the WAR file, but does the program need all the JAR files that it used to compile when it is running?
Does it use the JARs in the app server? If so, then could a potential problem occur if the JARs in the app server are not identicle to the JARs in the sdk?
I'm developing this policy of just putting JARs everywhere to make sure my stuff works and I know this is bad and I'd need a little clarity on just everything works in a macro sence. Thanks for any help.
 
Jeanne Boyarsky
author & internet detective
Marshal
Posts: 35719
412
Eclipse IDE Java VI Editor
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Janus,
The program needs access to the jars (or equivalent jars containing code with the same api). So if the server has a copy of the jar it is ok to use it instead of your local copy. This often happens with jars the server needs to run. Unless it is some common jar like xerces.jar or j2ee.jar, you are better off including the jar in the war so that you know it is the same version.
The reason that you need all the jars at runtime is that the compiler does not include these in your .class files. It just tells the bytecode to look for a class in the classpath with a certain name.
 
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
Boost this thread!