• Post Reply Bookmark Topic Watch Topic
  • New Topic

Java Packages and Impact on compilation

 
Kunal Aher
Greenhorn
Posts: 4
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi ,

I am currently working with autogenerated code which has more than 50k java classes which all have the same package name . The code is compiled using automated scripts. Since repackaging itself is a major activity that will destabilize the system , it would be nice to know if reorganizing the code in separate packages will have any positive impact on building the code base.

Thanks & Regards
 
William Brogden
Author and all-around good cowpoke
Rancher
Posts: 13078
6
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I would bet $5 that it would have zero impact.

The compiler is still going to have to do the same work.

Bill
 
Jeanne Boyarsky
author & internet detective
Marshal
Posts: 35709
408
Eclipse IDE Java VI Editor
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Kunal,
How long does it take now? Figuring out how much time you hope to safe is an important criteria in whether to do anything about it.

I agree with William that rebuilding all 50K classes is likely to take the same amount of time. But do you need to rebuild them all each time? If there are infrequently updated classes, you can build those once, jar them up and then use that jar in the main build. This would save time because you wouldn't be doing a full compile each time.
 
Paul Sturrock
Bartender
Posts: 10336
Eclipse IDE Hibernate Java
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
If you have 50,000 files in one directory some operating systems will begin to show a degredation of performance (NT, I'm looking at you). Compilation itself may not be speeded up, but file IO may be if these were refactored.

(50,000 classes in one package - how do you find anything?!)
 
Tim Holloway
Bartender
Posts: 18408
58
Android Eclipse IDE Linux
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Actually, it's not the OS that is the determining factor on large directories, it's the file system. A classic example was the Commodore Amiga. The original filesystem was optimized for minimal file-open time using hashed directories, and as a result was dog-slow when updating the desktop display where the directories were read sequentially. So the OS team wrote a new fast filesystem that sped up that aspect while retaining the original filesystem support.

On the other hand, Windows isn't a place where people routinely tune their filesystem choices (at least beyond FAT, FAT32 and NTFS), so I'll let that dart stick.

I always had the impression that the Sun Java compiler only actually recompiled source that it detected a change in (though what the change detection mechanism was I never knew). So, if true, that could alter the timings as well.

A better reason for splitting out packages is that it allows for separating the program into functional areas, which should make it generally easier for people to find things and to deduce what they're good for. From a safety point of view, it reduces visibility/vulnerability of package-scope resources (and consequentially keeps the sizes of the resources needed to track these items down). And, of course, expect a massive hit if you do an "import com.mycorp.bigpackage.*;"! Which some IDEs may do for you if you exceed their single-item import threshold.
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
Boost this thread!