Win a copy of The Little Book of Impediments (e-book only) this week in the Agile and Other Processes forum!
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

Performance Analysis

 
Michael Drexler
Greenhorn
Posts: 2
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hello,
I have two questions about measuring the performance of java applications/methods.

As a student of computer science I am currently attending a course/seminary dealing with IT security. As an assignment I must implement a java application to measure the performance of various encryption/signature algorithms. To my mind, the easiest way to do so is to implement a method encrypt(...) returning its own execution time, determined by System.nanoTime(). If you call this method in a loop, you can easily determin average/max/min execution times. Are there any other ways to measure the performance of specific functions (average/min/max execution times)?

Another problem when it comes to measuring performance is, that garbage collection could kick in at any time, slowing down the encryption/signing process and falsifying the results. Is it possible to avoid this? (For instance by temporarily turning off garbage collection?)

Thanks in advance,
Mike
 
Paul Clapham
Sheriff
Posts: 21576
33
Eclipse IDE Firefox Browser MySQL Database
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
It isn't obvious to me why the existence of garbage collection falsifies your results. For example, if you are comparing algorithm A to algorithm B, and algorithm B produces many more temporary objects than algorithm A, then having to garbage collect those temporary objects is a necessary consequence of the algorithm. In this case I believe that ignoring the effects of garbage collection would result in falsifying the results.

I think your problem here is that garbage collection is non-deterministic. So measuring the time to execute an algorithm once has an unpredictable component. But that doesn't mean you should ignore the unpredictable component. It just means you should use statistical techniques that account for it properly. You should be using those statistical techniques in your benchmarking anyway, to account for all the other non-deterministic effects like JVM startup time and JIT compiling.
 
Manhar Puri
Ranch Hand
Posts: 41
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Just curious. Isn't this like saying what if some important OS daemon process kicks in while running program for algorithm A and takes up all the CPU.

What I will suggest is (this is in case u want to ignore the the fact algorithm B is creating more temporary objects than algorithm A and is hence inefficient in the first place itself) run your programs several time so that you are average out these unpredictable events. Also increase the memory allocation to your JVM.

Again just a thought.

-Manhar.
 
steve souza
Ranch Hand
Posts: 862
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Check out jamonapi.com. Within the next few weeks there will be a new powerful release that will allow you to monitor ANY interface with one line of code. This will show its power most when working with jdbc interfaces. Simply wrap the Connection interface and all methods of all jdbc interfaces will automatically be monitored. In addition there will be a rolling buffer of any exceptions thrown, and the most recent sql executed. JAMon will track aggregate stats for all sql (i.e. select * from table where key=?) and methods invoked. Some examples of how easy this will be to do follow:

 
Michael Drexler
Greenhorn
Posts: 2
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Thanks a lot for your help! I think I will have a closer look at jamon...

Bye,
Michael
 
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic