How much time does it take to execute a loop?
Here you can try this:
long startTime = System.currentTimeMillis();
long endTime = 0;
for(int i=0; i < 1000000; i++) {
//Something
}
endTime = System.currentTimeMillis();
long timeneeded = ((startTime - endTime) /1000);
One way to time an operation is to take an average with nanoTime() You may want to adjust the number of iterations and you will get less variation with an average. nanoTime is better than currentTimeMillis in that it is more accurate and monotonically increasing (it won't go backwards while the application is running)
long start = System.nanoTime();
int runs = 1000*1000;
for(int i=0;i<runs;i++) {
// do test
}
long time = System.nanoTime() - start;
System.out.printf("The average time taken was %.1f ns%n", (double) time / runs);
Using printf allows you to format the result. You can divide by 1000 to get micro-seconds or 1000000 for micro-seconds.
You need to be very careful when writing micro-benchmarks in Java. For instance:
If the JIT compiler can figure out that the loop body doesn't affect the results of the code, it can optimize it away. For instance:
for (int i = 0; i < 1000000; i++) { int j = i + 1; }
is likely to "run" very fast.
Code runs a lot faster after it has been JIT compiled.
Code can appear to run a lot slower while it is being JIT compiled.
If the code allocates objects, then you need to take account of potential variability of measured performance due to the GC running, the initial or maximum heap size being too small and so on.
And of course, performance will depend on your hardware, your operating system, the version and patch level of your JVM, and your JVM launch options.