Last evening (Tuesday) was a great evening. I hosted the first ever panel on Java Performance at the Chicago Java Users Group. CJUG's PR Janet Traub made it happen by getting the support of Bank Of America. So we used the great conference room at the Bank location on Lasalle Street, adjacent to the Chicago Board of Trade.
Since it was the first time, we were holding such an exercize, we invited panelists from the Chicagoland region only. Why take the trouble of inviting panelists from out of town, to disappoint them, if we do not meet their expectations?
Ok, lets get to the story now.
When it comes to Java and the topic is performance, the first thing that eludes anybody's mind, is the JVM performance. Very few Java knowledgeable people know about the JVM tuning/settings. They take the JVM for granted. Hence CJUG was very fortunate to have Brian Doherty from Sun Microsystems on the panel. Brian is a Chicago native and he works in the performance group dealing with the Sun JVM. So when it comes to benchmarking, Brian is the guy.We needed a representative from BEA to talk on JRockit and BEA suite of products. Siva Krishnan from BEA was present at the panel.There were two representatives from Oracle - Phil Bergman (Principal Solutions Architect, J2EE Application Server) and Phil McGlaughlin, formerly Actional and now Oracle.
Jon Hansen, formerly Sun Professional Services and now an independent consultant was on the panel. Jon is an expert in real world deployment of SOA, XML and Web Services for many big customers.
Tim O'Brien, author of a book on Maven and Jakarta Commons, who is an independent consultant, was also on the panel. Rob Lambert who is an independent consultant and has experience in Spring Framework was on the panel. But Rob did not get too much of a window to participate in the discussion. Dave Orme, Eclipse Visual Editor Lead and currently working for db4objects was on the panel too. Brian Pontarelli, CJUG president and Senior Engineer at Orbitz was there.
Questions from attendees.
Do the vendors like Sun,BEA, JBoss,Oracle etc do performance benchmarks before they release the product?
The answer from the vendors was that they do run a lot of industry standard benchmarks before they release the products. A general consensus was that many times benchmarks are done to clinch sales deals, as customers request for benchmark numbers. Also, benchmarks do not always cover real performance issues that may exist, but they do provide a lot of insight into the product for the engineers who are running the benchmarks.
What are the steps one can do to achieve performance?
Brian Doherty quoted Donald Knuth's famous quote: "Premature optimization is the root of all evil.". There was an argument as to whether it is suitable to first implement the algorithm and then optimize it or write a simple/optimized/maintainable algorithm.
General notes from the panel:
- Java Performance is the onestop location for Java Performance.
- VisualGC and JConsole are great tools to attach to the JVM in production where regular profiling tools like JProbe etc are not recommended.
- Brian Pontarelli discussed how Orbitz was revolutionizing computing via Jini and he discussed various benefits of Jini over traditional RMI.
- Some panelists agreed that the root of performance issues or a project being in mess is not really a technology problem but because of personnel and business expectations.