You are here

Java performance tools and teams survey: Who is doing what?

public://pictures/Esther-Schein-Freelance-Writer-Schein-Freelance.jpg
Esther Shein, Freelance writer

What's the best way to tune Java performance? That's a complicated question, and RebelLabs has a ton of answers in its just-released sixth annual Developer Productivity Report.

RebelLabs surveyed 1,562 of its community members and found:

  • Most performance issues are found by users, pointing to a need for better testing.
  • Most respondents saw 50 percent or greater speedups from testing—yet a quarter of respondents didn't do profiling at all.
  • 20 percent of teams write in-house tooling to run performance tests.
  • In-house tools and JProfiler find more performance issues than any other tools.
  • Nearly 50 percent of teams use multiple tools while testing.
  • Using more tools increases your chances of finding performance issues.
  • Dedicated performance teams are more likely to find issues—over twice as likely as the operations team.
  • Dedicated performance teams spend twice as long as software developers fixing performance issues.
  • Performance issues are almost always fixed by developers, regardless of who finds them.

[ Get Report: Gartner: Use Adaptive Release Governance to Remove DevOps Constraints ]

Java profiling tools

Most performance issues are discovered through user feedback and system crashes. "In short, we as an industry are failing to test properly,'' the RebelLabs report points out. In fact, over a quarter of respondents didn't spend any time on profiling and performance monitoring.

Figure 1.14

Teams that do profile got the best results with in-house tools. This shows that "applications can be different enough that a third-party tool isn't enough,'' observes Simon Maple, Java champion and developer advocate at ZeroTurnaround, which sponsored the survey.

Figure 3.1

However, third-party tools are still important. Nearly half of respondents use VisualVM. Also popular are JProfiler, Java Mission Control, and YourKit.

Figure 1.12

Almost half the development teams use multiple tools, and report that using a variety of tools helps them find issues. Those in QA, a dedicated performance team, or operations are 40 percent more likely to use custom, in-house tools.

[ Get Report: The Top 20 Continuous Application Performance Management Companies ]

The role of teams

The structure of the team plays a major role in performance optimization. Dedicated performance teams spend almost 50 percent longer diagnosing, fixing, and testing performance issues than other teams. Not surprisingly, dedicated performance teams are also more likely to find performance-related bugs—twice as likely as operations teams.

Figure 2.4

Regardless of who finds performance issues, developers almost always fix them.

The report finds that the teams with the happiest end users tend to be smaller, with less complex applications. Such teams tend to be more proactive, with 40 percent of respondents profiling on a daily or weekly basis, and are less reactive, with 20 percent saying they're less likely to test reactively when issues crop up.

Early performance testing is key

Early testing is also key to happy users: teams with happy users test 36 percent more often while they code.

"Performance testing is often considered a post-development exercise by the masses, yet numbers show it can be more effective when run earlier,'' Maple says. While there are obviously some tests that can only be run later under real or simulated production load, "if it's possible to find, diagnose, and fix performance bugs while they're being written, it doesn't make sense putting it off til later.''

Performance testing is similar to functional testing in that if a developer tests and fixes earlier, it's cheaper and provides better results, Maple maintains. "It's a team mindset that Java teams need to adopt."

Figure 1.7

Performance issues and root causes

According to the survey, the overwhelming cause of performance issues is users failing to complete requests. Slow requests and application and server outages were also common issues.

Respondents zeroed in on database issues when asked about root causes. Slow database queries, too many database queries, and inefficient code all stood out as major problems.

Figure 1.16

The report notes that disk I/O and network I/O—often considered bottlenecks—didn't seem that important relative to other issues.

Does profiling matter?

When asked about the impact of profiling and performance testing, a quarter of respondents said it only made their application "marginally faster." But most respondents reported gains of at least 50 percent.

Figure 1.20

Just as important, a quarter of respondents didn't know how to measure the impact or didn't do testing in the first place. So it's likely that the benefits of performance testing are being significantly underestimated.

So is testing worth it? Perhaps the best summary comes from the report: "We have wonderful tools available, like JMeter and Gatling, which provide us with benchmarking for our applications and should really be used to test for performance fixes."

[ Get Report: Buyer’s Guide to Software Test Automation Tools ]