linkGitClear at scale

linkResults when applied to a 500+ developer healthcare company

linkResults at a Glance

5.5% increase

in overall organizational code output (as measured by Diff Delta)

37.5% decrease

in PR time under review

49.2% increase

development effort toward test coverage


One of GitClear's clients is a leading provider of innovative healthcare technology and data solutions for medical practices, aiming to improve patient outcomes and provider efficiency. Their customized solutions empower patients and clinicians, while their award-winning technology drives high-performing practices towards whole person health and value-based care.They employ over 3,000 employees, including 700 developers, helping care for more than 65 million patients in the US.

linkCustomer Story

Two years ago, the CTO of the company defined a set of KPIs that were to be the focus of the engineering team in the following quarters. Articulating his goal to his team and to GitClear, he explained[1]:

"I want to be very prescriptive on these KPIs. I want to focus on how can we improve Productivity, Efficiency and Quality."

But, to measure progress against these KPIs, the first step was to establish the existing baselines. Finding reliable, consistent development measurements is no easy feat when engineering work is scaled across 100 remote development teams.

In short, they were facing the same hurdles that almost every enterprise company faces when considering which development metrics to adopt:

No known metric that can objectively measure progress over highly disparate operating parameters

Data-gathering is resource-intensive and can be inconsistent

Building a custom tool is costly, time-consuming, and likely to lack validation

Visibility across 100 teams entails trade-offs between cost and consistency

Data centralization and orchestration lives in a spreadsheet

"Ease of producing" metrics is inversely related to their applicability. The most "available" metrics also the most "noisy"

Security must be priority #1

Since building a tool from scratch was not an option, our client began to search for a vendor that met their requirements. They needed a tool that could provide:

An easy and reliable set of metrics to track and improve KPIs

Fast and clear visibility that encompasses the whole organization

A clear and concise report for CTOs and individual managers

Easy setup and automation

Tangible, objective, consistent metrics

Benchmarks that allow comparing performance against industry norms

Analysis that can be undertaken without creating extra work for teams

Metrics that suggest follow-on actions

A secure system that can operate within their premises

linkChallenges and Objectives

Track and improve productivity, efficiency and quality

Visibility of high-level progress for over 100 remote teams distributed over multiple countries

Integrate charts and data into existing note-taking system

Imbue the entire development process with data that fortifies decision-making

Managers must feel comfortable communicating these goals to teams

linkRollout and Measurement Process

"We wanted managers to take control of the dev process and put together action plans."

Their CTO outlined an interest in four key metrics:

The overall Diff Delta produced across the organization, ideally prorated based on head count

The under review time (ie: time from open to close) of a typical pull request

The percentage of overall code volume (Diff Delta) dedicated to tests

Minimizing the work done on reviewed (pull request) code after said code was merged to the repository's main branch

At the start of each month, each manager had a snapshot of their team's performance over the four KPIs tracked by GitClear. These metrics were reviewed in detail, and the managers would take appropriate action to correct any KPIs for which their team was underperforming.

We'll take a deep dive into how the actual measurements worked out over the course of GitClear's rollout in the following sections.

linkMetric: Total Diff Delta

The first, and simplest, metric they analyzed was overall Diff Delta, the metric that seeks to quantify, using empirically-validated methods, the rate of overall codebase change within the organization's repositories. The rollout team was keen to observe how much, overall, was getting done, and how much that correlated with team size. Hiring, layoffs, and organizational changes can deeply affect the number of resources allocated to a project, which in turn affects its overall rate of development. We tracked these changes over time using longer-term measurements, snapshotting the month before, 3 months after, and 6 months after GitClear's introduction to the test team.

The chief area of concern was less a raw improvement in Diff Delta, but moreso keeping these numbers steady in light of the fact that three other metrics, which can affect the speed of development, would also be introduced. They wanted to make sure that the organizations being surveyed were able to still deliver consistent results from their teams.

The results didn't just meet but actively exceeded expectations, as their teams were able to improve their overall output, even while controlling for quality and reliability related factors (see below).

linkData: Before and after

Time period

Overall Diff Delta (△)

Percent diff

Month before GitClear adoption



3 months after



6 months after



linkMetric: Pull Request under review time

The first action target that managers identified was pull request under review time. When the time that a PR took under review extended beyond one working day, it became a pain point that led to undesirable turnaround times.

Managers of a specific rollout team worked with their development teams to implement plans that reduced pull request size. Their efforts led to a measured drop in the turnover time for PRs. Over the course of 3 months, their agile teams drastically reduced their time under review, meeting the target goal.

linkData: Before and after

Time period

Under Review Time

Percent diff

Month before GitClear adoption

1.6 weekdays


1 month after

1.2 weekdays


3 months after

1.0 weekday



These results speak to the benefits that objective measurements can bring about toward improved performance. Improvement follows measurement.

linkMetric: Test code

Managers, in the interest of promoting how well-tested their team's code is, set a goal to significantly improve the volume of test code written. To do this, they used our Diff Delta metric, which can be segmented into the various code domains (including test code). Using the percentage of code specifically directed into tests enables getting a reliable signal for how much of their overall codebase development was meaningfully focused in test code.

For this metric, due to the fact that it represents a slower, longer-term objective for development styles, we measure over slightly longer time periods, and can even observe an initial drop in the 3 months following GitClear's adoption, before rebounding and increasing by an amount that more than makes up for the initial drop. Overall, with this metric combined with their code coverage metrics, managers were able to speak more confidently to the fact that their teams were sufficiently test-focused.

linkData: Before and after

Time period

Test Code (share of △)

Percent diff

Month before GitClear adoption



3 months after



6 months after



linkMetric: Pull request review changes

Our client, as with any company dealing with a competitive and complex space, had a key interest in reducing the amount of friction in their PR review process. In particular, reducing its chief manifestation: Changes made during the PR review process.

Since GitClear has the ability to break down where a PR's changes are taking place (either before the PR was submitted, during its review, or after it's merged), they could analyze what percentage of a typical PR's work was taking place while under review. Ideally, this number could be minimized, indicating a smooth, friction-free PR review process. As a control against simply merging PRs quickly with minimal review, this was compared against both the relative percentage of work on bugs and the percentage of work done after the PRs were merged, but neither of these metrics are going to be covered in the scope of this case study.

Overall, they were able to observe a noted drop in the under review changes for pull requests across this 6-month sampling time. Combined with the drop in under review time, this suggests an overall pattern of considerably higher smoothness in the PR process, and faster delivery of key items on their agenda for the team under test.

linkData: Before and after

Time period

% of △ under PR review

Percent diff

Month before GitClear adoption



3 months after



6 months after




How did the move toward developer-friendly data analytics translate to measurable results for our healthcare client?

Each individual team was empowered to take control of their processes and collaboratively deploy GitClear to address their perceived inefficiencies. While the KPIs were set, managers' approach to improving them was free-form, and flexible, depending on the team's constraints and needs. Over the course of several office hours, working alongside GitClear staff, gains were measured across each of the three KPIs the CTO had designated as targets. GitClear was able to suggest both how to ideally focus on the KPIs as well as how to communicate the focus to their individual team members.

linkEfficiency Gains

Clear action plans and consistent execution across teams can make a big impact.

Thanks to the action plans that their managers set in motion, 80% of the company's development teams reported tangible results in the reduction of pull request review times, and less code being necessary during the review and post-merge phases of the typical PR. GitClear's pull request phase graph proved their progress.

GitClear's Pull Request Stats include the oft-discussed conventional metrics like "Under Review Time" and "Cycle Time." But, beyond the ordinary metrics, GitClear paints a fuller picture, letting teams see what happens immediately after the pull request is merged. This is labeled "Post-Merge Work" in GitClear's pull request stats, and the company proved especially adept at minimizing the follow-on work that occurs when the PR review process gets shortchanged.

Bottom line: by using GitClear metrics to measure organization performance, they succeeded at decreasing their pull request "Under Review Time" over 1/3 in 3 months' time.


linkReliability Improvements

Keeping a close eye on the code quality is an easy mark to miss. It's very easy to be persuaded by metrics demonstrating increasing productivity, but that productivity may be concentrated in a way that emphasizes a raw volume of "stuff done" over the actual positive impact on the codebase.

Our client aimed to ensure that at least 10% of the development effort would be invested into expanding and maintaining test coverage. Beyond the traditional estimations of "test coverage," GitClear allowed the team to see visual evidence that a long-term code quality focus was successfully propagated throughout the organization's software engineers.

Managers can see a breakdown of the code written into all the code domains (area of interest) they develop in. They can see the percentage of test-oriented code alongside "library," "view," and other code types. This can help developers pinpoint how their efficiency in front-end tasks compares to when they're working in the backend.

Having easy access to this metric enabled them to reach its target and continue to monitor it, keeping it roughly at the 15% sweet spot.

linkProductivity Improvements

It should come as no surprise that more coding time leads to more things getting done, and GitClear can let executives see exactly how many more things that Managers are pushing across the finish line.

Having a reliable metric that unveils the amount of work happening inside your repositories is a key to reproducible success. Our client's managers began tracking their development team's coding effort with the help of GitClear's Historic Diff Delta Stats.

Using empirically validated data from the commits within the teams' git repositories, Historic Diff Delta Stats provide an objective review of how much coding work is happening inside the organization.

Going one step further, Diff Delta Per-Contributor Stats break down how much change is happening relative to the count of active contributors.

Our client was able to make use of their newly available resources, improving efficiency and reliability, while improving their overall Diff Delta (by 5.2%) over that time. This suggests that, even with fluctuating headcount and shifting processes, they were able to track and know that their overall level of output was not just staying steady, but improving.

[1] GitClear executive kickoff meeting, October 2022.