How to Involve Developers in the Quality and Performance Process (Without Losing Them)

Published on 10 June 2025

Involving developers in software performance initiatives can make or break your quality process. At WedoLow, we’ve seen both sides: pushback when performance audits feel imposed, and deep engagement when teams are brought into the process as co-owners. Here’s how to build that second scenario, and why it matters.

In this article, we’ll break down the challenges we’re facing when working with a team of developers to carry out a performance audit of their software applications. We’ll deep dive into how we can work closely with the devs to “embed” them in the performance analysis and optimization process.

This step is a valuable addition to your development workflow if your goal is to systematically assess the performance of your embedded software.   

Why Do Developers Resist Performance Audits?

Do your developers cheer when you set up a code quality or performance audit? 

We didn’t think so. We’ve heard reactions like: “Another audit? We’re going to waste a week fixing things that were working just fine…”

And we’ve even seen projects where we demonstrated over 70% performance gains on a battery-powered embedded application. Yet, we struggled to get developers on board to implement the fixes alongside us. 

I hear you… “the fixes must have been difficult to correct, or impeding the readability and maintainability”. And guess what? That wasn’t even true in this case! But now, with a little more hindsight than when WedoLow started up, we can understand why.

When we talk about software quality, code review or auditing, developers’ first reaction is often distrust. Not because they’re against the idea of doing things right. But because they have too often seen these approaches imposed from above, without explanation, without being heard, and without business context. It can also be seen as “we’re going to point out what I’ve done wrong”

This is the best way to nip the initiative in the bud. If performance efforts feel imposed without logic or benefit, you’re asking for resistance.

Imposed quality becomes a chore. Co-constructed quality becomes a culture.

3 Levers to Build Developer Engagement in Performance Optimization

1. Make developers actors, not executors

  • Rather than imposing rules, involve devs in their development and its quality and performance.

Workshops, open discussions, voting on code standards: everyone needs to have their say. 

It’s also essential to point out the difference between “developing software for a given functionality” and “optimizing its performance”. 

At WedoLow, we’re not experts in image processing, signal processing, ADAS… and that’s okay. Our business and our expertise are clearly different from those of the users of our product. And that’s why it works! 

The aim of implementing a software performance analysis and optimization tool is to facilitate the day-to-day work of developers, not to replace them.

 

2. Explain why, not just how

  • Telling “we’re adding a performance testing phase to our CI/CD chain” isn’t enough. 

You need to explain why:

  • to guarantee performance as early as possible in the development process, 
  • to meet product constraints (execution time, memory, autonomy…) more easily, 
  • to facilitate the addition of features or updates in the future,
  • to reassure the customer,
  • to speed up delivery by avoiding a final performance optimization phase that hurts everyone and leads to major delivery delays or causes delivery timelines to blow up. 

When the meaning is clear, the effort is accepted.

 

3. Value quality and performance efforts in team rituals 

  • If performance improvements aren’t visible, they won’t be valued.

Build a culture that rewards and reinforces quality over time:

  • point out “good examples”, 
  • celebrate performance analyses that bring a debt back to zero, or show improvement over time, 
  • encourage smart refactoring.

It’s essential to set aside the time to implement them. 

If you’re an enthusiast of code quality analysis via Sonar, for example, you’ll know that configuring everything takes a bit of dedication. Just once. And then it’s up and running for the various projects. But you need to invest this time to ensure tests are automated and systematic. The same goes for performance debt analysis.

Technique is never an obstacle. The real stumbling block is posture.

When developers understand that code auditing or quality rules are not there to pin them down, but to protect them (and value their work), everything changes.

 They become the driving force.

That’s when a culture of quality and performance becomes a reflex, not a rule.

TL;DR

  • Give them a voice
  • Always explain why
  • Celebrate quality over time

How do you involve your dev teams in quality and performance processes?

What about you? Do you encounter any internal resistance? Or on the contrary, did something click that changed everything? 

We’d be happy to learn more about your experience, and to help you implement the performance analysis reflex in your development process.