Measure Developer Experience Using Surveys: Step-by-Step Guide

Developer experience (DevEx) has been directly linked to the performance and productivity outcomes of developers, teams, and organizations. In addition, we could also show a clear relationship between higher developer experience and a higher degree of innovation, creativity, and learning. Better developer experience also means less tech debt and better code quality. All in all, it’s understandable that teams and organizations want to improve the developer experience. But to do so, we first have to measure it.

For that, surveys are an excellent tool of choice. A good, solid survey provides you with reliable and insightful data.

But, how exactly do you measure developer experience using a survey?

After my talk at GOTO Amsterdam about improving developer experience, many developers approached me, wanting to learn how to design such a developer experience survey and which questions they should ask.

So, let me give you a step-by-step guide on designing and running a developer experience survey:

Developer Experience Survey: Step-by-Step Guide

1. Define the Survey Goal/ Focus

First, you have to settle on the aspects or factors you want to measure. Our grounded theory study on DevEx identified 26 factors that influence developer experience. While we identified a large range of factors, you do not necessarily have to measure all of them, especially not all of them at once. You can make a selection based on your current understanding of your developer experience. For example, if you know code reviews and tech debt are a big headache, you might focus more on factors that cover those areas, such as communication, knowledge sharing, deadlines, or factors that have to do with tooling.

2. Question Design

In the next step, you formulate questions or statements based on the factors you selected. You also have to decide which scale to use. When designing the survey, keep simplicity and clarity at the forefront of your thoughts. You have to use closed-ended questions, like Likert scale questions, if you want to derive indexes or measure the change in developer experience over time. Open-ended questions are a wonderful and rich instrument for gaining a deeper understanding of what is going on. Analyzing and understanding open-ended questions takes more effort, though.

3. Piloting Your Survey

Before running your survey within the whole department or company, you have to conduct a pilot test with a small, representative sample of your audience. From this pilot, you want to get as much feedback as possible: did the participant interpret the questions as you intended? Were questions ambiguous or unclear? How about the length of the survey?

4. Running the Survey

Once you have updated the survey, you can run it with all the developers you want to participate. For that, you have to find a tool that allows for smooth data collection and, ideally, also analysis. Common choices are tools such as Google Forms, SurveyMonkey, Culture Amp, Lattice, or Officevibe.

5. Data Analysis and Interpretation

As a first step, you have to prepare and clean the data. This includes handling missing data, outliers, or inconsistencies. For open-ended questions, you need to code the responses. Then, you can use descriptive analysis, such as summary statistics or frequency distributions, to get a first picture of the outcomes and results. If you have a skilled Analyst next to you, you can also dive into inferential analysis, such as hypothesis testing, correlations, or regressions. Finally, you have to make sense of all the results and interpret them based on the context and circumstances at your company.

6. Continuous Improvement

Many companies like Google, LinkedIn, or Shopify run developer satisfaction, happiness, or experience surveys. One thing all of those companies have in common is the need to continuously improve and adapt their surveys to make them more reliable and valid, as well as to keep them up with the change in technology, engineering practices, and processes.


Concrete Surveys to Measure Developer Experience

Let’s look at some concrete examples of developer experience surveys. I’ll show you two types of Developer Experience surveys. to show the diversity in questions that you can ask, and also to inspire you for your own developer experience survey.

Developer Experience Survey based on three DevEx Dimensions

For our study of the impact of developer experience on outcomes such as innovation or productivity, we measure three dimensions of developer experience that have proven to be very relevant in practice:

  1. Flow state,
  2. Feedback loops,
  3. Cognitive load.

Here are the questions that we developed and used:

Flow state:

  • I have a significant amount of time for deep work in my work days.
  • In a typical week, how often are you interrupted to work on something else that was unplanned or suddenly requested?
  • Generally speaking, the coding tasks I work on are more engaging than boring

Feedback loops:

  • How often does it take more than 10 minutes to obtain an answer to an internal technical question (i.e., about code, a system, or the domain you are using)?
  • Approximately what percentage of the code reviews you request are completed within four business hours?

Cognitive load:

  • For the primary team you work on, how would you rate the ease of deploying changes?
  • How often can you easily understand the code you work with?
  • In general, the processes I need to follow to do my work are easy for me to figure out.
  • In general, the developer tools I have are intuitive and easy to use.

The questions we used to measure developer experience were either previously validated in the literature or developed and refined over time with expert input. You can look up the origin of each question in the paper.

Validated developer experience survey items
Validated developer experience survey items, DevEx in Action, ACM Queue 2023


Atlassian’s Developer Experience Survey

Atlassian has a different angle from which to measure DevEx. They use eight signs to measure developer experience:

  1. Sustainable speed for shipping
  2. Waiting time
  3. Execution independence
  4. Ways of working
  5. External standards
  6. Maintenance
  7. Onboarding
  8. Developer satisfaction

For each vital sign, they recommend to ask about the developers’ satisfaction, and their perceived importance of this sign.

This is the example survey including questions for each sign from their playbook:

  1. Sustainable speed to ship
    • How important is shipping high-quality code sustainably for your team?
    • How satisfied are you with your team’s ability to ship high-quality code sustainably?
  2. Waiting time
    • How important is minimizing waiting time to your productivity?
    • How satisfied are you with the amount of developer waiting time on your team?
  3. Execution Independence
    • How important do you consider your team’s ability to deliver independently of other teams?
    • How satisfied are you with your team’s delivery independence?
  4. Ways of working
    • How important is it for your team to discover and onboard new ways of working, including tools, processes, and practices?
    • How satisfied are you with your team’s ability to discover and onboard a new way of working, including tools, processes, and practices?
  5. External standards
    • How important to your productivity is the amount of maintenance or platform work it takes to meet the externally generated company standards your team owns?
    • How satisfied are you with the amount of maintenance or platform work it takes to meet the externally generated company standards your team owns?
  6. Maintenance
    • How important to your productivity is the amount of effort required of you to maintain your team’s standards with regard to code, tools, and pipelines?
    • How satisfied are you with the amount of effort required for code, tools, and pipeline maintenance?
  7. Onboarding
    • How important to your productivity is the amount of time it takes new hires or internal transfers to become effective on your team?
    • How satisfied are you with the amount of time it takes new hires or internal transfers to become effective on your team?
  8. Developer satisfaction
    • How important is your satisfaction to your productivity?
    • How satisfied are you with your team’s developer productivity?

Survey Design Pitfalls

As you see from the two examples, your developer experience survey can contain different questions and cover different factors. Yet, not every question is a good question to ask. Some survey instruments are more reliable than others. Common mistakes people make when designing their own survey include using leading questions, subjective or ambiguous terminology, or designing a too-long and/or boring survey.

It’s also important to always test your survey with a small set of people first and incorporate their feedback (i.e., piloting the survey). Thoroughly reading about survey design or working with a skilled Survey Designer or Survey Researcher significantly increases your chances of getting reliable and valid data that provides accurate and actionable insights.

Using a Ready-Made DX Survey Tool

Instead of designing your own questionnaire, you can use a tool like DX, which already provides you with a solid set of questions. As DX’s main value proposition is to help you measure and improve developer experience, they have a complete team dedicated to ensuring their survey instruments lead to reliable measurements and actionable improvements. Using DX is a great way to get metrics and measurements, even industry benchmarks, out of the box and to quantify your improvements over time.

In-depth Investigations

Yet measuring developer experience is not enough to improve it. Once you identify areas of concern, I recommend regularly conducting deep dives into those areas. Here, designing our own survey, running a developer experience workshop, or shadowing/ interviewing developers can give you rich insights. System data, such as error logs, test logs, commit data, or PR data, can also be an insightful source of information – especially if used in combination with insights coming directly from the developers.Only when you deeply understand the friction points, problems, and challenges developers face can you start to create an improvement plan and take actions that lead to a better developer experience.

So, now tell me, do you consider developer experience at your company? How do you measure and/or improve it, and if so?

Dr. Michaela Greiler

I make code reviews your superpower.

Leave a Reply

Your email address will not be published. Required fields are marked *