AI Impact Feature

Learn about our approach to measuring the impact of AI

Use this feature to identify patterns in AI adoption across different user segments and understand how AI coding assistants affect your team's development metrics.

Get more from your AI tooling – the research-backed way

This feature is based on our original research about the impact of AI on engineering teams, where we followed 500+ developers for months. We combined telemetry data from real development workflows, AI tooling telemetry data, comprehensive surveys, and in-depth interviews with engineering leaders and team members. This research helps us understand not just whether AI tools are being adopted, but how they affect productivity, code quality, collaboration patterns, and developer wellbeing.

Our findings show that the actions you take as a leader have the biggest impact on the success of your AI rollout – not your tooling. With the right initiatives, you can help your team get more benefit from AI, with fewer of the costs (to codebase quality, learning, and more). But to do that, we need to be able to:

  • See what AI interventions are working and which aren’t

  • Support our team members to learn – whether they’re engaged or skeptical

  • Measure the full impact of AI – we don’t want to focus so much on productivity that we lose sight of code quality or the impact on developer experience

  • Get early warning signs of AI slop

This feature will help you do just that. Follow the links below to learn more.

If you're curious about our original research, check out multitudes.com/research.

Getting started

To get started, set up integrations with your AI tooling. The installation docs are here:

AI Adoption

AI Impact

Last updated

Was this helpful?