TestCon Europe 2020
Vilnius and Online
Book Author, Agile Testing Consultant & International Speaker
Gil Zilberfeld has been in software since childhood, writing BASIC programs on his trusty Sinclair ZX81. With more than twenty years of developing commercial software, he has vast experience in software methodology and practices.
Gil has been applying agile principles for product development more than a decade. From automated testing to exploratory testing, design practices to team collaboration, scrum to kanban, traditional product management to lean startup – he’s done it all. He is still learning from his successes and failures.
Gil speaks frequently in international conferences about unit testing, TDD, agile practices and product management. He is the author of “Everyday Unit Testing”, blogs at http://www.gilzilberfeld.com, co-organizer of the Agile Practitioners conference and in his spare time he shoots zombies, for fun.
The Quality Dashboard
You’ve got thousands of automated tests running, multiple test and coverage reports and logs – but you can’t see the forest from the trees. The problem is you don’t know: is it safe to release? With refined, specific metrics, you can define reports (or dashboard) that tell you the real quality of the product. You can then decide what to do about it.
This is a case-study of building a quality dashboard with metrics that matter for an application with hundreds of APIs, and multiple front-ends. Some features were better covered than others, but what that coverage meant was vague. The dashboard was built, collecting information from multiple sources – test reports and coverage reports from Jenkins, custom logs that were farmed for information, SonarQube and more. We then added some “brains” to show the analyzed metrics, in terms of covered and uncovered test cases, test quality and more. We then presented a confidence level calculated from the metrics. The effort was done by developers, quality advisors, dev-ops people and others. This session is about this project.
The dashboard helped managers see what features are ready, where the gaps are, and gave back feedback to the developers how well their tests are working for them. With this session you may be inspired to build a quality dashboard that will tell you how well your team is doing.