Definition of Done — The What, Why and practically use it for creating engineering marvels
If you have ever been on an Agile project, I’m sure you would have heard someone asking or someone saying “What is your Definition of Done”.
The actual meaning to this question is how do you determine whether the work that you delivered is complete? Or in the above figure when do you move the items to Done column?
Definition of Done is a collection of agreed-upon set of items that must be met before a user story can be considered complete. It is constantly implemented and serves as an official gatekeeper separating “in progress” and “done” stories. It is abbreviated as DoD
Sample DoD for a product will look like the below:
- Unit test added
- Zero critical security vulnerabilities
- Code is peer-reviewed
- Acceptance criteria met
- Design document or ADR created/updated for any design changes
- Code/feature passes regression testing
- Code/feature passes smoke testing
- Documentation added(if required)
- Product Owner accepts the story is good to go to production
In Agile Scrum, all the work that needs to be delivered will be in the form of User Stories. Check out my ebook if you haven’t heard about Agile Scrum or User Story.
Each User story that is worked on by a developer/tester needs to be delivered with complete quality. In Agile Scrum projects, quality does not just imply that everything works as intended; it also means that the User story has met the Definition of Done. So it is the developer/tester responsibility to ensure that the user story they worked on has met Definition of Done. To elaborate, if the user story is expected to follow the above sample Definition of Done, the developer working on it must ensure that, in addition to developing code, he also writes unit tests, has this code peer reviewed, collaborates with the test team to ensure automated tests are in place to test his work, and so on.
BTW, what’s the difference between Acceptance Criteria and Definition of Done. Aren’t they same?
Nope. Both are different. Acceptance Criteria is a component of “Definition of Done”. Acceptance typically defines the criteria for functionality expectation.
Acceptance Criteria: Conditions that must be satisfied by the product or software to be accepted by stakeholders or customers. They are unique for each user story and define the feature behaviour from the end-user’s perspective. This is expressed in Given-When-Then-And format.
Acceptance Criteria for a User Story — “As a user, I want to be able to recover the password for my account, so that I will be able to access my account if I forgot the password”
“The user has navigated to the login page and selected the forgot password option. The user entered a valid email to receive a link for password recovery, then the system should send an email to recover the user’s password. The user clicks on the link in the email and should be able to set a new password”
Attaining “Definition of Done” for a User story is above and beyond attaining Acceptance criteria. Acceptance criteria apply to a single item whereas Definition of Done applies for an entire product
If all the user stories that are being delivered in a sprint have met “Definition of Done”, I would call that the ultimate nirvana, and the team can be recognised as the best self-organizing team ever.
In actuality, the team establishes a Definition of Done, prints it on pamphlets, t-shirts, and mugs, but makes exceptions during the sprint. The thumb rule for delivering a quality product is if any of the stories is not meeting the Definition of Done, it shouldn’t go to Production. Unfortunately, teams tend to skip and rush things to production even-though the story fails to meet the Definition of Done. A regular pattern is to just check if the story meets Acceptance Criteria, but with no unit test, no automated tests, and documentation in place, the stories slip through the cracks and make their way to production.
Though Unit tests are important, some of the teams think it is nice-to-have stuff. By the way, things still work without the Unit test and that’s the harsh reality. Things still work even though the functional test is not automated. People ramp up and understand projects even without documentation just by reading the code. Things still work even though the code is not peer-reviewed.
So what could be the reason why senior devs, coaches, managers and directors fret about the Definition of Done and request teams to adhere to the same without purpose.
My 2 cents —
“If your story just meets acceptance criteria and skips the rest of the facets of Definition of Done then you are just doing programming and not Software Engineering”
There is a glaring difference between programming and Software engineering.
Imaging the below situations
- Need to make a change in legacy code or in a function that is being reused in multiple places. Without a proper unit test, how do you ensure or feel safe that your new code or bug fix is not breaking other features of which you are not aware?
- How does a junior or newcomer on your team learn the flow of your code or the purpose of different modules without proper unit tests or documentation or design docs?
- How do you speed up things getting into production if your tests are not automated? Isn’t it boring for the test team to manually execute every test for every feature? Imagine a production application with 50 flows and your new piece of code is a reusable one, touching max flows and you expect the test team to test all the flows manually?
- How do you learn from your peers or level up your coding experience if your code is not reviewed with a new set of eyes?
All of the above are real-world scenarios that must be addressed; if not, product increments will be delayed, KTs(Knowledge transfer) will be painful and adding new code will become an everyday fearful adventure of an undiscovered island and eventually the team, the product will burn out.
As a result, having teams respect and adhere to the Definition of Done is critical. Finally, it is the team’s definition of a nice product, and so it represents them.
Now it’s time to work on the optimisation. How can we assure that the team does not break its pledge to themselves? We make promises to ourselves to be healthy and fit, but we still miss the gym and break our promises to ourselves. The key to sticking to a process is to incorporate it into your daily routine.
That’s where I recommend DoDChecker, a new agile scrum ritual — a fifteen-minute meeting per Sprint.
DoDChecker needs to be
- Facilitated by Scrum Master just like Standup
- The entire Development team(Including Dev/QA/UX etc) and Product Owner should be participating
- It needs to happen on the last day of Sprint
- In this meeting, the team needs to go through the current sprint stories and very intentionally check whether the stories that are getting delivered for the sprint are attaining the “Definition of Done”
- If any story is missing “Definition of Done”, blamelessly carry forward the story to the next sprint to attain the same.
This 15-minute meeting ensures that the team is fully aware of the Definition of Done and that the team is constantly reminded that every production-ready story must have achieved Nirvana — Definition of Done.
Also to augment the adoption of the “Definition of Done”, the team can automate some of the agreements made in the Definition of Done. Let me explain the same with the above Sample DoD:
- Unit test added — For every new PR, the new code coverage can be generated and checked if the coverage is at-least 60% via CI tool like Jenkins. If the new-code coverage for PR is less than 60% the developer is informed via PR comments and block the PR from merging to main branch via GitHub checks
- Zero critical security vulnerabilities — After the recent Log4j massive threat, it is extremely critical to ensure that the new code churned is free of critical security vulnerabilities. Integrating a static application security scan tool like Sonar into CI tool like Jenkins will help enable to automate this step. If the PR has any critical security vulnerabilities, block the PR via GitHub checks from merging. This will inform the developer about this aspect of DoD to fix the vulnerability and augment the adoption.
- Code is peer-reviewed — Using the Github setting to add number of required reviewers to merge the PR will help adherence to this aspect of DoD .
- Acceptance criteria met — This aspect to be manually confirmed. DoD checker meeting will help to check on this aspect.
- Design document or ADR created/updated for any design changes — DoD checker meeting will help to check on this aspect.
- Code/feature passes regression testing — When a PR is created, it can be deployed to an ephemeral environment, and then a regression test can be run to see if there are any errors. Integrating “zero regression test failures” check into CI process before merging PRs can augment the adoption of this aspect.
- Code/feature passes smoke testing — Same as above(Code/feature passes regression testing)
- Documentation added(if required)- This aspect to be manually confirmed. DoD checker meeting will help to check on this aspect.
- Product Owner accepts the story is good to go to production
The team’s adherence and adoption of the Definition of Done will be long-lasting with the above-mentioned automatic checks in place and the DoD checker meeting. Also for any newcomers on the team, as the process already guides at every level, the adoption of the Definition of Done will be easy and far from skipping..
Definition of Done is a beautiful construct for creating quality products that help all the parties speak the same language and helps the team to understand the expectations.
Don’t stop at just writing the DoD, the true success lies in adopting it for a long time. Making the definition a reality is a journey, and all of the efforts put in to reach that Nirvana state is worthwhile.
Vinay Kanamarlapudi on LinkedIn