Definition of Done: The Complete Guide with Examples & Checklist
Definition of Done: Quality Standards for Every Increment
The Definition of Done (DoD) is a formal commitment in Scrum and Agile that establishes the quality criteria every Product Backlog item and Increment must meet before being considered complete. It's not just a checklist—it's a shared understanding among the entire Scrum Team about what "done" means, creating transparency and ensuring potentially shippable Increments.
Key characteristics: The Definition of Done applies universally to ALL work (unlike Acceptance Criteria which varies per story), evolves as team capabilities mature, and is non-negotiable—work not meeting DoD cannot be released. It typically includes code review, automated testing with specific coverage thresholds, documentation updates, security scans, and deployment verification.
Critical distinction: Definition of Done defines QUALITY standards (e.g., "code reviewed," "tests >80% coverage"), while Acceptance Criteria defines FUNCTIONAL requirements (e.g., "user can log in with email"). Both must be met for work to be truly Done.
Quick Answer: Definition of Done at a Glance
| Aspect | Details |
|---|---|
| Definition | Formal description of quality standards an Increment must meet |
| Purpose | Creates transparency about what "done" means for the entire team |
| Scope | Applies to every Product Backlog item and Sprint Increment |
| Who Creates It | Scrum Team collaboratively (with organizational standards as baseline) |
| Examples | Code reviewed, tests >80% coverage, security scan passed, deployed to staging |
| Key Principle | Work not meeting DoD is NOT Done—it cannot be released or demonstrated |
| Evolution | Teams strengthen DoD over time as capabilities improve (basic → intermediate → advanced) |
| Common Mistake | Confusing DoD with Acceptance Criteria (DoD = quality, AC = functionality) |
What You'll Learn in This Guide
In this comprehensive guide, you'll discover:
- The Foundation: What Definition of Done truly means in Scrum and why it's a commitment, not just a checklist
- DoD vs Acceptance Criteria: Clear distinctions with side-by-side comparisons and working examples showing how both must be met
- Industry-Specific Checklists: Ready-to-use DoD examples for 6 industries (SaaS, Healthcare, Finance, E-commerce, Mobile, DevOps)
- The Three-Level Hierarchy: How to structure DoD at Feature-level, Sprint-level, and Release-level with decision frameworks
- Maturity Journey: Progressive strengthening from Basic DoD (new teams) through Intermediate to Advanced (high-performing teams)
- Common Mistakes: 8 critical anti-patterns teams make with DoD and exactly how to fix them
- Evolution Strategy: Step-by-step guidance for incrementally improving DoD without overwhelming the team
- Practical Implementation: Collaborative creation process, visibility strategies, and enforcement techniques
Why Definition of Done Matters Today
The Definition of Done isn't just another Scrum formality—it's the quality gatekeeper that prevents technical debt accumulation and ensures every Sprint delivers potentially shippable Increments. This critical commitment allows teams to:
- Eliminate quality ambiguity through explicit, measurable standards everyone understands
- Prevent scope creep by clearly defining when work is complete vs. when it's still in progress
- Enable predictable releases because "done" means genuinely releasable, not "mostly done"
- Support distributed teams with automated verification reducing synchronous communication needs
- Scale consistently across multiple teams working on the same product with shared standards
Whether you're establishing DoD for a new Scrum team, strengthening DoD for a maturing team, or ensuring compliance in regulated industries (healthcare, finance, government), this guide provides proven frameworks for success.
Key Insight: The Definition of Done is non-negotiable. When pressure builds to meet Sprint Goals, the correct response is reducing scope (deliver fewer fully-Done items) rather than compromising quality (deliver more incomplete items). Teams that compromise DoD accumulate crippling technical debt and undermine Scrum's empirical foundation.
Let's explore how to create, implement, and evolve Definition of Done that transforms your team's quality and delivery capabilities.
Table Of Contents-
- What is Doneness?
- Definition of Done
- The Scrum Guide Deciphered
- The Goals of Definition of Done
- Benefits of the Definition of Done
- Where to Begin the Defining Process
- Creating an Effective Definition of Done
- Three Critical Questions
- Definition of Done vs Acceptance Criteria: Key Differences
- Definition of Done Examples by Industry
- The Three Levels of Definition of Done
- Common Definition of Done Mistakes (And How to Fix Them)
- How Definition of Done Evolves: The Maturity Journey
- Conclusion
- Quiz on Definition of Done
- Continue Reading
- Frequently asked questions
What is Doneness?
In Scrum, "doneness" refers to the state of completion of a Product Backlog Item (PBI) or an Increment.
It indicates that the work has been finished to a level of quality and completeness that is acceptable to the Scrum Team and the stakeholders.
To ensure a shared understanding of when a PBI or Increment is considered complete, Scrum Teams use the Definition of Done.
Definition of Done
The Definition of Done (DoD) is a shared understanding among the Scrum Team members of the criteria that must be met for a PBI or Increment to be considered complete.
It establishes a clear and consistent set of expectations, ensuring that everyone on the team knows what is required to finish a task successfully.
The Definition of Done is essentially an agreed-upon checklist of tasks and criteria that must be fulfilled before a project or user story can be considered complete.
While the specifics may vary from one organization to another, a typical DoD encompasses key items such as:
- Code is peer-reviewed: Ensuring that code undergoes scrutiny by peers for quality and accuracy.
- Code is checked in: Committing the code to version control for team access.
- Code is deployed to a test environment: Making the code available for testing.
- Code/feature passes regression testing: Verifying that changes don't break existing functionality.
- Code/feature passes smoke testing: Conducting basic tests to ensure the feature works as intended.
- Code is documented: Creating comprehensive documentation to aid future understanding and maintenance.
- Help documentation is updated: Ensuring user-facing documentation is accurate and complete.
- Feature is OK’d by stakeholders: Gaining approval from relevant stakeholders for the feature's readiness.
These checkpoints collectively serve as a gatekeeper, distinguishing between work that's "in progress" and that which is genuinely "done."
They act as a safety net to maintain quality and completeness throughout the development process.
The Scrum Guide Deciphered
According to the Scrum Guide, the Definition of Done serves as a formal blueprint of the state an Increment must achieve to meet the quality benchmarks required for the product.
Once the Definition of Done criteria are met, the Increment earns the coveted status of "Done" and is ready for delivery.
The Goals of Definition of Done
Building Consensus on Quality
One of the primary goals of DoD is to foster a common understanding within the team about quality and completeness.
When everyone agrees on what 'done' means, it eliminates confusion and streamlines the development process.
A Reliable Checklist
DoD acts as a reliable checklist that User Stories or Product Backlog Items are checked against.
This ensures that nothing falls through the cracks, and every aspect of a task is meticulously examined.
Ensuring High-Quality Increments
Ultimately, the overarching goal of DoD is to ensure that the increment shipped at the end of the Sprint is of high quality.
This quality should be clearly understood by all team members, stakeholders, and anyone involved in the project.
Benefits of the Definition of Done
Having a well-defined Definition of Done is essential in Agile development for several reasons:
- It provides clarity and consistency: Team members and stakeholders have a clear understanding of what is expected for each completed task.
- It improves quality: By defining quality standards and testing requirements, it helps deliver a higher-quality product.
- It reduces misunderstandings: It minimizes the risk of miscommunication and ensures everyone is on the same page regarding project completion.
- It aids in decision-making: It helps the team decide when a task or feature is ready for release, streamlining the development process.
- It enhances transparency: Stakeholders can track progress more effectively, knowing that "done" means all necessary work is completed.
Where to Begin the Defining Process
Defining the DoD should be a collaborative effort involving stakeholders and those responsible for the actual work.
Whether it starts with brainstorming or a proposed framework from the technical team, ample opportunities for feedback and unanimous support for the final DoD are essential.
Assigning ownership to each criterion helps resolve disputes and maintains consistency.
In adherence to the Agile principle of simplicity, the DoD should be concise. Its purpose is to ensure consistent quality, not create bureaucratic obstacles.
Overcomplicating the DoD can impede progress rather than facilitate it.
Creating an Effective Definition of Done
To create an effective Definition of Done for your Scrum Team, follow these steps:
-
Collaborate: Engage the entire Scrum Team in the creation of the DoD, ensuring that everyone's perspective is considered, and a shared understanding is established.
-
Define Criteria: Identify the criteria that must be met for a PBI or Increment to be considered complete. Include aspects such as functionality, quality, performance, documentation, and compliance.
-
Keep it Visible: Make the DoD easily accessible and visible to the entire Scrum Team, ensuring that team members are always aware of the expectations.
-
Review and Update: Regularly review and update the DoD based on the Scrum Team's learnings, experiences, and changing requirements.
Three Critical Questions
To determine where an activity belongs in the DoD hierarchy, three questions should guide your decision-making process:
- Can we do this activity for each feature?
- If not, can we do this activity for each sprint?
- If not, it becomes imperative to include this activity for our release.
Definition of Done vs Acceptance Criteria: Key Differences
Many teams confuse Definition of Done with Acceptance Criteria—but they serve different purposes and must work together.
Definition of Done (DoD)
Scope: Applies to ALL work across ALL Product Backlog items
Purpose: Defines quality standards and technical practices
Who Defines: Scrum Team collaboratively (often inherited from organizational standards)
Example DoD Items:
- Code reviewed by at least one peer
- Unit tests written and passing (>80% coverage)
- Integration tests passing
- No critical or high-priority bugs
- Documentation updated
- Security scan completed
- Performance benchmarks met
Key Point: DoD is consistent across all features—every item must meet the same quality bar.
Acceptance Criteria (AC)
Scope: Unique to each specific Product Backlog item or user story
Purpose: Defines functional requirements and business logic
Who Defines: Product Owner (with Developer input on feasibility)
Example AC for "User Login" Story:
- User can log in with email and password
- Invalid credentials show error message
- Successful login redirects to dashboard
- "Forgot password" link sends reset email
- Login persists for 30 days with "Remember me"
Key Point: AC varies by story—each feature has unique functional requirements.
Working Together
| Aspect | Definition of Done | Acceptance Criteria |
|---|---|---|
| Applies to | All work equally | Each story uniquely |
| Answers | "Is it quality work?" | "Does it work as intended?" |
| Defines | Technical standards | Functional behavior |
| Example | "Code is reviewed" | "User can reset password" |
| Created by | Scrum Team | Product Owner |
| Changes | Rarely (evolves gradually) | Every story (unique each time) |
Both must be met: Work is only "Done" when it meets both the Definition of Done (quality) and Acceptance Criteria (functionality).
Definition of Done Examples by Industry
Different industries require different DoD items based on compliance, security, and domain-specific requirements.
Software-as-a-Service (SaaS)
✓ Code reviewed and approved
✓ Unit tests passing (minimum 80% coverage)
✓ Integration tests passing
✓ API documentation updated
✓ Security vulnerability scan passed
✓ Performance testing completed (< 2s response time)
✓ Feature flag configured
✓ Monitoring and alerts configured
✓ Deployment runbook updated
✓ Product Owner approvedHealthcare / HIPAA-Compliant Software
✓ Code reviewed by senior developer
✓ All tests passing (unit, integration, E2E)
✓ HIPAA compliance checklist completed
✓ PHI data encryption verified
✓ Audit logging implemented
✓ Access controls tested
✓ Security review passed
✓ Privacy impact assessment completed
✓ Documentation updated (technical + user)
✓ Compliance officer approvedFinancial Services / Banking
✓ Code reviewed (dual review for critical paths)
✓ Automated tests passing (>90% coverage)
✓ PCI-DSS compliance verified
✓ Penetration testing completed
✓ Transaction integrity tests passed
✓ Rollback procedure documented
✓ Fraud detection rules updated
✓ Regulatory reporting capability verified
✓ Audit trail implemented
✓ Change Advisory Board approvedE-Commerce Platform
✓ Code peer-reviewed
✓ Unit + integration tests passing
✓ Cross-browser testing completed (Chrome, Safari, Firefox, Edge)
✓ Mobile responsive design verified
✓ Page load time < 3 seconds
✓ Analytics tracking implemented
✓ SEO meta tags added
✓ Cart and checkout flow tested
✓ Payment gateway integration tested
✓ User acceptance testing passedMobile App Development
✓ Code reviewed
✓ Unit tests passing (>75% coverage)
✓ UI/UX matches design specs
✓ Tested on iOS (latest + 2 previous versions)
✓ Tested on Android (latest + 3 previous versions)
✓ Accessibility standards met (WCAG 2.1 AA)
✓ App store screenshots and descriptions ready
✓ Crash reporting configured
✓ Analytics events tracking correctly
✓ Privacy policy and permissions updatedDevOps / Infrastructure
✓ Infrastructure as Code (IaC) peer-reviewed
✓ Automated tests passing
✓ Security groups and IAM policies reviewed
✓ Cost estimation completed
✓ Monitoring dashboards created
✓ Alerts and runbooks documented
✓ Disaster recovery tested
✓ Change management ticket approved
✓ Deployment verified in staging
✓ Rollback procedure documented and testedThe Three Levels of Definition of Done
According to Scrum Alliance and Agile experts, DoD can exist at multiple levels—teams should consciously decide where each quality activity belongs.
Level 1: Feature-Level DoD
Activities performed for EVERY Product Backlog item
✓ Code written following coding standards ✓ Unit tests written and passing ✓ Code peer-reviewed ✓ Acceptance Criteria met ✓ Local testing completed
Question: Can we realistically do this for every single feature?
Level 2: Sprint-Level DoD
Activities performed once per Sprint for the entire Increment
✓ Integration testing across all Sprint features ✓ Regression testing suite executed ✓ Performance testing completed ✓ Security scan run ✓ Sprint documentation consolidated ✓ Increment deployed to staging environment
Question: If not feasible per feature, can we do this once per Sprint?
Level 3: Release-Level DoD
Activities performed before releasing to production
✓ User acceptance testing (UAT) with stakeholders ✓ Load testing under production-like conditions ✓ Penetration testing and security audit ✓ Compliance review and sign-off ✓ Production deployment runbook executed ✓ Marketing materials and release notes prepared ✓ Customer support trained on new features
Question: If not feasible per Sprint, must we do this before release?
Why Multiple Levels Matter
Reality: Not every quality activity can be done for every feature. Examples:
- Full penetration testing can't happen for every story—it's a Release-level activity
- Integration testing across all features happens once per Sprint, not per feature
- Load testing with 10,000 concurrent users is prohibitively expensive per feature
Solution: Be explicit about which level each DoD activity belongs to. This creates:
- Transparency: Everyone knows when activities happen
- Realistic expectations: Teams don't promise impossible per-feature activities
- Quality gates: Critical activities aren't skipped—they're scheduled appropriately
Common Definition of Done Mistakes (And How to Fix Them)
Mistake 1: Confusing DoD with Acceptance Criteria
Problem: Team treats DoD as functional requirements instead of quality standards
Example: DoD includes "User can log in with email" (this is Acceptance Criteria)
Fix: DoD should be technical/quality standards (e.g., "All user-facing features have >80% test coverage"), not functional specifications
Mistake 2: Creating an Impossible DoD
Problem: DoD includes activities the team can't actually complete every Sprint
Example: "Full security audit by external firm" (costs $50K, takes 3 weeks)
Fix: Move this to Release-level DoD. Sprint DoD might be "Automated security scan passed"
Mistake 3: DoD is Too Vague
Problem: Generic statements that don't drive action
Example: "Code should be high quality" or "Testing complete"
Fix: Be specific: "Code reviewed by senior developer" and "Unit tests >80% coverage, integration tests passing, smoke tests passed"
Mistake 4: Never Evolving the DoD
Problem: Team uses same DoD for years despite gaining capabilities
Example: Team learns automated testing but DoD still says "Manual testing only"
Fix: Review DoD quarterly in Retrospectives. As team improves, strengthen DoD (e.g., raise test coverage from 70% → 80% → 90%)
Mistake 5: Skipping DoD When "Under Pressure"
Problem: Team compromises DoD to meet Sprint Goal, accumulating technical debt
Example: "We'll skip code review this time—we're behind schedule"
Fix: DoD is non-negotiable. If you can't meet DoD, the work is NOT Done. Reduce scope instead of compromising quality
Mistake 6: One Person Owns DoD
Problem: Only Scrum Master or tech lead defines DoD without team input
Example: Tech lead creates 20-item DoD checklist without consulting Developers
Fix: Entire Scrum Team collaborates on DoD. Everyone must understand and commit to it
Mistake 7: Not Displaying DoD Visibly
Problem: DoD lives in a wiki nobody checks
Example: "Where's our DoD?" "Uh... somewhere in Confluence?"
Fix: Make DoD visible on your board, in Sprint Planning, and during Sprint Review
Mistake 8: DoD Without Ownership
Problem: DoD items exist but nobody verifies compliance
Example: DoD says "Documentation updated" but nobody checks if it happened
Fix: Assign clear ownership for each DoD item verification, or make it part of code review checklist
How Definition of Done Evolves: The Maturity Journey
Great teams don't start with perfect DoD—they strengthen it over time as capabilities improve.
Stage 1: Basic DoD (New Scrum Teams)
Typical DoD:
✓ Code written
✓ Code committed to version control
✓ Basic manual testing completed
✓ Acceptance Criteria metCharacteristics:
- Minimal automated testing
- Heavy manual testing reliance
- Basic quality checks
- Focus on getting work "functional"
Improvement Focus: Introduce automated unit testing and peer review
Stage 2: Intermediate DoD (Maturing Teams)
Typical DoD:
✓ Code reviewed by peer
✓ Unit tests written (>60% coverage)
✓ Integration tests passing
✓ Acceptance Criteria met
✓ Deployed to test environment
✓ Basic documentation updatedCharacteristics:
- Automated testing emerging
- Code review process established
- CI/CD pipeline beginning
- Quality becoming systematic
Improvement Focus: Increase test coverage, add automated deployment
Stage 3: Advanced DoD (High-Performing Teams)
Typical DoD:
✓ Code reviewed and approved
✓ Unit tests passing (>85% coverage)
✓ Integration tests passing
✓ E2E tests passing
✓ Security scan passed
✓ Performance benchmarks met (<2s response)
✓ Accessibility standards verified (WCAG 2.1 AA)
✓ Documentation updated (code + user)
✓ Feature flag configured
✓ Monitoring/alerts configured
✓ Deployed to staging automatically
✓ Product Owner approved in stagingCharacteristics:
- Comprehensive automated testing
- Full CI/CD pipeline
- Proactive quality measures
- "Potentially shippable" truly means shippable
Improvement Focus: Continuous improvement through metrics and feedback
How to Strengthen Your DoD
Step 1: Identify Current Gaps
During Sprint Retrospective, ask:
- What quality issues slipped through despite meeting DoD?
- What technical debt are we accumulating?
- What causes rework or bugs in production?
Step 2: Add One Improvement at a Time
Don't overwhelm the team. Strengthen DoD incrementally:
- Sprint 1-3: Add peer code review requirement
- Sprint 4-6: Require unit tests for new code (>50% coverage)
- Sprint 7-9: Increase to >70% coverage
- Sprint 10+: Add integration testing requirement
Step 3: Invest in Capability Building
DoD can only improve if team capabilities improve:
- Need higher test coverage? Provide TDD training
- Need better security? Bring in security expert for workshops
- Need automated deployment? Invest in CI/CD infrastructure
Step 4: Measure and Adapt
Track metrics to validate DoD improvements:
- Defect escape rate (bugs found in production)
- Time to fix production issues
- Deployment frequency
- Mean time to recovery (MTTR)
Conclusion
The Definition of Done is crucial for maintaining transparency, aligning team members' expectations, and delivering a potentially releasable increment at the end of each sprint.
It goes beyond functionality to assert the quality of a feature. Informed by reality, it adapts to various levels, providing clarity, and fostering communication within the team and with stakeholders.
It helps prevent incomplete or low-quality work from being considered "done" and provides a clear guideline for when the development work is truly complete.
Understanding the DoD and applying it effectively is a journey towards delivering not just software but excellence in every line of code.
As you embark on this journey, remember that the Definition of Done is a dynamic tool, always ready to evolve and guide your team towards greater heights.
Quiz on Definition of Done
Your Score: 0/15
Question: What does the 'Definition of Done' (DoD) represent in Agile software development?
Continue Reading
Product Increment in ScrumLearn how Product Increments must meet the Definition of Done to be potentially shippable and ready for release.Sprint Planning: Your Guide to Effective Scrum ExecutionMaster Sprint Planning and learn how teams use Definition of Done to forecast what can be completed in a Sprint.Sprint Review - A Powerful Scrum EventDiscover how Sprint Review demonstrates only work that meets the Definition of Done for stakeholder inspection.Sprint Retrospective: Boost Team PerformanceLearn how Sprint Retrospectives provide opportunities to inspect and evolve your Definition of Done over time.Scrum Product Backlog: Master Essential Agile ArtifactUnderstand how Product Backlog Items must meet Definition of Done criteria to be considered complete.Sprint 0: Complete Guide to Objectives & BenefitsExplore how Sprint Zero helps teams establish their initial Definition of Done and quality standards.Continuous Integration - Boost Scrum DevelopmentDiscover how CI/CD practices automate Definition of Done verification and ensure consistent quality.Scrum Emphasis on Testing: Agile TestingMaster testing practices that form the foundation of strong Definition of Done quality criteria.
Frequently Asked Questions (FAQs) / People Also Ask (PAA)
Is definition of done a Scrum artifact?
Is definition of done the same as acceptance criteria?
What is definition of done in testing?
Can definition of done be changed?
When is the definition of done created?
Who creates the definition of done in Agile?
Definition of done vs Sprint Goal - what's the difference?
Is the Definition of Done the same for every Agile team or organization?
What happens if work doesn't meet the Definition of Done by Sprint end?
How do you create an effective Definition of Done for a new team?
Can the Product Owner override the Definition of Done?
What are the three levels of Definition of Done?
How does Definition of Done help reduce technical debt?
How does Definition of Done work with distributed or remote teams?
Should Definition of Done include non-functional requirements like performance and security?