gogogo
Syndetics cover image
Image from Syndetics

Lean-agile acceptance test driven development better software through collaboration / Ken Pugh.

By: Material type: TextTextSeries: Net objectives lean-agile seriesPublication details: Upper Saddle River, NJ : Addison-Wesley, c2011.Description: 1 online resource (xxii, 345 p.) : illISBN:
  • 9780321714084 (paperback : alk. paper)
  • 0321714083 (paperback : alk. paper)
Subject(s): Genre/Form: Additional physical formats: Print version:: No title
Holdings
Item type Current library Call number Copy number Status Date due Barcode
Standard Loan Thurles Library Main Collection 005.14 PUG (Browse shelf(Opens below)) 1 Available R19182AKRC
Standard Loan Thurles Library Main Collection 005.14 PUG (Browse shelf(Opens below)) 1 Available R19181YKRC
Standard Loan Thurles Library Main Collection 005.14 PUG (Browse shelf(Opens below)) 1 Available R19180XKRC
Standard Loan Thurles Library Main Collection 005.14 PUG (Browse shelf(Opens below)) 1 Available R19177MKRC

Enhanced descriptions from Syndetics:

Praise for Lean-Agile Acceptance Test-Driven Development

nbsp;

" Lean-Agile Acceptance Test-Driven Development tells a tale about three fictive project stakeholders as they use agile techniques to plan and execute their project. The format works well for the book; this book is easy to read, easy to understand, and easy to apply."

--Johannes Brodwall, Chief Scientist, Steria Norway

nbsp;

"Agile development, some say, is all about pairing, and, yes, I''m a believer in the power of pairing. After reading this book, however, I became a fan of the ''triad''--the customer or business analyst + the developer + the tester, who work collaboratively on acceptance tests to drive software development. I''ve written some patterns for customer interaction and some patterns for testing and I like what Ken Pugh has chosen to share with his readers in this down-to-earth, easy-to-read book. It''s a book full of stories, real case studies, and his own good experience. Wisdom worth reading!"

--Linda Rising, Coauthor of Fearless Change: Patterns for Introducing New Ideas

nbsp;

"The Agile Manifesto, Extreme Programming, User Stories, and Test-Driven Development have enabled tremendous gains in software development; however, they''re not enough. The question now becomes ''How can I ensure clear requirements, correct implementation, complete test coverage, and more importantly, customer satisfaction and acceptance?'' The missing link is acceptance as defined by the customer in their own domain language. Lean-Agile Acceptance Test-Driven Development is the answer."

--Bob Bogetti, Lead Systems Designer, Baxter Healthcare

nbsp;

"Ken Pugh''s Lean-Agile Acceptance Test-Driven Development shows you how to integrate essential requirements thinking, user acceptance tests and sounds, and lean-agile practices, so you can deliver product requirements correctly and efficiently. Ken''s book shows you how table-driven specification, intertwined with requirements modeling, drives out acceptance criteria. Lean-Agile Acceptance Test-Driven Development is an essential guide for lean-agile team members to define clear, unambiguous requirements while also validating needs with acceptance tests."

--Ellen Gottesdiener, EBG Consulting, www.ebgconsulting.com, Author of Requirements by Collaboration and The Software Requirements Memory Jogger

nbsp;

"If you are serious about giving Agile Testing a chance and only have time to read one book, read this one."

--David Vydra, http://testdriven.com

nbsp;

"This book provides clear, straightforward guidance on how to use business-facing tests to drive software development. I''m excited about the excellent information in this book. It''s a great combination of the author''s experiences, references to other experts and research, and an example project that covers

many angles of ATDD. A wide range of readers will learn a lot that they can put to use, whether they work on projects that call themselves lean or agile or simply want to deliver the best possible software product."

--Lisa Crispin, Agile Tester, ePlan Services, Inc., Author of Agile Testing

nbsp;

Within the framework of Acceptance Test-Driven-Development (ATDD), customers, developers, and testers collaborate to create acceptance tests that thoroughly describe how software should work from the customer''s viewpoint. By tightening the links between customers and agile teams, ATDD can significantly improve both software quality and developer productivity.

nbsp;

This is the first start-to-finish, real-world guide to ATDD for every agile project participant. Leading agile consultant Ken Pugh begins with a dialogue among a customer, developer, and tester, explaining the "what, why, where, when, and how" of ATDD and illuminating the experience of participating in it.

nbsp;

Next, Pugh presents a practical, complete reference to each facet of ATDD, from creating simple tests to evaluating their results. He concludes with five diverse case studies, each identifying a realistic set of problems and challenges with proven solutions.

nbsp;

Coverage includes

nbsp;

*nbsp;nbsp;nbsp;nbsp; How to develop software with fully testable requirements

*nbsp;nbsp;nbsp;nbsp; How to simplify and componentize tests and use them to identify missing logic

*nbsp;nbsp;nbsp;nbsp; How to test user interfaces, service implementations, and other tricky elements of a software system

*nbsp;nbsp;nbsp;nbsp; How to identify requirements that are best handled outside software

*nbsp;nbsp;nbsp;nbsp; How to present test results, evaluate them, and use them to assess a project''s overall progress

*nbsp;nbsp;nbsp;nbsp; How to build acceptance tests that are mutually beneficial for development organizations and customers

*nbsp;nbsp;nbsp;nbsp; How to scale ATDD to large projects

nbsp;

Includes bibliographical references (p. 315-322) and index.

Description based on print version record.

Table of contents provided by Syndetics

  • Introduction (p. 1)
  • Part I The Tale
  • Chapter 1 Prologue (p. 9)
  • Ways to Develop Software (p. 9)
  • One Way (p. 9)
  • Another Way (p. 9)
  • The Difference (p. 10)
  • The Importance of Acceptance Tests (p. 10)
  • System and Team Introduction (p. 12)
  • The System (p. 12)
  • The People (p. 13)
  • Summary (p. 14)
  • Chapter 2 Lean and Agile (p. 15)
  • The Triad and Its Units (p. 15)
  • Post-Implementation Tests (p. 17)
  • Quick Feedback Better Than Slow Feedback (p. 18)
  • Preimplementation Tests (p. 19)
  • Lean and Agile Principles (p. 20)
  • Summary (p. 21)
  • Chapter 3 Testing Strategy (p. 23)
  • Types of Tests (p. 23)
  • Where Tests Run (p. 25)
  • Test Facets (p. 26)
  • Control and Observation Points (p. 27)
  • New Test Is a New Requirement (p. 27)
  • Summary (p. 28)
  • Chapter 4 An Introductory Acceptance Test (p. 29)
  • A Sample Business Rule (p. 29)
  • Implementing the Acceptance Tests (p. 31)
  • Test Script (p. 32)
  • Test User Interface (p. 33)
  • xUnit Test (p. 34)
  • Automated Acceptance Test (p. 35)
  • An Overall Test (p. 36)
  • Testing Process (p. 37)
  • Summary (p. 37)
  • Chapter 5 The Example Project (p. 39)
  • The Charter (p. 39)
  • Objectives (p. 40)
  • Project Acceptance Tests (p. 41)
  • High-Level Requirements (p. 43)
  • Features (p. 43)
  • Feature Acceptance Criteria (p. 45)
  • Summary (p. 46)
  • Chapter 6 The User Story Technique (p. 47)
  • Stories (p. 47)
  • Features into Stories (p. 48)
  • Roles (p. 49)
  • Role Attributes (p. 49)
  • Persona (p. 50)
  • Stories for Roles (p. 51)
  • Story Acceptance Criteria (p. 52)
  • Acceptance Tests Determine Size (p. 53)
  • Customer Terms (p. 54)
  • INVEST Criteria (p. 55)
  • Summary (p. 56)
  • Chapter 7 Collaborating on Scenarios (p. 57)
  • Use Cases from User Stories (p. 57)
  • Simple Use Case (p. 59)
  • Exceptions and Alternatives (p. 60)
  • Acceptance Tests (p. 63)
  • Documentation (p. 63)
  • Story Map (p. 63)
  • Conceptual Flow (p. 65)
  • Communication (p. 66)
  • Summary (p. 68)
  • Chapter 8 Test Anatomy (p. 69)
  • Triad Creates Tests (p. 69)
  • Test Context (p. 70)
  • Test Structure (p. 71)
  • Calculation Table (p. 73)
  • Data Table (p. 74)
  • Action Table (p. 75)
  • Tests with Example Values (p. 76)
  • Requirements Revised (p. 77)
  • Acceptance Test Revised (p. 78)
  • Test with Values in Text (p. 79)
  • When and Where Tests Are Run (p. 80)
  • Summary (p. 81)
  • Chapter 9 Scenario Tests (p. 83)
  • Tests for Exception Scenarios (p. 83)
  • Tests for Business Rules (p. 87)
  • Cross-Story Issues (p. 88)
  • Don't Automate Everything (p. 89)
  • Multi-Level Tests (p. 90)
  • User Interface Tests (p. 93)
  • Check the Objectives (p. 93)
  • Summary (p. 94)
  • Chapter 10 User Story Breakup (p. 95)
  • Acceptance Tests Help Break Up Stories (p. 95)
  • Business Rule Tests (p. 96)
  • A Story with a Business Rule (p. 100)
  • Summary (p. 101)
  • Chapter 11 System Boundary (p. 103)
  • External Interfaces (p. 103)
  • More Details (p. 107)
  • External Interface Tests (p. 108)
  • Component Tests (p. 108)
  • Test Doubles and Mocks (p. 111)
  • What Is Real? (p. 112)
  • Story Map of Activities (p. 113)
  • Summary (p. 114)
  • Chapter 12 Development Review (p. 115)
  • The Rest of the Story (p. 115)
  • Usability Testing (p. 116)
  • Separating State from Display (p. 116)
  • Quality Attribute Tests (p. 118)
  • Workflow Tests (p. 119)
  • Deployment Plans (p. 120)
  • From Charter to Deliverable (p. 120)
  • Summary (p. 121)
  • Part II Details
  • Chapter 13 Simplification by Separation (p. 125)
  • Complex Business Rules (p. 125)
  • Simplify by Separating (p. 126)
  • The Simplified Rule (p. 128)
  • Rental History (p. 128)
  • Summary (p. 130)
  • Chapter 14 Separate View from Model (p. 131)
  • Decouple the User Interface (p. 131)
  • Decoupling Simplifies Testing (p. 136)
  • Summary (p. 136)
  • Chapter 15 Events, Responses, and States (p. 137)
  • Events and an Event Table (p. 137)
  • States and State Transitions (p. 139)
  • Internal State or External Response (p. 142)
  • Transient or Persistent States (p. 144)
  • A Zen Question (p. 144)
  • Summary (p. 144)
  • Chapter 16 Developer Acceptance Tests (p. 145)
  • Component Acceptance Tests (p. 145)
  • Field Display Tests (p. 145)
  • Tabular Display Tests (p. 147)
  • Summary (p. 151)
  • Chapter 17 Decouple with Interfaces (p. 153)
  • Tests for a Service Provider (p. 153)
  • The Interface (p. 153)
  • Quality Attribute Tests (p. 155)
  • Comparing Implementations (p. 155)
  • Separating User Interface from Service (p. 157)
  • Separation of Concerns (p. 158)
  • Reusable Business Rules (p. 158)
  • Summary (p. 159)
  • Chapter 18 Entities and Relationships (p. 161)
  • Relationships (p. 161)
  • Entities and Relationships (p. 161)
  • Multiple Relationships (p. 163)
  • Alternative Representations (p. 166)
  • Summary (p. 166)
  • Chapter 19 Triads for Large Systems (p. 167)
  • Large Systems (p. 167)
  • When a Customer Test May Not Be Required (p. 169)
  • Data Conversion (p. 170)
  • Database Conversions (p. 170)
  • What If There Are No Tests? (p. 170)
  • Legacy Systems (p. 172)
  • Summary (p. 173)
  • Part III General Issues
  • Chapter 20 Business Capabilities, Rules, and Value (p. 177)
  • Business Capabilities (p. 177)
  • Scenario Handling (p. 178)
  • Business Rules Exposed (p. 179)
  • A Different Business Value (p. 179)
  • Summary (p. 181)
  • Chapter 21 Test Presentation (p. 183)
  • Customer Understood Tables (p. 183)
  • Table Versus Text (p. 185)
  • Specifying Multiple Actions (p. 185)
  • Complex Data (p. 187)
  • Custom Table Forms (p. 188)
  • Summary (p. 189)
  • Chapter 22 Test Evaluation (p. 191)
  • Test Facets (p. 191)
  • Understandable to Customers (p. 191)
  • Spell Checked (p. 192)
  • Idempotent (p. 192)
  • Not Fragile (p. 192)
  • Test Sequence (p. 193)
  • Workflow Tests (p. 193)
  • Test Conditions (p. 194)
  • Separation of Concerns (p. 194)
  • Test Failure (p. 195)
  • Test Redundancy (p. 196)
  • No Implementation Issues (p. 197)
  • Points to Remember (p. 197)
  • Summary (p. 198)
  • Chapter 23 Using Tests for Other Things (p. 199)
  • Uses of Acceptance Tests (p. 199)
  • Degree of Doneness (p. 199)
  • Estimation Aid (p. 200)
  • Breaking Down Stories (p. 200)
  • Developer Stories (p. 200)
  • Tests as a Bug Report (p. 201)
  • Root Cause Analysis (p. 201)
  • Production Bugs (p. 202)
  • Regression Testing (p. 202)
  • Summary (p. 202)
  • Chapter 24 Context and Domain Language (p. 205)
  • Ubiquitous Language (p. 205)
  • Two Domains (p. 207)
  • Summary (p. 208)
  • Chapter 25 Retrospective and Perspective (p. 209)
  • Recap (p. 209)
  • The Process (p. 210)
  • Testing Layers (p. 210)
  • The Tests (p. 211)
  • Communication (p. 212)
  • What's the Block? (p. 212)
  • Monad (p. 212)
  • Unavailable Customer (p. 213)
  • Change (p. 213)
  • Risks (p. 214)
  • Benefits (p. 214)
  • Summary (p. 215)
  • Part IV Case Studies
  • Chapter 26 Case Study: Retirement Contributions (p. 219)
  • Context (p. 219)
  • The Main Course Test (p. 220)
  • Setup (p. 220)
  • Event (p. 221)
  • Expected (p. 221)
  • Implementation Issues (p. 222)
  • Separation of Concerns (p. 222)
  • Business Value Tracking (p. 223)
  • One Exception (p. 223)
  • Event (p. 223)
  • Expected (p. 224)
  • Another Exception (p. 225)
  • Event (p. 225)
  • Expected (p. 225)
  • Two Simultaneous Exceptions (p. 226)
  • Event (p. 226)
  • Expected (p. 227)
  • The Big Picture (p. 227)
  • Event Table (p. 228)
  • State Transition Table (p. 228)
  • Summary (p. 230)
  • Chapter 27 Case Study: Signal Processing (p. 231)
  • It's Too Loud (p. 231)
  • Sound Levels (p. 231)
  • Developer Tests (p. 233)
  • Summary (p. 233)
  • Chapter 28 Case Study: A Library Print Server (p. 235)
  • The Context (p. 235)
  • A Workflow Test (p. 236)
  • Summary (p. 241)
  • Chapter 29 Case Study: Highly Available Platform (p. 243)
  • Context for Switching Servers (p. 243)
  • Test for Switching Servers (p. 244)
  • Test for Technical Rule (p. 246)
  • Summary (p. 248)
  • Part V Technical Topics
  • Chapter 30 How Does What You Do Fit with ATDD? (p. 251)
  • Test Platforms (p. 251)
  • Internal Design from Tests (p. 252)
  • Device Testing (p. 254)
  • Starting with User Interfaces (p. 255)
  • Black Box Testing (p. 255)
  • Unit Testing (p. 256)
  • Summary (p. 256)
  • Chapter 31 Test Setup (p. 257)
  • A Common Setup (p. 257)
  • Some Amelioration (p. 259)
  • Test Order (p. 260)
  • Persistent Storage Issues (p. 260)
  • Summary (p. 261)
  • Chapter 32 Case Study: E-Mail Addresses (p. 263)
  • Context (p. 263)
  • Breaking Down Tests (p. 264)
  • Local-Part Validation (p. 265)
  • Domain Tests (p. 266)
  • Disallowed Domain Tests (p. 268)
  • Test to Ensure Connection (p. 269)
  • Verification Test (p. 269)
  • Summary (p. 270)
  • Part VI Appendices
  • Appendix A Other Issues (p. 273)
  • Context (p. 273)
  • Customer Examples (p. 274)
  • Fuzzy Acceptance Tests (p. 274)
  • Acceptance Test Detail (p. 275)
  • Requirements and Acceptance Tests (p. 275)
  • Documenting Requirements and Tests (p. 276)
  • Decoupling Requirements (p. 276)
  • Separation of Issues (p. 276)
  • Testing Systems with Random Events (p. 277)
  • The Power of Three (p. 277)
  • Summary (p. 278)
  • Appendix B Estimating Business Value (p. 279)
  • Business Value (p. 279)
  • Developer Stories (p. 281)
  • Summary (p. 282)
  • Appendix C Test Framework Examples (p. 283)
  • The Examples (p. 283)
  • Fit Implementation (p. 284)
  • Setup (p. 284)
  • Check-Out CD (p. 284)
  • Check-In (p. 286)
  • Category-Based Rental Fees (p. 287)
  • SlimûTable Style (p. 288)
  • Header (p. 288)
  • Setup (p. 288)
  • Check-Out CD (p. 288)
  • Check-In (p. 290)
  • Category-Based Rental Fees (p. 291)
  • SlimûCucumber Style (p. 291)
  • Tables Everywhere (p. 299)
  • Money with ATDD (p. 305)
  • Exercises (p. 311)

Author notes provided by Syndetics

Kenneth Pugh has over two-fifths of a century of software experience. Previously a principal at Pugh-Killeen Associates, he is now a fellow consultant for Net Objectives. He has developed software applications ranging from radar tracking to financial analysis. Responsibilities have included everything from gathering requirements to testing. After the start of the new millennium, he has worked with teams to create software more effectively with lean and agile processes. He has spoken at numerous national conferences; consulted and taught all over the world; and testified on technology topics. This is his seventh book. In 2006, his book Prefactoring won the Jolt Award [DrDobbs01]. In his spare time, he snowboards, windsurfs, and backpacks. Between 1997 and 2003, he completed the Appalachian Trail. The cover photograph of Mount Katahdin, the northern end of the trail, was taken by the author from Abol Bridge in Maine.

Powered by Koha