Child pages
  • Sample Project Quality Plan
Skip to end of metadata
Go to start of metadata

Sample Quality Plan

1. Purpose and scope

This plan identifies the set of standards, practices, reviews, checkpoints and other quality improvement methods to be implemented for the SampleProject Phase 1 project. The project adheres to UW and Kuali Student QA practices, and UW's Lean-Kanban development process.

This plan applies to the following:

  • Software developed and maintained by the University of Washington SampleProject technical team
  • Software and systems relied up on to complete the implementation (SDB, Rice, SWS)
  • Software project management and development practices.
  • Necessary education and communication documentation

2. Quality Roles and responsibilities

The project team is responsible for the level of quality of the project deliverables and will participate in quality reviews. Specific responsibilities include:

Role

Quality Assurance Responsibility

Executive project sponsors

  • Virjean Edwards, University Registrar - Seattle
  • Andrea Coker-Anderson, University Registrar - Tacoma
  • Pam Lundquist, University Registrar - Bothell
  • Darcy Van Patten, Director, Student Systems, UW IM
  • Assure availability of essential project resources for identified quality activities. 
  • Ensure resolution of quality issues escalated by the project manager.

Scrum Master

  • Ben Clark
  • Review and approve the quality plan. 
  • Include project quality activities in the project plan and work assignments. 
  • Facilitate resolution of quality issues, escalating as needed.

Business Readiness Team

  • Matt Winslow, Acting Associate Registrar
  • Jennifer Payne, Curriculum Management
  • Tina Miller, DARS Subject Matter Expert, Academic Records
  • Pam Lundquist, University Registrar - Bothell
  • Andrea Coker-Anderson, University Registrar - Tacoma
  • Bob Jansson, Associate Registrar - Seattle
  • Help determine the alpha test, beta test and production readiness acceptance criteria for this project. 
  • Provide feedback on the deliverables and the requirements and testing processes.

Technical team members (including Application Integration Services)

  • Kamal Muthuswamy 
  • Craig Nomaguchi
  • Virginia Balsley
  • David McClellan
  • Hrishikesh Tidke "Rishi"
  • Provide feedback on the quality plan; help determine the quality reviews, metrics and acceptance criteria for this project. 
  • Be a part of the quality reviews (including testing) and provide feedback on the deliverables and the overall process.

Quality Manager

  • Kerry Lamb
  • Prepare and update the quality plan and coordinate quality reviews and checkpoints. 
  • Provide test and test management support, including monthly status of testing and bugs 
  • Monitor the quality activities for completion and improvements. 
  • Review project processes and artifacts. 
  • Provide project manager with project quality status for existing communications as described in the project charter (e.g., updates to sponsors)

Product Owner

  • Darcy Van Patten
  • Provide feedback on the quality plan; help determine the quality reviews, metrics and acceptance criteria for this project. 
  • Be a part of the quality reviews (including testing) and provide feedback on the deliverables and the overall process.
  • Represent the Product Owner Advisory Group to the project team, making sure that decisions are vetted by that group.

SIS Technical Architect

  • Kamal Muthuswamy
  • Review architectural and other technical decisions and provide technical guidance
  • Provide integration and infrastructure guidance for architecture and technical decisions

Data Decision team

  • Craig Nomaguchi
  • Virginia Balsley
  • David McClellan
  • Hugh Parker
  • Carol Bershad
  • Bob Jansson
  • Matt Wilson

  • Review data management decisions and provide data management guidance
  • Escalate issues that cannot be made at this level
  • System analysis

 

3. Quality checkpoints / deliverable reviews

This project follows the Information Management Software Development Life Cycle and its associated checkpoints and reviews:

  • Architecture checkpoint (may be combined with the design checkpoint)
  • Design checkpoint
  • Production readiness checkpoint

The project is following an iterative approach with multiple sprints making up a milestone.  For example, the first milestone, Milestone 1 is met in 5 sprints.  Each milestone may include an architecture and design checkpoint. Since deliverables may not be released into production at each milestone, a production readiness checkpoint is held only when release to production is planned. The team decides when these checkpoints occur during the milestone.

The outcomes of major checkpoints are documented as JIRA Decisions, using the SDLC's Checkpoint Findings Template as a guideline.

Any decision that is has significant impact on the project is documented in JIRA and approved by the "Student Program Technical Coordinating Team."

The technical lead will track Technical Decisions.

 

3.1 Requirements review

The requirements for the SampleProject are developed and reviewed using Agile techniques: the team creates, reviews and prioritizes user stories with high level acceptance criteria (validation strategy) at the beginning of each sprint.  The acceptance criteria are reviewed before or during the Specify state, by the project manager, the quality manager, and/or the test lead, and used as the basis for acceptance tests and more detailed tests by the test lead, and for unit tests by the developer.  Any wireframes developed by the UX designer are also considered requirements and are peer-reviewed during the Specify state.

3.2 Architecture/Design checkpoint

The team reviews the architecture and design (including systems interface and user experience design) informally, with decisions from those reviews tracked using JIRA (as "decision" issues).  Architectural and design issues are discussed with the Student Program Technical Coordinating team with additional formal review conducted as agreed upon among the SampleProject team and the Coordinating Team.  Note: SampleProject architecture is based on the Kuali Student Architecture, and both Kuali Student and Kuali Rice are partners.

The reviews are guided by the architecture and design standards identified in Standards, Practices and Guidelines. This checkpoint determines if security risks have been addressed, if the best solution has been chosen from the alternatives, if all the requirements are addressed in the design and test plan, and if support has been considered.

As development progresses for each milestone, the architecture and designs are peer-reviewed (by at least one other person). Major designs are presented to the technical team, the Coordinating team and to the Kuali Student team for review.

Example of minor design reviews and the Kanban board (does not include coding or testing activities):

  1. User story for the functionality is sized and prioritized during the planning session, and loaded onto the board in the backlog
  2. A developer moves the user story to 'Specify' when work begins on the design
  3. Another developer reviews the design, which is reworked if necessary
  4. Key decisions made during the design are captured in JIRA and mentioned during the standup
  5. Once the design is acceptable to a second developer, the user story moves to 'Execute' and coding/unit testing begins.

The SDLC's Key Quality Questions for Architect and Design are referenced at the end of each milestone (for example, does everyone understand the design? have solutions to security risks been discussed? have major decisions been captured?).

3.3 Security reviews

While security and authentication/authorization is discussed during deliverable reviews, the project team conducts internal reviews of high risk deliverables against OWASP Secure Coding Principles for Web applications and the OWASP top ten most critical web application security flaws (see Standards, Practices and Guidelines).

Authorization and access is tested during integration testing, and revalidated during user acceptance testing.

The quality manager will assist the project team with the UW-IT Application Security Review Process:

  1. Completing the Web Application  Security Self-Assessment Worksheet
  2. Developing a Security diagram, showing data flows and locations
  3. Depending on the results of the assessment, conducting a formal security review session with the Business Owners and others.

3.4 Code walkthroughs

Project team members hold informal 'desk check' code reviews for most of the code, including any automated tests, with more formal code walkthroughs for high risk code. High risk code involves new methods, new technology, new design, or restricted/confidential data. High risk deliverables should be identified as early as possible in the milestone.

Code is reviewed against the coding standards identified in Standards, Practices and Guidelines.

The code review tool 'Crucible' is being explored for use by the team.

Kuali Student posts their own guidelines for coding standards and code reviews:

https://wiki.kuali.org/display/STUDENTDOC/4.4+Coding+Standards+and+Code+Review

3.5 Project management/documentation reviews

The product owner, scrum master, and the quality manager will periodically review the project's progress by evaluating:

  • progress relative to the overall project plan, scope, deliverables, project budget, and project schedule
  • project risks and risk management
  • requirements coverage (user story tracking to project deliverables and testing)
  • quality issues raised during the quality activities.

Evaluations of the project's development processes will be held during sprint retrospectives and at the end of the project to determine if the processes provide value and how they can be improved.

Other project documents may be reviewed as needed (e.g., support plan, production support documentation). 

The project reports to the UW-IT Project Review Board every two weeks.

Issues from these reviews are raised to the stakeholders as needed.

3.6 Testing

A Test plan (or set of plans) will include the overall test strategy for the verification and validation of the software deliverables, the defect tracking processes, test environments, and test roles and responsibilities, using the Test Planning Checklist on the UW Project Management Portal as a guide.
Tests to be run are identified, with detailed test cases prepared for high risk or high priority tests.
Project team members hold informal 'desk check' test case reviews for most of the tests, with more rigorous reviews for high risk areas. 
At a minimum, the following types of verification and validation are performed:

  • Build, Deploy and Unit testing on a nightly automated basis
  • Functional/integration testing (automated and manual interface, end-to-end, security, regression tests)
  • Data migration testing
  • Performance and load testing
  • User acceptance testing
  • Documentation/help verification
  • Test & production deployment / implementation verification (technical environment validation)

3.7 Production readiness checkpoint

Team members (including appropriate stakeholders and non-project members of UW Information Technology) will review the following project artifacts for completeness and correctness, to determine if the project deliverables are ready to deploy to production.

User acceptance test results (including any outstanding issues from functional/integration, data migration, and user acceptance testing) are measured against production readiness criteria: to be determined by the project team, the Business Readiness Team, Product Owners Advisory Group, and project sponsors.  Initial criteria:

  • All integration, data migration, and user acceptance tests have been run and all high priority tests have passed
  • Load testing results meet the minimum performance thresholds identified in the requirements
  • All "Blocker" bugs (SampleProject and Kuali) have been resolved and associated tests rerun successfully
  • All unresolved bugs have been documented and include workarounds if needed
  • Production environment is available, stable, populated and validated (Kuali and SampleProject components)
  • The SampleProject production readiness recommendation has been completed and stakeholders agree 

4. Standards, practices and guidelines

The Information Management standards, practices and guidelines, along with the architecture and design standards, naming and coding standards, and source control and build practices identified in the Kuali Student Developer Guide will be followed by this project where appropriate. The project team and the quality manager determine the appropriateness of the guidelines.  

Additional standards, guidelines and practices adopted by the project:

Area

Standard, practice or guideline to be applied

4.1 System configuration management practices

  • UW source control and build practices
  • Student Team build practices for the production environment
  • Software that is 'production ready' but not yet deployed will be considered the production baseline until deployed.

4.2 Problem reporting and corrective action practices

  • The project uses JIRA for problem tracking. Major project issues, risks and decisions are captured, and defects found in the requirements, design, code or tests are documented in JIRA after discussion by the team. Bugs found during coding and unit testing are not entered into the system unless they remain unresolved once the code has been deployed onto a team server.
  • Resolutions for high-priority issues are documented with the issue. 
  • Resolutions for fixed bugs are retested and regressed.
  • Issues with external components are captured in both the SampleProject JIRA project and the external problem reporting system (e.g., KS CM 2.0 issues are logged in the KS JIRA.)

4.3 Development Methodology

  • The project uses Lean-Kanban processes. 
  • The criteria for moving user stories from state to state (from Specify to Execute to Verify to Done) are defined and refined by the team.
  • At the end of each sprint, team processes are reviewed and suggestions for improvement made.

5. Metrics

Quality metrics to measure project deliverable quality will be further defined by the team: 

  • The ratio of positive vs. negative results from a user satisfaction survey
  • The amount of time spent on fixing defects after a feature is considered complete.
  • The amount of time spend on fixing defects after implementation.
  • No labels