Manage and Mitigate Delivery Risk

Skills Covered

This guide teaches you how to take a more nuanced approach to mitigate and manage delivery risk at critical junctures.

Skills Checklist:

  • Provide a brief summary of how the Appian Delivery Governance Model mitigates risk.
  • Identify the three most important roles needed in establishing a Delivery Governance Model.
  • Name the types of application reviews conducted as part of the governance model.
  • Identify the Appian Delivery phase in which application reviews are conducted.

Dynamic Risk Mitigation

Some degree of risk is always inevitable when managing cross-platform development teams or a large-scale deployment. This is especially true as business partners seek to innovate and develop new digital differentiators in a dynamic marketplace. As organizations increase velocity to keep up with new demands, teams traditionally charged with mitigating risk struggle to keep pace. As a result, many are now taking a significantly more nuanced approach to mitigate and manage delivery risk at critical junctures. This new approach can effectively match the risk appetite (or risk tolerance) of a given organization to an appropriate level of resource investment without slowing development teams.

Application delivery requires different types of governance attention across the application development lifecycle. Appian has condensed this lifecycle into four phases (The Appian Delivery Methodology) and mapped a series of mandatory, lightweight readiness checks and application reviews to each phase. These readiness checks and application reviews ensure that best practice and program standards are implemented across the critical Build and Release phases.

While this model can be implemented with slight variations based on organizational requirements and risk appetite, Appian strongly encourages the involvement of the following three roles in order to implement effective governance:

  • Program Owner 
  • Program Architect 
  • Team Lead(s)

Start with Peer Reviews

Implementing a peer review process ahead of more formal Application Reviews or other internal governance checkpoints is a useful way to ensure your developers are following intended practices. Whether it’s for code or database development, Appian customers using this practice report this being a ‘great way to scale valuable governance resources’ and that ‘informal peer-to-peer reviews has accelerated development of younger team members’. Peer reviews also work exceptionally well in federated governance models or in organizations that cannot dedicate full-time resources to a CoE.

Delivery Governance Model

The Application Delivery Governance Model enables customers with a lightweight, effective way to govern their Appian delivery teams. By implementing this model, Appian program owners can ensure that applications are 1) built according to best practices and 2) adhere to program-specific requirements and standards. Designed by Appian’s Customer Success Professionals, this model provides delivery teams with the standards, measurements and process alignment needed to develop quality low-code applications, quickly and efficiently.

Roles Involved

While this model can be implemented with slight variations based on organizational requirements and risk appetite, Appian strongly encourages the involvement of at minimum the following three roles in order to implement effective governance.

Program Owner

The Program Owner wears a few different hats, but is ultimately the individual responsible for making sure the governance model is followed. They are the ultimate approval authority for governance standards, ensuring that project managers enforce standards, and making the “go forward” decisions at readiness checks. They also provide delivery teams with contextual information and awareness of the program’s direction.

Program Architect

The Program Architect’s primary concern is executing the governance program through a series of lightweight reviews and readiness checks. They will ensure the high quality of your application.

TTo get the most value out of your Program Architect, Appian encourages this role to be a part of the planning efforts early in the project lifecycle. The visibility will not only help prepare them for readiness checks throughout, but their domain expertise will also help navigate unforeseen obstacles in planning sessions.

Team Lead(s)

Team Leads are the day-to-day coordinator for application development ceremonies and serve as the main point of contact for delivery teams. They ensure their teams follow governance guidelines and standards set by the Program Owner and facilitate discussions between teams where needed. They also attend all readiness checks and reviews for their teams.

Application Reviews

During each phase of the application development lifecycle, delivery teams engage in select reviews where their progress will be checked by an expert against outlined best practices and program-standards. These reviews are designed to be lightweight and favor discussion over documentation; although some reviews do require artifacts for facilitation.

Application reviews will result in a prioritized list of recommended next steps categorized by risk profile (High, Medium, Low). If there are findings in the “High” category, the team should address the recommendation and perform the review again. Teams should still address “Medium” and “Low” findings, but these should be done in parallel as development continues.

Solution Architecture Review

Goal: Mitigate architectural risk early in the process by modeling and reviewing foundational design decisions.

What to bring: Completed Appian Architecture Document

When to review: End of Initiate Phase

Attendees: Team Lead, Project Manager, Program Architect

  • Non-functional requirements have been captured including: total users, peak current users, and volume estimates for all key entities.
  • Platform Team has been notified of any significant changes in usage patterns that may require a resizing of the infrastructure.
  • Application design does not contain anti-patterns.
  • User provisioning design has been established.
  • Application security requirements have been collected, and the design accounts for role-based access to relevant data.
  • All external system integrations have been identified and key design decisions determined including: transport mechanism, authentication and logging.
  • Reporting requirements have been collected.
  • Application design adheres to organizational and platform-specific guidelines.

Development Review

Goal: Ensure the developed features are designed according to standards.

What to bring: Demonstration of completed features

When to review: As needed. Recommended at least a few times every iteration

Attendees: Team Lead, Key Developers, Program Architect

  • Application objects adhere to Appian Development Standards and any platform-specific guidelines.
  • Key interfaces are performant: most interactions should be under 1 second.
  • Role-based security has been configured for the developed features.
  • Reusable component catalog is being used according to the platform-specific guidelines.
  • Team has performed database volume testing
  • Indexes have been created for key entities according to the Indexing Guidance.

UX Review

Goal: Ensure interfaces are intuitive, adhere to best practices, and are consistent across the platform.

What to bring: Mockups or demonstration of developed UX

When to review: During Initiate Phase or during sprint 1 and again before the end of the Build Phase

Attendees: Team Lead, Key Developers, Program Architect

  • Interfaces are succinct and contain just what’s needed. Any redundant or obvious labels should be removed.
  • Key interfaces are performant: common interactions should be imperceptibly fast and most other interactions should be around 1s.
  • User experience is intuitive and responses make sense. The outcome of an action should match the expected result.
  • User experience is consistent across the application and complies with the Appian UX Guide, organizational and any platform-specific UX guidelines.

Readiness Checks

After a team is ready to advance the next phase of development, delivery teams engage in readiness checks. Here, program management will review the current status of all findings and determine whether the team is ready to advance to the next lifecycle phase. If a team is not deemed ready to proceed, Program Owners will provide teams with the steps needed to advance to the next phase. Delivery teams will continue this process until the application has been released.

Health Check

Goal: Ensure developed features adhere to development standards via the Health Check tool.

What to bring: Health Check results from the development environment

When to review: Every iteration

Attendees: Team Lead, Program Architect

  • Health Check tool has been executed on the development environment for the current iteration.
  • All high risk findings are correctly prioritized.

Technical Readiness Check

Goal: Ensure the application is technically ready to release. Though the delivery team will have created a release-able application throughout the Build phase, this is an opportunity for a final check.

What to bring: Deployment Plan, Defect Log

When to review: During the Release Phase

Attendees: Team Lead, Key Developers

  • End users have performed hands-on testing of the application.
  • Addressed any critical feedback from user acceptance testing and that any outstanding application and system defects have been reviewed by the Program Owner.
  • Testing included all relevant user groups (personas, geographic regions, etc.).
  • Establish a comprehensive plan for the deployment that includes timing, owners, tasks, dependencies, and that all the necessary supporting parties have awareness of their responsibilities (networking, DBAs, third party systems integrations, etc.).
  • A complete application package has been created for the release including database scripts.
  • Verify that users have been provisioned in the appropriate authentication groups.
  • Verify results of performance testing of the system based on the concurrent user activity expected, and that it was conducted with production-like data and data volumes.
  • Ensure that load tests approximate the anticipated load on the entire environment/site across all major applications, and that the tests accounted for peak usage days/weeks/months.
  • Verify that hardware sizing has been adjusted based on real and anticipated production data and usage based on load testing.

Download full article e-book