Skip Maine state header navigation

Agencies | Online Services | Help

     State Log

State of Maine Office of Information Technology

 

 

 

Application Deployment Certification Handbook

(Guidelines and Procedures)

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

                                                                                                                                                               

Date of Publication: March 14, 2011


 

REVISION HISTORY

 

Version

Date

Author(s)

Revision Notes

1.0

3/14/2011

Workgroup

Initial Version

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

TABLE OF CONTENTS

 

1.    Preface............................................................................................................................................................................... 1

2.    Test Guidelines Per Deployment Type...................................................................................................... 2

3.    Roles and Responsibilities............................................................................................................................... 10

4.    Approvals Per Deployment Types - Guidelines................................................................................ 15

5.    Process Flow............................................................................................................................................................... 15

6.    Glossary.......................................................................................................................................................................... 16

7.    Application SDLC Considerations............................................................................................................... 17

8.    Links to Relevant Policies.............................................................................................................................. 19

Accessibility Policies & Standards:..................................................................................................................... 19

Security Policies & Standards:................................................................................................................................ 19


1.     Preface

 

The purpose of the State of Maine Application Deployment Certification policy is stated as:

‘As applications have become more complex, more interconnected, and more exposed to the external world, it has become even more important to thoroughly vet them before they are deployed into production. This policy establishes a uniform and objective battery of tests that enables the CIO to evaluate the suitability of an application to be deployed into production.’

 

The document includes Guidelines to assist in the facilitation of the testing required to meet the Applications Deployment Certification policy.  Along with the Application Deployment Policy, the Project Managers / Product Managers responsible for application deployment certification tasks should use these as general guidelines for:

·         Determining for each certification test type what level of testing  is appropriate for their particular application; dependent on deployment type; i.e. “right sizing” to adapt to circumstances, while meeting the intent of the policy.

·         Roles and responsibilities –identifying who is involved in deployment certification activities

·         Guidelines are provided for all types of applications (new applications, enhanced applications, COTS, and vendor hosted), and for four deployment types: New Development, Enhancement/Utility Release, Emergency, and Hosting Environment Change.  These expanded deployment type definitions are in the Glossary – Section 6.

·         Web Service changes are to be tested following the same guidelines as any other application change.

·         Hosting Environment Changes, New Application, Major Enhancements, Legacy systems that are undergoing certification for the first time, and minor upgrades may require variations to the testing approach. The Application Project / Product Managers may use discretion in making exceptions based on the guidelines and should document any variations in the Application Certification Test Summary Found of the OIT Application Deployment Certification Signoff document.

·         Touch points throughout the SDLC for Application Development and Enhancements.

 

The objective of these guidelines is to provide a path for a “generic” application. Testing should be value added, reducing risk to the State, and generally using common sense. Due diligence should apply to testing for any level of change to ensure data integrity and security, application stability, and application availability. In all cases, the decisions and assumptions made in determining what testing applies should be documented.  The Applications owners are the decision makers, and are ultimately accountable for ensuring the completeness and appropriateness of testing.


2.     Test Guidelines Per Deployment Type

 

As described in the Application Deployment Policy, the following tests must be conducted and passed for all deployments.  Rationale for any exceptions must be documented in the Application Deployment Test Summary document.  Refer to the policy for expanded definitions of each test type.

 

Exit Criteria for all Tests – required for successful or conditional completion follows:

1.    Business team satisfied the application meets expectations

2.    Business management approved any known/remaining issue

3.    Test results documented and stored in agreed on repository

 

Other Testing Guidance

1.    Test and Production environment must be close enough in structure for tests to be relevant.

2.    Testing should begin early in development, rather than after promoting it to a test environment ... simply because if there are code changes surfaced by these tools, the earlier the better.

3.    Security & Performance tests must be performed after a demonstrated code-freeze.

 

 


Legend for Test Per Deployment Types:

R = Required; E = Exceptions, i.e. variations (less stringent) for some requirements at discretion of the project manager;  N = Test Execution Not Required

Deployment Type

Test Type

Tests / Exceptions

Process

New

Utility / Enhancement

Emergency

Hosting Environment Change

Use Case

Tests:
1. Use cases and test scenarios targeting all new and/or /changed functions (vendor may provide when a vendor app)
 Exceptions/Considerations:
1. Emergency deployments - only need if introduce new functionality; and then need only simple list of tests/results.
2. Hosting environment change - need only selected sub-set of Use Cases.

Process:

1. Identify and prepare test cases, scenarios, tools.

2. Obtain business team input
3. Define test repository for cases and results

4. Execute tests during the Acceptance Test Phase

Entry Criteria:
Executed / ’passed’ IT Tests
Exit Criteria:
See standard criteria in this section  introduction

R

R

E

E

Accessibility

Tests:
1. Test cases/scenarios target per areas on Accessibility Test checklist.
2. Code Test tools: AccVerify and Compliance Sheriff, which are automated tools that can verify most accessibility coding issues.

3. Flow and Usability Test Tools: Screen Reader tools JAWS and/or Window Eyes (demo version of JAWS can be downloaded for use by applications developer to perform a preliminary test), with final test performed by OIT/SIG resource.


Exceptions/Considerations:
1. Utility/Enhancement - simple list of tests/results OK.
2. Emergency & Hosted Env Change: testing is  discretionary

3. For releases other than new, Accessibility Tests may be operationalized, allowing for conditional approvals.  Operationalized means system-wide testing on a periodic basis,, such as annually.

4. Considerations: 

Execute tests as soon as in test environment, and second round during the Acceptance Test Phase

 

Process:

1. Identify and prepare test cases, scenarios, tools.

2. Obtain business team input
3. Define test repository for cases and results

4.  Schedule with State’s OIT and Business testers 2 weeks before needed

5. Execute tests during the Acceptance Test Phase

Entry Criteria:
Developer executed / ’passed’ IT Tests
Exit Criteria:

1. See standard criteria in this section introduction
2. Waiver request approved for any issues

3. For any tests not passed, include a remediation plan with the certification request

R

E

E

N

Regression

Tests:
1. Existing function Use case and test scenarios targeting a core suite of essential functions (generally selected on a risk based approach)
Exceptions/Considerations:
1. Applies only to existing applications; new applications have nothing to regression test
2. Emergency: extent of tests can be scaled back at product manager discretion.

3. Considerations:

Manual testing may be appropriate

4. Care should be exercised here, it is just during rushed emergency deployments that testing should be reinforced and exercised. If not, significant harm could result.

Process:

1. Identify and prepare test cases, scenarios, tools.

2. Obtain business team input
3. Define test repository for cases and results

4. Execute tests during the Acceptance Test Phase

Entry Criteria:
Developer executed / ’passed’ IT Tests
Exit: Criteria:
See standard criteria in this section  introduction

N

R

E

E

Data Conversion

Tests:
1.Test scenarios for viewing accurate mapping of all identified business critical data and functions
Exceptions/Considerations:
1. Need only when have data conversions requirements.

2. Not needed for version upgrades, emergency prod fixes new apps when no legacy app exists, etc.

3. Not required for existing prod system retro-cert.

 

Process:

1. Identify and prepare test cases, scenarios, tools.

2. Prepare  'Pre-Test' data reports and screen shots

2. Obtain business team input
3. Define test repository for cases and results

4. Execute tests during the Acceptance Test Phase

5. Prepare 'Post-Test' data reports and screen shots prepared
6. Validate 'Post' Results data accurate

Entry Criteria:
Executed / ’passed’ IT Tests
Exit Criteria:
See standard criteria in this section  introduction

E

R

N

N

Interfaces

Tests:
1. Test cases/scenarios for viewing accurate input/export to business critical interfacing data and functions
Exceptions/Considerations:
1. Only Acceptance test required for COTS, Vendor Developed and Hosted Test.

2. Not required for existing prod system retro-cert.

Process:

1. Identify and prepare test cases, scenarios, tools.

2. Prepare  'Pre-Test' data reports and screen shots

2. Obtain business team input
3. Define test repository for cases and results

4. Execute tests during the Acceptance Test Phase

5. Prepare 'Post-Test' data reports and screen shots prepared
6. Validate 'Post' Results data accurate

Entry Criteria:
Developer executed / ’passed’ IT Tests
Exit Criteria:
See standard criteria in this section  introduction

R

R

E

R

Performance

Tests:
1. Test cases/scenarios for (3) three tiers:  Tier 1: Individual System Component response /turnaround times (screens, batch jobs, etc.

Tier 2: System Load Tests appropriate for typical user population

Tier 3: Network Profile efficiency
2. Test case/scenarios for stated Business performance requirements
3.  Legacy Operating Platform Tests: Differences between pre-production and production environments, validating the high-availability, failover, and load balancing configurations

Exceptions/Considerations:
1. Small & Medium: In-house Developed: execute test cases for typical user behavior; exclude 'simulated load & network profile' tests
2. Small: Vendor / COTS: execute test cases with typical user behavior; exclude 'simulated load & network profile' tests
3. Emergency: Test post implementation - using Small criteria (see above)

4. For releases other than new, Performance Tests may be operationalized, allowing for conditional approvals.  Operationalized means system-wide testing on a periodic basis, such as annually.

5. Considerations: use automated tools as much as possible fro for applicable type and size releases

-  SoM can test up to 1000 virtual users with automated tool

-. Automated tool does not work on all technology

- schedule manual monitoring for network components, databases, servers.

6. Performance Test must accommodate the Legacy Operating Platform Test: Differences between pre-production and production environments, and, more specifically, validating the high-availability, failover, and load balancing configurations, etc.

Process:

1.  Identify and prepare test cases, scenarios, tools.

2. Obtain business team performance requirements as input
3. Define test repositoriesy for cases and results

4. Execute tests in test environment &/or prior to code freeze

5. Execute platform tests again in pre-prod and prod environments after code freeze for final results.

Entry Criteria:
Execute platform tests in pre-prod and prod environments after code freeze for final results.

Exit Criteria:
See standard criteria in this section  introduction

R

E

E

E

Security

Tests:
1. Test scenarios for areas in State Security policies (see link in appendix 8): generally: login/authentication procedures, database security, passing of parameters and data through layers of application

 

Exceptions/Considerations:
1. Emergency: ?

2.  Note  SoM tools only work on web applications and web services

3.  Use automated test tools for all tests as possible

4. For releases other than new, Security Tests may be operationalized, allowing for conditional approvals.  Operationalized means system-wide testing on a periodic basis, such as annually.

Process:

1. Obtain URL and credentials for a test environment.

 2. Identify and prepare test cases, scenarios, tools.

3. Obtain app/SIG team input
4. For in-house apps, start tests when adopt logical architecture and authentication method

5, Execute tests in test environment &/or prior to code freeze
Entry Criteria:
Execute platform tests in pre-prod and prod environments after code freeze for final results.
Exit Criteria:  
Waiver approved by CIO

See standard criteria in this section  introduction

R

E

E

E

Restoration

Tests:
1. Use case and test scenarios- select full suite or business critical suite

2. For new applications, this test will be performed on the production infrastructure BUT PRIOR TO opening it up to end-users (Go-Live).
Exceptions/Considerations:
1. Utility/Enhancement & Emergency - Not needed if a version upgrade only.

Process:

1. Identify and prepare test cases, scenarios, tools.

2. Obtain business team input
3. Define test repository for cases and results

4. Execute tests during the IT & Acceptance Test Phase

Entry Criteria:
Developer executed / ’passed’ IT Tests
Exit Criteria:
See standard criteria in this section  introduction

R

E

E

R

 


3.     Roles and Responsibilities

 

Notes:

1.   Project Manager will coordinate all deployment certification activities within the project plan and to manage the team to meet the project plan/schedule.

2.   In the absence of a Product Manager, the Application IT Manager will assume this role.

3.   One person may be assigned multiple roles.  In fact, often on the smaller systems, one person is the Product Manager, Application Technical Lead, and Project Manager.

4.   The role(s) described is responsible for the actions; however, the responsibility may be delegated.  It is recognized that application support teams may delegate this function to a single individual for multiple application releases.

5.   In the table below: The Role(s) responsible for completing the activity is (are) indicated by an ‘R’ in the appropriate role column.  The test types the activity must be performed for are indicated by an ‘X’ in each applicable Test Type column.


 

Deployment Certification Activities

 

Roles

Application Test - applicable

Prod. Mgr

Proj. Mgr

Bus. Tester

OIT Tester

App Dev Team

Bus. Mgr

CIO, Assoc CIO, App Director

SIG

App Host-ing

Use Case

Acces-sibility

Reg-ress-ion

Data Con-vers-ions

Inter-faces

Sec-urity

Perfor-mance

Restor-ation

Lead, Coordination, Oversight - Activities:

 

R

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Review Application Certification Guidelines and define for the specific project, the scope of tests to be done (which tests, and to what extent) and when the tests need to be done.

R

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Determine who will be designated the ultimate approver(s) of the deployment certification. (See section 4 for guidance).

 

 

 

 

 

 

R

 

 

 

 

 

 

 

 

 

 

Identify key end user to assist with testing (business tester)

R

 

 

 

 

 

 

 

 

X

X

X

X

X

 

 

X

Obtain key vendor rep to assist business tester for COTS/vendor hosted products

R

 

 

 

 

 

 

 

 

 

X

 

 

 

X

X

X

Schedule test timelines with testers (allow 1-2 weeks minimum for scheduling accessibility tester and security tester)

 

R

 

 

 

 

 

 

 

X

X

X

X

X

X

X

X

Provide, or approve if one exists; tracking tool/repository, for test results

R

 

 

 

 

R

 

 

 

 

X

X

X

X

X

X

X

X

Facilitate remediation  and  request re-tests as needed

 

R

 

 

 

R

 

 

 

X

X

X

X

X

X

X

X

Request confirmation that all issues have been remediated and/or are acceptable to business

 

R

 

 

 

 

 

 

 

X

X

X

X

X

X

X

X

Communicate requirements to interfacing Product/Project Mgrs & with App Hosting

 

R

 

 

 

 

 

 

 

 

 

 

 

 

 

 

X

Store ‘email’ that application has passed in Application Deployment Certification document, OR  Initiate waiver request with results

 

R

 

 

 

 

 

 

 

 

X

 

 

 

 

 

 

Test Preparation Activities:

 

 

R

R

 

 

 

 

 

X

X

X

X

X

X

X

X

Produce, or select from existing suite,  use cases or business work flow to execute major parts of the application (Request vendor input as needed)

 

 

R

 

R

 

 

 

 

X

X

X

 

 

 

x

 

Produce, or select from test existing suite,  appropriate manual or automated test cases

 

 

 

 

R

R

 

R

 

 

 

 

 

 

X

X

 

Produce data comparisons and  work flow to identify critical data

 

 

 

 

R

 

 

 

 

 

 

 

X

 

 

 

 

Prepare for conversion results validation: identify and create ‘pre & post’ test reports and/or screen shots

 

 

R

 

R

R

 

 

 

 

 

 

X

 

 

 

 

Define & set up and coordinate test preparation with interfacing dev resources & with App Hosting

 

 

 

 

R

 

 

 

 

 

 

 

 

X

 

 

X

Test Execution Activities:

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Set up tests; coordinate test execution with interfacing dev resources & with App Hosting

 

 

 

 

R

 

 

 

 

 

 

 

 

X

 

X

X

Execute recovery tests (e.g. execute backup/restores)

 

 

 

 

 

 

 

 

R

 

 

 

 

 

 

 

X

Execute use case tests

 

 

R

 

 

 

 

 

 

X

 

X

 

 

 

 

 

Execute tests in application test environments

 

 

 

 

R

 

 

R

 

 

 

 

 

 

X

 

 

Execute tests in ”pre-production environment

 

 

 

 

R

 

 

 

 

 

 

 

 

 

 

X

 

Apply, monitor, provide output of performance monitoring tools in  staging/acceptance’ environment performance

 

 

 

 

 

 

 

 

R

 

 

 

 

 

 

X

 

Run test conversions

 

 

R

 

R

 

 

 

 

 

 

 

X

 

 

 

 

Navigate use cases thru application; run screen reader (Estimate 2 hrs per app)

 

 

R

R

 

 

 

 

 

 

X

 

 

 

 

 

 

Document Test  Results Activities:

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Review and document conversion results data (reference pre & post reports or screens)

 

 

R

 

 

 

 

 

 

 

 

 

X

 

 

 

 

Observe test; document results, scribe issues identified by OIT tester

 

 

 

 

R

 

 

 

 

 

X

 

 

 

 

 

 

Obtain confirmation of issues from OIT tester. Get agreement on must have / nice to have

 

 

 

R

R

 

 

 

 

 

X

 

 

 

 

 

 

Obtain documented use case test results 

 

 

 

 

R

 

 

 

 

X

 

X

 

 

 

 

 

Review results in application for accuracy

 

 

R

 

 

 

 

 

 

 

 

 

 

X

 

 

X

Obtain documented results 

 

 

 

 

R

 

 

R

 

 

 

 

 

 

X

X

 

Approvals: ready for application certification signoff (OK test status including issues / or waivers)

 

R

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Signoff to indicate acceptance of any remaining issues

R

R

 

 

 

R

 

 

 

X

X

X

X

X

X

X

X

Support deployment certification testing successfully completed; submit for formal application certification approval

 

R

 

 

 

 

 

 

 

X

X

X

X

X

X

X

X

 


 

4.     Approvals Per Deployment Types - Guidelines

 

Notes:

1.   CIO/Associate CIO Applications approval is required across all deployment types where the deployment: a. affects entire enterprise

b. introduces new technologies

c. presents high security risks

d. requires major re-engineering

e. has high public visibility

The approval will be made based upon advice from the PMO Director, the Application Director of record, the CTO and other subject matter experts.

2.   Where the deployment does not meet the criteria above requiring the CIO/Associate CIO approval, the deployment certification approval will be delegated to the Application Director or in some cases to the IT Manager.

3.   The CIO/Associate CIO Applications may request retaining signature approval for any deployment at their discretion.

 

 

5.     Process Flow

New Systems and Enhancements/Utility Releases:

1.            Development effort commences.

2.            Product Manager and Project Manager meet to decide applicable tests.  Product Manager makes the call, IT Manager/App Director are consulted and may direct decisions on tests.

3.            Software release approver decided by App Director, or directed by CIO, Associate CIO Applications.

4.            Project Manager ensures tests are included in project plan, including multiple iterations.

5.            Tests are performed by assigned parties, with multiple iterations as necessary.

6.            Tests results and any recommendations are fed back to project manager for compilation in the sign-off document.

7.            Project Manager, Product Manager, and Business Sponsor provide their recommendation on the final approval

a.    When CIO or Associate CIO Applications approval is necessary:

                                          i.    The Application Director must approve prior to reaching CIO or Associate CIO.

                                         ii.    The CIO and Associate CIO Applications will seek advice from the PMO Director, the Application Director of record, the CTO and other subject matter experts.

8.            Approval happens – this is the Go/No Go decision for the release.  In the event tests have conditional approvals, remediation plans will be required.

a.    Security – Needs to have a waiver as well as a remediation plan, reviewed by Security Officer and Application Director, approved by CIO.

b.    Accessibility – Needs to have a waiver as well as a remediation plan, reviewed by Accessibility Coordinator and Application Director, approved by CIO.

9.            RFC put into place to document the change. CAB ensures there are no conflicts and back-out plan adequate.

10.          Change happens.

11.          Application Inventory updated as appropriate.

 

Emergency Changes:

1.            Effort commences.

2.            Product Manager decides applicable tests, IT Manager/App Director are consulted and may direct decisions on tests.

3.            Approver decided by App Director, or directed by CIO, Associate CIO Applications.

4.            Tests results and any recommendations are fed back to Product Manager for compilation in the sign-off document.

5.            Product Manager provides recommendation on the final approval

6.            Approval happens – this is the Go/No Go decision for the release.  In the event tests have conditional approvals, remediation plans will be required.

7.            RFC put into place to document the change. CAB ensures there are no conflicts and back-out plan adequate.

8.            Change happens.

9.            Application Inventory updated as appropriate.

 

Hosting Environment Changes: 

1.            Effort commences. 

2.            Product Manager decides applicable tests, IT Manager/App Director are consulted and may direct decisions on tests.

3.            Approver decided by App Director, or directed by CIO, Associate CIO Applications.

4.            Tests results and any recommendations are fed back to Product Manager for compilation in the sign-off document.

5.            Product Manager provides recommendation on the final approval

6.            Approval happens – this is the Go/No Go decision for the release.  In the event tests have conditional approvals, remediation plans will be required.

7.            RFC put into place (by Application Hosting Team) to document the change. CAB ensures there are no conflicts and back-out plan adequate.

8.            Change happens.

9.            Application Inventory updated as appropriate.

 

 

6.     Glossary

 

1.   Deployment Type Criteria:

 

Deployment Type

Criteria

New

All new development

Utility release / Enhancements

All change/enhancement releases of one or more, related or unrelated changes/enhancements of any size.

Emergency

Urgent rapid development and deployment

Hosting environment change

Change in the underlying platform, such as server and OS changes, but not hosting location changes.

 

2.   Test Type Terminology:

 

Term

Definition

Tests

The types of tests cases / scenarios to be performed   

Exceptions/Considerations

Variations by deployment type of test requirements that may be discretionary. 

Entry Criteria

What must be done/in place before testing process steps can begin

Exit Criteria

Description of what comprises test completion

 

3.    Other Definitions:

 

Term

Definition

SDLC

System Development Life Cycle

PM

Project Manager

PdM

Product Manager

RFC

Request for Change

Code Freeze

(from Wikipedia) ‘… point in time in the development process after which the rules for making changes to the source code or related resources become more strict, or the period during which those rules are applied.’

The specific rules may vary by project and testing / release approach. Suggest the PM define the rules for the project at project initiation, with the team.

 

 

 

7.      Application SDLC Considerations

 

The items in this section are meant to be guidelines (for those performing project management) for inclusion in the project plan for the development effort, whether it be new software, enhancement, or a utility release. 

 

Considerations include:

1. Touch points with application hosting.

2. RFC

3. Application Inventory Update

 

Tasks typically part of Application Deployment Certification are ‘bolded’ in the table below.

 

Type

Format

Who

When

Intent

Lead/ Turnaround time for response from others

Define Project Code Freeze Specifics

Collaborative, Informal or Formal Meeting

PM

Application Project Team

At project initiation

Establish consistent definition and rules for ‘code freeze’ for this project.  E.g. at what phase freeze occurs, nature of freeze (allow bug fixes or strict with no changes [may apply when releasing in iterations]), etc

NA

Design Input

Collaborative, Informal or Formal Meeting

§ App Technical Lead

§ App Hosting Leads or tech staff as deemed appropriate by App Hosting leads

§ As close to project inception as possible

§ At least once before coding begins

§ Ongoing as needed

§ Review with infrastructure resources, the application’s logical architecture design

§ Seek advice if physical infrastructure presents a perceived barrier or a “fork in the design road” is encountered

TBD

Design Finalization

Formal, “quasi-sign-off” event

§ App Technical Lead

§ App Hosting Leads

 

After design phase when desired architecture is defined and documented, and server needs identified

§ This is a collective review (not just App Hosting reviewing App’s proposal) based on the collaborative interactions during the design development

§ Final definition of scope which frames the certification requirements

§ Defines anticipated timelines, level of effort from hosting team.

TBD

Project Meetings

Electronic via Outlook

Initiated by App Technical Lead or Project Manager

Regularly scheduled occurrences during project/ release timeframe

§ To keep all parties (app and hosting teams) on same page of project

§ To ensure checkpoints occur to evaluate new information as it occurs.

NA

Scheduled Test Requests

Electronic via Footprints

§ Initiated by App Lead

§ Specific App Hosting or App Dev teams that are involved in the execution of required tests

§ Throughout testing phase

§ Possibly during development phase if there is a need to validate a new technical methodology

Engage technical staff that administer tests

TBD

Testing Feedback

Electronic

Documented test results delivered to App Lead by teams involved in testing

Upon completion of specific tests

Documentation of test results

TBD

RFC

Electronically submitted via Footprints

§ Initiated by App Lead

§ Reviewed by CAB

Prior to the CAB meeting in advance of desired deployment date

§ Ensure impacted teams aware of deployment activities, downtime, backup plan, risks

§ Schedule all teams with hands-on role with deployment

TBD

Update application inventory as applicable

 

§ App Lead

 

§  

 

 

Notes: These guidelines are necessary regardless of the Application Hosting Team, Core Technologies or Remote Hosted.  Application Hosting Team is the infrastructure, including servers, instances/databases, and network.

 

8.     Links to Relevant Policies

 

Application Deployment Policy

Infrastructure Deployment Certification Policy

Remote Hosting Policy

Waiver Policy

Accessibility Policies & Standards:

OIT - IT Policies, Standards, and Procedures

 

Security Policies & Standards:

OIT - IT Policies, Standards, and Procedures