4.1 Practice Summary, Improvement Evaluation and Result Communication


4.1 Practice Summary, Improvement Evaluation and Result Communication

Current and Desired State Summary

Users of the guidebook will establish the current and desired state of practice for each assessed Area, Section, and Element of the framework.  This will provide a clear picture of where there are gaps in current practice, exposing opportunities for potential improvement.

While element-level response templates provided in the print guidebook are available to complete a pen and paper assessment, use of the TAM Data Assistant will facilitate summary and communication of the assessment results.

Visual summary and presentation of current and desired practice benchmarking is the most effective means of communicating assessment outcomes.  “Spider web” or “radar” charts are best used for this communication (example provided in Figure 4-1). Due to the number of individual elements, individual summary charts should be developed for each assessed Area within the guidance framework.

These charts will provide a compelling, visual representation of where current performance is high or low, as well as where there are gaps between current and desired performance.  Using these charts will clearly identify priorities for advancement, and support improvement evaluation and prioritization.

TAM Data Assistant

The TAM Data Assistant simplifies the summary of assessment outcomes by automatically generating these charts from the detailed assessment data.

Additional Recommendations

Summary and review of assessment results can generate new insights from the assessment team and allow for broader engagement beyond those involved in the initial assessment process. 

Use the assessment summary materials to iteratively refine the assessment details and generate more meaningful assessment results and improvement priorities.

Figure 4-1: TAM Data Assistant Assessment Summary Example.

Explanation of Recommended Summary Charting

The figure above exemplifies the recommended approach to visualizing the current and desired state captured through the assessment process.  Highlighted are four key elements of this visualization:

  1. The “spider web” or “radar” chart itself, including each assessed Element within the Area, organized by Section, and representing each possible level of performance (from benchmark level 0 to benchmark level 4).
  2. The current performance, highlighted in blue. This is provided for each assessed element within the targeted Area.
  3. The desired performance, highlighted in green. This is provided for each assessed Element within the targeted Area.
  4. The element identifier and name for each assessment Element represented in the summary chart.

Use of Recommended Summary Charting

Identification of Low- and High-Performing Sections and Elements: In the example above, Governance and Metadata practices are easily identified as low performing, whereas Treatment and Work Data Standards are relatively high performing.

Low performing areas may become obstacles to ongoing advancement and may need to be prioritized for improvement, even if these capabilities are not specifically an area of focus for the agency.  In the example above, without advancing Governance and Metadata capabilities, the ability to efficiently and effectively collect, integrate, or analyze TAM data may be compromised due to lack of understanding of and compliance with data standards as business needs and practices change.

Gaps in Current and Desired Performance: In the example above, all assessed Elements had a gap between current and desired performance, however certain elements had larger gaps than others.  Governance elements typically were two levels lower than desired, and will require significant investment and potentially face substantial institutional hurdles and organizational challenges to implement. 

Based on this summary, a long-term governance implementation initiative should be considered. Communication to decision-makers should highlight the significant gap in current practices with respect to the desired state and the value and benefits of investment in advancing governance practice.

Detailed Analysis

Detailed assessment data can be exported from the TAM Data Assistant to an Excel spreadsheet.

The export file can be used to readily list, filter, sort, and apply calculations which may be helpful in communicating the current practice, desired state, or practice gaps.  The user can also readily create a “radar” or “spider web” chart from the export file (though the tool does this automatically for each framework area).

The assessment information can also be combined with detailed improvement evaluation outcomes (also included within the export file) to relate current and desired practice to individual improvement opportunities (as is discussed further in the Improvement Evaluation section of this Chapter). 

TAM Data Assistant Quick Reference Guide

For more detailed information on the tool’s functions and use, see the TAM Data Assistant Quick Reference Guide.

Improvement Evaluation

After candidate improvements are identified, the next step is to evaluate them, understand effort versus likely payoff, and anticipate implementation challenges. This evaluation step is important for setting priorities and developing a comprehensive improvement strategy.

The TAM Data Assistant allows you to sort, filter, and review a list of improvement identified during the assessment process. Through this interface, you can track evaluation results based on the criteria described below.

Each improvement should be evaluated in the context of other selected improvements, allowing the relative impact, effort, and priority of each improvement to be established (as High, Medium, or Low) with respect to the other identified options. Improvement specific challenges can also be identified for consideration during strategy development.

Impact is characterized by the extent to which new or existing practices will transform TAM related business practices. 

Effort is characterized by the level of resources and staff time required and the extent to which those can be incorporated into the responsibilities and budgets of existing business units.

Priority is established on the basis of when that improvement would be targeted for implementation, ranging from immediate action to being recognized for future, unplanned action.

Challenges can be categorized as into distinct categories of Time, Resource, Expertise, Coordination, Change, or Other.

Conceptual examples illustrating the application of these evaluation factors are provided on the following pages.

TAM Data Assistant Uses

The TAM Data Assistant provides functionality for recording ratings of impact, effort, priority and challenges for each selected candidate improvement.

Additional Recommendations

An iterative approach to improvement evaluation is recommended.  To the extent practical, this process should also involve external stakeholders and external planning processes.  For example, the goals and objectives stated in agency strategic plans should be incorporated into prioritization of improvement action. 

The availability, workload, and resources of impacted business units should also be considered, as well as the engagement and enthusiasm for change found in potential project sponsors and business leads.  Without stakeholder engagement, it is unlikely that a data or information system improvement will be successfully and sustainably implemented within routine business.

Improvement Evaluation Tools

The figure below demonstrates the TAM Data Assistant functionality supporting improvement evaluation.  Highlighted are five key aspects of this interface:

  1. Sort and Display Functionality – organize improvements identified during the self-assessment process
  2. Filter Functionality – apply criteria to filter the improvements based on Area, Challenge, Priority, Effort, Impact, as well as other factors.
  3. Individual Improvement Details – see details for each selected improvement
  4. Evaluation Criteria – establish the improvement’s impact vs. effort, priority, and associated challenges.
  5. Assessment Information – review the current and desired state of the associated element, as well as provide a link to quickly return to, and adjust, the associated assessment information.

Figure 4-2: TAM Data Assistant Use to Evaluate Selected Improvements.

  • Priority: The Low, Medium, High priority value assigned to the improvement.
  • Impact: The Low, Medium, High impact value assigned to the improvement.
  • Effort: The Low, Medium, High impact value assigned to the improvement.
  • Time Challenge: An indicator of whether a time challenge was identified for the improvement (0 = no challenge was identified, 1 = a challenge was identified).
  • Resource Challenge: An indicator of whether a resource challenge was identified for the improvement (0 = no challenge was identified, 1 = a challenge was identified)
  • Expertise Challenge: An indicator of whether an expertise challenge was identified for the improvement (0 = no challenge was identified, 1 = a challenge was identified)
  • Coordination Challenge: An indicator of whether a coordination challenge was identified for the improvement (0 = no challenge was identified, 1 = a challenge was identified)
  • Change Challenge: An indicator of whether a change challenge was identified for the improvement (0 = no challenge was identified, 1 = a challenge was identified)
  • Other Challenge: An indicator of whether another type of challenge was identified for the improvement (0 = no challenge was identified, 1 = a challenge was identified)
  • Status: An indicator of whether the improvement was or was not selected for improvement.
  • Evaluation Notes: Improvement notes captured during the self-assessment activity.

The export file can be used to readily list, filter, sort, and apply calculations which may be helpful in communicating the priorities for improvement.  By joining these results with the detailed assessment information, the user can further refine the priorities for improvement. 

TAM Data Assistant Quick Reference Guide

For more detailed information on the TAM Data Assistant’s functions and use, see the TAM Data Assistant Quick Reference Guide.

Conceptual Examples
Detailed Result Evaluation
High Impact, Low Effort Improvements

Filter for High Impact, Low Effort improvements.  Consider improvement opportunities which deliver significant value without substantial effort.  Where practical for immediate investment, communicate these “low hanging fruit” to decision-makers as easy wins.

Combine Assessment and Improvement Information

Combine assessment and improvement information using the Element ID field.  Use this to improve communication of improvement priorities by also relating current or desired performance.

Improvement of Low Performing Elements

Use the combined assessment and improvement information to sort the improvement list for low performing elements (“increasing” by assessment current level).  Identify improvements to lowest performing elements. 

A low performing element may not always stand on its own as a priority of the organization, but also consider the interrelated nature of performance within the framework.  Lagging performance in one aspect of performance can impact ability to be successful in other areas.

Improvement of Elements with Large Performance Gaps

Use the combined assessment and improvement information to sort the improvement list based on performance gaps (“decreasing” by the different between desired and current level).  Identify improvements in the high performance gap elements. 

Consider whether initial improvements in these areas should be prioritized, given that multiple improvements over an extended period of time will likely need to be implemented.

Conceptual Examples
Challenge Categorization
Time

Recommended when the time available is limited for the extent of the effort.

Resources

Recommended when level of resources or staff time would require executive approval.

Expertise

Recommended when the expertise required is not available to the DOT without specialized support.

Coordination

Recommended when engagement and agreement is required across many different areas of business within the DOT, particularly when many of the impacted business units do not typically work together as part of the routine business of the agency.

Change

Recommended when the improvement will significantly transform current business across multiple business units and processes, requiring extensive process reengineering and/or training to those impacted.

Conceptual Examples
Impact Evaluation
High Impact

Transforms current business in a way that addresses major process pain points, is likely to extend to multiple business units, and adds value to multiple business processes.

Medium Impact

Makes existing business processes significantly more efficient and effective, however may be within a limited area of business (e.g. a specific business function or process area).

Low Impact

Contributes a minor adjustment to an existing business process, but will not significantly change the business. In general, these improvement may already be informally in place, but are simply being formalized or being made clearer in the context of the program at large.

Effort Evaluation
High Effort

Requires a major commitment of resources and staff time, typically across multiple business units. Examples would include a major IT application, a statewide technology deployment, etc.

Medium Effort

May be incorporated within typical budgets and resources but would require planning and coordination, typically limited to a specific business function or process area.

Low Effort

Can be included within the routine responsibilities of a business unit or working group and typically able to be completed within a short timeframe.

Priority Evaluation
High Priority

Targeted for immediate action.

Medium Priority

Desired to begin within the next several investment or planning cycles (e.g. 1-2 years).

Low Priority

Recognized, but not anticipated for action within the near future and unlikely to be incorporated into near term planning activities.

Executive Communication

Clear, concise communication of current practices, the desired state, key performance gaps, and priority improvements are essential to securing support for implementation.

The assessment facilitator, project sponsor, and other key team members should be involved in development of executive communication materials.

“Radar” charts, individual improvement evaluation data entry, as well as summary improvement “impact vs. benefit” charting can be directly used in decision-maker communication.

These pre-developed communication materials should be selectively used within externally developed executive briefing and summary materials designed to speak to the specific needs and interests of the targeted decision-makers.

Detailed export output should be used as the basis for any non-standard communication materials.  This will ensure that these are easily maintained or updated in the event that the assessment results are revisited at a future date.

Recommendations for effective executive communication include:

  1. Present the assessment focus and context emphasizing the motivation, desired value in selecting the focus, and the cross-functional nature of the assessment team.
  2. Communicate current and desired state quickly demonstrating where performance is low, where it is high, and were improvement is most necessary.  Provide practical examples of impacts that low performance is having on current TAM business.
  3. Communicate current and desired state quickly demonstrating where performance is low, where it is high, and were improvement is most necessary.  Provide practical examples of impacts that low performance is having on current TAM business.
  4. Acknowledge challenges that will be faced and outline organizational practices and real-world case studies that will support successful implementation.