2.2 Key Roles and Responsibilities


2.2 Key Roles and Responsibilities

Key Roles

A diverse set of perspectives are needed to examine current and desired capabilities and identify targeted improvements.

A cross functional team should be formed and led by a knowledgeable, trusted, and respected facilitator. Participants should be selectively targeted for their background, ability to constructively participate in the focused discussion, and position to advance the anticipated outcomes of the process.

Recommended participants and their respective responsibilities are shared below.

Project Sponsor

It is strongly recommended that a “project sponsor” be identified for any formal application of this guidance.

The project sponsor should have decision-making authority, be willing to be engaged throughout the process, and share enthusiasm for improving within the focus area.

The project sponsor should:

  1. Provide Leadership: Provide executive or management level endorsement and support for the assessment and recommended improvements.
  2. Select a Facilitator: Appoint an assessment facilitator to organize, communicate and manage the process and detailed activities.
  3. Be a Champion: Engage with leadership and management to ensure enthusiasm and cross-functional participation by targeted business, information technology, and support units.

Assessment Facilitator

It is essential that an assessment facilitator leads and organizes selfassessment, improvement identification, and improvement evaluation activities. A good candidate for this role is organized, empathetic to the diverse perspectives of the participants, and able to command the attention and respect of the group.

Ideally, this individual should be knowledgeable about the DOT asset management program and supporting data and information systems. The facilitator should also not have a particular agenda or bias with respect to the outcome or conclusions of the group (their role or perspective would not be seen as inherently favoring certain assets or data areas).

The ideal candidate for such a role is a program or project manager from the enterprise asset management, business process improvement, or other such program. Use a qualified, external consultant if candidate agency staff are not able to dedicate the time necessary to prepare, facilitate, document, and summarize the results of the process.

Key responsibilities of the assessment facilitator are:

  1. Assessment Scoping:
    • Establish the assessment focus with the Project Sponsor.
  2. Participant Selection:
    • Identify and engage targeted participants in the process
  3. Participant Preparation:
    • Share context and direction throughout the process.
    • Ensure expectations are clear and individuals are adequately prepared to constructively participate.
  4. Group Facilitation:
    • Organize meeting attendance and provide direction to meeting activities.
    • Ensure productive discussion and full participation.
    • Document key meeting outcomes.
    • Provide summary materials for group review and preparation in advance of future meetings or activities.
    • Utilize the TAM Data Assistant to capture group consensus during assessment, improvement identification, and improvement evaluation activities.
  5. Assessment Leadership:
    • Capture group consensus on current and desired state and selected improvements.
    • Document supporting contexts and takeaways from the assessment meetings.
    • Delegate action items (e.g. gaps in understanding that need to be closed by targeted participants).
  6. Improvement Evaluation Leadership:
    • Review practice gaps, assessment notes, and consider organizational needs, challenges, and context.
    • Ask questions that support informed discussion of agency improvement priorities.
    • Prepare supporting materials (such as “radar” charts).
    • Capture group consensus on improvement challenges, impact and effort, and priority.
    • Consider when “reassessment” is needed to refine the assessed current or desired state, or to identify additional or remove previously selected improvements.
  7. Results Summary:
    • Summarize outcomes for implementation action.
    • Present improvement priorities for executive endorsement and action.
  8. Implementation Support:
    • Work with the project sponsor and other participants to advocate for implementation.
    • Seek funding opportunities.
    • Lead efforts to incorporate recommendations into the agency technology, business, and/or process improvement plans, initiatives, and actions.

Asset Program Leads

Program leads from within the selected TAM focus area, or who rely upon the data and information systems within the identified data-lifecycle area are critical participants within the process.

These are typically central office program management, project managers, analysts, or engineers who understand asset management decision-making needs from a statewide and policy perspective. These individuals should also be able to discuss organizational challenges posed by substantial data, information system, or business process change.

A typical team includes:

  • Several such individuals, spanning key asset and/or program areas.
  • At least one Program Lead who is able to share executive management perspectives.

Field Asset Management Leads

District asset managers, engineers, or maintenance supervisors who are involved in day-to-day field asset management decision-making and execution. These staff must share the practical realities, challenges, priorities, and constraints of field asset management staff.

A typical team should include several of these individuals with differing perspectives. A district management perspective is necessary, as well as project-level decision-making and boots-on-the-ground field perspectives.

Information Technology (IT) Management and Staff

Key IT staff, particularly those who have an understanding of existing technologies, applications, and priorities within the targeted area. This may include IT relationship managers (those engaged with or integrated with key business units or applications), system administrators, project managers, or business or technical analysts.

IT staff should be prepared to share data, technology, or application related context and perspective as business needs or capabilities are discussed. These individuals should identify technology solutions from other agency business functions which may be useful to the TAM program.

During improvement evaluation, IT staff should share the technical process, challenges, and constraints anticipated when delivering IT solutions.

Data Life-Cycle Area Subject Matter Experts

As appropriate to the asset program, or when focusing on specific data life-cycle areas, other key perspectives should be represented. For example:

  • Specify and Standardize Data: Computer aided design and drafting (CADD) and location referencing system (LRS) managers and technical experts, metadata and governance leadership or staff.
  • Collect Data: Statewide data collection (e.g. LiDAR or imagebased vehicle collection), geographic information system (GIS) program, and/or mobile data collection program managers.
  • Store, Integrate, and Access Data: Data warehouse and GIS program managers and technical experts, business, data and/or enterprise architecture staff.
  • Analyze Data: Business intelligence, data analysis/science program managers or staff.
  • Act Informed by Data: Performance management or performance dashboard staff, capital, operations, and maintenance program budgeting, and/or field project and construction managers.
Recommended Preparation

This Section outlines the recommended process for guidebook use and identifies keys to success. Detailed instructions are provided in Appendix H.

Process Overview

Full, formal use of this guidance includes the following activities:

  1. Initial Scoping
  2. Participant Engagement
  3. Process Kickoff Meeting
  4. Self-Assessment and Improvement Identification Meetings
  5. Improvement Evaluation Meetings
  6. Outcome Summary and Communication
  7. Implementation Support These activities are led by the assessment facilitator, though initial scoping should also involve the leadership of a project sponsor.

Keys to Successful Use

Facilitator preparation, participant engagement, and use of the TAM Data Assistant are strongly recommended.

  • Facilitator Preparation: an active, prepared assessment facilitator is essential. Appendix H provides a detailed walk through of each activity in the process, sharing anticipated outcomes, detailed facilitator instructions, digital tool uses and supporting materials (such as sample meeting agendas or participant engagement materials).
  • Participant Engagement: a small, cross-functional group of knowledgeable and engaged individuals is needed to share perspectives on existing TAM processes, related data and information systems, and potential improvements.
  • TAM Data Assistant Use: an online, supporting digital tool is available. This tool provides a streamlined workflow to create assessments, benchmark performance, select, evaluate, prioritize improvements, and summarize and communicate outcomes. The Appendix I provides a detailed user quick reference guide.

TAM Data Assistant

The companion digital tool is available online, through the AASHTO TAM Portal, at: www.dataassessment.tam-portal.com.

Create and customize assessments of your TAM programs.

Benchmark current practices and desired state for 51 individual elements.

Select from candidate improvements to address identified practice gaps. Prioritize selected improvement based on implementation impact, effort, and challenges.
Export summary communication materials, directly from the tool. Use these for communicating to executives and to advocate for implementation priorities.