Скачать книгу

If so, what knowledge can be leveraged from these previous efforts?

      <--- Score

      12. What is the definition of DRIVE Technology excellence?

      <--- Score

      13. Is the scope of DRIVE Technology defined?

      <--- Score

      14. How do you keep key subject matter experts in the loop?

      <--- Score

      15. If substitutes have been appointed, have they been briefed on the DRIVE Technology goals and received regular communications as to the progress to date?

      <--- Score

      16. Are different versions of process maps needed to account for the different types of inputs?

      <--- Score

      17. Are there different segments of customers?

      <--- Score

      18. How do you manage changes in DRIVE Technology requirements?

      <--- Score

      19. What are the rough order estimates on cost savings/opportunities that DRIVE Technology brings?

      <--- Score

      20. What was the context?

      <--- Score

      21. Do you all define DRIVE Technology in the same way?

      <--- Score

      22. Is the team sponsored by a champion or stakeholder leader?

      <--- Score

      23. What constraints exist that might impact the team?

      <--- Score

      24. How do you gather the stories?

      <--- Score

      25. Are approval levels defined for contracts and supplements to contracts?

      <--- Score

      26. What information do you gather?

      <--- Score

      27. How do you gather DRIVE Technology requirements?

      <--- Score

      28. Are the DRIVE Technology requirements testable?

      <--- Score

      29. What DRIVE Technology requirements should be gathered?

      <--- Score

      30. What are (control) requirements for DRIVE Technology Information?

      <--- Score

      31. What customer feedback methods were used to solicit their input?

      <--- Score

      32. Who is gathering DRIVE Technology information?

      <--- Score

      33. When are meeting minutes sent out? Who is on the distribution list?

      <--- Score

      34. How was the ‘as is’ process map developed, reviewed, verified and validated?

      <--- Score

      35. What critical content must be communicated – who, what, when, where, and how?

      <--- Score

      36. Are accountability and ownership for DRIVE Technology clearly defined?

      <--- Score

      37. How often are the team meetings?

      <--- Score

      38. Will a DRIVE Technology production readiness review be required?

      <--- Score

      39. What are the DRIVE Technology use cases?

      <--- Score

      40. What gets examined?

      <--- Score

      41. Has the direction changed at all during the course of DRIVE Technology? If so, when did it change and why?

      <--- Score

      42. How does the DRIVE Technology manager ensure against scope creep?

      <--- Score

      43. Is the current ‘as is’ process being followed? If not, what are the discrepancies?

      <--- Score

      44. How are consistent DRIVE Technology definitions important?

      <--- Score

      45. How can the value of DRIVE Technology be defined?

      <--- Score

      46. Who is gathering information?

      <--- Score

      47. Does the team have regular meetings?

      <--- Score

      48. Are the DRIVE Technology requirements complete?

      <--- Score

      49. What are the requirements for audit information?

      <--- Score

      50. Has a team charter been developed and communicated?

      <--- Score

      51. How do you think the partners involved in DRIVE Technology would have defined success?

      <--- Score

      52. What scope to assess?

      <--- Score

      53. What is the context?

      <--- Score

      54. What is out of scope?

      <--- Score

      55. Has the improvement team collected the ‘voice of the customer’ (obtained feedback – qualitative and quantitative)?

      <--- Score

      56. Where can you gather more information?

      <--- Score

      57. What is the scope of DRIVE Technology?

      <--- Score

      58. What is out-of-scope initially?

      <--- Score

      59. Do the problem and goal statements meet the SMART criteria (specific, measurable, attainable, relevant, and time-bound)?

      <--- Score

      60. Are improvement team members fully trained on DRIVE Technology?

      <--- Score

      61. Is special DRIVE Technology user knowledge required?

      <--- Score

      62. Will team members regularly document their DRIVE Technology work?

      <--- Score

      63. Are audit criteria, scope, frequency and methods defined?

      <--- Score

      64. What is in scope?

      <--- Score

      65. Are all requirements met?

      <--- Score

      66. Has a DRIVE Technology requirement not been met?

      <--- Score

      67. Has a project plan, Gantt chart, or similar been developed/completed?

      <--- Score

      68. What are the record-keeping requirements of DRIVE Technology activities?

      <--- Score

      69. Who are the DRIVE Technology improvement team members, including Management Leads and Coaches?

      <---

Скачать книгу