Posted: February 26th, 2023
To prepare for this question, review the readings attached below.
For your Answer:
Applied Improvement Project Model Reading
Dr. Dooley: Hello, I am Dr.Dooley, as a faculty facilitating the EdD advanced doctoral courses, my goal is to describe the purpose and benefits of the Applied Improvement Project, or AIP Model, that will be instrumental in you being successful in your doctoral program.
The AIP Model is intended to serve as a resource during your EdD journey. The Applied Improvement Project Model media piece provides a big picture understanding of the process in which you will engage during the advanced doctoral courses.
The cycle is a logical series of steps, starting with problem identification and analysis. You will then take action to address the problem and determine if the action made a difference or resulted in an improvement.
The AIP Model is referenced frequently in your courses. This resource is not intended to replace other classroom resources related to action research or other improvement cycles, but instead to complement them.
How are these steps of the Applied Improvement Project Model incorporated into your advanced doctoral courses? Listen carefully.
The 10-step cycle of inquiry includes three main phases: planning, implementation, and evaluation. I'd like you to click on each icon to see a detailed description of the three phases, as well as the 10 steps within those phases. You will see how each step connects to your coursework and your end goal, the
As you read through the descriptions of each step, note that we provided guided questions along the way for you. These are important questions for you to ponder as you navigate through the model and your advanced doctoral courses. If you work through the questions, you will be well on your way to completing your project.
The consistent use of the AIP Model in your courses will form the foundation for completing the AIP Monograph, which is a primary requirement for earning your degree.
Keep in mind the AIP Model can be applied to planning, implementation, and evaluation of any improvement inquiry or initiative unrelated to the requirements of the EdD program.
We hope you will refer to this AIP media piece often as you progress through your coursework within your doctoral program. Please know that we are here to help you with every step. It is essential to share questions early in your program so that we can guide your understanding of the AIP Model and ensure that your learning experience is progressing.
Good luck with your program and please reach out to your advisor, faculty, or program leader with any questions on how to implement this Applied Improvement Project Model. Bye!
Planning PhaseAIP Monograph
Section 1: Planning
· Problem of Practice
· Organizational Context
· Purpose Statement
· Review of the Literature
· Applied Improvement Project Methods
· Credibility, Dependability, and Transferability
· Ethical and Regulatory Issues
Step 1: Diagnose and Define the Problem of Practice
The inquiry cycle begins with a diagnostic phase: What is the problem of practice? What are the factors that are causing the problem? How can the problem best be defined? In this phase, the focus is on the problem, not on a solution.
The diagnostic phase is completed in EDD9951/EDD-FPX9951 through the Problem of Practice Needs Assessment signature assignment. In that assignment, at least two potential approaches to addressing the problem of practice are aligned with one or more of the root causes of the underlying problem. Possible approaches to addressing the problem of practice, are identified with the preferred approach forming the basis of the doctoral project. During this phase, stakeholders provide information to inform a theory of action (also called a theory of change) that was constructed to address one or more root causes of the problem.
Performance Gaps and Causal Analysis
Diagnosis is accomplished by defining performance gaps and conducting a causal analysis. This work is completed in EDD8050/EDD-FPX8050 and can be redefined in EDD9951/EDD-FPX9951. Performance gaps describe the difference between the current state (i.e., what is actually happening) with the desired or expected state (i.e., what should be happening). A causal analysis, conducted with the help of a tool such as a fishbone diagram, helps develop a comprehensive understanding of the various facets of the problem and its causes. A causal analysis also highlights potential points of leverage for an intervention.
Diagnosis involves an examination and evaluation of existing data. Additional data collection is often necessary in order to accurately define performance gaps and analyze causes. Relevant literature and research about key aspects of the issue may also contribute to a better understanding of the problem of practice (and provide ideas for interventions or solutions later on). Note that data or evidence used to define the problem and its performance gaps may serve as benchmark or comparison data for data collected to determine the effects of the intervention or solution implementation.
Because issues need to be understood within an organizational systems context, environmental factors such as the political climate, the assumptions that are made about the issue or widely held in the organization, and differing viewpoints about the issue all need to be considered. For this reason, collaboration with colleagues and others in the organization is essential to fully define and understand a problem of practice and its causes. Without the participation of colleagues and support of organizational leaders, along with the insights they bring to an improvement process, meaningful sustainable change in all but the most trivial of problems is virtually impossible. An organizational analysis is completed in EDD9951/EDD-FPX9951.
When a problem of practice or issue has been fully explored and diagnosed, the understanding of the problem (or opportunity) and its causes should be clearly defined, described, and framed within the organizational context in which it occurs. Different institutions may have particular problems of practice in common. However, each institution has its own history and culture and the problem of practice must be understood in detail in its own specific and local context. Reporting on both the history and culture of the institution is an important part of the organizational analysis completed in EDD9951/EDD-FPX9951.
The diagnostic phase can be complex and may take more time than anticipated. What first appears to be the problem may turn out to be a symptom of a larger issue that must be addressed in order to improve the situation. Further investigation may be necessary. Even when it is not possible to acquire complete information about a complex problem of practice, what is known and not known about the problem should be clearly articulated and evaluated, based on available data. The importance of the diagnostic phase cannot be overestimated because, as in medical practice, the steps or treatment that follow rely on a comprehensive understanding of the issue and its causes, that is, the accuracy of the diagnosis. Once it is diagnosed, a problem should be understood so precisely that it can be stated in one sentence that begins: “The problem is that…” If the problem cannot be stated this way, it probably needs further study. This language is used in the Needs Assessment that is formulated in EDD9951/EDD-FPX9951.
Questions to Consider:
To analyze the problem of practice and its context:
· Who needs to be involved in conducting an analysis of the problem of practice?
· What are other stakeholders’ points of view or perspectives on the problem of practice?
· Who is accountable for improving outcomes related to the problem or practice?
Data and Evidence
· What questions do I have about the problem?
· What information, evidence, or data do I need to understand the issue?
· What information, evidence, and data already exist at the organization to analyze, understand, and define the problem?
· What additional information, evidence, and data need to be collected?
· What specific performance gaps about the problem of practice can be defined?
· What are the root causes as well as internal/external factors that contribute to the performance gaps? (Use a tool such as a fishbone or tree diagram.)
· What has been done to address the issue in the past? What is currently being done to improve the situation? What was or is the outcome of these efforts?
· What is the organizational context of the problem, that is, the political, economic, cultural, structural, ethical, or other considerations that have a bearing on the problem?
· What is the systems context? (Consider systems archetypes and action science perspectives on espoused vs. theories in use, single and double loop learning.)
· What assumptions are being made or am I making about the situation and the organizational context? What things are "taken for granted" in the organization?
· What effect does the problem have on the organization (e.g., on people, resources, and/or organizational outcomes)?
To evaluate existing research, literature, and improvement initiatives:
· Which key concepts and topics are relevant to the problem of practice?
· Which research studies and literature will add to my knowledge and understanding of the problem of practice and its organizational and systems context?
· How do I summarize, compare, synthesize, or evaluate existing research and literature?
· Which other organizations have similar problems of practice and how are they addressing them?
· What can I conclude about the issue, its causes, and potential solutions from what I have learned in the literature and from other professionals or institutions?
To establish key questions about the problem of practice and how to address it:
· What key question or questions emerge from my analysis of the problem and study of the literature about the problem of practice and its causes?
· What is still unknown?
To develop a one-sentence problem statement:
· How can I articulate the problem concisely and precisely in one sentence that begins, “The problem is…” or “The problem is that…”
Step 2: Generate Alternatives
When the problem of practice has been studied and a causal analysis conducted, the next step is to consider how the problem of practice can be addressed such that improved outcomes result. In EDD9952/EDD-FPX9952, the Literature Review, the second Signature Assignment is developed using the problem of practice and corresponding needs. Also, in EDD9952/EDD-FPX9952, a preliminary draft of the entire Applied Improvement Project Action Plan is developed. Along with the findings from current scholarly research literature related to your problem of practice, this step also involves collaboration with stakeholders. Alternatives to the current state are collaboratively generated and evaluated, and decisions about a course of action are made.
The process of establishing alternatives or interventions designed to achieve better outcomes implies that a desired future state in relation to the problem of practice has been defined. It is important to have a clear vision of the desired outcome when changes to address the problem or issue are in the planning phase. An analysis of the organizational context completed in EDD9951/EDD-FPX9951 and the literature review completed in EDD9952/EDD-FPX9952 will be vital in collaboratively envisioning a desired future state and developing potential interventions, solutions, or actions that are feasible, realistic, and expected to lead to the desired future.Questions to Consider:
· With whom do I need to collaborate in developing a vision of a desired future in relation to the problem of practice?
· What does the desired future look like?
· How would I describe the outcomes associated with the desired future that the organization seeks to achieve?
Interventions or Solutions
· With whom do I need to collaborate to brainstorm interventions, solutions, or evaluate alternative courses of action?
· How can I best work together with other stakeholders to generate alternatives and implement a chosen change initiative?
· What are several alternative courses of action to address the issue and bring the organization closer to its desired future?
· How do the actions address specific causes of the problem?
· What are the known or likely risks of each? Unintended consequences? Trade-offs?
· What is my theory of change? For example, do I believe that if action “x” is undertaken, then “y” will result? If so, why?
· For complex problems of practice, how can I prioritize multiple potential interventions or solutions?
· Do some parts of the problem need to be resolved or mitigated before others can be addressed?
· Are the change actions feasible? Are they sustainable?
· Do I have the ability within my organization to implement the change action or intervention and would I be able to gain permission to do so?
· Which of the feasible and sustainable actions are most likely to address key causes of the problem and lead to improved outcomes?
· What is the best course of action?
· What are the measurable outcomes that will determine if and how the course of action, during and after implementation, made a difference?
Step 3: Design the Action Plan
The last step in the planning process is to design an action plan for the applied improvement project, taking into consideration the organizational context, current understanding of the problem of practice, and the chosen intervention. As in prior steps, action planning involves collaboration with those who will implement or are affected by the change, including guiding questions and data methods.
In the first several weeks of EDD9953/EDD-FPX9953 course, the Applied Improvement Project Action Plan – the third Signature Assignment – is completed. This work is not done in isolation, but is rather drawn from the previous two signature assignments along with the comprehensive feedback received in EDD9952/EDD-FPX9952 on the preliminary draft of the Applied Improvement Project Action Plan. Ongoing feedback is again provided in EDD9953/EDD-FPX9953 as the Action Plan is further refined up until the time the AIP Action Plan is formally submitted.
Guiding questions reflect what you and your organization wish to learn from both the process and the effects of the implementation. Guiding questions drive the implementation process and data methods. Broad questions can be combined with more specific questions. For example, broad questions might be:
· How can the implementation of “x” lead to result improvement “y” (where “y” is better a better outcome)?
· What happens when “x” is implemented?
These questions allow for any and all aspects of the implementation to be studied. To answer the questions, process data are collected and a process analysis is conducted.
More specific questions could be:
· What are the perceptions of [name the group] about the implementation of “x”?
· To what extent did “x” lead to improved outcomes “y”?
· How are outcomes “y” different after implementation of “x”?
These questions represent outcomes of the implementation: improved perception outcomes for one or more stakeholder groups and improved outcomes (“y”) for the problem of practice. Perception data can also be an important source of feedback about an intervention.
When the questions of interest have been established, a data plan can be developed to ensure data are collected to answer them. The chosen course of action, the implementation plan, data collection, and data analysis must align with each other and with the problem, its causes, and guiding question(s).
An implementation plan is part of every applied improvement project Action Plan. The implementation plan is a step by step description of each action, a breakdown of each action into specific tasks, the person(s) responsible for completing each task, the resources needed, the duration and deadline for each task, and the evidence or data that indicated completion of each task. An implementation plan allows you to manage the project on a day-to-day basis, monitor whether actions and tasks are on schedule or completed, and make adjustments to the plan as circumstances dictate. An implementation plan is created in EDD9953/EDD-FPX9953 and then detailed out into a document called “the Work Plan” in EDD9954/EDD-FPX9954.
Process data: It is important to study and document the implementation of the change as it unfolds by collecting qualitative data: notes about conversations with stakeholders, observations of the implementation process, modifications to the plan, and your own insights and reflections about what is happening. These data are process data because they allow you to understand what happens when the change is in progress. When the cycle of inquiry is completed, process data allow you to tell the story of the change and provide an explanation of how the implementation led to the outcomes that were achieved (or not).
Outcomes data: The action plan design step also defines the data that will be collected to document the outcomes or effects of the implementation. Outcomes data allow you to answer questions such as, To what extent did “x” lead to improved outcomes for “y”? To compare outcomes before the intervention with outcomes after you have implemented the change, you will need baseline data. For example, the data you analyzed to determine and define performance gaps between the current and desired outcomes in the diagnostic phase is baseline data. The baseline data included in your action plan should be evidence of the specific outcomes you have targeted for improvement.
A comparison of baseline data (pre-intervention) and the same data collected post-intervention is one way to determine whether the intervention or change actions reflect an improvement over the status quo, no change, or a negative effect on the situation. If baseline data are not available, expected levels of improvement called success criteria can be identified. Success criteria define the performance level the intervention is expected to achieve on a given outcome. Even when baseline data are available, it is a good idea to define success criteria before implementation to clarify the extent to which stakeholders expect the intervention to make a difference or to achieve a specific result. If actual results depart too much from the established success criteria, stakeholders can then ask themselves why their theory of change was inaccurate and seek alternate solutions.
In addition, you may want to collect perception data from participants during and/or after the intervention, survey data, and other data specific to your problem of practice and intervention. The interview process is first covered in EDD8030/EDD-FPX8030. Multiple types of data, including your own AIP Implementation Journal entries should be collected to increase the credibility and validity of the inferences or conclusions you are able to draw from your data. Systematic collection of a variety of evidence or data is essential to a scholarly investigation of practical problems and interventions to solve them.Collaboration
In a cycle of inquiry, collaboration with participants and/or others in the organization to develop questions, criteria, and metrics for interventions and to identify the sources and types of appropriate data to collect, can prove very helpful. The consent, cooperation, and/or approval of those who will participate, who are in positions of authority, or who will be affected by the implementation of an intervention may be required by the IRB depending on the information sought. Other resources needed should be identified and availability determined.Questions to Consider:
· What are the questions (e.g., action research or other applied project) that will guide the project and its plan for data collection?
· What are the specific actions and step-by-step tasks that will be implemented?
· What is the tentative timeline for implementation?
· Who is accountable for completing the tasks and what is the evidence of completion?
· How will the implementation of the intervention or solution be continuously monitored and documented?
· In addition to the AIP Implementation Journal which is completed throughout the advanced doctoral courses, what types of data or evidence should be collected to accurately monitor the implementation?
· When will I collect these data?
· Is the process data plan robust enough to allow me to tell the story of how the intervention was implemented and explain how the intervention led to the results?
· What is the intended outcome(s) or goal(s) of the intervention?
· What are the measurable criteria for success?
· What are the types and sources of data I need to collect to measure the effects of the intervention?
· Will I need access to existing organizational data?
· Will I need to develop data collection instruments?
· When will I collect each type of data?
· What data accurately describe the status quo and serve as baseline data?
· What data will I collect during and/or after the implementation?
· How long will it take for the intervention or actions to demonstrate an effect? Or, how long will data collection take?
· Have I identified multiple sources of data to collect so that triangulation can take place?
· How does the plan address issues of validity and reliability, or credibility, transferability, dependability, and confirmability?People
· Is the action planning collaborative? Is it inclusive? Are multiple perspectives considered?
· What support from organizational leadership is needed to implement the plan? Can support and permission for implementation be obtained?
· How will the plan be communicated?
· Does the plan adhere to the highest ethical and scholarly standards?
· Have I addressed any legal concerns and social responsibilities in designing the action plan?
Implementation PhaseAIP Monograph
Section 2: Implementation
· Process Analysis
· Data AnalysisCourses
Step 4: Implement the Action Plan
In Phase 2 of the Inquiry Cycle, the action plan developed in Phase 1 of the Inquiry Cycle is implemented. A draft of the Action Plan is developed in EDD9952/EDD-FPX9952 and completed in EDD9953/EDD-FPX9953. Successful implementation of the plan depends in large measure on the quality of the planning process. Although unanticipated events and surprises may occur in any applied improvement project, careful analysis of the issue and attention to the steps outlined in the planning phase provide a sound foundation for making any adjustments needed during the improvement project.
A natural desire during implementation is for an intervention or action to succeed or improve the situation and demonstrate positive results. The desire to succeed can lead to difficulty in seeing the situation as it really is. Confirmation bias is another way our minds can derail our ability to think critically. Awareness of assumptions that are being made about the intervention and the ability to view the situation from different points of view are important habits of mind during the investigation of any problem or situation. Throughout the implementation, respect for participants and adherence to high ethical and scholarly standards is essential. Ethical standards are covered in EDD9953/EDD-FPX9953.Questions to Consider:
· How is the change process unfolding? What are the surprises? Unanticipated difficulties?
· How is the implementation being monitored and documented?
· Are adjustments or modifications needed to the intervention plan, based on observations and feedback from participants?
· What assumptions am I making about the implementation and the outcomes?
· From which points of view am I examining the issue? Am I keeping an open mind?
· Is the implementation of the intervention, including communications and data collection, being carried out in a way that is respectful of participants?
· Is the plan being led and implemented according to the highest ethical and scholarly standards?
Step 5: Collect and Analyze Data
Two types of data collection are essential to an applied improvement project: process data and outcomes data. Collection of these data and analysis begin in EDD9954/EDD-FPX9954 and continue with data analysis in EDD9955/EDD-FPX9955.Process Data
In EDD9954/EDD-FPX9954, process or monitoring data can be collected in the AIP Implementation Journal and analyzed in real time during implementation to document the intervention’s implementation. Documentation of how the implementation process unfolds on a day-to-day basis is essential to understanding and clearly communicating to others "what happened" during the project and explaining “how” the intervention made a difference.
The documentation may include logs, observations, reflections, informal conversations, and virtually any evidence that informs or describes what happens during the intervention.
The AIP Implementation Journal which first appears in 9953, is also an appropriate place to record your thoughts, feelings, and ideas as they occur during the implementation. Process data documentation should reflect any departures from the original action plan, including justifications for the changes to the plan. A thorough and accurate description of the project adds to its credibility, dependability, and potential transferability. Assignments associated with the AIP Implementation Journal begin in 9954 and analyzed in 9955.
As noted, process data are collected and analyzed during the implementation in real time. One reason for the ongoing analysis of data during implementation has to do with the nature of an applied improvement project. An applied improvement project seeks to improve a situation through action or change that may result in unanticipated consequences, either positive or negative. Circumstances may also change during an implementation. Based on the effects of the changes as they occur, the action plan may need to be modified or adjusted in real time in order to better address the problem situation. Modifications of an action plan are often most effective when they are the result of ongoing dialogue and consultation with participants directly engaged in the project.Outcomes Data
Outcomes data, on the other hand, are typically collected during and/or after the implementation and are analyzed after all data have been collected. Outcomes data measure effects of the intervention on the problem of practice. Comparisons can be made with pre-intervention outcomes if baseline data are available to determine whether the implementation led to improvement of the outcomes that were defined as part of the applied improvement project action plan in Phase 1. If baseline data are not available for comparison, targets for success can be established when designing the action plan and the outcomes can be measured against the success criteria. Both quantitative and qualitative data may serve as outcomes data, depending on the guiding questions posed in the action plan. For example, in addition to quantitative measures, interview data could answer questions about the perceived benefits or efficacy of the intervention.
Taken together, these data will enable the applied researcher to reconstruct and tell the story of the implementation of an intervention and analyze the effects of the implementation on outcomes, including an explanation of how the intervention led to the results that were obtained.Questions to Consider:
· How are data (including process implementation data) being continuously collected, recorded, and analyzed during implementation?
· Am I collecting and analyzing both outcomes and process data according to the action plan?
· Am I documenting the project’s progress and monitoring the implementation?
· Am I documenting any departures from the original data collection plan?
· Am I maintaining my journal or creating memos of observations, conversations, and other process data?
· How do I interpret the information or data? What can I infer?
· Am I adhering to the highest ethical and scholarly standards?
Step 6: Dialogue about Process and Findings
An applied improvement project is undertaken to address a problem of practice and effect improvement. The results of the Applied Improvement Project are developed in EDD9954/EDD-FPX9954 and the findings of the results are revealed in EDD9955/EDD-FPX9955. By definition, an improvement project involves change. The people involved in the change may have been involved in the planning of the intervention, may be directly participating in the change or intervention and directly affected by it, may be members of an organization who are indirectly affected by the project, or may be leaders in the organization who have an interest in the outcome of the project. Although the degree of participation may vary depending on the setting and the problem, an applied improvement project is planned and implemented with participants. That is, applied improvement project are done with people, not “to” them.
There are several benefits of ongoing dialogue and information sharing with participants and other stakeholders during the implementation of the intervention. Participants can be more effective in implementing an intervention, for example, when they receive feedback about the effects of their actions. People, in general, perform better and are more energized when they are empowered with information about “what’s going on.” Participants can also be more engaged with the change if their ideas and insights about potential modifications are respected and, as appropriate, incorporated into the project. Perhaps most importantly, collaborative dialogue about the implementation process enables stakeholders to increase their understanding of both the problem and the efficacy of the intervention.Guiding Questions:
· Am I keeping participants and members of the organization informed of progress and interim results? Of the successes and challenges of implementation?
· How is dialogue about the process and the findings occurring?
· Are appropriate forums for discussion, concerns, and suggestions available?
· Are participants' perspectives incorporated into the implementation where appropriate?
· What conclusions do those involved in the implementation draw from the data thus far?
· Have plans to ensure the validity, reliability, credibility, and dependability of results been implemented?
Section 3: Evaluation
· Reflections and Critique
· Implications for Professional Practice
Step 7: Evaluate Outcomes
The first inquiry cycle is complete when the timeline established in the action plan comes to an end. At this point, all collected data not already analyzed are analyzed and evaluated as described in the action plan. This analysis is completed in EDD9955/EDD-FPX9955 during the development of the Monograph – another signature assignment. The baseline data about the problem or issue, specific criteria against which to assess the success of the intervention, and metrics that help evaluate whether the criteria were met were established in the action plan prior to implementation.
Making Valid Inferences
The project’s outcomes data analyses are compared with the baseline data and the criteria for success. If the appropriate data were collected, it should be possible to offer credible evidence of the extent to which the implementation of the change did or did not make a difference. However, any pre-post quantitative comparison must follow statistical guidelines such as appropriate sampling, sufficient sample size, and meet the assumptions of any chosen statistical test. Because local projects are often limited in terms of sample size, comparisons of pre- and post-intervention data may only be possible using descriptive statistics with limited ability to draw inferences about significant change. Yet, promising preliminary local results, even if inconclusive, can set the stage for the next cycle of inquiry in order to continue the study of promising strategies.
This is also a good reason to collect qualitative data in addition to quantitative measures so you have additional evidence that may support or provide an explanation for your positive, “no change,” or negative quantitative measures. Qualitative outcomes data can also help document change in attitudes or perceptions. As you analyze and evaluate your data, consider how and to what extent you can determine whether any changes to the status of the problem of practice were the results of the intervention and whether any of the changes could be attributable to a different cause.
The process data (the documentation of the implementation process) are also evaluated for the extent to which the original plan was followed and whether modifications were necessary. Unanticipated events, unexpected consequences, participant reactions, and other aspects of the implementation process are compiled in narrative form. The qualitative process data are a rich source of information and data about daily events, ideas, and insights compiled during the course of the study. The evaluation of process data results in a compelling narrative of the implementation and how it worked that helps explain how the intervention produced the outcomes it did.
Member checks, audit trails, and other methods that strengthen the trustworthiness of the data are completed to increase the credibility, dependability, confirmability and, ultimately, the transferability of the project. The precision, accuracy, and completeness of the applied improvement project analysis help determine the extent to which the project can contribute to the next cycle of continuous improvement and, potentially, to a larger body of knowledge about the problem of practice.Questions to Consider:
· Are data being analyzed and evaluated according to the action plan? How do I know when analysis is complete?
· Have I referenced my journal for information and data that confirm or disconfirm other data I collected, or fill in gaps that have not been documented elsewhere?
· Have I evaluated the implementation process and constructed a cogent compelling narrative of the project?
· What are the project’s findings? Have the results of the data analysis been evaluated against criteria for success?
· What can I conclude from the information or data analysis? Are my conclusions based solely on the evidence?
· Have I accounted for confirmation bias, optimism bias, and other forms of bias?
· Do the findings indicate a need to revise the theory of change developed in Phase 1?
Step 8: Reflect or Dialogue on Results
Reflection on the results of the project is a critical step in all applied improvement projects. This work is completed in the Monograph which is developed in EDD9955/EDD-FPX9955. The results of an applied improvement project suggest further questions or areas of improvement that are reported as recommendations for further study. The results of an applied improvement project may suggest a need for further (or a different type of) action. (The results may also indicate that no further action is needed.) Reflection on the results of an applied improvement project is essential because the results have important implications for a specific real-world issue that is ongoing and immediate.
Reflection often takes the form of dialogue with participants and stakeholders, through which agreement can occur about the conclusions, implications and consequences, and further inquiry cycles.Guiding Questions:
· What are the consequences of our conclusions, positive and negative?
· What did we learn from the findings?
· How might what we learned lead to a modification of our theory of change?
· What literature might be consulted to help explain the findings?
· What unexpected elements or outcomes were identified?
· Would a different project have better addressed the performance gap or root cause(s) associated with the chosen problem of practice?
· Referring to the performance gap and causal analyses in the diagnostic phase, in what way(s) was progress or improvement made?
· What modifications to the project could or should have been made?
· What might be done differently if the project were to be repeated?
· What ideas have emerged from a collaborative discussion about the findings?
Step 9: Recommend or Decide on Next Steps
The reflection process, whether individual and/or collaborative, leads to recommendations or a decision about what should happen next. The reflection process is completed in EDD9955/EDD-FPX9955. Did the change, action, or intervention result in enough improvement or show enough promise that it should be continued for a specified period of time? Should the intervention be continued with modifications based on the results of the first inquiry cycle? Should the intervention or action be discarded in favor of a different approach to the problem of practice? In other words, the results, conclusions, and implications have been identified and reflected upon and, because the problem of practice is a real-world issue that is likely not “solved” by one inquiry cycle, a recommendation or decision is made about what do to next to continue the improvement process.Questions to Consider:
· Based on the reflection cycle (individually and collaboratively) what are the implications for next steps?
· What can be concluded about what needs to happen next?
· What literature might be consulted to guide next steps?
· How feasible are the proposed next steps?
· How do next steps align with the performance gap and causal analyses completed in the diagnostic phase? Do the gap and/or causal analyses need updating as a result of the applied improvement project?
· How does the next action reflect the previous learning?
Step 10: Communicate Results
The communication of the results of the project are presented in the Monograph, completed in EDD9955/EDD-FPX9955. During EDD9956/EDD-FPX9956, applied improvement project results are shared with stakeholder the site – and sometimes beyong for the same reasons any other project results are shared. Among those reasons are to enable learning to occur for all stakeholders; to ensure everyone shares a common understanding of the changes and how they affected the problem of practice; to contribute to the body of knowledge and theory about a problem of practice; to contribute to the profession; and to allow others who experience similar problems of practice to benefit from the project. The manner in which the results are communicated depends on the interests and needs of the audience.
Participants and others in the organization know that the applied improvement project took place and may have participated in the implementation. They should be fully informed about the project’s results if they have not already engaged in the collaborative reflection process in step 8. Others who may have an interest include those at a higher level in the organization, or people in similar institutions or organizations who may have similar issues to resolve.
Beyond the Organization
If the applied improvement project has followed a rigorous planning, implementation, and evaluation process, its results, especially when they show improvement, may also be presented at conferences, published as articles, reported to meet formal requirements of funders, or reported in a professional portfolio to showcase your work. Each presentation will be structured differently with a different level of detail and formality because the purpose of the presentation varies for each audience.
However, all reports have some basic commonalities. The report, whether formal or informal, answers these questions: What was the problem of practice - and how did you know? What were or are its causes? What did you do about it? What difference did the actions, intervention make? How do you know the actions caused the difference? What are the implications of the results? What are the next steps? Developing a written report or oral presentation serves as a valuable opportunity to critically re-examine each component of the project to confirm the accuracy of your thinking and the validity of the inferences you made based on data analysis.
Most reports will include a description of the problem of practice and its causes and context, the action or intervention chosen to address it (and why), and the applied project design, including data collection and analysis methods and reasons for their selection. A report includes a description of the implementation and its progress, as well as a description and interpretation of the data that have been collected and analyzed. Conclusions must be validated and supported by the evidence in a way that is convincing to the audience. Finally, next steps that are identified in the report should lead logically from the conclusions and implications of the project's results.Questions to Consider:
· How will I help others understand the results?
· What type of report is appropriate for the purpose and audience?
· Have I provided clear context and description of the problem and implementation?
· Have I described the data collected, how they were analyzed, and the results of the analysis?
· Have I supported my conclusions with evidence? Have I made valid inferences based on my data?
· Have I given credit that is due to others?
· Have I explained the implications of my conclusions?
· Have I communicated the significance of the project?
Place an order in 3 easy steps. Takes less than 5 mins.