Skip to Main Site Navigation Skip to Content Skip to Footer
Back To Top

2.2

2.2 Areas for Improvement Cited in the Action Report from the Previous Accreditation Review

Summarize activities, processes, and outcomes in addressing each of the AFIs cited for the initial
and/or advanced program levels under this standard.

The following citations were given for Standard 2 as Areas for Improvement:

1. The unit does not have an adequate assessment system to delineate initial and advanced programs.
a. The EPP has made extensive revisions and improvements to delineate distinct assessments for the
initial and advanced programs. These have been described above.

2. The unit does not demonstrate sufficient evidence that it regularly and comprehensively gathers,
aggregates, summarizes, and analyzes assessment and evaluation information on its operations, its
programs, and candidates.
a. The EPP follows a cycle of administration of assessment, followed by data analyses, discussion, plans
for programmatic improvements. Data were discussed during program meetings and annual retreats but
will from 2016-2017 onwards, be systematically discussed at three pre-planned EPP-wide retreats.
b. Operations: The assessment cycle of the UAS is managed by EPP-wide operations that are
coordinated by the adhoc assessment committee, which is tasked with piloting all initial administrations
of CORE portfolios, and the EPP assessment committee. In fall 2015, the adhoc assessment committee
applied a backward design approach for the development of the CORE I portfolio, beginning with a pilot
of the Looking Backwards Looking Forward (LBLF) assignment. The draft assignment was
implemented throughout the EPP. The adhoc committee, comprised of representation from all programs
analyzed the results. Representative candidate work samples from the LBLF were collected
via systemic sampling for a total of five samples, followed by random sampling, resulting in three
samples from each program. The analysis was conducted to meet two purposes: (1) to assess the
assignment for content validity and (2) to ascertain patterns among candidates' responses. Each work
sample was read individually by each of the committee members. This process lent itself to internal
quality control which was valued by the members. In addition, the process proved to be a professional
development endeavor for faculty via collaborative dialog. The results were used to refine the
assignment and to make recommendations to the EPP for continuous improvement. The process is a
model for future CORE portfolio analysis.
c. The assessment cycle at the advanced level is similar to that at the initial level and is managed by unitwide
operations that are coordinated by the adhoc assessment committee, which is tasked with piloting
all initial administrations of CORE portfolios, and the EPP assessment committee. In fall 2015, the
adhoc assessment committee applied a backward design approach for the development of the CORE I
portfolio, beginning with a pilot of the Looking Backwards Looking Forward assignment at the initial
level. The efficacy of the pilot directly informed the development of the Advanced level LBLF (pilot fall
2016).
d. Similarly, the pilot of each new key assessment will be managed by the adhoc assessment committee
as outlined above.

3. The unit does not regularly and consistently share assessment results for unit and program
improvement.
a. Beginning fall 2016, assessment data will be shared consistently with University stakeholders and
external stakeholders via two primary vehicles: the University's revised, annual report and
via the EPP's webpage. In addition, the Dean meets with regional superintendents annually to discuss
programming updates, means for partnerships to improve the professional, and to share program
outcomes. (Minutes from recent meeting)

Return to Standard 2 Main Page

Return to Institutional Report Main Page