OffsiteReview

Institutional Report Offsite Review

List of Attachments (Including Response to the Offsite Review file)

Date: October 31, 2016
Institution/Agency/LEA:  Eastern Connecticut State University
Developmental/Initial/Continuing On-site/Electronic Review: November 13-15, 2016

 Offsite FindingsAreas of Concern to Meet StandardsEastern’s Response
Standard 1

Candidate Knowledge, Skills, and Professional Dispositions
This standard was found to be unmet in the 2014 review.1. Not all programs are recognized by their respective SPA.The only programs that have not been recognized by their SPAs are Elementary Education and Secondary Math Education.
Elementary Education Undergraduate SPA (Recognized with conditions) has been revised, paying particular attention to data requirements, comingling of standards, and clarifying reporting of grades by NCATE criteria. It has been re-submitted for approval in September 2016. Please see Appendix C1 for the submitted report.
Elementary Education Graduate SPA has been revised and resubmitted for approval in September 2016. The concerns raised by the reviewers, such as addressing all standards, aligning assessments to ACEI standards, presenting adequate data, reporting grades as per NCATE criteria, among others, were carefully addressed. Data were discussed clearly with explicit narration of how the results were used for program improvements. Please see Appendix C2 for the submitted reports.
Secondary Math Education (Undergraduate and Graduate) have a recognition status that has expired. Steady progress for a resubmission has been made with deliberate intentionality.
Assessments have been re-aligned to the NCTM 2012 standards (see scoring rubric and assessment guidelines for Student teaching Impact Portfolio in Appendix C3).
The EPP has started to collect data from these revised assessments, as and when they are offered. One application of data will be available for review during the visit.
The Secondary Math Education faculty are in close collaboration with the Math department and continue to revise the Advanced Math Portfolio (completed in MAT 372 Advanced Math High School Teaching) and to develop a new course on History of Math, both of which will be completed in Academic Year 2016-2017. These revisions will support the program to provide two content assessments (besides Praxis II).
Please see a detailed report on the status of these revisions and projected plans for implementation (in Appendix C4).
2. Baccalaureate and master’s candidates’ data are not disaggregated.Data disaggregated by program and by undergraduate/graduate status for initial candidates has been provided for the CARE admission data (Appendix E1) and Student Teaching (Appendices E2;E3;E4;E5;E6;E7;E8). Other data, disaggregated similarly, will be provided at the onsite visit.
3. It is not clear how GPAs are used as evidence of candidates’ content knowledge.Only programs whose SPAs permit the use of GPA as an indicator of content knowledge use GPA to measure content and or pedagogical knowledge. These include Elementary, Secondary Social Studies, Secondary Math (which is currently being revised), and Secondary English. All documentation adheres to the NCATE criteria for grades. See documentation from the recently submitted ELE program report for the graduate and undergraduate (Appendices C2 and C1, respectively) reporting of GPA as per NCATE criteria.
4. The IR has not provided convincing evidence that there is a concerted effort within the unit to ensure comprehensive and consistent performance measurements of candidates content knowledge, pedagogical content knowledge and skills, professional and pedagogical; knowledge and skills, ability to help all students learn, and professional dispositions.The EPP has established a comprehensive performance assessment system to measure candidates’ content knowledge, pedagogical knowledge, professional knowledge and skills to help all P-12 student learning. The system is designed on a progression across each of the clinical cores, using multiple data points. Each transition point (or CORE) represents developmentally appropriate practice for candidates.
Student teaching data (see disaggregated data from 4 semesters in Appendices E2;E3;E4;E5;E6;E7;E8) documents content knowledge, pedagogical content knowledge and skills, professional and pedagogical knowledge and skills, ability to help all students learn, and professional dispositions. This is additional evidence not previously submitted and exemplifies data across all programs. The data were discussed by the EPP at the May and August retreats for programmatic improvements (see minutes in Appendices F1 and F2).
Additionally, please refer to the chart (Appendix A3) in appendices that aligns the student teaching evaluation instrument (Appendix A2) to the various standards as well as distinctly aligns to sub-components of NCATE standard. The chart helps explain how the EPP uses the student teaching evaluation as evidence for various sub-components of Standard 1.
EPP has launched a series of Core Portfolio assessments across all programs. The Core I portfolio was piloted in spring 2016 and is being implemented in fall 2016 across all programs. See CORE I Portfolio Assignments previously submitted with exhibits with the IR. Candidates are provided feedback by faculty and data are being collected in TK20 and will be analyzed at the end of the term collaboratively among EPP faculty and P-12 partners. Following the processes for piloting the Looking Backwards Looking Forward assignment from CORE I spring 2016, evaluators will assess construct and content validity, and bias during the process. Evaluators will include faculty and professional community members (e.g., CORE I host teachers, alumni, university supervisor), thereby demonstrating the community’s involvement in the development of the EPP’s assessment unit. The results will be further discussed by the EPP in the January 2017 retreat. This is a working model, designed over two academic semesters, intended to be duplicated as each new CORE portfolio is piloted in full.
Core II portfolio has been drafted by the Adhoc Assessment Committee and will be piloted spring 2017. See appendices for draft of the Core II portfolio (Appendix I1).
The CORE III portfolio will be designed in spring 2017 for pilot implementation fall 2017.
For CORE IV, the EPP continues to pilot edTPA in advance of state adoption. Three programs piloted edTPA in spring 2016: Secondary Education with the exception of mathematics, Physical Education, and Elementary Education. In Fall 2016 elementary education continues to pilot and in spring 2017, the EPP will continue to pilot across all programs.
In addition to each of the CORE portfolios of performance based learning and edTPA, the EPP has extended observational data during clinical experiences to include CORE I and CORE II. For each of these COREs, eight (8) competencies aligned with the Connecticut Common Core of Teaching standards are assessed beginning fall 2016. See pages 4-6 of the Clinical Handbook submitted with the IR.
Data from both the performance based portfolios and observation data generated during clinical experiences will be used to corroborate candidate outcomes.
Master’s level data comprise two groups. These are master’s level, initial certification candidates. They do not hold Connecticut state licensure prior to admission. The second group of master’s level students are advanced students. These students are not seeking state licensure via program completion. Often times they already hold professional credentials either in the state of Connecticut or another state. Current Advanced level assessments were reported with the IR.
The EPP, however, has approved the closure and teach-out of existing advanced programs (Early Childhood, Elementary, Secondary, and Reading Language Art), once the new advanced master’s program is approved by the State (see Appendix D1 for minutes from EPP meeting). The new program will consist of assessments for clinical, content knowledge, and data/research literacy. (See advanced Master’s proposal, including assessment schema in Appendix D2). Additionally, the EPP has revised the previously approved Candidate Learning Outcomes to show alignment to Advanced CAEP standards (Appendix A1).
Impact on P-12 Student Learning is addressed through current, emergent, and future assessments within the EPP. Candidates’ impact on P-12 student learning continues to be evaluated through the student teaching evaluation instrument. See disaggregated data (Appendices E2;E3;E4;E5;E6;E7;E8) submitted in appendices and refer to the chart (Appendix A3) for specific items on the instrument that address P-12 student learning.
Additionally, the Response to Intervention (RTI/SRBI) assignment for the Core II portfolio will provide information on our initial candidates’ abilities to use data to plan and evaluate their impact on student learning.
Data from the spring 2016 edTPA pilot also exemplified candidates’ impact on p-12 student learning and/or development.
Advanced Programs :
Currently the clinical assessment in EDU 518, English Language Learners provides data on our advanced candidates’ impact on student learning. Data were previously provided.
For the revised Advanced program, please refer to Appendix D2 outlining our proposed assessment scheme for evaluating the impact on P-12 students.
Other evidence requested by the BOE Team for validation will be provided during the onsite visit.
Offsite FindingsAreas of Concern to Meet StandardsEastern’s Response
Standard 2

Assessment System and Unit Evaluation
This standard was found to be unmet in the 2014 reviewNoneData for the EPP are collected, analyzed, and used for decision making to ensure sound procedures and content for the advancement of candidates’ knowledge, skills, and professional disposition, as well as program operations supported by community input (e.g., host clinical teachers, university supervisors, Arts and Sciences Faculty, school leaders, and alumni). Data are collected at three critical transitions points: Program admission, mid-point (which includes early clinical experiences and entry into student teaching), and exit (leading up to licensure). The EPP currently assesses program alumni via graduate and employer surveys. (see document on UAS Transition Points previously submitted).
Each of the unit assessment transitions demonstrates the use of multiple decision points and the use of multiple indicators to assess candidates’ mastery of content knowledge, skills, and professional dispositions.
The EPP will soon evaluate alumni impact on P-12 student learning and development via additional methods and measures, to be determined in concert with school professionals. In fact, the conversation of how to do so has already begun during the Dean’s annual meetings with area Superintendents (See 2015 minutes in Appendix F9; minutes of the 2016 Superintendents meetings will be provided during visit). This is yet another example of community involvement in the EPP’s assessment system.
The EPP has revised its data collection plan to accommodate the expanded UAS, which now includes multiple data points during each of the developmentally designed clinical placements. The plan (See Unit Assessment Data Collection Plan previously submitted) outlines the schema for data responsibility for the systematic gathering, summarizing and evaluating of fully developed assessments. For newly designed assessment instruments in prototype form for pilot, the adhoc assessment committee is charged with the task of instrument development (see Appendix F5 for adhoc assessment committee minutes). This committee uses existing EPP data and ever-changing professional demands to develop new assessments. For example, the CORE II portfolio SRBI assignment is developed in response to prior student teaching data and Unit retreat discussions among faculty (see retreat minutes for May 2016 and August 2016 in Appendices F1 and F2). The faculty determined that candidates continue to struggle with differentiation of instruction and data use, thereby informing the tenets of the portfolio assignment.
Other evidence requested by the BOE Team for validation will be provided during the onsite visit.
Offsite FindingsAreas of Concern to Meet StandardsEastern’s Response
Standard 3

Field Experiences and Clinical Practice
This standard was found to be Unmet in the 2014 review.1. The unit does not ensure that all initial and advanced candidates have experience working with students from diverse ethnic/racial backgrounds, and students with exceptionalities, and students from different socioeconomic groups, and English language learners. (ITP, ADV) All initial candidates at the undergraduate and graduate level experience a diversity of clinical placements according factors of community indicators (e.g., suburban, rural, and urban) and socio-economic status (e.g., Title II). Please refer to Appendices G2 and G3 for descriptors of Connecticut’s district demographics. In addition, P-12 student factors of ethnic/racial backgrounds, gender, ability diversity, and home language diversity are considered. At the EPP, the intersectionality of these factors is strong. For example, one of our key school districts is largely minority serving 68% Hispanic student population, with 43% from non-English speaking homes. The district is also classified as culturally rich urban.
Candidates’ placements are tracked by the EPP according to these diversity indicators. A completed spreadsheet outlining the diversity of clinical placements will be available at the onsite visit. Herein, in appendices G4, G5 and G6, we provide documentation of current placements student teaching, pre-student teaching and practicum. In addition to district and school level factors of diversity, candidates are required to report on the student populations for the classrooms placements and to reflect their content, pedagogical knowledge, skills, and professional dispositions development across diverse contexts (See previously submitted Core I Portfolio Looking Backwards, Looking Forward, for example).
The newly developed year-long, residency based internship program for master’s level students allows for the full immersion in a learning community whereby candidates are able to demonstrate professional roles they will incur in their schools. For example, one of our current interns participated in instructional rounds observations with teachers and school leaders as well as participated in the discussion that followed (see Appendix J5 for Instructional Rounds schedule at the internship program).
For all advanced candidates, the newly proposed advanced master’s program is set to replace the existing programs. The existing programs include a single clinical experience, while the new advanced master’s program will consist of a minimum of two dynamic placements. For example, candidates in the new advanced program will be required to demonstrate their content knowledge, pedagogical skills, and research acumen for the construction and delivery of a professional development workshop aimed to advance the knowledge base of their peer educators (see advanced master's assessment schema in Appendix D2).
For all clinical curricula, candidates independently evaluate their teaching performance along their university supervisor at two critical transition points (mid-point and final). To further candidate’s input for evaluating clinical placements, the revised end-of-program/exit survey will permit candidates to evaluate clinical supervisors and host teachers alike, fall 2016 (drafts of revised end-of-year/exit survey will be provided at the onsite visit). These data will help to triangulate the current practices (e.g., school leader recommendation and/or state TEAM training) for ensuring clinical faculty quality.
Additionally, in response to feedback from candidates about the timely allocation of clinical placements, the Office of Educational and Clinical Experiences (OECE) has drafted procedures to follow for future placement planning and operations (Appendix G1).
Other evidence requested by the BOE Team for validation will be provided during the onsite visit.
Offsite FindingsAreas of Concern to Meet StandardsEastern’s Response
Standard 4

Diversity
This standard was found to be unmet in the 2014 reviewNoneThe EPP currently outpaces the state rate for minority teachers in the workplace. The latest Integrated Postsecondary Education Data System (IPEDS) report for the State of Connecticut (2013-2014) reveals that of the 18 teacher education providers, 3,558 candidates were enrolled. An analysis of teacher education enrollment indicates that only 7% of matriculating candidates were racial/ethnic minority. This finding is similar to the rate among in-service minority teachers across the state (which hovers around 7% annually), indicating a persistent disproportionality.
Data for Eastern Connecticut State University show that Eastern increased its overall percentage of minority teacher candidates from less than 5% in 2006 to 13% in 2011, with an overall average of 10% over the past three years (2014-2016). (See Appendix J3 on Minority Teacher Enrollment and Supply).
The EPP, however, seeks to grow the number of minority teachers prepared by Eastern to meet market demands. In fall 2016, the EPP awarded two assistantships for the new Holmes Master’s program and an additional Dean’s Scholar in accordance with this goal. Eastern is the first EPP in the state to host the program. Each of these candidates participates in initiatives to advance diversity within the progression as part of his/her assistantship. For example, one is working with a local high school to examine the root causes for minority teacher. The principal has approved the action (see email correspondence in Appendix J4 on Minority Teacher Recruitment).
In an effort to maximize cultural competence among Eastern’s teacher candidates, the EPP has fostered a growing relationship with the nation’s top rated Historically Black College and University by U.S. News and World Report, Spelman College. This relationship is significant in that it adds to the programming of diversity, which is imperative for preparing future educators who are empathetic to the cultural assets, needs, and perspectives of all learners. Because Spelman offers fewer teacher education programs, as compared to Eastern, the initial collaboration will begin between the Early Childhood programs of both EPPs. The purpose of the collaborations is mutually developed to address cultural competence and leadership development (See Spelman and Eastern email correspondences in Appendix J6).
The EPPs programs also exemplify a commitment to diversity in ways that expand beyond the curriculum of teacher preparation. In essence the footprint of programming reaches deep into the Windham region. For example, pre-education and other Eastern students participate in community base activities and service learning that contribute to the wellbeing of the area. One activity is tutoring of P-12 students in local schools, which is coordinated by the Center for Community Engagement. The Center for Early Childhood Education engages in research based services to benefit the educators and families (See the Report on CECE and Staff Report in Appendices J1 and J2). Additionally, the Child and Family Development Resources Center (CFRDC) serves as a hub for equity. Not only do Eastern students from different majors intern with the center, but families from diverse communities also benefit from its educational services.
Other evidence requested by the BOE Team for validation will be provided during the onsite visit.
Offsite FindingsAreas of Concern to Meet StandardsEastern’s Response
Standard 5

Faculty Qualifications, Performance and Development
This standard was found to be met in the 2014 review.NoneEastern’s full-time EPP faculty hold doctorates in their cognate area and have demonstrated professional experiences in school settings that align with their certification area and the levels that they supervise. Please refer to exhibit submitted with the IR on faculty qualifications. Additionally, faculty are meaningfully engaged in scholarship and creative activities that involve active research and professional development. Please refer to exhibits submitted with the IR. Faculty will be available to discuss their work during the visit. Some of our faculty’s research and involvement with candidates are documented at the Center for Early Childhood Education website (http://www.easternct.edu/cece). All clinical faculty (both university supervisors and cooperating teachers) are licensed in the field that they teach and/or supervise and are recognized as master teachers by their school district and by the state.
Our faculty stay current in their field and demonstrate best practices in their own teaching, as evidenced by the sample syllabi submitted in the appendices. Faculty have been recognized by the university for their excellence in teaching and have received other distinguished awards. All courses in the certification programs are carefully aligned with the learning outcomes and proficiencies as outlined in professional, state and institutional standards (see sample syllabi in Appendices B1 and B2). They include assessments and scoring rubrics that outline the expectations clearly for candidates. Diversity is integrated in all programs, specifically through select courses (see Appendix B3 for a list of these courses), and candidates are expected to demonstrate their competency in the key assessments, including the student teaching evaluation (see disaggregated data provided in appendices E2;E3;E4;E5;E6;E7;E8). Technology is integrated in all programs through integrated coursework. EPP faculty use course evaluations and their own ongoing professional development as metrics to measure, enhance and improve their teaching effectiveness.
All EPP faculty engage in scholarship within their field of study. Many collaborate with colleagues to engage in active inquiry related to their own teaching and cognate area. Please see exhibit submitted with IR related to faculty scholarship.
EPP faculty are actively engaged in service to their department, university and the larger community. Please see exhibit submitted with the IR. Several faculty are engaged with P-12 faculty offering professional development and other education-related services. More examples will be shared during the visit.
EPP faculty are continually evaluated as per the union contract in 4 categories of teaching, creative activity, service and professional activity.
EPP faculty, especially new faculty, are supported to pursue scholarship through funding for conference presentations and attendance, reassigned research time and periodic sabbatical leave.
Other evidence requested by the BOE Team for validation will be provided during the onsite visit.
Offsite FindingsAreas of Concern to Meet StandardsEastern’s Response
Standard 6

Unit Governance and Resources
This Standard was found to be unmet in the 2014 review.1. The unit’s governance structure does not allow the unit to manage and coordinate the secondary education programs with the content fields and their faculties. The EPP’s operations are constructed largely on the merits of collaboration among faculty, staff, and community. The AAUP contract lays out certain rules for governance at the departmental level for faculty credit load allocations for the Chair and the construction of by-laws (see previously submitted departmental by-laws). However, the EPP consists of core faculty members across two departments: Education and Kinesiology and Physical Education.
EPP faculty participate on various committees. These support the aforementioned UAS and other operations of the unit. Each committee has its unique role for which assessment operations are implemented seamlessly (See documentation on committee responsibilities in Appendix F8). Starting Academic Year 2016-2017, the EPP has determined that Assessment meetings would be held once a month on the second Thursdays (see minutes and agenda in Appendices F3 and F7, with minutes provided at onsite visit) and Accreditation meetings held once a month on the fourth Thursdays (see minutes and agenda in Appendices F4 and F6, with minutes provided at onsite visit). These periodic monthly meetings related to Assessment and Accreditation have helped streamline the UAS and to receive/give timely feedback and to make programmatic improvements.
Data for the EPP are collected, analyzed, and used for decision making to ensure sound procedures and content for the advancement of candidates’ knowledge, skills, and professional disposition, as well as program operations supported by community input. Data are collected at three critical transition points: Program admission, mid-point (which includes early clinical experiences and entry into student teaching), and exit (leading up to licensure). The EPP currently assesses program alumni via graduate and employer surveys. (see previously submitted UAS Transition Points).
The EPP hosts retreats, whereby data from the UAS are used to make decisions regarding global changes (See example in appendices F1 and F2 of EPP retreat minutes). These retreats represent the connector between the end of one data cycle and the beginning of the next.
2. The unit does not effectively engage P-12 teachers and other practicing educators in the design, implementation, and evaluation of the unit and its programs.The EPP collaborates with P-12 educators on several fronts to assist in program design, implementation, and the evaluation of its unit and programs. First, area superintendents meet with the dean at least once a year to provide input on the Unit’s updates and to discuss benefits of programming or their respective districts. In particular, school administrators have indicated the need for greater clarity of the expectations of host teachers, resulting in a clear articulation of the roles and responsibilities of clinical faculty (see previously submitted Clinical Handbook).
In an effort to provide clarity, the staff of the Child and Family Development Resources Center (CFDRC) in collaboration with EPP faculty created a system to better support practicum students at the CFDRC.  The recommendations were based on observations surrounding areas of need, improvement, additional experience for the candidates. (see Appendix J7 for Collaboration with CFDRC)
Data for the EPP are collected, analyzed, and used for decision making to ensure sound procedures and content for the advancement of candidates’ knowledge, skills, and professional disposition, as well as program operations supported by community input. EPP faculty collaborate with content area faculty to share data, discuss program changes and ways to support candidates, across various disciplines (see minutes of the recent Collaborative Content Area Faculty meeting in Appendix H1)
The EPP, like the rest of the university has continued to experience reduced resources over the years. Yet, it has been through innovation and the will to ensure high quality programing to students, that candidate services have not demised.
3. The unit has endured budget cuts for the past 6.5 years, which may affect its ability to provide high quality programs for candidates.
Several steps have been taken to ensure continued efficiency in the face of declining resources. For all advanced candidates, the newly proposed advanced master’s program is set to replace the existing programs. This is a strategy to ensure that resources are economized for maximization (See Appendix D1 for advanced master’s program rationale).
The program has reassigned faculty lines to more efficiently meet the needs of the EPP in areas where applicable. For example, upon the retirement of an ECE faculty member for Special Education, the EPP has rewritten the faculty line to accommodate the needs of both the Early Childhood and Elementary Education programs (see job description at http://www.easternct.edu/humanresources/jobs/faculty-positions/#SpecialEd). Additionally, on the retirement of the Social Studies and Educational Technology faculty, the EPP has combined the two cognate areas relevantly to seek a candidate with strong social studies and educational technology background (see job description at http://www.easternct.edu/humanresources/jobs/faculty-positions/#SSED).
Nevertheless, there have been fiscal commitments such as the acquisition of TK20, the electronic assessment data system and University Assistants to support assessment, clinical practice, and certification and accreditation. These hires have relieved EPP faculty of certain managerial responsibilities in order to focus on strong curriculum to support candidates’ knowledge, skills, and dispositions development.
A comparison of the EPP budget to that of Social Work will be provided during the full visit.
Other evidence requested by the BOE Team for validation will be provided during the onsite visit.

Return to Accreditation Main Page