How to Order.vp

(backadmin) #1
294 K-12 LEADERSHIP PRACTICES

reform strategies continue to proliferate leaving behind a trail of broken promises, incomplete
and uneven implementation, and students who still cannot read and cipher (Calkins et al.,
2007). Efforts to augment inadequate budgets, including the imposition of whole school
reform models on low SES districts, have resulted in little meaningful or sustained change in
the low-performing poor districts throughout the state (Walker & Gutmore, 2000).
Further complicating this dilemma fact that administrators and teachers do not see eye-to-
eye on the nature of the problems facing urban schools (Rand, 2007). On the one hand,
superintendents critique the absence of adequate funding for reform efforts. On the other,
teachers see the failure to align curriculum and state standards as one explanation for low
scores. Citing insufficient time for teaching and planning, teachers express their frustration
with unrealistic expectations embedded in the NCLB legislation (Le Floch et al., 2007).
Hoping to motivate educators to increase student achievement, states attempt to provide
rewards, assistance to low performing schools, or sanctions (USDOE, 2000).Yet Goertz and
Duffy (2003) reported that “educators in the CPRE study generally faced few formal
consequences for not meeting school, district, and/or state performance goals beyond those
imposed by the state” (p. 5). Citing that states rarely apply sanctions to low performing
schools because they lack capacity, including both fiscal and human resources, to support
effective change, the researchers indicated the need for clear goals, incentives, teacher
motivation, including capacity-building efforts, and teacher knowledge and mastery of
effective instructional techniques.


METHOD


Data Collection


Data for the district monitoring were collected from the River City School District and
were analyzed by the five QSAC team members. The monitoring team observed that the
document checklist had a number of planned redundancies and by working back and forth
between the DPRs, they were able to compare key indicators acquiring a fairly clear picture of
district performance. The second stage of monitoring, extended interviews, was conducted
using QSAC protocols. The third and final stage required the monitoring team to complete
comprehensive reviews of one-third of the district’s schools.
During the second phase of this study, an analysis of the secondary data across all pilot
districts, including monitoring reports from the remaining districts were downloaded from the
NJDOE’s public web-site. The site included Benchmark Assessment Reports, Curriculum
Audits, and data from all five QSAC DPRs. Data points within each of the five indicators
provided triangulation for key proxies. Specifically, the Instruction and Program DPR and the
Learner-Centered Instruction domain of CAPA helped fill in some of the missing data
pertaining to the teaching and learning process throughout the district (Doolittle et al., 2007).
For example, using assessment for improving instruction appears across all five DPRs.
Likewise, professional development, collaboration, and efficient management, also appear in
the DPRs.


Data Analysis


Using the constant comparative method devised by Glaser and Strauss (1967), data were
coded, organized into categories, and then collapsed into themes. Our analysis focused on
effective school improvement derived from the corpus of research literature associated with

Free download pdf