<< Chapter < Page | Chapter >> Page > |
The design of the program is informed by an adult learning framework which postulates three key pedagogical elements that should be incorporated in 21st century classrooms, (1) utilize collaboration (i.e., groups or teams); (2) are problem or project-based; (3) have a practical or real-life (authentic) focus. This framework is also referred to as “relate – create – donate” (Kearsely&Shneiderman, 1998). The implementation of online threaded discussions, digital portfolios, and communities of practice follow this theory of “relate – create – donate.”
The purpose of the innovative technologies is to allow for the third space of critical thinking, self-awareness, and praxis to occur for principal candidates. Program outcomes confirm that this “third space” that is critical to the development of effective school leaders transcends delivery mode and is attainable through design and pedagogical techniques be it through traditional or distance delivery mechanisms. Furthermore, the third space becomes truly transformational when not only critical thinking of leadership is attained by students, but also a self-awareness about what is informing, shaping, or possibly biasing their beliefs as a school leader. This higher order cognition can be understood as a metacognitive process – that is – principal candidates are prepared to think about how and why they are thinking what they are thinking.
For both learning outcomes and assessment of student learning, individual portfolios are used; the University portfolio system is utilized in the introductory course of the program and used throughout for evaluation and representation of the student learning outcomes in the form of a capstone. The online program takes our usual program evaluation data a step further by enabling more frequent data collection and, we believe, better continued connections with students after graduation. We assess student learning via benchmark activities from key projects that are reviewed across students for program evaluation purposes; course evaluations by students (quarterly); satisfaction surveys of students (quarterly); feedback forms from Cohort Instructors about the type and quality of student work and interactions with the Internship Supervisors; capstone portfolios; and exit interviews with graduating students. These data are reviewed quarterly where possible by program leadership to catch potential issues early. The full array of assessment and program evaluation/student satisfaction data are collected and analyzed annually at the end of each cohort’s program, and reviewed by the full group of instructional personnel in order to identify and implement changes and updates in content, instructional processes, assessments, and program support services that are needed to improve the program for the next cohort. Similar data are collected for the university-district program, so that comparisons can and will be made to ensure that similar quality is present across all of our delivery models.
Notification Switch
Would you like to follow the 'Education leadership review special issue: portland conference, volume 12, number 3 (october 2011)' conversation and receive update notifications?