At the Bok Center we view evaluation as an integral piece of instruction, curriculum development and university strategic planning.
It is our mission to:
- Communicate to our faculty and the university community the value and intentions of evaluation in a way that is meaningful to them.
- Partner with faculty, academic and co-curricular programs to help them build and integrate systematic process of self-reflection into their teaching and programming.
- Enable continuous learning and development by organizing and promoting assessment and evaluation materials and workshops and internship opportunities for both faculty and graduate students.
Collaborating Across the Disciplines
Universities have generally arranged themselves into units that reflect academic disciplines, their attendant cultures, and their specialized ways of knowing. While a great deal of research has focused on learning and instruction, only a limited attempt has been made to understand how disciplinary context affects how students learn, how and what we teach, and what and how we measure has been made.
Bringing Research to Practice
Here at the Bok Center, we offer a new educational space where research, evaluation and practice intersect. In expanding our role to incorporate research and evaluation, the Center is uniquely positioned to serve as a model of how to bridge the gap between research and practice and to disseminate knowledge.
A Highly-Centralized Approach
Evaluation works best in an office that is independent, impartial, and insulated from the influence of the programs that are being evaluated. This approach avoids conflict of interest. The Bok Center reports into the Dean of FAS, which allows for more objective evaluation of programs in the College.
Evaluation also works best when it is highly centralized and avoids a number of apparent liabilities including: 1) the cost escalation as few economies of scale can be realized and each lesson learned must be repeated in every case; 2) the inability to ensure that uniform and comprehensive implementation occurs across the FAS; 3) an absence of coordination with reporting requirements and 4) the inability to focus the best expertise on evaluation issues. Finally successful implementation of evaluation in the academic sector should be tied to the scholarship of teaching and learning; this cannot occur effectively under the reporting structure of a line administrator or administrative office. If line administrators become too directly involved with setting academic policy, then many groups of faculty will become quickly disengaged with evaluation. At the Bok Center, our focus is on teaching and learning, our Center is overseen by a faculty director and our evaluation team works closely with pedagogy experts in the disciplines and the Graduate School of Education.
Building Knowledge Networks
In many universities, graduate students constitute the largest teaching resource. Incorporating integrative evaluation training into their professional development can address local challenges in cultivating the adoption of more self-reflective processes into instruction within Harvard departments and programs.
How We Conduct Our Work
Impartiality is the absence of bias, in the methodology and approach of the evaluation, and in considering and presenting achievements and challenges of the program. Independence is the freedom from conflict of interest and the ability to be transparent in reporting results. The requirements for independence and impartiality are necessary at all stages of the evaluation process, including designing evaluation instruments and protocols and presenting findings.
However, evaluation does not occur in a vacuum. A good evaluation requires cooperation between program staff and evaluators in order to generate the most valid data and interpretations to answer the questions set forth by the program. As part of this cooperation, each party brings different strengths to the evaluation process. The program staff brings real questions and inquiries about their activities, resources and needs, content expertise about the program, and the context for the evaluation. The Derek Bok Research and Evaluation team offers objectivity, methodological expertise, confidentiality, and constructive and relevant recommendations.
Program staff will work with the Bok team to provide guidance in articulating program goals and framing evaluation questions. They will be given the opportunity to review draft evaluation plans, instruments and reports in order to correct any factual errors and to comment on the findings and recommendations. This should be done in such a way that maintains the independence and impartiality of the evaluation team so that the resulting evaluation will be as objective as possible. The Bok Center Research and Evaluation team will lead the evaluation and will explore with the program staff the strengths and shortcomings both of various evaluation questions and the various approaches that might be used for answering questions. Good question design is a critical dimension of the quality of evaluation. To maintain objectivity and to ensure scale validity, item development will be the responsibility of the evaluation team and not the program staff. We work closely with program staff to ensure that valid and reliable items are created to match the program’s evaluation questions. This is an iterative process. The Bok Center evaluation team will finalize all instruments once the evaluation’s substantive content has been reviewed by the program staff, and will then collect and analyze data, and generate reports.
We as evaluators commit to the principles intended to guide the professional practice of evaluation developed by the American Association of Evaluation:
- Systematic Inquiry: Evaluators conduct systematic, data-based inquiries about whatever is being evaluated.
- Competence: Evaluators provide competent performance to stakeholders.
- Integrity/Honesty: Evaluators ensure the honesty and integrity of the entire evaluation process.
- Respect for People: Evaluators respect the security, dignity and self-worth of the respondents, program participants, clients, and other stakeholders with whom they interact.
- Responsibilities for General and Public Welfare: Evaluators articulate and take into account the diversity of interests and values that may be related to the general and public welfare.
While most of the research conducted by Bok Research and Evaluation (R&E) is intended for internal use within the university, occasionally one of our clients will have the opportunity to publish results that are directly related to the work that R&E conducted for the client. If a client seeks to publish publically in a journal any aspect of the work that has been substantially contributed to by R&E, we have the following expectations:
- Depending on the nature of the data that is to be collected, IRB approval may be necessary. If Bok R&E is going to be substantially involved in data collection, analysis, and/or reporting, then the Director of R&E should be at least co-PI on the IRB project protocol.
- If data collected from the student body in FAS is to be published, approval must be received from the Dean of FAS.
If Bok R&E contributes significantly to the publication, our expectation is that the relevant team members will be authors on the paper. Our criteria for authorship are based on the “Uniform Requirements for Manuscripts Submitted to Biomedical Journals,” developed by ICMJE and last updated in December 2016. Many journals voluntarily use these recommendations. If all are met, we expect that the contributing Bok R&E team member(s) will be included as authors on the publication.
- “Substantial contributions to conception and design, or acquisition of data, or analysis and interpretation of data.
- Drafting the information product or revising it critically for important intellectual content; and
- Final approval of the version to be published.”