Friday, June 20, 2008

Program Evaluation Tools and Strategies for Instructional Technology

Saturday, June 23, 2007




Notes on:

NECC 2007 Program Evaluation Tools and Strategies for Instructional Technology

jsun@sun-associates.com 978-251-1600 x204

doing program eval for seven history grants

redoing alabama tech plan, help them develop own measures to take ownership; districts need to make meaning of data being gathered for state.



Handouts and PPT:

NECC 2007 Workshops



Qualitative as well as quantitative; mostly qualitative in our work;



Many eval summative, but should be both formative and summative

By Definition, Evaluation…

Is both formative and summative

Helps clarify project goals, processes, products

Should be tied to indicators of success written for your project’s goals

Is not a “test” or simply a checklist of completed activities

Qualitatively, are you achieving your goals?

What adjustments can be made to your project to realize greater success?





Page 5 in the workbook diagram (slide A Three-Phase Evaluation Process)





Indicators drive the data collection

not throw out the net and see what happens; often data collection same way; rather design survey on what you know you are looking for;



Handout: tasks

1. initial mtg getting to know partners

2. estabish identity of eval

3. establish project lead

4. create contract work (works schedules, procedures:

5. create eval committee

6. hold first eval committee mtg purpose to create project benchmark indicator rubrics and data collection expectations

7. establish schedule fo radditional committee mtgs thoughout year

8. establish reporting schedule

9. review and finalize rubrics and data collection tools

10. create data collection schedule

11. collect data

12 data analysis

13. reporting

#5 Feel very strongly that evaluator works with a committee of stakeholders; participants have to do the work of the goals, wouldn't you want them providing input on the eval process; if not students then parents advocates for students



handout: (slide nine) Project Sample - diagram of logic map called Supportive Reading Environments

  1. project inputs: needs, what is project addressing;
  2. Strategies
  3. Intermediate goals outcomes or bojectives
  4. Ultimate project goal or outcome (or vision)

Eval question related wording in intermediate goals; wording is "To what extent have...[wording of the goal]." Dont' want to word it yes/no "have they or haven't they"



Rubric 4 levels; highest level is the Wow level; mix of quantitative and qualitative



Indicator statements criteria (slide 13)



We looked at some proposals and had to develop the logic map: needs, strategies, intermediate goals, ultimate goal. Then spent time discussing rubric ultimate



To Summarize...

Start with your proposal or technology plan

Logic map the connections between actions, objectives, and goals

From your goals/objectives, develop evaluation questions

Questions lead to indicators

Indicators are organized into rubrics

Data collection flows from that rubric

Evidence/Data Collection

Classroom observation, interviews, and work-product review

What are teachers doing on a day-to-day basis to address student needs?

Focus groups and surveys

Measuring teacher satisfaction

Triangulation <http://www.sun-associates.com/eval/samples/samplesurv.html> with data from administrators and staff

Do other groups confirm that teachers are being served?



focus on just survey not very reliable since it is self reporting data; may be more cost effective but need to triangulate









It does no good if not disseminate reports

School committee

press releases

community mtgs











Powered by ScribeFire.

No comments: