On this program the California Social Work Education Center is the owner of this program, and the California Macro Evaluation Subcommittee of the Statewide Training and Education Committee manages the program for training evaluation. The curriculum development oversight group develops the standards and processes for the curriculum, and from there each time training is provided a standardized evaluation form is used.
This program is a child welfare training evaluation and strategic plan for assessing the child welfare training for personnel within the California Department of Social Services. This report covers, the need and requirements of child welfare training evaluation in California, the structure and processes for development of the strategic plan. It also addresses the rationale for the plan, the specific provisions, including tasks, timeframes, and the progress on implementing that plan. This is developed to, assess and report the effectiveness of the training so that improvements can be made if necessary.
A framework for following and announcing all case manager finish of Core Training. Quality affirmation for the improvement and conveyance of amended Core Training in five substance territories. Advancement and usage of information testing for five substance zones of Core Training. Advancement and use of aptitudes appraisal in one region of substance of Core Training. Appraisal of the adequacy of coaching as a strategy to expand exchange of learning for members in Core Training. The discoveries from this assessment will advise the advancement of the supervisory Common Core Training. Examination of information at numerous levels of assessment so as to assemble a “chain of confirmation” about preparing adequacy keeping in mind the end goal to answer wide inquiries regarding the effect of preparing on accomplishment of organization and customer results.( Calswec, 2004)
There will be information about adequacy of preparing at different levels (a chain of confirmation) with the goal that the general inquiry regarding the viability of preparing can be better tended to. Information about preparing viability will be founded on thorough assessment plans. Educational modules authors and mentors will have information concentrated on particular parts of preparing, taking into account focused on corrections of material and techniques for conveyance. Since in California youngster welfare preparing is created and additionally conveyed by different preparing substances (RTAs/IUC, CalSWEC, regions), there are numerous varieties of preparing. Assessment gives an institutionalized procedure to efficient survey and assessment of these distinctive methodologies. (Calswec, 2004)
What issues were identified as a result of the evaluation? (Outcomes of the program evaluation)
Students were presented to coursework on the tyke abuse ID skills. The course was content legitimate and tended to the skills at the proper expansiveness and profundity. Students found the course significant and valuable. Students learned particular information expected to distinguish youngster abuse (e.g., the legitimate meanings of mishandle and disregard, sorts of consumes and breaks related with abuse and those related with mishaps). Students started to ace vital abilities, for example, having the capacity to: Recognize different sorts of outer wounds (cigarette consumes, rope marks, sprinkle consumes) from slides. Perceive basic pointer bunches from composed case situations (e.g., kid practices related with sexual manhandle). Look at a kid for wounds. Students exchange this information and aptitude to their occupations (e.g., effectively recognizing sorts of harms, precisely clarifying restorative reports of wounds, making judgments about kid abuse in view of depictions of kid practices and physical confirmation). Customers accomplish better results (e.g. kids get restorative care when required). Since the interceding factors are not controlled, contending clarifications for learner execution are as yet conceivable. For instance: more laborers might be precisely distinguishing youngster abuse in light of the fact that a specific administrator underlines that as opposed to in light of the fact that they found out about it in preparing as well as customers may accomplish better or more regrettable results for some reasons inconsequential to preparing.
I believe this is a summative evaluation based on the fact that the focus is on the outcome of the program rather than a formative evaluation which is based on the worth of a program while the program activities are forming. In this instance this was an external program evaluator, contractors to CalSWEC for Child Welfare Training Evaluation.
This analysis of this case study will help me in planning my own program evaluation or in assisting an external Program Evaluator by giving me the experience needed to conduct this type of evaluation on my own. With this I will have the ability to start my own analysis, and not feel overwhelmed without knowing where to start. I feel getting thrown to the wolves so to speak in terms of being in class and having to do these really is a confidence booster once the task is completed. I know at first its hitting you in the face like WHOA! But after I get started and chip away at it, I see that it is not such a monster after all.
Program Logic Model: ORI 105 Online Orientation and Student Success Evaluation Plan
| Course Activities
· Skills of instructional designers
· Experienced online instructors
· Approved course syllabus
· Student evaluations
· Professional development training/Online Teaching certificate
· Computer access for faculty, staff and students
· ITS support Offices
· Blackboard LMS
· Internet access
· Support of Administrative Council
· Use LMS and other tools to complete course requirements
· Use synchronous and asynchronous communication to collaborate with learners and instructors
· Retrieve and critically evaluate information on the Internet
· Assess online learning skills
· Instructor and Course Evaluation
· Active and engaged learner participation in class discussion forums
· Multimedia presentation of group assignment
· Annotated bibliography project
· Paper Reflection discussing the online learning experience
· Completed Student
· Instructor Evaluation
· Increased enrollment in online courses
· Increased success rate of online learners
· Improved quality of online learning and teaching at WCC
· Better prepared online students
· Online instructors have improved competence
ORI 105 Online Orientation and Student Success is an orientation to the world of online learning and allows the students to address any learning gaps that exist in their readiness for success in online learning. This course is open to first semester students at Wallace Community College, and provides a wonderful learning experience for first time online learners. This course introduces the students to our technologies, communication tools, and learning processes that are used in the online classroom. Students will: (1) be introduced to the differences in online and traditional learning; (2) be able to identify their own individual learning styles also, figure out what adjustments, assuming any, might be required to prevail in an online course; (3) survey the attributes of successful online learners; (4) figure out how to use the huge assets of the Internet to encourage learning; and (5) survey their own potential as an online student in relationship to these issues.
There has been lots of research and writing which proposes that dropout rates for online courses have a tendency to be fundamentally higher than the dropout rates for customary face-to-face courses. The purposes behind these outcomes regularly incorporate at least one of the accompanying:
- numerous students taking their first online course need adequate PC abilities;
- numerous students are relative newcomers to the Internet;
- some first-time online students have negligible or no past involvement with incorporating innovation with human cooperation keeping in mind the end goal to convey successfully;
- numerous students who select in an online course do as such without having any approach to assess whether this learning condition is fitting for their learning style.
This assessment has been intended to decide the adequacy of ORI 105 Online Orientation and Student Success for getting students ready for accomplishment in the online classroom. The assessment tries to address inquiries from various partners, including understudies, educators, and chairmen at the college.
Proposal of five evaluation questions that determine specifically what my program evaluation is going to answer.
The five following questions will provide some information for my program evaluation on the ORI105 Online Orientation and Student Success Evaluation Plan.
- What is the reasoning behind the development of this course?
2. What types of students participate in the class?
3. What are the intended objectives?
4. What is the intended outcome of the course?
5. What are the benefits of this course for the students?
The rationale for selecting the 5 questions above are due to my readings the past few weeks have pointed out that in Fitzpatrick’s writings (Fitzpatrick, et al. 2010) show that only a few select questions can be answered by one particular method. However, if only one method tends to answer all of the questions posed, then it is up to the evaluator to select that single method of evaluation.
Clarify what is not being evaluated and why it should not be
The cost of the class is not going to be evaluated, due to the fact that it is a required class, and a lot of students that get their GED through our workforce development program get this class as their free class for passing their GED.
Identify what standards are reflected in the choice of evaluation questions The standards that are reflected in the choosing of evaluation questions comes from (F2) practical procedures, (F4) Resource Use, and (A3) Reliable Information. (Fitzpatrick, et al. 2010). ]
Identify which stakeholders should be involved in determining evaluation questions and explain why you think so. Explain what the role of the stakeholder should be in determining the evaluative criteria.
The stakeholders that should be included in the choosing of evaluation questions are the instructors, school administrators, class participants, and administrators with expertise in giving orientation classes to students. The role of the stakeholders in determining the criteria are being in constant contact and communication with each other, as well as the students in the class. With communication being an important factor it can help stakeholders identify areas where a program improves or whether it needs further monitoring.
|Stakeholders||Policy Makers||Operational Decisions||Provide Input for evaluation||Reaction||Interest only|
|Dean of Instructional Affairs||X||X||X||X|
- Stakeholders with decision authority
The findings of this evaluation will aid the Administrative Council and Dean of Instructional Affairs in making appropriate policy decisions that support the college’s mission and strategic plan.
- Stakeholders with direct responsibility
Division Directors and instructional designers may utilize the evaluation findings to assist them in making operational changes to improve curriculum development.
- Intended Beneficiaries
The primary beneficiaries of this evaluation will be the students who enroll in ORI 105 Online Orientation and Student Success course and the instructors who teach it.
- Disadvantaged Stakeholders
Because of student interest in more online courses, WCC has seen an emotional increment in enrollment in its online courses and projects. This managed fame of web based learning may in the end prompt a comparing diminish sought after for conventional up close and personal courses on grounds. Educators and students who lean toward customary classroom instructing and learning may end up noticeably impeded because of decreased on-campus course offerings.
|President’s Office||How many online students?||Annual Report, Marketing, Recruiting.|
|Dean of Instructional Affairs||What is the cost of the course?||Resource Allocation, Curriculum Funding|
|Division Director||How many students register for online courses each semester? How many instructors teach online courses each semester?
What is the percentage of students that successfully completes this course each semester?
|Teaching assignments/course loads Recruitment|
|Students||What is the student/instructor ratio for this course?
How accessible is the instructor for this course?
|Selection of courses|
|Instructor||What percentage of students successfully completes this course each semester?
What do online learners need to know to be successful in this course?
|Design Team||Which groups of student are taking online courses?
How many instructors are teaching online courses?
|Accreditation Office||Does the course meet accreditation standards?||Program review|
|Faculty||Will students completing this course be better suited for online learning?||Technology challenge considerations|
|Administrative Council||Which courses are/should be approved for online delivery?||Review of programs|
|President’s Office||Are we adequately preparing students for online learning?||Annual reports, speeches, marketing Curriculum funding requests|
|Dean of Instructional Affairs||How effective are the college’s online courses?||Program improvement
|Division Director||How effective is the instructor in an online environment?
Are students reaching the desired outcomes?
|Identify training/support needs|
|Students||Am I best suited for online learning?||Future course selection|
|Instructor||Are my online teaching strategies effective?
Are my students learning?
|Design Team||Does the course employ universal design concepts?
Are course materials presented for various learning styles?
|Accreditation Office||Does the course meet accreditation standards?||Program review|
|Faculty||Will students completing this course be prepared for online learning?||Technology integration considerations|
|Administrative Council||Are instructors effective in the online classroom?
Are students achieving learning outcomes?
EXPERTISE AND CONSUMER-ORIENTED APPROACHES
(Fitzpatrick, J., Sanders, J., & Worthen, B. (2010) pgs. 142 – 149.
|Evaluation criteria are detailed and are assumed to be valued by the consumer.
Can be used to evaluate products in various stages of development.
Evaluations can incorporate the partners and the end users.
Extensively used by government agencies and independent consumer companies.
Scriven’s checklists account for needs, side effects, process, support for users, cost, and outcomes of performance.
Emphasize the central role of expert judgment, experience, and human wisdom.
Can provide efficient and effective recommendations based on expert evaluators.
|Identification of evaluation criteria can be quite difficult.
Stage of development unclear and may skew evaluation findings.
Differences of opinions can cloud fact-finding or create a bias during the evaluation.
Not thoroughly discussed by professional evaluators.
The evaluation process can be confined to the program outcome.
Requires careful consideration into the types of experts needed to perform the evaluations.
Presumed expertise of experts can be a potential weakness in the evaluation.
PROGRAM-ORIENTED EVALUATION APPROACHES
(Fitzpatrick, J., Sanders, J., & Worthen, B. (2010 pgs. 153 – 169).
|The reasons for a portion of the exercises are determined, and afterward the evaluation concentrates on the degree to which those reasons, or destinations, are accomplished.
Numerous people have added to the development of this approach.
Easy to use since it rotates around the achievement or disappointment of the program goals.
|The evaluation needs clear and exact targets and a reasonable and quantifiable stopping point.
Numerous forms of this approach.
Narrow minded concentration on targets and their estimation.
Neglects program portrayal; the need to pick up a comprehension of the setting in which the program works and the impacts of that setting on program achievement or disappointment.
Distorts the intricacy of program conveyance and setting.
DECISION-ORIENTED EVALUATION APPROACHES
(Fitzpatrick, J., Sanders, J., & Worthen, B. (2010) pgs. 172 – 188)
|Presents data to aid basic leadership.
Concentrates on directors and their needs.
Single way toward finishing program assessment.
Simple to take after process concentrating on making proposals for leaders.
|Tend to disregard partners with less power.
Evaluators could be depicted as “employed weapons” that serve just the foundation.
Evaluators might be not able react to inquiries or issues that might be noteworthy – even basic – yet that conflict with or possibly don’t coordinate the worries and inquiries of the chief who is the essential gathering of people.
Requires the choices, the program, and its setting will stay stable and un-evolving.
PARTICIPANT-ORIENTED EVALUATION APPROACHES
(Fitzpatrick, J., Sanders, J., & Worthen, B. (2010) pgs. 189 – 230)
|Heavily involve stakeholders in the evaluation process.
Evaluators are trained in research methods.
|Evaluators have to balance the advantages of undertaking a participatory approach in each new setting, considering the nature of the program, the organization, and the potential stakeholders to involve.
Rely on the evaluator’s abilities versus a set process.
The feasibility, or manageability, of implementing a successful participative study.
The credibility of the results to those who do not participate in the process. Credibility of results without full participation.
Relies on the competence of stakeholders to perform the tasks that some approaches call for.
|Explain your choice of model for your program evaluation: In my opinion I feel that the expertise and consumer oriented approaches, due to this being designed by educators and student outcome driven. This is a newly developed course and would do well with an evaluation, being as evaluations can include stakeholders and the final users.
For this week’s assignment, I chose a different set of questions and opted not to use the recommended questions. I did this mainly because I knew everyone would be asking and answering those same set of questions. So, I figured I would branch out a bit and actually try to grow myself a bit more by coming up with my own questions, straight from the hip.
I asked the following set of questions to two of my classmates, Gerard Apanewicz, and Felicia McGhee, and I got a great response from both of them. I saw that Felicia replied after 6 pm on Sunday night to my questions, and I was without internet until today, so I am putting this together now for submission. Below are the questions that I sent to my classmates and a summary of their responses.
- Given your understanding of where we are in the program, what you would have expected to see happen but may be missing?
From both of my classmates I gathered that both of them have been online learners for a good while and they miss the true “hands on” aspect of learning, and having an actual true moc evaluation to go through as if it were a live evaluation. The program itself seems to be lacking in that department respectively speaking, rather than find our own program to evaluate, to actually have a list of programs we could pick from and evaluate. Maybe even get put into groups and work together doing a program evaluation from the aspect of a simulated team of evaluators and have a final group project submission such as in real life.
- Explain your method of how you will gather data for this evaluation? Would it be a single method or mixed?
The major consensus between the three of us would be a mixed method of gathering information, due to the fact that we all are in agreeance one method would not give the most accurate results.
- What evaluation method would you use to evaluate this program? Explain.
All parties have answered mainly that we would all use the participant oriented evaluation model, due to the major stakeholders being faculty, students, and the administration of the college.
- Explain what design model you would use based on your evaluation method.
This one would be more diverse with one using the descriptive design model due to the questions asked being descriptive. The other answer I got was responsive evaluation, due to responsiveness and flexibility being reflected in the clock state developed to reflect recurring events in a responsive evaluation (Fitzpatrick et al 2010).
- Would you use a summative or formative evaluation type? Explain
On this question I got mixed answers as well, one answering formative since this is an active program. The other answered summative based on their knowing some recent graduates and summative being used to judge the worth of the program and focus on the outcomes.
- Identify the stakeholders in this evaluation and what are their specific interests?
I got the general consensus of current students (us), prior students, faculty, and the administration. The current students have first hand knowledge of the course in it’s present state. Prior students would offer insight into the previous versions of the course whether it being a positive or negative view. Faculty would offer the expertise and knowledge they have and offer a look into the problems students have during the course. The administration will have an interest in the program in 2 ways, one being they would offer the student contact information, and two, they help provide funding and know the costs of the program as a whole.
- Who would be the key stakeholders in this evaluation? Are there any political contexts that may skew the outcome of the evaluation?
- Current Students.
- Previous Students
- Administration/Funding Department
There could be political context from both the faculty, administration and funding department. The faculty may feel the course is performing well and any change suggested might be a slight to their performance/ego. From the admin/funding side, any evaluation that may pose a change to the current way of doing things will have an impact to finances. There could be pressure to allude to a favorable outcome to avoid any changes in budget.
How did the experience of conducting each interview differ?
The experience of conducting each interview was as different as night and day. On one hand I got a quick response and the questions they wanted answered were sent out in a timely manner. On the other hand I did not get a set of questions at all from the other classmate, I only got a response and it was too late in the evening on the day the paper was due to actually do anything with, even if I did have internet.
What did you learn about the process of conducting an interview?
The process of conducting an interview in this capacity was much more challenging than actually conducting a job interview. I would much rather have a set time and place for us to meet rather than via email, because there are so many procrastinators out in the world today.
Worksheet: Sign-Off Form ORI 105
Project Name: XYZ System Upgrade
I have reviewed the following deliverables as of the date identified below:
- December 18, 2017
I have found these deliverables to meet with my approval, with the following exceptions:
- All deliverables meet with my approval.
I hereby give my approval to proceed with the evolution of these deliverables to the next
stage of development in order to meet the project objectives in a timely fashion.
I understand that any changes (additions, deletions, or modifications) to the fundamental
structure, underlying design, or the specific features of these deliverables might result in:
- Slippage of the completion date for these deliverables
- Additional resource requirements
- Additional costs
_Jarrod Craven___________________________________________ [Signature]
Jarrod Craven Project Manager
Corporate Program Evaluators
Date: _December 18, 2017____
Fitzpatrick, J., Sanders, J., & Worthen, B. (2010). Program evaluation: Alternative approaches and practical guidelines (4th ed.). Boston, MA: Pearson.
Mohan, R., & Sullivan, K. (2006). Managing the politics of evaluation to achieve impact. New Directions for Evaluation, 112, 7–23. doi: 10.1002/ev.204 Retrieved from the Walden Library databases.
Parry, C., & Berdie, J. (2004). California social work education center: Training evaluation framework report. Retrieved from http://calswec.berkeley.edu/calSWEC/MacroEvalFrameworkReportFinal.pdf Ritter, L., & Sue, V. (2007). Selecting a sample. New Directions for Evaluation, 115, 23–28.