ABET Accreditation

The UCSC Electrical Engineering Department offers a bachelor of science (B.S.) degree in electrical engineering which is accredited by ABET.

193 students were enrolled in the Electrical Engineering B.S. program in Fall 2013.

42 B.S. degrees in electrical engineering were awarded from Fall 2013 to Summer 2014.

Program Learning Outcomes: Definition, Alignment and Assessment
Electrical Engineering Department

I. Introduction

The electrical engineering program is accredited by the Engineering Accreditation Commission of ABET. The accreditation process has many reasons for application in engineering, e.g. graduates of engineering departments often become professional engineers and are legally responsible for engineering decisions. The Electrical Engineering B.S. is the only ABET accredited program at UC Santa Cruz.  ABET accreditation is a demanding process and is aimed at continuous improvement of the programs involved as is the PLO process that is the subject of this document. The concept of continuous improvement in an industry or an institution is often credited to W. Edwards Deming and known as the Deming method. The Deming method was put into practice by W. Edwards Deming in Japan after World War II at the request of Gen. MacArthur. It is credited with a very significant impact on the postwar recovery of Japan, helping a devastated country to become the home of many of the highest quality products in the world. The EE Department has now been through two cycles of the ABET accreditation with notable success – a virtually clean pass on both the 2003 and 2009 visits.

We plan to use our experience with the ABET process to implement the PLO assessment process for our graduate programs, extending the current undergraduate ABET process to our masters and doctoral programs. However, our PLO assessment process for the graduate programs follows the guidance in the document entitled, "Guidelines for the Development and Assessment of Program Learning Outcomes" provided by the campus. The heart of continuous improvement programs is shown on page 3 of the guidelines document, shown in figure 1 below.

We propose to use our existing ABET accreditation process for PLO assessment in the undergraduate program since it already exists and has proved valuable in improving the EE department’s undergraduate program. We describe the existing program below in section II. A part of the existing ABET machinery is a continuous improvement program (CIP) committee that performs the analysis, findings and recommendation process, extending it to include the PLO assessment process for the MS and PhD programs, discussed here.

Fig. 1. Schematic illustration of continuous improvement (CIP) program process (after UCSC, Guidelines for the Development and Assessment of Program Learning Outcomes, 2013)

II. Undergraduate Program

A. ABET experience and example

For the undergraduate program we plan to make use of our existing ABET accreditation process that serves the same continuous improvement function as the PLO assessment in the UCSC guidelines. We emphasize here that we are using our existing ABET process because it serves the same function as the PLO process, it already exists and has been shown to work effectively. To demonstrate the effectiveness of the ABET process we include an example of the process output, namely recognition of shortcomings, directions for improvement and specific recommendations (Appendix A).

B. Definition of PLO's for undergraduate program

Our table of PLO's for the undergraduate program corresponds to the ABET program learning outcomes, a through k.

Table 1: Program Learning Outcomes for EE Undergraduate Program

  1. An ability to apply knowledge of mathematics, science, and engineering
  2. An ability to design and conduct experiments, as well as to analyze and interpret data
  3. An ability to design a system, component, or process to meet desired needs within realistic constraints, such as economic, environmental, social, political, ethical, health and safety, manufacturability, and sustainability
  4. An ability to function on multidisciplinary teams
  5. An ability to identify, formulate, and solve engineering problems
  6. An understanding of professional and ethical responsibility
  7. An ability to communicate effectively
  8. The broad education necessary to understand the impact of engineering solutions in a global, economic, environmental, and societal context
  9. A recognition of the need for, and an ability to engage in life-long learning
  10. A knowledge of contemporary issues
  11. An ability to use the techniques, skills & modern engineering tools for engineering practice.

The correspondence between these outcomes and the example undergraduate outcomes shown in the USCS Guidelines is both close and obvious. The outcomes of Table 1 are very appropriate for engineering departments and are widely accepted in the USA and internationally (see www.abet.org).

C. Alignment matrix

The alignment matrix for the PLO outcomes above is shown in Table 2 below. The courses used in this alignment matrix are the required courses for undergraduates in electrical engineering. As shown by the dots and triangles, many courses support the outcomes listed above, but a smaller number are used to sample the results for outcomes assessment. Only those courses that are sampled for assessment are shown. We have grouped them in core EE courses, supporting courses and capstone experience courses. The EE department has used this method for outcomes assessment in the undergraduate program for some 10 years now and found it works effectively. Two ABET visits over this time have been very successful and the EE program has been accredited since its inception (the 2003 ABET visit accreditation was backdated to cover all graduates since the program started in 1999 period). An ABET visit is an intensive, in person, visit to campus by an examiner and is roughly equivalent to the visit for a departments external review.

D. Assessment process

The assessment process is with well characterized by the block diagram in figure 2.

Fig. 2. Block diagram of information sources and flow for outcomes assessment process. CIP refers to Continuous Improvement Program. Note similarities with figure 1.


Table 2: Alignment Matrix Linking EE Program Outcomes with Courses for EE Majors

(Courses marked S support an outcome. Courses marked M measure an outcome.)

Required Courses

a

b

c

d

e

f

g

h

i

j

k

Core EE Undergraduate Course

                     

EE 101: Intro. to Electronics/Lab

M

M

S

S

S

           

EE 80T Technology & how it works

S

       

S

S

M

S

M

 

EE 103: Signals & Systems

M

S

S

               

EE 135 EM Fields & Waves/Lab

S

M

   

M

         

S

EE 145 Properties of Materials/Lab

S

M

S

 

S

 

S

     

S

EE 151* Communications Systems

S

 

S

 

M

         

S

EE171 Analog Electronics/Lab

M

M

M

 

S

         

M

Supporting Undergraduate Courses

                     

CE 80e Engineering Ethics

         

M

S

S

 

S

 

CE 100 Logic Design

S

S

M

     

S

     

M

CE 107 Math. Methods Sys. Anal.

M

S

               

S

CE 185 Technical Writing

         

S

M

 

S

   

Engr 27 Math. Meth. for Engineers

M

                 

S

Capstone Experience Courses

                     

EE 129A Capstone Design Project

S

S

S

S

S

 

S

 

M

   

EE 129B,C Capstone Design Project

S

S

S

M

S

M

M

M

S

M

M

EE 195 Senior Thesis

S

S

S

M

S

M

M

M

S

M

M

In figure 2 the last block contains the continuous improvement process (CIP) that involves the analysis of the data and the meeting of the CIP committee that formulates the recommendations. The process illustrated in figure 2 is done each year as required by ABET. A sample of the resulting recommendations is given in Appendix A. When the PLO assessment program is implemented for our graduate degrees, the department will have similar sets of recommendations for the graduate programs, issued on the assessment schedule discussed in section III below.

The rubrics for the assessment process in figure 2 are lengthy and not repeated here. They are similar to the rubrics in section 3 below and can be found in their entirety in the EE Department’s Self-Study for the 2009 ABET visit, namely p. 245ff in “ABET Self-Study Report, 2009”, EE Dept., UCSC, June 26, 2009. A pdf copy can be supplied on request to the EE Dept. Manager.

III. Graduate Program

For the graduate program we define the program learning outcomes for MS and PhD jointly and the same for the alignment of PLO's and courses, examinations and other documents. However, we treat the MS EE and PhD EE assessments separately.

A. PLO definition

In the table below we define the five PLO's for the Masters program and the six PLO's for the PhD program. The additional PLO for the PhD program recognizes that the PhD degree should confer the ability to do research in the graduate’s future career, whereas the Masters degree does not require the same ability.

Table 3: PLO's for Masters and PhD graduate programs

EE Dept. MS & PhD Program Learning Outcomes

MSEE

PhDEE

1. Proficiency in understanding fundamental knowledge in electrical engineering

1. Mastery of a broad range of fundamental knowledge in the field of electrical engineering

2. Proficiency in using electrical engineering expertise in an applications area.

2. Mastery of electrical engineering expertise in an applications area.

3. Ability to plan a research project and carry it successfully to completion

3. Ability to formulate a theoretical and/or experimental problem in electrical engineering and solve it

4. Ability to communicate research objectives, work and results in a competent technical report

4. Ability to communicate research objectives, work and results in competent technical presentations and report

5. A high standard of professional and research ethics

5. A high standard of professional and research ethics

 

6. Ability to independently recognize an important research problem, plan a corresponding research project and carry it successfully to completion

B. PLO alignment

The data sourcesfor the assessment of the PLO'sare selected for both direct and indirect evidence and shown in Table 4 below. The rationale for the selection is summarized below:

  • PLO 1: The academic courses aligned with PLO 1 are the more fundamental or introductory course, especially the ones that are typically taught every year or most frequently. For both the Masters and doctoral programs there are no specifically required courses. Hence, we have an extended list of courses so that students in the several areas of emphasis covered by the EE Department are sampled.
  • PLO 2: The courses aligned with PLO 2 are the more specialized courses that emphasize applications of fundamentals in an applications area and are typically taught every other year or less frequently. Again, we have an extended list of courses so that students in the several areas of emphasis covered by the EE Department are sampled.
  • PLO 3: For this PLO we include one course in which a project is done; thus, allowing assessment of the students ability to plan and complete a research project. Thesis and other evidence are also included.
  • PLO 4: Again for this PLO we include one course in which a project is done; thus, allowing assessment of a student’s ability to communicate the findings and conclusions of a project in a report. Thesis and other evidence are also included.
  • PLO 5: For this PLO we include assessment information from all sources in table 4. Since this PLO concerns professional and research ethics, the assessment looks especially for cases where there are violations of professional and research ethics. Since there are likely to be few violations, we include all sources so as to catch any that do occur.
  • All PLO's: For both the MS and PhD programs all PLO's are assessed using the thesis itself as well as the advisor’s observation of the research practice that the student exhibits during the research leading to the thesis. In the case of the PhD program two additional sources of assessment data are provided by the preliminary examination and the advancement to candidacy presentation.

C. MSEE Assessment

In our discussion of PLO assessment for the Masters program we first exhibit the matrix summarizing the assessment process (table 5) and then move on to notes regarding to the matrix and rubrics of the assessment process.

1. Matrix: In Table 5 we show a summary of the PLO assessment process for the MSEE program, including timing, evidence sources, population sampled, assessment approach and schedule of evidence collection, analysis and reporting.

Table 4: Alignment of PLO's for Masters and PhD graduate programs

With Courses, Examinations and other Documents

Integrated MS and PhD Curriculum Matrix for EE Department

Assessment Data Sources

MS & PhD Program Learning Outcomes (PLOs)

PhD Only

       
 

PLO 1

PLO 2

PLO 3

PLO 4

PLO 5

PLO 6

EE 211 Intro. Nanotechnology

X

     

X

 

EE215 MEMS Design

 

X

X

X

X

 

EE 216 Nanomaterials & Devices

       

X

 

EE 221 Adv. Analog IC's

 

X

   

X

 

EE 222 High-Speed Low-Power IC Design

 

X

   

X

 

EE 226 CMOS RF IC Design

 

X

   

X

 

EE 230 Optical Fiber Comm.

X

     

X

 

EE 236 Integrated Biophotonics

 

X

   

X

 

EE241 Feedback Control

X

     

X

 

EE 250 Digital Signal Processing

X

     

X

 

EE 251 Digital Communication

       

X

 

EE252 Wireless Communications

 

X

   

X

 

EE 253 Information Theory

X

     

X

 

EE 262 Statistical Signal Processing

X

     

X

 

Ph D Preliminary exam

           

Advancement to Candidacy (Ph D only)

 

A

A

   

A

Thesis Defense (Ph D only)

   

A

A

A

A

MS or Ph D Thesis

A

A

A

A

A

A

MS/Ph D Research Practice

A

A

A

A

A

A

X= Data is collected for PLO assessment, A= Students demonstrate PLO, and assessment evidence is collected

           

Table 5: MS EE PLO Assessment Summary Matrix

Year

PLO

Type of evidence and its source (note if it needs to be developed)

Population (who will be assessed)

Assessment approach & tools

When evidence will be collected

Analysis, report, recommen-dations

2013-15

PLO 3 Research Project Ability

Direct evidence: EE215 project & MS thesis

All students graduating

Analytic rubric with specified standards for MS level

Fall 2013 through Spring 2015

Fall 2015

   

Indirect evidence: Survey of thesis committee

All students graduating

Survey of thesis committee

Fall 2013 through Spring 2015

 
 

PLO 4 Communication

Direct evidence: EE215 project & MS thesis

All students graduating

Analytic rubric with specified standards for MS level

Fall 2013 through Spring 2015

Fall 2015

   

Indirect evidence: Survey of thesis committee

All students graduating

Survey of thesis committee

Fall 2013 through Spring 2015

 
 

PLO 5 Ethics

Direct evidence: EE215 project, academic integrity

All students graduating

Analytic rubric with specified standards for MS level

Fall 2013 through Spring 2015

Fall 2015

   

Indirect evidence: Survey of thesis committee

All students graduating

Survey of thesis committee

Fall 2013 through Spring 2015

 

2015-17

PLO 1 Proficiency with the fundamental knowledge

Direct evidence: Grades in aligned courses, student evaluations & MS thesis

All students taking aligned courses & those graduating

Analytic rubric with specified standards for MS level

Fall 2015 through Spring 2017

Fall 2017

   

Indirect evidence: Survey of thesis committee

All students graduating

Survey of thesis committee

Fall 2015 through Spring 2017

 
 

PLO 2 Proficiency with the Application

Direct evidence: Grades in aligned courses, student evaluations & MS thesis

All students taking aligned courses & those graduating

Analytic rubric with specified standards for MS level

Fall 2015 through Spring 2017

Fall 2017

   

Indirect evidence: Survey of thesis committee

All students graduating

Survey of thesis committee

Fall 2015 through Spring 2017

 

2. Notes: In the survey of the MS thesis committees, referred to several times in the assessment matrix, a number of questions are addressed, e.g. thesis quality from a research standpoint, from a communications standpoint, etc., both from the perspective of the thesis document itself and from the interaction between the student and the committee members during the thesis research itself (thesis research practice).

Several times we refer to thesis quality. We plan to have the thesis committee, not only pass or fail a thesis, but also complete a survey for PLO assessment that addresses the thesis quality in terms of PLOs 3, 4 and 5, i.e. problem formulation and solution, communication and ethics. The assessment protocol details with respect to the surveys are being developed.

In Table 5 we refer to the EE215 project for assessment. To be clear we use the project report from this course for assessment of PLOs 3, 4 and 5 because it speaks to a student’s skills in doing a research project and is direct evidence because it is graded.

3. MS PLO Rubrics

  • PLO 1: To assess students’ proficiency in fundamental knowledge of electrical engineering we use, as direct evidence, the average grades in aligned courses. Since EE graduate courses are typically taken pass/fail, we use as a metric the percentage of students passing the courses (scaled to 1 to 10, 20% weight). Further direct evidence is collected from department specific questions on the student evaluations (as with the undergraduate PLO assessment). For each graduate course we have an extended course description that contains the primary topics covered in the course and the required skills that students are expected to have when they finish the course. The department specific questions on student evaluations ask the students to rate the degree to which the primary topics are covered and the degree to which they acquired the required skills. We take the average of the student evaluation results for the aligned courses and put it on a 1 to 10 scale (40%). For indirect evidence we use a survey of the thesis committees during the evaluation period (proficiency in fundamental knowledge exhibited in the thesis is scaled to 1 to 10, 40% weight). The weighted sum of these three pieces of evidence must be greater than 8 for the PLO to be achieved. The assessment protocol details are being developed.
  • PLO 2: To assess students’ proficiency in using fundamental electrical engineering expertise in an applications area we use as direct evidence the average grades in aligned courses. Since EE graduate courses are typically taken pass/fail, we use as a metric the percentage of students passing the courses (scaled to 1 to 10, 20% weight). Further direct evidence is collected from department specific questions on the student evaluations (as with the undergraduate PLO assessment). For each graduate course we have an extended course description that contains the primary topics to be covered in the course and the required skills that students are expected to have when they finish the course. The department specific questions on student evaluations ask the students to rate the degree to which the primary topics are covered and the degree to which they learned the required skills. We take the average of the student evaluation results for the aligned courses and put them on a 1 to 10 scale (40%). In terms of indirect evidence we use a survey of the thesis committees during the evaluation period (proficiency in fundamental knowledge exhibited in the thesis is scaled to 1 to 10, 40% weight). The weighted sum of these three pieces of evidence must be greater than 8 for the PLO to be achieved. The assessment protocol details are being developed.
  • PLO 3: To assess student's ability to plan a research project and carry it successfully to completion we use, as direct evidence, the EE 215 class project report grades averaged on the scale of 1 to 10 (20% weight), the percentage of students completing a thesis (on a scale of 1 to 10, 10% weight) and an average of MS thesis advisors’ assessment of MS thesis quality in terms of research (70% weight). The thesis quality is rated on a scale of 1 to 10 with eight being a satisfactory mark for achievement of PLO 3. The weighted sum of these three pieces of evidence must be greater than 8 for satisfactory achievement of PLO 3. The assessment protocol details are being developed.
  • PLO 4: To assess student's ability to communicate research objectives, work and results in a project report we use, as direct evidence, the EE 215 class project report grade average on the scale of 1 to 10 (20% weight) and an average of MS thesis advisors’ assessment of MS thesis quality in terms of communicating the thesis objectives, research work and results (80% weight). The thesis quality in terms of communication is rated on a scale of 1 to 10 with 8 being a satisfactory mark for achievement of PLO 3. The weighted sum of these two pieces of evidence must be greater than 8 for satisfactory achievement of PLO 4. The assessment protocol details are being developed.
  • PLO 5: To assess the degree to which students have a high standard of professional and research ethics is evaluated by direct evidence: first, that there are no violations of academic integrity amongst the MS students in the evaluation period and second by proper credit to prior art and citation of the work of others in the text and references in the EE 215 project reports (on a scale of 1 to 10 as part of the report grade, 30% weight). In terms of indirect evidence the MS thesis committee would be surveyed to determine the degree to which the thesis has proper credit to prior art, proper referencing of quoted material and proper credit to the work of others in a reference list (evaluated on a scale of 1 to 10, 70% waiting). Any violation of academic integrity during the review period flags a failure to achieve this PLO. If there are no violations of academic integrity, a weighted score of 8 of 10 is taken as satisfactory achievement of this PLO. The assessment protocol details are being developed.

D. Ph D assessment

1. Matrix: In Table 6 we show a summary of the PLO assessment process for the PhD program, including timing, evidence sources, population assessed, assessment approach and schedule of evidence collection, analysis and reporting.

Table 6: PhD EE PLO Assessment Summary Matrix

Year

PLO

Type of evidence and its source (note if it needs to be developed)

Population (who will be assessed)

Assessment approach & tools

When evidence will be collected

Analysis, report, recommen-dations

2013-15

PLO 3 Problem Formulation & Solution Ability

Direct evidence: EE215 project & PhD thesis

All students graduating

Analytic rubric with specified standards for PhD level

Fall 2013 through Spring 2015

Fall 2015

   

Indirect evidence: Survey of thesis committee

All students graduating

Survey of thesis committee

Fall 2013 through Spring 2015

 
 

PLO 4 Communication

Direct evidence: EE215 project, PhD Advancement to Candidacy exam & thesis defense

All students graduating

Analytic rubric with specified standards for PhD level

Fall 2013 through Spring 2015

Fall 2015

   

Indirect evidence: Survey of thesis committee

All students graduating

Survey of thesis committee

Fall 2013 through Spring 2015

 
 

PLO 6 Ability to do independent research

Direct evidence: Advancement to Candidacy exam

All students graduating

Analytic rubric with specified standards for PhD level

Fall 2013 through Spring 2015

Fall 2015

   

Indirect evidence: Survey of thesis committee

All students graduating

Survey of thesis committee

Fall 2013 through Spring 2015

 

2015-17

PLO 1 Mastery of fundamental knowledge

Direct evidence: Grades in aligned courses, student evaluations & Qualifying Exam

All students taking aligned courses & those graduating

Analytic rubric with specified standards for PhD level

Fall 2015 through Spring 2017

Fall 2017

   

Indirect evidence: Survey of thesis committee

All students graduating

Survey of thesis committee

Fall 2015 through Spring 2017

 
 

PLO 2 Mastery in Expertise Application

Direct evidence: Grades in aligned courses & student evaluations

All students taking aligned courses & those graduating

Analytic rubric with specified standards for PhD level

Fall 2015 through Spring 2017

Fall 2017

   

Indirect evidence: Survey of thesis committee

All students graduating

Survey of thesis committee

Fall 2015 through Spring 2017

 
 

PLO 5 Ethics

Direct evidence: EE215 project, academic integrity

All students graduating

Analytic rubric with specified standards for PhD level

Fall 2013 through Spring 2015

Fall 2017

   

Indirect evidence: Survey of thesis committee

All students graduating

Survey of thesis committee

Fall 2013 through Spring 2015

 

2. Notes: In the survey of the PhD thesis committees, referred to several times in the assessment matrix, a number of questions are addressed, e.g. thesis quality from a research standpoint, from a communications standpoint, from an ability to work independently, etc., both from the perspective of the thesis document itself and from the interaction between the student and the committee members during the thesis research itself (thesis research practice).

Several times we refer to thesis quality. We plan to have the thesis committee, not only pass or fail a thesis, but also complete a survey for PLO assessment that addresses the thesis quality in terms of PLOs 3, 4, 5 and 6, i.e. problem formulation and solution, communication, ethics and ability to do independent research. The assessment protocol details with respect to the surveys are being developed.

In Table 6 we refer to the EE215 project for assessment. To be clear we use the project report from this course for assessment of PLOs 3, 4 and 5 because it speaks to a student’s skills in doing a research project and is direct evidence because it is graded.

3. Rubrics

  • PLO 1: To assess students’ mastery of fundamental knowledge of electrical engineering we use, as direct evidence, the average grades in aligned courses and the results of the qualifying examination. Since EE graduate courses are typically taken pass/fail, we use as a metric the percentage of students passing the courses (scaled to 1 to 10, 10% weight). Further direct evidence is collected from department specific questions on the student evaluations (as with the undergraduate PLO assessment). For each graduate course we have an extended course description that contains the primary topics covered in the course and the required skills that students are expected to have when they finish the course. The department specific questions on student evaluations ask the students to rate the degree to which the primary topics were covered and the degree to which the required skills were learned. We take the average of the student evaluation results for the aligned courses and put it on a 1 to 10 scale (20%). A further piece of direct evidence is the results of the qualifying examination. We use the percentage of the examination segments passed (put on a 1 to 10 scale), averaged over all students taking the examination during the assessment period (30%). In terms of indirect evidence we use a survey of the thesis committees during the evaluation period (proficiency in fundamental knowledge exhibited in the thesis is scaled to 1 to 10, 40% weight). The weighted sum of these four pieces of evidence must be greater than 8 for the PLO to be achieved. The assessment protocol details are being developed.
  • PLO 2: To assess students’ mastery of use of fundamental electrical engineering expertise in an applications area we use, as direct evidence, the average grades in aligned courses. Since EE graduate courses are typically taken pass/fail, we use as a metric the percentage of students passing the courses (scaled to 1 to 10, 20% weight). Further direct evidence is collected from department specific questions on the student evaluations (as with the undergraduate PLO assessment). For each graduate course we have an extended course description that contains the primary topics to be covered in the course and the required skills that students are expected to have when they finish the course. The department specific questions on student evaluations ask the students to rate the degree to which the primary topics are covered and the degree to which they learned the required skills. We take the average of the student evaluation results for the aligned courses and put them on a 1 to 10 scale (40%). In terms of indirect evidence we use a survey of the thesis committees during the evaluation period (mastery in the use of fundamental knowledge in an application area exhibited in the thesis is scaled to 1 to 10, 40% weight). The weighted sum of these three pieces of evidence must be greater than 8 for the PLO to be achieved. The assessment protocol details are being developed.
  • PLO 3: To assess student's ability to formulate an electrical engineering problem and carry it successfully to solution we use, as direct evidence, the EE 215 class project report grades averaged on the scale of 1 to 10 (20% weight), the percentage of students completing a thesis (on a scale of 1 to 10, 10% weight) and an average of PhD thesis committee assessments of PhD thesis quality in terms of problem solution (70% weight). The thesis quality with relation to PLO 3 is rated on a scale of 1 to 10. The weighted sum of these three pieces of evidence must be greater than 8 for satisfactory achievement of PLO 3. The assessment protocol details are being developed.
  • PLO 4: To assess student's ability to communicate research objectives, work and results in a presentation and report we use, as direct evidence, the EE 215 class project report grade average on the scale of 1 to 10 (20% weight) and an average of PhD thesis advisors’ survey assessment of PhD thesis quality in terms of communicating the thesis objectives, research work and results (50% weight). The thesis quality in terms of communication is rated on a scale of 1 to 10 with eight being a satisfactory mark for achievement of PLO 4. We also include a survey result of the members of the committees that evaluate the research prospectus presentation that leads to advancement to PhD candidate status in term of PLO 4 (scale of 1 to 10, 30%). The weighted sum of these three pieces of evidence must be greater than 8 for satisfactory achievement of PLO 4. The assessment protocol details are being developed.
  • PLO 5: The degree to which students have a high standard of professional and research ethics is evaluated by direct evidence: first, that there are no violations of academic integrity amongst the PhD students in the evaluation period and second by proper credit to prior art and citation of the work of others in the text and references in the EE 215 project reports (on a scale of 1 to 10 as part of the report grade, 30% weight). In terms of indirect evidence the PhD thesis committee is surveyed to determine the degree to which the thesis gives proper credit to prior art, proper referencing of quoted material and proper credit to the work of others in the text and reference list (evaluated on a scale of 1 to 10, 70% waiting). Any violation of academic integrity during the review period flags a failure to achieve this PLO. If there are no violations of academic integrity, a weighted score of 8 of 10 is taken as satisfactory achievement of this PLO. The assessment protocol details are being developed.
  • PLO 6: The degree to which students show an ability to do independent research is evaluated by two pieces of evidence, the advancement to candidacy exam and the PhD thesis itself (written document and defense). As a part of the advancement to candidacy exam process the committee will be asked not only to pass or fail the candidate, but also to evaluate the candidates ability to formulate independently research problem that will result in new knowledge in the subject area (on a scale of 1 to 10, 30% weight). The second piece of evidence is the PhD thesis itself and the evaluation of this PLO will be a segment of the survey of PhD thesis committees and consider both the thesis document and the defense (on a scale of 1 to 10, 70% weight). The weighted sum of these two pieces of evidence provides the assessment of PLO 6 with a threshold of 8 being a satisfactory assessment this PLO. The assessment protocol details are being developed.

IV. Assessment Schedule & Committee

The assessment process for the graduate programs (MS and PhD) runs on a 4-year cycle as indicated in Tables 5 and 6, beginning in the current academic year and continuing through the 2016-17 academic year. Reports on analysis and recommendations would be issued in the fall of 2015 (PLOs 3, 4 and 5 for the MS program and 3, 4 and 6 for the PhD program) and in the fall of 2017 for the remaining PLOs. The assessment process for the undergraduate program would continue to run on the one year cycle required by ABET.

The assessment process for the undergraduate and graduate programs would be run as shown in the block diagram of figure 2, in Section II and in Tables 5 and 6. The existing undergraduate CIP committee would be augmented to perform the analysis, findings and recommendation process for both the undergraduate and graduate programs by adding the graduate director to the existing CIP committee, i.e. John Vesecky (CIP Chair), Joel Kubby (Dept. Chair), Stephen Petersen, Don Wiberg, Ken Pedrotti (Undergrad Director) and Hamid Sadjadpour (Graduate Director). This committee contains the Department Chair, Undergraduate and Graduate Directors as well as representatives familiar with PLO assessment process and both graduate and undergraduate teaching.

Appendix A: Undergraduate CIP Recommendations for academic year 2012-2013

EE Department Continuous Improvement Program

Findings and Recommendations from CIP Oversight Committee Review

Academic Year 2012-2013

Committee members: John Vesecky (CIP Chair), Joel Kubby (Dept. Chair), Stephen Petersen, Don Wiberg and Ken Pedrotti (Undergrad Director)

The committee reviewed the findings of the various components of the CIP review process. Most outcomes were assessed as satisfactory. However, the following outcomes (immediately below) were assessed as unsatisfactory or marginal. We also include information summarized from discussions of the CIP Oversight Committee, exit interviews by the Department Chair and recommendations that flow from it. After discussion and deliberation we present the following findings and recommendations to the EE Department faculty for action.

ABET Methodology: Thecommittee noted that there were some gaps in compliance for faculty to supply information regarding outcomes assessment. Hence the following recommendations:

  • Emails regarding data collection to each faculty member required to collect data that quarter
  • Checklist for compliance with ABET data collection to all relevant faculty each quarter
  • Generate and collect all missing data for academic year 2012 – 13.

Below we review only those outcomes that were not satisfactory, reporting our findings and recommendations. Following the specific outcomes we make further recommendations regarding the EE department undergraduate program.

ABET outcome b: Ability to design and conduct experiments as well as analyze the data

Outcome b was not satisfactorily achieved. The place where most improvement is needed is in the electromagnetics courses. The department has two courses EE 135 and 136. In recent years the shortage of funding did not permit EE 136 to be taught. For EE135 we have the following findings and recommendations:

  • The laboratory needs a thorough revision including demonstrations and an emphasis on why electromagnetics are important for electrical engineers.
  • Revision of the curriculum was discussed, including moving to a two-quarter sequence (EE135A & B) -- Prof. Pedrotti to make specific recommendations to the faculty. The most important recommendation regarding the course is an emphasis on the applications of electromagnetics to electrical engineering practice, i.e. require students learning this stuff.

ABET Outcome f: an understanding of professional and ethical responsibility

As in some previous years the CE80E assessment fell well short of the satisfactory achievement level due to low grades by EE students and student evaluations that showed the students thought that the required skills were not gained and the core topics were not well taught. Our findings and recommendations are given below:

  • This course has been a disappointment in terms of both the grades that EE students make in the course and the degree to which they think they learn the required skills and that the core topics are covered. This course has been taught by a variety of lecturers – some good and some not so good.
  • In the exit interviews, considered below, we note that graduating students think that the ethics and professional responsibility taught in the CE80E course could be learned in a more effective way with fewer units.
  • We recommend that a new method of achieving outcome f be used, including ethics material taught in CE/EE129A by expanding CE/EE129A to 5 units.

ABET outcome g: an ability to communicate effectively

This outcome was assessed as unsatisfactory based primarily on the report of the capstone design course (EE 129ABC). We have the following findings and recommendations:

  • The quality of student technical writing and their ability to construct a first-class technical report has deteriorated from previous years.
  • Although the students have taken a technical writing course (CE185), they have little idea of how to generate a decent technical report.
  • We recommend that technical writing instruction be included in expanding EE129A, as discussed below.

ABET outcome h: Broad education necessary to understand impact of engineering solutions in a global/societal context

This outcome was assessed as unsatisfactory based on information from the EE80T course on modern technology and how it works. We had the following findings and recommendations:

  • We found that students were taking this introductory course for freshmen and sophomores in their senior year and using it as an easy course to make their senior year less difficult.
  • We recommend requiring EE80T for entry into the gateway circuits course (EE 101). This makes sense for two reasons; first, the students would necessarily take it earlier in their BS program where it does the most good. Secondly the material in EE80T would set the stage for EE101, making that course seem more relevant to their in the education. We strongly recommend making this curriculum change, but typically waiving the requirement for students transferring from junior college on a case-by-case basis.

ABET outcome i: A recognition of the need for, and the ability to engage in, life-long learning

This outcome was assessed as unsatisfactory based on information from the EE129A course on capstone design. We estimate student recognition of the need for lifelong learning by the extent to which students in capstone design use professional literature sources and join the student branches of engineering societies. We have the following findings and recommendations:

  • Students in capstone design did not make appropriate use of the professional literature in doing their projects -- they typically used only a general search engine, such as Google.
  • Students in capstone design showed only moderate interest in joining student professional societies that promote lifelong learning.
  • We recommend including material on use of the professional literature in the design class and recruitment opportunities for the student professional societies.

Recommendations resulting from the CIP Oversight Committee’s general discussion, exit interviews and review meetings.

  1. Our most important recommendation, stemming from indications above, is to expand the first quarter of the capstone design course (EE129A) to 5 units and simultaneously drop requirements for specialized courses in ethics and technical writing. Currently students are required to use 10 units of their education on ethics and technical writing. We are particularly skeptical that the five unit course on engineering ethics gives the required value for the time spent, especially in view of the lost opportunity to take more technical courses. If EE 129A were expanded to five units and requirements for CE80E and CE80I were dropped, students would get ethics and technical writing instruction and practice more closely associated with the kind of technical work they are most likely to be engaged in during their career. In addition they would have the opportunity to take at least one additional technical course and perhaps two. We think this would give students better value for money in their BSEE program.
  2. Feedback from students during exit interviews indicates that the discrete math course CE16 does not have much value for EE students. Hence, we recommend dropping it as a required course for BSEE degrees and including needed material in EE103.
  3. Move EE157 to the fall quarter so that it could be taken more easily as a prerequisite for the capstone design course. Currently EE 157 is given in the winter quarter and hence must be taken the year previous to capstone design in order to be relevant for that course. This move would make the EE curriculum in line with the offering of CE118 (Mechatronics) in the fall quarter. Having both of these prerequisites in the fall quarter eases the workload for students in their final year when they take capstone design (CE/EE 129ABC). Further, moving EE 157 to the fall eases the current congestion in winter quarter.
  4. A further recommendation regarding EE 157 is to make it a co-requirement with EE 151. EE157 would fulfill the desire by students for a lab to go with EE 151 as well as fulfilling the role of a prerequisite for capstone design.
  5. In past years there is been discussion of including the microcontroller course CE 121 as a requirement for BSEE students. Exit interviews indicated that CE118 (Mechatronics) got the most positive feedback of any course. This course contains a strong component in the use of microcontrollers in a robotic system. So including it in the EE curriculum would fulfill both student demand and microcontroller experience for BSEE students. Such experience would be a great benefit to EE students taking the capstone design course.
  6. Student feedback in exit interviews indicates that our subject matter tracks are not clearly defined. We should rethink our tracking system, e.g. using hardware and software tracks or hardware and analysis tracks, etc.
  7. Based on exit interview feedback we should consider dropping AMS 147 from the list of electives, as students did not find the course to be useful.
  8. As our number of EE students increases, we should look more carefully at our power engineering courses, perhaps developing a power engineering track in EE. We now have three courses EE 175, 176 and 177. Two of these courses (Generation and Transmission, and Motors and Control) are currently taught by Steve Petersen and are successful and stable. However, the power electronics course has been taught by variety of graduate and postgraduate students, varying from year to year with the instructor often teaching it for the first time.
  9. EE103 laboratory is useful to students, but instructors are lukewarm in supporting it. Unless the laboratory has a good TA it is not likely to be a success. TA training is essential. An aid to TA training and student learning could be gained by video lectures relevant to the laboratory and its experiments. We recommend asking the Dean for media resource funds to make this recommendation viable.
  10. EE 101 has some of the some problems with laboratories. The use of Yoni Savita’s book as a TA training and teaching aid would likely increase student interest in the labs. Further, the media resource funds discussed above (item 9) should probably include videos for EE 101 as well.
  11. A Google Docs site for the department with accounts for all EE faculty and instructors would provide a common collection point for course materials that would be valuable for all instructors and possibly TA’s. Some suggested material for such a repository would be laboratory instructional materials and videos, lectures, training aids for TA’s and instructors and a place for ABET material.

Respectfully submitted, ABET Continuous Improve Program Review Committee 2012-13