Fundamental of Engineering Examination Computer-Adaptive Version at Morgan State University

DOI : 10.17577/IJERTV9IS040210

Download Full-Text PDF Cite this Publication

Text Only Version

Fundamental of Engineering Examination Computer-Adaptive Version at Morgan State University

Steve Efe (D.Eng) and Mehdi Shokouhian (P.hD) Department of Civil Engineering

Morgan State University

Abstract:- The Engineer-in-Training (EIT) exam is the first step to licensure as a professional engineer in the United States. Currently, the EIT Computer Based Testing (CBT) is generally perceived as a positive trend towards a more innovative and constructed response assessment task, which evaluates understanding in disciplinary knowledge, providing accessibility, unbiasedness and speedier test delivery for state- wide assessment. This paper describes the interface design of a Computer Based Testing environment (FEBooth) as a testing modality at Morgan State University (MSU), its implementation, advantages over the paper based assessment (PBA) testing and evaluation or analysis of performance and, where appropriate, additional issues such as students perception of this method of assessment. To provide reliable measurements of student performance which, as well as having predictive value for the real Civil EIT exam, a total of 20 questions with short answers (SA) and multiple choices (MC) were administered to 90 graduating seniors in 2016. Strong influence on performance resulted from student learning gain from subjects in classroom and perceived ability to use FEbooth successfully which affected their behavioral response. Findings revealed preference to the paper based testing; however, 65% of the students exhibited computer anxiety resulting in slower task completion and poor performance. The study identified 35% and 55% of the students passed at first attempt while the likelihood of success increased to 56% and 68% in their second attempt of the CBT FEBooth and PBA, respectively.

INTRODUCTION

During the past few years, computerized testing (CBT) has gained popularity as an assessment modality and this has been implemented in occupational fields for licensure, certification, and psychological tests. 2 Test security, ability to create randomized questions from vast question pools and utilize encrypted databases for stored questions and responses are great advantages of the CBT. 9

The Fundamentals of Engineering (FE or EIT) exam, the first step to becoming a professional engineer (P.E.), is designed for recent graduates in the United States. The computer version of the FE offers innovations in testing and assessment as it can be taken independent of time (administered year-round at NCEES-approved test centers) and the FE test developers can subject individual candidates to set the same test conditions. So far, the shift to computer- based testing is believed to have accommodated an increase in the number of test takers across the United States. 6 Conceivably though, research has not authenticated performances based on behavior such as computer anxiety and slower task completion in comparison to the previously administered paper-based EIT exams. Frustrations from

examinees are likely to occur in CBT than on traditional paper-based exams due to concerns over constraints such as degree of computer literacy, test difficulty, questions being tailored to examinee ability levels, inability to skip and review questions and change answers. Whilst accredited training centers and hundreds of FE practice books are available, their assessment framework and test interface features differ from the real-administered EIT exam.

In addition, familiarization with exam format and less comfortable test environment could hinder good test performance in CBT exams. Cognitive concerns such as ill preparation, thought disruptions and how others view the examinee if he/she does poorly raises anxiety.11 Since the EIT exam is not yet a self-adapted testing, poor performance could be as a result of the level of difficulty of test questions. Students outcome in the EIT CBT exam, just like other computerized testing, would largely depend on student learning gain on test areas. Student achievement is the status of subject-matter knowledge. Adequate preparation by students for the EIT exam would influence outcomes and scores depending on learning process variables such as availability of technology for learning before the real exam is taken, including time specifications for understanding subject matter.

LITERATURE REVIEW

Computerized testing, a next-generation way of administering tests, provides numerous benefits leading to higher productivity in testing not experienced in traditional test designs such as the paper and pencil assessment format.10 Despite indications of indifference in the past about the advent of CBT and the opportunities it creates, increase in the use of such innovations as the internet has gradually pushed for support of computerized testing.1 CBT is particularly implemented in licensure and certifications due to the standardization of test administration conditions, customized feedback, test security, and immediate reporting as a result of advanced technological capacities.2 Evidence from early studies suggests that success in computer based assessment is highly dependent on examinees being adequately accustomed to the format in which the examination is administered.4 However, few researches provide information on whether computer-based tests in engineering exams are equivalent to paper based assessments under same test conditions. Notably poor performance in CBT has been attributed to constraints such as computer literacy, computer anxiety, test anxiety and student learning gain prior to examination.5

Computer anxiety has been found to be a source of interference with performance on computerized tests. It is a major issue affecting learning characteristics since individuals learn and process information in different ways.7 In education settings, exam anxiety is common. The implication of test anxiety on test performance could be detrimental and impair future opportunities.5 Anxiety is a psychological and behavioral response which occurs due to possible failure, reduced self-efficacy, feeling of unpreparedness, fixation on the exam and sometimes lack of self-worth. Test anxiety largely depends on the extent to which students perceive assessment as threatening. Previous research 8 reported that 28-33 % of US students experience some form of test anxiety. This affect true assessments of student ability and undermines the reliability of test scores. The barriers to high scores in computer- based testing are inadequate learning gain on test areas and self-regulated strategies.3 Justifiably student performance do not represent direct evidence of learning, however, engineering institutions in the United States must support critical thinking and diversity in learning that improves learning gains in preparation for the EIT CBT exams.

Successful completion of the Fundamental of Engineering (EIT) exam is a vital outcome for practice in engineering. Undoubtedly, the recent changes to the testing program by NCEES on the high stake EIT in 2004 has ushered sentiments on its opportunities and drawbacks. It is unclear whether this innovation will result in similar results as paper-based exams. In engineering institutions, their integration and implementation is slowly developing and lacking in many. Despite the positive acceptance of this assessment approach, little is known about the influence of computer literacy and anxiety on performance resulting in misrepresentation of examinee true skills.

METHODOLOGY

Participants

A sample of CBT FE exam takers at Morgan State University (n = 55) in 2016. These examinees are students who are close to finishing an undergrauate engineering and are expected to have the required level of proficiency for the exam.

Materials

A 12-weeks lecture or training was delivered by a variety of staff in nine key subject areas in civil engineering; mathematics, statics, materials, dynamics, structures, strength of materials, fluid mechanics, geotechnical engineering and project management. The students were required to take the FEBooth CBT exam at the end of each lecture to evaluate their performances. Students who completed the CBT exam were required to evaluate their experiences by responding to a paper-based questionnaire after completing the FEBooth exam. A total of 20 questions with short answers (SA) and multiple choices (MC) were administered in each subject area. Quiz results are reported to the internal web server of FEBooth.

Analysis

Data analysis was performed using descriptive and appropriate inferential statistical tests (independent sample t-test, one way ANOVA and two-way ANOVA) at 0.05

alpha with SigmaPlot, a scientific data analysis and graphing software package. Based on the independent variable, results from the FEBooth CBT exam was compared with the paper- based exam which had ten (10) questions written on each page and examinees were expected to write the letter (A,B,C or D) of their answer choice on a separate paper.

The ex post facto research approach is utilized since the independent variables are already present prior to this study, the examinees characteristics are not manipulated and are not randomly assigned.

Furthermore, students' exam preference (computer-based exam, paper-based exam) were asked after the final exam and whether they changed their opinion about EIT CBT exams. Students provided answers on a five-point Likert response scale (Figure 4).

The FEBooth Interface

The FEBooth user interface was designed to consider aspects of the test design that could have a direct effect on examinee performance such as timing or pacing, navigation and automation of test assembly. The ease of navigation around the FEBooth via back, next and submit buttons, the visual style of navigation and ability to flag or review a question later was provided to give a positive perception to the examinee. To avoid rapid guessing behavior and failure to reach certain questions, examinees are afforded adequate time and prevented from submitting an exam with intentionally omitted questions. Each question is a uniquely identifiable module structured to prevent data error and questions are picked from a test bank or pool (Figure 1). This allows randomization of questions to protect the integrity of the exam. The question bank stores the question definitions and these are organized into categories (Multiple choice, Short answer and True/false). Figure 1 shows a sample question from the pool set. The left pane of the interface houses the questions while the right pane shows the reference formula or expression for the particular test question.

Figure 1: FEBooth Exam Home Interface

Figure 1: Sample FEBooth Database Structure

Figure 2: FEBooth Sample Question

Figure 2 shows a visual representation of the sequence of steps and decisions made by FEBooth during the exam processes. A database structure was created to manage the exam data, efficiently measure examinee proficiency and provide immediate test performance feedback to examinees.

Figure 3: FEBooth CBT Exam Flowchart

ANALYSIS

Analyses were performed to evaluate the following research questions:

  1. Performance of examinee in the FEBooth CBT exam due to anxiety (First Attempt).

  2. Examinee performance in the FEBooth CBT exam due to anxiety (Second Attempt).

  3. Influence of subject area and test mode on students performance in PBA and FEBooth CBT Exam.

  4. Percentage of examinee that passed the FEBooth CBT and PBA exams in their first and second attempt.

  5. Do student perfer the CBT exam over the PBA.

Research Question 1: Influence of anxiety on students performance in first attempt of FEBooth CBT?

Hypothesis 1: There is no significant difference in the performance of students on CBT based on their anxiety level.

Table 1 shows the Mean and Variance of the performance of students in the FEBooth exam for the various groups based on their level of anxiety. The group with mild anxiety had a mean of 63.08 with a Variance of 38.62. The group with moderate anxiety had a Mean of 59.93 with a Variance of

20.85. The group with severe anxiety had a Mean of 51.63 with a Variance of 16.84. These results show that the group with severe anxiety had the lowest mean score of 51.63. This suggests that there is a difference in the performance of students with different levels of anxiety.

Table 2 shows that F = 15.70 with a p-value of 0.000 which is less than the chosen alpha (0.05), indicating that the difference is statistically significant. Since the difference is statistically significant, the null hypothesis that there is no significant difference in the performance of students on a

CBT based on anxiety is rejected: F(2,97) = 15.70, P < 0.05. This in turn implies a significant difference in the performance of students with different levels of anxiety in favor of candidates with mild anxiety.

Table 1: Descriptive statistics of the performance of

students on a CBT based on their computer literacy

Anxiety

Count

Sum

Mean

Variance

Mild

63

3974

63.08

38.62

Moderate

29

1738

59.93

20.85

Severe

8

413

51.63

16.84

Table 2: One-way ANOVA of performance of students on

Anxiety

SS

df

MS

F

P-

value

F

crit

Between Groups

1002.41

2

501.20

15.70

0.00

3.10

Within Groups

3096.34

97

31.92

Total

4098.75

99

Anxiety

SS

df

MS

F

P-

value

F

crit

Between Groups

1002.41

2

501.20

15.70

0.00

3.10

Within Groups

3096.34

97

31.92

Total

4098.75

99

a CBT based on anxiety

Research Question 2: Influence of anxiety on students performance in their second attempt of FEBooth CBT?

Hypothesis 2: There is no significant difference in the performance of students in their second attempt on CBT based on their anxiety level.

Table 3 shows the Mean and Variance of the performance of students in the second attempt of the FEBooth exam for the various groups based on their level of anxiety. The group with mild anxiety had a mean of 66.97 with a Variance of

22.77. The group with moderate anxiety had a Mean of

    1. with a Variance of 24.83. The group with severe anxiety had a Mean of 68.63 with a Variance of 11.41. These result shows that anxiety had the little impact on the score in their second attempt.

      Table 4 shows that F = 1.32 < Fcrit with a p-value of 0.27 which is higher than the chosen alpha (0.05), indicating that the diffrence is statistically insignificant. Since the difference is statistically insignificant, the null hypothesis that there is no significant difference in the performance of students on a CBT based on anxiety is accepted: F(2,97) = 1.32, P < 0.05.

      Table 3: Descriptive statistics of the performance of

      students on a CBT based on their computer literacy

      Anxiety

      Count

      Sum

      Mean

      Variance

      Mild

      63

      4219

      66.97

      22.77

      Moderate

      29

      1988

      68.55

      24.83

      Severe

      8

      549

      68.63

      11.41

      Table 4: One-way ANOVA of performance of students on

      a CBT based on anxiety

      Anxiety

      SS

      df

      MS

      F

      P-

      value

      F crit

      Between Groups

      59.66

      2

      29.83

      1.32

      0.27

      3.09

      Within Groups

      2186.99

      97

      22.55

      Total

      2246.64

      99

      Research Question 3: To what extent does subject area and test mode influence students performance?

      Hypothesis 3: There is no significant difference in the performance of students on CBT based on subject areas and test mode.

      Table 5 shows the mean scores of the various subject areas: Statics, Materials, Mathematics, Dynamics, Strength of Materials, Structures, Fluid Mechanics, Geotechnical Engineering and Project Management for PBA and CBT as 62.68 and 65.01; 63.87 and 68.92; 60.84 and 63.80; 50.32

      and 52.40; 65.83 and 66.40; 66.30 and 69.47; 57.11 and

      61.18; 59.13 and 61.09; and 62.18 and 62.38 respectively. Table 6 shows that for subject areas, F = 124.03 and p = 0.00. For test mode, F = 221.93 and p = 0.00. For the interaction of subject areas and test mode, F = 5.43 and p = 0.00. This means all the p values are less than the alpha = 0.05, meaning that the difference is significant. The null hypothesis that there is no significant difference in the performance of student based on subject areas and test mode is therefore rejected. This means that the performance of student based on subject areas and test mode is significantly different.

      Research Question 4: Percentage of examinee that passed the FEBooth CBT and PBA exams in their first and second attempt.

      Figure 4 shows that out of the 90 students performance evaluated, 35% and 55% of the students passed the CBT and PBA respectively in their first attempt. 56% and 68% of the students examined succeeded in their second attempt in the CBT and PBA exam respectively.

      Figure 4: Percentage passing in CBT and PBA exam in both attempts

      FLUID MECHANICS

      CBT

      90

      5140

      57.11

      11.76

      PBA

      90

      5506

      61.18

      3.65

      Total

      18

      0

      10646

      59.14

      11.82

      GEOTECHNICAL ENGINEERING

      CBT

      90

      5322

      59.13

      37.87

      PBA

      90

      5498

      61.09

      15.41

      Total

      18

      0

      10820

      60.11

      27.45

      PROJECT MANAGEMENT

      CBT

      90

      5596

      62.18

      1.72

      PBA

      90

      5614

      62.38

      5.43

      Total

      18

      0

      11210

      62.28

      3.56

      TOTAL

      CBT

      81

      0

      49344

      60.92

      44.98

      PBA

      90

      51358

      63.40

      40.37

      FLUID MECHANICS

      CBT

      90

      5140

      57.11

      11.76

      PBA

      90

      5506

      61.18

      3.65

      Total

      18

      0

      10646

      59.14

      11.82

      GEOTECHNICAL ENGINEERING

      CBT

      90

      5322

      59.13

      37.87

      PBA

      90

      5498

      61.09

      15.41

      Total

      18

      0

      10820

      60.11

      27.45

      PROJECT MANAGEMENT

      CBT

      90

      5596

      62.18

      1.72

      PBA

      90

      5614

      62.38

      5.43

      Total

      18

      0

      11210

      62.28

      3.56

      TOTAL

      CBT

      81

      0

      49344

      60.92

      44.98

      PBA

      90

      51358

      63.40

      40.37

      Figure 5: Preference for CBT and PBA exams

      Preference for computer-based exams.

      Research Question 5: Percentage of students that prefer the CBT over PBA exams.

      As shown in Figure 5, 27% of the students preferred a CBT exam, 60% preferred a paper based assessment while 13% indicated no prefence for one test mode over the other after completing the first FEBooth exam. However, the opinion of the students on CBT improved after the second attempt. 40% of students felt more positive, 45% remained negative, and 5% remained indifferent towards CBT exams. The more positive opinion on the FEBooth computer based testing is a result of immediate feedback on their exam performance.

      Table 5: Descriptive statistics of the performance of students on a CBT based on subject and test mode.

      Table 6: Two-way ANOVA of students performance based on subject and test mode

      Subject Areas

      SS

      df

      MS

      F

      P-

      valu e

      F crit

      Sample

      2503.82

      1

      2503.8

      3

      124.0

      3

      0.00

      3.84727

      Columns

      35842.9

      2

      8

      4480.3

      7

      221.9

      3

      0.00

      1.944173

      Interactio n

      862.46

      8

      107.81

      5.34

      0.00

      1.944173

      Within

      32340.4

      2

      160

      2

      20.187

      5

      Total

      71549.6

      3

      161

      9

      CONCLUSIONS

      Larger number of students who took the CBT exams in their first attempt believed that that the experience in PBE in general was more favorable compared to CBE in terms of their ability to work in a structured manner, overcome few technical use difficulies, and their ability to concentrate. Students did better in their second attempt since they have been accustomed to the CBT mode, and therefore have developed confidence in their approach to taking computer- based exams.

      The psychological unpreparedness of the CBT exam takers and feeling of failure and expectation of others affected their performances in their first attempt of the CBT exam. Students appear to feel more in control of their emotions when taking the computer-based exam in the second attempt. This means overall student acceptance can improve with more experience with computer-based testing.

      Paper-based exams is the common test mode in many universities which might underline the failure of students in the EIT computer-based exams. Introduction of computer and digital technologies to design and implement fully functional computer-based exams would alleviate anxiety and improve confidence before taking the NCEES Engineer- in-Training exam.

      Subject Areas

      Mode

      N

      Sum

      Mean

      Variance

      STATICS

      CBT

      90

      5641

      62.68

      17.77

      PBA

      90

      5851

      65.01

      6.75

      Total

      18

      0

      11492

      63.84

      13.56

      MATERIALS

      CBT

      90

      5748

      63.87

      32.66

      PBA

      90

      6203

      68.92

      16.86

      Total

      18

      0

      11951

      66.39

      31.04

      MATHEMATICS

      CBT

      90

      5476

      60.84

      12.47

      PBA

      90

      5742

      63.80

      3.11

      Total

      18

      0

      11218

      62.32

      9.94

      DYNAMICS

      CBT

      90

      4529

      50.32

      8.98

      PBA

      90

      4716

      52.40

      3.23

      Total

      18

      0

      9245

      51.36

      7.16

      STRENGTH OF MATERIALS

      CBT

      90

      5925

      65.83

      38.19

      PBA

      90

      5976

      66.40

      53.86

      Total

      18

      0

      11901

      66.12

      45.85

      STRUCTURES

      CBT

      90

      5967

      66.30

      48.75

      PBA

      90

      6252

      69.47

      44.90

      Total

      18

      0

      12219

      67.88

      49.09

      Subject Areas

      Mode

      N

      Sum

      Mean

      Variance

      STATICS

      CBT

      90

      5641

      62.68

      17.77

      PBA

      90

      5851

      65.01

      6.75

      Total

      18

      0

      11492

      63.84

      13.56

      MATERIALS

      CBT

      90

      5748

      63.87

      32.66

      PBA

      90

      6203

      68.92

      16.86

      Total

      18

      0

      11951

      66.39

      31.04

      MATHEMATICS

      CBT

      90

      5476

      60.84

      12.47

      PBA

      90

      5742

      63.80

      3.11

      Total

      18

      0

      11218

      62.32

      9.94

      DYNAMICS

      CBT

      90

      4529

      50.32

      8.98

      PBA

      90

      4716

      52.40

      3.23

      Total

      18

      0

      9245

      51.36

      7.16

      STRENGTH OF MATERIALS

      CBT

      90

      5925

      65.83

      38.19

      PBA

      90

      5976

      66.40

      53.86

      Total

      18

      0

      11901

      66.12

      45.85

      STRUCTURES

      CBT

      90

      5967

      66.30

      48.75

      PBA

      90

      6252

      69.47

      44.90

      Total

      18

      0

      12219

      67.88

      49.09

      REFERENCES

      1. Amenyedzi, F. W., Lartey, M. N., & Dzomeku, B. M. (2011). The use of computers and internet as supplementary source of educational material: A case study of the senior high schools in the Tema metropolis in Ghana. Contemporary Educational Technology, 2(2), 151-162.

      2. Bergstrom, B. A., & Lunz, M. E. (1999). CAT for certification and licensure. Innovations in computerized assessment, 67-91.

      3. Compeau, D. R., & Higgins, C. A. (1995). Computer self- efficacy: Development of a measure and initial test. MIS quarterly, 189-211.

      4. Gershon, R. C., & Bergstrom, B. (1991). Individual Differences in Computer Adaptive Testing: Anxiety, Computer Literacy and Satisfaction.

      5. Holder, S. D., & Gibson, R. (2000). Electronic versus paper- based testing in education. In Proceedings ISECON

      6. Kelly, W. E. (2007). Certification and accreditation in civil engineering. Journal of Professional Issues in Engineering Education and Practice, 133(3), 181-187.

      7. Rezaie, M., & Golshan, M. (2015). Computer Adaptive Test (CAT): Advantages and Limitations. International Journal of Educational Investigations, 2-5.

      8. Sady, J. C. (2010). Test anxiety: Contemporary theories and implications for learning. Anxiety in schools: The causes, consequences, and solutions for academic anxieties, 7-26.

      9. Vale, C. D. (2006). Computerized item banking. Handbook of test development, 261-285.

      10. Wainer, H., Dorans, N. J., Flaugher, R., Green, B. F., & Mislevy, R. J. (2000). Computerized adaptive testing: A primer. Routledge.

      11. Wall, D., & Horák, T. (2008). The impact of changes in the TOEFL examination on teaching and learning in Central and Eastern Europe: Phase 2, Coping with change. ETS Research Report Series, 2008(2).

Leave a Reply