SIGNALS AND SYSTEMS ASSESSMENT: COMPARISON OF RESPONSES TO MULTIPLE CHOICE CONCEPTUAL QUESTIONS AND OPEN-ENDED FINAL EXAM

PROBLEMS
Kathleen E. Wage George Mason Univ. ECE Department Fairfax, VA 22030 USA k.e.wage@ieee.org John R. Buck Univ. of Massachusetts Dartmouth ECE Department N. Dartmouth, MA 02747 USA johnbuck@ieee.org Margret A. Hjalmarson George Mason Univ. Graduate School of Education Fairfax, VA 22030 USA mhjalmar@gmu.edu Jill K. Nelson George Mason Univ. ECE Department Fairfax, VA 22030 USA jnelson@gmu.edu

ABSTRACT The validity of the Signals and Systems Concept Inventory (SSCI) was evaluated by comparing students’ performance on the SSCI to open-ended final exam problems. An assessment instrument is said to be valid to the extent that it measures what it was designed to measure. The SSCI was designed to measure students’ understanding of core concepts in undergraduate signals and systems (S&S) courses through 25 multiple choice questions. The SSCI scores and final exam scores for more than 150 students in four sections of S&S at two schools were found to have a statistically significant correlation. A more detailed analysis was conducted with a pool of over 60 students at both schools. This second analysis compared detailed coding of students’ responses on the final exam problems to their answers for specific SSCI questions assessing the same topic. This analysis found statistically significant correlations between SSCI questions and final exam problems for some convolution and Fourier transform problems. Results were mixed for the problem on Bode plots. Index Terms— Assessment, signals and systems, concept inventory 1. INTRODUCTION Engineering problem solving requires both conceptual and procedural knowledge. For example, signals and systems students must understand both the fundamental concept of convolution and the procedure for computing the convolution integral. Student understanding of convolution and other important topics can be assessed in various ways. One way to test conceptual understanding is to administer a concept inventory, which is a standardized multiple-choice exam. A concept inventory (CI) is designed so that the incorrect choices (distractors) represent common misconceptions. Ideally, CI’s emphasize conceptual understanding, rather than rote calculation. Procedural knowledge, such as how to compute a convolution integral, is often measured using problem-solving exercises. Analysis of student responses to open-ended problems reveals whether they understand how to implement the convolution operation. This study compares student responses to a concept inventory with their responses to open-ended final examination questions for a signals and systems course. A key motivation for this work is the need to validate the concept inventory. An exam is said to be valid if it “measures what it was intended to measure” [1]. There are a number of different aspects of validity; see the article by Moskal et al. for a summary [2]. This study investigates the content validity of
Work funded by NSF Grants DUE-0512430 and DUE-0512636.

the questions by examining whether student responses to the inventory accurately reflect their understanding of the underlying concept. It also looks at criterion-related evidence for validity by correlating the inventory scores with other measures, such as final exam scores. Steif et al.’s analysis of the Statics Concept Inventory is an example of the type of validation study required [1, 3]. The focus of this study is the Signals and Systems Concept Inventory (SSCI) [4]. The SSCI is a 25-question exam designed to test knowledge of the core concepts in the undergraduate signals and systems curriculum taught in electrical and computer engineering. Development of the SSCI began in 2001, and as of 2010, 30 instructors have given the SSCI to more than 2600 students. The project website (http://signals-and-systems.org) provides a complete list of SSCI-related publications. Instructors can request a password to access the latest copies of the inventory. This paper is a follow-on to an initial study presented at the Frontiers in Education Conference in 2007 [5]. The 2007 study compared the SSCI and final examination results for a single class of students at one university. The present study analyzes a larger data set obtained from four classes at two universities. The rest of the paper is organized as follows. Section 2 describes the data set and provides relevant details about the courses where data were collected. Section 3 examines the correlation between students’ SSCI scores and their scores on the final examination. Following that, Section 4 compares student responses to three open-ended final exam problems with their responses to related questions on the SSCI. Section 5 summarizes the paper. 2. DATA SET This study focuses on data from four undergraduate signals and systems classes taught at George Mason University (GMU) and the University of Massachusetts Dartmouth (UMD) between 2006 and 2009. Table 2 summarizes the course information, number of students, instructor, and textook for each of the classes in the data set. Three of the classes were sections of ECE 220 taught by the first author at GMU in three different semesters. The remaining class was ECE 321, taught by the second author at UMD. Both ECE 220 and ECE 321 focus on continuous-time linear signals and systems. ECE 220 is open to both sophomores and juniors, and is often taken concurrently with courses in circuits and differential equations. ECE 321 is taken primarily by second-term juniors who have already taken a discrete-time signals and systems course, two semesters of circuits courses, and a differential equations course. Both ECE 220 and ECE 321 were scheduled for two 75-minute lectures per week. ECE 220 also had one 50-minute recitation and one 100-minute laboratory session each week. The laboratory as-

978-1-61284-227-1/11/$26.00 ©2011 IEEE

198

DSP/SPE 2011

and F3) to their performance on SSCI questions that assess related concepts. and the analysis indicates these correlations are statistically significant at the 1% level (p < 0.3 (10.007 0. and the final was worth 13-19% of the grade. the SSCI scores appear positively correlated with the final exam (problem-solving) scores. The correlation is not equal to 1.5) 67.000 signments consisted of Matlab exercises. F2. Section 4 compares the coded results for these final exam problems to related questions on the SSCI. The SSCI was worth between 5-7% of the grade.2 of the CT-SSCI.9) 45. The second part of the final exam consisted of standard openended analytical problems. See the ECE 220 course webpages on Kathleen Wage’s website [7] for more information on how ACL methods were implemented and for sample in-class problems. The question numbers refer to version 3. An ANOVA test indicates there is no statistically-significant difference in the four class means for the subtest containing only the common questions. so that the final coding represents the group consensus. The conventional problems could be solved using procedural knowledge (a standard “recipe”). COMPARISON OF OPEN-ENDED FINAL EXAM PROBLEMS TO SELECTED SSCI QUESTIONS This section compares student responses to three final exam problems (F1. Mazur found that student scores on a series of conceptual physics problems were uncorrelated with scores on a set of paired “conventional” problems [10].2) Gain 0. 3. class periods consisted of short lecture segments interspersed with in-class problems that students worked in small groups. These results indicate that. Plot shows results for the 20 CT-SSCI questions shared by versions 3 and 4. Specifically.610 0. Once the SSCI was complete. Class Course name N Instructor Textbook 1 GMU ECE 220 40 Wage [8] 2 GMU ECE 220 39 Wage [8] 3 GMU ECE 220 49 Wage [9] 4 UMD ECE 321 26 Buck [9] Post−test difficulty index for questions common to v3 and v4 100 90 80 70 60 50 40 30 Class 1 Class 2 Class 3 Class 4 Class 1 2 3 4 Table 2. Both ECE 220 and ECE 321 were taught using active and collaborative (ACL) methods.0) 67. Summary of classes in the data set.53 0. such as those described in the 2004 article by Buck and Wage [6]. ECE 321 does not have a separate laboratory session.2 and 4. While there are some differences between the four classes. CORRELATION BETWEEN SSCI SCORE AND FINAL EXAM SCORE Figure 2 shows scatter plots of scores on the CT-SSCI versus scores on the final exam for each of the four classes.2 v3. classes 1 and 4 shared three questions on the problem-solving section of their final exams. for this population.007 for all classes). There were 20 common questions between versions 3. while the last two classes used version 4. The pre-test took place during the first lecture.0 (11. Table 2 shows the SSCI results for the 4 classes.6) 74. Pre-test Post-test mean (std) mean (std) 39. students who have greater conceptual understanding (as measured by the SSCI) perform better on open-ended exam problems. A five-level coding scheme was used: 4=correct. For example. Table 3 contains the correlation coefficients and the associated significance values for the SSCI/final exam comparison.5 (9. SSCI version v3. and the posttest was administered during the first hour of the final examination period. Both the SSCI post-test and the open-ended final problems counted towards the final grade.47 20 10 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 CT−SSCI question number (version 3) Fig.760 Significance 0.4 (11. 3=minor errors.000 0. students could not use notes or calculators.44 0.42 and 0.11 of the CT-SSCI. 2=major errors.4 (12. Differences were discussed and resolved. and 0=no answer. Comparison of post-test difficulty indexes (percent correct) for the four classes.637 0.46 0. Correlation: CT-SSCI vs.0) 75. 1=wrong.4 (10.76 for these four classes. the general behavior of the difficulty index curves is quite similar. The SSCI was administered as a pre-test and post-test in all four classes.2 v4. final exam Class 1 2 3 4 Correlation 0.427 0.3 (11. The students’ responses to these questions were coded and linked to the SSCI results using anonymous study ID numbers. The correlation varies between 0. nor would we expect it to be since conceptual understanding and procedural understanding are not necessarily correlated.000 0. students could use three sheets of notes and a calculator for the second part of the final.11 v4. Final exams for all classes were closed book. The ANOVA test and the difficulty indexes indicate that the student populations in these four classes are similar enough that it is reasonable to compare data from these different semesters and schools. but students worked on Matlab projects in small groups outside of class. During the SSCI portion of the exam period. In addition to the SSCI data.Percent correct Table 1. Note that the first two classes used version 3. Three of the authors coded the results independently. 1.11. Classes 1 and 4 are used in this analysis. Table 3. 199 . 4.3 (9. Based on these plots. Figure 1 shows the post-test difficulty indexes (percentage correct) for the 20 SSCI questions that are common to versions 3 and 4.7) 53.9) 41.11 SSCI results for 4 classes. but the conceptual problems could not be solved by rote.

and what shape that convolution will have. since they give the results of adding and multiplying x(t) with h(t). but with different lengths. This problem Problem F1 System 1 is a linear time-invariant (LTI) system with the impulse response h1 (t) shown below: 1 h1 (t) Determine and sketch the output y(t) of the system when the input is the signal x(t) shown below: -2 -1 0 t x(t) 2 1 0 1 3 5 t Fig.2. Two distractors probe whether students understand what convolution means. all students who were able to compute the correct answer to the openended problem selected the correct answer to Q8.2 Question 8 (Q8) and SSCI v4. Based on this feedback. adding instead of convolving) and subtle misconceptions (getting the right shape but the wrong time extent). Problem F1 asks students to compute the output of an LTI system when the input is two non-overlapping rectangles and the impulse response is a short square pulse. However.1. Table 4 compares the coded responses of Class 1 students to Problem F1 with their answers to SSCI Q8. but the wrong extent. Feedback from the SSCI Design Team and student interviews indicated that Q8 on version 3 was too inconsistent in the distractors between gross misconceptions (e. slope is wrong). SSCI question Q8 asks students to identify the output of an LTI system when the impulse response is a unit amplitude rectangular pulse and the input signal is a square pulse.. Students choosing this distractor would recognize that convolving the signals given should produce a trapezoid. Obtaining the correct answer to Q15 requires a deeper level of understanding about convolution that Q13 requires. both the input and the impulse response are unit amplitude rectangular signals. version 4 of the SSCI replaced Q8 from version 3 with two new questions: Q13 and Q15.2 of the CT-SSCI and Classes 3 and 4 took version 4. The third distractor is now a rectangular pulse but with the same extent and location as the correct output should have. This final distractor probes for the case of students who remember how to find the starting and ending times for the convolution of two finite signals. Classes 1 and 2 took version 3.11 Questions 13 (Q13) and 15 (Q15). 2. Students must recognize first that they need to convolve these signals.Class 1 100 SSCI (percent) SSCI (percent) 80 60 40 20 0 0 50 Final (percent) Class 3 100 SSCI (percent) SSCI (percent) 80 60 40 20 0 0 50 Final (percent) 100 100 80 60 40 20 0 0 100 100 80 60 40 20 0 0 Class 2 50 Final (percent) Class 4 100 Table 4. As the table shows. This problem also asks the students to specify the output of an LTI system given the input and impulse response. the correct answer is the only trapezoid given among the choices. but lack sufficient understanding to identify which trapezoid. Problem F1: Convolution Figure 3 shows Final Exam Problem F1 that assesses students’ understanding of the fundamental topic of convolution. However. As with Q8. The question statement for Q13 is identical to Q8 from version 3. Q13 was designed to be the more basic of the two. a trapezoid with the wrong amplitude for the constant region. and a triangle. The design goal for Q13 was to probe whether students have the basic understanding that they need to convolve the input and impulse response to find the output. the correct answer is a trapezoid. is linked with SSCI v3. but don’t recall how to actually compute a convolution. The third distractor has the right trapezoidal shape.e. and then identify the correct results for that convolution. Scatter plots of final exam score versus SSCI score for the four classes in the study. The distractors for Q8 vary substantially in the level of misconception. The majority of students who made minor errors on the open-ended problem were 200 . Question Q15 was designed to probe for a deeper understanding of convolution than Q13. Again. Contingency Table for F1 vs. all three distractors have the same region of support on the time axis as the correct trapezoid. In this case. Final exam Problem F1 (one part of longer problem). Class 1. These questions were designed to probe different levels of misconceptions about convolution. 4. two of the distractors represent the addition and the multiplication of x(t) and h(t). in this case. 3.. The distractors are a trapezoid with the wrong extents for the linear region and constant regions (i. The correct answer is a tall triangular pulse followed by shorter trapezoidal signal. Q8 Problem F1 SSCI Q8 correct incorrect correct 25 6 incorrect 1 8 50 Final (percent) 100 Fig. F1 versus CT-SSCI Question 8 Problem F1 minor major SSCI Q8 correct errors errors wrong zero correct 19 6 5 1 0 wrong (shape ok) 0 1 5 2 0 wrong (add sigs) 0 0 0 1 0 wrong (multiply sigs) 0 0 0 0 0 Table 5.11 of the CT-SSCI. Class 1.g.

the data set for Class 4 is relatively small. Tables 8 and 9 summarize the results for the comparison of the open-ended Bode plot problem and the corresponding SSCI question. and “zero”) worked well for the general Q8. SSCI questions 13 and 15 are more focused on specific aspects of convolution than Q8. Most of the students were able to select the correct answer to Q15. Table 7 comparing F1 and Q15 shows similar results. differing only in formatting details. The results for Class 1 shown in Table 8 indicate that most students can correctly answer the conceptual question. The 25 students who chose the correct answer to Q13 gave responses to F1 that ranged from correct to completely wrong.Table 6. Again the hypothesis that the open-ended Bode plot problem and the SSCI question are correlated is rejected by Fisher’s test. it is not surprising that Fisher’s exact test of the corresponding 2 × 2 table rejects the hypothesis that student responses to Q13 and F1 are correlated. While the general five-level coding scheme (“correct”.2. The distrators include one that adds a zero instead of a pole. All of these questions require students to work with Bode frequency response plots. whereas those who made major errors did not. It may be useful to consider a different coding scheme that is tailored to represent the distractors in Q22/20. (s + 1)(s + 100) Sketch the Bode magnitude plot for this system. Fisher’s test indicates that the results for Q8 and Problem F1 are statistically significantly correlated (p = 0. Table 4 reduces to the 2 × 2 contingency table shown in Table 5. This contingency table can be analyzed using Fisher’s exact test [11].11) are also closely related. F1 versus CT-SSCI Question 15 Problem F1 minor major SSCI Q15 correct errors errors wrong zero correct 5 6 4 3 1 wrong (slope is incorrect) 0 0 0 1 0 wrong (max height wrong) 0 1 3 1 0 wrong shape (triangle) 0 1 0 0 0 Table 8. The student who answered Q13 incorrectly also produced a completely wrong answer to Problem F1. For example. Assuming that students who made minor errors had a correct understanding of convolution. “wrong”. Fisher’s exact test for the 2 × 2 table corresponding to Table 7 rejects the hypothesis that the results for Q15 and Problem F1 are correlated. F1 versus CT-SSCI Question 13 Problem F1 minor major SSCI Q13 correct errors errors wrong zero correct 5 8 7 4 1 wrong (equal to h(t)) 0 0 0 1 0 wrong (add sigs) 0 0 0 0 0 wrong shape/right length 0 0 0 0 0 Problem F2 A causal linear time-invariant system has the transfer function H(s) given below: H(s) = s + 10 . but the wrong time extent. “minor errors”. but their responses to Problem F1 varied from correct to completely wrong. F2 versus CT-SSCI Question 22 Problem F2 minor major SSCI Q22 correct errors errors wrong zero correct 10 11 6 0 0 wrong (pole at 10) 1 2 1 2 1 wrong (zero at 100) 2 0 1 0 0 wrong (-40dB offset) 1 1 1 0 0 also able to choose the correct answer to the conceptual question. asks students to sketch the Bode magnitude plot for a system function H(s) with two poles (at s = −1. Fisher’s exact test for the corresponding 2 × 2 table rejects the hypothesis that the results for Problem F2 and SSCI Q22 are correlated. but the coded results indicate that these same students exhibit varying levels of ability when it comes to sketching a Bode magnitude plot given the system function. as noted in the conclusions (Section 5 below). 201 . it is possible that the cognitive level of Problem F1 is significantly higher than that of questions 13 and 15. there are several possibilities for the lack of correlation between the conceptual question and the open-ended problem. Q22/20 provides a Bode magnitude response for a system H(jω) and asks the student to identify the magnitude response of an new system obtained by multiplying H(jω) by an additional pole. a coding scheme that ignores time axis errors might work better. Tables 6 and 7 compare the coded responses of Class 4 to the open-ended Problem F1 with their responses to the two new convolution questions on v4. Third. Final exam Problem F2 (one part of longer problem). As noted in the previous section.0003). all but one student in Class 4 was able to answer Q13 correctly. but others chose the answer with the right shape. and one that puts the pole at the wrong frequency. The question is designed to assess whether students understand how the introduction of a new pole modifies the Bode plot. There are several possibilities for why Q8 correlates well with the Problem F1 while Q13 and Q15 do not. Fig. Given the distribution of responses in Table 6. Q22 and Q20 are the same basic conceptual question. Problem F2.11 of the SSCI. since all of the Q15 answers have the same time extent. Note that the only student who selected the SSCI distractor indicating he would add the input and the impulse response to obtain the output was also completely unable to solve the open-ended convolution problem. Problem F2: Bode Frequency Response Plots Final exam Problem F2 and CT-SSCI Q22 (v3. As Table 6 shows. “major errors”. Second. −100) and one zero (s = −10). Class 1. Table 7. 4. it may not be appropriate for these more specific questions. shown in Figure 4. 4.2) and Q20 (v4. one that changes the DC value of the Bode plot. The results for Class 4 shown in Table 9 show a similar mixture of coded responses for the large majority of students who get Q20 correct. This mismatch in level could explain the lack of correlation of the results. Class 4. Class 4. and very few students got the conceptual questions wrong. Some students who made major errors or were completely wrong on Problem F1 were able to answer Q8 correctly. First. A larger data set may be required to fully sample the distractor space.

(a) Determine and sketch p(t). CONCLUSIONS An assessment of validity is crucial for any concept inventory. Note that the frequency response H(ω) is shown below the system. (Note that for the earlier study [5] it was 6 questions. while the correlation of F3a with the Fourier subtest is not significant. 6. These results are consistent with the idea that F3a tests students’ procedural knowledge. 5.001) with the Fourier subtest. the relevant questions are 10–12.46 with significance level p = 0.497 at better than the 1% level (p = 0.) (b) The signal p(t) is the input to the following system. It is therefore reasonable that performance on F3b is statistically-significantly correlated with the SSCI subtest on Fourier transform concepts. and 16. 6 and 7 all of the students who got the correct answer on the open-ended problem correct also got the SSCI question correct. This suggests that it might be revealing to examine the SSCI and final exam questions from the perspective of Bloom’s taxonomy to see if some of the open-ended problems require the students to function at a higher cognitive level than the paired conceptual level. Figure 6 indicates that for Class 1. (You may use the Fourier transform table to check your result. The content and distractors for these five questions were essentially unchanged between the two versions of the exam. ACKNOWLEDGMENTS We thank the National Science Foundation (NSF) for its support of the SSCI project through grants DUE-0512686 and DUE-0512430 under the Assessment of Student Achievement program. Fig. is related to a collection of five questions on the CT-SSCI dealing with Fourier transform properties. but we removed one of those questions between version 3 and 4 of the SSCI. but you will not receive full credit unless you prove your result using the definition. Specifically.11. these are Questions 9–11. Class 4. F2 versus CT-SSCI Question 20 Problem F2 minor major SSCI Q20 correct errors errors wrong zero correct 4 5 4 5 1 wrong (pole at 10) 2 0 1 2 1 wrong (zero at 100) 0 0 0 0 0 wrong (-40dB offset) 0 0 0 1 0 Problem F3 The signal p(t) has the Fourier transform P (ω) shown below. For this study we compare the results of F3a and F3b with scores on an SSCI subtest consisting of 5 Fourier-transform-related conceptual questions. while on CT SSCI v4. Final exam problem F3. The correlation coefficient and the statistical significance are shown in the title of each plot. F3b requires conceptual understanding to deconstruct a rather complex problem into a set of subproblems. p(t) p(t) p(t) × s(t) × x(t) H(ω) y(t) + r(t) 2π B cos(2Bt) H(ω) 1 ··· −2B 0 +2B ω ··· Determine and sketch the Fourier transform of the signals at each point in this system.Table 9. in Tables 4. Show your sketches and any other work below. The results from Section 4 raise the possibility that the SSCI questions were easier for many students in this pool than the open ended problem solving questions. Part (a) in Problem F3 asks students to do a rote calculation of the inverse Fourier transform of a square pulse. were not statistically-significant. This paper evaluated the validity of the SSCI by comparing student performance on the inventory questions to their performance on a set of related open-ended exam problems. Analysis of a population of more than 150 students in four classes at two universities indicated a statistically-significant correlation between SSCI scores and associated final exam scores. The results for Class 4 shown in Figure 7 show similar behavior: the correlation of F3b with the Fourier subtest is 0. 5. Results for other open-ended problems. The size of the dots on these plots is indicative of the number of students with that response. notably the Bode plot problem. respectively.ended problem.3. and R(ω). Problem F3: Fourier Transform Final exam Problem F3. shown in Figure 5. questions. In addition we thank the members of the SSCI Development Team for their input on the design and 202 . In other words. 21.2. and 22.) Figures 6 and 7 show the scatter plot of the Fourier subtest scores versus the coded responses for Problem F3 for Classes 1 and 4. thus the results should not necessarily be correlated with the conceptual question results. NSF also supported the initial development of the SSCI through grant EEC9802942 to the Foundation Coalition. X(ω). while part (b) requires students to apply Fourier transform properties to analyze a new system. the results for F3b are have a correlation of 0. 15.1 although the formatting and wording were adjusted in an effort to make the questions easier to read with a cleaner layout. On CT-SSCI v3. Y (ω). Use the definition of the inverse Fourier transform to determine this result. sketch S(ω). A more detailed analysis of the final exam problems for two classes revealed significant correlations between the coded scores for open-ended problems and the relevant SSCI 1 One distractor on Q15 was changed based on a review of the version 3 data and feedback from the SSCI Development Team.018. On the other hand. but many of the students who correctly solved the SSCI question had serious errors on the open. P (ω) 1 −B +B ω 4.

22. 448–461. NJ. Sig=0. Steif and John A. vol. Jon A. 76–81. 7. August 2005. pp. [7] “ECE 220 course materials. 2. 1997. “Active and Cooperative Learning in Signal Processing Courses. Wright. “A Statics Concept Inventory: Development and Psychometric Analysis. [2] Barbara M.Corr=0. and Michael J. 363–371. 2005. Welch.edu/˜kwage/teaching. vol. Oxford University Press. second edition. S. 203 . and Thad B. pp. G.” in Proceedings of the 37th ASEE/IEEE Frontiers in Education Conference. Zar. reliability. 1 2 3 Problem F3b 4 [11] Jerrold H. Wage. Buck and Kathleen E. Willsky with H. England. 22. Buck.” International Journal of Engineering Education. Linear Systems and Signals. 2006. no. pp. Prentice Hall. October 2005.gmu. Peer Instruction: A User’s Manual. pp. 6. Kathleen E. 1997. Milwaukee. “The Signals and Systems Concept Inventory. March 2005. pp.” http://ece. Englewood Cliffs. Steif and Mary Hansen. Wage. 7. and the assessment of engineering education. Englewood Cliffs. fourth edition.213 5 Fourier subtest score Fourier subtest score 4 3 2 1 0 0 1 2 3 Problem F3a 4 5 4 3 2 1 0 0 Corr=0. development of the SSCI. Upper Saddle River.018 Fig. [3] Paul S. Biostatistical Analysis. NJ. Scatter plots of the results for Problems F3a and F3b for Class 4. Pavelich. [4] Kathleen E.” Journal of Engineering Education.460. “Comparisons between performances in a statics concept inventory and course examinations. Moskal. Scatter plots of the results for Problems F3a and F3b for Class 1. [10] Eric Mazur. Margret A.001 [8] B.” IEEE Trans. Leydens. Dantzler. Hjalmarson. “Comparing student understanding of signals and systems using a concept inventory. Sig=0. pp.html. S1G1–S1G6. 1070–1076. [5] John R.” Journal of Engineering Education. [9] A. Prentice Hall. Oppenheim and A. [6] John R. Lathi. July 2002. vol.382. Sig=0. October 2007. Oxford. NJ. REFERENCES [1] Paul S. Sig=0. 3. Signals and Systems. Nelson.201. V. Nawab.497. John R. 48. Wage.. 1999. and Jill K. P. a traditional exam and interviews. “Validity.” IEEE Signal Processing Magazine. Prentice Hall. Corr=0. Fig. on Educ. WI. 351–354. no.054 5 Fourier subtest score 4 3 2 1 0 0 1 2 3 Problem F3a 4 Fourier subtest score 5 4 3 2 1 0 0 1 2 3 Problem F3b 4 Corr=0. Cameron H. Buck.

Sign up to vote on this title
UsefulNot useful