ScanExam-II for New Users
Compiled from Scan Exam-II HELP topics, August 2003
1. Install Scan Exam-II Software (Scanex)
6. Edit Menu
7. View Menu
9. Transfer Scanex grades to Marks Management
11. Quick Surveys
1. Install Scan Exam-II Software (Scanex)
Use a web browser to visit ssts.uwo.ca/network/software_resources/downloads.html and download the self-install package. Double click this file to execute the software installation. The Scanex software is subsequently run from:
Start > All Programs (All Apps) > Scan Exam-II (or look for a desktop shortcut in public rooms)
Note: It is important to uninstall any older versions of Scan Exam-II prior to installing the current version.
2. HELP within Scanex
The HELP menu command provides the best source of documentation for all the features and operations of the software. A Table of Contents and full text word/phrase searching is provided for finding information. A quick lookup of a help topic for any of the commands that appear in the menus is provided by pressing the F1 key while the command is highlighted.
3. Features Overview
Scanex is an application for scoring, analysing, reporting and editing multiple choice exams that have been processed with an optical sheet scanner. Scan Exam-II is designed primarily to work with the University of Western Ontario standard multiple choice 180 question (A-E choices) Scantron Form No. F-13209-UWO and the extended multiple response 45 question (A-T choices) Scantron Form No. F-13622-UWO. (The latter form is no longer available for purchase from Western Office Supplies, but the software still supports reading this form if you have copies in stock.) Information Technology Services located in the Support Services Building provide Scantron sheet scanning for all UWO instructors, although some Faculties, such as Social Science, Engineering and Business also provide scanning services.
- The student answer sheets and master answer sheet are displayed on the screen as a replica of the Scantron Form. This makes both the error correction process and general browsing of student answer sheets intuitive and quick. The screen images are sizeable and best suited for monitors which support resolutions of 800x600 and above.
- Student answer sheets are dynamically marked showing the number correct, number incorrect, raw score, percentage score, highlight swash through unanswered questions and red mark highlight of questions answered incorrectly. As you edit a student answer, or the answer key, or the question weight, question alternatives, etc. the student is remarked concurrently. Student numbers may also be validated against an authority file such as a Registrar’s class list file or a Marks Management System file.
- Unanswered (blank) questions, multiply marked questions and student number coding problems are highlighted for quick identification. There is a coding error summary report for checking the integrity of the file as well as a quick locate Find Next Error button useful for systematic location of problem sheets for their subsequent editing. Note: multiply marked is defined as two or more Reponses for standard multiple choice exams while for extended multiple response exams it is more responses than the correct answer defines.
- The Exam Analysis view provides standard descriptive statistics of the exam results including graphical depictions of letter grade and raw grade distributions.
- Exam question item analysis provides difficulty rating, point biserial, number answered correctly, number answered incorrectly and the distribution of responses to the question. Other analysis includes item distracter analysis, item discriminater analysis and inter-item correlation analysis.
- Numerous options can be set easily from drop down menus which allow for control of marking method, question weights, question alternatives and weights, and question mapping (different versions of the same exam).
- Custom or Preset Question and Answer Choice mapping is available for defining up to nine additional exam layouts that can be either question rearrangements of the master exam questions or answer choice rearrangements of the master exam. This is often done to reduce cheating that might otherwise occur if identical exams were used in adjacent rows of the examination room. Preset mapping generates nine additional mappings for the user to choose from when setting up their exam whereas Custom mapping allows the user to define the explicit question or answer choice mappings used. Use of mapping allows all variations of the exam sheets to be processed collectively eliminating the need to process separately the different versions of the exam.
- A tool to assemble the master exam including the Custom or Preset mapped versions. Each of the exam questions must be prepared as simple text files or html files in order to use this tool. Incorporation of graphics and picture files into questions are simplified by this approach.
- A tool to produce simple text file of student numbers and their grades. This file is compatible for loading into the Marks Management System and other packages used by instructors for managing their student grades.
- Tools to produce an HTML file of the exam statistics and student numbers with individual grades and/or an HTML file of student exam answer selections. These files are suitable for viewing with Internet WEB browsers such as Netscape or Microsoft Explorer.
- A tool for investigating the possibility of cheating. The technique used is Answer Choice Match Analysis.
- A tool for conducting quick surveys as part of the examination process. Typically the student records his/her answers to opinion or priority type questions that are included at the end of the test or examination. The analysis of these questions provides percentage distributions of the question responses and question means and standard deviations based on a fixed scale of 5(A), 4(B), 3(C), 2(D), 1(E), N/A or Blank excluded. Not available for the extended multiple response format.
4. Exam Format Menu
Scanex supports two exam formats. These two formats cannot be mixed together in one DAT file. The most common format is the Standard Multiple Choice exam consisting of a 5-answer choice (A-E) format and uses the University of Western Ontario 180 question Scantron Form No. F-13209-UWO. The second is the Extended Multiple Responses exam consisting of a 20-answer choice (A-T choices) format and uses the 45 question Scantron Form No. F-13622-UWO. (The latter form is no longer available for purchase from Western Office Supplies, but the software still supports reading this form if you have copies in stock.)
Standard Multiple Choice
The standard multiple choice format offers up to 5 answer choices A through E per question and the student may record only one answer per question. If the student marks a question with two or more responses it is interpreted as ambiguous and recorded as a question mark (?). Blank and multiply marked questions are highlighted by Scanex as potential student coding errors. If left unchanged, a multiple response to a question is marked as incorrect. A question that is left blank is considered neither correct nor incorrect. By default, (see other marking methods) a student’s score is calculated as the sum of the individual question weights of all correct questions and this score is expressed as a percentage of a perfect exam score. Where a question has defined for it one or more alternate correct answers a student’s answer is marked correct if it matches one of the alternate choices and the score is adjusted using the weight assigned to the corresponding alternate answer.
Extended Multiple Response
The extended multiple response exam format is similar to the standard multiple choice format except that each question has up to 20 choices A through T per question and the student may record multiple answer choices as necessary to answer the question. This format is very powerful format that is beyond the scope of this basic introduction. Please refer to the HELP for a full description.
When Scanex loads a DAT file it automatically senses the exam format. The screen view of each format simulates the appearance of the exam sheet.
5. Getting Started
The design and preparation of multiple choice exams is an artful and arduous task. Although Scanex is used primarily to "mark" exams, its considerable question analysis features provides useful feedback for post-exam adjustments and possible question refinements for future use.
Example 1 - Most Common Application
The most common use of Scanex is to edit and mark an exam where all questions are designed with a weight of 1.0, the marking method is a simple sum of weights with no penalty applied for wrong answers and a single version of the exam is taken by all students, i.e. no scrambled versions. This scenario corresponds to the default option settings in Scanex. The instructor would complete the Exam Answer Key scan sheet using student number 999999999 and simply include it with the pile of student answer sheets which are given to the scanning service. Alternatively, the Answer Key may be entered after scanning if this is preferred for security reasons. After scanning the sheets are given back to the instructor along with a diskette file (or e-mail) copy of the scanned sheets.
The file of scanned sheets is referred to by Scanex as the "DAT" file which is the default file extension e.g. EC150A.DAT that Scanex expects when opening a file of scanned sheets. When Scanex opens a DAT file that includes the Answer Key, it recognizes it by student number 999999999, senses the number of questions and marks the exam. Alternatively, the answer key may be entered/edited after the exam sheets are loaded.
Usually the instructor disregards the initial exam results until all student sheets containing marking errors, i.e., questions which the scanner has read as blank or multi-marked (possibly poor erasures) have been investigated and corrected to the extent possible. This is the editing step. Response sheets having a questionable error are flagged on the screen with a teal coloured highlight over the specific areas of concern. Matching the corresponding student exam scan sheet to the one on the screen is a matter of looking through the pile of sheets which were imprinted with a sequential serial number during the scanning process. The serial number shown on the screen provides a convenient lookup into the pile of response sheets while the student number provides the final confirmation that the correct sheet has been matched. If the scanner does not provide a printed serial number feature, refer to the sequential sheet count indicator that appears with the browse forward/reverse buttons and use that to determine which sheet in the pile to find. As highlighted questions are compared to the original sheet and corrected, the teal coloured highlight disappears. Each next problem sheet is located by clicking the Find Next Error button. This process continues until all errors are corrected or until it is determined that any errors that remain are un-correctable. The Edit menu provides a summary report of all scanning errors.
Use the View menu, select Exam Analysis to see the statistical profile of the exam and the detailed question analysis reports. The Print menu is used to print the exam analysis, a posting format for the student grades and if desired a report detailing individual student answers. Many instructors choose the option to produce posting information in an HTML format for subsequent placement on their course web sites (see Tools menu).
The student grades may be output to a disk file using the Create DPC file found in the Tools menu. The text file consists of lines containing student number and percentage grade or raw score. This file is compatible for loading into the Marks Management System (MMS) or spreadsheet software such as Excel.
6. Edit Menu
A few functions are provided for viewing a summary of scanning errors, searching for students and inserting or deleting unwanted sheets in the file and specialized clipboard functions specifically for manipulating mapping codes (refer to the advanced topics guide for Mapped Exams).
Scan Error Summary
All sheets are checked for possible scanning errors and student completion errors such as blank and multiply marked bubbles that occur in the student number field and question answer fields. If the Mapped Exams option is in effect, checking also occurs in the exam code field of all sheets otherwise this field is ignored. When Mapping is on, then exam codes must be coded exactly 111, 222, etc. up to 999. If the Sections Range option is in effect, checking also occurs in the section code field of all sheets otherwise this field is ignored. When on, section codes must be of the format 001 to 999 and must be within the specific range value declared. Locating each sheet with errors is done by clicking the Find Next Error command button.
Validate Student Numbers
Exam sheet student numbers may be validated against an authority file which can be either a Registrar’s Class List file (MMS compliant) or the course’s existing Marks Management System (MMS) file itself. Any exam sheet student number that cannot be matched against the authority file is reported in a list on the screen. Subsequently locating the invalid sheets must be done by using the Find Student command since the Find Next Error command button only locates coding or code range errors.
7. View Menu
Three views are provided as described below.
Student Scan Sheets View
This is the default view when Scanex starts. It displays student exam answer sheets. To edit any of the bubbles simply click inside the bubble to shade or unshade it. The grading for the sheet is shown on the right side of the sheet and is dynamically re-graded as it is edited (i.e. corrections to responses or any of the option settings). Areas highlighted in teal signify that a question or student number needs to be investigated as a possible bad scan or student coding problem. Question numbers shown in red have been marked wrong as determined by the Answer Key and any appropriate mapping in effect.
On the left side of the screen there is a variety of navigational aids. At the top is a drop down list of all the student numbers shown in the order that they appear in the file, i.e. unsorted. Beneath is another drop down list that shows the sequential serial numbers printed onto each of the scan sheets as they were read through the scanner. Both these lists provide a convenient way to locate a particular sheet in the file. (See Find Student for searching).
The Find Next Error button provides a quick jump forward from the current sheet to the next sheet that has a question coding problem or student number coding problem. Coding errors are highlighted in teal. A blank Exam Code is not an error because a blank Exam Code field is typical for single version exams and treated as equivalent to 000 if a mapped exam is in effect. Questions that are “off” in the Answer Key are always ignored.
The Browse Forward and Browse Back buttons permit sequential browsing forward and backward through the file. The Flip Page button turns over the sheet to view questions 61-180.
Exam Analysis View
There are descriptive statistics, general item analysis, inter-item correlations, distracter analysis and cheating analysis to choose from. The descriptive statistics and general item analysis are automatically regenerated whenever edits occur however the more computationally involved features such as inter-item correlations, distracter analysis and cheating analysis execute only if they are selected. The descriptive statistics and general item analysis are relatively quick processes depending on the speed of the microcomputer and the number of answer sheets being marked. All the exam statistics use student percentage scores rounded to one decimal place and NOT raw scores.
Master Answer Key View
This view is identical to the Scan Sheet View except that all navigation buttons disappear. In addition the student number is fixed at 999999999 and the EXAM CODE field is fixed as blank or 000. The master answers are set or edited by clicking the appropriate response bubbles. If a question is set to have no answer (blank) (i.e. teal highlight appears) then this question will be disregarded in the marking process and the coding error detections. This becomes useful after an exam has been given and it is determined that one or more questions were sufficiently ambiguous that they should be removed from the marking process.
8. General Item Analysis
This analysis provides tabular display of each questions difficulty, point biserial, number of students answering correctly and incorrectly, the correct answer, the distribution of answer choice selections and Cronbach’s alpha (coefficient of reliability). A graphical display of the question difficulty and point biserial is also provided. If multiple exams (i.e. mapped exams) are used then split out analysis is possible, and the split out question statistics are always reported consistently in terms of the master exam question order. The following is given in a table format for each question.
Difficulty Rating = W/(C+W)
where
C is Number of Students Answering Question Correctly
W is Number of Students Answering Question Incorrectly
Note: blank answers are discarded for this calculation.
Difficulty values range from 0 (very easy) to 1 (very difficult). Scanex uses an increasing value scale to represent increasing difficulty of a question, which permits a consistent graphing of this value with the Point Biserials. See Point Biserial for further details.
Point Biserial = (Mp - Mq)/sd * SquareRoot(p*q)
where
Mp is mean score of students answering question correctly
Mq is mean score of students answering question incorrectly
sd is the standard deviation of the exam mean
p is the proportion of students answering question correctly
q is the proportion of students answering question incorrectly
Note: blank answers are considered incorrect only for the purpose of assigning scores to Mp, Mq and counts to p and q.
The Point Biserial can be thought of as the product moment correlation where one variable is dichotomous i.e. the item is answered correctly or incorrectly and the other variable is continuous and equal to the test scores. The Point Biserial values range from -1 (answered correctly more often by students with low scores) to +1 (answered correctly more often by students with high scores). The following illustrates how a question with a near perfect point biserial (+1.0) might appear if it were graphed. Note the complete absence of any graph points (students) corresponding to high score and incorrect answer or low score and correct answer.
In general, relatively high point biserials are desirable, +.30 and above is very good. Any item with a point biserial near zero or negative should be carefully inspected. It is possible for an item to correlate near zero and still be a valid item but more often than not the item is sufficiently ambiguous that it results in equal numbers of good students who get it incorrect as there are poor students who get it correct. Sometimes this is an indication that there are actually two equally correct answers for the question, which the instructor may then elect to define with an alternate correct answer using the Options menu. This doesn’t make the question better, it only partially addresses the scoring anomaly. A negative point biserial indicates a very bad question, one where the majority of good students answer the question incorrectly while the majority of poor students answer it correctly, i.e. this is completely counter intuitive to what any exam question should produce as a result. Sometimes a negative Point Biserial simply highlights a coding error on the master answer key, where the instructor has coded an incorrect choice as the correct answer. Unless there are grounds for keeping a question with a negative point biserial, then such questions should be considered for removal (see note below). A graph appears beneath the question analysis, which plots the Point Biserial and Difficulty for each question.
Considerations for Removing Questions
Questions that produce a negative point biserial have poor testing characteristics and as a simple rule of thumb should be removed from the scoring by clicking them to blank on the Master Answer Key. Removal of even one question will cause individual student grades to fluctuate up and down. The decision to remove a question should be looked at closely to see if there is an explanation for its behaviour. First, examine the question to determine if the master answer is indeed the correct answer, i.e. an instructor coding error. Second, ascertain if the question may actually contain two legitimate correct answer choices in which case the question may be marked using two correct answers (refer to Alternate Answers under the Options menu). Third, the question is deemed to be a good question by all other criterion and its poor statistical showing is perhaps more related to inconsistent coverage of course material due to different instructors. If a decision is made to retain a bad question for those that got it correct but remove it for those that got it wrong consider the use of the Bonus Question option.
9. Transfer Scanex grades to Marks Management
A tool (see Tool menu) will produce a text file of student numbers and grades or raw scores. The output file is compatible for loading into the Marks Management System (MMS) as well as spreadsheet packages such as Excel. The format of each line in the DPC file is a student number and grade (or raw grade) separated by a tab.
10. Cheating Analysis
A tool using Answer Choice Match Analysis is provided to assist with the investigation of cheating. This is an involved topic that is fully explained in the HELP of the software. The following is a brief introduction.
This tool may be used to guide an instructor in the investigation of cheating but it cannot prove cheating.
The statistical criterion used to detect and support assertions of cheating uses a default setting that is set quite high to avoid false suspicions. Nonetheless, there may be pairs of students identified by the process as being suspicious of cheating and instructors are cautioned to disregard such results in the absence of other compelling evidence.
In general, same answer choice selection is expected to increase as the grades of students being compared increase because the incidence of correct answers necessarily rises. The above bird’s eye view graph illustrates this, even suggesting that a linear relationship exists between grade and number of matches, which comes as no real surprise. When comparing lower to middle scoring students, a large number of same answer choices is unusual because of the natural variability that often occurs in wrong answer choices as well as the natural variability for which questions are answered correctly. This natural variability is established by the students themselves and is not pre-determined in any way. The empirical nature of this method is sensitive to how students actually respond on a particular exam and it is more sensitive to identification of systematic rather than opportunistic cheating. Extreme values often suggest that students are cheating in a very co-operative manner. Where a single version of a test is used instead of multiple scrambled versions (refer to mapped exams in Scanex) this type of blatant cheating is quite feasible through the use of simple prearranged hand and finger signals. It becomes especially feasible if an electronic communications device is used. Unfortunately, even multiple scrambled exams can be overcome if the students have the ability to choose their own seating. Forced random seating placements can address this problem.
Provided the distribution of match counts is normal, then the associated probabilities for observing match count Z values as extreme as 4 and greater is given below.
Z=4 occurs 3 in one hundred thousand.
Z=4.7 occurs 1 in one million.
Z=5 occurs 3 in ten million.
Z=6 occurs 1 in 1 billion.
Z=7 occurs 1 in 1 trillion.
Z=8 occurs 6 in 10**16.
11. Quick Surveys
The convenience for including a short survey as part of the exam sheet processing can be useful for gathering class opinions on the course content, social and political issues, etc. A typical use would have students record their answers to the survey questions beginning at the next question after the last test question. Students would be informed that their participation in the survey was optional and that the survey questions do not form part of their test score (see Survey Tips below for other possible scenarios). The Quick Survey tool will calculate the percentage distributions for the survey questions categories (A-E) as well as means and standard deviations based on a 5 point rating scale where A is 5, B is 4, C is 3, D is 2, E is 1, and Blank or bad marks are N/A which are excluded. Questions that use rating scales typically pose questions of opinion, e.g. (5-Strongly Agree to 1-Strongly Disagree), (5-Exceptional to 1-Poor), (5-High Priority to 1-Low Priority), etc. Quick Surveys will also calculate a matrix of Pearson Correlation Coefficients based on the above 5 point rating scale so that survey questions can be examined for the presence or absence of linear relationships, i.e. do respondents rate consistently on a particular pair of questions. Refer to the HELP for more details.