During autumn 2002, a total of 13 English and Scottish universities were visited in a miniproject survey of ‘The experience of fresher students in mathematics diagnostic testing’. A wide range of procedures for diagnostic testing were in use at the universities visited. In the survey of diagnostic testing, several hundred students were observed and 98 of these students, selected at random, were asked to fill out a questionnaire immediately after completing their test.
Sections One to Four of the questionnaire covered administrative matters such as the student’s profile, preparation for the diagnostic test, its timetabling, perceived difficulty, execution, and any special features associated with taking a computerised test. This was followed, in Section Five, with a detailed set of questions about the student’s familiarity with mathematical topics. Section Six asked students to highlight the three topics that they found most difficult and unpleasant, and Section Seven asked them to put in rank order favoured strategies for follow up assistance in mathematics.
Most students believed that the diagnostic test served a purpose and agreed that the timing of the test, right at the beginning of their studies, was suitable. Students wanted to be given information about how to prepare themselves for their degree, but less than two-thirds claimed that they had received information on how to get ready for it, with a large number feeling that they could be better prepared for the test.
A number of students raised questions about the precise form of the diagnostic test. Some replies suggest that some of them thought the test was designed to measure their real mathematical ability, opposed to their present preparedness. Institutions must be quite clear about their intentions, and should say quite explicitly that their aim is.
There is a deeper issue behind the information. It needs to be known that the diagnostic test really measures fundamental shortcomings in knowledge, as opposed to a false message indicating a poor understanding, when lack of revision might be at fault. This raises the question as to how seriously the outcome of the diagnostic test must be taken. It was suggested by the project holders that diagnostic tests are evaluated in as much detail as possible, so that universities can advise upon remedial measures.
In the cases where students claim to feel least confident, (for example, complex numbers, discrete mathematics etc.), it would be useful to see if the test results actually bear this out this unease.
There are variations between different universities in what students know. The knowledge of differential calculus may be a case in point.
With regard to student support, the most popular choice was for support classes but these can be notoriously difficult to timetable. Some students also like walk-in centres, which are a more expensive, though more convenient option.
