Monday, January 27, 2014

Other Comments from Respondents


A number of general comments were made about the study--some
positive and some negative. The study was described alternately as
“very well done” and “interesting, ” but also, by one respondent, as a
“complete waste of time. ” Most of the comments were positive,
however, and many pertained more specifically to the kinds of
questions that were asked. The consensus seemed to be that the
questionnaire was not easy to complete. Moreover, faculty in the
several disciplines sometimes had different ideas as to what kinds of
questions would have been appropriate. For example, one English
faculty member noted the lack of questions on the use of language in
critical writing, and a computer science faculty member observed that
questions on abilities involved in formulating proofs, which are vital
to success in computer science, were only partially covered in the
questionnaire. An education faculty member noted that the survey did
a better job of assessing skills associated with hypothesis-testing
than with other research skills.
Along these same lines, a number of other respondents also
believed that the questions were more relevant to other disciplines
than to theirs. Several computer science professors, for example,
characterized the questions as oriented more toward argument than
problem solving, in which they had greater interest. An engineering
professor said that some of the questions were more pertinent to
educational research than to scientific or technical research, and one
English faculty found that questions seemed “geared to the hard
sciences. ” Finally, some noted ambiguities or redundancies, or
lamented that the questions were “too fine.” Even with these
difficulties, however, most of the comments about questions were
positive: “Items seem especially well chosen,” “questions are
appropriate, ” “questions were quite thorough,” “a good set of
questions, ” “topics covered are critical,” and “your lists are right
on target. ” The majority of comments, therefore, suggested that the
questionnaire was pitched at about the right level and included
appropriate kinds of reasoning skills.
-lOA
number of comments were made about the relationship between
subject matter and analytical skills, e.g., that successful problem
solving is predicated on having specific knowledge in a field. One
respondent believed that the questionnaire downplayed the importance
of “context effects” in favor of “strict reasoning ability,” and
another noted that the measurement of analytical abilities is quite
discipline specific. Another commented on the difficulty of measuring
analytical ability without regard to the amount of knowledge
available.
Several faculty commented on the development of analytical skills
in graduate school and on the differential importance of these skills
at various stages of graduate education. one respondent said, “I
rated entering behavior or behavior across the entire program
(courses, internships, dissertation). If I were to rate the
dissertation experience alone, the ratings would have been much
higher. ” Many noted that by the end of their programs, skills would
be expected to increase and various reasoning errors could be expected
to occur less frequently: “Entering students are more likely to make
these errors and graduates to make far fewer.” Another said, “In some
sense, the essence of graduate training is analytical skills.’ “These
are skills which students acquire. Flhen they enter they make most of
the mistakes you mentioned. If they can’t learn, they leave the
program. ” Another said, “I’m more concerned about the presence of
these behaviors after my course than before it. One simply does not
harshly judge a beginning student who makes an error, but one could be
very critical of a student about to finish a Ph.D. thesis....’

No comments:

Post a Comment