Ways of Investigating Variables which Affect Student Learning
Debra Tedman and Bronwyn Ellis, University of South Australia
Both qualitative and quantitative approaches can be used to examine factors affecting students' learning. An understanding of these factors is necessary in order to provide true equity of access, participation and outcomes in a learning environment of increasing diversity. While qualitative methods are useful in investigations of small numbers of students, in a larger study quantitative methods can be used to search for relationships in the data and to assess their importance. Hierarchical linear modelling (HLM) provides a means of analysing the multilevel (individual, institutional and external) nature of the factors affecting student learning. This paper describes the methods used in a small qualitative study of influences on Indigenous participation in higher education as well as those used in a quantitative and qualitative study of senior secondary students' views and attitudes in regard to the interactions between science, technology and society. Some of the techniques of analysis used in the latter study could be applied to a larger study similar in content to the former.
Factors affecting students' participation in their learning environment have been considered in both authors' postgraduate research projects. Debra Tedman's PhD study employed quantitative techniques and, to a lesser extent, qualitative techniques, while Bronwyn Ellis's MEd study employed qualitative techniques. The variables discussed in Ellis's qualitative study could be analysed in more extensive further investigations by using quantitative techniques similar to those used in Tedman's study. These quantitative and qualitative methods of educational analysis are discussed below. The benefits of both of these methods are considered briefly in this paper and the factors found to affect students' participation in their learning environment are mentioned. Quantitative methods which could be applied to a larger qualitative study similar in focus to the one described are discussed.
Qualitative and Quantitative Research
Qualitative research does not involve the gathering of numerical data for subsequent statistical analysis. It has been defined as
In qualitative research researchers themselves (and by extension their questionnaires and insights) are the research instruments. Other characteristics of qualitative research are:
(Adapted from Bogdan & Knopp Biklen 1992, pp. 29-32)
Reality is seen as multiple and constructed; qualitative research does not assume that there is one single objective reality; in fact, it recognises subjectivity, and that research cannot be value-neutral: values influence the very questions that we ask (Sandelowski 1996). The term 'qualitative research' covers a variety of traditions and methods (Miller & Crabtree 1992), and researchers may be eclectic; it is also possible for basically qualitative studies to have a quantitative component or aspect.
Quantitative research involves description or measurement of quantity. The instrumentality of measurement was discussed by Kaplan, who stressed that failure to recognise the efficacy and power of measurement made for a kind of 'mystique of quantity' (Kaplan 1964, p. 172). From early times, some great scholars have believed in the power of numbers, and Plato was influenced by Pythagoras's famous saying 'All is number' (Strathern 1996). In the nineteenth century, Lord Kelvin believed that number had an intrinsic scientific value, since when one could measure something, one knew some information about it. Scientific progress was believed to owe much to a quantitative approach, although the qualitative approach was also a necessary component of the scientific endeavour (Kaplan 1964).
Data analysis by statistical methods enables communication to others of the meaning of numbers, the outcomes of quantitative research. Unfortunately the anxiety which might be produced by the issue of statistical analysis can discourage some from attempting quantitatively-based research. However, statistical analysis is an extremely valuable research method which allows the determination of causal relationships when inferring from samples to populations.
Quantitative research is often concerned with the question of whether a true measure of the effect of the treatment variables has been obtained for the subjects in the experiment, whereas qualitative research may be concerned with whether generalisations for the whole population may be arrived at from the findings obtained.
There are relative advantages and disadvantages of both quantitative and qualitative research methodologies. These techniques involve the issues of precision and generality. Although precision is important in research, in some situations it is valuable to see things as a whole. The advantage of measuring is that it enables the researcher to use statistical techniques to search for relationships in the data and assess their importance. Statistical analysis is viewed as a tool which can give the researcher some advice, and the researcher uses his or her own expertise to make decisions.
A Qualitative Study
A qualitative approach was used in Ellis's 1997 Deakin University Master of Education project on institutional influences on the participation of Aboriginal students in higher education in South Australia. This study aimed to discover what shifts would be required in the areas of administration, pedagogy, curriculum, and other areas indicated by preliminary research to enable Indigenous students to participate in a form of higher education inclusive of their interests and valuing their knowledge base. Previous changes or attempts to move in these directions were also considered.
The approach used in this study can be described more specifically as phenomenological: the quality of people's university experience for them was sought. The central question addressed by this approach is, 'What is the structure and essence of experience of this phenomenon for these people?' (Patton 1990, p. 88). In order to understand this, enquirers 'must "bracket" their own preconceptions and enter into the individual's lifeworld and use the self as an experiencing interpreter' (Miller & Crabtree 1992, p. 24).
The first step was to ascertain from the literature and from people involved in higher education the current situation with regard to both the extent and the quality of Indigenous participation and what, in their opinion, were the main issues. Indigenous students, both current and former, were surveyed to gain an insight into the perceived positives and negatives of their experience of the higher education learning environment. From these positives and negatives the institutional influences on that experience, as distinct from personal, personality, family etc. factors, were identified. Relevant staff at the three South Australian universities were surveyed in similar fashion. In addition, unstructured interviews were conducted with some staff involved in Indigenous higher education and other education programs. Government policy documents, institutional policies and publications were also examined.
Structured Interviews and Questionnaires
There were four versions of the questionnaire (also used as an interview schedule in some cases): for current Indigenous students, for Indigenous students who had discontinued, for recent Indigenous graduates, and for Indigenous staff or non-Indigenous staff who worked (or had worked) with Indigenous students. The sample was purposeful, taking in as many participants as could feasibly be reached who belonged to one of these four groups. At the same time, efforts were made to ensure that responses represented a variety of situations (Patton's 'maximum variation sampling', Patton 1990). In all, fifty people were involved in structured interviews or responding to questionnaires.
Brief demographic profile details were followed by questions concerning students' motivation for doing university studies and participants' ideas about the positives and negatives of their university learning environment. With regard to the motivation question, it was not expected that answers would necessarily relate to institutional factors, but that they would provide background to the other responses and give glimpses of one of the yardsticks against which students would measure their subsequent university experience. All participants apart from non-Indigenous staff were asked about the things that they would change (or would have changed) if possible. Asking about desirable changes was an additional means of discovering what factors participants regarded as positives and negatives. The questions to both staff and former students about reasons for discontinuing were also designed to reveal other negative factors.
All these questions were open-ended, intended to elicit whatever factors were significant to the participants, 'to understand the world as seen by the respondents' (Patton 1990, p. 24), rather than limited to what the compiler believed would be significant. With the main questions were subsidiary questions or other prompts to encourage participants to think about various aspects of their university experience, rather than focusing on only the first factor that came to mind. For example, the questions regarding positives and negatives were each framed in three ways (regarding likes and dislikes, comfort and discomfort, factors conducive to and militating against study) in an attempt to widen the range of responses.
At least some of the questions were designed to be similar to questions that a potential university student could ask of a current or former one ('Why do/did you go to uni?' 'Do/Did you like it?' 'Why?'), so that the answers could give an indication of things that could be said in such conversations. Such talk could conceivably influence others' perceptions of university life and so possibly their future participation. On the other hand, questions asking people to rate a range of factors of university life, on a Likert scale or similar, could cover more ground but not necessarily reveal the relative importance of those factors for individual students.
Staff questionnaires sought staff perceptions of these factors, based where possible on anecdotal evidence and, in the case of Indigenous staff, their own feelings about the institutional environment. Staff were also asked to relate reasons given by students for undertaking a university course or for discontinuing. The latter comments were particularly valuable in compensating to some extent for the small number of responses from former students, and being a channel for data (albeit second-hand) relating to students who did not participate in the study.
Subsequent research attempted to be responsive to the information thus provided. In some cases participants' responses suggested other 'lenses' through which to examine what each institution currently offered; in fact, a primary aim of the questionnaires and interviews had been to ensure that factors in the university environment which were significant to the participants themselves were not overlooked. However, whether or not they mentioned anything 'new', their own perceptions were, of course, valuable in themselves.
Issues from unstructured interviews were identified and related to the literature. Index cards were made for these informants, for issues raised by them and others discussed in the literature, as well as for items in the working bibliography. This facilitated cross-referencing.
After clustering the responses to each section of the structured questionnaires/ interviews, institutional factors were isolated and examined for common areas of comment. Responses relating to these areas were then grouped. This was done by physically cutting them up and attaching them under headings and subheadings on a large sheet of cardboard, and using highlighters of different colours to indicate positives and negatives. With a larger sample a software tool such as NUDIST would have made this process manageable. Non-institutional factors were also grouped, for an institutional response to these, or lack of it, was considered to be significant. In some cases, the student's personal situation meant that something within the institution became a positive or negative factor for that student. Staff responses were grouped in a similar way but separately. The areas of comment were then described, including as much of respondents' contributions as feasible, and including positives and negatives relating to each area. The following categories emerged:
Connections between topics raised in the unstructured interviews, the literature, the questionnaire/ structured interview responses, and institutional documents regarding Indigenous policies and programs, as well as more general statements of mission and goals, were identified as a basis for considering, firstly, current strengths and weaknesses of the institutions with regard to encouraging and facilitating Indigenous student participation and, secondly, possible areas for change and/or development.
A Quantitative and Qualitative Study
Modern societies are increasingly dependent upon science and technology. Thus, in Australia and many other countries, the need for informed public debate on the interactions between science, technology and society (STS) has been recognised by secondary schools and universities in the form of curriculum shifts towards the inclusion of objectives which emphasise these interrelationships. Tedman's doctoral study discussed in this paper used both qualitative and quantitative techniques to investigate the shift towards STS of the South Australian senior secondary science curricula. This study aimed to inform course and curriculum development as well as the development of teacher in-service programs. The views and attitudes towards STS of students, teachers and scientists were measured. The factors affecting students' views on STS, liking of science, and expectations to continue with university study after the completion of secondary school were also investigated. Structured interviews with South Australian senior secondary science teachers in the qualitative phase of the study enabled the collection of teachers' views, understandings and concerns in relation to this shift in the objectives of science curricula.
The Choice of Methods for Gauging Respondents' Views on STS
The questionnaire chosen for this study was the Views on Science, Technology and Society (VOSTS) questionnaire. Aikenhead and Ryan (1992) began their article on the development of the VOSTS items by citing their study which found that when high school students were offered the chance to respond "I do not understand", more than a quarter of Grade 11 and 12 students did so. Aikenhead and Ryan then suggested that whenever students responded "agree" or "disagree" to a statement in a traditional Likert-type questionnaire to measure students' understandings of the nature of science, a number of students simply did not understand what the statement meant.
This new instrument to monitor students' views on STS, VOSTS, was developed to reduce ambiguity by using empirically derived, multiple-choice items (Aikenhead and Ryan 1992). Thousands of Year-12 students contributed to the development of the VOSTS instrument by writing paragraphs about various issues in STS. Subsequently, these paragraphs were analysed to find common viewpoints or "student positions". A questionnaire was then developed from these viewpoints. Each VOSTS item comprises a statement and several student positions (Aikenhead, Ryan & Fleming 1989).
Data Collection Methods
In this study a scaled questionnaire was used to measure and compare views, beliefs and attitudes towards STS. The main advantage of using a questionnaire for the collection of data is that a relatively large amount of data can be collected efficiently from a substantial number of respondents. Data from a large sample are desirable, since the larger the sample, the greater the chances that it will be an accurate representation of the total population. This method of data collection using scales also required careful consideration of the issues of validity and consistency, although it is beyond the scope of this paper to discuss these issues.
Information on students' and teachers' views was gathered by using a scaled VOSTS instrument. Aikenhead and Ryan (1992) suggested, however, that the items used in the VOSTS instrument would not scale as the previously-used Likert-type responses did. They believed that the field of item response theory had not yet developed the mathematical equations that could analyse responses to VOSTS items (Aikenhead & Ryan 1992). Thus, their study was qualitative. For the purposes of Tedman's doctoral study, their view was considered to be in error. The instrument used to gather viewpoints on STS in this Australian study was significantly strengthened by adding a measurement component to the work that was done. This served to strengthen the conclusions arrived at as a result of the quantitative phase of this study. Mathematical procedures to do this had been advanced (Masters 1985 and Adams, Doig & Rosier 1991) and acceptance of Aikenhead's challenge to develop a measurement framework for the VOSTS items was of considerable significance.
During the field-work phase of the study, the scaled questionnaire was administered to 1278 students in the selected schools and 110 teachers. The group discussions with 101 teachers also occurred at this stage. Thirty-one professional scientists also completed the questionnaire. The strength and coherence of the STS views of the three groups of respondents were compared.
A comprehensive range of statistical techniques was used to analyse the data collected. This study aimed to produce a "master scale" for views on STS by scaling a selection of the VOSTS items. It was considered extremely important to analyse the data in a way which enabled significant correlations between factors at both the student and school levels and the coherence of students' views on STS to be shown.
The statistical computer packages QUEST (Adams & Koo 1993) and Statistical Package for the Social Sciences (SPSS) were used to enter and analyse the data in order to deal adequately with the large amount of quantitative data.
Pearson chi-square values and degrees of freedom
Possible relationships between variables were investigated by using the crosstabs statistical function of the SPSS package. The Pearson chi-square values and degrees of freedom were an example of contingency tables which were used in this study to determine whether a relationship was significant. This was accomplished by using the table of critical values of chi-square (Ferguson 1959) to determine the critical value corresponding to the degrees of freedom for that particular relationship.
Analysis of variance
One-way analysis of variance on the three scales was used to determine the statistical significance of the effect of a number of variables on the strength and coherence of students' views towards STS. These variables included: (a) science subjects studied in 1995, (b) marks in science, (c) liking of science, (d) location of school, (e) type of school, (f) years of further education, (g) inclusion of science subjects in further education, (h) future course of study, (i) future occupation, (j) mother's occupation, and (k) father's occupation.
Hierarchical linear modelling
Educational researchers often need to take account of nested data, with students nested into classrooms, schools, universities, districts, states or countries. Individuals in the same group are more alike than individuals in different groups. Different schools or universities, by way of example, have different teaching staff and equity procedures. In this study, student- and group-level characteristics influenced each other, for example, sex of student influenced the type of school (coeducational, single-sex boys', single-sex girls') attended by students. The assumption of single-level multivariate analysis is that the effect of one variable upon another is independent of any other effects in the model. In order to be able to use analysis of variance with traditional statistical techniques it is necessary to ensure that factors at one level are not influenced by factors at another level. This is accomplished by random allocation of students to groups. There was no random allocation of students to groups in this study, so there was clustering of data in a hierarchical form. When data are at different levels, problems arise with this inability to assign students randomly to groups if traditional techniques of statistical analysis are used.
Hierarchical linear modelling (HLM) (Bryk, Raudenbush & Congdon 1994) overcomes this problem by incorporating variables from all levels in the analysis. In this study, the computer package HLM2 was used to examine variability simultaneously at different organisational levels and in cross-level interactions. HLM enabled the testing of effects which had been occurring both within and between schools. The initial steps in the multilevel analysis are described below and an overview of this method of statistical analysis and its benefits is given.
In this study, the model at the student level (Level-1) was built around explanatory variables which included student sex and father's occupation. The school level variables (Level-2) included type of school (coeducational, single-sex girls', single-sex boys'), governing body of school (government, non-government, Catholic), school location (metropolitan, non-metropolitan) and average socioeconomic status of school based on father's occupation. The effects of variables at these two levels on the strength and coherence of students' views on STS, the students' liking of science and the students' expectations to study science at the university level were analysed. The use of HLM also allowed the formulation and testing of hypotheses about cross-level effects and the consideration of both within- and between- school components (Bryk & Raudenbush 1992).
Complex regression models were then developed. After this, the HLM2 package used the maximum likelihood estimation procedure to compute maximum likelihood estimates for the regression coefficients and variance components by an iterative procedure. Thus the estimation steps were repeated many times to attempt to obtain convergence, or a minimal change in the likelihood function for the estimates the program generated as values for the various parameters in the model.
The multilevel modelling was, therefore, a procedure in which the researcher suggested exploratory models by using variables hypothesised as possibly significant on the basis of the review of the literature on previous studies. Simple models were investigated first, and then various types of other variables were included in a sequential process. After each step, the results were inspected to see if the variables were significant. The chi-square test was used to test formally the improvement of fit between the model specified at a particular step and the model from the previous step. The chi-square value showed how much of the variance was already being explained. A large amount was already being explained if the chi-square value for that variable was significantly larger than the degrees of freedom.
Hierarchical linear modelling works by calculating separate regression coefficients based on within-group analyses and uses these coefficients as outcome variables in a subsequent regression performed at the between-group level. Once all the student-level variables were entered into the Level-1 equation, with the strength and coherence of students' views on STS as an outcome, the initial multiple regression analysis was repeated for each of the 29 schools in the sample for the study. Next, the sets of regression coefficients yielded from this initial regression analysis served as outcome measures in subsequent regression analyses. In these second regression analyses, student-level and school-level variables were entered as predictor variables.
In Tedman's study, one of the analyses investigated the influence of socioeconomic status (SES) of students' fathers' occupations on the expectations of students in different types of schools to continue with the study of science at university. Examination of the regression coefficients in all of the types of schools showed a positive influence of high SES backgrounds on students' expectations to include science subjects in their university studies. However this could be interpreted as an indication that students from high SES backgrounds are more likely to attend university, whether or not they study science. A significant effect between the decisions of students from high status backgrounds to continue with university studies which include science subjects might be a consequence of these students having a greater opportunity to pursue university study of any kind.
As in Tedman's study, in Ellis's study there were variables at more than one level, in this case the student level and the university level. For a further study involving the collection and analysis of a larger amount of information using quantitative techniques, hierarchical linear modelling could be used to investigate the effects of university-level variables (such as availability of tutorial support, library opening hours, or access to computers) on a student-level variable such as performance in a particular course. The interactions between variables at different levels would also be able to be investigated in a further quantitative study of institutional influences on the participation of particular groups of students, such as Indigenous students and targeted equity groups, in higher education in Australia. The methodology has potential for other investigations involving attitudes and perceptions and interrelated multilevel factors affecting those attitudes and perceptions.
Qualitative and quantitative approaches can both play a part in investigating factors affecting the learning environment and outcomes for students from a range of backgrounds. In individual studies one or other may be chosen depending on the nature of the investigation, or they may have complementary roles to play within the one study. Hierarchical linear modelling is one quantitative technique that could profitably be used to show interrelationships of multiple variables at different levels.
Adams, R. J., Doig, B. A., & Rosier, M. 1991, Science Learning in Victorian Schools: 1990, The Australian Council for Educational Research Ltd, Hawthorn, Victoria.
Adams, R. J. & Koo, S. K. 1993, Quest - The Interactive Test Analysis System, Australian Council for Educational Research, Hawthorn, Victoria.
Aikenhead, G. S. & Ryan, A. G. 1992, 'The development of a new instrument: "Views on Science-Technology-Society" (VOSTS)', Science Education, 76, pp. 477-491.
Aikenhead, G. S., Ryan, A. G. & Fleming, R. W. 1989, Views on Science-Technology-Society, Department of Curriculum Studies, College of Education, Saskatchewan.
Bogdan, R. & Knopp Biklen, S. 1992, Qualitative Research for Education: An Introduction to Theory and Methods, 2nd edn, Allyn & Bacon, Boston.
Bryk, A. S., & Raudenbush, S. W. 1992, Hierarchical Linear Models: Applications and Data Analysis Methods, Sage Publications, Newbury Park, California.
Bryk, A. S., Raudenbush, S. W., & Congdon, R. T. 1994, HLM TM 2/3: Hierarchical Linear Modelling with the HLM/2L and HLM/3L Programs, Scientific Software International, Chicago.
Ferguson, G. A. 1959, Statistical Analysis in Psychology and Education, 2nd edn, McGraw-Hill, New York.
Kaplan, A. 1964, The Conduct of Inquiry, Chandler Publishing Company, New York.
Masters, G. N. 1988, 'Partial credit models' in Educational Research, Methodology and Measurement: An International Handbook, ed. Keeves, J. P., Pergamon Press, Oxford.
Miller, W. L. & Crabtree, B. F. 1992, 'Primary care research: a multimethod typology and qualitative road map', in Doing Qualitative Research: Multiple Strategies, eds B. F. Crabtree & W. L. Miller, Vol. 3 in Research Methods for Primary Care, Sage Publications, Newbury Park, CA, pp. 3-28.
Patton, M. Q. 1990, Qualitative Evaluation and Research Methods, 2nd edn (1st edn 1980: Qualitative Evaluation Methods), Sage Publications, Newbury Park, CA.
Sandelowski, M. 1996, Issues in designing qualitative research, presentation at the Whyalla Campus of the University of South Australia, 27 March.
Strathern, P. 1996, Plato in 90 Minutes, Constable & Co., London.
Strauss, A. & Corbin, J. 1990, Basics of Qualitative Research: Grounded Theory Procedures and Techniques, Sage Publications, Newbury Park, CA.