|[ Proceedings Contents 1996 ] [ Abstracts 1996 ] [ WAIER Home ]
Development, validation and use of personal and class forms of a new classroom environment questionnaireBarry Fraser, Campbell McRobbie and Darrell Fisher
Curtin University of Technology
In the early days of the study of human environments, Murray (1938) introduced the term alpha press to describe the environment as assessed by a detached observer and beta press to describe the environment as observed by those within that environment. These ideas were extended by Stern, Stein and Bloom (1956) to include perceptions of the environment unique to the individual (called private beta press) and perceptions of the environment shared among the group (called consensual beta press). Hence, even in these early studies of human environments, it was recognised that the perceptions of persons from different perspectives could lead to different interpretations of that environment.
Interest in the study of learning environments in classrooms was rekindled during an evaluation of Harvard Project Physics which required the development of an instrument to assess learning environments in physics classrooms. This instrument, the Learning Environment Inventory (LEI, Walberg, 1968), asked students for their perceptions of the whole-class environment. At about the same time, Moos and Trickett (1974) had been developing a series of environment measures which concluded with the Classroom Environment Scale (CES), which also asked students for their perceptions of the learning environment of the class as a whole. These two questionnaires provided considerable impetus for the study of classroom learning environments, were used extensively for a variety of research purposes, and provided models for the development of a range of instruments over the next two decades or so (see Fraser, 1994). Most of these instruments are available in an actual version, which asks respondents questions about the experienced learning environment, and a preferred version, which focuses on the learning environment ideally preferred by students.
Inherent in this approach is the assumption that there is a unique learning environment in the classroom that all students in a class more or less experience. Variations in scores on learning environment instruments were considered as error variance, with the class mean representing a good measure of the learning environment in the classroom. However, the assumption of a common learning environment experienced by all students within a classroom was challenged in the latter half of the 1980s. For example, in interpretive studies employing classroom learning environment instruments, classroom observations and interviews involving teachers and students suggested that there were groups of students (termed "target" students) who were involved more extensively in classroom discussions than the other students. These target students were found to have more favourable perceptions of the learning environment than those students less involved, suggesting that there could be discrete and differently-perceived learning environments within the one classroom (Tobin, 1987; Tobin & Gallagher, 1987; Tobin & Malone, 1989). One implication of these studies is that there is potentially a problem with using the traditional class form of learning environment instruments when studying differences between groups of students in a classroom (e.g., boys and girls) because these instruments elicited the student's perceptions of the class as a whole rather than the student's personal perception of his or her role in that classroom (Fraser & Tobin, 1991). Although classroom environment scales have been used to advantage in case study research (Tobin & Fraser, 1987; Tobin, Kahle, & Fraser, 1990), these studies suggested the desirability of having a new form of an instrument available which is better suited than is the conventional Class Form for assessing differences in perceptions that might be held by different students within the same class.
Around the time when these studies were being carried out, the traditional teachers' role of transmitting the logical structures of knowledge to students was being questioned in favour of a view that meaningful learning is a personal cognitive process that actively involves the learner in making sense of world experiences in terms of the existing knowledge of the individual, and a social process in which this sense-making process involves negotiation and consensus building with others (Tobin, 1993; von Glasersfeld, 1989).
These studies and influences led Fraser, Giddings and McRobbie (1992) to propose a different form of a learning environment instrument which asked students for their personal perception of their role in the environment of the classroom rather than their perception of the learning environment in the class as a whole; these two forms are called the Personal Form and the Class Form, respectively. The first part of this paper summarises some of the previous research involving Class and Personal Forms of the Science Laboratory Environment Inventory (Fraser, Giddings, & McRobbie, 1995), while the second part reports current research involving the Personal and Class Forms of a new classroom environment questionnaire (Fraser, Fisher & McRobbie, 1996).
The first stage in constructing this instrument was the development of an actual and a preferred version of a Class Form of the instrument (Fraser, Giddings, & McRobbie, 1993). This involved field testing in six countries involving 3,401 students in 183 senior high schools and 1,242 students in 42 university laboratory classes. The final scales in the Class form are Student Cohesiveness, Open-endedness, Integration, Rule Clarity and Material Environment. Each of the five dimensions was validated by discussion with students and teachers on the salience of the scales and the wording of the items, by reference to concerns and findings expressed in the research literature on aspects of the science laboratory learning environment (e.g., Hegarty-Hazel, 1990; Tobin, 1990; Woolnough, 1991), coverage of the categories identified by Moos (1979) for conceptualising all human environments (i.e., Relationship, Personal Development, and System Maintenance and System Change Dimensions), and factor analysis, internal consistency and discriminant validity analyses performed on data collected by administering a preliminary version to students.
The Personal Form of the instrument was devised by rewording items of the Class Form. For example, the item, 'In our laboratory sessions, different students do different experiments' became 'In my laboratory sessions, I do different experiments than some of the other students'. Similar transformations of preferred version items of the Class Form were made in developing a preferred version of the Personal Form.
The actual and preferred versions of the Personal Form of the SLEI were administered as part of a study involving the cross-validation of the Class Form among senior high school chemistry classes in Queensland, Australia (Fraser, Giddings, & McRobbie, 1995; Fraser & McRobbie, 1995). As part of this larger study which employed a matrix sampling design, all students responded to actual and preferred versions of the Class Form of the SLEI and a general aptitude test. The sample of students who responded to both the Personal and Class Forms of the instrument consisted of 516 students in 56 year 11 chemistry classes. This sample enabled a comparison to be made of these two forms of the SLEI.
The Personal Form of the SLEI had satisfactory validity and reliability on both the actual and preferred versions, which was comparable to that of the Class Form. For the actual version of the SLEI, the alpha reliability figures ranged from 0.71 to 0.86 when the individual student was used as the unit of analysis, and from 0.74 to 0.91 when the class mean was used as the unit of analysis. The alpha reliability figures for the preferred version of the Personal Form of the SLEI ranged from 0.64 to 0.84 when the individual student was used as the unit of analysis and from 0.70 to 0.85 when the class mean was used as the unit of analysis. Principal components factor analysis followed by varimax rotation for the actual and preferred versions of the Personal Form yielded the same five-factor structure for each form. Further, analysis of variance showed that each scale of the actual version of the Personal Form of the instrument differentiated between the perceptions of students in different classes. The eta2 values, which are the ratios of between to total sums of squares and represent the proportion of variance explained by class membership, ranged from 0.23 to 0.28 and indicated that each scale of the Personal Form of the SLEI differentiated significantly (p<0.1) between the perceptions of students in different classes.
In addition, cross-cultural validation information was collected by administering the Personal Form of the SLEI to 1,592 grade 10 chemistry students in 56 classes in Singapore (Wong & Fraser, 1996). For this sample, the alpha reliability figures were quite similar to the Queensland sample using either the individual student or the class mean as the unit of analysis.
The Queensland sample was used in an investigation of differences between students' scores on the SLEI's Personal and Class Forms. This revealed that students had a more positive view of the learning environment when they responded in relation to the whole class than when they gave their perceptions of their personal role in the classroom environment.
One of the common lines of research with Class Forms of learning environment instruments in the last 25 years has been investigation of associations between characteristics of the learning environment and various student outcome measures (Fraser, 1994). The administration of the Class and Personal Forms of the SLEI along with an attitude outcome survey to the Queensland sample allowed a comparison to be made of the magnitude of attitude-environment associations for the Class and Personal Forms. Attitudes were assessed with a Likert scale covering a range of chemistry-related attitudes associated with the goals of laboratory teaching, namely, Attitude to Laboratory Learning, Nature of Chemistry Knowledge (testability and changing nature of science knowledge), Cooperative Learning and Adoption of Laboratory Attitudes (e.g., working safely, repeating observations, following instructions). Generally, the strengths of outcome-attitude associations were similar for the Class Form and the Personal Form.
Nevertheless, although the total variance explained is comparable for each form of the instrument, this does not imply that they account for the same variance in the outcome measures. A commonality analysis (Pedhazur 1982) based on the squared multiple correlation for the five Class Form and five Personal Form scales in the actual version as predictors was performed separately for the same four attitude scales (Fraser & McRobbie 1995) using the class mean as the unit of statistical analysis. The Class Form and the Personal Form each accounted for a sizeable proportion of the outcome variance which was unique to that form when compared to the other form. Further, the unique variance accounted for in the outcome measures was comparable to the common variance. Except for the Cooperative Learning outcome, both forms of the SLEI (Personal and Class) made a statistically significant unique contribution to the variance in the outcome measures. Similar results also were reported by Fraser, Giddings and McRobbie (1995) for other attitude outcome measures and an inquiry skills test thus vindicating the development of separate forms of the questionnaire.
The first version of the new instrument contained the following 9 scales, each scale containing 10 items: Student Cohesiveness, Teacher Support, Involvement, Autonomy/Independence, Investigation, Task Orientation, Cooperation, Equity and Understanding. The new instrument employs the same five-point Likert response scale (Almost Never, Seldom, Sometimes, Often, Almost Always) as used in some previous instruments. Actual versions of the Class and Personal Forms were developed and administered to 355 students in 17 grade 9/10 mathematics and science classrooms in five Australian schools. Principal components factor analysis followed by varimax rotation, along with item analysis and estimation of internal consistency (Cronbach alpha coefficient) and discriminant validity (mean correlation of each scale with the other scales), resulted in the acceptance of a revised version of the instrument comprising 54 items in seven of the original scales (with the Autonomy/Independence and Understanding scales not holding up). In the second trial version of the new instrument, these 54 items in seven are imbedded in an 80-item version with 10 items in each of 8 scales (with Autonomy/ Independence being reinstated).
Table 1 reports statistical data relevant to the
internal consistency reliability (Cronbach alpha coefficient)
and discriminant validity (mean correlation of a scale with the
other scales) for the Class and Personal Forms of the 54-item
version of the new questionnaire. Appendix A shows the corrected
item-scale correlations for the items in the new version that
were retained from the previous trial version. Further, the factor
analysis reported in Appendix B shows that these revised forms
of the instrument each had a similar factor structure.
|Table 1: Number of items, internal consistency (Cronbach Alpha Coefficient), discriminant validity (Mean and Correlation with other scales), ability to differentiate between classes and difference in scale means for class and personal forms.|
|Scale||Form||No of items||Cronbach Alpha||Discrim. validity||ANOVA results eta2||Mean||SD||Difference in means|
|* p < .05, ** p < .01|
This three-level approach provided a rich description of how the students perceived their personal role in the learning environment of the classroom, their perceptions of the learning environment for the class as a whole, and the differences between those perceptions. As part of instrument development procedures, these students also were asked to comment on any difficulties which they experienced in interpreting or understanding the items in the questionnaires and whether there were any additional items or issues that should have been included in the instrument as concerns for improving the learning environment in their classroom.
Below are some examples of responses given by students to explain differences in their perceptions of the whole-class learning environment and their personal perceptions of their role in that learning environment in response to the initial open-ended question or questions relating to the specific dimensions. These responses illustrate a case for which the Personal perception was both more and less favourable than that for the Class as a whole:
There are parts of science I really like but other parts are boring. Some people in the class are like me in that they like some lessons but some people just don't care at all. They just muck around. (About the science class)
I would say that it is a friendly class, but some of the students are smarter than me and can understand everything better. So the way I see the class will be different to what they see. They have fun in practical work because they know what to do. Not understanding it spoils it for me. (Student Cohesiveness)
I know that we have to do all of our work and have it in on time. I always do my work and the homework. Sometimes, if I don't finish my work in class, I take it home and do it. The class? There are fools in the class and they don't want to do their work. They want to muck up and play around. (Task Orientation)
Table 2 provides student comments for selected items in Teacher Support, Involvement and Task Orientation, which all are scales for which the differences in the means of the Class and Personal Forms were both statistically significant and of a magnitude to warrant further investigation. The item wording in Table 2 is for the Personal Form, and the student's response to the Class Form and the Personal Form are shown.
|Table 2: Student comments about their response to class and personal forms of some items|
|Scale and item wording||Student response||Student comment|
|The teacher takes a personal interest in me.||Often||Almost never||I said that because, whenever the teacher asks me a question, I usually answer it wrongly. So I guess the teacher avoids me and prefers someone who actually can answer the question. She is interested in all of her students, but I think that she chooses people that actually can answer the questions correctly.|
|The teacher goes out of his/her way to help me.||Almost always||Seldom||Some people need more help than others. If someone is behind, he will stop and wait for them to catch up. I normally don't need to ask many questions because normally I understand the work.|
|The teacher helps me when I have trouble with the work.||Often||Sometimes||Some people in the class need more help than other people. I don't think I really need that much help.|
|Almost always||Seldom||Sometimes the teacher is not always available, because there are so many students.|
|Students give their opinions during class discussions.||Often||Sometimes||Some people put their hands up more than others. I just listen to everybody else.|
|Almost always||Seldom||I said that because there are so many people in the class.|
|Almost always||Seldom||The class as a whole answer questions, but I can't answer questions because I think that I am wrong and everyone will laugh at me.|
|The teacher asks me questions.||Almost always||Sometimes||He almost always asks questions to the whole class, but only sometimes he asks a particular student to answer and singles you out to ask for the answer. Most of the time, it is just to the whole class.|
|Student's ideas and suggestions are used during class discussions.||Almost always||Sometimes||Yes, the other people in this class would have better ideas than I would have. I am not really the creative type.|
|I ask the teacher questions.||Almost always||Sometimes||I sometimes ask the teacher a question if I need help, but the class asks questions just about all the time.|
|The teacher asks me to explain how I solve problems.||Almost always||Sometimes||He will ask different people in the class. It is just that he has so many students to ask that you only get asked sometimes.|
|Almost always||Sometimes||If you answer the question, he will ask you how you did that and why that was what you said. I said he asks me sometimes because I don't try to answer his questions. But, if you do answer a question and especially if it is wrong, he wants to know why you put that down.|
|Getting a certain amount of work done is important to me.||Sometimes||Almost always||Yes, because most people don't really care if they do or not. They don't really care what they get in the test, but I do.|
|Sometimes||Often||There are lots of kids who just don't do work (disruptive kids). But I always do work, because you just do.|
|I know the goals for this class.||Often||Almost always||I guess some people in the class don't know what they are aiming for, but I do. I know what I want to achieve in science.|
|Sometimes||Almost always||Well for me, the goals are important because I want to get a good mark. But, some of the students couldn't care less and just want to get it over and done with.|
|I try to understand the work in this class.||Often||Almost always||I almost always try to understand because I don't want to do badly in the test, but there are students in the class who don't always try to understand.|
These student responses provide examples for which the Class Form response was more favourable than the Personal Form response, as well as other cases for which the Personal Form was more favourable. Explanations of student responses to the Class Form often were predicated on the identification of events or actions of small groups in the class rather than on the class as a whole, and this raises questions about the validity of Class Form responses representing the whole class learning environment. Underlying many of the responses for which the Class Form response was more favourable than the Personal Form response was the idea that the individual student is only part of the class and therefore interactions with that individual student are necessarily less than the interactions with the whole class. Also, responses frequently reflected a desire on the part of the student not to become involved in the classroom actions for a variety of reasons. A further observation was that, on each of the scales, there was a large proportion of both negative and positive differences in scores between the Class Form and the Personal Form. For example, for the Task Organisation scale, this difference (Class Form minus Personal Form) was positive for 25.9% of students, and negative for 65.9% of students, with 34.8% of students having difference scores which were equivalent to one or more points on the response scale. These ranges in scale score differences between the Class and Personal Forms of the instruments provide further support for the contention that there are groups of students within a classroom with perceptions of the learning environment which differ from these perspectives.
The size of the scale score ranges raises the question of how well the mean score on either form can represent the learning environment in a classroom. Fraser and Hoffman (1995) used qualitative and quantitative data to show how Personal Forms could be utilised in studying the classroom learning environment at different "grain sizes". They showed how individual students and the teacher could be investigated at the smallest grain size and how these environment scores can be aggregated to the class level. When appropriate, such aggregation also could be extended to the system level. However, where classes are composed of heterogeneous groups of students with respect to their perceptions of the learning environment, the aggregation of learning environment perception scores inevitably obscures differences between students and groups within that classroom.
The student responses for the Personal Form also showed that some students were responding in terms of their perceptions of their personal involvement in the classroom and, depending on the scale, were identifying factors that personally could influence their learning. Recent approaches to learning increasingly have recognised the role of social factors in knowledge construction. The responses to the Personal Form also show the extent to which students perceived themselves as participating in the construction of knowledge from a social perspective (e.g., Student Cohesiveness, Cooperation and Involvement scales) in the classroom, both with the whole class as a whole and with their closer working groups. Accordingly, the Personal Form of the instrument has the potential to characterise the learning environment in a classroom from the perspective of recent views of learning. Taylor, Dawson and Fraser (1995) have constructed a Personal Form of a learning environment instrument, specifically for the purpose of assessing constructivist emphases within classroom learning environments, and McRobbie and Tobin (in press) have utilised Personal Forms of learning environment instruments to characterise the learning environment in a chemistry classroom from a social constructivist perspective. Relative to earlier periods of research on environments, currently the Personal Form of instruments is being used increasingly in research on classroom environments.
The Personal Forms of scales displayed satisfactory factorial validity, internal consistency reliability and discriminant validity, and they were capable of differentiating between the perceptions of students in different classrooms. Interesting differences were found between mean scores on the Class and Personal forms for particular scales, and interviews with students helped to illuminate some of the reasons for these differences.
Overall, the findings reported in this paper provide convincing evidence that many respondents have differing perceptions of the learning environment in classrooms from the perspective of the whole class relative to their perceptions of their personal role in that class. However, the research on the characteristics and associations of Personal Forms of learning environment instruments is still in its infancy and much further research will be required before the implications associated with Personal Forms of instruments are understood fully. Meanwhile, the development of this instruments now makes the study of individuals or groups of students within a classroom more valid. It also open the way for the utilisation of qualitative and quantitative data together to paint a more compelling picture of the learning environments of individual and small groups of students.
Fraser, B. J. (1994). Research on classroom and school climate. In D. Gabel (Ed.), Handbook of research on science teaching and learning. New York: Macmillan.
Fraser, B. J., Fisher, D. L., & McRobbie, C. J. (1996, April). Development, validation and use of personal and class forms of a new classroom environment instrument. Paper presented at the annual meeting of the American Educational Research Association, New York, USA.
Fraser, B. J., & Hoffman, H. (1995, April). Combining qualitative and quantitative methods in a teacher-researcher study of determinants of classroom environment. Paper presented at the annual meeting of the American educational research association, San Francisco, CA.
Fraser, B. J., Giddings, G. J., & McRobbie, C. J. (1992). Science laboratory classroom environments: A cross-national perspective. In D. L. Fisher (Ed.), The study of learning environments (pp.1-18). Launceston, Tasmania: University of Tasmania.
Fraser, B. J., Giddings, G. J., & McRobbie, C. J. (1993). Development and cross-national validation of a laboratory classroom environment instrument for senior high school science. Science Education, 77, 1-24.
Fraser, B. J., Giddings, G. J., & McRobbie, C. J. (1995). Evolution, validation and application of a personal form of an instrument for assessing science laboratory classroom environments. Journal of Research in Science Teaching, 32, 399-422.
Fraser, B. J., & McRobbie, C. J. (1995). Science laboratory classroom environments at schools and universities: A cross-national study. Educational Research and Evaluation, 1(4), 1-29.
Fraser, B. J., & O'Brien, P. (1985). Student and teacher perceptions of the environment of elementary-school classrooms. Elementary School Journal, 85, 567-580.
Fraser, B. J., & Tobin, K. (1991). Combining qualitative and quantitative methods in classroom environment research. In B. J. Fraser and H. J. Walberg (Eds.), Educational environments: Evaluation, antecedents and consequences. Oxford: Pergamon.
Fraser, B. J., & Treagust, D. F. (1986). Validity and use of an instrument for assessing classroom psychosocial environment in higher education. Higher Education, 15, 37-57.
Hegarty-Hazel, E. (1990). The student laboratory and the science curriculum. London: Routledge.
Hodson, D. (1990). A critical look at practical work in school science. School Science Review, 70, 33-40.
Lazarowitz, R., & Tamir, P. (1994). Research on using laboratory instruction in science. In D. Gabel (Ed.), Handbook of research on science teaching and learning. New York: Macmillan.
McRobbie, C. J., & Tobin, K. (in press). A social constructivist perspective on learning environments. International Journal of Science Education.
Moos, R. H. (1979). Evaluating educational environments: Procedures, measures, findings and policy implications. San Francisco: Jossey-Bass.
Moos, R. H., & Trickett, E. J. (1974). Classroom Environment Scale manual (1st ed.). Palo Alto, CA: Consulting Psychologists Press.
Murray, H. A. (1938). Explorations in pers onality. New York: Oxford.
Pedhazur, E. (1982). Multiple regression in behavioral research: Explanation and prediction. New York: Rinehart and Winston.
Stern, G.G., Stein, M. I., & Bloom, B. S. (1956). Methods in personality assessment. Glencoe, IL: Free Press.
Taylor, P., Dawson, V., & Fraser, B. (1995, April). Classroom learning environments under transformation: A constructivist perspective. Paper presented at the annual conference of the American Educational Research Association, San Francisco, CA.
Teh, G., & Fraser, B. J. (1995). Development and validation of an instrument for assessing the psychosocial environment of computer-assisted learning classrooms. Journal of Educational Computing Research, 12, 177-193.
Tobin, K. (1987). Target students involvement in high school science. International Journal of Science Education, 10, 317-330.
Tobin, K. (1990). Research on science laboratory activities: In pursuit of better questions and answers to improve learning. School Science and Mathematics, 90, 403-418.
Tobin, K. (Ed.). (1993). The practice of constructivism in science education. Washington: AAAS.
Tobin, K., & Fraser, B. J. (Eds.). (1987). Exemplary practice in science and mathematics education. Perth: Curtin University of Technology.
Tobin, K., & Gallagher, J. J. (1987). What happens in high school science classrooms. Journal of Curriculum Studies, 19, 549-560.
Tobin, K., Kahle, J. B., Fraser, B. J. (Eds.). (1990). Windows into science classes: Problems associated with higher level cognitive learning. London: Falmer Press.
Tobin, K., & Malone, J. (1989). Differential student participation in whole-class activities. Australian Journal of Education, 33, 320-331.
von Glasersfeld, E. (1989). Cognition, construction of knowledge, and teaching. Synthese, 80, 121-140.
Walberg, H. (1968). Teacher personality and classroom climate. Psychology in the Schools, 5, 163-169.
Woolnough, B. E. (1991). Practical science: The role and reality of practical work in school science. Milton Keynes: Open University Press.
Wong, A. & Fraser, B. (1996). Environment-attitude associations in the chemistry laboratory classroom. Research in Science & Technological Education, 14, 91-102.
|Please cite as: Fraser, B., McRobbie, C. and Fisher, D. (1996). Development, validation and use of personal and class forms of a new classroom environment questionnaire. Proceedings Western Australian Institute for Educational Research Forum 1996. http://www.waier.org.au/forums/1996/fraser.html|