[ Proceedings Contents ] [ Forum 1997 Abstracts ] [ WAIER Home Page ]

Student perceptions of practical tasks in senior biology, chemistry and physics classes

Allan Harrison, Darrell Fisher
SMEC, Curtin University of Technology
David Henderson
Launceston College, Launceston
Regular laboratory work is regarded as an integral part of most science courses; however, a significant proportion of laboratory activities remain highly prescriptive and fail to challenge secondary science students. This unique study of senior high school biology, chemistry and physics laboratory environments drew data from student responses to the Science Laboratory Environment Inventory (SLEI) and a curriculum analysis of the implemented laboratory tasks. The study involved 387 biology, chemistry and physics students in 20 classes in Tasmania, Australia who responded to the SLEI. The curriculum analysis was based on Lunetta and Tamir's (1979) Laboratory Structure and Task Analysis Inventory and Laboratory Task Analysis. The study found that the SLEI differentiated between the three subject areas in the following ways: Students believed that physics was more open-ended than either biology or chemistry; rule clarity was greatest in chemistry; and biology was less integrated than either physics or chemistry. The Laboratory Structure and Task Analysis Inventory also confirmed the more open-ended nature of the physics investigations.


Content analysis of practical investigations in science laboratories has been undertaken in a number of previous studies. For example, Herron (1971) provided a means of categorising the levels of openness and emphasis on enquiry skills found in textbooks and laboratory manuals. A number of subsequent studies based on Herronís scheme (e.g., Hegarty, 1978; Tamir, 1991; Tamir & Lunetta, 1978) showed that most practical tasks in science laboratory manuals provided students with little or no opportunity for open-ended or enquiry learning.

This study adds to the analysis of practical investigations in that it combines a measure of science studentsí perceptions of their laboratory learning environment with an analysis of some of the practical tasks undertaken by the students. In addition to enabling a comparison of the kinds of practical tasks undertaken by biology, chemistry and physics students, such a study permits a comparison between the content of the laboratory activities and studentsí perceptions of the laboratory environment. In the study, further information was gathered on whether the instrument used to measure studentsí perceptions of the laboratory environment, the Science Laboratory Environment Inventory, to different laboratory approaches.

Thus, the first aim of this study was to investigate whether there were differences in the perceptions of senior high school biology, chemistry and physics students of their actual science laboratory learning environments. The second aim was to investigate the practical tasks undertaken in these three subject areas and thirdly to make comparisons between the content of the laboratory activities and the learning environment perceptions.

The Science Laboratory Inventory

In the past 25 years, much attention has been given to the development and use of instruments to assess the qualities of the learning environment from the perspective of the student (Fraser, 1986, 1994). The investigation of the effects of learning environment variables and student outcomes has provided a particular rationale and focus for the use of such instruments (Haertel, Walberg & Haertel, 1981). Other studies have investigated various determinants of learning environment (e.g., grade level, school type, curriculum adopted, wait-time, class size).

Laboratory work is seen as an integral part of most science courses and offers students a learning environment that differs in many ways from the 'traditional' classroom setting. Consequently, the Science Laboratory Environment Inventory (SLEI) (Fraser, McRobbie & Giddings, 1993) was developed to assess student perceptions of the psychosocial environment of science laboratory classes.

The initial development of the SLEI was guided by the following criteria. A review of the literature was undertaken to identify dimensions that were considered important in the unique environment of the science laboratory class. Guidance in identifying dimensions also was obtained by examining all scales contained in existing classroom environment instruments for non-laboratory settings (Fraser, 1994). By interviewing numerous science teachers and students at the senior high school level and asking them to comment on draft versions of sets of items, an attempt was made to ensure that the SLEI's dimensions and individual items were considered salient by teachers and students. In order to achieve economy in terms of the time needed for answering and scoring, the SLEI was designed to have a relatively small number of scales, each containing a fairly small number of items.

A set of items was written and was subjected to several successive revisions based on reactions solicited from people with expertise in questionnaire construction and science teaching at the senior high school level. Item and factor analysis yielded the 35-item form of the instrument used in this study. The SLEI assesses Student Cohesiveness, Open-Endedness, Integration, Rule Clarity, and Material Environment. A description of each of these scales is provided in Table 1

Table 1: Descriptive information for each scale of the
Student Laboratory Environment Inventory (SLEI).

Scale name Description

Student Cohesiveness Extent to which students know, help and are supportive of one another
Open-Endedness Extent to which the laboratory activities emphasise an open-ended, divergent approach to experimentation
Integration Extent to which the laboratory activities are integrated with non-laboratory and theory classes
Rule Clarity Extent to which behaviour in the laboratory is guided by formal rules
Material Environment Extent to which the laboratory equipment and materials are adequate

The SLEI was cross-nationally validated with a sample of 3,727 senior high school and university students in 198 science laboratory classes in six countries (Australia, United States, Canada, England, Israel, and Nigeria) (Fraser, McRobbie, & Giddings, 1993). Analogous validity analyses performed separately within each of the six countries suggested that res earchers and teachers from six different countries can use the SLEI with confidence.

McRobbie and Fraser (1993a) used the SLEI in the first investigation into associations between student outcomes and classroom environment that was conducted specifically in science laboratory class settings. Generally, the findings showed that students' perceptions of classroom psychosocial environment accounted for appreciable amounts of variance in student outcomes beyond that attributable to student characteristics such as general ability. Of the five scales of the SLEI, Integration, the extent to which the laboratory activities are integrated with non-laboratory and theory classes, showed the strongest positive association with both students' cognitive and attitudinal outcomes.

The SLEI was used by McRobbie and Fraser (1993b) to develop a typology of science laboratory learning environments. Responses to the actual form of the SLEI by 4,596 students in 240 classes in four countries suggested that more than 90% of the classes could be assigned to one of eight distinct typologies. The types included such laboratory classrooms as those which were above average on each of the environment scores and could be said to have a moderately positive or supportive environment, those in which the environment was moderately negative, those in which there was a high degree of Integration and a low level of Rule Clarity and Material Environment support, and those which were labelled as supportive open-ended. Furthermore, they concluded that students' attitudinal outcomes varied according to the typology of the class.

Wong and Fraser (1994) used the SLEI with a sample of 1,592 high school chemistry students in 56 classes in Singapore. This study provided further cross-cultural validation of the SLEI. All scales of the SLEI with the exception of Open-Endedness were found to be positively related to students' attitudinal outcomes. Females were found to perceive their environment more favourably than did males on all scales except Open-Endedness where the reverse was true.

Fisher, Henderson and Fraser (in press) confirmed the reliability and validity of the SLEI in an investigation of associations between student perceptions of the biology laboratory environment and student outcomes with a sample of 489 senior high school students in 28 biology classes in Tasmania, Australia. Generally, the dimensions of the SLEI were found to be positively related to student attitude scores. In particular, students' attitude scores were higher in classrooms in which students perceived greater Student Cohesiveness, Integration, and Rule Clarity and a better Material Environment: It was concluded that if biology teachers want to promote favourable student attitudes to their class and laboratory work, they should ensure the presence of these SLEI dimensions in their classrooms.

Assessing laboratory tasks

It is generally agreed that student laboratory activities should be designed to develop the so-called 'higher' cognitive abilities that underpin scientific problem-solving skills (Woolnough, (1991). Despite this agreement, a significant proportion of laboratory activities remain highly prescriptive and fail to challenge secondary science students. Lunetta and Tamir (1979) therefore developed a set of protocols for analysing student laboratory activities which they used to systematically analyse a number of well-known physics, chemistry and biology courses.

The method uses two tabular checklists, the Laboratory Structure and Task Inventory and the Laboratory Task Analysis to analyse laboratory activities. The Laboratory Structure and Task Inventory analyses laboratory activities from four perspectives: activity planning and design; student performance behaviours; student analysis and interpretation of results; and student application of laboratory findings. The four perspectives contain 23 questions that are designed to elucidate the manipulative, social and thinking behaviours that characterise scientific investigations. The Laboratory Task Analysis then looks at laboratory activities from the perspectives of structure (high-low cognitive level, open-ended, prescriptive), relation to text (i.e., timing) and mode of participation (i.e., individual, group, whole-class). The perspectives and questions are recorded in a table that teachers and researchers can easily use to analyse and describe particular laboratory activities, modules or courses.

Lunetta and Tamir applied the Laboratory Structure and Task Inventory and the Laboratory Task Analysis to a number of activities and courses and a modified version was used by Fuhrman, Lunetta and Novick (1982) to evaluate five upper secondary chemistry courses. Lunetta and Tamir's approach provides a comprehensive and convenient method for analysing a wide range of science laboratory activities and courses.


This study focused on students in senior secondary classes in Tasmania, Australia. Students in Tasmanian government schools complete their high school education at the end of grade 10. Those students continuing their studies in grades 11 and 12 attend one of eight secondary colleges which offer only senior secondary courses. This study's sample was composed of students from three of the secondary colleges. A total of 387 students in 20 biology, chemistry or physics classes were involved. In order to assess their perceptions of the laboratory environment each of these students responded to the SLEI.

Content analysis of practical manuals used by the Tasmanian biology, chemistry and physics students was undertaken using the method described by Lunetta and Tamir (1979). The textbook and laboratory manuals used in each class were obtained and systematically analysed using the protocols designed by Lunetta and Tamir. As the analyst had extensively used each of the textbooks in his own classes, they were not reread in detail; however, the laboratory manuals were analysed in detail because all the entries in the Laboratory Structure and Task Analysis Inventory related to the manual contents. The laboratory task analysis was completed by comparing the laboratory structure and task inventory results with the text content and the relative position of each laboratory activity (pre-instruction, integrated, post-instruction). The findings from this analysis are recorded in Table 1 which is an adaptation of Lunetta and Tamir's (1979) Laboratory Structure and Task Analysis Inventory.

Finally, the conclusions from the SLEI data analysis and the patterns observed in the laboratory activities analysis were compared to identify relevant similarities and differences.


Fraser et al. (1993) reported that field testing of the SLEI in six countries confirmed the instrument's reliability and validity. The data presented in Table 2 for the present sample of 510 students provide further crossvalidation information supporting the reliability and validity of the SLEI. Analysis of responses to the SLEI using the individual student as the unit of analysis revealed that each SLEI scale had an acceptable internal consistency, with alpha reliability coefficients ranging from 0.71-0.85. These values of the alpha coefficients were even higher with the class mean as the unit of analysis. Table 2 also reports the means and standard deviations for the sample. The means for the different subject disciplines were the more important in this study; however, it can be noted that the students perceived quite a favourable learning environment in their laboratories.

Table 2: Internal consistency (Cronbach alpha coefficient),
Means and Standard Deviations for the SLEI. N = 510

Scale Alpha
Mean Standard

Student Cohesiveness 0.713.900.61
Open-Endedness 0.852.720.54
Integration 0.754.140.66
Rule Clarity 0.793.540.67
Material Environment 0.713.860.61

An examination of the means and standard deviations for the different subject areas, as illustrated in Table 3, indicated that there were differences between the subjects.

One-way ANOVA with subject as the main effect was then conducted to determine whether there were any significant differences between the means of the subjects. The results of these analyses are depicted in Figure 1.

Figure 1 shows that physics students perceived their laboratory environment as more Open-Ended than did chemistry or biology students and chemistry was more Open-Ended than biology. Chemistry students perceived greater Integration between theory and practical than occurred with the other two subjects and again biology was least integrated. Rule Clarity was greatest in chemistry classes and least in physics

Figure 1 [1]: Student perceptions of the Open-Endedness, Integration and Rule Clarity in chemistry, biology and physics laboratory activities. Pictorial representation of results from one-way ANOVA - the arrow represents the significant associations (p = .05 level, Scheffe Test)

Table 3: Means and standard deviations for the three subject areas

Scale BiologyChemistryPhysics

Student Cohesiveness Mean
Stand. dev.
Open-Endedness Mean
Stand. dev.
Integration Mean
Stand. dev.
Rule Clarity Mean
Stand. dev.
Material Environment Mean
Stand. dev.
N N = 184 N = 96N = 138

The findings of the Laboratory Structure and Task Inventory and the Laboratory Task Analysis to analyse laboratory activities (Lunetta & Tamir, 1979) are presented in Table 4. Wherever possible, the presence or absence of an item is indicated by a "A" or a "ns" respectively. "A" and "ns" mean that this category was satisfied in a majority of the activities. The entry "occ" indicates that this item was present in some cases, that is, its occurrence was sufficiently frequent to prevent its classification as "ns" but insufficient for it to be classified as "A". The entry "?" means that there was insufficient information in the activities to make a decision. Special entries were used for cases which did not seem to fit the four descriptors previously mentioned.

Table 4: Laboratory Structure and Task Analysis Inventory and
Laboratory Task Analysis (see key at end of table).

Task category and investigation number BiolChemPhys

1.0 Planning and design
1.1Formulates a question or defines a problem to be investigated. AAA
1.2Predicts experimental results nsnsns
1.3Formulates hypothesis to be tested in this investigation nsnsA
1.4Designs observation or measurement procedure nsnsA
1.5Designs experiment occnsA
2.0 Performance
2.1aCarries out qualitative observation AAns
2.1bCarries out quantitative observation or measurement Ans or AA
2.2Manipulates apparatus; develops technique AAA
2.3Records results; describes observation AAA
2.4Performs numeric calculation ns?A
2.5Explains or makes a decision about experimental technique nsnsA
2.6Works according to own design occnsA
3.0 Analysis and interpretation of results
3.1aTransforms result into standard form (other than graph) ?AA
3.1bGraphs data occnsA
3.2aDetermines qualitative relationship AAA
3.2bDetermines quantitative relationship occnsA
3.3Determines accuracy of experimental data somensA
3.4Defines or discusses limitations and /or assumptions underlying the experiment rarelynsA
3.5Formulates or proposes a generalisation or model AnsA
3.6Explains a relationship AAA
3.7Formulates new questions or defines problem based upon results of investigation nsns?
4.0 Application
4.1Predicts based on results of this investigation occns?
4.2Formulates hypothesis based on results of this investigation occns?
4.3Applies experimental technique to new pro blem or variable nsns?

Laboratory Task Analysis Organisational categories
A. Structure
a.1High degree some AAns
a.2Low degree, ie open some AnsA
B. Relation to text
b.1Precedes text Awhere appropA
b.2Follows text Ans?
b.3Integrated with text AA?
C. Participation mode
c.1Students work on a common task and pool results AAA
c.2Students work on different tasks and pool results Ans?
c.3Post-lab discussion required occns?

Key applies, asked for
ns not stated or asked for
occ applies in some cases
item not mentioned or there was insufficient detail to identify it (eg, open-ended physics activities were quite brief)

The Biology practicals consisted of short prescriptive investigations (usually contained in a marginal box beside the text) telling the students what to do and what to look for. Despite being prescriptive, the instructions were relatively open-ended in that they did not detail every process and observational step for the students. Examples of the biology investigations are Activity 1K (Australian Academy of Science, 1990, p.65) dealing with ecosystem components and Activity 4H (p.282) examining transpiration. Results were usually asked for as a list of items or examples, diagram or explanation. The placement of the activities provided suggested links to the textbook content and theory, however these links were not explicated. [Textbook and integrated investigations: Biology - The common threads (Australian Academy of Science, 1990)]

The Chemistry practicals (Bucat, 1983) were highly structured investigations that prescribed what the students should do and observe. The laboratory activities generally preceded or accompanied the in-text theory with their most common position being integrated with the theory. Laboratory activities were described in shaded boxes in the main body of the text. Qualitative and quantitative results were asked for (depending on the activity) and these were usually directed towards answering a specific problem or supporting the content being learned. Most of the chemistry laboratory activities contained much detail because of the poisonous and corrosive nature of reagents and the possibility of violent reactions. Examples of the prescriptive nature of these investigations are Experiments 1.3.4 (reactions of acids on metals) (pp.17-18) and 1.10.6 (titrating permanganate versus oxalate) (pp.160-161). [Textbook and integrated investigations: Elements of chemistry - Earth, air, fire and water (Bucat, 1984)]

The Physics practicals were all open-ended student designed investigations where the students were responsible for problem formulation, hypothesis development, experimental design and execution and interpretation. An example is provided in Figure 2. Problems were usually stated as a physics or everyday situation and the students were given general guidelines and theoretical hints that helped them focus on the problem and relevant theory. The form in which the results were to be collected and presented was not prescribed. Student-directed laboratory investigations were mandated in the physics syllabus. [No common textbook or laboratory manual.]

The Laboratory Structure and Task Analysis Inventory and the brief descriptions of each textbook and its accompanying laboratory investigations summarise the similarities and differences between the three courses. The obvious difference was the overt open-endedness of the physics investigations in contrast to the prescriptive biology and chemistry courses (chemistry more prescriptive than biology). Safety considerations meant that the chemistry investigations contained more 'how-to' details than either biology and physics (biology more than physics). The open-ended problem solving format of the physics course made them quite difficult to classify on Lunetta and Tamir's inventory; indeed, some aspects of the laboratory work could not be commented on with certainty because most of the method, data collection and data analysis was left for the students to decide.


  1. DESIGN an experiment to investigate the way in which acceleration down an inclined plane varies as the angle the plane makes with the horizontal is varied [C4].

  2. PERFORM the experiment eliminating errors due to parallax and record all relevant results [C5].

  3. ANALYSE the data collected to find a relationship between the acceleration and the angle [C6].

  4. If applicable, make a GENERALISATION based upon

    • your experiment, and
    • relevant theory [C7].

  5. CONVEY THE INFORMATION in a written report of the experiment [C2].

Codes like [C6] refer to the investigative skills and processes prescribed in the syllabus.

Figure 2: A typical physics laboratory activity.

Discussion and conclusion

The SLEI results strongly support the qualitative findings that the physics laboratory activities were more Open-Ended than the laboratory activities in either chemistry or biology. The Laboratory Structure and Task Analysis Inventory supports this view that in physics, the locus of control lay with the students and the SLEI results indicate that this had a significant positive effect on their feelings about the learning environment. The qualitative course analysis also suggested that biology was some-what more open-ended than chemistry and there is some evidence in the Table 4 to support this view. However, this pattern was not evident in the Table 3 data. The Tasmanian physics syllabus requirement that students design and conduct their own laboratory investigations appears to have had a positive effect on this group of students' perceptions about the Open-Endedness of their physics laboratory environment.

The ANOVA results described in Figure 1 also indicate that in both chemistry and physics, laboratory work was more Integrated with theory than in biology. The integration of the chemistry laboratory activities within the textual materials informed students that each investigation contained important chemical ideas that were directly related to the concepts being explained. The same case cannot, however, be made for physics because there were no laboratory activities embedded in their textbook explanations. It might be suggested that the high degree of student control over laboratory activities served to embed the laboratory work in the theory. Given the high degree of student ownership of physics laboratory investigations, it may be conjectured that students found it relati vely easy to integrate the theory with the laboratory work. This issue should be further investigated. It is probably easier to understand the students low rating for biology's integration on the SLEI. Many of the investigations did not appear to 'belong' in the theory; both their lack of detail and peripheral placement suggests that they were non-essential. The qualitative and quantitative findings support a claim that laboratory investigations should be strongly integrated in course theory for students to appreciate their relevance.

The third diagram in Figure 1 shows that chemistry investigations were perceived to have a higher degree of Rule Clarity than either biology and physics. The much greater detail provided in chemistry activities than in biology or physics seems to adequately explain this finding. As stated earlier, chemistry is highly prescriptive for safety reasons and students seem to reflect this in their SLEI responses. On the other hand, much research (e.g., Johnstone, 1991) suggests that too many rules inhibit learning in chemistry. Chemistry educators should heed this warning and try to develop chemistry courses that are open-ended without compromising safety.

Many science courses claim to develop students' investigative skills, and integration, open-endedness and non-rule based learning are recognised characteristics of scientific enterprise. Nevertheless, many "investigations as outlined in [science laboratory] handbooks do not live up to the 'scientific inquiry' goals of their designers" (Fuhrman et al., 1982, p. 565). The value of this combined quantitative / qualitative study is its demonstration that readily available probes can be used by teachers, administrators and course designers to measure the effectiveness of laboratory-based science courses. Indeed, regular field-testing of the efficacy of science laboratory courses is likely to foster more scientific perceptions and behaviours in secondary school students.

The Laboratory Structure and Task Analysis Inventory and the brief descriptions of each textbook and its accompanying laboratory investigations summarise the similarities and differences between the three courses. The obvious difference was the overt open-endedness of the physics investigations in contrast to the prescriptive biology and chemistry courses (chemistry more prescriptive than biology). Safety considerations meant that the chemistry investigations contained more 'how-to' details than either biology and physics (biology more than physics). The open-ended problem solving format of the physics course made them quite difficult to classify on Lunetta and Tamir's inventory; indeed, some aspects of the laboratory work could not be commented on with certainty because most of the method, data collection and data analysis was left for the students to decide.


Australian Academy of Science (1990). Biology, the common threads. Part 1. Canberra: Australian Academy of Science.

Bucat, R. B. (Ed.) (1984). Elements of chemistry Vol ½. Canberra: Australian Academy of Science.

Fisher, D.L., Henderson, D.G. & Fraser, B.J. (in press). Laboratory environments and student outcomes in senior high school biology. The American Biology Teacher.

Fraser, B.J. (1986). Classroom environment. London: Croom Helm.

Fraser, B.J. (1994). Research on classroom and school climate. In D. Gabel (Ed.), Handbook of Research on Science Teaching and Learning. New York: Macmillan.

Fraser, B.J., McRobbie, C.R. & Giddings, G.J. (1993). Development and cross-national validation of a laboratory classroom environment instrument for senior high school science. Science Education, 77(1), 1-24.

Fuhrman, M., Lunetta, V. N., & Novick, S. (1982). Do secondary school laboratory texts reflect the goals of the "new" science curricula? Journal of Chemical Education, 59, 563-565.

Haertel, G.D., Walberg, H.J., & Haertel, E.H. (1981). Socio-psychological environments and learning: A quantitative synthesis. British Educational Research Journal, 7(1), 27-36.

Hegarty, E.H. (1978). Levels of scientific enquiry in university science laboratory classes: Implications for curriculum deliberations. Research in Science Education, 8, 45-57.

Herron, M.D. (1971). The nature of scientific enquiry. School Review, 79, 171-212.

Johnstone, A. H. (1991). Why is science difficult to learn? Things are seldom what they seem. Journal of Computer Assisted Learning, 7, 75-83.

Lunetta, V. N., & Tamir, P. (1979). Matching lab activities with teaching goals. The Science Teacher, 46(5), 22-24.

McRobbie, C.J. & Fraser, B.J. (1993a). Associations between student outcomes and psychosocial science environment. Journal of Educational Research, 87, 75-85.

McRobbie, C.J. & Fraser, B.J. (1993b, November). A typology for university and school science laboratory classes. Paper presented the Annual Conference of the Australian Association for Research in Education, Perth.

Tamir, P. & Lunetta, V.N. (1978). An analysis of laboratory inquiries in the BSCS yellow version. The American Biology Teacher, 40, 353-357.

Tamir, P. (1991). Practical work in school science: An analysis of current practice. In B. E. Woolnough (Ed.), Practical science: The role and reality of practical work in school science. Milton Keynes, England: Open University Press.

Wong, A. & Fraser, B.J. (1994, April). Science laboratory classroom environments and student attitudes in chemistry classes in Singapore. Paper presented at annual meeting of American Educational Research Association, New Orleans, LA.


  1. Some figures were not available to the HTML editors at the time this file was prepared.

Authors: Allan Harrison and Darrell Fisher, SMEC, Curtin University of Technology, GPO Box U1987 Perth Western Australia
David Henderson, Launceston College, Launceston, Tasmania Australia 7250

Please cite as: Harrison, A., Fisher, D. and Henderson, D. (1997). Student perceptions of practical tasks in senior biology, chemistry and physics classes. Proceedings Western Australian Institute for Educational Research Forum 1997. http://www.waier.org.au/forums/1997/harrison.html

[ Proceedings Contents ] [ Forum 1997 Abstracts ] [ WAIER Home Page ]
Last revision: 1 June 2006. This URL: http://www.waier.org.au/forums/1997/harrison.html
Previous URL 30 July 2001 to 16 May 2006: http://education.curtin.edu.au/waier/forums/1997/harrison.html
Previous URL from 12 Aug 1999 to 30 July 2001: http://cleo.murdoch.edu.au/waier/forums/1997/harrison.html
HTML: Roger Atkinson [rjatkinson@bigpond.com] and Clare McBeath [c.mcbeath@curtin.edu.au]