|[ Proceedings Contents ] [ Forum 1998 Program ] [ WAIER Home Page ]
The effect of using laptop computers on achievement, attitude to science and classroom environment in scienceDarrell Fisher and Ed Stolarchuk
Science and Mathematics Education Centre,
Curtin University of Technology
and St Hilda's School, Southport Queensland
This study was part of an evaluation of the effectiveness of laptop computers in grades 8 and 9 science classrooms, in a sample of Australian independent Schools. Effectiveness was determined in terms of the impact laptop computers have had on laptop students' attitudinal and achievement outcomes and their perceptions of science classroom environment. Students' attitudes to science were assessed using a scale from the Test of Science-Related Attitudes (TOSRA) instrument, achievement was measured using scales from the Test of Enquiry Skills (TOES) instrument, while students' perceptions of the science classroom environment were assessed using the Science Classroom Environment Survey (SCES). These quantitative instruments were administered to 433 laptop and 430 non-laptop students in 14 independent schools across four Australian states. Descriptive statistics confirmed the reliability and validity of the SCES for science laptop classroom research. Qualitative data were collected by interviewing students and teachers in two of the fourteen schools. These data confirmed and offered explanations for the quantitative findings, which indicated that those laptop science classrooms characterised by opportunities for individual students to interact with the teacher and an emphasis on the skills and processes of inquiry best promoted positive students' attitudes to science. Laptop science classrooms characterised by selective treatment of students least promoted students' cognitive achievement in science.
Students' perceptions of science classroom environment have been favourably associated with student attitude to science and student cognitive achievement in science (Fraser, 1994; McRobbie & Fraser, 1993; Fraser, 1991; Fraser, Walberg, Welch, & Hattie, 1987; Haertel, Walberg, & Haertel, 1981). The availability of proven instruments for assessing science classroom environment, student attitude to science and student cognitive achievement, have allowed this study to proceed and thus contribute to our understanding of the effects of laptop computers.
Validation statistics for the short form report that the Cronbach alpha reliability coefficients ranged from 0.69 to 0.85 indicating high internal scale consistency. Each scale's ability of differentiating between the perceptions of students in different classrooms was confirmed by calculating one-way ANOVAs for each scale, using class membership as the main effect. The ANOVA eta2 statistic calculated for each scale, representing the proportion of variance due to class membership, ranged from 0.21 to 0.39 (p<0.001) indicating adequate scale differentiation. The mean scale correlations ranged from 0.15 to 0.34, indicating each scale sufficiently measured a different dimension of the classroom environment. These reliability and validation figures confirmed that the short form of the student-actual version of the ICEQ is a reliable and valid instrument and that it could be used with confidence to measure students' perceptions of science classroom environment.
The short form of the ICEQ was the basis for the construction of the Science Classroom Environment Questionnaire (SCES) used in this study. The major change made to this form of the ICEQ was that the items were reworded, as necessary, to ensure each one was written in the personal form. For example, the item 'The teacher is unfriendly to students' was reworded to read 'The teacher is unfriendly to me'. This was done to elicit students' perceptions of their own experiences, rather than their perceptions of the class or group experience as a whole. This was thought to be important in classrooms where laptops are used. That individuals' perceptions, are based on their perspectives, was recognised over four decades ago when Stern, Stein, and Bloom (1956) differentiated between private beta press, unique individual perceptions, and consensual beta press, group perceptions.
|Table 1: Descriptive information for SCES scales|
|Personalisation||Emphasis on opportunities for individual students to interact with the teacher and on concern for the personal welfare and social growth of the individual.||The teacher helps me if I'm having trouble.|
|Participation||Extent to which students are encouraged to participate rather than be passive listeners.||I ask the teacher questions.|
|Independence||Extent to which students are allowed to make decisions and have control over their own learning and behaviour.||The teacher decides which students I work with.|
|Investigation||Emphasis on the skills and processes of of inquiry and their use in problem solving and investigation.||I explain the meaning of statements, diagrams and graphs.|
|Differentiation||Emphasis on the selective treatment of students on the basis of ability, learning style, interests, and rate of working.||I move on to other topics if I work faster than other students.|
|Negotiation||Emphasis on opportunities for students to explain and justify to other students their newly developing ideas; reflect self-critically on the viability of their own ideas.||Other students ask me to explain my ideas.|
Adapted from Fraser (1990) and Taylor, Dawson, & Fraser (1995)
Moreover, the ICEQ did not address an aspect of the science classroom environment that was of interest, this being, whether the laptop science classroom allowed for more student interaction and peer learning/teaching. Various classroom environment instruments were reviewed to identify scales that could be used to meet the above need, and a suitable scale was found in the Constructivist Learning Environment Survey (CLES) (Taylor, Fraser, & White, 1994; Taylor, Dawson, & Fraser, 1995; Taylor, Fraser, & Fisher, 1997). This scale is referred to as the 'Negotiation' scale in the SCES. The final version of the SCES contains six scales, the five ICEQ scales and the one CLES scale. Each scale consists of five items, with some items being reverse scored. Each item is responded to on a five-point, Likert-type scale ranging from 1 (Almost never) to 5 (Very Often). Table 1 contains a description and sample items of each scale.
An 'attitude to science' scale, simply referred to as the Attitude scale, was attached to the SCES. It was adopted from the 'Enjoyment of Science Lessons' scale of the Test of Science Related Attitudes (TOSRA) (Fraser, 1981) and contained five items.
Cognitive achievement was measured using three scales from the 'interpreting and processing information skills' section of the Test of Enquiry Skills (TOES) (Fraser, 1979), which was designed to test non-content specific enquiry skills of science students in grades 7 to 10. The three scales selected were 'Scales', 'Charts and Tables' and 'Graphs'.
A number of independent schools were contacted in all Australian states to determine if they used laptop computers in grades 8 and 9 science and if so, whether they would participate in this study. The final sample consisted of 863 students in 44 classes of grades 8 and 9 science classes in 14 independent schools, in four states. Of these students, 433 used laptops in 23 different laptop science classrooms. A non-laptop sample consisted of 430 students in 21 non-laptop science classrooms. Each student in the sample completed the Science Classroom Environment Survey (SCES), with attached Attitude scale and enquiry skills scales.
Qualitative data were collected from two of the 14 laptop schools, both having laptop and non-laptop classes. Laptop students and teachers were interviewed using pre-set questions, with some impromptu questioning to further explore the answers provided to the pre-set questions. Students and teachers were interviewed separately. In one school a group of eight students was interviewed during one single session. In the other school, two groups of four students were interviewed in two separate sessions. In both cases, this was determined by timetable constraints and student availability. A group of three teachers was interviewed in one school and a group of two teachers was interviewed in the other school. Student and teacher interviews were recorded on audio tape.
The reliability and validity of the SCES were confirmed by determining each scale's internal consistency (Cronbach alpha coefficient), discriminant validity (mean correlation of each scale with the other scales) and ability to differentiate between classrooms (ANOVA eta2 ). The associations between students' perceptions of science laptop classroom environment and students' attitudinal and cognitive achievement outcomes were investigated by analysing the data using both simple and multiple correlations. The simple correlation (r) describes the bivariate association between a selected outcome and each scale of the instrument, the SCES in this instance. The multiple correlation, as expressed by the standardised regression weight (beta ), describes the multivariate association between an outcome and particular scale, when all other scales are controlled.
The effects laptop computers had on students' perceptions of science classroom environment were investigated by calculating effect sizes, ES, for each of the SCES scales. Effect sizes were calculated using Cohen's d formula (1977) where the difference in the two group means, for each scale, is divided by the pooled standard deviation. Effect sizes of 0.2 are considered small effects, 0.5 medium effects, and 0.8 large effects.
The qualitative data were transcribed using pen and paper, and then typed and categorised in a form considered favourable for analysis and interpretation.
|Table 2: Internal consistency (Cronbach Alpha Coefficients), Discriminant Validity (Mean Correlation with other Scales), Ability to Differentiate Between Classroom (ANOVA eta2 ), Scale Means and Standard Deviations for Science Laptop Student Sample|
*p<0.01 student n = 433 class n = 23
Scale discriminant validity was confirmed by calculating the mean correlation of each of the instrument's six scales with the remaining scales. The correlations ranged from 0.08 to 0.34 using the individual student as the unit of analysis and from 0.08 to 0.45 using the class mean as the unit of analysis, indicating satisfactory scale discriminant validity. Each scale's ability of differentiating between the perceptions of students in different classrooms was confirmed by calculating one-way ANOVAs for each scale, using class membership as the main effect. The ANOVA eta2 statistic calculated for each scale, representing the proportion of variance due to class membership, ranged from 0.11 to 0.39 (p<0.01); therefore, indicating satisfactory scale differentiation.
Scale means are reported in Table 2 as this is the first reported use of the ICEQ in science laptop classrooms. Using the individual student as the unit of analysis, scale means ranged from a high of 3.58 for the Independence scale to a low of 1.99 for the Differentiation scale, and from a high of 3.57 to a low of 1.97, for the same scales respectively, using the class mean as the unit of analysis. Fraser (1990) reported scale means, for the class mean unit of analysis, for grades 8 and 9 science students, ranging from a high of 3.60 for the Participation scale to a low of 2.20 for the Differentiation.
Both multiple correlation (R ) statistics, between the set of SCES scales and each of the outcomes, are statistically significant. An examination of the simple correlation (r ) results in Table 3 indicates that of the 12 possible relationships between science classroom environment and the outcome variables of attitude and achievement, 10 are statistically significant (p<0.05). This is over 16 times that expected by chance alone. A similar examination of the multiple correlation beta weights (beta ), however, reveals that only three of the 12 possible relationships are statistically significant (p<0.05). This is five times that expected by chance alone.
The multiple correlation (R ) statistic, at value 0.69 (p<0.001), suggests that the association between students' perceptions of science laptop classroom environment as measured by the SCES, and students' attitude to science is a strong one. Furthermore, the RSup>2 statistic, at 0.48 indicates that 48% of the variance in laptop students' attitude to science is explained by students' perceptions of science classroom environment.
The simple correlation (r ) data in Table 3 indicate that all the associations between students' attitudinal outcomes and the SCES scales are statistically significant. The associations for the scales of Personalisation, Participation and Investigation are quite large, while the association for the Negotiation scale is somewhat smaller. The associations for the Independence and Differentiation scales are small. All of the associations are positive, except for the Differentiation scale. These findings suggest that positive students' attitude to science are promoted in science laptop classrooms where students perceive the science classroom as being characterised by personalisation, participation, independence, investigation and negotiation. In contrast, students' attitude to science decrease in science laptop classrooms where students perceive the science classroom as being characterised by differentiation. However, as noted earlier, this is a small association.
An examination of the attitudinal outcomes standard regression weights (beta ) data, indicates that only two of the six scales retain their statistical significance. This more conservative analysis suggests that of the science laptop classroom characteristics earlier identified as promoting positive student attitudes to science, it is the classroom characteristics of personalisation and investigation that are most influential.
|Table 3: Associations Between SCES Scales and Laptop Students' Attitudinal and Cognitive Achievement Outcomes in Terms of Simple Correlations (r ) and Standardised Regression Coefficients (beta )|
|Strength of Classroom Environment-Outcome Association|
|Multiple R Correlation||0.69**||0.33**|
*p<0.05 **p<0.001 n = 433 (laptops)
The qualitative data collected supported and offered several insights into the quantitative findings. For example, when students were asked why they thought they had a better attitude to science than did their non-laptop counterparts, their answers included comments such as "easier, different, new, a bit more fun" and "it's not such a drag and everything." Students also felt that correcting errors was easier, they could complete their work quicker (once they learned how to use their laptops), they were very much pleased with their finished product (it looked like a professional report) and they experienced reasonable success at learning on their own through trial and error, but this learning was more related to learning about the laptop than science.
The qualitative data collected from teachers included comments such as "averaging, spreadsheets, graphing, statistics - all that early stuff, it came alive" and "at the stage where there wasn't much statistical data involved, they just enjoyed writing it up on the laptops so it looked like a nice scientific report - it helped them take an interest in what was going on." Teachers also felt that they could offer students more individual help and that students appeared to be more motivated to work on their own and discover how to do various things, but again this was more to do with learning about computers than the scientific concepts being taught.
The multiple correlation (R ) statistic, 0.33 (p<0.001), indicates a significant association between science laptop classroom environment, as measured by the all the SCES scales, and student cognitive achievement. The R2 statistic indicates that 11% of the variance in students' cognitive achievement can be attributed to laptop students' perceptions of their science classroom environment.
The simple correlation (r ) data indicates that four of the six correlations between students' cognitive achievement and the SCES scales are statistically significant. Of the four scales, the Personalisation, Participation, and Independence scales are positively correlated and the Differentiation scale is negatively correlated. The correlation values are generally small, suggesting that to a small extent, students' cognitive achievement is higher in science laptop classrooms where students' perceive the classroom as being characterised by personalisation, participation and independence. Students' cognitive achievement is lower in science laptop classrooms where students' perceive the classroom as being characterised by differentiation.
Achievement outcome association standard regression weights (beta ) data in Table 3 indicate that only one scale, of the three scales earlier identified as being statistically significant, retains its statistical significance. This more conservative analysis of association suggests that student cognitive achievement is most influenced by science laptop classrooms perceived to be characterised by differentiation.
The qualitative data collected reinforced the existence of a less positive relationship between laptop use and cognitive achievement than between laptop use and attitude. Students indicated that laptops helped them with things such as making tables, spreadsheets, graphing, presentation, editing of work, projects, note taking, writing up investigations and organisation, but it really didn't make it easier to learn science. They felt laptops were only used as tool for graphing, note taking and so on; they were not used to 'teach' any of the science materials or topics. Following are some typical comments
Laptops don't really help with our knowledge of science, except for our knowledge of computers.Teacher comments included
Perhaps if we had a science program in it we could go home and look at it and learn more from it. It would be good if you could put encyclopaedias, Encarta and stuff like that on them but they are too small for that.
Like they expect you to know everything on the computer and if you don't they you have to waste time to find out how to do it, and you miss out on class things.
Ah, I think that to start with the information they are putting in the computer, they don't actually take notice of it.Teachers also felt that they had to often spend a disproportionate amount of time with students who could not use the computers efficiently. This led to some students feeling they did not get their 'fair share' of the teacher's time and that students were not treated equally in the class.
I guess what I'd say is that there are more questions about the actual computing that science which is maybe a bit of a worry. the could actually detract from the learning of science - they are so busy trying to learn how to do a table they actually don't pay attention to the information going in.
We were teaching a lot of computer skills, like graphing which they then used later on - it was an unknown situation.
|Table 4: Effect Sizes (ES ) for Laptop Computers on the Science Classroom Environment as measured by the SCES Scales|
|Scale||Unit of Analysis||Effect Size - ES a|
a ES was calculated by subtracting the non-laptop mean from the laptop mean and dividing the difference by the pooled standard deviation, Cohen's d (1977)
An examination of the magnitude of the effect sizes indicates that for the scales of Independence, Investigation and Negotiation, at the class mean unit of analysis, the effects are small (0.15 to 0.20). The effect size for the Differentiation scale, 0.45, approaches the medium effect category at the class mean unit of analysis and is small, 0.25, for the individual student unit of analysis. All other reported effect sizes are negligible, at 0.10 or smaller.
The multiple correlation (R ) statistics in Table 3 suggest that of the two student outcomes, attitude and cognitive achievement, the strength of the association between students' attitudinal outcomes and their perceptions of science laptop classroom environment is just over twice as strong as that between students' cognitive achievement outcomes and their perceptions of science laptop classroom environment. Furthermore, the R2 statistic, indicates that the percentage of variance in students' attitudinal scores, as explained by students' perceptions of science laptop classroom environment, is over four times that, for students' cognitive achievement scores and their perceptions of science laptop classroom environment.
The student and teacher interview data supported a more positive and stronger relationship between students' attitude and science classroom environment than between students' cognitive achievement and classroom environment. The use of computers resulted in students being more enthusiastic toward science, and made some of the mundane science-related tasks such as making tables and graphing 'come alive'. However, the qualitative data indicated that much was taught and learned about computers, often at the expense of science, and that the teacher was often perceived as spending an inordinate amount of class time with certain students at the expense of others.
The effect size data in Table 4 suggest that laptops have had minimal effects on students' perceptions of science classroom environment, especially at the individual student unit of analysis.
These findings are of practical significance as they have indicated an overall positive association between students' perceptions of science laptop classroom and their attitudinal and cognitive achievement outcomes. Schools considering the introduction of laptops into science classrooms would find this information valuable during their deliberations.
Fraser, B.J. (1994). Research on Classroom and School Climate. In D.L. Gabel, (Ed.), Handbook of Research on Science Teaching and Learning. New York: Macmillan.
Fraser, B.J. (1991). Two Decades of Classroom Environment Research. In B.J. Fraser & H.J. Walberg (Eds.), Educational Environments: Evaluation, Antecedents and Consequences. Oxford: Pergamon Press.
Fraser, B.J. (1990). Individualised Classroom Environment Questionnaire: Handbook and Test Master Set. Hawthorn: The Australian Council for Educational Research Ltd, Radford House.
Fraser, B.J. (1981). TOSRA: Test of Science-Related Attitudes Handbook. Hawthorn: The Australian Council for Educational Research Limited.
Fraser, B.J. (1979). Test of Enquiry Skills Handbook. Hawthorn: The Australian Council for Education Research Limited.
Fraser, B.J., & Walberg, H.J. (Eds) (1991). Educational environments: Evaluation, Antecedents, and Consequences. Oxford: Pergamon Press.
Fraser, B.J., Walberg, H.J., Welch, W.W., & Hattie, J.A. (1987). Synthesis of educational productivity research. International Journal of Educational Research, 11(2), 145-252.
Gardner, J., Morrison, H., & Jarman, R. (1993). The impact of high access to computers on learning. Journal of Computer Assisted Learning, 9, 2-16.
Haertel, G.D., Walberg, H.J., & Haertel, E.H. (1981). Socio-psychological environments and learning: a quantitative synthesis. British Educational Research Journal, 7, 27-36.
Loader, D. (1993). Reconstructing an Australian School. The Computing Teacher, 20(7), 12, 14-15.
McMillan, K., & Honey, M. (1993). Year One of Project Pulse: Pupils Using Laptops in Science and English. A Final Report. New York: Bank Street College of Education. (ERIC Document Reproduction Service No. ED 358 822)
McRobbie, C.J., & Fraser, B.J. (1993). Associations between student outcomes and psychosocial science environment. Journal of Educational Research, 87, 78-85.
Mitchell, J., & Loader, D. (1993). Learning in a Learning Community: Methodist Ladies' College case study. Jolimont: Incorporated Association of Registered Teachers of Victoria.
Rowe, H.A.H. (1993). Learning with Personal Computers. Melbourne: The Australian Council for Educational Research.
Shears, L. (Ed) (1995). Computers and Schools. Melbourne: The Australian Council for Educational Research Ltd.
Stern, G.G., Stein, M.I., & Bloom, B.S. (1956). Methods in personality assessment. Glencoe: Free Press.
Taylor, P., Dawson, V., & Fraser, B. (1995). Classroom learning environments under transformation: A constructivist perspective. Paper presented at the annual meeting of the American Educational Research Association, San Francisco.
Taylor, P.C., Fraser, B.J., & White, L.R. (1994). A classroom environment questionnaire for science educators interested in the constructivist reform of school science. Paper presented at the annual meeting of the National Association for Research in Science Teaching, Anaheim.
Taylor, P.C., Fraser, B.J., & Fisher, D.L. (1997). Monitoring constructivist classroom learning environments. International Journal of Educational Research, 27(4), 293-302.
|Please cite as: Fisher, D. and Stolarchuk, E. (1998). The effect of using laptop computers on achievement, attitude to science and classroom environment in science. Proceedings Western Australian Institute for Educational Research Forum 1998. http://www.waier.org.au/forums/1998/fisher.html|