• Aucun résultat trouvé

Discussion of findings

Dans le document Go-Lab Deliverable D8.3 First trial report (Page 108-111)

PART I: STUDENT EVALUATION

9.1 A BOUT THE STUDY

9.2.5 Discussion of findings

Given the above findings, it is safe to recognise that there was not any significant effect of the use of the ‘Experimental Error Tool’ on students’ understanding about ‘How Science Works’. A number of reasons can be identified for this:

a) Student’s understanding about ‘How Science Works’, and in this particular case, students’ procedural and epistemic knowledge are unlikely to change in the course of only one experiment sustained and long-term intervention is needed for this to occur.

b) The length of use of the Tool was not enough for this to have an effect, especially given that the students’ were totally unfamiliar with it.

c) The ‘Experimental Error Tool’ is in the English language, and this may counteract its purpose of use as a scaffolding tool, at least for non-native English speaking students.

d) The ‘Experimental Error Tool’ included significant mathematics content itself, which needed to be understood before it could be used seamlessly to treat experimental values. The students may not have had enough time to acquire this understanding and thus never managed to appreciate fully its intended potential for analysis of experimental values and their certainty.

Further, longer and repeated investigations are needed to assert on the learning value of using the ‘Experimental Error Tool’.

Having said these, there were some indications that the intervention using the ILS

‘Radioactivity’ itself may have had some positive effects on student’s knowledge, especially on their procedural knowledge and more particularly on students’ understanding of the need for repeated measurements of a of a single variable. Again, additional, longer and repeated investigations are needed to prove this more solidly.

9.3 References

Kind, P. M. (2013). Establishing assessment scales using a novel disciplinary rationale for scientific reasoning. Journal of Research in Science Teaching, 50, 530-560. doi:

10.1002/tea.21086

10 The effect of the Go-Lab conclusion tool on students’ science learning

10.1 Abstract

The purpose of this study was to assess the effect of the Go-Lab conclusion tool on students’ science learning and inquiry skills. Two conditions were compared, the control condition, where the Conclusion Tool was not included in the learning environment and the experimental group, where the Conclusion Tool had been integrated in the learning environment. Before and after the intervention, students completed the same content knowledge test and the same inquiry skills test. Students’ post-test content knowledge scores in the experimental condition were marginally higher than those of the students in the control condition. Additionally, statistical significant differences between the pre- and post-test scores were found for “Identifying variables” and “Identifying and stating hypothesis” for both conditions. The Conclusion Tool appears to enhance the content knowledge of students. Since the tool allows students to gather all necessary information they would need to formulate their conclusions at one place, this might make it easier for them to be able to reach a valid conclusion. Further, the tool seems to catalyse a reflection function on former learning products, which might imply a substantial meta-cognitive step before they can combine these products and formulate their conclusions. Future research should track student actions along learning activities in the Conclusion phase to investigate if learner pathways differ between experimental and control conditions.

10.2 Introduction

Conclusion has been identified as one of the phases involved in inquiry (Pedaste et al., 2015). In this phase, students reach to conclusive statements concerning research questions or hypotheses, which are usually formulated at the beginning of an inquiry enactment, after some sort of investigation (e.g., exploration, experimentation) that results in the collection of proper data/evidence (Scanlon et al., 2011). In the case of an open research question, the Conclusion phase leads to specifying the relationship between the variables under study, whereas in the case of a hypothesis the Conclusion phase requires a conclusive remark, accompanied by the necessary evidence, as far as the hypothesis is accepted or rejected.

As most of the inquiry phases, the Conclusion phase is also a challenging task to complete (van Joolingen & Zacharia, 2009; Zacharia et al., 2015). This arises because of the different factors that a student needs to consider before reaching to conclusions.

Specifically, for reaching to conclusions a student needs to consider (a) the research question or hypothesis stated, (b) the data/evidence that emerged during the Investigation phase, usually through an experiment, and (c) the interpretations of the data, after they have been organized/represented and analysed (Pedaste et al., 2015). Research has highlighted a number of difficulties that students face during the Conclusion phase. For instance, students fail to consider all the aforementioned together to synthesize their conclusions, or fail to consider all the evidence collected (for more details see Zacharia et al., 2015).

One way to overcome the problems related to the Conclusion phase, researchers have advocated in favour of the provision of guidance, especially through the use of computer supported inquiry learning environments, considering how difficult it is for a teacher to provide individual based feedback and support to each student separately (Cho &

Jonassen, 2012; Demetriadis et al., 2008; McNeill et al., 2006; Reiser et al., 2001;

Veermans et al., 2006; Woolf et al., 2002; Zumbach, 2009). The literature of the domain shows that researchers have designed and developed different types of guidance, such as performance dashboards, prompts, heuristics and scaffolds, to support student learning when enacting inquiry in computer supported learning environments. For instance, Woolf et al. (2002) developed the Final Case Review Tool to support students at the conclusion phase. This particular tool allowed students to have access to a review of all their observations and hypotheses tested when creating their final report. McNeill et al. (2006) developed prompts in the form of statements (e.g., “Write a sentence that connects your evidence to your claim that…”) to support students when writing conclusive scientific explanations following the structure of claim-evidence-reasoning, whereas Demetriadis et al. (2008) developed prompts in the form of questions, namely the observe prompt, the recall prompt and the conclude prompt, in order to support the students to spot the necessary information and accompany them with proper reasoning for reaching to solid conclusions. Both sets of prompts were found to have a positive effect on the quality of students’ conclusions. Veermans et al. (2006) have referred to the Present evidence heuristic developed by Schoenfeld (1985) to support the Conclusion phase. This particular heuristic reminded students when stating a conclusion about a certain hypothesis, to present evidence to support that conclusion. However, no empirical evidence was provided about the effectiveness of this particular tool.

In addition to prompts and heuristics, a number of scaffolds were developed to support students when formulating conclusions. For example, Reiser et al. (2001) used in their BGulLe computer supported learning environment a scaffold, namely the ExplanationConstructor tool of the investigation journal, that requested from students to directly connect their data and their explanations. Zumbach (2009) developed and used a scaffold which allowed students to represent their arguments (at a conclusion stage) with a text editor tool. In the text editor tool case, students were asked to classify pro and con arguments, whereas in the case of the graphical mind mapping tool, students were asked to connect these arguments and mark them either with “+” or “-”, respectively. The latter scaffold was also found to enhance students’ acquisition of knowledge.

Given all these conclusion related tools and their positive impact on students’ formation of conclusions and learning, and given that the studies examining conclusion related tools effect on student learning are limited in number, we developed a conclusion tool in the context of the Go-Lab platform and aimed at examining its effectiveness. In particular, the purpose of this study was to assess the effect of the Go-Lab conclusion tool on secondary school students’ science learning and inquiry skills.

Dans le document Go-Lab Deliverable D8.3 First trial report (Page 108-111)