• Aucun résultat trouvé

Measuring up: Canadian results of the OECD PISA 2003 study: The performance of Canada's youth in mathematics, reading, science and problem solving: first findings for Canadians aged 15

N/A
N/A
Protected

Academic year: 2022

Partager "Measuring up: Canadian results of the OECD PISA 2003 study: The performance of Canada's youth in mathematics, reading, science and problem solving: first findings for Canadians aged 15"

Copied!
99
0
0

Texte intégral

(1)

Measuring up: Canadian Results of the OECD PISA Study

The Performance of Canada’s Youth in Mathematics, Reading, Science and Problem Solving

2003 First Findings for Canadians Aged 15

Council of Ministers of Education, Canada Human Resources and Ressources humaines et

(2)

Specific inquiries about this product and related statistics or services should be directed to: Client Services, Culture, Tourism and the Centre for Education Statistics, Statistics Canada, Ottawa, Ontario, K1A 0T6 (telephone: (613) 951-7608; toll free at 1 800 307-3382;

by fax at (613) 951-9040; or e-mail: educationstats@statcan.ca).

For information on the wide range of data available from Statistics Canada, you can contact us by calling one of our toll-free numbers. You can also contact us by e-mail or by visiting our Web site.

National inquiries line 1 800 263-1136

National telecommunications device for the hearing impaired 1 800 363-7629 Depository Services Program inquiries 1 800 700-1033 Fax line for Depository Services Program 1 800 889-9734

E-mail inquiries infostats@statcan.ca

Web site www.statcan.ca

Ordering and subscription information

This product, Catalogue no. 81-590-XPE, is published irregularly as a standard printed publication at a price of CDN $11.00 per issue. The following additional shipping charges apply for delivery outside Canada:

Single issue United States CDN $6.00 Other countries CDN $11.00

All prices exclude sales taxes.

This publication is available electronically without charge through the internet at:

www.pisa.gc.ca

www.statcan.ca

www.cmec.ca

www.hrsdc.gc.ca

This product can be ordered by

Phone (Canada and the United States) 1 800 267-6677

Fax (Canada and the United States) 1 877 287-4369

E-mail order@statcan.ca

Mail Statistics Canada Dissemination Division Circulation Management 120 Parkdale Avenue Ottawa, Ontario K1A 0T6

And in person at the Statistics Canada Reference Centre nearest you.

The paper used in this publication meets the minimum requirements of American National Standard for Information Sciences - Permanence of Paper for Printed Library Materials, ANSI Z39.48 - 1984.

(3)

Measuring up: Canadian Results of the OECD PISA Study

The Performance of Canada’s Youth in Mathematics, Reading, Science and Problem Solving

2003 First Findings for Canadians Aged 15

Authors

Patrick Bussière, Human Resources and Skills Development Canada Fernando Cartwright, Statistics Canada

Tamara Knighton, Statistics Canada

Special Contributor

Todd Rogers, University of Alberta

Published by authority of the Minister responsible for Statistics Canada

© Minister of Industry, 2004

All rights reserved. No part of this publication may be reproduced, stored in a retrieval system or transmitted in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, without prior written permission from License Services, Marketing Division, Statistics Canada, Ottawa, Ontario, Canada K1A 0T6.

December 2004

Catalogue no. 81-590-XPE — No. 2 ISBN 0-660-19397-3

ISSN 1712-5464

Catalogue no. 81-590-XIE — No. 2 ISBN 0-662-38806-2

ISSN 1712-5472

Également offert en français sous le titre : À la hauteur : Résultats canadiens de l’étude PISA de l’OCDE — La performance des jeunes du Canada en mathématiques, en lecture, en sciences et en résolution de problèmes — Premiers résultats de 2003 pour les Canadiens de 15 ans

Frequency: Occasional Ottawa

Human Resources and Skills Development Canada, Council of Ministers of Education, Canada and Statistics Canada

(4)

Bussière , Patrick

The performance of Canada’s youth in mathematics, reading, science and problem solving : 2003 first findings for Canadians aged 15

(Measuring up : Canadian results of the OECD PISA study ; no. 2) Issued also in French under title: La performance des jeunes du

Canada en mathématiques, en lecture, en sciences et en résolution de problèmes : premiers résultats de 2003 pour les canadiens de 15 ans.

Available also on the Internet.

ISBN 0-660-19397-3 (paper) ISBN 0-662-38806-2 (Internet) CS81-590-XPE

CS81-590-XIE

1. High school students – Rating of – Canada.

2. Academic achievement – Canada – Statistics.

3. High school students – Rating of – Canada – Statistics.

4. High school students – Rating of – OECD countries – Statistics.

5. Educational evaluation – Canada – Statistics.

6. Programme for International Student Assessment.

I. Bussière, Patrick. II. Cartwright, Fernando – III. Knighton, Tamara – IV. Rogers, W. Todd – V. Statistics Canada. II. Canada. Human Resources and Skills Development Canada. – VI. Council of Ministers of Education (Canada) – VII. Series.

LB3054.C3 B87 2004 373.126’2’0971 C2004-988006-3

(5)

Acknowledgements

We would like to thank the students, parents, teachers and principals who gave of their time to participate in the 2003 OECD PISA study and the Youth in Transition Survey. The support for this Federal- Provincial collaborative project provided by members of the PISA-YITS Steering Committee and by the coordinators in each participating Ministr y or Department of Education during all steps of the study is gratefully acknowledged. The dedication of the survey development, implementation, processing and methodology teams was essential to the project’s success and is appreciated.

This publication was prepared jointly by Statistics Canada, Human Resources and Skills Development Canada and the Council of Ministers of Education, Canada and was supported financially by Human Resources and Skills Development Canada.

The report has benefited from the input and comments of reviewers in provincial Ministries and Departments of Education; the Council of Ministers of Education, Canada; Human Resources and Skills Development Canada and Statistics Canada. A very special thank you is extended to Danielle Baum for her indispensable help in preparing the manuscript for publication. The contribution of editorial, com- munications, translation and dissemination services staff of Statistics Canada, Human Resources and Skills Development Canada and the Council of Ministers of Education, Canada was essential to the project’s success and is appreciated.

Note of Appreciation

Canada owes the success of its statistical system to a long-standing partnership between Statistics Canada, the citizens of Canada, its businesses, governments and other institutions. Accurate and timely statistical information could not be produced without their continued cooperation and goodwill.

(6)

© SchoolNet, Industry Canada

(7)

Acknowledgements 3 Introduction 9

The Programme for International Student Assessment 9 Why do PISA? 10

Why did Canada participate? 10 What is PISA 2003? 11

Objectives and organization of the report 12 Chapter 1

The performance of Canadian students in mathematics in an international context 13 Defining mathematics 13

Canadian students performed well in mathematics 15 Provincial results 21

Mathematics skill levels 22

Provincial variation in mathematics performance 27 How does the performance of boys and girls compare? 28

Achievement of Canadian students by language of the school system 29 Comparison of mathematics performance in PISA 2003 and PISA 2000 30 Summary 31

Chapter 2

The performance of Canadian students in reading, science and problem solving in an international context 33

Defining reading, science and problem solving 33

Canadian students performed well in reading, science and problem solving 34 Provincial results 38

How does the performance of boys and girls compare? 38

Achievement of Canadian students by language of the school system 39 Comparison of reading and science performance in PISA 2003 and 2000 40 Summary 41

Chapter 3

The relationship between student engagement, student learning, and mathematics performance 43

Engagement in mathematics 43

Mathematics learning strategies and preferences for learning 48 Summary 52

Table of contents

(8)

Chapter 4

The relationship between family characteristics, home environment, and mathematics performance 53

Parental education, occupation, and student performance 53 Socio-economic status and student performance 59

Summary 62

Conclusion 63

Appendix A: PISA sampling procedures and response rates 67 Table A1 PISA 2003 school and student response rates 68

Appendix B: Tables 69 Chapter 1

Table B1.1 Estimated average scores and confidence intervals for

provinces and countries: COMBINED MATHEMATICS 70 Table B1.2 Estimated average scores and confidence intervals for

provinces and countries: MATHEMATICS SPACE AND SHAPE 70 Table B1.3 Estimated average scores and confidence intervals for provinces and

countries: MATHEMATICS CHANGE AND RELATIONSHIPS 71 Table B1.4 Estimated average scores and confidence intervals for

provinces and countries: MATHEMATICS QUANTITY 71 Table B1.5 Estimated average scores and confidence intervals for

provinces and countries: MATHEMATICS UNCERTAINTY 72

Table B1.6 Variation in combined mathematics performance, Canada and the provinces 72 Table B1.7 Percent of students at each level for provinces and countries:

COMBINED MATHEMATICS 73 Table B1.8 Gender differences by country and province:

COMBINED MATHEMATICS 74 Table B1.9 Gender differences by country and province:

MATHEMATICS SPACE AND SHAPE 75 Table B1.10 Gender differences by country and province:

MATHEMATICS CHANGE AND RELATIONSHIPS 76 Table B1.11 Gender differences by country and province:

MATHEMATICS QUANTITY 77 Table B1.12 Gender differences by country and province;

MATHEMATICS UNCERTAINTY 78 Chapter 2

Table B2.1 Estimated average scores and confidence

intervals for provinces and countries: READING 79 Table B2.2 Estimated average scores and confidence

intervals for provinces and countries: SCIENCE 79 Table B2.3 Estimated average scores and confidence intervals

for provinces and countries: PROBLEM SOLVING 80 Table B2.4 Gender differences by country and province: READING 81 Table B2.5 Gender differences by country and province: SCIENCE 82

Table B2.6 Gender differences by country and province: PROBLEM SOLVING 83

Table of contents

(9)

Chapter 3

Table B3.1 Average score for indices of student

engagement in mathematics: Canada and the provinces 84 Table B3.2 Difference in mathematics performance between students

with high mathematics engagement compared to students

with low mathematics engagement, Canada and the provinces 85 Table B3.3 Student engagement regression coefficients for females relative to

males controlling for mathematics ability, Canada and the provinces 87 Table B3.4 Average scores on indices of learning strategies and preferences

for learning situations in mathematics, Canada and the provinces 88 Table B3.5 Difference in mathematics performance between students with

high levels of mathematics learning strategies and preferences

for learning compared to students with low levels, Canada and the provinces 90 Table B3.6 Average score for learning strategies and preferences for

learning: low achievers versus high achievers, Canada and the provinces 91 Chapter 4

Table B4.1 Parental educational attainment, Canada and the provinces 93

Table B4.2 Parental education and student performance in mathematics, Canada and the provinces 93 Table B4.3 Distribution of parental education for Canadian students with high and low overall

mathematics performance 94

Table B4.4 Parental educational attainment and occupation, Canada 94 Table B4.5 Parental occupation and student mathematics performance,

Canada and the provinces 95

Table B4.6 School SES and student performance in mathematics in Canada 96

Table of contents

(10)

© SchoolNet, Industry Canada

(11)

Introduction

The Programme for International Student Assessment

The Programme for International Student Assessment (PISA) is a collaborative effort among member countries of the Organisation for Economic Co-operation and Development (OECD). PISA is designed to provide policy-oriented international indicators of the skills and knowledge of 15-year-old students1 and sheds light on a range of factors that contribute to successful students, schools, and education systems. PISA measures skills that are generally recognized as key outcomes of the educational process. They are not, however, the only expected outcomes nor are they solely acquired through education. The assessment focuses on young people’s ability to use their knowledge and skills to meet real life challenges. These skills are believed to be prerequisites to efficient learning in adulthood and for full participation in society.

PISA has brought significant public and educational attention to international assessment and studies by generating data to enhance the ability of policy makers to make decisions based on evidence. In Canada, PISA is carried out through a partnership consisting of Human Resources and Skills Development Canada, the Council of Ministers of Education Canada, and Statistics Canada.

PISA began in 2000 and focuses on 15-year-olds’

capabilities as they near the end of compulsory education. PISA reports on reading literacy, mathematical literacy, and scientific literacy every three years and provides a more detailed look at each domain

in the years when it is the major focus. For example, mathematics was the major domain of PISA in 2003 and as such focused on both overall mathematical literacy and four mathematics sub-domains (space and shape, change and relationships, quantity, and uncertainty).

Additionally, problem-solving skills were evaluated in PISA 2003. As minor domains in PISA 2003, only single measures of reading and science were available.

On the other hand, more detailed information was available on reading and reading sub-domains in 2000 and more information will be available on science and science sub-domains in 2006.

The PISA Assessment Domains

PISA measures three domains: mathematical literacy, reading literacy, and scientific literacy. In addition, PISA 2003 measured problem-solving skills. The domains were defined as follows by international experts who agreed that the emphasis should be placed on functional knowledge and skills that allow active participation in society.

Mathematical literacy (hereafter referred to as mathematics):

An individual’s capacity to identify and understand the role that mathematics plays in the world, to make well- founded judgements and to use and engage with mathematics in ways that meet the needs of that individual’s life as a constructive, concerned and reflective citizen.

Reading literacy (hereafter referred to as reading):

An individual’s capacity to understand, use and reflect on written texts, in order to achieve one’s goals, to develop one’s knowledge and potential and to participate in society.

(12)

Scientific literacy (hereafter referred to as science):

An individual’s capacity to use scientific knowledge, to identify questions and to draw evidence-based conclusions in order to understand and help make decisions about the natural world and the changes made to it through human activity.

Problem-solving skills (hereafter referred to as problem solving):

An individual’s capacity to use cognitive processes to confront and resolve real, cross-disciplinary situations where the solution path is not immediately obvious and where the literacy domains or curricular areas that might be applicable are not within a single domain of mathematics, science or reading. Education systems play a key role in generating the new supply of skills to meet this demand. The skills acquired by the end of compulsory schooling provide the essential foundation upon which we will develop the human capital needed to meet the economic and social challenges of the future. For more information please refer to the PISA 2003 assessment framework.

Why do PISA?

The skills and knowledge that individuals bring to their jobs, to further studies, and to our society, play an important role in determining our economic success and our overall quality of life. The importance of skills and knowledge is expected to continue to grow. The shift from manufacturing to knowledge and information intensive service industries, advances in communication and production technologies, the wide diffusion of information technologies, falling trade barriers, and the globalization of financial markets and markets for products and services have precipitated changes in the skills the present and future economy requires. These include a rising demand for a strong set of foundation skills upon which further learning builds.

Elementary and secondary education systems play a central role in laying a solid base upon which subsequent knowledge and skills can be developed.

Students leaving secondary education without a strong foundation may experience difficulty accessing the postsecondary education system and the labour market and they may benefit less when learning opportunities are presented later in life. Without the tools needed to be effective learners throughout their lives, these individuals, with limited skills, risk economic and social marginalization.

Governments in industrialized countries have devoted large portions of their budgets to provide high quality universal elementary and secondary schooling.

Despite these investments, there is concern among these governments about the relative effectiveness of their education systems. To address these issues, member governments of the Organisation for Economic Co- operation and Development (OECD) developed a common tool to improve their understanding of what makes young people—and education systems as a whole—successful. This tool is the Programme for International Student Assessment (PISA).

Information gathered through PISA enables a thorough comparative analysis of the skill level of students near the end of their compulsory education.

PISA also permits exploration of the ways that skills vary across different social and economic groups and the factors that influence the level and distribution of skills within and among countries.

Why did Canada participate?

Canada’s participation in the PISA study stems from many of the same concerns as have been expressed by other participating countries. Canada invests significant public resources in the provision of elementary and secondary education. Canadians are concerned about the quality of education provided to their youth by elementar y and secondar y schools. How can expenditures be directed to achieve higher levels of skills upon which lifelong learning is founded, and to potentially reduce social inequality in life outcomes?

Canada’s economy is also evolving rapidly. For the past two decades, the growth rate of knowledge-intensive occupations has been twice that of other occupations.22 Even employees in traditional occupations are expected to upgrade their skills to meet the rising demands of new organisational structures and production technologies. Elementary and secondary education systems play a key role in generating the new supply of skills to meet this demand. The skills acquired by the end of compulsory schooling provide individuals with the essential foundation necessary to further develop human capital.

Questions about educational effectiveness can be partly answered with data on the average performance of Canada’s youth. However, two other questions can only be answered by examining the distribution of skills:

Who are the students at the lowest levels? Do certain groups or regions appear to be at greater risk? These are important questions because, among other things, skill acquisition during compulsory schooling influences

(13)

access to postsecondary education, eventual success in the labour market, and the effectiveness of continuous, lifelong learning.

What is PISA 2003?

Forty-one countries participated in PISA 2003, including all 30 OECD countries3. Between 5,000 and 10,000 students aged 15 from at least 150 schools were typically tested in each country. In Canada, approximately 28,000 15-year-olds from about 1,000 schools participated across the ten provinces4. The large Canadian sample was required to produce reliable estimates representative of each province, and for both French and English language school systems in Nova Scotia, New Brunswick, Quebec, Ontario, and Manitoba.

The 2003 PISA assessment was administered in schools, during regular school hours in April and May 2003. This assessment was a paper-and-pencil test lasting a total of two hours. Students also completed a 20-minute student background questionnaire providing information about themselves and their home and a 10- minute questionnaire on information technology and communications, while school principals were asked to complete a 20-minute questionnaire about their schools.

As part of PISA, national options could also be implemented. Canada chose to add a 20-minute student questionnaire from the Youth in Transition Survey in order to collect more information on 15-year-olds’ school experiences, work activities and relationships with others.

Additionally, a 30-minute interview was conducted with parents.

International Canada

Participating countries/provinces 41 countries 10 provinces

Population Youth aged 15 Same; youth born in 1987

Number of participating Between 5,000 and 10,000 per country with some Approximately 28,000 students students exceptions for a total of close to 272,000 students

Domains Major: mathematics Same

Minor: reading, science and problem solving

Amount of testing time 390 minutes of testing material organized into different Same devoted to domains combinations of test booklets 120 minutes in length

210 minutes devoted to mathematics

60 minutes each devoted to reading, science and problem solving

Languages in which the 32 languages English and French

test was administered

International assessment Two hours of direct skills assessment through Same mathematics, reading and science, and problem-solving

Twenty minute contextual questionnaire administered to youth

A school questionnaire administered to school principals

International options Ten-minute optional questionnaire on information Ten-minute optional questionnaire on technology and communications administered to students information technology administered to

Ten-minute optional questionnaire on educational students administered to students career administered to students

National options Grade-based assessment Twenty-minute questionnaire on school

experiences, work activities and relation- ships with others administered to students

Other options were undertaken in a limited Thirty-minute interview with parents to

number of countries collect detailed information on youths’

school experiences, parental education and occupation, labour market participation and household income

Box 1

Overview of PISA 2003

(14)

Objectives and organization of the report

This report provides the first pan-Canadian results of the 2003 PISA assessment of mathematics, reading, science, and problem solving. The information is presented at the national and provincial levels in order to complement the information presented in “Learning for Tomorrow’s World – First Results from PISA 2003”5. Wherever possible, an attempt has been made to put results into context through comparisons to student peers, internationally and within Canada.

Chapter 1 provides information on the relative performance of Canadian 15-year-old students on the 2003 PISA assessment in mathematics. It looks at the average level of performance on the overall mathematics scale as well as the four mathematics sub-domains, the distribution of achievement scores and proficiency levels in mathematics, gender differences, the differences between English-language and French-language school systems, and comparisons with PISA 2000. Chapter 2 presents information on the mean performance of Canadian students in reading, science and problem solving. Chapters 3 and 4 use PISA 2003 data to explore two themes related to mathematics performance. In Chapter 3, the relationship between student engagement in mathematics, student learning and mathematics performance is explored. Chapter 4 examines the impact of student socio-economic background on mathematics performance. Finally, the major findings and opportunities for further study are discussed in the conclusion.

Notes

1. OECD (1999), Measuring Student Knowledge and Skills: A New Framework for Assessment, Paris.

2. Lavoie, Marie and Richard, Roy ( June 1998). Employment in the Knowledge-Based Economy: A Growth Accounting Exercise for Canada, Ottawa: HRDC Applied Research Branch Research Papers Series R-98-8E.

3. OECD countries include Australia, Austria, Belgium, Canada, Czech Republic, Denmark, Finland, France, Germany, Greece, Hungary, Iceland, Ireland, Italy, Japan, Korea, Luxembourg, Mexico, Netherlands, New Zealand, Norway, Poland, Portugal, Slovak Republic, Spain, Sweden, Switzerland, Turkey, United Kingdom, United States. Partner countries include Brazil, Hong Kong-China, Indonesia, Latvia, Liechtenstein, Macao-China, Russian Federation, Serbia and Montenegro (Ser.), Thailand, Tunisia, Uruguay. Although the United Kingdom participated in PISA 2003, technical problems with its sample prevent its results from being discussed here.

4. No data were collected in the three territories and on Indian Reserves.

5 . OECD (2004), Learning for Tomorrow’s World – First results from PISA 2003, Paris.

(15)

Chapter 1

The performance of Canadian students in mathematics in an international context

This chapter compares the Canadian results of the PISA 2003 assessment in terms of average scores and proficiency levels. First, the performance of Canadian 15-year-old students is compared to the performance of 15-year-old students from countries that participated in PISA 2003. Second, the results of students’ performance in the ten Canadian provinces are analyzed. This information is followed by a comparison between the performance of boys and the performance of girls in Canada and the provinces. Fourth, the performance of students enrolled in English-language and French- language school systems are compared for the five provinces in which the two groups were sampled separately. Finally, the results of PISA 2003 are compared with those of PISA 2000.

Defining mathematics

Mathematics performance as measured by PISA involves more than the ability to perform arithmetic computations. The assessment items also emphasized mathematical knowledge put to functional use in a variety of situations and contexts. This emphasis is reflected in the PISA definition of mathematics:

An individual’s capacity to identify and understand the role that mathematics plays in the world, to make well-founded judgements and to use and engage with mathematics in ways that meet the needs of that individual’s life as a constructive, concerned and reflective citizen.

Mathematics results are presented not only in terms of students’ overall mathematics performance but also for four mathematics sub-domains. These sub-domains are defined in terms of four content areas that cover the range of mathematics 15-year-old students need as a foundation for life. The OECD defined the four content areas for mathematics as follows:

Space and shape relates to spatial and geometric phenomena and relationships, drawing on the discipline of geometry. It requires looking for similarities and differences when analysing the components of shapes, recognising shapes in different representations and different dimensions as well as understanding the properties of objects and their relative positions.

Change and relationships involves mathematical manifestations of change as well as functional relationships and dependency among variables. It relates most closely to algebra. Mathematical relationships often take the shape of equations or inequalities, but relationships of a more general nature (e.g., equivalence, divisibility, inclusion) are relevant as well. Relationships are given a variety of different representations, including symbolic, algebraic, graphical, tabular, and geometrical representations. Since different representations may serve different purposes and have different properties, translation between representations is often of key importance in dealing with situations and tasks.

Quantity involves numeric phenomena as well as quantitative relationships and patterns. It relates to the understanding of relative size, the recognition of numerical patterns, and the use of numbers to

(16)

represent quantities and quantifiable attributes of real-world objects (counts and measures).

Furthermore, quantity deals with the processing and understanding of numbers that are represented in various ways. An important aspect of dealing with quantity is also quantitative reasoning, which involves number sense, representing numbers, understanding the meaning of operations, mental arithmetic, and estimating. The most common curricular branch of mathematics with which it is associated is arithmetic.

Uncertainty involves probabilistic and statistical phenomena and relationships that become increasingly relevant in the information society.

These phenomena are the subject of mathematical study in statistics and probability.

The mathematics scores are expressed on a scale with an average of 500 points for the OECD countries6 and about two-thirds of the students scoring between 400 and 600 (i.e. a standard deviation of 100).

While PISA is not a test of curriculum, the points on the mathematics scale can be interpreted in the context of the school environment. For example, 26 of the 30 OECD countries that participated in PISA 2003 had a sizable number of 15-year-olds in the sample who were enrolled in at least two different, but consecutive grades.

For these 26 countries, the OECD analyses revealed that one additional school year corresponds to an increase of 41 score points on the PISA combined mathematics scale7. For Canada, the OECD analyses revealed that one additional school year corresponds to an increase of 53 score points on the combined mathematics scale.

One way to summarize student performance and to compare the relative standing of countries is by examining their average test scores. However, simply ranking countries based on their average scores can be misleading because there is a margin of error associated with each score. This margin of error should be taken into account in order to identify whether differences in average scores exist. See text box ‘A note on statistical comparisons’.

A note on statistical comparisons

The averages were computed from the scores of random samples of students from each country and not from the population of students in each country.

Consequently it cannot be said with certainty that a sample average has the same value as a population average that would have been obtained had all 15- year-old students been assessed. Additionally, a degree of error is associated with the scores describing student skills as these scores are estimated based on student responses to test items. We use a statistic, called the standard error, to express the degree of uncertainty associated with sampling error and measurement error.

The standard error can be used to construct a confidence interval, which provides a means of making inferences about the population means and proportions in a manner that reflects the uncertainty associated with sample estimates. A 95% confidence interval is used in this report and represents a range of plus or minus about two standard errors around the sample average. Using this confidence interval it can be inferred that the population mean or proportion would lie within the confidence interval in 95 out of 100 replications of the measurement, using different samples randomly drawn from the same population.

When comparing scores among countries, provinces, or population subgroups the degree of error in each average should be considered in order to determine if averages are different from each other.

Standard errors and confidence intervals may be used as the basis for performing these comparative statistical tests. Such tests can identify, with a known probability, whether there are actual differences in the populations being compared.

For example, when an observed difference is significant at the 0.05 level, it implies that the probability is less than 0.05 that the observed difference could have occurred because of sampling and measurement error. When comparing countries and provinces, extensive use is made of this type of test to reduce the likelihood that differences due to sampling and measurement errors will be interpreted as real.

Only statistically significant differences at the 0.05 level are noted in this report, unless otherwise stated. This means that the 95% confidence intervals for the averages being compared do not overlap. Due to rounding error, some non-overlapping confidence intervals share an upper or lower limit. All statistical differences are based on un-rounded data.

(17)

Canadian students performed well in mathematics

O verall, Canadian students performed well in mathematics, as illustrated in Figures 1.1 to 1.5. Listed in Table 1.1 are the countries that performed significantly better than Canada or equally as well as Canada on the combined mathematics scale as well as the four mathematics sub-domains. The average scores of students in the remaining countries that took part in PISA 2003 were statistically below that of Canada. Among 41 countries, only two countries performed better than Canada on the combined mathematics scale.

Table 1.1

Countries performing better than or about the same as Canada

Countries performing Countries significantly better* performing as than Canada well* as Canada Mathematics – Hong Kong-China, Korea, Netherlands, combined scale Finland Liechtenstein, Japan,

Belgium, Macao-China, Switzerland

Mathematics – Hong Kong-China, Czech Republic, space and shape Japan, Korea, Netherlands,

Switzerland, Finland, New Zealand, Liechtenstein, Belgium, Australia, Austria,

Macao-China Denmark

Mathematics – Netherlands, Finland,

change and Korea Hong Kong-China,

relationships Liechtenstein,

Japan, Belgium Mathematics – Finland, Korea, Liechtenstein, quantity Hong Kong-China Macao-China, Switzerland,

Belgium, Netherlands, Czech Republic, Japan Mathematics – Hong Kong-China Netherlands, Finland,

uncertainty Korea

* Differences in scores are statistically significant only when confidence intervals do not overlap. Countries performing about the same as Canada have a confidence interval that overlaps that of Canada’s.

Canadian students also performed well in the mathematics sub-domains (Figure 1.2 to Figure 1.5;

Table 1.1). Only one country performed significantly better than Canada in the uncertainty sub-domain, while students from two countries performed significantly better than Canadian students in the quantity, and change and relationships sub-domains. Eight countries performed significantly better than Canadian students in the space and shape sub-domain.

Further examination of the performance of Canadian students in the four mathematics sub-domains provides insight into the relative strengths and weaknesses of Canadian students. By comparing Canada’s relative performance across the four sub- domains, the results show that the strengths of Canada’s 15-year-old students are in the areas of change and relationships, quantity and uncertainty, while their relative weakness is in the area of space and shape.

(18)

Figure 1.1

Estimated average scores and confidence intervals for provinces and countries:

COMBINED MATHEMATICS

Estimated average score

630

Note: The OECD average is 500 with a standard error of 0.6.

330 380 430 480 530 580

Finland

Quebec

CANADA Belgium

Switzerland Liechtenstein

Japan Ontario Manitoba

British Columbia

Macao-China Australia

Hong Kong-China

Nova Scotia

France Denmark

Saskatchewan New Zealand

Iceland Newfoundland and Labrador Czech Republic

Alberta Korea Netherlands

New Brunswick 95% Confidence interval

Estimated average score

Brazil Indonesia

Ireland Austria

Tunisia

Luxembourg Hungary Latvia

Turkey Thailand Mexico

Sweden

United States Portugal

Slovak Republic Germany

Spain Poland

Norway Prince Edward Island

Russian Federation Italy

Serbia and Montenegro (Ser.) Greece

Uruguay

(19)

Figure 1.2

Estimated average scores and confidence intervals for provinces and countries:

MATHEMATICS space and shape

630

Note: The OECD average is 496 with a standard error of 0.7.

Estimated average score

330 380 430 480 530 580

Japan

Hong Kong-China

Brazil

Tunisia

Finland

Quebec

CANADA Belgium Liechtenstein

Ontario Manitoba British Columbia

Macao-China

France Denmark

New Zealand

Iceland

Czech Republic Alberta

Korea

Netherlands

Indonesia

Ireland

Austria

Luxembourg

Hungary Latvia

Turkey Thailand

Mexico

United States Portugal

Germany

Spain

Norway Prince Edward Island

Italy Greece

Uruguay 95% Confidence interval

Estimated average score

Newfoundland and Labrador Switzerland

Slovak Republic

Poland

Australia

Nova Scotia Sweden Saskatchewan

Russian Federation

New Brunswick

Serbia and Montenegro (Ser.)

(20)

Figure 1.3

Estimated average scores and confidence intervals for provinces and countries:

MATHEMATICS change and relationships

300 350 400 450 500 550 600

95% Confidence interval

Estimated average score

Note: The OECD average is 499 with a standard error of 0.7.

Quebec

Belgium Japan British Columbia

Macao-China

Australia

Nova Scotia France

Denmark Saskatchewan

Iceland Czech Republic

Alberta Korea Netherlands

New Brunswick

Ireland

Austria Sweden

Slovak Republic Germany

Prince Edward Island

Brazil Tunisia

Hungary

Latvia

Turkey

Thailand Mexico

United States Spain

Italy Greece

Uruguay

Switzerland Hong Kong-China

Liechtenstein Finland

Canada Ontario Manitoba

Newfoundland and Labrador New Zealand

Portugal

Serbia and Montenegro (Ser.)

Indonesia

Norway Luxembourg Poland Russian Federation Estimated average score

(21)

Figure 1.4

Estimated average scores and confidence intervals for provinces and countries:

MATHEMATICS quantity

330 380 430 480 530 580

Finland

Quebec

CANADA Belgium Switzerland

Liechtenstein

Japan Ontario

Manitoba British Columbia

Macao-China

Australia

Hong Kong-China

Nova Scotia

France Saskatchewan

Iceland Czech Republic

Alberta Korea

Netherlands 95% Confidence interval

Estimated average score

Note: The OECD average is 501 with a standard error of 0.6.

Brazil

Ireland Hungary

Latvia

Thailand

Sweden

United States

Portugal

Slovak Republic Germany

Spain

Poland Norway Prince Edward Island

Russian Federation Italy

Uruguay

Luxembourg

Newfoundland and Labrador Denmark

Austria

New Zealand

New Brunswick

Indonesia Tunisia Mexico

Turkey Greece

Serbia and Montenegro (Ser.) Estimated average score

(22)

Figure 1.5

Estimated average scores and confidence intervals for provinces and countries:

MATHEMATICS uncertainty

330 380 430 480 530 630

Finland Canada

Belgium Liechtenstein

Japan

British Columbia

Macao-China

Australia

Hong Kong-China

Nova Scotia

France Denmark Saskatchewan

New Zealand

Iceland Newfoundland and Labrador

Czech Republic Alberta

Korea Netherlands

New Brunswick 95% Confidence interval

Estimated average score

Note: The OECD average is 502 with a standard error of 0.6.

Brazil Indonesia

Ireland

Tunisia

Luxembourg Hungary

Latvia

Turkey

Mexico

Sweden

Slovak Republic Spain

Poland Norway

Prince Edward Island

Russian Federation Italy

Greece

Uruguay

580

Quebec Ontario Manitoba

Switzerland

Austria

United States Germany

Portugal

Serbia and Montenegro (Ser.)

Thailand

Estimated average score

(23)

Provincial results

Most provinces performed well in mathematics (Figures 1.1 to 1.5). All provinces performed at or above the OECD mean in the combined mathematics scale and mathematics sub-scale with one exception: Prince Edward Island performed below the OECD mean in the space and shape sub-domain. Several provinces performed as well as the top-ranked countries. For example, on the combined mathematics scale the performance of students in Alberta, Quebec and British Columbia compared favourably with the performance of students in Hong Kong-China.

A note on interpreting provincial differences

Although PISA measures skills beyond the school curriculum, most mathematics skills are learned in school. Therefore, students in higher grades may have an advantage in mathematics simply because they have been exposed to more advanced topics. The figure below illustrates the differences in performance between 15-year-old Canadian students in grades 9, 10 and 11 who had not repeated any grades. As expected, the performance of students increased with increasing grade level, although there is substantial overlap among the grades.

Most students born in 1987 were in grade 10 in 2003. However, provincial educational policies on age of enrolment and grade repetition result in differences among the proportions of 15- year-olds enrolled in higher or lower grades. Quebec, for example, has a higher proportion of students from the 1987 cohort in grade 9 than other provinces. Interpretation of provincial differences in performance should consider that this report describes the performance of all 15-year-olds as is the intent of PISA and not the performance of 15-year-olds by grade.

Distribution of overall mathematics score by grade level, Canadian 15-year-olds

0.00

0.03

0.02

0.01

0.00

200 300 400 500 600 700 800

PISA overall mathematics score 0.03

0.02

0.01

Grade 9

0.06 0.06

0.05

0.04

Proportion of students Proportion of students

0.05

0.04

Grade 10

Grade 11

(24)

Provinces generally fall into one of three groups when compared to the Canadian averages (Table 1.2).

The average performance of students in Alberta was significantly above the Canadian average for combined mathematics and the four mathematics sub-domains.

Students in British Columbia, Manitoba, Quebec, and Ontario performed about the same as the Canadian

average with one exception: students in British Columbia performed above the Canadian average in uncertainty.

Students in Newfoundland and Labrador, Saskatchewan, Nova Scotia, New Brunswick and Prince Edward Island performed significantly lower than the Canadian average across all mathematics scales.

Table 1.2

Provincial results in mathematics in relation to the Canadian average

Provinces performing Provinces performing Provinces performing significantly better* than as well* as the significantly lower* than the Canadian average Canadian average the Canadian average Mathematics – combined scale Alberta Quebec, Ontario, Manitoba, Newfoundland and Labrador,

British Columbia Prince Edward Island, Nova Scotia, New Brunswick, Saskatchewan

Mathematics – space and shape Alberta Quebec, Ontario, Manitoba, Newfoundland and Labrador, British Columbia Prince Edward Island,

Nova Scotia, New Brunswick, Saskatchewan

Mathematics – change and relationships Alberta Quebec, Ontario, Manitoba, Newfoundland and Labrador, British Columbia Prince Edward Island,

Nova Scotia, New Brunswick, Saskatchewan

Mathematics – quantity Alberta Quebec, Ontario, Manitoba, Newfoundland and Labrador,

British Columbia Prince Edward Island, Nova Scotia, New Brunswick, Saskatchewan

Mathematics –uncertainty Alberta, Quebec, Ontario, Manitoba Newfoundland and Labrador,

British Columbia Prince Edward Island,

Nova Scotia, New Brunswick, Saskatchewan

* Differences in scores are statistically significant only when confidence intervals do not overlap. Provinces performing about the same as Canada have a confidence interval that overlaps with that of Canada’s. Provinces within each cell are ordered from east to west.

Mathematics skill levels

The average scores reported in the previous section provide a useful but limited way of comparing performance of different groups of students. Another way to look at performance is to examine the proportions of students who can accomplish tasks at various proficiency or skill levels. This kind of analysis allows a further breakdown of average scores and an examination of groups of students who show similar abilities. In PISA, mathematics skill is a continuum – that is, mathematics skill is not something a student has or does not have, but

rather every 15-year-old shows a certain level of mathematics skill. The mathematics skill or proficiency levels used in PISA 2003 are described in the text box

‘Mathematics Proficiency levels’.

Figure 1.6 (based on data from Table B1.7) shows the distribution of students by skill level by country, and includes the Canadian provinces. Results for countries and provinces are presented in descending order according to the proportion of the 15-year-olds who performed at level 2 or higher.

(25)

Mathematics proficiency levels

Mathematics achievement was divided into six proficiency levels representing a group of tasks of increasing difficulty, with Level 6 as the highest and Level 1 as the lowest. Students performing below Level 1 (mathematics score below 359) are not able to show routinely the most basic type of knowledge and skills that PISA seeks to measure. Such students have serious difficulties in using mathematical literacy as a tool to advance their knowledge and skills in other areas. Placement at this level does not mean that these students have no mathematics skills. Most of these students are able to correctly complete some of the PISA items. Their pattern of responses to the assessment is such that they would be expected to solve less than half of the tasks from a test composed of only Level 1 items.

In PISA, students were assigned to a proficiency level based on their probability of answering correctly the majority of items in that range of difficulty. A student at a given level could be assumed to be able to correctly answer questions at all lower levels. To help in interpretation, these levels were linked to specific score ranges on the original scale. Below is a description of the abilities associated with each proficiency level. (Source: Organisation for Economic Cooperation and Development, Programme for International Student Assessment, PISA, 2003).

Level 6 (score above 668)

At Level 6 students can conceptualise, generalise, and utilise information based on their investigations, and modelling of complex problem situations. They can link different information sources and representations and flexibly translate among them. Students at this level are capable of advanced mathematical thinking and reasoning. These students can apply this insight and understanding along with a mastery of symbolic and formal mathematical operations and relationships to develop new approaches and strategies for attacking novel situations. Students at this level can formulate and precisely communicate their actions and reflections regarding their findings, interpretations, arguments, and the appropriateness of these to the original situations.

Level 5 (score from 607 to 668)

At Level 5 students can develop and work with models for complex situations, identifying constraints and specifying assumptions. They can select, compare, and evaluate appropriate problem solving strategies for dealing with complex problems related to these models. Students at this level can work strategically using broad, well-developed thinking and reasoning skills, appropriate linked representations, symbolic and formal characterisations, and insight pertaining to these situations. They can reflect on their actions and formulate and communicate their interpretations and reasoning.

Level 4 (score from 545 to 606)

At Level 4 students can work effectively with explicit models for complex concrete situations that may involve constraints or call for making assumptions. They can select and integrate different representations, including symbolic ones, linking them directly to aspects of real-world situations. Students at this level can utilise well-developed skills and reason flexibly, with some insight, in these contexts. They can construct and communicate explanations and arguments based on their interpretations, arguments, and actions.

Level 3 (score from 483 to 544)

At Level 3 students can execute clearly described procedures, including those that require sequential decisions. They can select and apply simple problem-solving strategies. Students at this level can interpret and use representations based on different information sources and reason directly from them. They can develop short communications reporting their interpretations, results, and reasoning.

Level 2 (score from 421 to 482)

At Level 2 students can interpret and recognise situations in contexts that require no more than direct inference. They can extract relevant information from a single source and make use of a single representational mode. Students at this level can employ basic algorithms, formulae, procedures, or conventions. They are capable of direct reasoning and of making literal interpretations of the results.

Level 1 (score from 359 to 420)

At Level 1 students can answer questions involving familiar contexts where all relevant information is present and the questions are clearly defined. They are able to identify information and to carry out routine procedures according to direct instructions in explicit situations. They can perform actions that are obvious and follow immediately from the given stimuli.

(26)

Figure 1.6

Percentage of students at each level of proficiency on the combined mathematics scale

Finland Alberta British Columbia Korea Ontario Canada Hong Kong-China Netherlands Manitoba Quebec Macao-China Liechtenstein Newfoundland and Labrador Japan Nova Scotia Saskatchewan New Brunswick Australia Switzerland Iceland New Zealand Denmark Belgium Czech Republic France Ireland Sweden Prince Edward Island Austria Slovak Republic Norway Germany Luxembourg Poland Spain Hungary Latvia United States Portugal Russian Federation Italy Greece Serbia and Montenegro (Ser.) Uruguay Turkey Thailand Mexico Brazil Tunisia Indonesia

0 20 40 60 80 100

20 40

60 80

100

40

0 20 60 80 100

20 40

60 80

100

Level 1

< level 1 Level 2 Level 3 Level 4 Level 5 Level 6

(27)

Using these proficiency levels, students with very high and very low levels of proficiency can be identified.

Listed in Table 1.3 are the percentages of students who performed at Level 1 or below and the percentages of students who performed at Level 5 or above for each country and the ten provinces.

The lower group includes students who would have great difficulty continuing studies in mathematics and in daily life activities involving the application of mathematics skills. In contrast, the students in the upper group are likely to be well qualified to do so.

Compared to the OECD average, a significantly smaller proportion of Canadian students performed at Level 1 or below in mathematics. The Canadian proportion at Level 1 or below was approximately half the proportion of the OECD average (10% versus 21%

respectively). Only Finland had a significantly smaller proportion of students at Level 1 or below than Canada.

In contrast, a significantly higher proportion of Canadian students performed at Level 5 or above in mathematics. The OECD average was approximately 15%, five percentage points lower than the average for Canada. Four countries (Hong Kong-China, Belgium, Liechtenstein and the Netherlands) had significantly greater percentages of students with higher skills than Canada.

Turning to the provinces, the percentages of students who performed at Level 1 or below on the combined mathematics scale were, with the exception of New Brunswick and Prince Edward Island, similar to the percentage for Canada. The percentages of students in New Brunswick performing at Level 1 or below (14%) were significantly higher than the Canadian percentage performing at Level 1 or below but lower than the percentage observed for the OECD average. The percentages of students in Prince Edward Island performing at Level 1 or below (18%) were significantly higher than the Canadian percentage performing at Level 1 or below and statistically the same as the percentage observed for the OECD average.

The proportion of students in Alberta at Level 5 or above (27%) was significantly greater than the Canadian percentage (20%). The proportion of students in Quebec, British Columbia, Manitoba, and Ontario who performed at Levels 5 or higher were comparable to the proportion for Canada.

Lower percentages of students in Newfoundland and Labrador, Prince Edward Island, Nova Scotia, New Brunswick and Saskatchewan performed at Level 5 or above compared to the Canadian percentage (Table 1.3).

However, with the exception of Prince Edward Island, the provincial percentages were statistically the same as the OECD average.

Références

Documents relatifs

It presents the distribution of achievement scores for Canada as a whole and for participating provinces, including the results by language (English- and French-language school

Although 65 countries and economies participated in the PISA 2012 assessments of mathematics, reading, and science, the number participating in the problem- solving assessment

Considering confidence intervals, the mean score for Nova Scotia students responding in English in reading is significantly lower than that obtained by Canadian students responding

These results are in stark contrast to the gender differences observed on the assessment of individual problem solving in PISA 2012 (CMEC, 2014; OECD, 2014), where boys scored 7

The performance on the financial literacy assessment of 15-year-old students across the seven provinces is compared to that of the other participating countries and economies in

For those provinces where there was a significant difference in achievement between the two language systems in science and reading, students in majority-language settings (students

Results are presented for Canada overall and by province, both for mathematics overall and by the sub-domains of mathematics (processes and content areas) for the paper-

As can be seen in Table 1.4 at the Canada level, students in the English-language school systems outperformed students in the French-language school systems and in five