Directions answer the given questions below 1 how was the experience doing the learning task 4

Students' questions play an important role in meaningful learning and scientific inquiry. They are a potential resource for both teaching and learning science. Despite the capacity of students' questions for enhancing learning, much of this potential still remains untapped. The purpose of this paper, therefore, is to examine and review the existing research on students' questions and to explore ways of advancing future work into this area. The paper begins by highlighting the importance and role of students' questions from the perspectives of both the learner and the teacher. It then reviews the empirical research on students' questions, with a focus on four areas: (1) the nature and types of these questions; (2) the effects of teaching students questioning skills; (3) the relationship between students' questions and selected variables; and (4) teachers' responses to, and students' perceptions of, students' questions. Following this, some issues and implications of students' questions for classroom instruction are discussed. The paper concludes by suggesting several areas for future research that have significant value for student learning.

Questioning is an integral part of meaningful learning and scientific inquiry. The formulation of a good question is a creative act, and at the heart of what doing science is all about. As Cuccio‐Schirripa and Steiner (2000) have stated, ‘Questioning is one of the thinking processing skills which is structurally embedded in the thinking operation of critical thinking, creative thinking, and problem solving’ (p. 210). Moreover, as we will show, students' questions play an important role in the learning process as they are a potential resource for both teaching and learning science.

Despite the capacity of students' questions for enhancing learning, much of this potential still remains untapped. Observational studies of classrooms by Dillon (1988) and of tutoring sessions by Graesser and Person (1994) found that students asked few questions, and even fewer in search of knowledge. As grade level increases, students ask fewer ‘on‐task attention’ questions (Good, Slavings, Harel, & Emerson, 1987, p.186) that relate to the immediate task and draw attention to themselves. This probably occurs because students do not want to call attention to themselves or because teachers often do not encourage students to ask questions. Also, few students spontaneously ask high‐quality thinking or cognitive questions (Carr, 1998; White & Gunstone, 1992, p.170), with most questions being factual, procedural, or closed in nature.

In recent years, however, there has been an increasing emphasis on the important role that language, discourse, and argumentation play in both the personal and social construction of scientific knowledge (e.g. Duschl & Osborne, 2002; Lemke, 1990). At the same time, there has also been a growing interest in the role of students' questions in learning science as questions are an essential component of discursive activity and dialectical thinking. A key, if not central, feature of scientific discourse is the role of questioning in eliciting explanations, postulating theories, evaluating evidence, justifying reasoning, and clarifying doubts. Put simply, the act of questioning encourages learners to engage in critical reasoning. Given that asking questions is fundamental to science and scientific inquiry, the development of students' abilities to ask questions, reason, problem‐solve, and think critically should, likewise, become a central focus of current science education reform (Zoller, Tsaparlis, Fatsow, & Lubezky, 1997).

The purpose of this paper, therefore, is to examine and review the existing research on students' questions and to explore ways of advancing future work into this area. In organising this review, we have structured it into four main sections. In the first section, we begin by highlighting the importance and role of students' questions in learning and teaching science. In doing this, we examine students' questions from the perspectives of both the learner and the teacher. We make the case for, and explain how, students' questions not only help students in the learning process, but also serve useful functions as a pedagogical tool for the teacher. Next, we review the empirical research that has been carried out on students' questions and consider the significance of the findings. This section is further divided into four sub‐sections, with each one addressing a different aspect of the research that has been carried out to date. Drawing on the findings of this research, we then discuss some issues and implications of students' questions for classroom instruction in the third section. Following this, in the fourth section, we conclude by giving some suggestions for future research that can be conducted in this area. We end the paper by summarising the salient points raised in the earlier sections. Our fundamental case, though, is that this is an undervalued domain of inquiry among the research community, which has significant value for student learning.

First and foremost, questions from students indicate that they have been thinking about the ideas presented and have been trying to link them with other things they know. The source of students' questions is a gap or discrepancy in the students' knowledge or a desire to extend their knowledge in some direction. The questions may stem from curiosity about the world around us as well as events and interactions with real‐world issues. Students' questions may be triggered by unknown words or inconsistencies between the students' knowledge and the new information, which then engender cognitive dissonance (Festinger, 1957) or ‘epistemic curiosity’ (Berlyne, 1954).

The value of students' questions has been emphasized by several authors (e.g. Biddulph, Symington, & Osborne, 1986; Fisher, 1990; Penick, Crow, & Bonnsteter, 1996). Questions raised by students activate their prior knowledge, focus their learning efforts, and help them elaborate on their knowledge (Schmidt, 1993). The act of ‘composing questions’ focuses the attention of students on content, main ideas, and checking if content is understood (Rosenshine, Meister, & Chapman, 1996). The ability to ask good thinking questions is also an important component of scientific literacy, where the goal of making individuals critical consumers of scientific knowledge (Millar & Osborne, 1998) requires such a facility.

For students, posing their own questions is a first step towards filling their knowledge gaps and resolving puzzlement. The process of asking questions allows them to articulate their current understanding of a topic, to make connections with other ideas, and also to become aware of what they do or do not know. In this regard, student‐generated questions are also an important aspect of both self‐ and peer‐assessment (Black, Harrison, Lee, & Marshall, 2002, p. 14). The skill of questioning is also important in problem‐solving and decision‐making (Pizzini & Shepardson, 1991; Zoller, 1987). Additionally, it has the potential to facilitate productive thinking in students (e.g. Gallas, 1995), enhance creativity and higher order thinking (Shodell, 1995), and is also a scientific habit of mind. The ability to generate interesting, productive ideas and answers is dependent on being able to first come up with good questions (Shodell, 1995). Indeed, as Graesser and Olde (2003) have remarked, ‘an excellent litmus test of deep comprehension is the quality of questions asked when [one is] confronted with breakdown scenarios’ (p. 524).

While students' questions serve useful functions for learners, they are also helpful to teachers in prompting reflective thought and student engagement. Therefore, we examine the role of students' questions by distinguishing between the use of these questions in learning science and in teaching science. First, we consider how students' questions can benefit learners. Following this, we discuss the role of students' questions in teaching science.

For students learning science, their questions have the potential to (a) direct their learning and drive knowledge construction; (b) foster discussion and debate, thereby enhancing the quality of discourse and classroom talk; (c) help them to self‐evaluate and monitor their understanding; and (d) increase their motivation and interest in a topic by arousing their epistemic curiosity. Each of these is considered separately below.

How does question‐asking steer student learning and facilitate knowledge construction? Learning is a generative process requiring effort in which learners actively construct their own meanings that are consistent with their prior ideas rather than passively acquire knowledge transmitted to them (Osborne & Wittrock, 1983, 1985). A guiding assumption of much of the research is that deep thinking and reasoning is fostered through contextualised answering of questions. In their ‘depth dynamic model’, Chin and Brown (2000a) postulated how questions may help learners initiate a process of hypothesising, predicting, thought‐experimenting, and explaining, thereby leading to a cascade of generative activity and helping them to acquire and construct missing pieces of knowledge or resolve conflicts in their understanding. During this process, learning may occur through the formation and rearrangement of cognitive networks or schemata as students progressively construct explanations and answers to each question that they pose while working on tasks set by the teacher. Indeed, Chi, de Leeuw, Chiu, and Lavancher (1994) have demonstrated that eliciting self‐explanations (i.e. those spontaneously generated) does improve understanding. Thus, questions, particularly those asked in response to puzzlement and wonderment, may stimulate students to generate explanations for things that perplex them and to propose solutions to problems. These questions can occur spontaneously or in response to deliberate stimulation by the teacher and could then trigger the use of deep‐thinking strategies, which may not be invoked if the questions had not been asked. In this way, they play an important role in engaging students' minds more actively.

When students engage socially in talk and activity about shared problems or tasks, their questions can stimulate not only themselves but, also, another group member to use the relevant thinking strategies and processes (e.g. hypothesising, predicting, explaining) in their search for an answer. Thus, the questions embedded in the conversation of peer groups help learners to co‐construct knowledge, thereby engendering productive discussion (Chin, Brown, & Bruce, 2002). This is consistent with views of learning as involving not only the individual but also the social construction of knowledge (e.g. Driver, Asoko, Leach, Mortimer, & Scott, 1994; Duit & Treagust, 1998), and that meaning‐making and knowledge construction is a dialogic and dialectic process (Alexander, 2005). If students build on each others' ideas productively when they collaborate with their peers, they enter the ‘zone of proximal development’ (Vygotsky, 1978), using questions as one of the psychological tools for thinking. The questions embedded in the discourse of collaborative peer groups help the scaffolding of ideas, encouraging learners or their peers to reflect on their own ideas. They facilitate the negotiation of meaning in the ‘construction zone’ (Newman, Griffin, & Cole, 1989, p. ix), and help learners co‐construct knowledge inter‐psychologically. This knowledge is then appropriated or constructed intra‐psychologically by the individual members.

From a social‐cognitive perspective, question‐generation is a constructive activity and an essential component of student discourse in ‘talking science’ (Hawkins & Pea, 1987; Lemke, 1990). In addition, questions can provoke discussion and debate about alternative viewpoints, stimulate students to consider the pros and cons of different perspectives of an issue, and foster the process of argumentation and critical thinking in science. The latter process, in particular, is essential to help students recognise faulty reasoning and invalid assumptions, construct hypotheses, generate explanations, identify evidence that supports or refutes a hypothesis, evaluate options in a logical manner, and make links between seemingly disparate ideas.

At the individual level, the act of asking oneself questions is a hallmark of a self‐directed and reflective learner. The questions can be used in ‘self‐talk’ as ‘thought‐starters’ to scaffold and prime students in their thinking (Chin, 2006). Vygotsky (1962, 1986) viewed private speech as the link in the transition from vocal speech to inner verbal thought. According to his theory of verbal self‐regulation, speech becomes verbal thought via three stages: external, private, and internal speech. During the ‘external’ stage, the activity of the learner is directed by the verbalizations of an external agent, for example, a teacher. During the ‘private speech’ stage, the learner internalises the external agent's verbal messages and talks aloud to him or herself. This private speech is overt, audible speech‐to‐self and gradually becomes more reflective as it progresses from an external to an internal activity. In the next ‘internal verbal self‐regulation’ stage, the self‐verbalisations become inaudible, silent and covert. This progression through the stages is referred to as a movement from an inter‐psychological to an intra‐psychological plane of functioning. Thus, from a Vygotskian perspective, self‐regulation is a linguistically guided process in which regulation‐through‐commands‐of‐others shifts developmentally to self‐regulation. Since one of the linguistic tools that can be used in an internal dialogue with oneself is a question, this verbalisation to self in the form of self‐questioning is a key process that aids the development of metacognition and self‐regulated learning habits (e.g. Manning & Payne, 1996).

King (1989) found that college students studying educational psychology could be assisted to self‐question by training them to ask specific questions on lecture comprehension, which improved their performance in understanding the content covered in lessons. Wong (1985) suggested that, besides asking questions that pertain to subject matter content, students may also ask themselves evaluative questions (e.g. ‘Is there anything I don't understand in this paragraph?’) that help them to check how well they comprehend what they are studying. In this regard, self‐questioning may also stimulate students to self‐evaluate and monitor the status of their understanding, as well as help them to redirect their use of learning strategies – strategies that are essential to use assessment formatively (Black et al., 2002). Allowing a dialogue with oneself drives the mind to look for patterns and connections, establish relationships with prior knowledge, and build bridges to new perceptions, as well as convert raw data into new meaning. Thus, as both a cognitive and metacognitive tool, self‐questioning is an integral part of self‐assessment and learning.

Graesser, Person, and Huber (1992) suggested that one of the mechanisms responsible for the generation of questions stems from a need to correct declarative knowledge deficits. This may occur when learners detect inconsistencies that exist between incoming information and their prior knowledge. One of the regulative actions undertaken to resolve this anomaly and initiate repair would then be to ask questions about the inconsistent information that is to be processed.

Finally, the act of question‐posing can also pique curiosity, as well as arouse motivation and interest in the topic under study. For example, Chin and Kayalvizhi (2005) found that about three‐quarters of the Grade 6 students in their study preferred investigating questions that they themselves posed compared with simply answering the investigative questions given in their practical activity books. These students reported feeling ‘happy’, ‘excited’, or ‘proud’ about generating their own questions for the investigations, and described the experience of investigating their own questions as ‘thrilling’, ‘fun’, and ‘interesting’. This reflects that asking questions about the nature of the material world, and being able to answer them, is the key feature of science – a process that has liberated society since the Enlightenment from the shackles of dogma and received wisdom.

Thus far, we have shown how students' questions can help students in enhancing the learning process. However, how can students' questions assist in the teaching of science?

For teachers, students' questions raised in class have the potential to: (a) help the teacher diagnose students' understanding and tap into their thinking, thereby acting as aids in formative assessment to inform future teaching; (b) evaluate higher‐order thinking; (c) stimulate further inquiry into the topic under study via open investigations, problem‐based learning and project work; and (d) provoke critical reflection on classroom practice.

Students' questions provide insights into their knowledge, understanding, and puzzlement, and act as a window into their minds. Thus, for the teacher, such questions can diagnose students' understanding by revealing the quality of students' thinking and conceptual understanding (White & Gunstone, 1992), conceptual difficulties, alternative frameworks, and confusion about concepts (Hadzigeorgiou, 1999), their reasoning (Donaldson, 1978), and what students want to know (Elstgeest, 1985). The type of question and the content embedded therein can also indicate the questioner's depth of thinking (Chin & Brown, 2000b). All these instances point to the potential use of students' questions in formative assessment (e.g. Bell & Cowie, 2001) where the teacher can gain some insight into the students' minds and provide the appropriate feedback. In this regard, students' questions allow two‐way ‘double feedback’ in that they not only provide feedback to the teacher about students' thinking, but also allow the teacher to act on this information and subsequently provide responsive feedback to the students.

For example, Maskill and Pedrosa de Jesus (1997a) obtained information about Grade 9 students' learning difficulties on temperature, energy, heat, and kinetic theory by having the teacher stop the lessons from time to time and requesting the students to write down any questions they had about problems that they were facing. The questions were a good source of information about each specific moment of the lesson and provided the teacher with a great deal of information with which to organise future teaching according to the students' needs. Watts and Alsop (1995) found that students' questions about energy, heat, and light (obtained through individual interviews, whole class work and group discussions) were diagnostic of the state of students' thinking and revealed their unorthodox understanding of science; they were also indicative of the routes through which students were seeking understanding.

Students' question‐posing capability can also be used as a means of evaluating higher‐order thinking, as Dori and Herscovitz (1999) have found. In this study, Grade 10 students were required to pose questions while practising a variety of learning activities. The students' question‐posing capability was then evaluated by using pre‐ and post‐test questionnaires where the students were presented with a case study and asked to compose as many questions as they could about the case they had read. There was a significant increase in students' question‐posing capability (as indicated by the total number, orientation, and complexity of questions) at all academic levels of achievement.

Although the above studies show that students are able to ask questions as part of diagnostic or formative assessment that can then further inform the teacher's future teaching, the study by Olsher and Dreyfus (1999) found that the number of questions that junior high students could spontaneously ask about abstract concepts and ‘black box’ molecular biochemical processes was limited, compared to questions pertaining to the clarification of terms or that referred to the human and social aspects of the uses of biotechnologies. However, the students were able to ask questions relevant to the processes at later stages of the lesson after some intense scaffolding. The authors concluded that, given their rudimentary knowledge, junior‐high school students could not be expected to spontaneously ask these questions and first had to learn the types of questions that one should ask about these processes. This finding suggests that, rather like argumentation, the skill of questioning requires some meta‐linguistic awareness, which must explicitly be addressed in teaching.

Other than their role in contributing to formative assessment and evaluation for the teacher, students' questions also have the potential to influence the curriculum by providing the impetus for inquiry. An alert teacher can harness students' questions and incorporate them into her teaching. Part of the content addressed during subsequent lessons could then be based on students' questions, and this could be motivational for students. Teachers could use these questions as ‘thought provokers’ for stimulating discussions (Maskill & Pedrosa de Jesus, 1997b).

Crawford, Kelly, and Brown (2000) reported on a teacher allowing 4th and 5th Grade students to initiate science explorations where they posed intriguing questions and tested hypotheses about the behaviour of sea animals in a marine science observation tank. Keys (1998) found that when Grade 6 students worked in groups to generate their own questions for open‐ended science investigations, they pursued two main avenues of ideas for questions, varying the teacher‐directed activity and inventing questions from their own imaginations. The former types of questions essentially repeated the activity, but changed one or more of the variables, while the latter original questions arose from students' ideas about previous science lessons and personal experiences from everyday life. Students' questions determined the depth and breadth of the concepts to be learnt, the scientific processes to be used, and the cognitive difficulty of the investigation tasks. Allowing students to generate their own investigation questions stimulated curiosity and encouraged profound thinking about relationships among questions, tests, evidence, and conclusions. Likewise, Gallas (1995) reported having children ask questions and offer their own theories about the human body, and claimed that such an approach could build a community of learners whose questions and theories help to shape the emerging curriculum.

Chin and Chia (2004) investigated Year 9 students' sources of inspiration for their questions and how these questions guided the students in knowledge construction when carrying out biology project work in problem‐based learning. The sources of inspiration for students' questions and problems included: curiosity arising from personal encounters, family members' concerns, or observations of others (48.0%); wonderment about information propagated by advertisements and the media (25.0%); cultural beliefs and folklore (13.5%); and issues arising from previous lessons in the school curriculum (13.5%). Questions asked individually pertained to validation of common beliefs and misconceptions, basic information, explanations, and imagined scenarios. The majority of students' questions were stimulated by sources outside of school. These findings imply that students' out‐of‐school everyday experiences offer rich opportunities in activating their learning. Teachers could anchor instruction around scenarios and questions from students' personal lives, thereby bringing about a better appreciation of the place of science in contemporary life.

The above study also found that students' learning was driven by their questions. These questions could be classified under four main categories and, where productive, served different functions in filling knowledge gaps and knowledge construction. The four categories were: (1) information‐gathering questions which pertained to mainly seeking basic factual information; (2) bridging questions that attempted to find connections between two or more concepts; (3) extension questions which led students to explore beyond the scope of the problem resulting in creative invention or application of the newly acquired knowledge; and (4) reflective questions that were evaluative and critical, and sometimes contributed to decision‐making or change of mindsets. The ability to ask the ‘right’ questions (i.e. those that allowed students to find appropriate answers to proceed to the next step of the inquiry), as well as the extent to which these could be answered, were important in maintaining a sense of ‘flow’ (Csikszentmihalyi, 1990) and in sustaining students' interest in the projects. This suggests that students should be encouraged to ask questions from many different stances, especially when they are stuck in an impasse. The above authors also proposed a model for the process of knowledge construction in question‐driven, problem‐based learning (Q‐PBL), which shows how students' individual questions and ideas are collectively pooled together to direct subsequent inquiry. The model depicts how, when driven by curiosity, puzzlement, and knowledge gaps, students ask the different types of questions in an attempt to integrate their disparate pieces of knowledge into a more coherent whole.

Questions formulated by students during the development of group mini‐projects have also been found to be useful tools in the self‐management and organisation of group work. Rather than explore the content of students' knowledge and understanding from the questions they asked, Pedrosa de Jesus, Neri de Souza, Teixeira‐Dias, and Watts (2005) examined university students' procedural knowledge in the context of project‐based learning. Their findings showed that students' ‘organisational questions’ performed several functions in the structure of students' work, such as in helping students to organise ideas, delimit the scale of the project, identify and discuss the many strands and sources of information available, and reflect on the project as a whole. The authors claimed that the questions contributed to students' engagement in chemistry, bringing about an increase in interaction between teacher and students, confidence and trust of the students in the asking of questions, and therefore an increase in the quality of classroom interaction in the learning and teaching of chemistry.

Watts, Alsop, Gould, and Walsh (1997) have suggested that powerful students' questions can provoke critical incidents for science teachers that forge critical reflection about the nature of science and the processes of teaching and learning and, in addition, generate shifts in their thinking and classroom practice. Their examples illustrated how students' questions brought about conceptual change in two teachers. In the first case of a primary science teacher, students' questions made her aware of her inadequate subject matter knowledge and prompted her to address the gaps in her scientific knowledge. In the second case of a secondary teacher, students' questions were instrumental not only in highlighting gaps in her own understanding and forcing her to test her own constructs but, also, in changing her epistemological beliefs in that ‘my view of science education has changed and I now think of it as not passing on scientific information but about coming to share (at some level) in consensually held theories’ (p. 1033). This teacher also found that she had changed in her attitude towards students' questions in that she became more receptive to them.

A summary of the role of students' questions in learning and teaching science is given in Table .

Download CSVDisplay Table

So far, we have argued that students' questions have an important place in science instruction. We now review the research literature on students' questions in more depth, with the aim of synthesising common themes across disparate studies, analysing patterns and trends, and then teasing out some issues and implications for further discussion.

Compared to the literature on teacher questioning (e.g. Blosser, 1995, Rowe, 1987; Tobin, 1987), there has been relatively less research on students' questions. Dillon (1988) suggested that this dearth is not because there is lack of interest in the topic, but rather that ‘investigators can scarcely find any student questions’ and that children may be raising questions in their own minds, or asking questions of their friends, but not aloud in the classroom. This paucity of student questioning has been an enduring feature in different settings (e.g. Commeyras, 1995, Elstgeest, 1985, Graesser & Person, 1994).

Until recently, most early research on students' questions focused on text‐based questions and often adopted a process‐product approach, which typically focused on the relationship between discrete observable teacher questioning practices and student outcomes such as student achievement (Carlsen, 1991). However, given newer developments in the methodologies of educational research and theoretical perspectives that encourage a sociolinguistic approach to studying the interactional nature of classroom discourse and social contexts, recent research on students' questions has become more diverse in its scope. Some of these more contemporary studies have also been based on current thinking that emphasises the social (Hodson & Hodson, 1998; Howe, 1996; O'Loughlin, 1992; Vygotsky, 1962, 1986), distributed (Pea, 1993), and situated (Brown, Collins, & Duguid, 1989; Hennessy, 1993; Wenger, 1998) nature of knowledge.

Another significant shift in the research has emerged from the work of Carlsen (1991) who suggested that three features of questions (namely context, content, and the responses and reactions of speakers) can be considered in the analysis of data on classroom questions from a sociolinguistic perspective. Context includes the description of speakers and their relationships to one another, as well as description of the ways utterances by different speakers fit together in discourse. Content refers to what is being talked about and the subject‐matter knowledge. Since then, some studies (e.g. Chin, Brown, & Bruce, 2002; van Zee, 2000; van Zee, Iwasyk, Kurose, Simpson, & Wild, 2001) have attempted to consider these three features. This change in approach has enabled researchers to address the dynamics and active construction of meaning that the process‐product paradigm was unable to consider, thus providing us with a better understanding of the interactional processes involved in classroom talk. Also, unlike the earlier studies, which were often based on written questions or carried out under experimental conditions, some of the recent studies have also included students' questions that were asked under naturalistic conditions. These questions were sometimes posed as part of oral discourse in the classroom, embedded in group discussions, or situated in whole‐class teaching contexts.

Emerging from an analysis of the literature, we found that research on students' questions has focused on the following four areas: (1) the nature and types of these questions; (2) the effects of teaching students questioning skills; (3) the relationship between students' questions and selected variables; and (4) teachers' responses to and students' perceptions of student questions. Finally, given the importance of students' questions in both teaching and learning for understanding, a number of researchers have also explored different ways of eliciting and encouraging students to generate questions. Each of these is now considered in turn.

Scardamalia and Bereiter (1992) distinguished between text‐based questions and knowledge‐based questions. Text‐based questions refer to questions that students ask as part of reading a text, particularly in response to given cues and where the answers can be found in the given text, while knowledge‐based questions are questions that are spontaneously generated that spring from a deep interest or an effort to make sense of the world and to extend knowledge in some direction. The above authors found that text‐based questioning tended to produce qualitatively distinct kinds of questions from knowledge‐based questioning. Knowledge‐based questions, which the Grade 5–6 students generated before studying the topic ‘endangered species’ and which reflected things that they genuinely wondered about, were of a higher order than text‐based questions generated in response to textual materials. These questions were significantly superior in their potential contribution to knowledge as they focused on explanations and causes instead of facts, and required more integration of complex and divergent information from multiple sources. This finding led the authors to conclude that questions generated under knowledge‐based conditions hold greater educational potential than those produced in the context of text‐based questioning.

However, Miyake and Norman (1979) argued that it takes considerable domain‐specific knowledge to ask good questions, and that ‘to ask a question, one must know enough to know what is not known’ (p. 357). Students, therefore, might find it difficult to ask educationally productive questions, especially at the beginning of their study of a topic, which is the point at which questions could have the most directive effect. Prompted by the question of whether asking educationally valuable knowledge‐based questions requires substantial prior knowledge, Scardamalia and Bereiter (1992) also sought to compare the nature of the questions that students generated for the topic ‘fossil fuels’ (one which the students indicated little prior knowledge) with that of ‘endangered species’ (one which the students were more familiar with). They found that lack of domain knowledge did not seem to hamper students in generating questions as the number of questions asked was almost exactly the same under both conditions. However, there was a qualitative difference in the types of questions asked. Students asked mainly ‘basic information’ questions for the less familiar topic, but concentrated on ‘wonderment’ questions for the more familiar one. This finding suggests that a lack of domain‐specific prior knowledge may influence the kinds (but not necessarily the number) of questions that students ask, thereby lending some support to the assertion made by Miyake and Norman (1979). Scardamalia and Bereiter (1992) further argued that wonderment questions, which reflect curiosity, puzzlement, scepticism, or a knowledge‐based speculation, have greater potential for an advance in conceptual understanding relative to basic information questions, which grope for basic orienting information. The findings of this study suggest that different kinds of questions can direct the learning process to different extents.

Since different kinds of questions can challenge and stimulate the mind to different extents, questions can be classified according to the level of thought required for answering them. Bloom's taxonomy (Bloom, Engelhart, Furst, Hill, & Krathwohl, 1956), for example, includes a hierarchy of levels that range from knowledge through comprehension, application, analysis, synthesis, and evaluation. Although this taxonomy was originally devised for the purpose of formulating questions by the teacher as part of the objectives for teaching or assessment, it could also be applied to students' questions. The taxonomy has since been revised (Anderson & Krathwohl, 2001) to accommodate a more highly differentiated range of cognitive processes that are subsumed under six major categories, namely; remember, understand, apply, analyse, evaluate, and create.

Another schema for categorising students' questions was developed by Pizzini and Shepardson (1991) who suggested that they could be one of three types (input, processing, output), using cognitive levels as a criterion. Input questions require students to recall information or derive it from sense data; processing‐level questions demand students to draw relationships among data; and higher‐level output questions require students to go beyond the data in new ways to hypothesise, speculate, generalise, create, and evaluate. However, a problem with this scheme is that the labels ‘input’, ‘processing’ and ‘output’ are not sufficiently descriptive of the cognitive levels that they purport to represent. These labels also do not indicate the cognitive processes associated with the questions. Furthermore, the use of only three descriptors does not reflect the full range of cognitive levels at which questions could be located.

A different way of classifying questions was proposed by Pedrosa de Jesus, Teixeira‐Dias, and Watts (2003) who attempted to use bipolar instead of uni‐polar constructs to categorise their questions. These authors argued that questions based on a uni‐polar construct are implicitly if not explicitly value laden in that asking higher‐level questions is always deemed better than asking low‐level ones, and that such questions do not allow for notions of context, situation, task, preference, intention, strategy, or goal. On the other hand, if questions were classified based on a bipolar construct, each pole would have ‘adaptive value so that the quality of the questions asked would depend on the nature of the situation; the learner's preferred style of working and the requirements of the task in hand’ (p. 1028). Based on this conceptualisation, the above authors placed questions on a continuum ranging from confirmation questions at one end to transformation questions at the other end, rather than defining them in terms of different levels of a hierarchy. Confirmation questions seek to clarify information and detail, attempt to differentiate between fact and speculation, tackle issues of specificity, and ask for exemplification and/or definition. Transformation questions, on the other hand, involve some re‐structuring or reorganization of the students' understanding. They tend to be hypothetic‐deductive, seek extensions in knowledge, explore argumentative steps, identify omissions, examine structures in thinking, and challenge accepted reasoning. The authors emphasised that both kinds of questions are necessary and complement each other; with the type of question that is appropriate to ask depending on the nature of the situation and the requirements of the task in hand.

Yet another perspective to classifying students' questions was offered by Watts, Gould, and Alsop (1997), who described three categories of students' questions that illuminated distinct periods in the process of conceptual change: consolidation questions, where students attempted to confirm explanations and consolidate understanding new ideas in science; exploration questions, where they sought to expand knowledge and test constructs; and elaboration questions, where students attempted to examine claims and counterclaims, reconcile different understandings, resolve conflicts, test circumstances, track in and around the ideas and their consequences. This taxonomy classifies questions according to the stages through which a student's understanding progresses. Although it reflects a developmental progression in students' thinking, it is also context‐dependent in that for it to be meaningful, one needs to know when the question was asked during the process of conceptual development.

The typologies reviewed thus far pertain to questions asked while students were engaged in a variety of classroom tasks but what of the kinds of questions that students pose specifically for investigative tasks? In the context of open investigations, studies have shown that not all questions asked by students are amenable to practical inquiry in that they could be answered first‐hand by the students designing and performing hands‐on investigations themselves. Although Fairbrother (1988) suggested that posing a question and formulating a hypothesis are often linked together, thereby lending to the identification of a practical problem, Watts and Alsop (1995) believed that many questions posed by students are ‘reasons for conceptualisation and for thinking aloud and are not expressed in a way that will obviously lead to an investigation’. The latter view was also shared by Swatton (1992) who argued that ‘not all questions raised by children are necessarily expressed in the formal terms amenable to empirical test’. This author believed that the teacher must then facilitate the translation of these questions into testable hypotheses. Symington (1980) also found that more than half of the questions that children asked did not lend themselves to practical investigation. Moreover, even questions that had been modified to make them more ‘investigable’ were often not immediately useful in a practical ‘hands‐on’ sense. Although students' ‘raw’ questions do not always seem to lend themselves to practical investigations at first, with the teacher's help students are able to translate such questions into investigable ones with time (Roth & Roychoudhury, 1993).

To guide students in generating researchable questions on their own, Chin and Kayalvizhi (2002) proposed a typology of investigable and non‐investigable questions for use with open investigations. Investigable questions refer to those in which students can find out the answers by designing and performing hands‐on investigations themselves. Such questions allow students to generate and collect some original data and finally, make a conclusion that answers the investigative question posed, on the basis of available first‐hand evidence. Investigable questions include comparison, cause‐and‐effect, prediction, design‐and‐make, exploratory, descriptive, pattern‐seeking, problem‐solving, and validation of mental model questions. Examples include ‘Which type of material is best for keeping water hot?’ (comparison), ‘How does concentration affect the rate at which salt dissolves in water?’ (cause‐and‐effect), ‘What would happen to the distance travelled by a toy car if I raise the height of the inclined plane?’ (prediction), and ‘What kinds of insects live in our garden?’ (descriptive).

On the other hand, non‐investigable questions include: (a) basic information questions that request simple information or basic facts and in which the answers can be found by referring to books, searching the Internet, or by simply asking someone; (b) complex information questions that probe an underlying mechanism and where explanations involve theorising at the molecular or sub‐microscopic level; and (c) philosophical and religious questions that may not be answerable by science.

In all of the studies discussed so far, students' questions were asked in the context of formal instruction in the classroom. But what about the nature and kinds of questions posed by children beyond the classroom walls? To characterise young people's spontaneous interests in science and technology, Baram‐Tsabari and Yarden (2005) analysed 1676 questions submitted by Israeli children to a series of television programmes. Questions were placed in one of the following categories: biology, physics, chemistry, earth sciences, astrophysics, nature‐of‐science inquiry, and technology. Questions pertaining to biology were the most popular (49.6%), with the relative frequency of questions categorised as ‘zoology’ decreasing and that of ‘human biology’ increasing with age. The interest of older students in human biology is well‐attested by a number of studies (e.g. Osborne & Collins, 2001). The second largest group of questions concerned technology (25%), followed by astrophysics (12.2%). Among the older students, there was a decrease in the number of biological questions and a concomitant increase in questions on technology. Questions related to earth sciences, physics, chemistry, and nature‐of‐science inquiry were less common.

The children's motivation for asking questions was for the most part ‘non‐applicative’. However, there were some shifts with age in the motivation for asking questions, with ‘applicative’ questions (almost all of which were associated with biology or technology) rising steadily with ages between 6 and 16. These ‘applicative’ questions concerned utility, where knowledge is used to solve problems (e.g. ‘How can I lose weight in a few days?’). More than half of the questions were ‘factual’, a little more than one‐quarter were ‘explanatory’, and 13.5% were ‘applicative’ in nature. Boys favoured ‘factual’ and ‘methodological’ questions, whereas girls asked for more ‘explanatory’ and ‘applicative’ types of information, the latter with reference to ‘personal use’ and ‘health and lifestyle’. The main sources of the children's questions were ‘hearsay’ and the ‘media’, with minimal questions related to school science. There were also gender‐related differences, with questions from girls being predominantly biological in nature, and those from boys dominating in all the remaining categories.

What implication can we draw from this study's findings about students' intrinsic interest in the school science curriculum? The authors (Baram‐Tsabari & Yarden, 2005) suggested that given students' relatively low interest in chemistry and physics, as manifested by their self‐generated questions, it may be worth exploring teaching a number of such topics in the context of space science or technological applications, which were areas that elicited more questions and interest. Also, since questions about the ‘nature of science inquiry’ were fairly rare, such ideas may be more successfully addressed in the context of inventions and patents, an area of interest that drew mainly applicative questions from the children.

A subsequent study by Baram‐Tsabari, Sethi, Bry, and Yarden (2006) of children's questions submitted to an international Ask‐A‐Scientist Internet site based in the USA also found biology questions to be the most popular and similar gender‐related interests fitting the well‐known stereotypic preferences for specific topics. The children asked more ‘school‐related’ questions related to assignments and textbooks as they got older, compared to ‘spontaneous’ questions which were internally motivated by personal contexts. The authors suggested that children's science interests, as inferred from the questions to Web sites, could be used as ‘hooks’ within the curriculum.

Table presents an overview of the studies that have been conducted on characterising the types of questions that students ask.

Download CSVDisplay Table

This review on the types of questions that students ask reveals a number of salient points. First, questions may be classified according to the different types and levels of cognitive processes that students are expected to use when they pose a question. One implication of such a classification scheme is that it could guide teachers to design classroom tasks that encourage students to pose questions that match the cognitive demand they seek. And, as questions of different cognitive complexity can direct students' conceptual development and advance their thinking to different degrees, these tasks should elicit questions which are pitched at higher levels of thought and which require the restructuring of ideas. However, given that students' question‐asking ability is also dependent on contextual factors such as their prior knowledge, questions that ask for basic information are also important in some cases. Second, students may not always know how to pose questions that are appropriate for practical investigations. One way of addressing this problem would be to teach students the differences between investigable and non‐investigable questions and illustrate these ideas with the use of specific examples. Third, students' questions, including those asked outside of school, are indicative of what students are interested in and would like to know more about in the world around them. Teachers could thus capitalise on such questions and use them to guide curriculum development in both formal and informal learning contexts.

In this section, we examine the effects of several intervention studies that aimed to teach students questioning skills and discuss the significance of their findings. These studies were carried out mainly in the contexts of: (a) reading science texts; (b) formulating researchable questions for science investigations; and (c) learning new content material through group discussions.

Most of the early research on student‐generated questions in science focused on improving students' reading comprehension of texts (e.g. Koch & Eckstein, 1991; Pearson, 1991). Koch and Eckstein (1991) found that there was improvement in college physics students' reading comprehension of texts when they were taught the skill of formulating questions about textual material. The strategy used, which the authors referred to as the ‘Questions Formulation Strategy’, consisted of the Answer/Questioning (A/Q) method and the Peer Feedback (P/F) method. In the A/Q method, students summarised a text by formulating questions using a table with three columns. Column 1 contained questions that had answers in the text and which the student believed he or she understood. Column 2 listed questions in which answers were found in the text but which the student did not understand. Column 3 contained questions in which the answers were related to, but not discussed in, the text. Having to separate their questions into three distinct categories helped students to distinguish between what they understood and what they did not, as well as to distinguish between what was and was not stated in the text. Thus, this was an exercise in meta‐comprehension and metacognition. In the P/F method, students took turns posing questions to their peers, whose feedback helped to clarify fuzzy questions. The results showed significant gains in reading comprehension for both the A/Q and P/F groups over a control group, with more substantial increase for the P/F group. Hence, this strategy of formulating questions stimulated students' awareness of their difficulties in reading comprehension and could be used as a self‐monitoring technique.

One finding that does not fit the general pattern that demonstrates the value of student‐generated questions comes from the work of Pearson (1991). He compared the achievement of college biology students after they had used either teacher‐provided or self‐generated questions when reading science texts. Results indicated that compared to using teacher‐provided questions, training students to generate and answer their own questions did have a favourable effect on their mid‐range (weekly quiz) performance. This finding suggests that these student‐generated questions and answers had a tendency to induce prose processing of material to a greater degree than teacher‐provided questions, based on the same text. Nevertheless, such an effect was not observed for a long‐range (summative examination) performance. To explain these seemingly contradictory results, the author hypothesized that ‘the weekly quizzes, which were forms of teacher‐provided questions (though not the same kind), compensated for the effects of training on student‐generated questions, making both groups equivalent over time with regard to student preparation for the summative exam (post‐test)’ (p. 499). If this were the case, then from the standpoint of long‐range effects, it would be just as efficacious for teachers to provide their students with text‐based questions instead of training them to generate and use their own questions. However, given that this research does not fit with the general findings of a large body of work, its credibility is questionable.

Although reading comprehension level was measured in the above studies, the students' questions were not evaluated for their cognitive levels. Inadequate analysis of the questions and the pathways through which they were formulated makes it difficult to discern the effects of instruction on question‐generation per se and elucidate the mechanisms that underlie the process. There may be stages in the students' formulation of questions that provide a window into the thought processes involved. For example, question construction may precede knowledge construction by stimulating the learner to integrate prior and new knowledge, make connections between disparate pieces of information, elaborate on what he or she knows, and construct explanations that include theories and models of how things work.

Other than studies pertaining to the reading of science texts, another group of researchers have been interested in studying the effects of teaching students how to formulate researchable questions for science investigations (e.g. Allison & Shrigley, 1986; Cuccio‐Schirripa & Steiner, 2000; Hartford & Good, 1982). For example, Hartford and Good (1982) found that teaching chemistry high school students questioning skills led them to ask more and better research questions. However, students' questioning skills were found to be independent of their level of Piagetian intellectual development, as assessed through their ability with probabilistic, combinatorial and proportional reasoning, and the isolation and control of variables. The authors suggested that research questioning ability might depend on hypothetico‐deductive reasoning ability rather than upon various types of formal operational thought. However, such an argument would contradict the views of Lawson (2003) who has argued that hypothetico‐predictive reasoning is related to formal thinking.

Allison and Shrigley (1986) conducted a study where they taught 5th and 6th Grade students to ask operational questions in science. Operational questions (Alfke, 1974) are questions that are amenable to practical investigations and which help students to manipulate variables in science experiments through eliminating, substituting, and increasing or decreasing the presence of a variable. The above authors found that when teachers modelled the asking of such questions, their students also asked more of such questions during science lessons. Students who experienced both teacher modelling and written practice also asked more operational questions than students in the control group.

Additional evidence that students who received instruction on asking researchable questions were better able to generate such questions of a higher‐level nature than a control group comes from the work of Cuccio‐Schirripa and Steiner (2000) with Grade 7 students. Instruction for the experimental group included the definition, critical attributes, examples, and non‐examples of researchable questions. Students then practised writing researchable questions and were provided with feedback. The questions were rated along a scale ranging from levels 1 to 4, with level 1 being at the lowest level and requiring only answers that pertained to factual information or simple ‘yes’ or ‘no’ responses. At the high end, level 4 answers represented cause‐and‐effect relationships and required an experimental design where variables were specific, measurable, and manipulative. The authors also found that high achievers in mathematics, reading or science outperformed low achievers. Furthermore, the questions for topics in which students had high interest were of a higher level than for low‐interest topics, suggesting that interest in a science topic motivates students to use questioning skills.

The findings of these three studies suggest that students can be explicitly taught how to ask researchable questions for science investigations with some degree of success. Teachers can design instructional tasks that involve the posing of operational questions, as well as teach students the characteristics of researchable questions. In addition, they need to model the asking of these questions, provide students with exemplars and non‐exemplars of such questions, and have students practise generating such questions. Also, while students' ability to ask research questions might be contingent upon their hypothetico‐deductive reasoning ability, having students pose questions for topics which interest them can elicit higher quality questions.

Another interesting approach to teaching students questioning skills involved guiding them with question prompts that were designed to help them access prior knowledge and make connections among ideas in interactive group settings. Such a study was carried out by King (1994) in the context of teaching students new content material. In the study, 4th‐ and 5th‐Grade science students used the strategy of ‘Guided cooperative questioning’ in a series of lessons on ‘systems of the body’. They made use of prompt cards, which consisted of thought‐provoking generic ‘comprehension’ and ‘connection’ question stems. Comprehension questions asked for a process or term to be described or defined (e.g. ‘What does … mean?’). Connection questions required students to go beyond what was explicitly stated in the lesson by linking two ideas together in some way (e.g. ‘What is the difference between … and …?’), or asking for an explanation, inference, justification, or speculation (e.g. ‘What would happen if …?’).

In small group discussions, the students posed their questions to each other and answered each other's questions. Students also received training in how to explain. Analysis of the students' performance on lesson comprehension tests for the material studied, post‐lesson knowledge maps, and verbal interaction during the study (as measured by the quantity and quality of questions asked and knowledge construction statements made) showed that the students trained to ask questions engaged in significantly more complex knowledge construction than the unguided and control groups. It was also found that while connection questions induced knowledge integration, factual questions induced only knowledge restatement. These findings lend support to the notion of a correspondence between level of questioning and level of knowledge‐construction activity, suggesting that type of questioning used may determine the possible level of knowledge construction that occurs.

A similar study was carried out by King and Rosenshine (1993) with a group of 5th Graders learning about tide pools in a series of science lessons, the difference being that this study compared the use of structured, more fully articulated question stems such as ‘How are … and … similar?’ with less elaborated question starters that gave only signal words (e.g. Why?, What?, How?) to guide question generation. It was found that children who used highly elaborated question stems outperformed those using less elaborated stems and unguided questioners on explanations provided during discussion, post‐test comprehension, and knowledge mapping. These findings indicate that in cooperative discussion contexts, structured and explicit guidance in asking thought‐provoking questions elicited explanations that, in turn, mediate and improve learning. One explanation is that the format of the question stems could have helped learners to generate thoughtful questions that induced them to think about and discuss their ideas in specific ways and to forge a variety of constructive links among these ideas. The above findings also show that even primary age students can be trained to use highly elaborated question stems to generate thought‐provoking questions about material presented in their lessons and, hence, that guided cooperative questioning is a viable socio‐cognitive strategy for use in classroom settings.

Table gives a summary of the studies on the effects of teaching students questioning skills.

Download CSVDisplay Table

The studies reviewed in this section were all examples of attempts to teach students questioning skills. In summary, the studies demonstrated that students' question‐asking capability could be fostered and made intentional through instructional interventions, and could lead to enhanced performance on a range of science tasks that included prose learning, investigations, knowledge mapping, as well as knowledge construction statements and explanations given during verbal discussions. More importantly, perhaps, is the finding that meaningful scaffolds provide a structure to guide students' thinking. These scaffolds include a table with specific columns that acts as a visual organiser; the linguistic form of operational questions (i.e. the wording and syntax used to construct such questions, for example, ‘How would the [dependent variable] change if we increase/decrease the [independent variable]?’); and elaborated question stems.

A survey of the research literature reveals that a number of researchers have also attempted to elucidate the link between student questioning and other variables such as: (a) achievement level and conceptual understanding; (b) learning approaches; (c) learning styles; (d) task condition and grade level; and (e) the nature of instruction – all of which are discussed below.

In his review of a number of Australian studies, Tisher (1977) reported that low levels of questioning and explaining on the part of the pupils were not associated with high levels of achievement. He concluded that ‘strategies used in the classroom and in curriculum must be such as to require more questioning and explaining on the part of the pupils' (p. 100). That student questioning is related to achievement level is further supported by the findings of King (1992), which showed that students trained to prepare for tests by generating and then answering their own questions significantly out‐performed comparable groups who were not taught questioning skills.

One aspect of a student's academic achievement is conceptual understanding of subject matter content. Harper, Etkina, and Lin (2003) investigated the types of questions that college physics students asked and the relationships among them, and whether there was a relationship between the types of questions and students' conceptual understanding. Students wrote a weekly report in which they answered three questions pertaining to what they had learnt and how they had learnt it, what questions remained unclear, and what questions they would ask their students if they were the professor, to find out if students understood the material. The raw number of questions asked had no significant correlation with any measure of conceptual achievement. However, students who asked high‐level questions received better scores on the conceptual performance test than those who asked only simple questions, indicating a direct relationship between depth of questioning and prior conceptual knowledge. Students who asked mainly questions about equations did not do as well as those who asked more coherence questions, suggesting that students with a fairly solid conceptual base asked questions that help them connect various pieces of their knowledge. Furthermore, students who lacked conceptual knowledge and asked questions to fill in the gaps scored better on subsequent conceptual tests than those who did not. These findings indicate that simply encouraging students to ask questions does not necessarily result in better learning. Rather, high‐level questions involving coherence and limitations were related to greater conceptual understanding.

The results of the above studies, suggesting that there is a relationship between the quality of students' questions and achievement as well as their conceptual understanding, are important but not unexpected. One pedagogical implication of this finding is for teachers to design learning tasks that provide opportunities for students to ask questions that would help them link disparate bits of knowledge into a coherent whole. Such tasks might require students to pitch their thinking at levels which include the application, evaluation, and synthesis of ideas.

The kinds of questions that students ask may be related to the way they approach their learning tasks. To investigate the relationship between student‐generated questions and students' approaches to learning science, Chin, Brown, and Bruce (2002) studied student questioning during classroom discourse while Grade 8 students were engaged in hands‐on laboratory activities in small groups. They found that students' basic information questions, which focused on facts and procedures, were typical of a surface‐learning approach and generated little productive discussion. In contrast, wonderment questions, which included comprehension, prediction, anomaly detection, application, and planning questions, led students to wonder more deeply about their ideas. These questions were indicative of a deep learning approach and stimulated students to generate explanations, formulate hypotheses, predict outcomes, thought‐experiment, interrogate anomalous data, consider application of ideas, and plan next steps. Open‐ended, problem‐solving activities elicited more and a wider range of wonderment questions than teacher‐directed activities where students asked mostly procedural questions when following step‐by‐step instructions. Although the students did not always ask wonderment questions spontaneously, they were able to generate such questions when prompted to do so. Some students were more inclined to ask questions than others, and wonderment questions stimulated either the questioners themselves or another student to generate an answer.

The above findings suggest that teachers need to recognise the learning approaches (i.e. deep versus surface, which reflects mastery versus performance) adopted by their students, be sensitive and responsive to the type and depth of questions that students ask, and to encourage students to ask wonderment questions at a deep level. Also, since the nature of classroom tasks and cognitive demands required of the students influences the type of questions that students ask, teachers should present laboratory activities in ways that encourage problem‐solving rather than formulaically following instructions to obtain expected answers. Furthermore, since students' questions were not always spontaneously generated, this finding implies that teachers cannot leave students to their own devices to ask questions. Rather, they need to provide prompts and scaffolding, and explicitly orient students towards asking questions as part of class activities.

According to Kolb (1984, 1985), there are four types of learning styles or preferences, depending on how an individual perceives and processes information. A diverger is imaginative, looks at things from different perspectives, and is good at generating ideas and perceiving relationships. A converger likes deductive reasoning, enjoys experimenting with new ideas and working with practical applications, and has strengths in problem‐solving and decision‐making. While the assimilator is systematic, analytical, adopts a logical approach, and likes to work with abstract concepts, models and theories; the accommodator prefers a practical ‘hands‐on’ approach, likes action, and takes risks.

Pedrosa de Jesus, Almeida, and Watts (2004) studied the relationships between students' questioning and learning styles using four case studies in a university chemistry course. They compared the quality and quantity of questions asked by these students who were identified as either a diverger, converger, assimilator, or accommodator, based on Kolb's experiential learning theory (Kolb, 1984; Kolb, Boyatzis, & Mainemelis, 2001) and his characterisation of learning styles (Kolb, 1985). Their findings on the disposition of learners to ask different kinds of questions showed that although a student may have a clear preference for a particular learning style, some students could still deploy all styles of learning and utilise diverse types of questions. This then led to a more holistic and effective way of learning. However, students at a lower stage of knowledge development lacked the sophistication to ask a variety of questions, leading the authors to conclude that the kind of questions that students raise is not only influenced by their learning style but also contingent upon their stage of knowledge development.

How does the condition under which students are presented a task and their age relate to their question‐asking capability? Costa, Caldeira, Gallástegui, and Otero (2000) sought to determine the answer to this issue by studying how task condition and grade level (Grades 8, 10, and 12) influenced the quantity and quality of questions formulated by students while they processed science texts explaining natural phenomena that involved ‘clouds’ and ‘dissolved oxygen’. The authors were also interested to find out what kinds of questions were asked. Three task conditions were chosen. In the ‘Class’ condition, the task was introduced as an activity aimed at developing the capacity to ask questions. In the ‘Examination’ condition, the task was presented as a test on question generation. Finally, in the ‘Extra‐academic’ condition, the questioning task was camouflaged as participation in a research project sponsored by the Ministry of Education and geared at the improvement of science textbooks.

Given that students' perception of the importance of the assessment influences their learning strategies, the authors expected that students would regard generating questions under the examination condition to be more important than in the class condition, and that the quantity and/or quality of questions asked might improve too. The results showed that the students were able to ask many written questions of variable quality, and that they were capable of generating a large volume of causal‐antecedent questions. However, no significant main effects were found for grade level or task condition concerning the total number of questions and number of deep reasoning questions. Nevertheless, a significant interaction of grade level by task was found, where a classroom setting furthers asking more quality questions than in an extra‐academic situation at the 12th‐grade level. The authors concluded that complex interactions exist among the variables, which, in turn, influence the questioning behaviour of the students. They also suggested that although comprehension‐monitoring ability and anomaly detection improve with grade level, thereby leading to the expectation of an increase in the number of questions asked by older students, other confounding factors relating to a ‘knowledge deficit’ hypothesis would lead to fewer questions being asked by these students. Thus, although we might expect older readers to ask more questions than younger ones because of their higher metacognitive ability, they may not do so because they have relevant knowledge of the information provided in the texts and find the paragraphs more comprehensible.

Do the kinds of questions that students ask depend on the type of instruction that their teachers use? The studies reviewed here attempt to provide some answers to this question.

A study by Marbach‐Ad and Sokolove (2000a), for example, found that undergraduate biology students from active learning, co‐operative groups were able to pose better and higher level questions after reading chapters from a textbook than those taught in a traditional lecture format. The active learning class employed student‐centred, constructivist‐based, and interactive instructional approaches where students worked in cooperative learning groups, were given problems for discussions, and encouraged to ask questions about items and issues from assigned readings, lecture topics, and personal experience. Questions from students were voiced publicly (using wireless microphones), written in laboratory notebooks, or posed on email to the instructor. Students' questions were often used to initiate small‐group learning exercises and/or to launch whole‐class consideration of key biological concepts and processes. On the other hand, the traditional class was taught using a lecture format with little time allocated for open discussion and questions. Both classes were presented with a taxonomy of questions. Over time, the questions from the active learning group became more insightful, thoughtful, content‐related, and research‐oriented, and were not easily answered by consulting the textbook or another readily available source. In contrast, the quality of students' questions in the traditional class was largely unchanged. In a related study, Marbach‐Ad and Claasen (2001) reported that providing students with criteria to evaluate their own questions also holds promise for improving student questioning.

Hofstein, Shore, and Kipnis (2004) found that providing 11th‐ and 12th‐Grade high‐school students with opportunities to engage in inquiry‐type experiments in the chemistry laboratory improved their ability to ask high‐level questions, to hypothesize, and to suggest questions for further experimental investigations. Likewise, in a related study, Hofstein, Navon, Kipnis, and Mamlok‐Naaman (2005) investigated the number and cognitive level of questions asked by high‐school 12th‐Grade chemistry students, as well as the nature of the questions chosen by these students for further investigation after carrying out an experiment (in the form of a practical test) and after critically reading a scientific article. For the experiment, students' questions were considered low‐level if they related to the facts and explanations of the phenomenon being studied, while for the article, they were low level if they were based highly on the text and where the answers could be found. In both the experiment and the reading of the article, questions were deemed to be high‐level type questions if they could be answered only by further investigation, such as conducting another experiment or looking for more information on the Internet or in the chemistry literature. The authors found that students who learnt through the inquiry approach, and who had experience in asking questions in the laboratory, significantly outperformed the control group in their ability to ask more and better questions.

Pedrosa de Jesus, Teixeira‐Dias, and Watts (2003) and Teixeira‐Dias, Pedrosa de Jesus, Neri de Souza, and Watts (2005) reported on a study that attempted to change the atmosphere of traditional lecture and tutorial sessions and enhance the quality of teacher‐student and student‐student classroom interactions in university chemistry. To stimulate active learning and enhance the quality of classroom interactions, teaching incorporated students' questions in small group work tutorials, conference lectures that addressed topics of scientific, technological and social interest, practical laboratory sessions, and mini‐projects. Students were required to post questions into a question box, through a computer software system, via e‐mail, and in project workbooks. The protocols of the practical sessions were also less directive and prescriptive. ‘Top‐up’ marks were awarded to students who generated good questions. The authors reported that the curricular ‘tuning’ led to changes in the way students learnt. There was an increase in students' engagement in learning over time, as indicated by the number and quality of questions asked by students. By analysing the number and distribution of questions over time, the authors concluded that it was possible to create a questioning environment where students ask questions and receive answers as an integral part of everyday transactions, and that learners' questions can be a fruitful means of increasing student engagement in learning chemistry. However, since this study did not include a control group, it is possible that the increase in students' engagement was due to the students getting to know their lecturers better and developing a stronger knowledge base with which to engage in the subject.

The kind of questions that students ask has also been found to be dependent on whether students have been exposed to reading research papers. To examine the effect of studying through research papers on students' ability to pose questions, Brill and Yarden (2003) told 11th‐ and 12th‐Grade high‐school biology students to ask questions before, during, and after instruction on what they found interesting to know about embryonic development. The researchers also monitored students' questions that were asked orally during the lessons. The questions were analysed using three categories based on Dillon's (1984) classification of research questions. These categories referred to the thinking level required to provide answers to the questions and were: (a) ‘properties’ (where answers described the properties of the subject in question); (b) ‘comparisons’ (where answers required a comparison between the subjects in question); and (c) ‘causal relationships’ (where answers related to finding the relation, correlation, conditionality, or causality of the subjects in question). Usually questions from the ‘properties’ category referred to one variable, while those from the ‘comparisons’ and ‘causal relationships’ categories referred to at least two variables. Questions that raised some kind of criticism of research were also included in the third category since they indicated contradictory relationships.

The above authors found that before learning through the research papers, students tended to ask only questions of the ‘properties’ category where answering the question required only declarative knowledge. The questions were also general (e.g. ‘What are the different stages of embryonic development?’) instead of specific in nature. In contrast, students tended to pose questions that revealed a higher level of thinking and uniqueness (e.g. ‘If we take a primary muscle cell at a certain stage in the differentiation process, could we create a muscle cell that would help a person who has a problem with his muscles?’) during or following instruction with research papers that combined both declarative and procedural knowledge. This change in the level and specificity of questions was not observed during or following instruction with a control group using a textbook. The authors suggested that learning through research papers may be one way to provide stimulus for question‐asking that would result in higher thinking levels and originality. The combination of theoretical background (declarative knowledge) and research methods (procedural knowledge) would not only allow a variety of possibilities and combinations in formulating questions, but also help students to become acquainted with the way scientists conduct research in developing the rationale for the research, formulating the research question, developing the methods, analysing the data, and arriving at conclusions.

Table presents a summary of the studies on the relationships between students' questions and selected variables.

Download CSVDisplay Table

Taken together, the results of the above studies show that the cognitive level of questions posed by students is, to some extent, dependent on the nature of instruction. Instruction that enhances students' ability to ask good quality questions includes adopting some or all of the following pedagogies: the use of active learning in cooperative groups; student engagement in inquiry‐type laboratory experiments; deliberately creating a questioning environment by providing a variety of opportunities for students to pose questions; and having students read research papers. What these pedagogies share in common is that they all explicitly require students to ask questions by immersing them in a learning environment that values questions. The students were engaged in thoughtful tasks where they experienced the various phases of inquiry that reflected the nature of science. These included being challenged by alternative points of view and debating them, justifying their assertions, and posing questions to resolve doubts or seek answers. In a nutshell, the students were placed in situations where they had to pose questions to steer and extend their own thinking.

Given the importance of students' questions in the science classroom for learning, how do teachers respond to students' questions? Tobin and Gallagher (1987) found that teachers responded differently to students' questions depending on who asked them, when they were asked, and other factors. To elicit questions, the teachers tended to call on a small number of target students, who projected themselves in whole‐class interactions and regularly contributed to classroom discourse by raising their hands, asking questions and calling out responses.

Watts, Alsop, Gould, and Walsh (1997) suggested that when responding to students' questions during instruction, teachers could ignore students' questions, give their best answer, admit ignorance, redirect the question to the student, or change the question to an empirical one for investigation. At times, teachers may be caught in a situation where their students raise unexpected questions for which they are unable to offer an accurate answer. This issue was discussed by Yip (1999) who asked a group of 26 biology teachers how they would respond to such a situation. The variety of strategies indicated by these teachers included: checking up references (92%), suggesting an instant answer (85%), asking students to suggest answers (54%), ignoring the question (46%), and setting the question as homework (23%). Yip (1999) compared the pros and cons of each strategy and discussed their pedagogical implications. He also suggested that teachers could use a sequence of probing questions to lead students through a process of inquiry to construct their own answers.

Rop's (2002) ethnographic study of a high school chemistry teacher's responses to ‘student inquiry questions’ (defined as thoughtful, content‐related and driven by curiosity) found that such questions held both positive and negative meanings for the teacher. Although these intellectual questions were valued as indicators of students' motivation and a legitimate search for understanding, and provided the teacher with clues about students' interest, ability and comprehension, they nonetheless created an interruption to the normal flow of things, particularly when they were too far ‘off track’, ‘off‐the‐wall’, or only tangentially related to the lesson's content objectives. To the teacher, such interruptions posed threats to his control of classroom events, teaching efficiency, and his ability to cover the content of his course, and were thus perceived to be disruptive, distracting, intrusive and annoying. Thus, although science educators might encourage students' questions as part of the spirit of inquiry, it can be difficult for this to happen in schools where teachers face institutional curricular pressures to complete a given curriculum in a set time. Nevertheless, research would suggest that allowing time to pursue students' questions helps to develop students' interest and motivation.

What perceptions do students have about their experiences of asking questions in class? Watts and Pedrosa de Jesus (2005) found that students' feelings about question‐asking could be both positive and negative. While some students might enjoy asking questions and value the role that questions play in guiding and organising their thinking, there were also others who felt timid and embarrassed in posing questions. The authors discussed the interplay between the cognitive and affective dimensions of questioning and suggested that questions have an ‘iterative duality’ in that they can be both stimuli for and outcomes of variation and change in students' thinking. That is, students' questions can invoke, and be invoked by, feelings, and that these feelings can then, in turn, generate and shape further questions.

In another study of ‘student inquiry questions’ in a high school classroom, Rop (2003) found that students asked such questions to ‘alleviate boredom and engage in intellectual challenges’, as well as to ‘fill an intellectual hunger to understand subject matter better’. These students wanted to challenge themselves to think at a higher level than was commonly required and it was far more interesting for them to think hard about difficult things than about the mundane tasks (such as memorising for a test, writing lab reports and doing assignments on time) that tended to fill their days. However, they also felt the social pressure to refrain from asking too many questions because of the intolerance and subtle disparaging responses of classmates, as well as their frustration with what they called the ‘teacher put off’ (p. 24). The latter referred to the teacher brushing off and not dealing with their questions. Despite the students' genuine desire to learn and extend subject matter knowledge, their questions usually received only scant attention from teachers, and were regarded as rather eccentric by classmates. These students felt that their questions were not always valued, encouraged, or given time to flourish. Consequently, teacher and peer responses often encouraged them to abandon their curiosity for social conformity.

The above findings on how teachers and students perceive student‐generated questions reveal an interesting ‘double irony’. On one hand, teachers appreciate the value of students' questions, especially if the questions show evidence of students' intellectual curiosity, interest, and conceptual engagement in the lesson. Yet, on the other hand, these very same teachers may not welcome students' questions because of the distractions these questions pose to the smooth running of their lessons, and the pressure that they feel in having to cover prescribed content within a specified time period and to prepare their students for high‐stakes tests. That there are students who would pose such inquiry questions is testimony to the potential for high‐level thinking occurring in science classrooms. However, these students who are curious about extended scientific ideas and motivated to ask questions beyond the delivered curriculum may find that their questions are not welcomed – either by their teachers or their peers.

The review of the literature indicates that there is substantial educational potential in student‐generated questions. How, then, can we maximally exploit students' questions as a potential resource for teaching and learning science? To address this issue, we raise the following three questions and then consider each one in turn:

1.

What kinds of questions are considered ‘quality’ questions that we would like our students to ask, and how might this vary according to context?

2.

What are the barriers to student questioning in the classroom?

3.

How can we encourage students to ask questions?

Not all questions are of equal value. The cognitive level of a question is determined by the type of answer that it requires (Yarden, Brill, & Falk, 2001). Closed questions (i.e. those with a single, unambiguous answer) and questions requiring recall of information are easier to generate and easier to answer. Indeed it is these types of questions used to initiate what Lemke (1990) terms ‘triadic dialogue’ or Initiation, Response and Evaluation (IRE) that dominate the school science classroom. In contrast, open, imaginative questions that require reflection and understanding, both to frame and to answer, are more difficult to conceptualise as they require deep processing of ideas such as reframing, application, or extension of taught ideas.

Graesser and Person (1994) described high‐level questions as those involving inferences, multi‐step reasoning, the application of an idea to a new domain of knowledge, the synthesis of a new idea from multiple information sources, or the evaluation of a new claim. Specifically, questions such as those described as ‘causal antecedent’ (Why …?), ‘causal consequence’ (What happens if …?), ‘goal orientation’ (What is the purpose of …?), or ‘instrumental/procedural and enablement’ (How …?) were considered by Graesser, Langston, and Baggett (1993) and Graesser and Person (1994) to be high quality ‘deep reasoning questions’, as they correlated with higher levels of cognition.

But perhaps Pedrosa de Jesus et al.'s (2003) definition of quality questions as that ‘combination of questions that most readily enable a learner to make meaning of the learning task’ is the most salient. Such questions form an efficient tool kit to generate good understanding and would comprise an optimal mixture of question types. The particular types of question that are meaningful to ask would depend on the context and the particular task in hand. For example, a student might ask ‘What?’, ‘How?’, ‘Why?’, and ‘What if?’ questions respectively if he or she is seeking more factual information about a topic, figuring out a procedure or the mechanism underlying a certain process, trying to explain and understand a phenomenon, or predicting the possible outcomes of a hypothetical scenario.

Thus, to actively engage with the subject at hand, learners should desire and be able to ask a set of interrelated ‘critical questions’ at the appropriate times (Browne & Keeley, 1998). For example, two critical questions pertaining to an argumentative task would be ‘What are the value conflicts and assumptions involved?’ and ‘How good is the evidence?’ Questions that are appropriate during decision‐making processes, when one has to decide among alternatives, might include ‘What are the pros and cons of each of these options?’ and ‘What would be the consequence if I chose option A (or B)?’ Encouraging students to ask a variety of critical questions that best suit a particular situational demand will foster the development of a deeper and more thoughtful approach to questioning.

There are several personal, psychological, and social obstacles that may prevent students from asking questions in the classroom. The number and type of questions that students ask may be influenced by their age, experiences, prior knowledge and skills, the attitude of the teacher, teaching style, nature of the topics, reward structure, classroom evaluative climate, and social interaction patterns (Biddulph & Osborne, 1982). Other factors include the student's knowledge of different types and levels of questions, the teacher's and peers' reactions to students' questions, as well as the supportive structures (physical, social, procedural, and logistical pertaining to space and time) that are in place in the classroom.

Furthermore, learners can be differentially curious, with some asking questions easily and others finding it more difficult to do so. This difference may depend on the individual's predisposition to taking risks, learning style, and ability to tolerate uncertainty (Pedrosa de Jesus et al., 2003). While some learners ask questions to minimise doubt and to restore an inner calm, others may be able to live with uncertainty and doubt without the need for explicit questions and answers. In addition, although learners may be eager to ask questions, the act of uttering a question publicly to the scrutiny of others may discourage them from doing so, as this may render them vulnerable to embarrassment, censure, or ridicule.

Graesser et al. (1992) have suggested that question‐generation may be stimulated by a knowledge deficit. This process comprises three distinct stages: anomaly detection, question articulation, and social editing. Factors that affect any of these stages may prevent the generation of questions. For example, shallow information‐processing would limit anomaly detection. Consequently, this failure to detect inconsistencies and contradictions with one's own knowledge would prevent asking a question. Individual variables such as achievement, motivation, and self‐esteem may also influence question‐asking, especially at the social editing stage. Good et al. (1987) found that average achievers asked more questions than low and high achievers. Also, the latter may be very sensitive to the cost to their self‐esteem of asking questions (Van der Meij, 1994).

In typical classroom settings, question generation is not a usual student role and students are more often expected to answer questions rather than to ask them. Teachers may also sometimes evade tackling students' questions because the questions are not direct or straightforward and may lie outside the teacher's sphere of knowledge (Watts & Alsop, 1995). This problem is faced particularly by teachers who view their roles as dispensers of knowledge. Thus, teachers who are unsure of their own knowledge base might tactically avoid or repress students' questions to avoid problematic issues (Woodward, 1992). However, if they choose to address only questions for which they know the answers and suppress the rest, they risk stifling the curiosity and creativity of their students. Teachers who were subjected to a didactic, knowledge‐based approach during their own experiences as students, who perceive science teaching as transmission of facts, and who feel that tight control is a necessary feature of teaching, are also unlikely to invite students' questions.

Good et al. (1987) and Wood and Wood (1988) have shown how teacher control of questioning constantly encourages student passivity. Systemic conditions such as school structures, relations between adults and students, and socialisation into institutional and situational authority roles, may also inhibit questioning. Thus, although students may be asking questions of themselves or of their friends, their questions may not be articulated or verbalised publicly in the classroom.

Teachers and students, particularly in high school classrooms, are also put in conflicting roles by differing agendas. Although an inquiry‐based approach is advocated in teaching and learning science, there are social, cultural, and institutional forces at play that may thwart the efforts of students who genuinely want to ask questions. Teachers' perceived and real time constraints, concern for covering content, and the accountability pressures that they face to teach to high‐stakes tests and national examinations are powerful forces that influence classroom questioning behaviours, limiting the time to digress. A solution to this dilemma would have to come from a change of teachers' mind sets and a restructuring of the educational context to redefine the criteria for what is considered as success in teaching and learning science.

According to Dennett's (1991, cited in Pedrosa de Jesus et al., 2003) concept of ‘epistemic hunger’, human beings are ‘informavores’ who need to ‘make meaning’ and understand their surroundings. To assuage this epistemic hunger, curiosity and a ‘spark’ are necessary. The spark is triggered when one encounters something unexplained, unconventional, or an incongruity, within the context of an appropriate body of knowledge, and not in vacuo. Models of question‐asking, such as the PREG model (PREG is part of the word ‘pregunta’, which means ‘question’ in Spanish) (Graesser & Olde, 2003; Otero & Graesser, 2001) predict that questions are asked when learners experience cognitive disequilibrium. According to the latter authors, ‘questions are asked when individuals are confronted with obstacles to goals, anomalous events, contradictions, discrepancies, salient contrasts, obvious gaps in knowledge, expectation violations, and decisions that require discrimination among equally attractive alternatives’ (p. 524). Thus, to stimulate students' question‐asking, one might set up some kind of cognitive disequilibrium in the classroom to spark students' curiosity.

How, then, can teachers foster a ‘culture of inquisitiveness’ in science classrooms and stimulate their students to ask questions? The use of cognitive conflict (Allison & Shrigley, 1986), real‐world problem‐solving activities (Zoller, 1987), problem‐based learning (Chin & Chia, 2004), or case studies (Dori & Herscovitz, 1999) have been reported as catalysts to question‐asking. Activities that pertain to the social dimension of science (i.e. socio‐scientific issues) also provide a dynamic arena for student questioning, dialogue, and debate. Using a combination of these strategies might, therefore, result in enhanced question‐asking by students.

Chin (2004) provided a review of ways to encourage students' questions. Strategies include the following: (a) providing students with suitable stimuli for them to ask questions; (b) modelling question‐asking; (c) providing question prompts or stems; (d) using a question taxonomy; (e) asking students to pose questions via a learning journal, weekly report, question board, question box, or on‐line computer systems; (f) establishing a question corner in the classroom to supply ‘questions of the week’; (g) including a ‘free question time’ and ‘brainstorm’ session during lessons; (h) setting ‘question‐making’ homework; (i) including question‐asking in evaluation; (j) using interactive instructional approaches where students work in collaborative groups to generate questions; and (k) creating a non‐threatening classroom atmosphere where students feel free to ask questions. Our analysis suggests that these strategies can be grouped around three themes outlined below.

Biddulph, Symington, and Osborne (1986) suggested providing students with suitable stimuli, modelling question‐asking, developing a receptive classroom atmosphere, and including question‐asking in evaluation. Likewise, White and Gunstone (1992) proposed the use of structuring or focusing strategies such as providing a stimulus (e.g. table of data or diagram) on which questions are to be based, providing an answer and asking for questions, and asking students to begin questions in a particular way (e.g. ‘What if …’, ‘Why does…’, ‘Why are …’, ‘How would …’) as such questions are more likely to be based on deeper thinking than simple recall.

The explicit teaching of question typologies is another way in which teachers can get students to pose questions. Teaching students categories of question types can make them aware that different kinds of questions elicit different thinking processes that help build answers in different ways that can lead to insight. For example, Chin (2006) suggested providing students with sample self‐questions that focus on specific cognitive processes (e.g. comparing, explaining, hypothesising, predicting, analysing, and inferring) that are aligned to the task in hand. Explanation‐based questions, for instance, would tap causal thinking and encourage logical justifications, hypothetical questions let students test their suppositions, analytical questions help them to find patterns and relationships in data, while evaluative questions guide them to make comparisons and reflect on the pros and cons of the object under consideration. These question prompts can be situated in contextualised activities across a range of different science tasks, and act as objects for reflection.

For example, for a decision‐making task where students have to select an option from among several possible alternatives, the overarching question would be ‘Which option is best after taking everything into consideration?’ (evaluating). Related subordinate questions include: ‘What are the options?’(identifying options), ‘What criteria are relevant and important to help me decide which option to choose?’ (identifying criteria), ‘What are the likely consequences of each option?’ (predicting), ‘How important are the consequences?’ (evaluating), and ‘What are the pros and cons of each option?’ (comparing). Use of such questions has the potential to direct students' thinking towards specific goals and sub‐goals, and to focus attention on different related aspects of the task in question.

To help university biology students recognise good questions and to explain what type of questions were considered to be high level, Marbach‐Ad and Sokolove (2000b) provided them with a taxonomy of questions. This taxonomy categorised questions into four major types: (a) those that do not make logical or grammatical sense or are based on a misunderstanding or misconception; (b) those that are about a simple or complex definition, concept, or fact that a student could have looked up in the textbook; (c) those that involve students using information beyond that given in the textbook (this includes moral, philosophical or socio‐political questions based on motives and intentions, as well as those seeking an evolutionary or functional explanation); and (d) those that involve students using higher‐level thinking skills (this includes thoughtful questions resulting from extended thought and synthesis of prior knowledge and information, as well as hypothetical and research questions).

To guide students to formulate questions that are amenable to practical investigations, students could be taught Alfke's (1974) model of ‘operational’ questions, which help students to identify and manipulate variables in science experiments. Since manipulating variables is central to understanding cause‐and‐effect relationships in scientific inquiry, operational questions are a means of encouraging students to think about such relationships. Chin and Kayalvizhi (2002) have also suggested a variety of ways to help students pose investigable questions. These include teaching students explicitly the nature of investigations, the concepts and language associated with investigations, the different types of investigations that can be carried out and, in particular, the criteria for a ‘good’ investigable question, as well as the syntax and semantics involved in crafting the investigative question. Students should also be taught to distinguish between investigable and non‐investigable questions.

Students can record their questions about things that puzzle them in a diary or learning journal, thus documenting a set of ‘I wonder’ questions (e.g. Kulas, 1995). Etkina (2000) discussed the use of the weekly report, which is a structured journal in which students answer three questions: (a) ‘What did you learn this week?’; (b) ‘What questions remain unclear?’; and (c) ‘If you were the professor, what questions would you ask to find out whether the students understood the most important material of this week?’. These questions not only encourage students to think about the gaps in their current knowledge, but also serve as an assessment tool and allow the instructor to modify subsequent instruction to address students' needs.

Similarly, Dixon (1996) described the use of a ‘question board’ to display students' questions relating to the topic being taught and described how these questions may be used as starting points for scientific investigations. Teachers can also establish a ‘problem corner’ in the classroom and encourage students to supply ‘questions of the week’ (Jelly, 1985). Watts, Gould, and Alsop (1997) suggested including specific times for questions such as a period of ‘free question time’ within a lesson or block of lessons, a question ‘brainstorm’ at the start of a topic, a ‘question box’ on a side table where students can put their questions, turn‐taking questioning around the class where each student or group of students must prepare a question to be asked of others, and ‘question‐making’ homework.

With the advance of new technologies and their increasing use in teaching and learning, computer systems can also be used to facilitate student questioning, for example through electronic discussion forums, email platforms, and Intranet systems. Such systems do not require face‐to‐face interaction and students can spend time thinking about how to phrase their questions (as well as their answers), before sending them to their teacher or peers. These computer systems also allow multiple responses from different individuals, and these do not have to be immediate, unlike questions posed in face‐to‐face interactions. In this respect, such electronic systems may afford opportunities for students to be more deliberative and thoughtful in crafting their questions and answers, and also allow shy students to have a voice. However, such systems may also be a double‐edged sword in that some learners may prefer more personal human‐to‐human classroom interactions, and may be disinclined to ask questions through electronic media via a machine.

Another way to encourage students' questions is to include question‐asking in evaluation. Eisner (1965) suggested that teachers should:

Ask students when they complete a unit to list as many questions as they can that they think would be important for obtaining a fuller understanding of the material they have just studied. Such a list could be scored for the number and quality of the questions, quality being defined by the relevance and centrality of the questions raised. (p. 628)

Including students' questions as part of evaluation was also reported by Zoller (1994) who developed and implemented the ‘Examination where the Student Asks the Questions’ (ESAQ) teaching/evaluation strategy for university chemistry undergraduates. In contrast to the traditional examination where students respond to questions prepared by the instructor, the students submitted their home‐prepared written questions to the instructor for grading. A few of the students' questions were selected by the instructor and redistributed to all students to serve as a student‐designed ‘take home’ examination. Besides returning their answers to the instructor for evaluation, students' answers could also be redistributed for peer correction or evaluation. The author reported that the ESAQ increased the course relevancy to students' needs, which, in turn, resulted in their increased interest and active involvement.

A number of peer‐learning approaches have been developed that use question‐asking and answering to structure group interaction at a high cognitive level. For example, in Guided Reciprocal Peer Questioning (King, 1992, 2002), students are provided with question‐starters to generate content‐specific questions in a focused discussion setting. Some questions stimulate students to express differing points of view and create socio‐cognitive conflict (Mugny & Doise, 1978). This engenders discussion where students attempt to reconcile differences and negotiate meanings through elaborating, explaining, justifying, and using other socio‐cognitive processes. Gallagher, Stepien, Sher, and Workman (1995) discussed the use of a ‘Need‐to‐Know’ worksheet where students work in collaborative groups to identify learning issues related to a given problem and to document their questions and ideas onto this sheet.

The classroom learning environment is also an important factor to consider. White (1977) has suggested that ‘praise should be given to those who invent questions, repression should be avoided’ (p. 125). To ask a question openly, especially among peers in a whole‐class setting, requires courage. Some students may not be comfortable with this risk‐taking and may be afraid that their questions may be considered ‘stupid’ or be laughed at by their classmates. A warm classroom climate with a low risk of censure, criticism, or ridicule is essential for promoting students' questions. Where teachers are intolerant of ‘stupid’ questions, students will be less forthcoming in venturing their questions, fearing that they may be dismissed as silly.

The above suggestions on the different ways of encouraging students to ask questions hold out the promise of a more interrogative and thoughtful approach to learning. However, there is a caveat. Simply encouraging students to ask questions or even teaching them how to ask questions may not necessarily result in better learning, if the questions reveal a formulaic focus. For example, some students may simply go through the motions of using question stems as a model without actively thinking of questions that they have, thereby misleading the teacher into thinking that they are actively generating their own authentic questions. Marton and Saljo (1976) found that their attempts to induce a deep approach to learning through forcing students to answer certain types of questions while reading text resulted in a ‘technification’ of the learning process. Instead, the demand structure of the learning situation led to an extreme form of surface learning, in which students simply complied with the demands without engaging in deep thinking. Likewise, a study by Arzi and White (1986) found that, after students were trained in asking good reflective questions in an attempt to actively engage them in thinking, some of the questions that students asked were simply a glib modification of versions of standard question stems that had been introduced to them. What is important is that the questions should be based on the students' genuine attempts at finding relationships and coherence in their search for understanding. Students should also understand the underlying purposes of the different kinds of questions and the type of thinking that each question can elicit.

What other aspects of students' questions are still untapped that have potential for future research? While research on students' questions has been progressively expanding in recent years, there are several aspects still to be explored. Research has shown that students are able to ask thoughtful questions on science content, provided certain facilitative conditions are met. These conditions include the use of scaffolds such as question prompts, stimuli that pique students' curiosity, and various supporting mechanisms such as computer‐aided teaching systems that prompt students to ask questions. In light of these findings, a broad question worth investigating is ‘How can questioning scaffolds (such as question prompts, curiosity‐provoking stimuli, and computer‐based supports) be used to foster students' questions in a variety of specific science learning contexts?’

Besides contexts involving tasks that pertain to reading science texts, formulating research questions for science investigations, project work and problem‐based learning (which have been studied by previous researchers), there are other areas incorporating student questioning that can be explored. These include: (a) group discussions and argumentation about students' ideas on various scientific topics; (b) students' decision‐making in a variety of situations such as during debates and role‐plays about scientific and socio‐scientific issues; (c) students' writing on tasks that involve either ‘learning‐to‐write’ or ‘writing‐to‐learn’; (d) learning through demonstrations and manipulative, interactive exhibits; and (e) informal learning beyond the classroom such as in science centres and museums.

Because talking science (Lemke, 1990) involves questioning, which is an important ‘thinking tool’, future research could look at how students' questions emerge and how they influence students' thinking. For example, the following research questions seem to offer promise: ‘How can teachers structure a particular task to foster a classroom discourse that stimulates question‐asking?’ and ‘Do students' questions guide them to engage in more critical thinking and argumentation about scientific ideas?’ The focus of such a study could be on devising specific, practical strategies that attempt to induce deep, critical thinking through student‐generated questions and then studying the effects of such questions on subsequent discourse and knowledge construction.

Studies could also be conducted on how the teacher can scaffold student discourse for specific science tasks and provide a task structure to foster question‐asking that would lead to productive inquiry as well as higher level cognitive and metacognitive talk. Such tasks might include designing an investigation, predicting the outcome of a demonstration or experiment, analysing and interpreting experimental data, constructing explanations for an observed scientific phenomenon, or making a decision about a debatable socio‐scientific issue that involves selecting an option from among a set of alternatives.

In addition to tasks that involve talking science, student‐generated questions could also be harnessed in writing science. Wallace, Hand, and Prain (2004) discussed the use of the Science Writing Heuristic (SWH) for enhancing students' learning from laboratory activities through writing‐to‐learn. This tool consists of a student template containing questions that prompt learners to generate and make connections between questions, observations, claims, and evidence for their claims, as well as to reflect on how their ideas have changed during the experience of the laboratory activity. Empirical research using the SWH has shown improvement in students' conceptual understanding. These authors also reported that the types of activities students found most helpful on other writing‐to‐learn tasks (e.g. writing journals, scientific posters, and brochures) included framing their own questions, explaining a process, and rewording scientific information for an audience other than the teacher. One implication of these findings is that having students pose questions to guide their own thinking and writing could also be extended to other classroom writing tasks such as answering structured and essay questions on formal assignments and tests, as well as writing project reports. Future research into how this may be accomplished also seems worthwhile.

A second area that is little explored but offers potential for further work deals with the contextual influences on student questioning and the variables that influence students' question‐asking. By identifying the variables that facilitate or hinder question‐generation, we may learn how to better foster students' questions in the classroom. Some of these variables have been discussed earlier. However, there are others that have yet to be explored in more depth. These include the task demand and learner's goal (Graesser et al., 1993), different task types, learner's motivation, and student groupings.

How, for instance, might task demand and learner's goals influence students' question‐asking and the type of questions asked? It is conceivable that students' questions would vary according to the nature and demands of the assigned task, (i.e. the task type). For instance, students may tend to ask causal questions if the task is to solve a causal problem, to explain a scientific phenomenon, or to engage in a design task. They may pose a variety of hypothetical, predictive, and analytical questions if they are confronted with an authentic investigative task. A classroom demonstration, especially one that involves a discrepant event that produces cognitive conflict, might invite predictive questions (‘What would happen if …?’) before the demonstration, and explanatory questions (‘Why does it happen like that?’) after it. A decision‐making task that requires students to choose from among a few alternative options would likely elicit questions of a comparative and evaluative nature. On the other hand, if students are given tasks that deal with information that is only descriptive, procedural, or that involves merely rote recall, then their questions may be relatively superficial or of a low order. In this regard, a question of research interest would be ‘How does the nature of students' questions vary with different types of task demands that require different kinds of thinking or cognitive processing?’ The answer to such a question would inform instructional practice in the area of task design, as teachers can then craft their task in a more purposive way to elicit the appropriate kinds of thinking processes that they hope to foster in their students.

It is possible that learners who are active processors searching for meaning would ask questions that are somewhat different from those who pose questions simply because they are asked to do so. Thus, learner's motivation may also be another contingent factor influencing question‐asking, and appears to be an interesting variable that may be explored in future studies on students' questions.

More in‐depth research on the influence of different student groupings on collaborative tasks and students' reactions to student questioning also seems fruitful. Given the issue raised earlier about whether it would be better to group an ‘inquisitive’ student with peers who ask few questions, a pertinent research question that can be addressed is ‘In what ways might students be grouped to ensure an optimal learning environment for productive student questioning and discussion?’ Also, given individual differences in learning style, a study designed to answer the question ‘In what different ways might individual students react when they are specifically encouraged to ask questions during instruction?’ also warrants attention. It is possible that the various suggested scaffolding strategies may benefit different individuals to differing extents. For example, some students may find it distracting or even stressful if they are explicitly asked to pose questions, and may not welcome such a requirement by the teacher. On the other hand, other students may need such support and find such an experience helpful in stimulating and nurturing their thinking.

Most studies on student‐generated questions have focused on those produced individually by students. It would be of interest to study how questions produced both individually and in a group setting interact in students' collaborative inquiry and the process of knowledge construction. In a classroom setting, such a study might be based in part on Scardamalia and Bereiter's (1992) model of how students' questioning ought to work in school. In this model, students work in groups. Each student produces, in consultation with group members and the teacher, one or more questions that the group is expected to pursue. Some of these questions become big, broad problem statements conceptualised at the macro level, while other more specific ones are subsumed at the micro level. Provision is also made for students to respond to an ‘I need to understand’ entry where they can enter anything they feel needs to be understood in order to answer the main question. Studying students' questions at both the individual and group level may provide insight into how students' questions interact both intra‐psychologically and inter‐psychologically.

This review has demonstrated that students' questions can help students to monitor their own learning, explore and scaffold their ideas, steer thinking in certain specific directions, and advance their understanding of scientific concepts and phenomena. For teachers, these questions can be used as indicators of students' learning problems, and provide diagnostic information about what students are thinking. Students' questions can also be harnessed for lessons that involve class discussions, argumentation, investigations, problem‐based learning, and project work.

The findings of previous research show that the explicit teaching of questioning skills to students can lead to improved performance on a range of science‐related tasks that includes reading comprehension of science texts, formulating researchable questions for science investigations, learning new content material through cooperative group discussions, and asking higher‐cognitive level questions. Classroom instruction that sees learning as active based around cooperative groups, uses inquiry‐based laboratory work, provides a variety of scaffolded opportunities for students to pose questions, and uses the reading of scientific research papers has also been found to enhance students' ability to ask good quality questions.

Research also suggests that questions often do not emerge spontaneously from students. Rather, they have to be encouraged and teachers need to specifically employ strategies to elicit them. Ways of stimulating students' question‐asking include teacher modelling, question prompts and taxonomies; purposeful structuring of tasks through the use of physical supports, time for question generation, and targeted activities that induce students to supply questions; as well as the provision of social supports. Also, for their questions to provoke thoughtful, intellectual engagement and guide their learning, students need to ask deeper questions than the common transactional, procedural, and basic information questions.

Ultimately, however, it is the teacher who holds the key to providing an atmosphere that encourages or discourages students' questions. To nurture the spirit of inquiry in students and cultivate questioning as a habit of mind, a central role for any teacher, therefore, is to foster a classroom environment where it is intellectually, socially and academically rewarding for students to pose thoughtful questions. Given that the default pedagogy of most science classrooms across the globe is still one of transmission (Lyons, 2006), there is still a lot of scope for paedagogies that exploit the potential value of students' questions and to realise Shodell's (1995) vision of the ‘question‐driven classroom’ – a classroom where each student is placed in an active role as questioner, and where Schwab's (1962) vision of science teaching as a process of ‘enquiry into enquiry’ may finally be realised.

Christine Chin is associate professor in the Natural Sciences and Science Education Academic Group, National Institute of Education, Singapore, where she is coordinator of the Med specialisation in science education.

Jonathan Osborne holds the Chair in Science Education in the Department of Education and Professional Studies at King's College London.

Table 1. The role of students' questions in learning and teaching science.

Role of students' questionsReferences
In learning science  
• Direct learning and drive knowledge constructionChin & Brown (2000a)
• Foster discussion and debate in classroom discourseChin, Brown & Bruce (2002)
• Help learners to monitor and self‐evaluate their understandingChin (2006); King (1989); Graesser, Person & Huber (1992); Wong (1985)
• Increase students' motivation and interest in a topicChin & Kayalvizhi (2005)
In teaching science  
• Help diagnose students' understanding in formative assessmentElstgeest (1985); Maskill & Pedrosa de Jesus (1997a); Watts & Alsop (1995); White & Gunstone (1992)
• Evaluate higher order thinkingDori & Herscovitz (1999)
• Stimulate further inquiry into the topic via open investigations, problem‐based learning, and project workChin & Chia (2004); Crawford, Kelly & Brown (2000); Gallas (1995); Keys (1998); Pedrosa de Jesus, Neri de Souza, Teixeira‐Dias & Watts (2005)
• Provoke critical reflection on classroom practiceWatts, Alsop, Gould & Walsh (1997)

Table 2. Nature and types of students' questions.

StudyHow students' questions were classifiedKey points/Implications
Scardamalia & Bereiter (1992)

Text‐based (asked in response to reading texts)

Knowledge‐based (spontaneously generated)

hyphen

basic information

hyphen

wonderment

Knowledge‐based questions were of a higher order than text‐based questions and hold greater educational potential.

Students asked mainly basic information questions for a less familiar topic, and wonderment questions for a familiar topic.

Wonderment questions have greater potential for an advance in conceptual understanding relative to basic information questions.

Anderson & Krathwohl (2001); Bloom, Engelhart, Furst, Hill, & Krathwohl (1956)

Remember, understand, apply, analyse, evaluate, create

Knowledge, comprehension, application, analysis, synthesis, evaluation

Questions are classified according to the level of thought (or cognitive processes) required for answering them.

Pizzini & Shepardson (1991)

Input – recall information or derive it from sense data

Processing – draw relationships among data

Output – hypothesise, speculate, generalise, create, evaluate

The authors developed this taxonomy to compare the quantity and quality of students' questions in problem‐solving instruction versus teacher‐directed lab instruction. However, with only three levels, this scheme is not sufficiently descriptive of the cognitive levels it purports to represent.

Pedrosa de Jesus, Teixeira‐Dias, & Watts (2003)

Confirmation – seek to clarify information, ask for exemplification or definition

Transformation – involve some restructuring or reorganisation of students' understanding

Both kinds of questions are necessary and complement each other.

The type of question that is appropriate depends on the nature of the situation and the task at hand.

Watts, Gould, & Alsop (1997)

Consolidation – confirm explanations and consolidate new ideas

Exploration – seek to expand knowledge and test constructs

Elaboration – reconcile different understandings, resolve conflict

Questions were classified according to the periods in the process of conceptual change.

Chin & Kayalvizhi (2002)

Investigable – amenable to first‐hand practical inquiry

Non‐investigable – do not lend themselves to hands‐on investigations

Not all students' raw questions are amenable to hands‐on investigations

Students could be taught to distinguish between investigable and non‐investigable questions.

Baram‐Tsabari & Yarden (2005)

Field of interest – biology, chemistry, physics, earth science, astrophysics, nature of science, technology

Motivation for question – ‘applicative’ versus ‘non‐applicative’

Type of information – factual, explanatory, methodological, evidential, open‐ended, application

Questions pertaining to biology were most popular, especially for girls.

Boys favoured ‘factual’ and ‘methodological’ questions, whereas girls asked more ‘explanatory’ and ‘applicative’ questions.

Children's science interests, as inferred from their questions, could be used to guide curriculum development.

Table 3. Summary of studies on the effects of teaching students questioning skills.

FocusStudyGrade levelSubjectNature of instructionKey findings
Reading comprehension of textsKoch & Eckstein (1991)College(n = 83)PhysicsStudents summarised a text by formulating questions into distinct categories (A/Q method) or posing questions to their peers (P/F method). Practice was carried out over 16 weeks.There were significant gains in reading comprehension for both A/Q and P/F groups over a control group.
Formulating researchable questions for science investigationsHartford & Good (1982)High school(n = 108)ChemistryDuring laboratory instruction over 12 weeks, students were taught to ask questions about manipulating experimental variables and the effects of these changes.The experimental group asked significantly more and better questions than the control group, as measured by the ‘Science inquiry assessment instrument’.
 Allison & Shrigley (1986)Grades 5&6(n = 72)ScienceOver three weeks, students were taught to ask ‘operational questions’ which manipulate variables in experiments through eliminating, substituting, and increasing or decreasing the presence of a variable.Students who experienced teacher modelling and/or written practice asked significantly more operational questions than the control group on a post‐test involving a demonstration about air pressure.
 Cuccio‐Schirripa & Steiner (2000)Grade 7(n = 181)ScienceStudents received instruction (90 min) on the definition, critical attributes, examples, and non‐examples of researchable questions, as well as concepts related to the manipulation of variables and cause‐effect relationships.Students who received instruction were better able to generate researchable questions of a significantly higher level than a control group, based on a science question rating scale.
Learning new content material through cooperative group discussionKing (1994)Grades 4&5(n = 48)ScienceStudents used prompt cards containing question stems to ask and answer each other questions in ‘guided cooperative questioning’ small‐group discussions on ‘systems of the body’. All instruction, training, practice, and testing were done in three weeks.Students trained to ask questions engaged in significantly more complex knowledge construction than unguided and control groups, based on lesson comprehension tests, post‐lesson knowledge maps, and verbal interaction.
 King & Roshenshine (1993)Grade 5(n = 34)ScienceStudents learnt about ‘tide pools’ by asking thought‐provoking questions using more fully articulated question stems or less elaborated question‐starters that gave only signal words. Instruction, training, practice, and testing occurred in seven 1.5‐hour sessions over 2.5 weeks.Children who used highly elaborated question stems outperformed those using less elaborated stems and the unguided questioners (control group) during discussion, post‐test comprehension, and knowledge mapping.

Table 4. Summary of studies on the relationship between students' questions and selected variables.

VariableStudyGrade levelSubjectNature of studyKey findings
Achievement level and conceptual understandingHarper, Etkina & Lin (2003)University(n = 158)PhysicsOver a 10‐week period, the authors investigated the relationship between the types of questions that students asked on a weekly written report and their conceptual understanding of various topics.There was no significant correlation between the number of questions asked and conceptual achievement. However, students who asked high‐level questions received better scores on the conceptual performance test than those who asked only simple questions. Students who asked questions to fill in conceptual gaps scored better than those who did not.
Learning approachesChin, Brown & Bruce (2002)Grade 8(n = 6)ScienceThis was a nine‐week case study of two groups of students who used learning approaches ranging from deep to surface. The small‐group talk from their seven hands‐on laboratory group activities during regular lessons (together with their written work and interview transcripts) was analysed. The focus of analysis was on the relationship between students' questions and the nature of their thinking and actions during the knowledge construction process.Students' ‘basic information’ questions that focused on facts and procedures (and which were typical of a surface learning approach) generated little productive discussion. In contrast, ‘wonderment questions’ that focused on comprehension, prediction, anomaly detection, application, and planning (and which characterised a deep learning approach) led students to engage in more thoughtful ideas and group talk.
Learning stylesPedrosa de Jesus, Almeida & Watts (2004)University(n = 4)ChemistryUsing four case studies, the authors compared the quality and quantity of questions asked by the students who were identified as a diverger, converger, assimilator, or accommodator, based on Kolb's (1984, 1985) experiential learning theory and learning styles. Data analysis was based on the students' written and oral questions (obtained from the Intranet, Question box, and classroom observations over two semesters), and interviews.Although a student may have a clear preference for a particular learning style, he or she can still work across all modes of learning and move between many diverse types of questions if he or she has the capacity to integrate the four learning modes. However, if the student is at a lower stage of knowledge development, then he or she may not yet have the sophistication to ask a variety of questions.
Task condition and grade levelCosta, Caldeira, Gallástegui & Otero (2000)Grades 8, 10 & 12(n = 289)ScienceStudents asked questions on science texts under three conditions (class, examination, and extra‐academic) during a regular 50‐min class period. The questions were graded according to the quantity and quality of questions asked.No significant main effects were found for grade level or task condition, concerning the total number of questions and number of deep reasoning questions. However, there was a significant interaction of grade level by task, with the class condition producing more quality questions than in the extra‐academic situation at Grade 12.
Nature of instruction

Active learning versus traditional lecture

Marbach‐Ad & Sokolove (2000a)University(n = 475)BiologyOver one semester, students taught under two conditions were compared for their ability to ask questions on a text. The active learning class used interactive instructional approaches in cooperative learning groups, whereas the other class used a traditional lecture format.Students from the active learning class were able to pose better and higher‐level questions after reading chapters from the textbook than those taught in a traditional lecture format.

Inquiry versus confirmatory labs

Hofstein, Shore & Kipnis (2004)Grade 11&12(n = 25 groups of 3–4 students each)ChemistryOver a two‐year period, students carried out lab experiments under one of two instructional formats: inquiry or control. Those in the inquiry group engaged in ‘open‐ended’ experiences while the control group followed procedural instructions.Students in the inquiry group asked more sophisticated questions involving more variables and quantitative components whereas those in the control group asked mainly low‐level, qualitative questions.
 Hofstein, Kipnis & Mamlok‐Naaman (2005)Grade 12(n = 111)ChemistryOver two years, students engaged in inquiry‐type experiments or traditional confirmatory labs. The number and cognitive level of students' questions were compared after they carried out an experiment and after reading a scientific article. Questions were ‘high‐level’ if they could be answered only by further practical investigation or looking for more information on the Internet or chemistry literature.Students who engaged in practical inquiry significantly outperformed the control group in asking more and better questions.

Exposure to active learning pedagogies

Pedrosa de Jesus, Teixeira‐Dias & Watts (2003); Teixeira‐Dias et al. (2005)University(n = 32 for year 2000/01; n = 100 for year 2001/02)ChemistryTo stimulate active learning, teaching incorporated students' questions in small group work tutorials, conference lectures, lab sessions, and mini‐projects. Students were encouraged to ask questions through using a software system, email, question box, and project workbooks. No control group was used.There was an increase in students' engagement in learning over time, as indicated by the number and quality of questions asked by students.

Exposure to reading research papers

Brill & Yarden (2003)Grade 11&12(n = 107)BiologyThe experimental group studied embryonic development through reading research papers while the control group used a textbook on genetics. Students asked questions before, during, and after instruction. Instruction was carried out in 30 hours over several weeks.Before reading research papers, students asked mainly general questions based on ‘properties’, which required only declarative knowledge. In contrast, during or following instruction, students' questions were of a higher level of thinking, based on ‘comparisons’ and ‘causal relationships’. There was no change in the level and specificity of questions with the control group.

  • Alexander, R. 2005. Towards dialogic teaching, York, , UK: Dialogos.  [Google Scholar]
  • Alfke, D. 1974. Asking operational questions. Science and Children, 11: 1819.  [Google Scholar]
  • Allison, A.W. and Shrigley, R.L. 1986. Teaching children to ask operational questions in science. Science Education, 70: 7380.  [Crossref], [Web of Science ®], [Google Scholar]
  • Anderson, L.W. and Krathwohl, D.R., eds. 2001. A taxonomy for learning, teaching, and assessing: A revision of Bloom's taxonomy of educational objectives , (Complete edition), New York: Longman.  [Google Scholar]
  • Arzi, H.J. and White, R.T. 1986. Questions on students' questions. Research in Science Education, 16: 8291.  [Crossref], [Google Scholar]
  • Baram‐Tsabari, A., Sethi, R.J., Bry, L. and Yarden, A. 2006. Using questions sent to an Ask‐A‐Scientist site to identify children's interests in science. Science Education, 90: 10501072.  [Crossref], [Web of Science ®], [Google Scholar]
  • Baram‐Tsabari, A. and Yarden, A. 2005. Characterizing children's spontaneous interests in science and technology. International Journal of Science Education, 27(7): 803826.  [Taylor & Francis Online], [Google Scholar]
  • Bell, B. and Cowie, B. 2001. The characteristics of formative assessment in science education. Science Education, 85(5): 536553.  [Crossref], [Google Scholar]
  • Berlyne, D.E. 1954. A theory of human curiosity. British Journal of Psychology, 45: 180191.  [PubMed], [Web of Science ®], [Google Scholar]
  • Biddulph, F. and Osborne, R. 1982. Some issues relating to children's questions and explanations, Waikato, , New Zealand: University of Waikato. LISP(P) Working paper no. 106 [Google Scholar]
  • Biddulph, F., Symington, D. and Osborne, R. 1986. The place of children's questions in primary science education. Research in Science and Technological Education, 4: 7788.  [Taylor & Francis Online], [Google Scholar]
  • Black, P., Harrison, C., Lee, C., Marshall, B. and Wiliam, D. 2002. Working inside the black box: Assessment for learning in the classroom, London: King's College London.  [Google Scholar]
  • Bloom, B.S., Engelhart, M.B., Furst, E.J., Hill, W.H. and Krathwohl, D.R. 1956. Taxonomy of educational objectives: The classification of educational goals, New York: Longmans Green. (Handbook 1: Cognitive domain) [Google Scholar]
  • Blosser, P.E. 1995. How to ask the right questions, Arlington, VA: National Science Teachers Association.  [Google Scholar]
  • Brill, G. and Yarden, A. 2003. Learning biology through research papers: A stimulus for question‐asking by high‐school students. Cell Biology Education, 2: 266274.  [Crossref], [PubMed], [Google Scholar]
  • Brown, J.S., Collins, A. and Duguid, P. 1989. Situated cognition and the culture of learning. Educational Researcher, 18: 3242.  [Crossref], [Google Scholar]
  • Browne, M.N. and Keeley, S.M. 1998. Asking the right questions: A guide to critical thinking, Englewood Cliffs, NJ: Prentice Hall.  [Google Scholar]
  • Carlsen, W.S. 1991. Questioning in classrooms: A sociolinguistic perspective. Review of Educational Research, 61: 157178.  [Crossref], [Web of Science ®], [Google Scholar]
  • Carr, D. 1998. The art of asking questions in the teaching of science. School Science Review, 79: 4750.  [Google Scholar]
  • Chi, M.T.H., De Leeuw, N., Chiu, M.H. and Lavancher, C. 1994. Eliciting self‐explanations improves understanding. Cognitive Science, 18: 439477.  [Crossref], [Web of Science ®], [Google Scholar]
  • Chin, C. 2004. Students' questions: Fostering a culture of inquisitiveness in science classrooms. School Science Review, 86(314): 107112.  [Google Scholar]
  • Chin, C. 2006. Using self‐questioning to promote pupils' process skills thinking. School Science Review, 87(321): 113122.  [Google Scholar]
  • Chin, C. and Brown, D.E. 2000a. Learning deeply in science: An analysis and reintegration of deep approaches in two case studies of Grade 8 students. Research in Science Education, 30(2): 173197.  [Crossref], [Google Scholar]
  • Chin, C. and Brown, D.E. 2000b. Learning in science: A comparison of deep and surface approaches. Journal of Research in Science Teaching, 37(2): 109138.  [Crossref], [Web of Science ®], [Google Scholar]
  • Chin, C., Brown, D.E. and Bruce, B.C. 2002. Student‐generated questions: A meaningful aspect of learning in science. International Journal of Science Education, 24(5): 521549.  [Taylor & Francis Online], [Web of Science ®], [Google Scholar]
  • Chin, C. and Chia, L.G. 2004. Problem‐based learning: Using students' questions to drive knowledge construction. Science Education, 88: 707727.  [Crossref], [Web of Science ®], [Google Scholar]
  • Chin, C. and Kayalvizhi, G. 2002. Posing problems for open investigations: What questions do pupils ask?. Research in Science & Technological Education, 20(2): 269287.  [Taylor & Francis Online], [Google Scholar]
  • Chin, C. and Kayalvizhi, G. 2005. What do pupils think of open science investigations? A study of Singaporean primary 6 pupils. Educational Research, 47(1): 107126.  [Taylor & Francis Online], [Google Scholar]
  • Commeyras, M. 1995. What can we learn from students' questions?. Theory into Practice, 34(2): 101106.  [Taylor & Francis Online], [Web of Science ®], [Google Scholar]
  • Costa, J., Caldeira, H., Gallástegui, J.R. and Otero, J. 2000. An analysis of question asking on scientific texts explaining natural phenomena. Journal of Research in Science Teaching, 37(6): 602614.  [Crossref], [Google Scholar]
  • Crawford, T., Kelly, G.J. and Brown, C. 2000. Ways of knowing beyond facts and laws of science: An ethnographic investigation of student engagement in scientific practices. Journal of Research in Science Teaching, 37(3): 237258.  [Crossref], [Web of Science ®], [Google Scholar]
  • Csikszentmihalyi, M. 1990. Flow: The psychology of optimal experience, New York: Harper and Row.  [Google Scholar]
  • Cuccio‐Schirripa, S. and Steiner, H.E. 2000. Enhancement and analysis of science question level for middle school students. Journal of Research in Science Teaching, 37: 210224.  [Crossref], [Web of Science ®], [Google Scholar]
  • Dennett, D.C. 1991. Consciousness explained, London: Penguin books.  [Google Scholar]
  • Dillon, J.T. 1984. The classification of research questions. Review of Educational Research, 54(3): 327361.  [Crossref], [Web of Science ®], [Google Scholar]
  • Dillon, J.T. 1988. The remedial status of student questioning. Journal of Curriculum Studies, 20: 197210.  [Taylor & Francis Online], [Web of Science ®], [Google Scholar]
  • Dixon, N. 1996. Developing children's questioning skills through the use of a question board. Primary Science Review, 44: 810.  [Google Scholar]
  • Donaldson, M. 1978. Children's minds, London: Falmer Press.  [Google Scholar]
  • Dori, Y.J. and Herscovitz, O. 1999. Question‐posing capability as an alternative evaluation method: Analysis of an environmental case study. Journal of Research in Science Teaching, 36: 411430.  [Crossref], [Web of Science ®], [Google Scholar]
  • Driver, R., Asoko, H., Leach, J., Mortimer, E. and Scott, P. 1994. Constructing scientific knowledge in the classroom. Educational Researcher, 23(7): 512.  [Crossref], [Google Scholar]
  • Duit, R. and Treagust, D. 1998. “Learning in science: From behaviourism towards social constructivism and beyond”. In International handbook of science education, Edited by: Fraser, B. J. and Tobin, K. G. 325. Dordrecht, , The Netherlands: Kluwer Academic Publishers. Part 1 [Crossref], [Google Scholar]
  • Duschl, R.A. and Osborne, J. 2002. Supporting and promoting argumentation discourse in science education. Studies in Science Education, 38: 3972.  [Taylor & Francis Online], [Google Scholar]
  • Eisner, E.W. 1965. Critical thinking: Some cognitive components. Teachers College Record, 66: 624634.  [Crossref], [Web of Science ®], [Google Scholar]
  • Elstgeest, J. 1985. “The right question at the right time”. In Primary science: Taking the plunge, Edited by: Harlen, W. 3646. London: Heinemann.  [Google Scholar]
  • Etkina, E. 2000. Weekly reports: A two‐way feedback tool. Science Education, 84(5): 594605.  [Crossref], [Google Scholar]
  • Fairbrother, R.W. 1988. Assessment of practical work for the GCSE, Nuffield, , UK: Chelsea Curriculum Trust.  [Google Scholar]
  • Festinger, L. 1957. A theory of cognitive dissonance, Stanford, CA: Stanford University Press.  [Crossref], [Google Scholar]
  • Fisher, R. 1990. Teaching children to think, London: Simon and Shuster.  [Google Scholar]
  • Gallagher, S.A., Stepien, W.J., Sher, B.T. and Workman, D. 1995. Implementing problem‐based learning in science classroom. School Science and Mathematics, 95(3): 136146.  [Crossref], [Google Scholar]
  • Gallas, K. 1995. Talking their way into science: Hearing children's questions and theories, responding with curricula, New York: Teachers College Press.  [Google Scholar]
  • Good, T.T., Slavins, R.L., Hobson Harel, K. and Emerson, H. 1987. Student passivity: A study of question asking in K‐12 classrooms. Sociology of Education, 60: 181199.  [Crossref], [Web of Science ®], [Google Scholar]
  • Graesser, A.C., Langston, M.C. and Bagget, W.B. 1993. “Exploring information about concepts by asking questions”. In The psychology of learning and motivation, vol. 29, Categorization by humans and machines, Edited by: Nakamura, G. V., Taraban, R. M. and Medin, D. 411436. Orlando, FL: Academic Press.  [Crossref], [Google Scholar]
  • Graesser, A.C. and Olde, B.A. 2003. How does one know whether a person understands a device? The quality of the questions the person asks when the device breaks down. Journal of Educational Psychology, 95: 524536.  [Crossref], [Web of Science ®], [Google Scholar]
  • Graesser, A.C. and Person, N.K. 1994. Question asking during tutoring. American Educational Research Journal, 31: 104137.  [Crossref], [Web of Science ®], [Google Scholar]
  • Graesser, A.C., Person, N.K. and Huber, J.D. 1992. “Mechanisms that generate questions”. In Questions and information systems, Edited by: Lauer, T., Peacock, E. and Graesser, A. C. 167187. Hillsdale, NJ: Erlbaum.  [Google Scholar]
  • Hadzigeorgiou, Y. 1999. On problem situations and science learning. School Science Review, 81: 4353.  [Google Scholar]
  • Harper, K.A., Etkina, E. and Lin, Y. 2003. Encouraging and analyzing student questions in a large physics course: Meaningful patterns for instructors. Journal of Research in Science Teaching, 40(8): 776791.  [Crossref], [Web of Science ®], [Google Scholar]
  • Hartford, F. and Good, R. 1982. Training chemistry students to ask research questions. Journal of Research in Science Teaching, 19: 559570.  [Crossref], [Web of Science ®], [Google Scholar]
  • Hawkins, J. and Pea, R.D. 1987. Tools for bridging the culture of everyday and scientific thinking. Journal of Research in Science Teaching, 24: 291307.  [Crossref], [Web of Science ®], [Google Scholar]
  • Hennessy, S. 1993. Situated cognition and cognitive apprenticeship: Implications for classroom learning. Studies in Science Education, 22: 141.  [Taylor & Francis Online], [Google Scholar]
  • Hodson, D. and Hodson, L. 1998. From constructivism to social constructivism: A Vygotskian perspective. School Science Review, 79: 3346.  [Google Scholar]
  • Hofstein, A., Navon, O., Kipnis, M. and Mamlok‐Naaman, R. 2005. Developing students' ability to ask more and better questions resulting from inquiry‐type chemistry laboratories. Journal of Research in Science Teaching, 42(7): 791806.  [Crossref], [Google Scholar]
  • Hofstein, A., Shore, R. and Kipnis, N. 2004. Providing high school students with opportunities to develop learning skills in an inquiry‐type laboratory. International Journal of Science Education, 26(1): 4762.  [Taylor & Francis Online], [Google Scholar]
  • Howe, A. 1996. Development of science concepts within a Vygotskian framework. Science Education, 80(1): 3551.  [Crossref], [Web of Science ®], [Google Scholar]
  • Jelly, S. 1985. “Helping children raise questions – and answering them”. In Primary science: Taking the plunge, Edited by: Harlen, W. 4757. London: Heinemann.  [Google Scholar]
  • Keys, C.W. 1998. A study of grade six students generating questions and plans for open‐ended science investigations. Research in Science Education, 28: 301316.  [Crossref], [Google Scholar]
  • King, A. 1989. Effects of self‐questioning training on college students' comprehension of lectures. Contemporary Educational Psychology, 14: 366381.  [Crossref], [Web of Science ®], [Google Scholar]
  • King, A. 1992. Facilitating elaborative learning through guided student‐generated questioning. Educational Psychologist, 27(1): 111126.  [Taylor & Francis Online], [Web of Science ®], [Google Scholar]
  • King, A. 1994. Guiding knowledge construction in the classroom: Effects of teaching children how to question and how to explain. American Educational Research Journal, 31: 338368.  [Crossref], [Web of Science ®], [Google Scholar]
  • King, A. 2002. Structuring peer interaction to promote high‐level cognitive processing. Theory into Practice, 41(1): 3339.  [Taylor & Francis Online], [Google Scholar]
  • King, A. and Rosenshine, B. 1993. Effects of guided cooperative questioning on children's knowledge construction. Journal of Experimental Education, 61: 127148.  [Taylor & Francis Online], [Web of Science ®], [Google Scholar]
  • Koch, A. and Eckstein, S.G. 1991. Improvements of reading comprehension of physics texts by students' question formulation. International Journal of Science Education, 13(4): 473485.  [Taylor & Francis Online], [Web of Science ®], [Google Scholar]
  • Kolb, D.A. 1984. Experiential learning: Experience as the source of learning and development, Englewood Cliffs, NJ: Prentice Hall.  [Google Scholar]
  • Kolb, D. 1985. LSI Learning Style Inventory: Self‐scoring inventory and interpretation booklet, Boston: McBer and Co.  [Google Scholar]
  • Kolb, D., Boyatzis, R. and Mainemelis, C. 2001. “Experiential learning theory: Previous research and new directions”. In Perspectives on thinking, learning and cognitive styles, Edited by: Sternberg, R. J. and Zhang, L. 227247. Mahwah, NJ: Lawrence Erlbaum.  [Google Scholar]
  • Kulas, L.L. 1995. I wonder…. Science and Children, 3294: 1618.  [Google Scholar]
  • Lawson, A.E. 2003. The nature and development of hyothetico‐predictive argumentation with implications for science teaching. International Journal of Science Education, 25(11): 13871408.  [Taylor & Francis Online], [Google Scholar]
  • Lemke, J.L. 1990. Talking science: Language, learning and values, Norwood, NJ: Ablex.  [Google Scholar]
  • Lyons, T. 2006. Different Countries, Same Science Classes: Students' experience of school science classes in their own words. International Journal of Science Education, 28(6): 591613.  [Taylor & Francis Online], [Web of Science ®], [Google Scholar]
  • Manning, B.H. and Payne, B.D. 1996. Self‐talk for teachers and students, Needham Heights, MA: Allyn & Bacon.  [Google Scholar]
  • Marbach‐Ad, G. and Classen, L.A. 2001. Improving students' questions in inquiry labs. American Biology Teacher, 63(6): 410419.  [Crossref], [Google Scholar]
  • Marbach‐Ad, G. and Sokolove, P.G. 2000a. Can undergraduate biology students learn to ask higher level questions?. Journal of Research in Science Teaching, 37(8): 854870.  [Crossref], [Web of Science ®], [Google Scholar]
  • Marbach‐Ad, G. and Sokolove, P.G. 2000b. Good science begins with good questions. Journal of College Science Teaching, 30(3): 192195.  [Google Scholar]
  • Marton, F. and Saljo, R. 1976. On qualitative differences in learning. I: Outcome and process. British Journal of Educational Psychology, 46: 411.  [Crossref], [Web of Science ®], [Google Scholar]
  • Maskill, R. and Pedrosa De Jesus, H. 1997a. Pupils' questions, alternative frameworks and the design of science teaching. International Journal of Science Education, 19: 781799.  [Taylor & Francis Online], [Web of Science ®], [Google Scholar]
  • Maskill, R. and Pedrosa De Jesus, H. 1997b. Asking model questions. Education in Chemistry, 34(5): 132134.  [Google Scholar]
  • Millar, R. and Osborne, J.F., eds. 1998. Beyond 2000: Science education for the future, London: King's College London.  [Google Scholar]
  • Miyake, N. and Norman, D.A. 1979. To ask a question, one must know enough to know what is not known. Journal of Verbal Learning and Verbal Behaviour, 18(3): 357364.  [Crossref], [Google Scholar]
  • Mugny, G. and Doise, W. 1978. Socio‐cognitive conflict and the structure of individual and collective performances. European Journal of Social Psychology, 8: 181192.  [Crossref], [Web of Science ®], [Google Scholar]
  • Newman, D., Griffin, P. and Cole, M. 1989. The construction zone, Cambridge: Cambridge University Press.  [Google Scholar]
  • O'Loughlin, M. 1992. Rethinking science education: Beyond Piagetian constructivism toward a sociocultural model of teaching and learning. Journal of Research in Science Teaching, 29(8): 791820.  [Web of Science ®], [Google Scholar]
  • Olsher, G. and Dreyfus, A. 1999. Biotechnologies as a context for enhancing junior high‐school students' ability to ask meaningful questions about abstract biological processes. International Journal of Science Education, 21: 137153.  [Taylor & Francis Online], [Web of Science ®], [Google Scholar]
  • Osborne, J. and Collins, S. 2001. Pupils' views of the role and value of the science curriculum: A focus group study. International Journal of Science Education, 23(5): 441467.  [Taylor & Francis Online], [Web of Science ®], [Google Scholar]
  • Osborne, R.J. and Wittrock, M.C. 1983. Learning science: A generative process. Science Education, 67: 489508.  [Crossref], [Google Scholar]
  • Osborne, R.J. and Witrock, M.C. 1985. The generative learning model and its implications for science education. Studies in Science Education, 12: 5987.  [Taylor & Francis Online], [Google Scholar]
  • Otero, J. and Graesser, A.C. 2001. PREG: elements of a model of question asking. Cognition and Instruction, 19(2): 143175.  [Taylor & Francis Online], [Web of Science ®], [Google Scholar]
  • Pea, R. 1993. “Practices of distributed intelligence and designs for education”. In Distributed cognition, Edited by: Salomon, G. New York: Cambridge University Press.  [Google Scholar]
  • Pearson, J.A. 1991. Testing the ecological validity of teacher‐provided versus student‐generated postquestions in reading college science text. Journal of Research in Science Teaching, 28(6): 485504.  [Crossref], [Google Scholar]
  • Pedrosa de Jesus, H., Almeida, P. and Watts, M. 2004. Questioning styles and students' learning: Four case studies. Educational Psychology, 24(4): 531548.  [Taylor & Francis Online], [Google Scholar]
  • Pedrosa de Jesus, H., Neri De Souza, F., Teixiera‐Dias, J.J.C. and Watts, M. 2005. Organising the chemistry of question‐based learning: A case study. Research in Science and Technological Education, 23(2): 179193.  [Taylor & Francis Online], [Google Scholar]
  • Pedrosa de Jesus, H., Teixeira‐Dias, J.J.C. and Watts, M. 2003. Questions of chemistry. International Journal of Science Education, 25(8): 10151034.  [Taylor & Francis Online], [Web of Science ®], [Google Scholar]
  • Penick, J.E., Crow, L.W. and Bonnsteter, R.J. 1996. Questions are the answers. Science Teacher, 63: 2629.  [Google Scholar]
  • Pizzini, E.L. and Shepardson, D.P. 1991. Student questioning in the presence of the teacher during problem solving in science. School Science and Mathematics, 91: 348352.  [Crossref], [Google Scholar]
  • Rop, C.J. 2002. The meaning of student inquiry questions: A teacher's beliefs and responses. International Journal of Science Education, 24(7): 717736.  [Taylor & Francis Online], [Web of Science ®], [Google Scholar]
  • Rop, C.J. 2003. Spontaneous inquiry questions in high school chemistry classrooms: Perceptions of a group of motivated learners. International Journal of Science Education, 25(1): 1333.  [Taylor & Francis Online], [Google Scholar]
  • Rosenshine, B., Meister, C. and Chapman, S. 1996. Teaching students to generate questions: A review of the intervention studies. Review of Educational Research, 66: 181221.  [Crossref], [Web of Science ®], [Google Scholar]
  • Roth, W‐M. and Roychoudhury, A. 1993. The development of science process skills in authentic contexts. Journal of Research in Science Teaching, 30(2): 127152.  [Crossref], [Web of Science ®], [Google Scholar]
  • Rowe, M.B. 1987. “Using wait time to stimulate inquiry”. In Questions, questioning techniques, and effective teaching, Edited by: Wilen, W. W. 95106. Washington, DC: National Education Association.  [Google Scholar]
  • Scardamalia, M. and Bereiter, C. 1992. Text‐based and knowledge‐based questioning by children. Cognition and Instruction, 9: 177199.  [Taylor & Francis Online], [Web of Science ®], [Google Scholar]
  • Schmidt, H.G. 1993. Foundations of problem‐based learning: Rationale and description. Medical Education, 17: 1116.  [Crossref], [Web of Science ®], [Google Scholar]
  • Schwab, J.J. 1962. The teaching of science as enquiry, Cambridge, MA: Harvard University Press.  [Google Scholar]
  • Shodell, M. 1995. The question‐driven classroom. American Biology Teacher, 57: 278281.  [Crossref], [Web of Science ®], [Google Scholar]
  • Swatton, P. 1992. Children's language and assessing their skill in formulating testable hypotheses. British Education Research Journal, 18(10): 7385.  [Taylor & Francis Online], [Google Scholar]
  • Symington, D.J. 1980. Scientific problems seen by primary school pupils, Australia: Monash University. Unpublished PhD thesis [Google Scholar]
  • Teixeira‐Dias, J.J.C., Pedrosa De Jesus, Neri De Souza, F. and Watts, M. 2005. Teaching for quality learning in chemistry. International Journal of Science Education, 27(9): 11231137.  [Taylor & Francis Online], [Web of Science ®], [Google Scholar]
  • Tisher, R.P. 1977. Practical insights gained from Australian research on teaching. Australian Science Teachers Journal, 23: 99104.  [Google Scholar]
  • Tobin, K. 1987. The role of wait time in higher cognitive level learning. Review of Educational Research, 57(1): 6995.  [Crossref], [Web of Science ®], [Google Scholar]
  • Tobin, K. and Gallagher, J.J. 1987. The role of target students in the science classroom. Journal of Research in Science Teaching, 24(1): 6175.  [Crossref], [Web of Science ®], [Google Scholar]
  • Van Der Meij, H. 1994. Student questioning: A componential analysis. Learning and Individual Differences, 6: 137161.  [Crossref], [Web of Science ®], [Google Scholar]
  • Van Zee, E.H. 2000. Analysis of a student‐generated inquiry discussion. International Journal of Science Education, 22(2): 115142.  [Taylor & Francis Online], [Google Scholar]
  • Van Zee, E.H., Iwasyk, M., Kurose, A., Simpson, D. and Wild, J. 2001. Student and teacher questioning during conversations about science. Journal of Research in Science Teaching, 38(2): 159190.  [Crossref], [Web of Science ®], [Google Scholar]
  • Vygotsky, L.S. 1962, 1986. Thought and language, Cambridge, MA: MIT Press.  [Crossref], [Google Scholar]
  • Vygotsky, L.S. 1978. Mind in society: The development of higher psychological processes, Cambridge, MA: Harvard University Press.  [Google Scholar]
  • Wallace, C.S., Hand, B. and Prain, V. 2004. Writing and learning in the science classroom, Dordrecht, , The Netherlands: Kluwer Academic Publishers.  [Crossref], [Google Scholar]
  • Watts, M. and Alsop, S. 1995. Questioning and conceptual understanding: The quality of pupils' questions in science. School Science Review, 76(277): 9195.  [Google Scholar]
  • Watts, M., Gould, G. and Alsop, S. 1997. Questions of understanding: Categorising pupils' questions in science. School Science Review, 79(286): 5763.  [Google Scholar]
  • Watts, M., Alsop, S., Gould, G. and Walsh, A. 1997. Prompting teachers' constructive reflection: Pupils' questions as critical incidents. International Journal of Science Education, 19: 10251037.  [Taylor & Francis Online], [Web of Science ®], [Google Scholar]
  • Watts, M. and Pedrosa de Jesus, H. 2005. The cause and affect of asking questions: Reflective case studies from undergraduate students. Canadian Journal of Science, Mathematics, and Technology Education, 5(4): 437452.  [Taylor & Francis Online], [Google Scholar]
  • Wenger, E. 1998. Communities of practice: Learning, meaning, and identity, New York: Cambridge University Press.  [Crossref], [Google Scholar]
  • White, R.T. 1977. An overlooked objective. Australian Science Teachers' Journal, 23: 124125.  [Google Scholar]
  • White, R.T. and Gunstone, R.F. 1992. Probing understanding, London: Falmer Press.  [Google Scholar]
  • Wong, B.Y.L. 1985. Self‐questioning instructional research: A review. Review of Educational Research, 55: 227268.  [Crossref], [Web of Science ®], [Google Scholar]
  • Wood, D. and Wood, H. 1988. “Questioning versus student initiative”. In Questioning and Discussion, Edited by: Dillon, J. T. Norwood, NJ: Ablex.  [Google Scholar]
  • Woodward, C. 1992. Raising and answering questions in primary science: Some considerations. Evaluation and Research in Education, 6: 145153.  [Taylor & Francis Online], [Google Scholar]
  • Yarden, A., Brill, G. and Falk, H. 2001. Primary literature as a basis for a high‐school biology curriculum. Journal of Biological Education, 35: 190195.  [Taylor & Francis Online], [Web of Science ®], [Google Scholar]
  • Yip, D.Y. 1999. Implications of students' questions for science teaching. School Science Review, 81(294): 4953.  [Google Scholar]
  • Zoller, U. 1987. The fostering of question‐asking capability: A meaningful aspect of problem‐solving in chemistry. Journal of Chemical Education, 64: 510512.  [Crossref], [Web of Science ®], [Google Scholar]
  • Zoller, U. 1994. The examination where the student asks the questions. School Science and Mathematics, 94: 347349.  [Crossref], [Google Scholar]
  • Zoller, U., Tsaparlis, G., Fatsow, M. and Lubezky, A. 1997. Student self‐assessment of higher‐order cognitive skills in college science teaching. Journal of College Science Teaching, 27: 99101.  [Google Scholar]