Analysing Inquiry Learning in ‘Popular Culture 1945 – present’, a unit by the HTAA

Heading photo credit: Beatriz Carramolino, Instructional Design, available under a Creative Commons Attribution 2.0 Generic license

In this series of blog posts, I will critique a Stage 5 unit of work developed by the History Teachers’ Association of Australia (HTAA), Popular culture 1945 – present.

An overview of this unit of work can be found here.

The Inquiry Approach

In the following analysis, light will be shed on the inquiry approach of the learning program Popular culture 1945 – present through an assessment of the level of agency students are able to exercise within the activities outlined, the type of thinking called for by the program and the way in which information is gathered, regarded and used through the inquiry process.

Analysis of inquiry learning within the program reveals a definite skew towards a teacher-directed approach to inquiry. When the program was assessed against Lupton’s (2016., p. 5) inquiry learning continuum for K-12 students it became apparent that most of the activities within the program fell under the teacher-directed end of the continuum. The results of this assessment are presented in Figure 4.

Screen Shot 2016-11-13 at 7.19.16 PM.png
Figure 4 – Results of the analysis of the program. Each learning sequence was plotted against the inquiry continuum developed by Lupton. The analysis of each learning sequence required an assessment of what type of inquiry (structured v guided v open) was evident within each stage (question formation, evidence, findings/argument, and communication). A dot was placed in the appropriate square in the table on the left to represent the learning sequence (where there is more than one type of inquiry approach represented due to multiple activities within a learning sequence, multiple dots have been placed as appropriate.) The key for this figure can be seen in the table on the right.

In its ideal form, inquiry learning is student-driven, resulting a personalised and active learning experience, and as such an ideal inquiry unit would reflect a high degree of student agency. An authentic inquiry approach engages higher-order thinking skills that result in construction of new knowledge, allows for a transformative experience and holds the potential for students to gain a greater understanding of themselves. Analysis of the program reveals that equal time is spent for students in structured inquiry as in guided inquiry, but very little time is spent in open inquiry. Structured inquiry is what Martin-Hansen (2002, p. 37) refers to as a “cookbook lesson”; a process characterized by the teacher setting instructions for students to follow, to pursue a predetermined end point. In the program under analysis, structured inquiry is evident in the pre-determination of topic, inquiry questions, and research direction and the provision of weblinks as well as PDFs of curated information to facilitate the response of students. The level of structure in this program is an obstacle to authentic inquiry because, as Martin-Hansen argues, structured inquiry “does not include much true inquiry” (2002, p. 37).

Interestingly, the Australian Curriculum defines Historical Inquiry as:

A process of investigation undertaken in order to understand the past. Steps in the inquiry process include posing questions, locating and analysing sources and using evidence from sources to develop an informed explanation about the past.

This definition goes further to support the argument that the inquiry process should allow students a greater degree of agency in developing inquiry questions and finding research material than is provided in this program.

A shift towards open inquiry can give students more ownership of and connection to their learning, however it must be noted that there still remains a place in the program for both guided inquiry and, to a lesser degree, structured inquiry. As Kuhlthau (2010, p. 4) argues, intervention in the inquiry process must happen at critical points in order for the teacher to support students in reaching real learning. Furthermore, Kirschner, Sweller & Clark (2006) make a compelling argument cautioning against “minimally guided instruction” based on the cognitive structures of the human brain.

As such, I recommend for the inclusion of a model such as that presented by Martin-Hansen (2002, p. 35), called coupled inquiry. This model of inquiry provides a persuasive middle ground where guided inquiry and open inquiry are combined in order for students to benefit from a modeling of the inquiry process, before being “handed the reins” to move through a more student-directed open inquiry process. This approach will play an important role in providing scaffolding to students as they move through the program, and will open up more opportunities later in the program for open inquiry.

The flow-on effect of the limited student agency in Popular culture 1945 – present is a restriction in regards to the level of thinking skills students are expected to exercise, as defined by Bloom’s Revised Taxonomy. Much of the learning in the program could be described as a “copying and pasting” activity due to the level of teacher-direction and information provision. This is something Kuhlthau (2010, p. 4) argues results in “little real learning”. At this level of inquiry, students’ cognition is restricted to lower order thinking skills. This theory-based argument is supported by the results of a practical analysis the activities within the program against Bloom’s Revised Taxonomy (see Figure 5).

Screen Shot 2016-11-13 at 7.31.00 PM.png
Figure 5- Mapping the activities within the program against Bloom’s Revised Taxonomy, looking specifically at the cognitive dimension (Adapted from a diagram from Learn NC)

In the above analysis, it must be noted that for all instances where it has been recorded that students are engaging in ‘Creating’, the creative process is dictated by the provision of components and I would argue that students are involved more in the creative arrangement of provided information into a predetermined ‘new structure’. Whilst this fits into the category of ‘Creating’ under Bloom’s Revised Taxonomy (see page 6) I would argue that these are weak examples of a creative process. Students are not generating or producing something based on their own inventions, imagination, or thought processes

As such, I recommend that in revising the program, the activities within the program that attempt to engage student in higher-order thinking be adjusted to provide less scaffolding, and, in keeping with the recommendations above, allow students a greater degree of agency in the determination of both the development and presentation of the results of their creative processes. Given the number of activities in the program that require students to analyse information, and the comparatively few opportunities for students to engage with ‘evaluating’, I recommend that where appropriate, activities be expanded to include a greater degree of critical literacy.

Limited student agency within the program is also reflected in the way in which information is treated within the program, as defined by Lupton’s GeSTE windows (see Figure 6 for an overview of the elements of GeSTE applied in this section of the analysis). The provision of “research” information in the program reflects the idea, characteristic of the generic window (G), that information is “external, objective and codified” (Lupton & Bruce, 2013, p. 11). Many of the exercises ask students to merely slot information into definitions and categories that have already been determined. Whilst these characteristics are strong indicators that a considerable proportion of the program was written from the restricted perspective of the generic window, the presence of information practices specific to the discipline of history indicates that there is some consideration of the situated window (S) in the design of this program. The situated window is characterised by the idea that information is internal and subjective. This concept is reflected briefly in the inclusion of an activity where students are directed to seek information from their family members about life in 1945. This is a clear, if brief, departure from the dominance of a generic view of what information is and how it is found.

Screen Shot 2016-11-14 at 9.01.23 AM.png
Figure 6- The relationships between the Generative, Situated, Transformative and Expressive windows, and the approach to information that characterises each (in terms of what information is, how it is found and what it is used for).

There is very little evidence of the transformative window (T) in the program. This is unsurprising, given that the program sits towards the teacher-directed end of the inquiry-learning continuum. Within the scope of the transformative window, critical literacy and the questioning of implicit and explicit meanings and assumptions, hold an important place. In providing the questions, information and setting the end-point of students ‘inquiry’, this program does not allow for a critical literacy approach. Within Learning Sequence 6 students are directed to consider and discuss who is missing from the depiction of an Australian. Whilst, this activity strongly engages with the critical literacy element of the transformative window, a significant part of the transformative window approach is that students “use information to question the status quo and challenge existing practice” (Lupton & Bruce, 2013, p. 15). There is no indication that students following this program are expected to take any transformative action based on their discussion.

There is no evidence of the presence of the expressive window (E) in the program in its current iteration.

I recommend that the learning sequences in the program be realigned to target the transformative window to a greater degree, in particular the critical literacy aspect of the transformative window. The recommended shift of focus should include the installation of the missing “call to action” element of the transformative window within Learning Sequence 6 (discussed above). Given the relationship between the windows (in that G is encompassed by S, which is encompassed by T – see Figure 6), allowing the program to continue with a dominant generic window perspective (in part driven by the level of teacher-direction in the inquiry approach) will prevent students from seeing other perspectives. By contrast, targeting the transformative window will allow for skills to be developed across the three (G, S and T) levels.

Due to the importance of the expressive window, which sits alongside the GST structure, in taking inquiry learning to a very personal level I recommend that an activity be designed from the perspective of this window. Deeply intertwined in questions of identity and self-expression, it would be appropriate that an activity that allows students to respond to the questions of representation and the definition of “Australian” raised Learning Sequence 6 be developed.

In reference to the importance of higher order thinking skills, the targeting of the transformative window in particular is a very important inclusion in improvements to the program. The critical literacy required at the transformative level demands that students use skills of evaluating and creating (Lupton& Bruce, 2013, p. 15), aligning to a heightened level on Bloom’s Revised Taxonomy in regards to the cognitive dimension.

As the program pulls away from the teacher-directed end of continuum and becomes increasingly student-directed, this shift supports the idea that information is not external and codified to be transferred from teacher (source) to student (vessel) but rather is internal and subjective. This shift represents a move up from the perspective of the generic window, up on the cognitive skills scale represented in Bloom’s Revised Taxonomy as it demands higher order thinking skills to be exercised by students.

Summary of concerns and recommendations

 Concerns  Recommendations
 Too much of the program leans towards the teacher-directed end of the learning continuum
  • Less structured inquiry
  • Use coupled inquiry in an early learning sequence to allow for explicit teaching of inquiry method and thus prepare students to engage in open inquiry in later learning sequences
  • Implement greater opportunity for students to formulate their own inquiry questions, decide how and where they will conduct their research, critically evaluate information and choose their method of presentation of conclusions
 Much of the program only demands lower order thinking skills be exercised by students and where creating exercises are provided, they are overly controlled
  • ‘Creating’ exercises be altered to provide less scaffolding and allow students a greater degree of agency
  • Some ‘analysing’ exercises be adjusted to engage students in ‘evaluating’
A large proportion of the program reflects the limited perspective of the generic window in regards to what information is, how it is found and what is it used for. There are comparatively few instances where the perspective of the transformative window is engaged in the design of activities, and no activity has been designed from the perspective of the expressive window
  • The activity within Learning Sequence 6 be improved to better reflect the transformative window
  • Where appropriate, other activities should be revised to include the critical literacy aspect of the transformative window
  • The activity in learning sequence 6 that questions who is represented by definitions of “Australian” be extended to include the expressive window through an element of self-reflection

Inquiry Learning – Principles and Skills

In this section, I will analyse the ways in which the program under review facilitates the development of inquiry learning skills relating to questioning. Given that the inquiry learning process in grounded in the action of questioning, and the number of questioning types and stages within the inquiry learning process, I developed a Prezi (see Figure 8) to make explicit the stages of questioning I have found to be the basis for the inquiry learning process. This Prezi is largely based in my reading of Lupton (2012), YouthLearn’s 2016 article Inquiry-Based Learning: An Approach to Educating and Inspiring Kids, and Lupton’s 2016 article on Inquiry Learning.

https://prezi.com/embed/0h1lzl1hotbt/?bgcolor=ffffff&lock_to_path=0&autoplay=0&autohide_ctrls=0&landing_data=bHVZZmNaNDBIWnNjdEVENDRhZDFNZGNIUE43MHdLNWpsdFJLb2ZHanI5ejZhRldLbEJkdnMzZ090K3A3WW5VR3hBPT0&landing_sign=oeGLSA6d_hSHBNfGWKb6U2HdajlzUXPyo8MdBJFtRNE

 Figure 8 

 

In assessing the program I found that there is much scope for improving opportunities for students to develop the questioning skills integral to inquiry learning. As has been discussed in the previous section, there is little opportunity for students to engage in the development of any type of question. Furthermore, at no point is any questioning framework made explicit nor am I able to identify any evidence that a particular questioning framework was used to guide the writing of the program.

The HTAA have not posed an essential question in writing the program, nor are students asked to generate one of their own. According to Wiggins (2007) an essential question encapsulates “the essence of the issue”. Provoking deep thought, the inclusion of an essential question is an important element for authentic inquiry. Allowing students to generate their own essential question is a powerful tool in ensuring they feel ownership of their inquiry and, as such, their learning. For these reasons, I recommend that the program should begin with students posing their own essential question after a discussion of the topic.

Regarding generative questioning, there are few opportunities for students to develop their own inquiry questions within the program and where the opportunity is given this process is controlled and directed. Allowing students to develop their own inquiry questions is important in ensuring a high level of student agency on the inquiry-learning continuum (Lupton, 2016, p. 5), and engaging students in the learning process. To improve this program I would recommend that students are provided with the opportunity to develop their own inquiry questions within the program.

In the program, students are not explicitly taught any questioning framework need to provide them with the skills required for the effective generation of inquiry questions. Explicitly teaching students these questioning frameworks is important in providing them with the tools required to understand the inquiry process and apply these skills in a range of situations. As such, I recommend that the questioning frameworks I have used in the development of my Prezi (see Figure 8) be explicitly taught to students.

In the program as it stands, students are provided with almost all “research” material in the form of curated information in PDFs and links to websites. This inclusion weakens the program in terms of the “research questions” stage of the inquiry learning process. Questioning frameworks surrounding the research stage of inquiry learning are not included in the program. I recommend that the questioning frameworks presented in my Prezi (see Figure 8) be exercised and taught explicitly in the program when it is re-written in order that students are able to aspire to a higher level of inquiry learning.

Regarding the evaluation stage of the inquiry learning process, the questioning skills students should demonstrate at this level relate to general information evaluation, discipline-specific evaluation, critical evaluation and an evaluation of information from the perspective of the Expressive window (see Prezi in Figure 8). Once again, explicit teaching of the types of questions students should ask within each area of evaluation is important to ensuring the effective mastery of inquiry learning processes.

In the program, students are rarely expected to evaluate information from the perspective of the generic window (where information is found through search strategies and regarded as external, codified and objective – see Figure 6). This can be partly attributed to the fact that the “research” information has been provided for most of the activities within the program. One exception is in learning sequence 1 and the activity, The role of culture in Australia. This activity asks students to select a section of information from the Australia government’s Department of Foreign Affairs and Trade website and, following six dot points provided, critically analyse what is says. These dot points deal with questions of evidence, viewpoints, purpose and audience. However, these dot points are not addressed explicitly as a questioning framework that should be applied to all sources of information. Furthermore, this is the only real instance of evaluation of information from the generic perspective in the program. In learning sequence 3, students are advised that it is “important for you, as a historian, to make sure that you verify all the information you see…”, however there is no framework or suggestions provided as to how this should be carried out. In improving this program in regards to the generic critical evaluation of information, I would recommend that students be explicitly taught the questioning framework proposed by Davis (1996), outlined in my Prezi (see Figure 8).

Critical evaluation questions in the program that reflect the viewpoint of the situated window (see Lupton (2016), p. 11) are characterised by a discipline-specific approach. If the critical evaluation of information in the program includes this perspective, the program must include an element of historical source analysis. This is largely absent from the program as it currently stands. In learning sequence 5, in the activity John Bywater interview, students are provided with information about a historical event, presented as a personal perspective. They are cautioned on the potential for bias in personal perspective. Whilst the program does provide a list of questions that should be asked when considering personal perspective, this list is not taught explicitly as a tool that should be applied in future when students are considering a source of information such as this (which is contrasted against ‘fact’). In re-writing the program, I recommend that students are taught and required to implement the questioning framework for historical sources developed by the HTAA (2013) (summarized in my Prezi in Figure 8).

In regards to the critical evaluation questioning as viewed from the perspective of the transformative window (see Lupton (2016), p. 11), information in the inquiry process should be examined through “questioning power and agency” (Lupton, 2016, p. 11). Within the program under analysis, the activity in learning sequence 6 touches on this. This activity asks students to critically evaluate the depiction of “being Australian” in history, to question who is included and who is silenced and assess Australian identity in popular culture. This activity therefore reflects critical evaluation as viewed from the transformative window with its overtly political and conscious-raising character (Lupton, 2016, para. 4). I recommend that the program’s revision should integrate more instances of critical evaluation of information as viewed from the perspective of the transformative window, as this is an important aspect of inquiry learning and demands the higher order thinking skills of evaluation as determined by Bloom’s Revised Taxonomy.

The fourth aspect of the evaluative questioning skills required for inquiry learning, is the questions used to interrogate information from the expressive window (see Lupton (2016), p. 12). As Lupton (2016, p. 12) points out, “the expressive window sees inquiry and information literacy as an expression of oneself”. This viewpoint encompasses considerations of self-awareness and identity. I have summarised the sorts of questions students should ask when evaluating information from an expressive window perspective in my Prezi (see Figure 8). These lines of interrogation are absent from the program under analysis, and as such, I recommend that in re-writing the program analysis of sources should extend to this level, and this questioning framework be explicitly taught to students.

The next phases of the research cycle of inquiry learning sees students using questioning frameworks to interpret sources, draw conclusions and present findings. In my Prezi (see Figure 8), I have adapted the questioning framework developed by Gourley (2008) – specifically the ‘Sorting out’ stage (summarized here (p. 10) by Lupton). The program under analysis does not lead students through an ‘interpretation’ phase in the course of their research, there is no mention of questioning frameworks that can guide this process nor are students explicitly taught these frameworks in order to obtain skills in this stage of questioning within an inquiry process. The absence of this framework, as well as that surrounding the development of a conclusion, should be amended through the implementation of an explicit process within the research activities of the program that covers both the Interpretation of information stage and the reporting stage (as outlined in Gourley’s (2008) ‘Sorting out’, ‘Going further’ and ‘Making conclusions’ phases – summaried by Lupton (2016) here). Furthermore, as outlined in my Prezi (see Figure 8), in the preparation of the conclusions, the questions that should be asked at this stage extend to a consideration of the Expressive window. This framework should also be implemented in the program when it is re-written. 

Overall, the program is not set up as a series of inquiry learning research cycles, nor is there any evidence that an inquiry learning or information literacy model was actively referred to in the development of the program. Reformatting the program to include the recommendations outlined above will better align it into such a cycle, and provide students with a more engaging, rewarding and personalised learning experience.

Summary of recommendations

 Recommendations
Regarding Essential Questions: Alteration of the program to allow students to begin by posing their own essential question after a discussion of the topic
Regarding Inquiry Questions: Provision of the opportunity for students to develop their own inquiry questions and for this process to be supported through the implementation and explicit teaching of the questioning frameworks I have used in the development of my Prezi (see Figure 8 – Inquiry Questions) be explicitly taught to students
Regarding Research Questions: Implementation and explicit teaching of the questioning frameworks as presented in my Prezi (see Figure 8 – Research Questions
Regarding generic Evaluative Questions: Expansion of the program to allow students to be explicitly taught the questioning framework proposed by Davis (1996), outlined in my Prezi (see Figure 8 – ).
Regarding situated Evaluative Questions: Implementation and explicit teaching of the questioning framework for historical sources developed by the HTAA (2013).
Regarding transformative Evaluative Questions: A greater requirement for students to critically evaluate information from the perspective of the transformative window.
Regarding expressive Evaluative QuestionsI recommend that in re-writing the program analysis of sources should extend to this level, and this questioning framework be explicitly taught to students.
Regarding interpreting sources and drawing conclusions: Implementation of a questioning framework within the program to guide students in these processes. The questioning frameworks to be implemented are to draw from Gourley (2008) and a consideration of the Expressive window. There should be explicit teaching of these frameworks.

Inquiry Learning and the Australian Curriculum

In this section I will analyse the program in the context of the expectations put forward by the discipline of History and the Australian Curriculum.

As pointed out by Lupton (2016, para 1.), each discipline approaches inquiry in a unique way due to the nature of knowledge construction, the types of sources used within the discipline, and the objectives of inquiry within that discipline. The scope and sequence of skills specific to the discipline of History as presented by the Australian Curriculum, that are pertinent to this analysis of inquiry learning, are listed in Figure 9 below. In the third column is an analysis of the program on the basis of these expectations for Year 10 History students and inquiry learning.

Please note: much of the analysis of the program in the following table is a reiteration of the conclusions and recommendations stated in part 1 (The Inquiry Approach) and Part 2 (Inquiry Learning – Principles and Skills) of this post. There are a number of reasons why I have not integrated this aspect of the analysis into the previous sections:

  1. Much of the clarity of the previous sections would be lost if this content was integrated
  2. I believe there is merit in structuring the analysis of the program according to the expectations of the Australian Curriculum, as opposed to the perspective presented when the inquiry learning model acts as the organising agent for the analysis (as is seen in the previous section)
  3. Presenting the correlation between the Australian Curriculum and the recommendations already made (as arguments for improving inquiry learning in the program) provides a strong conclusion to the analysis

Please note: the elaborations for each content description has been considered in this analysis.

 Historical Skills  Content description  Analysis of program
Historical questions and research
  • Identify and select different kinds of questions about the past to inform historical inquiry
This content description indicates an expectation that students will be active in the development of their own inquiry questions at Year 10 level. As outlined in part 1, this is not yet a feature of the program. It has been recommended in part 1 that students be afforded greater agency over the development of inquiry questions.
Historical questions and research
  • Evaluate and enhance these questions
In following on from the observation and recommendation reiterated above, when students are provided with more opportunities to exercise agency over the development of inquiry questions (and thus the program reflects a more student-centred inquiry approach) it has been recommended that a questioning framework to guide the generation of these questions be implemented and explicitly taught. This recommendation can be found in part 2, and relates to the framework discussed by Rothstein and Santana. Within this framework are steps which require students to both evaluate and enhance the inquiry questions they are developing. Furthermore, a framework for what makes a good inquiry question was proposed in the Prezi (see Figure 8), based on the work of YouthLearn (2016) and Jackson (2013).
Historical questions and research
  • Identify and locate relevant sources, using ICT and other methods
This content description indicates that there is an expectation that students in Year 10 will locate and access sources of information independently. As discussed in parts 1 and 2 of this analysis, the majority of “research” material students use in the course of the program is provided for them in the form of curated information blocks in PDFs and links to websites. As recommended in part 2, students should have greater agency over the research process in regards to location, accessing and searching information.
Analysis and use of sources
  • Identify the origin, purpose and context of primary and secondary sources
  • Evaluate the reliability and usefulness of primary and secondary sources
The four content descriptors on the left reflect an expectation set out by the Australian Curriculum that relates to a situated evaluation of information, namely the evaluation of sources from the perspective of the historical discipline. As indicated in part 2 of this analysis, for the majority of the program (with the exception of the John Bywater interview in learning sequence 5 discussed in part 2), this element of the inquiry process is absent. As indicated in part 2, it is recommended that to rectify this absence, the questioning framework for historical sources, developed by the HTAA (2013) be taught explicitly to students and implemented in the course of source analysis within the program.
Perspectives and interpretations
  • Identify and analyse the perspectives of people from the past
  • Identify and analyse different historical interpretations (including their own)
Perspectives and interpretations
  • Process and synthesise information from a range of sources for use as evidence in an historical argument
This expectation set by the Australian Curriculum for students working at Year 10 level in History reflects the importance of the implementation of a questioning framework around the interpretation. This framework does not exist in the program at present. As set out in the Prezi developed (see Figure 8), the recommendation was made for the integration of the section of Gourley’s (2008) framework ‘Sorting out’ (summarized here (p. 10) by Lupton) into the program and the explicit teaching of this framework in order that students are best equipped to develop these skills.
Explanation and communication
  • Select and use a range of communication forms (oral, graphic, written) and digital technologies
This relates to the agency of students to determine their inquiry learning process. As discussed in part 1, the program largely dictates how students will present the outcomes of their research. As such, the recommendation was made that students be given greater opportunities in the re-writing of the program to exercise agency over all parts of the inquiry learning process.

 

My next blog post will focus on bringing together all of the recommendations made throughout this post, and re-writing the HTAA program, Popular culture 1945 – present, to reflect a more authentic inquiry learning process based on the theories cited throughout this analysis.

Advertisements

3 Comments

  1. Hi Nina,

    Thank you for allowing me to comment on your blog.

    It is clear that you have a great understanding of this unit. Your analysis was thorough and very informative and I think you have applied the theories and concepts really well. Additionally, you have created a great balance between text and graphics/images. Your graphics were visually appealing and easy to read, as well as clearly articulating your analysis.

    I only have two minor comments. First, I noticed that the formatting of your table (Figure 9) is distorted, making the text in the first two columns (Historical Skills and Content Descriptions) difficult to read. Second, while I really like that you used a green font to highlight your comments throughout the post, it is almost the same colour green as your hyperlinks. This means that when a hyperlink is included in a recommendation, the hyperlink is not as prominent as it could be. Otherwise an excellent post!

    All the best.

    Renee : )

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s