Artificial Intelligence in schools

The AI future for lesson plans is already here


Another of our intermittent blogs during the #AARE2022 conference
If you want to cover a session at the conference or want to respond to something you’ve heard, please email jenna@aare.edu.au to check in. Thanks!

George Siemens gave the 2022 AARE conference keynote speech yesterday morning and the takeaway was this: artificial intelligence (AI) is already changing education in Australia whether we’re ready for it or not. He went on to suggest that unless we’re mindful we, as researchers and teacher educators, may not have a seat at the table in deciding what the future of education will look like. This blog post is about the future of AI-generated lesson plans and how we might think about this development today and into the future. Think of it as a concrete case study of exactly what George was talking about this morning where the conclusions might apply to other areas of education

What do today’s AI-generated lesson plans look like?

AI-generated lesson plans are already better than many people realise. Here’s an example generated through the GPT-3 deep learning language model:

Prompt: A lesson plan for English for teaching 16 year old students about the idea of “juxtaposition” within Romeo and Juliet.

Output from GPT-3 (verbatim, with formatting provided by the algorithm):

Background

Juxtaposition is the act of placing two things side by side, often for contrast. In literature, authors often use juxtaposition to create a certain mood or atmosphere, or to emphasize certain qualities of a character or situation.

 In Romeo and Juliet, Shakespeare uses juxtaposition extensively to contrast the two lovers and their families, the Montagues and the Capulets. He also uses juxtaposition to contrast the tragic events of the play with the light-hearted moments.

 Objectives

 By the end of the lesson, students will be able to:

 – Understand the concept of juxtaposition

– Identify examples of juxtaposition in Romeo and Juliet

– Explain how juxtaposition is used to create mood and atmosphere in the play

 Materials

 – A copy of Romeo and Juliet

– A whiteboard and markers

– A teacher-created handout with examples of juxtaposition from the play (optional)

 Procedure

 1. Introduce the concept of juxtaposition to the class. Explain that authors often use this literary device to create contrast and emphasize certain qualities.

 2. Ask the students to think of an example of two things that are often juxtaposed. Write some of their responses on the board.

 3. Tell the students that they are going to be looking for examples of juxtaposition in Romeo and Juliet. Explain that they should pay attention to how the contrast created by the juxtaposition contributes to the mood and atmosphere of the play.

 4. Give the students time to read through the play, or assigned sections of the play if time is limited. As they are reading, they should be looking for examples of juxtaposition.

 5. Once the students have finished reading, ask them to share the examples of juxtaposition that they found. Write these examples on the board.

 6. Finally, ask the students to explain how the juxtaposition affected the mood and atmosphere of the play. What did it emphasize about the characters or the situation?

 Extension Activity

 Have the students find additional examples of juxtaposition in other works of literature, or in everyday life. They can share their examples with the class and explain how the contrast created by the juxtaposition contributes to the overall effect.

Imagine that you’d set a task for a class of second year Bachelor of Education students to create a lesson plan with that same prompt. How do you think the AI response would compare to some of your students?

Anybody can generate their own lesson plans for a year level, subject area, and topic that interests them with a similar prompt: https://beta.openai.com/playground

Some concerns

It is trivially easy to generate useful lesson plans using this technique. One obvious upshot is that setting assessment tasks for any students in initial teacher education that involve them creating lesson plans isn’t a great idea any more—it’s too simple for them to generate one. Yet there are new opportunities that arise:

  • Why not get students to generate a few lesson plans, look at the patterns, and write something about the essential structure of this thing that we call a ‘lesson plan’?
  • Why not get them to take a generated lesson plan and improve it, annotating the reasons why their changes have made it better?

Another legitimate concern that arises is that inservice teachers might start to use the next generation of AI-generated lesson plans (which will undoubtedly be an order of magnitude more powerful) without critique—or worse, that some jurisdictions might actually request that teachers use such an approach in future.

A word that we need to look to is “design”

The issues raised by AI regarding lesson plans and in many places in education too can be addressed by consideration of design. When design in education is done well (whether that’s learning by design, design thinking, co-design, or within the subject area named “design”) it always places an emphasis on two things:

  1. Authentic problems: such that the learner must always construct an interpretation of the problem before they can address it
  2. Process and rationale such that the output that the student produces is impressive only if their process and rationale support what they’ve done.

When assessments follow these two ingredients then educators can give students free rein to use whatever tools they have at their disposal. The adoption of AI stops being a concern. When students are being assessed through their process rather than their output, students can use whatever tools are available. The challenge is integrating use of such tools into solving problems through collaboration, critical thinking, cultural understanding, and creativity.

Design as a response to “what should be taught”

George Siemens concluded his presentation by suggesting a list (controversially) of what should be taught in the context of an AI future. A summary/interpretation of his key points of what we should be teaching is:

  • Beingness: what it means to be human in the world, the interconnectedness of all things
  • Systems thinking: how systems change and what complexity is about
  • Technology and how to use it: machine learning and data literacy, computational thinking, collaborating with non-human intelligences

Increasingly, design has become a part of education: design for learning, learning by design, thinking, and so on. The epistemic fluency to design using computational tools in a way that enriches material life and human culture is at the root of all three of these areas. 

For any subject area, teaching using a design approach shifts the focus from knowing content to knowing process. It becomes less about how to get from A to B in a straight line and more about knowing how to frame problems, use tools, and communicate outcomes. More design in education provides one way of responding to this increase presence of AI in education, whether we’re ready for it or not.

It might even provide a response to George’s provocation about McKinsey, Deloitte, or Microsoft trying to get in on a slice of the education sector. Education conceived as design—process rather than output—prioritises the humans involved in the enterprise and makes it harder to sideline educators.

Dr Nick Kelly is a Senior Lecturer in Interaction Design at the Queensland University of Technology, in the School of Design. He is a genuinely cross-disciplinary researcher spanning the fields of Design and Education. He conducts research into design cognition (how designers think), metacognition in learning (how teachers and learners develop their metacognitive abilities), and places where these two things come together (design pedagogy, design for learning, learning by design, design of learning technologies). His specialisation is in the design, facilitation, and analysis of online communities.

Dr Kelli McGraw is a Lecturer in the Faculty of Creative Industries, Education Social Justice at QUT. Currently teaching units in Secondary English curriculum, pedagogy and assessment, her prior experience includes teaching high school English and debating in Southwest Sydney, NSW. Kelli researches the fields of English curriculum studies, secondary school assessment, teacher identity, digital literacy, popular culture and new media texts.

Five thoughtful ways to approach artificial intelligence in schools

The use of artificial intelligence in schools is the best example we have right now of what we call a sociotechnical controversy. As a result f of political interest in using policy and assessment to steer the work that is being done in schools, partly due to technological advances and partly due to the need for digital learning during COVID lockdowns, controversies have emerged regarding edtech and adaptive technologies in schools. 

An emerging challenge for research has been how to approach these controversies that require technical expertise, domain expertise and assessment expertise to fully grasp the complexity of the problem. That’s what we term a ‘sociotechnical controversy’, an issue or problem where one set of expertise is not enough to fully grasp, and therefore respond, to the issue at hand. A sociotechnical problem requires new methods of engagement, because: 

  1. No one set of expertise or experience is able to fully address the multi-faceted aspects of a sociotechnical controversy.
  2. We need to create opportunities for people to come together, to agree and disagree, to share their experience and to understand the limits of what is possible in a given situation. 
  3. We have to be interested in learning from the past to try to understand what is on the horizon, what should be done and who needs to be made aware of that possible future. In other words, we want to be proactive rather than reactive in the policy space.

We are particularly interested in two phenomena seemingly common in Australian education. The first of these concerns policy, and the ways that governments and government authorities make policy decisions and impose them on schools with little time for consideration, resourcing devoted to professional preparation and awareness of possible unintended consequences. Second, there tends to be a reactive rather than proactive posture with regard to emerging technologies and potential impacts on schools and classrooms. 

This particularly pertains to artificial intelligence (AI) in education, in which sides tend to be drawn over those who proselytize about the benefits of education technology, and those worried about robots replacing teachers. In our minds, the problem of AI in education could be usefully addressed through a focus on the controversy in 2018 regarding the use of automated essay scoring technology, that uses AI, to mark NAPLAN writing assessments. Our focus on this example was because it crystallised much about how AI in schools is understood, is likely to be used in the future and how the impacts that it could have on the profession. 

On July 26 2022, 19 academic and non-academic stakeholders, including psychometricians, policy scholars, teachers, union leaders, system leaders, and computer scientists, gathered at the University of Sydney to discuss the use of Automated Essay Scoring (AES) in education, especially in primary and secondary schooling. The combined expertise of this group spanned: digital assessment, innovation, teaching, psychometrics, policy, assessment, privatisation, learning analytics, data science, automated essay feedback, participatory methodologies, and emerging technologies (including artificial intelligence and machine learning). The workshop adopted a technical democracy approach which aimed not at consensus but productive dialogue through tension. We collectively experimented with AES tools and importantly heard from the profession regarding what they knew would be challenges posed by AES for schools and teachers. Our view was that as AI, and tools such as AES are not going away and are already widely used in countries like the United States, planning for its future use is essential. The group also reinforced that any use of AI in schools should be rolled out in such a way as to place those making decisions in schools, professional educators, at the centre of the process. AI and AES will only be of benefit when they support the profession rather than seek to replace it..

Ultimately, we suggested five key recommendations.

  1. Time and resources have to be devoted to helping professionals understand, and scrutinise, the AI tools being used in their context. 
  2. There needs to be equity in infrastructure and institutional resourcing to enable all institutions the opportunity to engage with the AI tools they see as necessary. We cannot expect individual schools and teachers to overcome inequitable infrastructure such as funding, availability of internet and access to computational hardware. 
  3. Systems that are thinking of using AI tools in schools must prioritise Professional Learning opportunities well in advance of the rollout of any AI tools. This should be not be on top of an already time-poor 
  4. Opportunities need to be created to enable all stakeholders to participate in decision-making regarding AI in schools. It should never be something that is done to schools, but rather supports the work they are doing.
  5. Policy frameworks and communities need to be created that guide how to procure AI tools, when to use AI, how to use AI why schools might choose not to use AI in particular circumstances. 

From working with diverse stakeholders it became clear that the introduction of AES in education should always work to reprofessionalise teaching and must be informed by multiple stakeholder expertise. These discussions should not only include policymakers and ministers across state, territory, and national jurisdictions but must recognise and incorporate the expertise of educators in classrooms and schools. A cooperative process would ensure that diverse stakeholder expertise is integrated across education sector innovation and reforms, such as future AES developments. Educators, policymakers, and EdTech companies must work together to frame the use of AES in schools as it is likely that AES will be adopted over time. There is an opportunity for Australia to lead the way in the collective development of AES guidance, policy, and regulation. 

Link to whitepaper & policy brief. https://www.sydney.edu.au/arts/our-research/centres-institutes-and-groups/sydney-social-sciences-and-humanities-advanced-research-centre/research.html

Greg Thompson is a professor in the Faculty of Creative Industries, Education & Social Justice at the Queensland University of Technology. His research focuses on the philosophy of education and educational theory. He is also interested in education policy, and the philosophy/sociology of education assessment and measurement with a focus on large-scale testing and learning analytics/big data.

Kalervo Gulson is an Australian Research Council Future Fellow (2019-2022). His current research program looks at education governance and policy futures and the life and computing sciences. It investigates whether new knowledge, methods and technologies from life and computing sciences, with a specific focus on Artificial Intelligence, will substantively alter education policy and governance.

Teresa Swist is Senior Research Officer with the Education Futures Studio and Research Associate for the Developing Teachers’ Interdisciplinary Expertise project at the University of Sydney. Her research interests span participatory methodologies, knowledge practices, and emerging technologies. She has a particular interest in how people with diverse expertise can generate ideas, tools, and processes for collective learning and socio-technical change.

Artificial Intelligence (AI) in schools: are you ready for it? Let’s talk

Interest in the use of Artificial Intelligence (AI) in schools is growing. More educators are participating in important conversations about it as understanding develops around how AI will impact the work of teachers and schools.

In this post I want to add to the conversation by raising some issues and putting forward some questions that I believe are critical. To begin I want to suggest a definition of the term ‘Artificial Intelligence’ or AI as it is commonly known.

What do we mean by ‘Artificial Intelligence’?

Definitions are tricky because the field is so interdisciplinary, that is it relates to many different branches of knowledge including computer science, education, game design and psychology, just to name a few.

I like the definition offered by Swedish-American physicist and cosmologist Max Tegmark. He describes Artificial Intelligence systems as being ‘narrowly intelligent because while they are able to accomplish complex goals, each AI system is only able to accomplish goals that are very specific.’

I like this definition because it mentions how complex AI can be but makes us focus on the reality that AI is narrowly focused to fulfill specific goals.

We already live in a world full of AI systems including Siri, Alexa, GPS navigators, self-driving cars and so on. In the world of education big international companies are currently working on or already marketing AI systems that develop “intelligent instruction design and digital platforms that use AI to provide learning, testing and feedback to students”.

We need to begin to pay attention to how AI will impact pedagogy, curriculum and assessment in schools, that is, how it will impact end users (teachers and students). There is a lot to think about and talk about here already.

Artificial Intelligence in Education

Conversations about Artificial Intelligence in Education ( AIEd) have been going on for many years in the world of education. This year the London Festival of Learning organised by Professor Rose Luckin and her team brought together scholars from around the world in the fields of AIEd, Learning at Scale ( large scale online learning platforms) and the Learning Sciences.

Closer to home the NSW Department of Education has been on the front foot in raising awareness of AIEd in a series of papers in its Future Frontiers agenda. This is a compilation of essays that canvas “perspectives from thought leaders, technology experts and futurists from Australia and around the world.” These are helpful articles and thought pieces. They are worth checking out and can serve to inform nascent conversations you might want to have about AIEd.

Questions for schools and teachers

It is important for researchers and teacher educators like myself to explore how AIEd will supplement and change the nature of teachers’ work in schools. We need to understand how this can be done in education so that the human intelligence and the relational roles of teachers dominate.

How will schools be involved? And how could the changing education landscape be managed as the subject of AIEd attracts more attention?

Leading research scientist and world expert in AIEd at University College London, Professor Rose Luckin (who incidentally is a former teacher, school governor, and AI developer/computer scientist), captures the core argument when it comes to school education. She says: It’s more about how teachers and students will develop sufficient understanding of AIEd so that it can be augmented by human intelligence when determining what AIEd should and should not be designed to do. For example, Luckin suggests if only purely technological solutions dominate the agenda then what AIEd can offer for change and transformation in teaching and learning will be limited.

The Australian Government’s Innovation and Science Australia (2017) report, Australia 2030, recommends prioritisation of the “development of advanced capability in artificial intelligence and machine learning in the medium- to long-term to ensure growth of the cyber–physical economy”.

It also lists Education as one of its “five imperatives for the Australian innovation, science and research system” that will equip Australians with skills relevant to 2030, thus highlighting the need to understand the implications of AIEd for schools.

Critical moment for school education

There is conclusive international evidence that we are at a critical moment for setting clearer directions for AIEd in school education.

With crucial questions being asked internationally about AIEd and local reports like Australia 2030 published we must start to probe Australian policy makers, politicians, school principals, students and parents, as well as the teaching profession more broadly about such vital issues. Indeed the NSW Department of Education held a forum to this end in 2017 and I understand more are planned.

Schools are one focus of the agenda, but how are teacher education programs in universities preparing preservice teachers for this future? Are we considering questions of AI in our preparation programs? If we need to lift the skill levels of all school students to work in an AI world then what changes might we need to make to accommodate AI in school curriculum, assessment, pedagogy, workload and teacher professional learning?

The debate about robots replacing teachers is not the main event. There will be assistants in the form of a dashboard/s for instance but humans will still do all the things that machines cannot do.

Moreover there is also a great need for deeper understandings of learning analytics. There are also questions of opaque systems, bias in algorithms, and policy/governance questions around data ethics. Such topics could form foundational programs in teacher education courses.

More hard questions

What implications do AIEd and automated worlds have for school infrastructure? How can higher education and industry support schools to be responsive and supportive to this rapidly changing world of AI?

Leaping back to the London Festival of Learning for one moment, Professor Paulo Blikstein, from Stanford University, in his keynote address painted a grim picture of the dangers that lie ahead and he told his audience that it is time to ‘make hard choices for AIEd.’

He explained a phenomenon of We Will Take It From Here (WWTIFH) that happens to researchers. It is when tech businesses tell researchers to ‘go away and play with their toys’ and that they will take over and develop the work technologically … taking over things “in the most horrible way”. Blikstein outlined how most tech companies use algorithms that are impervious and don’t consult with the field – there are few policy or ethical guidelines in the US that oversee decision making in these areas – it’s a “dangerous cocktail” described by Blikstein’s formula of:

WWTIFH + Going Mainstream + Silicon Valley Culture + Huge Economic Potential = DANGER.

I agree with his caution in that people in positions of power in teaching and learning in education need to be aware of the limitations of AI. It can help decision makers but not make decisions for them. This awareness becomes increasingly important as educational leaders interact and work more frequently with tech companies.

In teacher education in Australian universities we must begin to talk more about AIEd with those whom we teach and research. We should be thinking all the time about what AI really is and not be naïve and privilege AI over humans.

As you might sense, I believe this is a serious and necessary dialogue. There are many participants in the AIEd conversation and those involved in education at all levels in Australian schools have an important voice.

 

Dr Jane Hunter is an early career researcher in the STEM Education Futures Research Centre, Faculty of Arts and Social Sciences at the University of Technology Sydney. She was a classroom teacher and head teacher in schools both in Australia and the UK. Jane is conducting a series of STEM studies focused on building teacher capacity; in this work she leads teachers, school principals, students and communities to better understand and support education change. Her work in initial teacher education has received national and international recognition with a series of teaching awards for outstanding contributions to student learning. She enjoys writing and her research-based presentations at national and international conferences challenge audiences to consider new and alternate education possibilities. A recent podcast with Jane on AIEd can be heard here. Follow her on Twitter @janehunter01

 

Note from the editor: The counters on our sharing buttons are broken ( just the counters not the sharing function). Hopefully they will be working again soon.