We hear a lot about AI in education in the US on here, so can I just tell you about how it's working out in the schools sector in England? It's interesting! It's about data consultants, private education outfits and think tanks. Slow thread...
We hear a lot about AI in education in the US on here, so can I just tell you about how it's working out in the schools sector in England? It's interesting! It's about data consultants, private education outfits and think tanks. Slow thread...
Public education itself is a more wildly ambitious innovation than anything the tech industry has ever, or could ever, offer. Working to fulfill the promise of that innovation can help us be clear-eyed about what a volatile, diminished substitute Alpha School and its ilk are peddling.
There is of course this fantastic piece from Audrey Watters - but it's paid subscribers only (do subscribe! it's worth it!) and I need something for students ... 2ndbreakfast.audreywatters.com/ai-literacy-...
Maybe I do need to do that! Aside from all the co-option of AI literacy (even 'critical') for corporate interests it's such a get-out clause to avoid institutions and governments taking responsibility.
Can anyone recommend an blog, newsletter or open access article on critical AI literacy? Even better if it critiques the promotion of AI literacy as a way to avoid AI risks and harms!
Itβs like the people pushing AI donβt know or care how humans work. We relate through shared effort and shared struggle. Teachers want to be understood, students want to understand. Each learns from and instructs the other. That IS the relationship.
Finding from UCU member survey
Finding from UCU member survey
Finding from UCU member survey
Is AI helping or harming your work?
From classroom tools to performance monitoring, over 1,700 UCU members shared how AI is creeping into every corner of post-16 education.
Read the full report on how AI is already impacting membersβ working lives: tr.ee/rldkpe
No human doctor has the time to do real-time tracking of all their patients' data. This is clearly predicated on diagnosis and prescription carried out by AI doctors too.
I'm increasingly of the opinion that measuring time saved on specific tasks misses the point entirely - we need to ask if and how that notional time saved is actually used instead.
The pick'n'mix statement bank approach to report writing, much loathed by parents, has paved the way for this further automation.
There's multiple AI/automation discussions around UK public sector where you can see this simplistic multiplication of "potential savings for a human tasks" by "number of tasks" going on
Kinda fine if it's just for PR, but if ppl are building it into business cases then could easily damage services
The important question isn't whether time can be shaved off a specific task, but how that time is then used, and the experience or impact on *overall workload*
Just heard from a teacher, because AI can speed up report writing, they are now given less time to do it, and expected to do more. Claims that AI will solve teacher workload crisis needs to look at how automated productivity has played out for workers in the past #luddite #praxis
Another uninvited and unwelcome AI intrusion that actively subtracts rather than adds value. Nobody asked for or needs this! Who benefits from this?
And yes, maybe this could be helpful as a scaffold for going on to read the chapter in more depth. But really - how likely is that to actually happen in the context of current university education?
As well as not really an accurate summary of the argument.
The summary of Ch.8 in Digital Timescapes by @robkitchin.bsky.social is so bland it's useless: 'The development and adoption of digital devices and technologies are reshaping the temporal relations and organization of work and labour, leading to new dynamics such as the gig economy and automation.'
How are we supposed to continue encouraging deep reading, when our library's ProQuest Ebook Central has a 'Research Assistant' AI summary of the book chapter popping up at every moment like an unwelcome and uninvited guest.
We are looking for a Postdoctoral Researcher to contribute to our work on how we can evaluate whether digital technologies have good societal outcomes - the Digital Good Index.
We set out some ideas about how the DGI might take shape on our website: digitalgood.net/dg-research/...
Quite!
And just ... why bother being a teacher if you spend more time fact-checking text extrusion machine outputs than developing your own thinking?
Transparency and accountability are core professional principles, but the responsibility is entirely placed on education staff.
Why let providers and companies off the hook like this?
Doing this thoroughly and consistently will likely take *more time* than that saved (so it won't get done)
The new DfE guidance for teachers stresses that *you* (the teacher) are at all times wholly 'responsible for both the input and output of AI tools'. Who's responsible for the bit in the middle?!
Among the many horrific things I have heard recently - I just discovered that Oak National has an AI lesson planning tool. But my absolute favourite bits are that a) it gets stuff wrong b) it suggests hilariously bad tasks and c) it's literally no help unless you already know the history
This looks great! Would be great to use this for discussions with my Digital Education masters students.
*rolls eyes forever*
AI as a teacher time saver is largely a mirage once you add all the additional work accompanying it
Perfectly succinct statement of the problem with AI fuelled research and learning.
Which raises an obvious question ... why so much energy, resource and rush to AI-ify it all? Which teacher voices are being heard and amplified by political and tech interests?
Very interesting counter to all the 'tech/AI will solve education' hype - a small-scale survey found that a majority of US teachers do NOT expect technology to improve pupil outcomes, behaviour, attendance, or teacher satisfaction. teachertapp.com/articles/tea...