Signed in as:
filler@godaddy.com
Signed in as:
filler@godaddy.com
Additional Content Coming Soon
Please Note: This is not a criticism towards educators who use voice assistants as a tool in their classrooms. This is regarding the risk of allowing the wake word "Alexa" in the education sector and the importance of implementing appropriate wake words. [e.g. Amazon & Computer]
When You Know Better You Do Better
Educators & those in educational leadership possess the enormous responsibility of ensuring that best practices are consistently executed in all domains of the education sector, including voice technology in education. Educators play a pivotal role towards creating collaborative, inclusive learning environments. Data-driven decision-making drives continuous improvements on equitable opportunities for all students.
We believe the education sector must ensure best practices in voice technology through the implementation of a policy change encouraging schools to use non-human names as wake words. Utilizing voice assistants as a teaching tool maintains a potential but not at the cost of marginalizing and dehumanizing students.
Inappropriate vs. Appropriate Practices
Does this follow the code of educational ethics and standards?
Does it adhere to best practices in education technology?
"Ethics in education sets the standards of what's acceptable and what's not, therefore, protecting the interest of both teachers and students." ~Ecole Globale
The data strongly indicates the implementation of the wake word "Alexa" poses multiple risks for Alexas {& similar names} in the education sector.
If there was no evidence of negative impact on implementing the wake word "Alexa," educators could be reassured the integration of "Alexa" followed best practices in educational technology. Strategies on how to handle negative outcomes can be modeled but we must not lose sight of the core issue. This directly impacts not only Alexas but other students as well.
Educators seek to discover the root cause of issues children face in all facets of their development in order to create a path for successful outcomes. Upon examining these outcomes: name changes {e.g. identity erasure}, harassment, name shaming, child development impact, bullying, wake word restrictions, etc. because of an AI device; what would be your justification to continue using "Alexa" in your classroom?
It is critical we never lose the ability to examine all angles before implementing new technology. In this case, beyond the shadow of doubt, the root cause is using a human name as a wake word for a voice assistant. When educators identify the root cause of an issue, it’s instinctive to begin developing solutions.
Let's do the Math. If 20 students completed an "Ask Alexa" skills worksheet shown above once a week; the name Alexa would be uttered 300 times.
Students will utilize these skills more than once, however, so the name Alexa would be uttered more than 300 times. Let's take it one step further.
Let's say only one class per grade level (K-5) in an elementary building engages in the "Alexa Education." 300 times per class multiplied by six classrooms would result in "Alexa" being mentioned 1,800 times. And we're wondering why some Alexas don't want to be Alexa anymore?
As educators, we need to be asking ourselves:
✔Why are we singling out one child's name?
✔What are the potential long-term effects?
✔What message are we sending to children?
✔Is there a better approach?
Is this the vision we aspire to employ in the education sector?
Have you ever seen a child's name singled out to such an extreme degree?
Have you ever seen a child's name reduced to this level?
Evidence shows children are transferring the behaviors they employ with their voice assistants at home onto girls named Alexa.
What behaviors could arise in the classroom if you have a student named Alexa? If students met an Alexa, what transference behaviors could occur?
Many say bullying is the issue, not the device. It is critical educators and the public recognize and understand the distinction. Bullying is a secondary issue. The primary issue derives from Amazon choosing a human name for their voice assistant. If Amazon was using a non-human wake word as the default, these issues wouldn't exist. (e.g. Google)
Let’s reverse the scenario. What if educators decided to pick a name that wasn’t going to have equal access to learning how to read?
✔Alexa was the 32nd most popular name at launch in 2014.
✔Emily was the 32nd most popular name in 2020. (Source: Baby Center)
What if educators left Emilys out of the equation?
Parents would be outraged. Educators would be fired. Lawsuits would occur.
The irony? Alexas are being left out of the equation in the Voice in Education Tech world, yet few are batting an eye.
This isn’t an apple-to-apple comparison but demonstrates that in other aspects of life, we wouldn’t tolerate this level of collateral damage to our children.
"Keep parents in the loop with Alexa!"
"Stay connected to school with Alexa!"
Additional Content Coming Soon
"Alexa, write my IEPs."
"Alexa, do my lesson plans."
"Alexa, teach my students."
"Alexa, organize my classroom."
"Alexa, mute my students."
Not only is this dehumanizing and damaging to young Alexas, but it sends children the message that we should command people to do things.
Wearing "Alexa" t-shirts and using "Alexa" cups/mugs in the school setting needs to be eliminated when the evidence shows harmful impact.
Identifying the root cause of complications that surface in the education sector ensure that best practices are consistently implemented. A root cause analysis (RCA) is a problem-solving exercise that can be implemented when a problem has surfaced and even more importantly, needs to be resolved. If the root cause of an issue is not identified, a reoccurrence of the issue is likely to happen.
an educator on Twiter
We use cookies to analyze website traffic and optimize your website experience. By accepting our use of cookies, your data will be aggregated with all other user data.