Have you been wondering whether you should allow your students to rely heavily on tools like ChatGPT as a substitution to their own education? I was faced with it when I saw Jansen Huang’s declaration that children should no longer bother to learn coding because AI will do it for them. Huang also said that AI can make everyone a programmer and in the near future all we will need in order to do programming is our native language. So, before you pack your bags and discontinue teaching your students programming or any other subject, let’s look at the implications for students and for society in general of what Mr. Huang was saying.
I decided to find more information about it and found an article written by Nathan Anecone, titled “Jensen Huang is Wrong About Coding.” In it Mr. Anecone lays out a systematic analysis of everything that is wrong with Mr. Huang’s declaration. He points out that relying heavily on AI while completely neglecting one’s own education and training can lead to “ignorance explosion” or “knowledge collapse” caused by AI. Socrates once warned that it’s a mistake to think you possess knowledge just because you own a library of books you haven’t yet but might some day read. Because even if you don’t yet possess the knowledge in a book you own, the potential is still there. With AI over-reliance, you actually never learn anything. This type of attitude puts you in a state of over dependence on an external source, in this case a machine.
To prove his point, he brings the case of over-relying on GPS to find our way around. Even though it makes our lives easier when trying to get to a destination, not being able to apply our navigation skills when our GPS goes out of control can lead to many instances of navigational frustration. Research shows that drivers who rely on memory to pick a route have been shown to have bigger hippocampus — brain structures implicated in memory processes. GPS control is nothing compared to allowing AI to control our logical thinking skills.
As an educator I learned that students love a challenge and when they try to solve a problem, the experience is so gratifying, especially after overcoming a great amount of frustration and conquering a tough challenge. If we follow Mr. Huang’s advice, we will deprive children of this whole area of potential human experience and if we continue his line of thought, why even bother teaching students to paint or to write essays because AI can do the job for them.
Plus, if you never learned programming or any other subject because you deferred to AI to do it for you, how are you even going to find the right words to talk about it? How are you going to evaluate or make use of the results, without understanding it? By asking the AI to understand it for you? Any programmer will tell you that struggling to try to explain what they need in words is often way more time consuming and inefficient than just coding it.
My intention here is not to discourage students from using AI. AI can be a great help to coding and education, but using it as a substitute for students’ own knowledge of subjects can be detrimental to students’ future and society at large.
When OpenAI released ChatGPT and teachers were panicking that it may lead to upend education as we know it, I pointed out that this may provide an opportunity for teachers to shift from a traditional teaching methodology that AI can easily replicate to a deep learning methodology that AI was less capable of taking over. A big disadvantage of AI is that it cannot learn to think outside the box. AI is capable of learning over time with pre-fed data and past experiences, but cannot be creative in its approach. But even if in an ideal world it may one day be able to perform deep learning, stripping ourselves from the ability to critique what AI is suggesting us to do can be very dangerous for society.