Image of a motherboard

Why I’m not too worried about generative-AI making our students less skilled

In recent years, the rise of artificial intelligence and machine learning has revolutionized the way we interact with technology. One of the most impressive developments in this field is the creation of language models such as ChatGPT, a tool that is changing the way students of computer programming learn and study. In this blog post, we will explore the ways in which ChatGPT is affecting students of computer programming and why it is such a valuable tool for those who are seeking to improve their skills in this field.

Dear reader of this blog post, I must now come clean: The previous paragraph was written by ChatGPT to the request: “Write a blog post about how ChatGPT is affecting students of computer programming.” I will now use my own voice and cognitive abilities.

My colleague Brian Jackson recently wrote an insightful and entertaining blog post here for The Humanities Center about ChatGPT and the angst and hand-wringing that many college instructors and K-12 teachers are doing because they worry it will replace the work done by students. The anxiety is high among those who rely heavily on student writing to assess learning. As an instructor of technical tools for linguistic analysis, I worry less than my colleagues about ChatGPT ghost-writing essays, but one of my potential concerns is the use of ChatGPT to write computer code.

I was hired by BYU’s Department of Linguistics in 2017 to contribute to the technical course offerings that teach students to use computers to analyze human languages. Among other topics, I teach students to write computer code in the Python programming language to analyze human language production in large amounts of text. For example, I teach students to write Python code that takes as input 1,600 text files and returns as output frequencies of three linguistic features of their choice (e.g., number of subject pronouns or number of contracted words) in eight registers of English found online (e.g., song lyrics, Wikipedia articles, opinion blogs). In another assignment, I ask students to write a Python script to automate both the retrieval of comments below Youtube videos and the categorization of comments as either positive or negative, as well as ranking how positive or negative the sentiment of a comment is. In another assignment, I ask students to write a Python script to test the correlation between the number of linguistic features of their choice (e.g., number of ALL CAPS words or number of quotation marks) in 120,000 Yelp reviews and the number of stars in the reviews. 

While ChatGPT’s ability to write Python code is impressive, I’m not too worried. To my request “Write Python code to calculate the frequencies of a list of words” it created the following code, which accomplishes the task well:

words = [‘apple’, ‘banana’, ‘apple’, ‘cherry’, ‘banana’, ‘apple’, ‘date’]

freq = {}

for word in words:

    if word in freq:

        freq[word] += 1

    else:

        freq[word] = 1

print(freq)

Output:

{‘apple’: 3, ‘banana’: 2, ‘cherry’: 1, ‘date’: 1}

Additionally, ChatGPT was nice enough to offer an detailed explanation in prose of the logic of the code: 

In this code, we start by creating a list of words called words. We then create an empty dictionary called freq to store the frequencies of each word.

We then iterate through each word in the list words. For each word, we check if it is already in the freq dictionary. If it is, we increment its frequency by 1. If it is not, we add it to the freq dictionary with a frequency of 1.

Finally, we print out the freq dictionary, which contains the frequencies of each word in the words list.

Should my level of anxiety increase? Should I ban my students from accessing ChatGPT? How would I even enforce that? 

Here’s why I’m not too worried: I think we’re at a “horseless carriage” moment, or rather at another horseless carriage moment. Blacksmiths who relied heavily on shoeing horses for their income had to change and adapt to new technology: the steam engine, and later the internal-combustion engine. Many blacksmiths became the first automobile mechanics. Since its public release in 2006, has Google Translate put translators out of business? No, at least not yet. Translators now use machine translation as one of their tools to masterfully convey meaning from one language to another. In like manner, I believe that instructors who rely heavily on writing and (simple) computer programming to assess student learning will likely have to change and adapt to a new technology: human-like generative-AI. 

So, what’s an instructor to do? I think the most immediate thing is to ask ourselves questions. What is the purpose of assignments and assessments? Are we currently producing students who can write prose and program computers for themselves? Should we shift our focus to producing students who can produce good writing and solve text and speech processing tasks with computer code, regardless of how they go about producing it? What do employers and graduate school admission committees want? What can humans still do better than AI? How can I help my students use the new technology well? What flaws are there in the new technology that I should warn my students about?

My perspective is that ChatGPT is simply another tool that can help my students learn to write Python code in order to analyze human language. ChatGPT is the current kid on the block, but I imagine a new kid will make an appearance soon and wow (v.) the world like ChatGPT is doing right now. 

Perhaps the most important question for instructors is: How will I help my students prepare for the ever increasing generative-AI world that we see emerging? Technology is always advancing. Perhaps the most important skill that instructors can teach their students is adaptability, and that skill is probably best taught by example. 

This blog post was written by Earl Kjar Brown, Professor of Linguistics at BYU. 

Popular Articles...

Leave a Reply