A student in the University of Washington’s Suzzallo Library in Seattle. (UW Photo)

The sudden rise of ChatGPT — the AI-powered chatbot released by OpenAI last fall — has educators across the country rethinking how they teach and assess student work. The University of Washington is among them, providing guidance to its faculty on how to navigate technology’s latest impact on education.

The UW’s Center for Teaching and Learning, which supports the advancement of the school’s teaching community, has issued strategies for instructors to help them communicate with students, set expectations, and develop assignments in the age of ChatGPT and other AI-based tools.

The strategies are broken into several areas, including:

  • Set clear policies for the use of AI in specific courses;
  • Communicate the importance of college learning;
  • Assess a student’s process of learning as much as (or more than) the outcome;
  • Acknowledge that the struggle is part of learning;
  • Consider teaching through AI-based tools.

The center says instructors who prohibit the use of AI-based tools such as ChatGPT, and suspect that a student has engaged in academic misconduct, can make a report to Community Standards and Student Conduct.

The guidelines, first published Jan. 17, attempt to balance the benefits and drawbacks of artificial intelligence, addressing the logistical and ethical challenges of AI while recognizing that technology be “a vital part of advancing knowledge.”

AI-based tools like ChatGPT “have the potential to either advance learning or shortchange students,” UW spokesperson Victor Balta told GeekWire. “Our instructors are exploring how AI-based tools can be used to facilitate learning and help students think critically about digital literacy and the accuracy of information.”

At the same time, he said, “students who use AI-based tools as shortcuts to complete assignments shortchange themselves.”

While the constant evolution of ChatGPT can make usage of the tool difficult to detect, the UW says faculty are paying careful attention and believe some students are using the tool to complete their work.

ChatGPT has generated an equal share of intrigue and concern with its ability to quickly answer complicated questions and instantly produce content — including such things as software code and student essays.

The bot builds on existing natural language technology developed by OpenAI, the San Francisco-based company backed by Microsoft, whose cloud computing platform powers the back-end for OpenAI products.

Across the country, AI is causing a “huge shift” in teaching and learning, causing educators at all levels to react, the New York Times reported this month.

In some cases, educators are redesigning courses to stay ahead of the technology. For example, at some universities, professors are phasing out take-home, open-book assignments, which seem vulnerable to chatbots, opting instead for in-class assignments, handwritten papers, group work, and oral exams.

Seattle Public Schools joined a growing number of school districts banning ChatGPT on all school devices, saying that the district “does not allow cheating and requires original thought and work from students.”

OpenAI CEO Sam Altman recently addressed concerns about new wave of plagiarism in schools, saying that AI will require everyone to adapt.

“We adapted to calculators and changed what we tested for in math class, I imagine,” Altman told StrictlyVC’s Connie Loizos. “This is a more extreme version of that, no doubt, but also the benefits of it are more extreme, as well.”

Like what you're reading? Subscribe to GeekWire's free newsletters to catch every headline

Job Listings on GeekWork

Find more jobs on GeekWork. Employers, post a job here.