Teaching Students To Use AI More Effectively

A robot studying at a library.
(Image credit: Image by CreativeCanvas from Pixabay)

Rather than discourage generative AI use in students, Dr. Jennifer Parker has developed a tool for helping them critically assess what AI generates. Her FLUF test is a scoring system for critically evaluating the accuracy, applicability, and usefulness of AI outputs.

The tool was inspired by digital literacy frameworks such as the CRAP test and the five key questions for media literacy. It draws on Parker’s three-plus decade career in K-12 education and her current role as Faculty Development Coordinator ​at the Center for Teaching Excellence, University of Florida.

“The FLUF framework can be used to help you create better prompts because it can assist you in identifying the who, what, where, when, why, and how, and what goes into a good prompt,” Parker says. It also provides a guide for evaluating the finished product and deciding whether you need to adjust your prompts further.

Here’s everything teachers need to know about the FLUF test and using AI with students more effectively.

Utilizing AI In The Classroom

The FLUF test stands for format, language, usability, and fanfare. Parker, who developed the framework while working with educators at the University of Florida, explains that format refers to the layout and length of a product created by ChatGPT or another AI tool. Language measures tone and phrasing. Usability refers to the credibility and consistency. Fanfare really refers to the audience the output is reaching. For example, is it appropriate for the setting and is it entertaining and does it incorporate anecdotes?

Each of these elements receives a plus or minus score, and the final total is then easily critically evaluated. The goal is to have a zero FLUF and a fluff-free final output, Parker says.

The FLUF Test is available free on Parker’s website, and she encourages educators to start using it for their own AI experiments and with students. The test works with any product a generative AI tool creates.

“You can use it on an image, you can use it on a text. You can use it to critique a video work,” Parker says. “You're really looking for: Is the result your intended purpose? And if it's not, what do you need to do better with your prompts to speak to those elements of the FLUF test so that you really are getting the results that you intended.”

AI Prompts and Necessity

Researchers at the University of Florida and Central Michigan University, where Parker works as an adjunct instructor, are conducting research studies of the FLUF test with students. In the meantime, Parker says using it has taught her how to write better prompts, and she has also developed a prompt generator template based on the FLUF test.

For example, Parker has found that one common prompt-writing mistake people make is writing prompts that are too short and lacking in specificity.

“Sometimes people write just a quick prompt, as if they were Googling something,” Parker says. “But with AI, you have to be really descriptive and detailed.”

You should include information about who and what the final product is for, and be specific about the desired tone and format, Parker says.

For Parker, these are the kind of AI literate tools students need to develop and should be developing in K-12 and college. For her, the answer to the question of whether AI tools should be used by students is a resounding, "Yes."

“I think students should always use AI. I think that it's a tool that's here, just like the internet's here. I don't think we should hide from it,” she says. “What I do think we should do is teach them how to use it effectively and thoughtfully, and we also need to get teachers to engage in using it to have them create authentic products."

Erik Ofgang

Erik Ofgang is a Tech & Learning contributor. A journalist, author and educator, his work has appeared in The New York Times, the Washington Post, the Smithsonian, The Atlantic, and Associated Press. He currently teaches at Western Connecticut State University’s MFA program. While a staff writer at Connecticut Magazine he won a Society of Professional Journalism Award for his education reporting. He is interested in how humans learn and how technology can make that more effective.