Guide To Teaching and Learning

Generative AI Tools: The Basics

What they are and how they work in really simple terms. You can view this document as a recorded 5-minute presentation. This is a very short and very simple introduction to generative AI tools like ChatGPT and Midjourney.

There are a bunch of words used in identifying generative AI tools and we include them here. (Here’s a short dictionary.)

  • Generative AI tools are very sophisticated large language models using transformer architecture to analyze and use billions of bits of information available from the internet to predict output in response to input.
  • They are “trained” to improve the accuracy of their predictions in order to generate (hopefully) meaningful outputs.
  • Large language transformer models power tools like Midjourney (image generation), ChatGPT (language generation) and Codex (code generation).
  • They are very very fast due to this architectural model and their huge computing power.
  • They can analyze whole sentences and paragraphs at a time, which makes them feel very conversational and human-like.
  • And their outputs feel authoritative, even when they may be factually incorrect.

Text-Based Models

  • Text-based models appear to “converse” with the user. They are responding to text prompts based on the likelihood of certain words following other words based on analysis of the billions of bits of data in their databases.
  • They generate text that appears to respond to your prompts with great confidence (if it were a person), so it can be tempting to assume everything it returns is true and factually correct.

The more we can remember not to anthropomorphize these tools in the language we use to talk about  them, the more we remind ourselves and our students that they are tools. Not people.

Image-Based Models and Multi-Modal AI

  • Image-based models can generate sophisticated-looking images based on a few words in a prompt (though at least for now, they often produce images with weird distortions, particularly of faces and hands).
  • In addition, multimodal AI can appear to “see” images and “understand” what it’s “seeing,” with minimal human prompting. For example, the tool can be prompted to design a shoe and then describe, critique and refine it with minimal human intervention.
  • They can also “hear” and they are very good at it, unlike Siri and Alexa.

What to be alert to:

Be very wary of anthropomorphizing these tools. Interacting with them can feel very much like interacting with a human being, but they are NOT:

  • Thinking
  • Understanding
  • Creating

They are putting words and phrases together based on the probability of some words being in proximity to other words based on the ways they have been programmed.  So while they really do appear to be thinking and understanding and creating, they’re not. At least not yet….

Always verify

  • The researchers who created these tools say that the tools can “hallucinate,” returning non-factual – but seemingly true – results. 
  • Students will be tempted to trust that the outputs are accurate because, especially in the language tools, they seem so human and so confident.

How to use them

  • It’s important that we do use them. 
  • The best way to learn how to use them is to actually use them. Play around and see how these tools work and what they can do.
  • Because students will be using them in your courses.
  • And students will be living in a world where they will need to know how to use them effectively.
  • Discuss their impact with your students. 

Why use them?

  • Students can submit written or graphic work and request feedback from the tool, which they can then use to refine their work. Documenting as they go along, of course. 
  • Students can submit text they’re struggling with and ask for a summary. English language learners can check their use of idioms.
  • Faculty can use them to get suggestions on learning activities for their courses, to provide preliminary feedback on student drafts, to help design grading rubrics.

“Prompt Engineering”

It’s important to practice prompt engineering, that is, the writing or rather, the CRAFTING of prompts because prompts in these tools are essentially this: 

Computer code, in natural, everyday language

  • The better you are with words, the better your “code.” Really think about what you’re wanting the tool to generate and what are the best words to get the results you want.
  • Telling the tool that it’s a university professor teaching a course on X and asking for suggestions for student learning activities may produce some surprising – and surprisingly useful – results.

They’re here. And they’re not going away

The more you experiment and the more you experiment intentionally with your students in using these tools, the more confident you and your students will feel about how these tools can contribute to their effective learning in your course.

Take The Next Step

Submit your application

Undergraduate

To apply to any of our Bachelor's programs (Except the Bachelor's Program for Adult Transfer Students) complete and submit the Common App online.

Graduates and Adult Learners

To apply to any of our Master's, Doctural, Professional Studies Diploma, Graduates Certificate, or Associate's programs, or to apply to the Bachelor's Program for Adult and Transfer Students, complete and submit the New School Online Application.

Close