Artificial Intelligence in Teaching & Learning
Join the AI Learning Community! Four meetings during Fall 2023; attendance in person or via Zoom. Learn more
Getting to know AI
The field of Artificial Intelligence (AI) is progressing rapidly and in bewildering ways. Applications of AI have become common in almost every sort of human interaction, including teaching and learning. You’ll find guidance below for thinking about the role of AI in your classes, writing a proprietary AI policy, and responding to unauthorized AI use.
What is AI?
AI is an umbrella term for any computer program that simulates cognitive processes. Many different AI platforms are now accessible for public use. ChatGPT is perhaps the most widely used among university students and of most concern to faculty. ChatGPT and similar AI technologies use predictive algorithms to generate written text. They work by analyzing a large dataset of text to learn the patterns and relationships among words and phrases. When given a prompt or a starting point, these programs can generate text that is similar to text they have seen before. They can answer one-off prompts or engage in dialogue. It is often hard to distinguish their output from human writing.
What can AI not do?
There’s still quite a bit that AI and ChatGPT can’t do. These limitations may help you understand its ability and inspire how you design assignments. For instance (for now), ChatGPT cannot fact-check; engage in self-reflection; write about anything that happened after 2021; create infographics, interactive maps, videos, or memes; make predictions about the future; browse or summarize content from the internet (it was built with a specific data set and does not search the internet); cite specific examples or quotations from another text; or draw connections between course content and visual materials.
When and why might students use AI?
Some students use AI in situations similar to those in which they might formerly have engaged in plagiarism. They may resort to ChatGPT because of a heavy course load or work schedule, overwhelming family obligations, or because they don’t understand the assignment. They may use ChatGPT because they are unaware of what is allowed and what is not.
Your students may also use ChatGPT in productive ways:
- To edit their writing for grammar, spelling, and/or writing mechanics
- To seek explanations for complex ideas and theories
- To brainstorm
Whether you discourage or encourage the use of AI, you should talk to your students about your expectations early and often—on the first day of the quarter and in preparation for all major assessments.
Approaching AI use in your teaching
Take a proactive approach to AI. Regardless of your own feelings about it, AI is here to stay – eventually, you will need to find ways to either actively work with it or to minimize its downsides. The application and utility of AI will inevitably be different in different disciplines. In those disciplines wherein professional practitioners use AI to save time spent on more tedious tasks, for instance, teaching practical AI use may confer future employment benefit on students. In disciplines such as writing, however, it may detract from students’ opportunities to develop skills. Your approach to AI use should address the specificities of your discipline, your department, and your professional practice as a scholar and teacher. Keep in mind, too, that your students are looking to you as a model for standards of scholarship within and beyond your discipline. If they perceive apathy in your approach to the ethical use of AI, they may follow your lead.
How can I incorporate AI into my teaching?
Some instructors have chosen to incorporate AI use into their assignments with the proviso that students clearly cite where they have used it. Here are just a few of the wide-ranging possibilities for this technology:
- Have students use AI to generate text based on a prompt that you provide. Then ask them to critique the results at the level of content, writing, or both;
- Ask students to conduct a compare/contrast between AI-generated text and human-generated text; or,
- Invite students to use AI for components of assignments that a professional in the field would be likely to use AI for (e.g., simple coding).
- Use AI to generate conversations about ethics and technology, and how the topics will increasingly converge in education. Who has access to technologies like AI and will they always be “free” and equally available? Who might be helped and harmed by AI? How can the technology be used to promote educational equity?
- Have students interact with AI on topics from the course and analyze and critique what it generates. Do small changes to prompts result in significant differences in output? Are there significant omissions in the text that AI generates, and if so, what might be the causes for the omissions? Ask your students what they learned from interacting with AI and how it might influence their future writing both in terms of how they express their ideas and arguments, and their use of voice.
- Design multipart assignments that require students to use AI (e.g., in generating synthesis) and reflect on what it produces.
- Create assignments in which students engage in debate or argument with AI and reflect on what they learned.
- Use AI to generate writing prompts or quiz questions based on the course materials. Invite students to critique what AI did and didn’t get right.
- Ask students to “remix” their work creatively using AI (such as writing in the voice of an author or historical figure they’re studying) and to reflect on the new version.
- Use AI as a resource for students to gain feedback on their writing by developing a rubric that students can submit with their writing to an AI program.
- Invite students to consider the nature of tone and voice. For example, if AI consistently generates text that reads as if it were written by a robot, what precisely are the elements of a piece of writing that give it tone and voice? How can students articulate what’s missing?
Drafting an AI use policy for your class
Should you include an AI-use policy in your syllabus? Absolutely. Craft an AI policy that aligns with the specific needs and ethics of your discipline, your department, and your course. Rather than providing institutional language, we offer the customizable formula below for constructing your own policy. Sample syllabus language follows this guide to give you some ideas for creating a proprietary version that suits your needs. You can also refer to statements used in courses at UC Santa Cruz or to this crowd-sourced document that contains syllabus language samples from higher-ed institutions all over the U.S.
Note that unless your AI policy is uniformly prohibitive (including tools such as grammar and spelling checkers), your guidelines will be clearer to students as a standalone statement than as a brief part of your general academic integrity policy.
- A discipline-specific explanation of the role of AI in learning and practice
- When and how AI resources may and may not be used in your course
- When and how AI resources should (if you allow their use in certain circumstances) be cited or acknowledged
- How you will handle cases of suspected or confirmed AI use that contravenes your policy
- A list of resources students can use if they need help understanding how to interpret or use your policy— including an invitation to attend office hours
Putting It All Together
- Follow equity best practices by using welcoming and inclusive language and writing in the first person; framing your policy as a commitment to helping students grow their capacities will be more persuasive than threats of discipline.
- Review your policy carefully to make sure you haven’t incorporated any requirements that are unenforceable or illegal (in particular, the non-consensual use of AI detection software, which is currently prohibited by the University of California due to its infringement of students’ privacy).
Sample Syllabus Language
The samples provided below are from a CSE course and a Sociology course, respectively. Your message to students about the use of AI (and academic integrity in general) should be specific to your course and discipline and reflect your teaching philosophy, the priorities of your department, and the conventions of your field.
Collaboration and AI Policy
- In this class, I ask that you complete your coding assignments without AI-generated sources in any way: either for pseudocode or to augment or write your coding. assignment. For example, you cannot prompt an AI to generate code for the assignment, and then rewrite the code based on the generated AI code.
- You cannot use AI-generated sources for any reason during a quiz or an exam. If you submit an exam that appears to be written with AI sources, I will ask you to meet with me to discuss your answers.
- You are welcome to use AI-generated tools for brainstorming and conceptual understanding about high-level concepts, similar to asking a colleague something like “What are some reasons to use depth-first search versus breadth-first search?” Of course, note that these tools may hallucinate information and they may not provide truthful insights or information. These queries are only allowed outside of (1) coding assignments and (2) quizzes and exams.
- I may create an assignment that will ask you to critique content that is generated by AI. If this occurs, I will provide clear assignment-specific AI-use guidelines within the prompt.
- Students may reference online message boards (e.g. Stack Overflow) to debug errors in their code, but may not reuse code from these sources.
- Any external sources (excluding references linked in the assignment) should be cited using comments (##) in the relevant code block.
- If you submit work that appears to have been written by another source without attribution, I will ask you to meet with me to discuss your thinking and problem-solving. If, after our conversation, I conclude that it’s more likely than not you did not personally complete an assignment that you submitted under your name, I may refer you to your college provost for further conversation.
- If you have questions about AI use and or collaboration and proper attribution of people’s work, please come ask me! Citing work, especially in computer science is not intuitive, and part of my role is to help you learn those conventions.
A Word About Integrity
Integrity – other people’s perception of your word as true – is one of the most valuable assets you can cultivate in life. Being attentive to integrity in academic settings allows others to trust that you have completed work for which you are taking credit. This is symbolic of the public trust from which you will benefit in your future occupation and activism after you graduate from UCSC.
The creativity of your words, expression, understanding, and knowledge matters a great deal in your work as a sociologist, and it matters to me. My AI policy reflects the emphasis our discipline places on original thought and scholarship.
- In this class, I ask that you complete your work without using AI-generated sources to augment, think through, or write your assignments.
- There is one exception: you are welcome to use AI tools for pre-submission editing (spell-check and grammar-check) as long as you do not use them for thinking or drafting.
- On rare occasions, I may create an assignment in which I ask you to critique content generated by AI; if this occurs, I will provide clear assignment-specific AI-use guidelines within the prompt.
- If you submit work that appears to have been written using AI sources, I will ask you to meet with me to discuss your thinking and writing process. If, after our conversation, I conclude it’s more likely than not that you did not personally complete an assignment you submitted under your name, I may refer you to your college provost for further conversation.
- If you have questions about AI use and/or proper attribution of other people’s work, please come ask me! Scholarly citing is not particularly intuitive, and part of my role is to help you learn those conventions.
Detecting and Responding to Unauthorized AI Use
How do I know whether a student has used AI on an assignment?
While conventional plagiarism is often easy to identify, text produced by AI may be indistinguishable from other student work. Some AI-generated text contains the telltale signs of a large language model: plausible but illogical arguments, invented “facts,” prose that feels flat. But trying to detect AI-generated text is a losing game, and it will only become more difficult as the technology advances.
May I use an AI detector?
There are programs that detect AI-generated text with varying levels of accuracy, but they may only be used at UC Santa Cruz if the tool is hosted locally and data is protected from external access, the tool is contracted through campus Purchasing, or if you obtain prior approval from students. Without meeting one of these criteria, the use of AI detectors may violate the Family Educational Rights and Privacy Act (FERPA), which prevents the unauthorized sharing of student work. The guidance is explained in detail in this March 20, 2023 communication to faculty.
A less direct—but still informative—approach is to enter your own assignment prompt into ChatGPT a few times to see what it produces. Platforms like ChatGPT don’t generate the same response twice, but they do generate similar responses across multiple iterations. Certain words and phrases reappear; arguments and analyses are markedly similar. You may compare AI-generated text to student submissions without prior approval. If applicable, share the output with your teaching team and ask them to familiarize themselves with it.
A note of caution: If you plan to use a detection tool, proceed carefully, as detection tools occasionally produce false positives, which can harm student-instructor relations when false accusations are made.
What should I do if I suspect unauthorized AI use?
If you suspect that a student’s submission was generated by AI, start with a conversation. Students who completed their own work are generally able to explain and defend their process. Assume the best of your students and consider your evidence carefully before coming to conclusions about a student’s behavior. Just as an AI detector can return a false positive, your intuition is fallible. An unfounded accusation against one student can damage your relationship with all of your students.
If you are still convinced that the student did not complete the assignment in accordance with your AI/academic integrity policy, follow up with the action items you previously identified in your policy: such as no credit for the assignment, referral to college provost, etc. Describe this process clearly in your syllabus to ensure that your students perceive it as being fairly applied across the class.
If you choose to pursue institutional disciplinary action based on suspected academic misconduct, be sure to clearly document the reasoning behind your suspicions.
Proactive approaches to minimizing unauthorized AI use
There is no way to prevent AI use in your classroom, but you have more influence than you might realize on students’ thinking and decision-making. The following is a non-exhaustive list of strategizes you might consider incorporating in concert with one another:
- Promote the values of academic integrity: doing your own work, acknowledging (and citing) sources of ideas and information, improving with practice, honesty and openness.
- Discuss AI’s limitations and biases with your students
- Discuss the professional and personal implications of failing to acquire the skills/mastery that your assessments are meant to confer
- Ask students to sign an academic integrity pledge at the beginning of the quarter and/or as they submit each assignment
- Write a thoughtful academic integrity policy beyond AI use that clearly conveys to students the value that you and your discipline place on professional trust and ethics
- Inoculate against cheating by appealing to students’ sense of personal honor and desire to attain mastery
- Explain the pedagogy behind growth through mental effort: in short, you have to put real effort into learning how to write/think if you want to own that knowledge in the long term
- Explain why students should want to know how to write – beyond passing the class (for example, in what other adult contexts could fluid writing prove useful?)
- Explain the consequences of breaking interpersonal trust and the difficulties in repairing it; make clear that you want to be able to experience reciprocal trust in your students
- Design assignments that encourage students to leverage their creativity and personal experiences
- Use writing prompts that are less compatible with AI text generators (here is one such source)
- Build a peer-review process into major assignments, and require that students incorporate and address their peers’ feedback in subsequent drafts
- Break larger assignments into smaller, scaffolded assignments to decrease the anxiety surrounding high-stakes assessments and to give you and your teaching team the opportunity to make early interventions if you suspect AI use
- Ask students to reflect, as part of an assignment, on their writing process
- Remind students that there could be unanticipated future academic or career consequences to having represented AI-generated writing as one’s own
What we’re reading about AI right now
- “Teaching with Text Generation Technologies” (WAC Clearinghouse, Tim Laquintano, Carly Schnitzler, and Annette Vee)
- “Why Professors are Polarized on AI” (Inside Higher Ed, Susan D’Agostino)
- “I’m a student. You have no idea how much we’re using ChatGPT” (Chronicle of Higher Education, Owen Terry)
- “Resources for Exploring ChatGPT in Higher Education” (Bryan Alexander)
- “Teaching About the Use of Generative AI” (University of North Carolina, Chapel Hill)
- “Caught Off-Guard by AI” (Chronicle of Higher Education, McMurtrie and Supiano)
- Book: An Introduction to Teaching with Text Generation Technologies
- Intersecting Bloom’s Taxonomy with AI (Oregon State University)
- Recommendations to Guide the University of California’s Artificial Intelligence Strategy (University of California Office of the President)
- Students outrunning faculty in AI use (Inside Higher Ed, Lauren Coffey)
- Chatbots may “hallucinate” more often than many realize (New York Times, Cade Metz)
- How to help students avoid getting duped online — and by AI chatbots (EdSurge, Jeffrey R. Young)
Campus communications about generative AI
Event: Let’s Talk About ChatGPT (March 21, 2023)
Faculty presentations from Leilani H. Gilpin, Assistant Professor of Computer Science and Engineering; Amy Vidali, Chair & Associate Teaching Professor, Writing Program; Zac Zimmer, Associate Professor of Literature; and Jennifer Parker, Professor of Art.