Skip to Main Content

AI Best Practices for Faculty

Examine Privacy and Data Collection Practices

Before encouraging or asking students to use AI tools in their work, investigate how these tools collect personal information and data by reading their privacy policies. Some tools may use login data, tracking, and other analytics. This advice also applies to you as a faculty member. If you plan to input your existing prompts or materials into an AI tool, consider your own data privacy. 

If you are considering requiring students to use an AI platform to complete coursework, ensure that the tool complies with relevant privacy laws and university policies. Note that with any online service not contracted by the university, in view of regulations like FERPA, students must not be required to identify themselves to third parties without proper safeguards. 

Clarify the Expectations of AI Use

In the absence of guidance from the instructor, the use of AI text generators may be considered a violation of academic honesty policies. AI detection tools, including Turnitin, continue to be unreliable in differentiating between human and AI-generated text and these tools are not useful in determining academic honesty. Faculty are encouraged to review resources on initiating academic honesty conversations and guidelines on the academic honesty process as they frame their expectations. Instructors have the discretion to allow the use of such tools but must do so explicitly. Stating what you expect students to use (or not use) in their work helps clarify any questions around permissible support.  

In addition to setting your expectations, consider discussing AI tools with your students. 

  • As an emerging technology, students will have varying levels of familiarity with these tools. While AI tools like ChatGPT and Gemini can produce grammatically correct text or attempt to solve problems, they may be unreliable and limited in scope. Discuss with students the uses and limitations of AI tools more broadly, along with your perspective on their use in your class. 

  • AI tools have implications beyond the classroom. Consider talking with students about how to be engaged consumers of AI content—for example, how to identify trusted sources, read critically, and understand privacy concerns. 

Adapt Assessments

Given the rapid development of AI tools, making any assessment entirely free from AI influence can be challenging. Beyond setting expectations, you might consider adapting your assessments to reduce the effectiveness of AI-generated responses. Reflect on what you want students to gain from the assignment and share your expectations with them. Is the focus solely on the end product, or does the creation process hold significant value? 

  • Consider scaffolding assessments into smaller components (e.g., proposal, annotated bibliography, outline, first draft, revised drafts), allowing students to develop their ideas progressively. 

  • Ask students to reference the course textbook, supplementary readings on the Learning Management System (LMS), class discussions, or discussion boards in their work. 

  • Provide opportunities for students to relate what they're learning to their own lives and experiences, incorporating unique perspectives. 

  • Consider adapting assessments to include multimedia submissions, such as audio or video components. Tools like VoiceThread allow students to provide audio, visual, and video content. Additionally, social annotation tools like Perusall or collaborative platforms like Google Docs can be used for responding to readings or materials. 

  • Assign writing tasks during class (e.g., reading reflections at the beginning of class or exit tickets). Encouraging students to organize their ideas in writing during class can also enhance engagement in discussions and group activities.