MSU Denver Policy

Since Fall 2023, MSU Denver has expected that every course explicitly address generative AI use in the syllabus or other course information. Clear syllabus language reduces ambiguity, supports academic integrity, and helps students focus on learning rather than guessing what is allowed.

Syllabus language is necessary, but it is not sufficient. Regardless of which approach you choose, it is best practice to restate AI expectations in assignment instructions, especially for major assessments and any activity where permitted and prohibited uses differ.

Start with a practical choice

The Generative AI Taskforce recommends a simple spectrum of approaches to student AI use. Any option can be rigorous and fair when it is communicated clearly and reinforced at the assignment level.

For deeper insight and examples to support your decision, see the Generative AI Syllabus Language and Considerations document.

Thumbs up icon representing allowed or encouraged use.

Allows

Under this approach, the use of generative AI is generally allowed or encouraged across most coursework. For example, students may use AI tools to support their work, provided they clearly disclose and explain how those tools were used, and assignments continue to assess student reasoning, decision-making, and learning rather than surface-level polish or presentation.

Circular arrows icon representing a mixed or flexible approach.

Mixed

Under this approach, different parts of the course may forbid, allow, or encourage the use of generative AI. For example, students may be permitted to use AI for certain assignments but not others, depending on the task, with assignment instructions clearly stating what is allowed and requiring disclosure whenever AI use is permitted.

Thumbs down icon representing restricted use or not allowed.

Restricted

Under this approach, the use of generative AI is generally forbidden unless an assignment explicitly requires it. For example, students may only use AI when directed to do so, with any unapproved use treated as an academic integrity concern, and assessments designed to emphasize visible thinking, process, and individual student work.

Course-Level AI Policy Options

Below are copy/paste options adapted from the Taskforce “syllabus spectrum.” Each option includes a short version and a more detailed version. Instructors should modify language to fit the course, discipline, and specific assignment design.

Option 1: Allowed

Short Version: Students may use generative AI in any assignment in this course. All use of generative AI must be cited or explained.

Detailed Version: Students are free (and even encouraged) to use generative AI in all aspects of the course. Generative AI can be used when and wherever the student feels it is appropriate to enhance their learning. If generative AI is used as part of work that will be submitted, its use should be briefly but accurately explained in a submission statement.

Option 2: Mixed

Short Version: Using generative AI is encouraged in some aspects of this course, allowed in others, and prohibited in some. The instructions included with every assignment will address the use of generative AI. When used, you will be asked to briefly but accurately explain how in a submission statement.

Detailed Version: The use of generative AI is encouraged in some aspects of this course, allowed in others, and prohibited in some. Students are encouraged to explore any and all tools that help with their learning, including the use of generative AI. If generative AI is used as part of work that will be submitted, its use should be briefly but accurately explained in a submission statement.

In creating submissions, generative AI may be used in specific parts of the coursework, for example:

  • Help in generating an outline for a piece of writing
  • Getting feedback and editing help on your original writing

The following assignments should be completed without the use of generative AI, for example:

  • Reflection essays
  • Responding to classmates in a discussion forum
  • Answering questions on any quiz, test, or exam

The use of generative AI in these cases will be treated as a violation of academic integrity.

Option 3: Restricted

Short Version: Students may only use generative AI in this course when an assignment explicitly calls for it. Use in any other assignment will be treated as academic dishonesty.

Detailed Version: The use of generative AI is restricted to rare and specific circumstances in this course. Students should not use such tools unless they are specifically called for in an activity, assignment, or assessment. Any use of generative AI in the creation of submitted work for this course will be treated as a violation of academic integrity.

Assignment-Level AI Policy Options

Even when a syllabus policy is clear, students benefit from assignment-level guidance about generative AI use. Reiterating expectations within individual assignments is especially important in mixed approaches and helps reduce confusion and improve consistency.

For each assignment, instructors should clearly indicate whether generative AI use is allowed, allowed with disclosure, restricted, or not permitted. This information should appear directly in the assignment instructions.

When AI use is permitted, assignment instructions should clarify:

  • How AI tools may be used (e.g., brainstorming, outlining, or revision)

  • Which uses are not allowed such as generating final analysis, answering quiz questions, or completing personal reflections

In assignments where AI use is allowed, requiring a brief disclosure supports transparency and accountability. Students should be reminded that they remain responsible for:

  • Accuracy and reasoning

  • Verifying facts and sources

  • Making final decisions about their work

A disclosure typically includes:

  1. The AI tool used
  2. How it was used
  3. What the student changed or decided independently
  4. Any sources that were verified, when relevant

Providing assignment-level AI expectations supports fair evaluation and helps ensure assessments reflect student learning in an AI-present environment.

What Counts as Generative AI (and What Does Not)

From a practical standpoint, it is not always possible to draw a sharp line between generative AI and other digital tools that exist along a continuum of automation, machine learning, and AI. Tools such as voice-to-text, spellcheck, and grammar assistance may incorporate AI features without functioning as full generative systems.

Because of this ambiguity, instructors are encouraged to clearly define what counts as generative AI for their course and to provide students with practical guidance on what is permitted. This is especially important when accessibility or assistive technologies may be in use.

A useful rule of thumb is to focus on impact rather than technology: if a tool meaningfully generates, rewrites, or substitutes for student thinking or content creation, it should be treated as generative AI for the purposes of the course.

Key Considerations for Any Course Policy

When defining AI expectations, instructors should consider the following:

Accessibility and Accommodation

Some assistive technologies used by students may include AI features. Course language should account for accessibility needs and provide a clear process for students to seek clarification or accommodations.

Amplification of Existing Bias

Generative AI systems can reflect and reinforce biases present in their training data. Instructors should consider how this may affect learning, assessment fairness, and student outcomes.

Data Privacy and Protection

Prompts and outputs may raise data privacy or legal concerns. Students should be instructed not to enter sensitive or protected information into unapproved AI tools.

Equity and Access

If AI tools are permitted or required, instructors should consider whether all students have equitable access and whether tool use could create unfair advantage.

Common Questions