1
Michigan Tech’s Research Policy on AI Tools can be found on the Research website.
Faculty should also be aware of any guidance for the use of such tools in the proposal development process and on projects by their funding agency and publishers.
When using GAI tools, faculty should:
- Document where and how GAI tools are used. The documentation may be needed to understand how its use aligns with policies from funding agencies and publishers.
- Understand the limitations of such systems. Researchers need to ensure that systems report accurate and repeatable results. Researchers must understand the bias that may exist or be produced by such tools, in particular when working with human subjects data.
- Protect their data under any applicable data privacy laws, e.g., FERPA, HIPAA, Export Controls, etc.
Adapted from University of Michigan Generative AI Resource website
2
Michigan Tech currently allows for instructors and programs to decide how GAI can and may be used in classroom settings.
See also Question 10 on the MTU AI Student FAQ page
With this flexibility, instructors should make sure to clarify to students on what is acceptable and unacceptable use of such tools and systems in their syllabus and assignments. Sample syllabus statements are available here
3
The federal government has issued the Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence | The White House. However, additional resources on federal agencies’ current stance on AI include the following:
- FACT SHEET: Vice President Harris Announces OMB Policy to Advance Governance, Innovation, and Risk Management in Federal Agencies’ Use of Artificial Intelligence | The White House
- White House releases AI guidance for agencies
- Artificial Intelligence: Agencies Have Begun Implementation but Need to Complete Key Requirements | U.S. GAO
- GAO-24-105980 Highlights, ARTIFICIAL INTELLIGENCE: Agencies Have Begun Implementation but Need to Complete Key Requirements .
- Please also view the MTU Working Groups Agencies AI Guidance
4
The AI WG has assembled some resources in different academic areas; however, we encourage faculty to consult their professional disciplinary organizations in order to determine whether more specific advice is offered by the discipline.
5
The AI WG has collected several sample syllabi statements. We offer different options for language that reflect a variety of potential approaches that faculty might take. The key issue is being clear and transparent with students about your stance on AI for use in the course.
6
The wide range of GAI/AI tools available (and being introduced at a rapid pace) means that instructors need to be prepared to:
- Understand the capacity of G/AI tools: Students are largely ahead of faculty in their use of AI. While it can be tempting to disregard or “opt out” of understanding AI, it’s important to become familiar with AI tools that might be employed by students to complete work in your course. In addition, faculty can and should identify uses of AI that are compatible with their course goals and identify ways to integrate it into their courses to prepare students for AI use in their future area of work.
- Develop critical engagement with the range of tools that are relevant to your discipline. For example, a first step might be to provide GAI with your task/assignment prompt and see what kind of material it generates. But instructors must be prepared to do more than just copy and paste their assignments. Prompt engineering is a skill that draws from critical and rhetorical orientations to language and technology, and effective prompt engineering approached through repeated effort and revision can produce vastly different responses from AI.
- Understand Varied Approaches: Instructors must recognize that there is not a single right way to approach AI in their courses, programs, or scholarship. Approaches relevant and useful for writing studies will differ from those in data science, philosophy, chemical engineering, and the visual arts. This does not mean, however, that there are no general principles. Learn more about what principles of transparency, accountability, and ethical use look like in your field of study.
- Encourage Discussion within Your Academic Unit: Build in opportunities for faculty in your academic unit to share their approach to classroom use of AI (policy statement, pedagogical uses, guidance to students, etc), particularly for courses that are sequential or prerequisites. It is important to understand the student experience of navigating the curriculum and whether/how AI tool usage might be disruptive or confusing (or interfere with the development of key skills and knowledge necessary for success in the program).
- Browse Pedagogical Resources: Pedagogical guidance on AI is increasing, and campus-based opportunities such as ongoing professional development, certificates/courses, and digital repositories will only grow. Commit to exploring these resources and developing your own skills in these areas.
- Auburn Course
- MTU course in development
Think through implications of detection programs: Many universities have opted not to employ commercial products that promise detection of the use of AI tools in student work. Review Vanderbilt University’s article about their decision to disable AI detection software in their course management platforms, for example, to understand the issues related to these tools, academic integrity, and classroom dynamics. Detection software is not yet accurate enough to warrant the number of false positives, and it erodes the relationship of trust and focus on good-faith learning between students and teachers.