ChatGPT & Generative AI @ USI
Following-up on the communication about the use of ChatGPT and other generative Artificial Intelligence (AI) Tools, which has been sent out in January 2023 by the Deputy Rector, this text provides a statement of progress with regard to the position of USI on the use of AI tools.
USI’s overall strategy is to encourage a creative, critical and responsible use (or creation) of generative AI tools within its community and to foster experimentation, critical thinking and open discussions with and about these tools in teaching, research, transfer and other fields of work in USI.
General rule
As a general rule the use of generative AI tools is thus permitted, unless explicitly prohibited for a specific activity (e.g. exam, assignment, etc.). However, their use must always be correctly acknowledged (“I used it” / “how I used it”).
USI thus invites instructors and thesis advisors to pro-actively discuss the use of generative AI tools with their students and explicitly state what works for them and what not for the various activities (e.g. “for my exam the use of generative AI is not allowed / is allowed”).
Working group
In order to address issues related to the use of generative AI tools and to support decision making, USI’ Rectorate has established a trans-faculty working group. The goal of the group is to:
- Propose guidelines on how generative AI can be used creatively
- Provide teaching activities on the use and limits of generative AI tools
- Collect case studies and best practices in various fields and create awareness about AI-related misconduct (e.g.: regarding originality)
- Act as access point for any kind of questions/requests regarding the use of generative AI tools
- Provide advice and policy inputs to USI management
Currently the group is composed of members of the Faculties and the administration and is coordinated by the eLearning lab. To cover the whole USI community, the participation of students is also envisaged.
For any questions regarding the working group, you can contact
[email protected]