Numerous AI detection tools have emerged to address the use of technology such as ChatGPT, but none of these tools have been reviewed for accessibility, privacy, and security. In Fall 2023, RTL led a pilot of the AI detection tool within Turnitin (Berkeley's Academic Integrity platform available in bCourses). Results from the pilot have been inconsistent, so we’ve decided not to enable the tool campuswide. We will continue to review options for AI detection as they develop, but will only adopt solutions that have been fully vetted and tested.
Note that using any third-party AI detection tools could lead to FERPA, privacy, and copyright violations given that using these tools require faculty to input examples of student work into a third-party software. We do not recommend that instructors rely upon AI detection tools to identify usage for writing and, instead, encourage faculty to engage in conversation with their students about appropriate (and inappropriate) usage of AI for their courses. Learn more about Understanding AI Writing Tools and Their Uses for Teaching and Learning on the Center for Teaching & Learning website.
Transparency about GenAI Tools
Before adopting any tool that might utilize GenAI, we'll work with vendors to ensure transparency. In addition to campus procurement procedures around security, privacy, and accessibility, we’ll also ask the following AI-specific questions, guided by 1EdTech’s AI Tools Framework:
-
GenAI Utilization: Does any component of this tool use GenAI?
-
User Awareness: Is the use of GenAI clear to users?
-
Opt-out options: Can users opt out of the GenAI component of the tool?
-
Data Handling:
-
What happens to the inputted data?
-
Will it be used to train future models?
-
Do any additional third parties have access to that data?
-
Model Training and Improvement:
-
How was the AI model trained?
-
How will the model continue to be improved?
By addressing these questions, we aim to ensure transparency and hold vendors accountable for the proper handling of sensitive data. As third-party vendors release AI features, we’ll ask the questions above before enabling those features.
Partnerships with Faculty
For any centrally deployed AI tools, we will partner with the Center for Teaching & Learning (CTL), the Academic Senate, and faculty in first implementing a pilot study to identify the potential impact on instruction including improvements to student learning, risks and vulnerabilities. The outcome of the pilot will be evaluated by campus leadership and the Academic Senate as well as made public to the campus in advance of implementation.
Many educational technology companies are experimenting with AI applications for their tools. At Berkeley, we have existing licenses with many of these tools and are actively assessing which new application developments may be of benefit to our campus community. If you have questions about these applications or their potential use cases, our team would be happy to discuss.
Review a list of Generative AI tools available through UC system-wide and UC Berkeley licensing. To access tools with the protections of UC-Berkeley licensing, users must authenticate through CalNet and, in some cases, request a Berkeley-provided license.