ITS is keeping pace with the evolution of Generative Artificial Intelligence (GenAI) by exploring the technology, considering potential applications and developing guidelines to ensure its responsible use.ĢżBy keeping step with these developments, ITS, in collaboration with University partners, is striving to turn this advancement into an asset for our campus community, while staying vigilant of the potential risks.
GenAI News and Notes
Generative AI Overview
What is Generative AI?
Generative AI (GenAI) is a type of artificial intelligence able to create novel content like text, images, or sounds. GenAI technology can be used in a variety of ways; from aiding researchers and students in their academic pursuits, to optimizing administrative tasks.
Ethical Considerations
As with any new technology, there are risks. Content creation by AI can potentially be misused, like generating misleading information or deep fakes. Ethical concerns also arise around AI authorship.
As with many digital tools, it's crucial to be aware of its limitations, such as potential biases in algorithms and the inability to replace human insights and connections. Additionally, faculty should offer, and students should defer to, guidelines about appropriate use of AI in coursework, or when such use is prohibited. Students should also be aware of the appropriate and authorized use of AI in each class as outlined by the instructor and in accordance with the University Academic Integrity Policies.
Data Security and Privacy
Be aware that any information entered in an AI system (or Large Language Model) may not only be processed but also retained and used by the AI to give answers to others. This means if you enter any personal information about yourself or any confidential Boston College information, that information may be stored and potentially shared with or sold to others.
Important:
- Do not use your Ļć½¶Šć credentials (Ļć½¶Šć username, password, or any Ļć½¶Šć email address) to sign up for publicly available Generative AI tools.ĢżWhen you use your Ļć½¶Šć email address to sign up for online services, even if they are free, you may be putting your personal information and Boston College data at risk.ĢżNot all companies meet Ļć½¶Šćās security standards when it comes to protecting user data.
- Ģż'Confidential' and 'Strictly Confidential' data, as defined by theĢżBoston College Data Security Policy, should not be used in the any online AI tool.
Official Guidelines
Use of AI tools must comply with all existing University policies.
Faculty considering the use of Generative AI themselves, or possible use by students in their classes, should refer to theĢżCenter for Teaching Excellenceās page.
Acquisition of new AI software or subscription, like any other software, is subject to the āGetTechā process.Ģż
Ģż
Review the ITS Overview Statement on Generative AI Tool Usage.
Ģż
Educational Resources @ Ļć½¶Šć
The Center for Digital Innovation and Learning offers a toolkit, Engaging with AI.
The Center for Teaching Excellence offers guidance on syllabus statements and considerations for course design at AI in teaching and learning.
University Libraries created a Research Guide to Generative AI for students.
Online self-paced learning is available through LinkedIn Learning forĢżstudents, faculty, and staff.
Instructor-led training on generative AI is available to faculty and staff. See the ITS Training website for the latest offerings.
Microsoft Copilot
Microsoft Copilot with Data Protection is available for current students, faculty (full time and part time), and staff (computer users) to engage and experiment with AI in a protected environment. NOTE: Students under 18 years old will not have access to Copliot with Data Protection at this time due to restrictions by Microsoft. See FAQ below for more details.
Copilot provides data protection within the Ļć½¶Šć community, when used with Ļć½¶Šć credentials as described below. Microsoft states: āUser and organizational data are protected, chat prompts and responses in Copilot are not saved, Microsoft has no eyes-on access to them, and they arenāt used to train the underlying large language models.ā Other tools, like ChatGPT, do not currently offer this kind of data protection for Ļć½¶Šć.Ģż
To get started with Copilot:
- Go to:Ģż
- Click Sign in.
- Enter your username@bc.edu
- When prompted, select Work or school account.
- On the Ļć½¶Šć Single Sign On (SSO) login page, enter your Ļć½¶Šć Credentials (the username and password you use for Agora Portal) and complete Ļć½¶Šć 2-Step Verification.
- Once logged in, you will see a shield in the upper right (next to the New chat button), indicating that your information is protected.
Data Security Considerations: 'Confidential' and 'Strictly Confidential' data, as defined by the Boston College Data Security Policy, should not be used in the Microsoft Copilot platform, or any online AI tool, without University review. Additionally, this agreement with Microsoft MAY change in the future, and users should take notice of any changes in the Terms of Service published by Microsoft or notices from Boston College about the privacy of the Microsoft Copilot platform.
Additional Information:
- Ģżfrom Microsoft.Ģż
- For assistance accessing Copilot, contact the Help Center at 617-552-HELP (4357) or help.center@bc.edu, or your local
Ģż
Copilot FAQ
Microsoft Copilot is an AI-powered chat platform connected to the web. Microsoft refers to it as an AI companion. It is a Generative AI (GenAI) platform designed for individual use to generate text, images, and audio.
- ITS is providing access to the Microsoft Copilot chatbot available atĢż
- Copilot for Microsoft 365 is not included as it requires a separate per user/per month subscription fee.
- GitHub Copilot is not included as it requires a separate per user /per month subscription fee.
- Copilot in Windows is disabled by default at Ļć½¶Šć because currently it does not provide the same data protections as Microsoft Copilot on the web.
- ITS and the Provostās Office are providing Copilot as an opportunity to explore AI in a Ļć½¶Šć-specific environment, that is safe and secure.
- As part of a license agreement between ITS and Microsoft, Ļć½¶Šćās version of Microsoft Copilot has data protection. Specifically this means that 1) Prompts and responses aren't saved, 2) Microsoft has no eyes-on access to organizational data, and 3) Chat data isn't used to train the underlying large language models.
- While there are numerous AI platforms, by licensing Microsoft Copilot, ITS is able to provide eligible members of the Ļć½¶Šć community access to features that are normally only available via user subscription.
- Microsoft Copilot is available as part of a license agreement with Ļć½¶Šć, so your data is secure. You can upload content to it, and can be assured that your data is protected and will not be used by the AI to train its algorithm. However, confidential data should not be used in any AI tool (see data security question/answer below).
- Since Microsoft Copilot uses ChatGPT as its base technology, Ļć½¶Šć users are getting access to ChatGPT technology without having to pay separately for it.
'Confidential' and 'Strictly Confidential' data, as defined by the Boston College Data Security Policy, should not be used in the Microsoft Copilot platform, or any online AI tool, without University review. Additionally, this agreement with Microsoft MAY change in the future, and users should take notice of any changes in the Terms of Service published by Microsoft or notices from Boston College about the privacy of the Microsoft Copilot platform.
ITS and the Provostās Office are actively exploring a wide range of options to suit the many needs of the Ļć½¶Šć community. Different groups, such as researchers and administrators, may require different AI tools, so other platforms will be made available as needs arise.
At this time,ĢżMicrosoft does not allow Commercial Data Protection to be enabled for users under 18. Users under 18 will still be able to access Copilot, but without data protection. Therefore these users will not see theĢżgreen indicator that says "protected" in the top right-hand corner of the screen when logged into Copilot.
Google Gemini
Google Gemini with data protection is available for current students, faculty, and staff to engage and experiment with AI in a protected environment. Note: Students under 18 years old will not have access to Google Gemini at this time (due to restrictions by Google).
Gemini provides data protection within the Ļć½¶Šć community, when used with Ļć½¶Šć credentials as described below. Google states: āData from Gemini activity will not be used nor reviewed by humans for model improvement.ā Other tools, like ChatGPT, do not currently offer this kind of data protection for Ļć½¶Šć.
To get started with Gemini:
- Sign into your Ļć½¶Šć Google account using your Ļć½¶Šć email & Secondary Password.
- Go to gemini.google.com.
- Click Chat with Gemini.
- Read the Terms & Privacy policy, then click Use Gemini.
- Click Continue.
- Notice that a shield appears next to the prompt, which indicates your data is protected.
Data Security Considerations
āConfidential' and 'Strictly Confidential' data, as defined by the Boston College Data Security Policy, should not be used in the Google Gemini platform, or any online AI tool, without University review. Additionally, this agreement with Google MAY change in the future, and users should take notice of any changes in the Terms of Service published by Google or notices from Boston College about the privacy of the Google Gemini platform.