Things to Consider When Using AI at Work
Overview
Artificial Intelligence (AI) tools such as Microsoft CoPilot can help employees save time, improve writing, analyze data, and generate creative ideas. However, these tools also introduce important considerations related to data privacy, ownership, and compliance. This article outlines best practices and institutional expectations for using AI responsibly within SIUE’s technology environment.
This guidance applies to both chat-based AI (e.g., prompts in CoPilot) and built-in AI features such as meeting recap, transcription/captions, suggested replies, “analyze data” in Excel, generative fill in creative apps, translation, and code assistants.
Access and Availability
Microsoft CoPilot is available within Microsoft 365 applications, including Word, Excel, Outlook, Teams, and OneDrive.
If you’ve recently been assigned a CoPilot license, you may need to log out and back in to see CoPilot appear in the web versions of these apps.
What Counts as AI in Our Tools?
When we say “AI,” we mean any feature that interprets your content and produces or reshapes output—whether or not you typed a chatbot prompt. Examples include:
- Summaries and recaps of emails, chats, or meetings
 - Transcription, captions, translation, and auto-classification
 - Suggested replies, writing/rewrite help, and smart compose
 - Data analysis/explanations (e.g., Excel “analyze data”, chart/story generation)
 - Generative image/audio/video edits (e.g., background removal, generative fill)
 - Code suggestions and documentation generation
 
Data Access and Privacy
CoPilot operates within Microsoft 365’s existing security and permissions model. It can reference the same data sources you have access to (OneDrive, Outlook, Teams, SharePoint) when responding to a prompt, but it does not independently browse, store, or continuously monitor all of your data.
Data processed by CoPilot in Microsoft 365 remains within SIUE’s Microsoft tenant. However, standard data-handling policies still apply—never include confidential or encumbered data in AI prompts unless authorized.
The same privacy expectations apply to “one-click” or background AI features (e.g., auto-summaries or captions) even if you didn’t open a chat window.
You control what CoPilot can access by:
- Managing file and folder permissions in OneDrive, SharePoint, and Teams.
 - Being intentional with your prompts—avoid sharing or referencing information that is confidential or sensitive.
 - Reviewing collaboration permissions before using CoPilot with shared documents.
 
Example: If a spreadsheet in OneDrive is shared with your team, CoPilot may be able to draw from that data when generating summaries or analyses.
Responsible Use Guidelines
- Do not enter confidential, private, or restricted data (e.g., student records, health information, proprietary research) unless explicitly authorized.
 - Avoid asking AI to generate or summarize content that you wouldn’t otherwise share.
 - Check for accuracy—AI-generated responses can be helpful but may not always be correct or complete.
 - Maintain authorship, attribution, and accountability—even when AI assists, you are responsible for the final content’s accuracy and compliance.
 - Use approved tools only—limit SIUE data to vetted solutions (e.g., Microsoft CoPilot within Microsoft 365) that have passed security review.
 - Treat non-chat outputs the same as chat responses—meeting recaps, transcriptions, captions, suggested replies, code suggestions, and generative edits are all AI-produced and subject to the same data-handling rules.
 
Data Ownership and Compliance
SIUE’s software agreements require that data ownership remains with the University. When using AI tools:
- The University retains the right to extract all institutional data and revoke vendor access at any time.
 - Vendors must meet security and compliance standards for offsite data storage and processing; users may need to follow specific practices to keep configurations effective.
 - Distinguish approved Microsoft services from other AI vendors: CoPilot within our Microsoft 365 tenant includes contractual and regulatory safeguards under our enterprise agreement. Other AI tools may lack these assurances—use them only if they have passed SIUE’s security and compliance review.
 - Add-ins, plugins, extensions, and connectors (including CoPilot extensions or external model integrations) are treated as third-party vendors and must pass SIUE security/compliance review before use with University data.
 - Even if a vendor claims compliance, responsibility ultimately rests with the University and individual users to handle data properly.
 
Privacy, Contracts, and High-Risk Data
Many AI tools—and existing apps that add AI features—use customer inputs and outputs to train and improve their models. Because of this, users should avoid sharing personal or sensitive information with AI tools and should not input moderate or high-risk SIUE data (as defined by the SIUE Data Classification Policy), or SIUE intellectual property, unless all of the following are in place:
- Clear understanding of data use and model training
- Verify how the service uses SIUE data, including whether the provider trains models on customer prompts/outputs and whether an opt-out is available and enabled for SIUE.
 - Confirm data retention/deletion practices, data residency/location, subcontractor (subprocessor) access, and logging/audit options.
 
 - A signed contract that protects SIUE data
- SIUE-executed agreement with appropriate protections (e.g., no training on SIUE data by default; confidentiality; breach notification; timely deletion/return of data; liability/indemnification; and compliance with applicable regulations such as FERPA/GLBA/HIPAA, where relevant).
 
 - Security and privacy review
- Completion of SIUE’s security compliance review by ITS, coordinated through Purchasing.
 - Where personal or regulated data is involved, obtain the necessary privacy review/approval through ITS, in consultation with the Office of General Counsel as appropriate.
 
 - Intellectual property consultation (if applicable)
- When research IP, inventions, or commercialization are involved, consult the Office of Research & Projects in the Graduate School before using AI tools.
 
 
Important account guidance
- Do not process University data with personal or non-SIUE accounts (e.g., consumer Copilot, ChatGPT, personal Google, OpenAI, Anthropic).
 - Use only SIUE-provisioned accounts within approved services that have completed SIUE’s reviews and have an SIUE contract in place.
 
If a vendor or feature has not completed these steps, do not use it with moderate/high-risk data or SIUE intellectual property. When in doubt, contact ITS for guidance.
Key Risks to Keep in Mind
AI systems are powerful but informal by design, which can make it easy to overlook risks. Be mindful of the following:
| Risk Area | Description | 
|---|---|
| Data Ownership | Clarify ownership of data and outputs. SIUE data and derivatives must remain under institutional control. | 
| Unintended Disclosure | Generated output could inadvertently expose IP, personal information, or research data. Avoid including restricted data in prompts. | 
| Compliance | Users can be held responsible for breaches of protected data, even if unintentional or via misuse of AI tools. | 
| Vendor Transparency | Not all vendors clearly disclose how their AI collects, stores, or references data. Use only SIUE-approved tools (e.g., Microsoft CoPilot) for University data. | 
Note on meetings: AI features that generate recaps, action items, or transcripts process the content of the meeting. Avoid discussing or recording regulated or confidential topics unless authorized and necessary, and confirm attendees understand the use of AI features.
Balancing Innovation and Risk
AI has enormous potential to enhance productivity and innovation. The technology is still evolving, and much about its inner workings remains proprietary. Use AI thoughtfully—augmenting, not replacing, your expertise—and exercise extra caution with research, intellectual property, or regulated data. When in doubt, consult ITS before using AI in high-stakes contexts.
Summary
- Be mindful of what data you include in prompts; follow SIUE data-handling policies.
 - CoPilot processes data within SIUE’s Microsoft tenant; still treat sensitive data with care.
 - Verify AI-generated information and keep final ownership and accountability.
 - Use vetted, approved tools; seek guidance from ITS when uncertain.
 
By balancing innovation with thoughtful data stewardship, we can take advantage of AI’s benefits while protecting SIUE’s people, research, and mission.
