Things to Consider When Using AI at Work

Artificial Intelligence (AI) tools such as Microsoft CoPilot can help employees save time, improve writing, analyze data, and generate creative ideas. However, these tools also introduce important considerations related to data privacy, ownership, and compliance. This guidance applies to all faculty and staff, including those who use AI in teaching, course design, grading, and research. This article outlines best practices and institutional expectations for using AI responsibly within SIUE’s technology environment.

This guidance applies to both chat-based AI (e.g., prompts in Copilot) and built-in AI features such as meeting recap, transcription/captions, suggested replies, “analyze data” in Excel, generative fill in creative apps, translation, and code assistants.

Access and Availability

Some Microsoft Copilot features including a safe/secure Chat bot are available to everyone, with limited features in Microsoft 365 applications, including Word, Excel, Outlook, Teams, and OneDrive

The Microsoft Copilot add-on license can be purchased by your department.  If you’ve recently been assigned a Copilot license, you may need to log out and back in to see Copilot appear in the web versions of these apps.

License options for Microsoft 365 Copilot

If you or your department are interested in exploring additional tools please contact ITS prior to starting a trial/demo.

What Counts as AI in Our Tools?

When we say “AI,” we mean any feature that interprets your content and produces or reshapes output—whether or not you typed a chatbot prompt. Examples include:

  • Summaries and recaps of emails, chats, or meetings
  • Transcription, captions, translation, and auto-classification
  • Suggested replies, writing/rewrite help, and smart compose
  • Data analysis/explanations (e.g., Excel “analyze data”, chart/story generation)
  • Generative image/audio/video edits (e.g., background removal, generative fill)
  • Code suggestions and documentation generation
  • In instructional settings, AI features can assist faculty in developing assignments, generating test questions, drafting feedback, summarizing readings, or translating and formatting course materials. These same principles of data privacy, authorship, and accuracy apply to instructional uses of AI. Faculty should review all AI-generated materials before sharing them with students and ensure they comply with accessibility, copyright, and academic integrity policies. 

Data Access and Privacy

When logged in with your SIUE E-ID/password, Copilot operates within Microsoft 365’s existing security and permissions model. It can reference the same data sources you have access to (OneDrive, Outlook, Teams, SharePoint) when responding to a prompt, but it does not independently browse, store, or continuously monitor all of your data.

Data processed by Copilot in Microsoft 365 remains within SIUE’s Microsoft tenant. However, standard data-handling policies still apply—never include confidential or encumbered data in AI prompts unless authorized.

The same privacy expectations apply to “one-click” or background AI features (e.g., auto-summaries or captions) even if you didn’t open a chat window.

You control what Coplilot can access by:

  • Managing file and folder permissions in OneDrive, SharePoint, and Teams.
  • Being intentional with your prompts—avoid sharing or referencing information that is confidential or sensitive.
  • Reviewing collaboration permissions before using Copilot with shared documents.

Example: If a spreadsheet in OneDrive is shared with your team, Copilot may be able to draw from that data when generating summaries or analyses.

Responsible Use Guidelines

  • Do not enter confidential, private, or restricted data (e.g., student records, health information, proprietary research) unless explicitly authorized.
  • Avoid asking AI to generate or summarize content that you wouldn’t otherwise share.
  • Check for accuracy—AI-generated responses can be helpful but may not always be correct or complete.
  • Maintain authorship, attribution, and accountability—even when AI assists, you are responsible for the final content’s accuracy and compliance.
  • Use approved tools only—limit SIUE data to vetted solutions (e.g., Microsoft CoPilot within Microsoft 365) that have passed security review.  The SIUE responsible use policy (2D4 & 3C10) prohibits unauthorized installation or use of software, and in particular, software which may create a security risk on University computer facilities.
  • Treat non-chat outputs the same as chat responses—meeting recaps, transcriptions, captions, suggested replies, code suggestions, and generative edits are all AI-produced and subject to the same data-handling rules.

Instructional Use and Grading 

  • Faculty may use approved AI tools to assist in creating instructional materials or drafting student feedback, but all content and grades must be reviewed and finalized by the instructor. 

  • Do not upload identifiable student work, grades, or evaluations into AI tools unless they are within approved SIUE systems that meet FERPA and data security requirements. 

  • Be transparent with students when using AI tools to support course materials or feedback. 

  • Avoid using AI tools to make evaluative or qualitative judgments about students’ work or participation. 

Data Ownership and Compliance

SIUE’s software agreements require that data ownership remains with the University. When using AI tools:

  • The University retains the right to extract all institutional data and revoke vendor access at any time.
  • Vendors must meet security and compliance standards for offsite data storage and processing; users may need to follow specific practices to keep configurations effective.
  • Distinguish approved Microsoft services from other AI vendors: Copilot within our Microsoft 365 tenant includes contractual and regulatory safeguards under our enterprise agreement. Other AI tools may lack these assurances—use them only if they have passed SIUE’s security and compliance review.
  • Add-ins, plugins, extensions, and connectors (including Copilot extensions or external model integrations) are treated as third-party vendors and must pass SIUE security/compliance review before use with University data.
  • Even if a vendor claims compliance, responsibility ultimately rests with the University and individual users to handle data properly.

Privacy, Contracts, and High-Risk Data

Many AI tools—and existing apps that add AI features—use customer inputs and outputs to train and improve their models. Because of this, users should avoid sharing personal or sensitive information with AI tools and should not input moderate or high-risk SIUE data (as defined by the SIUE Data Classification Policy), or SIUE intellectual property, unless all of the following are in place:

  1. Clear understanding of data use and model training
    • Verify how the service uses SIUE data, including whether the provider trains models on customer prompts/outputs and whether an opt-out is available and enabled for SIUE.
    • Confirm data retention/deletion practices, data residency/location, subcontractor (subprocessor) access, and logging/audit options.
  2. A signed contract that protects SIUE data
    • SIUE-executed agreement with appropriate protections (e.g., no training on SIUE data by default; confidentiality; breach notification; timely deletion/return of data; liability/indemnification; and compliance with applicable regulations such as FERPA/GLBA/HIPAA, where relevant).
  3. Security and privacy review
    • Completion of SIUE’s security compliance review by ITS, coordinated through Purchasing.
    • Where personal or regulated data is involved, obtain the necessary privacy review/approval through ITS, in consultation with the Office of General Counsel as appropriate.
  4. Intellectual property consultation (if applicable)
    • When research IP, inventions, or commercialization are involved, consult the Office of Research & Projects in the Graduate School before using AI tools.

Important account guidance

  • Do not process University data with personal or non-SIUE accounts (e.g., consumer Copilot, ChatGPT, personal Google, OpenAI, Anthropic).
  • Use only SIUE-provisioned accounts within approved services that have completed SIUE’s reviews and have an SIUE contract in place.

If a vendor or feature has not completed these steps, do not use it with moderate/high-risk data or SIUE intellectual property. When in doubt, contact ITS for guidance.

Key Risks to Keep in Mind

AI systems are powerful but informal by design, which can make it easy to overlook risks. Be mindful of the following:

A table of risk areas and descriptions.
Risk Area Description
Data Ownership Clarify ownership of data and outputs. SIUE data and derivatives must remain under institutional control.
Unintended Disclosure Generated output could inadvertently expose IP, personal information, or research data. Avoid including restricted data in prompts.
Compliance Users can be held responsible for breaches of protected data, even if unintentional or via misuse of AI tools.
Vendor Transparency Not all vendors clearly disclose how their AI collects, stores, or references data. Use only SIUE-approved tools (e.g., Microsoft Copilot) for University data.

Note on meetings: AI features that generate recaps, action items, or transcripts process the content of the meeting. Avoid discussing or recording regulated or confidential topics unless authorized and necessary, and confirm attendees understand the use of AI features.

Balancing Innovation and Risk

AI has enormous potential to enhance productivity and innovation. The technology is still evolving, and much about its inner workings remains proprietary. Use AI thoughtfully—augmenting, not replacing, your expertise—and exercise extra caution with research, intellectual property, or regulated data. When in doubt, consult ITS before using AI in high-stakes contexts.

In the classroom, AI can enhance learning experiences by supporting creativity, accessibility, and feedback. Faculty are encouraged to use AI to augmentnot replacetheir expertise and engagement with students. As with any instructional tool, maintain transparency with students about when and how AI is used, and ensure all materials align with SIUE’s academic integrity standards. 

Summary

  • Be mindful of what data you include in prompts; follow SIUE data-handling policies.
  • Copilot processes data within SIUE’s Microsoft tenant; still treat sensitive data with care.
  • Verify AI-generated information and keep final ownership and accountability.
  • Use vetted, approved tools; seek guidance from ITS when uncertain.

By balancing innovation with thoughtful data stewardship, we can take advantage of AI’s benefits while protecting SIUE’s people, research, and mission.

Frequently Asked Questions

Can I put university data into AI tools like ChatGPT or Gemini?

Do not enter SIUE confidential, private, or restricted data into public AI tools, especially when using personal accounts. This includes student records, HR information, health information, research data, and internal financial data. Only use tools that have been reviewed and approved by SIUE, following the Acceptable Use Policy (2D4) and the Information Security Policy (3C10).

Is it safe to use Microsoft Copilot with my SIUE account?

When you are signed in with your SIUE E-ID, Copilot in Microsoft 365 works within SIUE’s existing security and permissions model. It can reference the same content you have access to in OneDrive, Teams, Outlook, and SharePoint, and data stays in our Microsoft tenant. However, you should still avoid including moderate or high-risk data unless the tool and use case have gone through the appropriate contract and security review.

Can I use AI to draft emails, reports, or other documents?

Yes, for routine, low-risk work. AI can help with wording, structure, and summarizing non-sensitive information. You must:

  • Review and edit all AI-generated content before sending or publishing.
  • Verify facts, dates, and references.
  • Make sure the final document still reflects SIUE’s tone, policies, and expectations.

Remember that you, not the AI tool, are responsible for the final content.

How accurate is AI-generated content?

AI tools can sound confident but still be wrong, incomplete, or out of date. Always double-check:

  • Data, statistics, and technical details.
  • Citations and references.
  • Policy or legal statements.

Use AI as a starting point, not as an authority.

Can I use AI to summarize or analyze sensitive documents?

No. Do not upload or paste documents that contain moderate or high-risk data (such as student records, HR files, health information, or confidential research) into AI tools unless all contract, security, and privacy reviews are complete and the tool has been approved for that specific use.

Can I use AI to help with coding or scripts for SIUE systems?

AI can assist with generic code examples and troubleshooting. However, you should not paste internal configuration files, passwords, connection strings, or detailed network/server information into external tools. Be especially careful not to expose anything that could reveal system vulnerabilities.

What about students using AI for coursework?

AI use in courses is determined by academic programs and individual instructors. Faculty should clearly state expectations in the syllabus and assignment instructions (for example, whether AI tools are allowed, restricted, or prohibited, and how use should be documented). Students must follow course policies and any academic integrity guidelines established by their school or department.

Who owns the data and content when AI is used?

SIUE retains ownership of institutional data and work products created in the course of university business, including content created with AI support. Approved vendors must meet contractual requirements that protect SIUE data and allow the University to retrieve and delete institutional data as needed.

Can I use my personal AI accounts for university work?

Using personal AI accounts is permitted for your own learning or truly generic drafting/upskilling. But if the work relates to SIUE business, involves SIUE data, or will be shared as part of your job, you should use SIUE-approved, reviewed tools.

Who can I contact if I’m unsure whether something is appropriate to put into an AI tool?

If you are unsure whether a tool or use case is appropriate, please contact ITS for guidance:

For questions involving research data, intellectual property, or compliance requirements, ITS may collaborate with the Office of Research and Projects or other campus units as needed.



Keywords:
CoPilot, ChatGPT, Claude, AI, Anthropic, Meta, xAI, Gemini, DeepSeek, guidelines 
Doc ID:
155403
Owned by:
System U. in Southern Illinois University Edwardsville
Created:
2025-10-09
Updated:
2025-12-22
Sites:
Southern Illinois University Edwardsville