
Responsible use of AI at Marshall University

Using non-approved AI tools with University Data
Employees, affiliates, and third-party agents of Marshall University should be mindful and only utilize approved acceptable tools and services when storing, processing, and/or transmitting Institutional Data. Technology tools and services, even those at no cost to the University, must be reviewed according to ITP-3: Technology Governance and Procurement Review. This includes personal productivity technologies, including artificial intelligence (AI) tools, that process and retain data (i.e., meeting recording and transcription, large language models (LLMs), small language models (SLMs), image processors, etc.).
Microsoft 365 Copilot Data Security
Microsoft 365 Copilot operates within a secure environment that aligns with enterprise-grade compliance standards to protect Marshall University’s institutional data. It is built on Microsoft’s core principles of privacy, compliance, and security. Data processed by Copilot remains within the organization’s Microsoft 365 environment and adheres to strict security protocols, including role-based access controls and encryption. Institutional data is not used to train Copilot’s underlying large language models, ensuring the confidentiality of sensitive information.
Additionally, Microsoft 365’s existing security features, such as Multi-Factor Authentication (MFA) and Conditional Access policies extend to Copilot Chat. These safeguards help prevent unauthorized access and minimize risks of data breaches. Within M365, MUIT can implement governance policies that can adhere to custom access to AI features for sensitive or private workloads. By combining Copilot Chat’s capabilities with Microsoft’s robust security framework, Marshall University can harness the power of AI while protecting sensitive data.

Avoiding Plagiarism
Plagiarism has long been a concern in higher education, but the advent of AI tools has added new dimensions to this issue. AI-powered writing assistants, like ChatGPT, can generate text that students might use to complete assignments without proper attribution. Marshall University Libraries offers examples of AI use in research and examples to avoid plagiarism.

Smart Use of AI Starts with Ethics
Marshall University supports the responsible use of AI to enhance learning and research while upholding academic integrity. As a student, here’s how to use AI tools ethically:
- Academic Integrity: Use AI to support—not complete—your work. Submissions must reflect your own understanding and effort.
- Privacy & Security: Log in with your university credentials and handle data responsibly.
- Fairness & Accuracy: AI may contain biases or errors. Always verify and fact-check AI-generated content.
- Transparency: Acknowledge when AI tools assist with your work (e.g., research, writing, summarizing).
- Human Oversight: Apply critical thinking. AI is a support tool—not a substitute for your judgment.
- Continuous Learning: Stay informed about AI ethics and explore resources to deepen your understanding.