The FSU Reading-Writing Center and Digital Studio recognize that generative AI tools including the use of large language models such as ChatGPT, Copilot, Google Gemini, and DALL-E are being used in educational settings.
Generative AI tools refer to computer systems that can produce (or generate) various forms of traditionally human expression through digital content including language, images, video, and music (MLA-CCCC Task Force). We encourage faculty and students to read about AI writing technologies’ affordances and limitations related to their subject areas, creative and research purposes, and teaching objectives. We list some additional resources at the bottom of this page.
In the RWC-DS, our priority is to work with students on their short- and long-term learning goals through individualized sessions. We recognize that students have questions about AI writing tools, and we are training our staff about responsible usage and critical awareness practices. We also acknowledge the academic integrity and copyright concerns that are commonly associated with the use of generative AI and promote critical, informed, and responsible practices regarding its use.
Consultants’ Responsibilities
Writing center consultants will honor the professor’s assignment sheet and/or syllabus guidelines when working with students. If a professor prohibits AI tools, consultants will not use those tools during a session. If a professor allows AI tools, consultants will ask to see the assignment to ensure the session follows any guidelines.
If an assignment allows for AI tools, consultants may use them in a session in ways they consider are appropriate and if the student requests it. We will also remind writers about the limitations of AI tools (see Limitations below). However, not all consultants are comfortable using AI tools and may conduct a session without them or refer the student to another staff member.
Because consultants see student writing before it is submitted, consultants are not required to report unpermitted AI use to the Office of Academic Integrity; they are, however, encouraged to report this to the writing center director.
Here is a brief overview of consultant responsibilities:
- Refer to faculty and course guidelines about the use of generative AI.
- Guide students in assessing the credibility and accuracy of information that generative AI produces.
- Help students recognize potential bias and lack of diverse perspectives and linguistic diversity in generative AI
- Discourage the use of genAI to create texts for students to submit in their courses and approach concerns directly with students.
- Discuss the consequences of using genAI for full text submission without proper citation.
- Support students in appropriately citing genAI.
Can students bring AI-generated work to the Reading-Writing Center?
Yes, but it is the student’s responsibility to disclose any AI use to the consultant. Otherwise, the consultant may not think to ask about a course’s AI policy. If a student notes that AI tools were used to generate work for an assignment but the assignment does not allow for those tools to be used, the writing consultant will let the student know that using AI tools could go against the course’s academic integrity policy and assignment guidelines. In case AI was used but is not allowed, the student and consultant could brainstorm new ideas and create a plan to complete the assignment without the use of AI tools.
Attribution
In accordance with the Office of Academic Integrity, the RWC-DS recommends instructors clearly explain whether AI writing technologies can or cannot be used for assignments. If the use of AI writing technologies is permitted, the RWC-DS upholds attribution and/or citation that aligns with the course guidelines or assignment instructions. This policy also extends to multimodal forms of communication including, but not limited to, the use of AI generated images. We ask that students bring in their assignment guidelines if ChatGPT or an AI tool is used for a project.
MLA Style’s method for citing generative AI states that writers should:
- cite whenever you paraphrase, quote, or incorporate into your own work any content (whether text, image, data, or other) that was created by it.
- acknowledge all functional uses of the tool (like editing your prose or translating words) in a note, your text, or another suitable location.
- take care to vet the secondary sources the GenAI tool produces.
MLA Style Example:
“What is an autoethnography?” prompt. ChatGPT, 13 Feb. version, OpenAI, 8 Mar. 2024, chat.openai.com/chat.
APA’s Style blog released an entry that covers how to cite AI software within text, an appendix, and/or online supplemental materials.
- quoting ChatGPT’s text from a chat session is more like sharing an algorithm’s output; thus, credit the author of the algorithm with a reference list entry and the corresponding in-text citation.
APA Style Example:
OpenAI. (2024). ChatGPT (Mar 14 version) [Large language model]. https://chat.openai.com/chat
By discussing the importance of attribution, we also hope to discourage unethical uses of AI tools including but not limited to:
- relying exclusively on a generative AI writing tool to generate the majority of ideas and content without engaging the ideas, analyzing and shaping the material, or conducting supplementary research.
- using a generative AI writing tool to produce an entire piece of writing that will be submitted as original work where the generative AI writing tool is not acknowledged or attributed.
- failing to disclose the use of AI involvement in the creation of your work.
Limitations and Risks of AI Tools
As with any technology, there are opportunities and limitations. Generative AI writing tools are evolving, and it is important to recognize that each tool has limitations including potential biases and the ability to produce false information. Here are some of the concerns surrounding AI tools and academic writing:
- Hallucinations & prioritizes making user happy over accuracy
- Erasure of linguistic diversity through the promotion of Standard American English
- Missed learning opportunities if outputs are relied on too heavily
- Presents writing as a product rather than a process
- Increases in surveillance over student work
- Unequal access to these technologies (and their advancements) can further socioeconomic inequalities
- Environmental damage from non-renewable resources used for processing power
Ongoing Conversations about GenAI
The Reading-Writing Center and Digital Studio will revisit this statement as we learn more about new developments. The RWC-DS is not liable for irresponsible uses of generative AI tools.
Additional Resources
Academic Integrity and Artificial Intelligence: A Guide for Faculty
MLA-CCCC Joint Task Force on https://aiandwriting.hcommons.org/working-paper-1/
OpenAI’s Usage Policies: https://openai.com/policies/usage-policies
UNC Chapel Hill's Generative AI in Academic Writing
Last Updated August 2024