AI & Academic Writing

Artificial intelligence systems that generate sustained conversations in response to user input ("generative AI") have captured headlines recently. Such composing tools offer challenges and opportunities for students and faculty who produce and assess writing projects. After all, writing is a creative process; it requires critical thinking, engaged reading, synthesis, and the iterative processes of seeking and responding to feedback from human readers. 

Generative AI promises “efficiencies,” but should these tools become (co)authors of our compositions? What do the texts on which the models are trained tell us about genres of human communication? Who “owns” the text that we feed into these tools, and how will our own questions and prompts be used by the creators of the next generation of AI tools? What is lost and what is gained when students turn to these tools to support their research, drafting, translating, and/or revising?

This page does not answer all of these questions (or the many others - including about privacy and intellectual property, environmental impact, and labor - raised by these tools), but it does offer a few starting points for thinking about AI & academic writing. Below, we provide answers to some frequently asked questions that faculty teaching writing may have. We invite you to contact us to ask additional questions and/or to share your pedagogical strategies for engaging with AI composing tools.

See also the university’s AI in Education site and [U]Tech’s AI page for additional events, tools, and resources.

If you don’t know much about what generative AI is or how it works, we encourage you to investigate it as a first step in deciding how you feel about it in the context of your teaching and learning. You might begin with [U]Tech's Learn More about Gen AI page.

In the context of teaching writing at the university, we recommend that you begin with the Modern Language Association (MLA) and Conference on College Composition and Communication (CCCC, part of the National Council of Teachers of English, NCTE) Joint Task Force on Writing & AI.

This group has authored several working papers, including, an Overview of Issues & Principles, recommendations for AI Policy Development, and advice for Building a Culture of AI Literacy on campus.

The Office of the Provost maintains the university's AI in Education page, which provides information about CWRU’s position relative to generative AI, including links to the AI Task Force and other campus initiatives.

[U]Tech and their Teaching and Learning Technologies (TLT) team provides resources for using generative AI tools that the university has investigated and purchased for our educational use. You can find a list of available tools along with instructions for access and using them on the [U]Tech’s AI- Artificial Intelligence page. TLT also provides consultations, Canvas training, and other tutorials and resources to help you educate yourself on generative AI. 

University Center for Innovation in Teaching and Education (UCITE) offers seminars and resources to support you in teaching with generative AI on their webpage: Teaching and Learning with Generative Artificial Intelligence Systems.

Annette Vee, Associate Professor & Director of Composition at the University of Pittsburgh, writes a newsletter for Norton, "AI & How We Teach Writing" that offers practical, thoughtful, and readable advice about generative AI in writing contexts. This post addresses critical AI literacy, which she defines as "the ability to understand, apply, and assess AI operations, uses, and outputs." According to Vee:

Students should know how to use AI well—have some literacy in it, if you will. They should also be able to discern when to use it, when to avoid it, and how to evaluate it—that’s the critical part of “critical AI literacy.”

The Modern Language Association (MLA) and Conference on College Composition and Communication Joint Task Force on Writing & AI recommends that instructors support Student AI Literacy (2024) to help developing writers make thoughtful and critical use of generative AI tools, to understand their inner workings and outputs, and to be educated on the impacts of AI on society, labor, and the environment. In order for students to be prepared for the workforce and to play an active role in their communities, they need to be able to use and understand AI. 

Understanding and articulating some of the reasons to remain skeptical and resistant to generative AI tools, scholars in Writing Studies have also developed a set of resources for “Refusing Generative AI.”

If you are using a written assignment to test a student’s knowledge or application of concepts, then you might want to prohibit the use of generative AI, which is very capable of producing information, and interfering with your student’s demonstration of learning.

But, if you are using writing as a tool for thinking, developing understanding, and/or sharing ideas and insights, generative AI might have a place in the writing process. Like other digital tools, generative AI can be helpful as an assistive device for students who struggle with processes such as reading/understanding complex texts, brainstorming, organizing, or editing.

Clearly, generative AI has both interesting possibilities and real disadvantages. While you probably do not want a student to put your assignment into ChatGPT and then simply submit the output to you for a grade, you might want to support the thoughtful exploration and use of generative AI tools in the classroom.

In the context of your course, you might:

  • discuss the limitations and opportunities that AI provides us as students, writers, and human beings
  • provide students with opportunities to experiment with generative AI tools, asking them to reflect on how they are helpful and not helpful to their learning and writing
  • encourage students to develop skills that will allow them to use generative AI more effectively (e.g.,"prompt engineering")
  • engage students in critical and ethical deliberation around the impacts of generative AI on your field and/or their future professions 

Good writing assignments engage students intellectually and personally, giving them incentive to develop their own ideas and insights, and to share them with others. 

Meaningful writing assignments 

  • require students to engage with critical and integrative thinking
  • relate clearly to students’ interests and experiences
  • engage students in conversation about their writing processes and topics
  • involve processes and scaffolding that allow students to complete a project over time, with feedback from you
  • have a clear, audience, purpose, and expectations

The Writing Program is happy to consult with you to develop writing assignments and assessments that emphasize the value of producing their own writing as a process of critical thinking, self-expression, and participation in important disciplinary conversations. 

These efforts can promote our values as a community, decrease the temptation to misuse generative AI, and support students with AI literacy so they can make informed choices about how and when to use AI. For more information, please email writing@case.edu

Each faculty member must decide what policy is appropriate for their courses - a decision that ultimately depends on what you want your students to learn, what skills and knowledge you want them to gain, and what you believe about the impacts of generative AI. The CWRU AI Taskforce has developed example AI Policies, Suitable for Course Syllabi.

However you decide to engage (or not) with generative AI in your courses, CWRU’s Academic Integrity Policy prohibits submitting AI-generated text as one’s own, just as it prohibits claiming authorship for text that was originally produced by another person.

We recommend that faculty make clear to students which (if any) uses of AI are acceptable in their courses. Because this may vary from one course to the next, we also recommend that students discuss any intended uses of AI tools with their instructors before they use them.

We also suggest that instructors use this new circumstance to explicitly discuss the value, purpose, and practice of academic integrity, and to examine the ways in which generative AI use can be academically dishonest. Through this approach, instructors can critically examine generative AI and whether or not it aligns with our values as a community, rather than focusing on “catching” students who may be using the tools in ways that do not uphold community values. For a rich consideration of this approach to academic integrity in the age of generative AI, watch this video of Antonio Byrd, Assistant Professor at the University of Missouri-Kansas City, and a member of the MLA-CCCC Joint Task Force on Writing and AI.

The Writing Program is happy to consult with you to develop writing assignments and assessments that emphasize the value for students of producing their own writing as a process of critical thinking, self-expression, and participation in important disciplinary conversations. These efforts promote our values as a community, decrease the temptation for writers to misuse AI tools, and support AI literacy so that all writers can make informed choices about if, how, and when to use generative AI. 

If you are concerned that a student has misused generative AI on a writing assignment, we recommend that you talk with the student about how they produced the work that they submitted. You might ask about their writing process, the content of the piece, and/or the specific tools they used to write.

Don’t assume that you “can just tell” that a text is written by generative AI; the technology's very purpose is to learn from and emulate human-generated writing, so detection services will always lag behind the generative AI tools themselves. And, AI detectors are ineffective and problematic for reasons described in the AI Detection Services FAQ below.  

If, after discussing your concerns, you and the student cannot come to a resolution together, you should follow the protocol set forth in the Academic Integrity Policy under “Discussing, Reporting, and Adjudicating Violations.”

Writing scholars have long debated the merits and limitations of plagiarism detection software (e.g., Turnitin), and these conversations continue with the advent of new AI-detection tools, which have not been proven to be accurate or reliable. In a 2023 study of fourteen different AI-detection tools, published in the International Journal for Educational Integrity, the authors conclude that such tools are "neither accurate nor reliable."

"Our findings strongly suggest that the 'easy solution' for detection of AI-generated text does not (and maybe even could not) exist. Therefore, rather than focusing on detection strategies, educators continue to need to focus on preventive measures and continue to rethink academic assessment strategies." (Weber-Wulff et al, 2023)

Furthermore, there is evidence that generative AI detectors perpetuate biases against already-minoritized students, resulting in disproportionate and false accusations of plagiarism against Black students, non-native speakers of English, and neurodiverse individuals. The Northern Illinois University Center for Innovative Teaching and Learning surveys the evidence and provides resources that promote the teaching of AI Literacy.

The Writing Program does NOT recommend using AI detection services. Our preferred approach focuses on setting classroom expectations, designing assignments and activities that promote academic integrity, and meeting with students to discuss their work and writing processes when concerns arise.

If, however, a faculty member does decide to use an AI detection service, we encourage them to consider how they will interpret the results, how they will make clear to students when such tools will be used, and what they will do with the reports that are generated. For example, consider including something on your syllabus similar to this recommended language about the use of Turnitin "originality reports" from Yale University’s Poorvu Center for Teaching & Learning

The Writing Program offers faculty consultations focused on effective writing instruction, as well as providing support to student/faculty/staff writers through the Writing Resource Center. We host workshops, lectures, and other activities for CWRU community members interested in writing instruction. 

Our faculty have developed a range of approaches to teaching in the context of generative AI. We would be happy to consult with you about your experiences, including on questions such as:

  • Are timed, in-class writing assignments (e.g., “blue book” exams) a good way to ensure that students don’t misuse generative AI?
  • How might we approach teaching critical reading skills, when students are using generative AI to summarize course texts?
  • What is “prompt engineering” and should we be teaching it?
  • How might generative AI be used for brainstorming and prewriting? 
  • When should we allow/encourage students to use generative AI tools as part of their drafting/revising processes?
  • What are the costs and benefits of using generative AI to respond to students’ work?

Please contact us for more information or to schedule a consultation.

The following list is not exhaustive; please share additional resources with us!