Final Assignment – Grammarly and Writing Instruction

The last time I taught English 12, I gave my students an in-class writing assignment as means of evaluating their ability to write an academic-style essay. The hope was to mitigate the potential for plagiarism, someone else writing it, or other forms of academic dishonest that by having students write during the class. Marking through one batch of these in-class essays, I came across one that was virtually textbook perfect in terms of the mechanics. There were some odd choices among the quotations, and there was little in the way of voice or tone, but the mechanics – they were flawless. Something seemed odd about this essay written in only an hour. Fortunately, the student came to see me for feedback. I pointed out the few areas where tone could be improved but otherwise complimented them on their near-flawless grammar. They thanked me and credited Grammarly for helping them edit. While I had not explicitly forbade grammar checking software, after all every word processor now contains rudimentary spelling and grammar checks, something felt dishonest about using artificial intelligence to check their grammar during the in-class setting. I was trying to assess the student’s writing ability, not the computer’s. However, upon further reflection, why shouldn’t students be allowed to use a program like Grammarly? Students in math class are often allowed, if not encouraged, to use calculators to help solve equations. English students should use the technological tools at their disposal to increase their ability to communicate effectively. The nature of my writing instruction and my assessment needed to change to account for technological advancement. The question became “how?” To figure out how writing instruction and assessment needed to change, I needed to understand Grammarly better, and how it could be embraced in the classroom.

Users can access Grammarly’s services directly through their website (www.grammarly.com) by either uploading documents or copying text, through a downloadable version, or installable plugins for a variety of platforms, such as Google, Microsoft, and several web browsers. Grammarly has been providing its editing services since 2009 through free and paid premium options (Dong and Shi, 2021). Both the free and premium service provide feedback on a user’s writing by highlighting various grammatical, stylistic, and tonal issues. It also provides an overall score based on correctness, clarity, engagement, and delivery. Each issue that Grammarly identifies is colour coded based on these categories. For example, a grammatical mistake that impacts the piece’s “correctness” is underlined in red and identified as a grammatical or spelling mistake. Along with the coded identification is a recommendation on how to address the issue and a grammatical explanation. Only the premium version, however, offers suggestions on how to improve engagement and delivery based on user-selected variables.

Although Grammarly does not disclose its actual software construction, it is designed as an artificial intelligence software that uses algorithms to process patterns of written language to identify grammatical issues and make suggestions for correction (Fitria 2021). The algorithm relies on an ever-growing body of work and user input to continually grow and adapt to stylistic choices in addition to the already established body of grammatical knowledge (Fitria 2021). Compared to earlier studies from 2016 that cited “250 error types,” a more recent 2021 study found that Grammarly has access to upwards of 400 grammatical structures (Dodigovic and Tovmasyan 2021). Even if students do not have access to the premium services that aid in style and tone, the free version can still provide instant potentially useful writing feedback.

Despite the program’s potential, the actual impact of Grammarly on student writing is mired in contradiction (O’Neill and Russell 2019). Some studies suggest that lower-level writers benefit most because feedback is instant, while others purport that the grammar suggestions are most impactful on upper-level writers who can understand and discern between suggestions (O’Neill and Russell 2019). Similar contradictions arise when looking at the research regarding its use with English as a Foreign Language learners compared to native English speakers (O’Neill and Russell 2019). Several studies suggested that while Grammarly generally improved students’ immediate writing scores, the actual learning of grammatical structures was generally left unchanged (Dodigovic and Tovmasyan 2021). This was largely dependent on how students viewed using Grammarly. Some blindly accepted everything it had to suggest, typically based on a lack of confidence in being able to understand the grammatical constructs themselves, while others were more selective and only accepted roughly half the suggestions due to distrust of the program (Koltovskaia 2020). The only students that seemed to gain the most from Grammarly’s immediate feedback were those that approached the program with a healthy level of scepticism, had existing grammatical knowledge, and were willing to research Grammarly’s suggestions when uncertain of the suggestions (Koltovskaia 2020). Student distrust of Grammarly is not entirely unfounded. In trying to determine the accuracy of Grammarly’s feedback, Dodigovic and Tovamsyan (2021) found that over seven essays input into Grammarly, the accuracy of identified errors ranged between just over 58% to 84%. Not being able to entirely trust the accuracy of the program seemed to lead to uncertainty of which suggestions to accept and reject. Even with this wide range of error detection, students have instant access to potentially correct basic mechanical issues they might not have noticed on their own. The issue is less with the program and more on how it can be incorporated effectively into writing instruction.

On a surface level, Grammarly has been suggested as a means of quickly fixing low-level mechanical issues, allowing teachers to focus on higher-level organizational, stylistic, and cognitive elements of writing (Koltovskaia 2020). From this perspective, Grammarly truly is the English subject’s version of a calculator. It does not matter if the student knows why the preposition is redundant or why 3×5=15; instead, it is important that the student can combine complex ideas and provide analysis in the form of an essay, or reason through the orders of operation in an equation. This perspective of Grammarly would only foster behavioural engagement from the student, but not holistically support student learning (Koltovskaia 2020). It would not help students who blindly accept Grammarly’s suggestions or reject half of them out of distrust. Students need to develop effective cognitive engagement, while also not being discouraged by the potential volume of feedback with tools like Grammarly to make the most of them (Koltovskaia 2020). While it should not be assumed that all students are inherently motivated to seek out extra explanation on their own in conjunction with the program’s suggestions, the next best thing would be to develop a mixture of behavioural, cognitive and affective engagement patterns in the classroom (Koltovskaia 2020).

One such suggestion to address these three areas of effective automated feedback engagement comes from Reva Potter and Dorothy Fuller (2008), who experimented with an action research project to combine Grammarly-type programs with student inquiry. They used student draft writing as the grammar check input and had students create a list of common errors they encountered. From that list, students selected the ones they were interested in addressing in their writing. Rather than simply having the program correct the work, students then explored the names of the issues flagged in their writing. The action research project encouraged students to actively explore the feedback the software was providing, rather than blindly accepting the corrections. They also explored the fallibility of the software and made a game out of trying to trick the software into false positive or negative hits. Students learned how to make effective use of their grammar check, while also developing the behavioural habit of engaging critically with the suggestions, rather than blindly accepting them.

Granted, Potter and Fuller’s (2008) study was conducted with grade seven students. When discussing the role of Grammarly and assessment in senior level English classes with my colleague and mentor, she too has altered her approach to writing instruction. Rather than try to assess against grammar-editing software, she has begun including more stylistic and rhetorical modes of effective writing in her instruction. Using a food metaphor for her students, she compares the minimal ministry writing standards to a bowl of unappealing oatmeal goop. It has all the basic nutrients one needs for communication and survival, but it is not very appealing. These can be relatively easy to attain through blind use of Grammarly – essentially my experience with the near textbook-perfect, but soulless write. Comparatively, like a beautifully plated display of sushi, she continued, people often want their food and writing to have a more complex flavour and pleasant presentation – hence the need for rhetorical and stylistic devices. Throughout the semester, her students have been finding and emulating famous examples of a variety of rhetorical devices. In doing so, her students are engaging more critically in how they use language and have begun to include those structures in their writing and discussions. While she continues to include mechanical coherence in her assessment, she can move away from it as a core benchmark of student writing. Instead, she can focus on the human element inherent in writing that critics of Grammarly-type programs have voiced – the nuances, subtleties, and design choices of communication (Roscoe et al 2017). Her students can use Grammarly, in addition to her instruction, to improve their writing in a more multifaceted way. She can then address the cognitive engagement (Koltovskaia 2020) with Grammarly in her individualized feedback with her students. This way her instruction and guidance complement and enhance Grammarly’s use among her students and does not reduce it to a blindly followed calculator. Rather than a nuisance that threatens to make writing instruction obsolete, it enhances students’ overall product through more meaningful writing.

Grammarly has immense potential to aid student writing. Its ability to provide instant feedback on core elements of writing means that discerning students have access to an additional mode of grammatical instruction to develop their writing skills. However, it is no magic bullet that will inherently improve student writing. Students still require guidance on how to effectively use such programs, as well as the desire to learn and improve. Simply relying on Grammarly to take care of student mechanics is not optimal use of the program. Writing instruction increasingly needs to take a multifaceted approach and utilize the effectiveness of programs like Grammarly by incorporating them into instruction. To truly get the most out of these programs, they must be combined with instruction and not just used as an English calculator that simply generates the right grammatical answer.

 

 

Dodigovic, M., & Tovmasyan, A. (2021). Automated writing evaluation: The accuracy of Grammarly’s feedback on form. International Journal of TESOL Studies, 3(2), 71. https://doi.org/10.46451/ijts.2021.06.06

Dong, Y., & Shi, L. (2021). Using grammarly to support students’ source-based writing practices. Assessing Writing, 50, 100564. https://doi.org/10.1016/j.asw.2021.100564

Fitria, T. N. (2021). Grammarly as AI-powered english writing assistant: Students’ alternative for writing english. Metathesis: Journal of English Language, Literature, and Teaching, 5(1), 65. https://doi.org/10.31002/metathesis.v5i1.3519

Koltovskaia, S. (2020). Student engagement with automated written corrective feedback (AWCF) provided by grammarly: A multiple case study. Assessing Writing, 44, 100450. https://doi.org/10.1016/j.asw.2020.100450

ONeill, R., & Russell, A. (2019). Stop! grammar time : University students’ perceptions of the automated feedback program grammarly. Australasian Journal of Educational Technology, 35(1), 42-56. https://doi.org/10.14742/ajet.3795

Potter, R., & Fuller, D. (2008). My new teaching partner? using the grammar checker in writing instruction. English Journal, 98(1), 36-41.

Roscoe, R. D., Wilson, J., Johnson, A. C., & Mayra, C. R. (2017). Presentation, expectations, and experience: Sources of student perceptions of automated writing evaluation. Computers in Human Behavior, 70, 207-221. https://doi.org/10.1016/j.chb.2016.12.076

Leave a Reply

Your email address will not be published. Required fields are marked *