Skip to main content
Beyond Grammarly: Preparing Students for an AI World

Thought Leadership

October 7, 2024

Beyond Grammarly: Preparing Students for an AI World

J.C. Polanco, Esq., MBA

President & CEO, Council on Legal Education Opportunity, Inc.

Even before the rise of ChatGPT, everybody in the academic world was aware of the power—and the peril—of artificial intelligence.

I’ve seen this in my own work in higher education. In addition to heading the Council on Legal Education Opportunity (CLEO), a pipeline program for diversifying law school enrollment, I teach business law at the University of Mount Saint Vincent in the Bronx and history and Latino studies at the Borough of Manhattan Community College in the CUNY system. For years, my colleagues and I have used a web tool called Turnitin.com to assess student writing for plagiarism.

If a student paper isn’t original work, the tool provides a remarkably detailed look at which uncredited sources it draws on and what percentages of the paper those sources make up. Students know the ground rules, and anybody who might be tempted to cut corners is aware that, along with being unethical, plagiarism is likely to be detected by this computer tool.

To be sure, AI can be used for constructive educational purposes, not just for playing defense. Students and faculty alike regularly use Grammarly, an AI-driven web tool, to improve their written work. It flags outright errors and also gives guidance on how to write more effectively by, for example, using active voice. I’m a big fan of Grammarly and have used it on everything from my own essays to preparing exams and course syllabi.

It’s been clear for a while, then, that although there’s a danger of students using AI as a crutch, it can also be a tool for good. Today, with ever-more-sophisticated AI tools available, it’s increasingly evident that legal educators need to be more aware of how to give students the best guidance on the appropriate and effective use of those tools.

Personal statements, which have been getting increasing attention since the Supreme Court banned using race as an admissions factor in higher education, are a particular minefield. A student eager to show how she overcame obstacles in her law school application might be tempted to ask ChatGPT to create an essay about growing up Dominican in New York City. But that approach is likely to yield a result that is not just dishonest - and likely detectable - but also generic and boring. At CLEO, we urge students to make their essays personalized and authentic. That might mean getting really specific about, say, growing up on Creston Avenue in the Bronx, across the street from Lowe’s, and struggling to get homework done in a noisy apartment building.

In fact, legal educators can also provide would-be attorneys with the following crucial guidance: the same push for personalization in law school applications will also hold them in good stead when they practice law. The profession will surely change in some respects as sophisticated AI tools are increasingly able to take on more routine tasks like drafting simple wills. But AI is no substitute for getting to know clients and their preferences. Building a successful legal practice will always require building relationships, which might involve something as fundamental as talking to estate planning clients about their funeral preferences and how family dynamics might affect their decisions on distributing property.

And if educators are looking for a real-world example of the dangers of the AI era in law, they won’t have to look far to get students’ attention: In June 2024, a judge fined two New York lawyers who used ChatGPT to create a legal brief they submitted to the court—and that turned out to include six fictitious case citations.

"AI's power in the classroom? Not just automating tasks, but sparking new ways for students to think critically and problem-solve."

The exciting news for educators is that, along with a certain amount of worry, the brave new world of AI may provide some truly great teaching tools. That might include a series of carefully constructed questions designed to prompt students to think through their assumptions and craft compelling arguments - an AI version of the Socratic method. Building this kind of systematic, logical thinking is particularly useful for the LSAT because of its emphasis on logic rather than acquired knowledge.

Consider examples like these, which are either already in use or can easily be created:

  • AI is particularly good at creating personalized or scenario-based exercises. Professors can develop AI-powered chatbots or virtual assistants trained to simulate conversations or written feedback from specific Supreme Court justices or other legal experts. Students could interact with these AI agents to explore different perspectives and anticipate how judges might rule on cases. As technology becomes increasingly sophisticated, time and place need not be a barrier to this kind of creative intellectual exercise: imagine AI walking students through how Justice Sotomayor might have ruled in Plessy v Ferguson or how Justice Thomas would decide Loving v. Virginia.

  • Artificial intelligence can help students apply their emerging legal knowledge to a range of other AI-generated scenarios. For example, an AI system could generate a detailed fact pattern based on a real-world legal issue. Students would then be asked to analyze the case, identify relevant laws and precedents, and propose a particular line of argument or course of action. They could receive AI-generated feedback from a range of different perspectives.

  • Beyond the help with writing mechanics provided by tools like Grammarly, sophisticated AI instruction can provide personalized feedback and guidance to students on their legal writing, research, and analysis. An AI system could analyze a student’s work, identify areas for improvement, and offer tailored suggestions for strengthening their skills.

  • Last, AI promises to transform the search for “best fit” internships and jobs. I’ve already seen plenty of students leverage AI to find internships or job placements that align with their interests and career goals. An AI-powered system—including tools now embedded on platforms like LinkedIn—can very quickly analyze a student’s background, preferences, and qualifications to suggest relevant opportunities. It can also provide specific insights on whether or not they might be a good fit for a particular opening at a particular organization.

As educators look to the future, the key to keeping AI in perspective is to tally up both its downsides and its potential to give students realistic and optimistic guidance. There’s little doubt that AI will continue to automate certain tasks and replace particular kinds of human interaction. But we’re also just beginning to see how much it can enhance the learning experience, foster critical thinking, and prepare students for the realities of careers, whether in my own world of law or in many other fields.