Generative AI tools, like ChatGPT, are becoming common in education, and many students arrive at colleges and universities with some experience using them. Often, though, that experience is tinged with concern—worries about academic integrity or fear of doing something “wrong.” I wanted to teach students how to use AI responsibly and explore how it could help me as an instructor, so I integrated it into a first-year seminar this fall.

Overall, it was a positive experience for me and for the students. The seminar was part of a common, required experience for first-year students that has learning goals related to understanding the concept of a global system, as well as major forces like cultural competence, the pernicious effects of colonialism, and the relationship of humans and the natural world, among others. The course is intended to be writing-intensive and is taught by Elon faculty from all over campus in every department. Each faculty member can choose the specific course material and themes, provided they meet the learning objectives. These classes are 21-person seminars.  

How I Used AI in My Class 

I used AI in two ways: 1) for individual projects and 2) as part of a group activity. 

For the individual project, students wrote a major paper in which they combined their life experiences with academic research to write a piece about the US class system that was in the style of a newspaper feature article. I started them out with a prompt they could use with an AI tool, like ChatGPT, and I showed them how to use it. In essence, the provided prompt directs the AI tool to act as a thought partner in creating a new prompt to use with the AI tool.

In an effort to teach AI prompting skills, I supplied them with a generation prompt as follows: 

Screenshot of Chapt GPT 4.0 Mini AI tool with prompt text: "Act as an expert prompt writer, with a goal to help me craft the best possible prompt for my needs. The prompt will be used by you, [Name of AI]. Do the following: 1. Your first response will be to ask me the topic and purpose of the prompt. I will provide my answer, and we will work together to clarify until you provide me with the best possible prompt for the task. 2. Based on my response, you will create, a) A tentative prompt that is clear to a human and likely to prompt [Name of AI] to do the needed task b) You must ask relevant questions to solicit additional information to improve the prompt. After I answer the questions, provide an updated prompt and a new list of questions. At any time, I may tell you that I am done, and this will end the iterations. Begin."
Prompt text sent in Chapt GPT 4.0 Mini.

I then went through the exercise in front of the class asking the AI (Microsoft Copilot in this case, since you can access it without an account) to create a prompt that would make an outline for their upcoming paper.  

When prompted in this way, the AI asked the students questions, and then it was able to suggest an outline based on their answers. That outline became the starting point for their papers. I also permitted them to use Grammarly for grammar and mechanics, after pleas from the students. Interestingly, most of my students were accustomed to using Grammarly, but had not previously considered it to be an AI tool. Previous in-class writing had showed me that many of my students struggle with college-level writing. They make frequent basic errors in spelling, punctuation, and sentence structure if they write without support. Because basic writing was not a learning objective for the class, I thought using the tool would help them focus on bigger-picture issues in the assignment. The AI helped them brainstorm and organize their ideas, but I still reviewed their outlines myself and gave feedback. The AI was a tool, not a replacement for my guidance. 

In the group activity, students worked together on a hypothetical case study I provided to them about immigration to the United States. Case studies represented different scenarios that might cause a person or group to immigrate to the US, including fleeing a war, escaping political persecution, looking for employment opportunities, following family, and looking for a better education for children. Students researched the process of immigration for their assigned case, and then I gave them a specific AI prompt to use that asked the students questions they could answer based on their research. Then, the prompt instructs the AI tool to write a letter explaining the necessary steps of immigrating to the US to the fictional family in their case. 

The prompt was: 

Screenshot of Microsoft Copilot AI tool with prompt text: "Act as a counselor for a family hoping to immigrate to the United States from [country] because of [reason]. You are writing a letter to the family, offering advice on what it will take to get a visa, to make the move and what kind of life they are likely to find when they arrive. You will advise them to move to a particular US city. To get the content for the letter, you should ask me up to 15 questions, one at a time. The letter should be no more than 1000 words and should be in plain English, understandable by a US 6th grader. Begin."
Prompt text sent in Microsoft Copilot.

The students then wrote a reflection on the AI’s performance. This activity helped them think critically about how well the AI communicated, and where it fell short. 

What Students Learned 

The reflection assignments for the AI activities were the most valuable part, and I learned that many of my students had never considered AI output critically before. In their reflection writing, they were able to describe its uses as a tool, and they also pointed out its flaws. Some of my more reflective students were even able to look at the AI’s output and consider how further prompts might generate better results. For all the students, I saw evidence that the reflection helped them to think about what makes communication clear and effective. 

One thing that surprised me was how easily students took to using AI as a thought partner, although they told me they had not used it in this way before. I found my students more willing to share early, unpolished ideas with the AI. Through the years, students have always had access to me and to resources like the campus writing center, but some students said they preferred the low-pressure nature of working with a machine. In future implementations, I might revise the prompts designed to support essay-writing to encourage kind language in the feedback. Of course, I still gave feedback on students’ outlines before they started drafting. The AI was there to support their work, not replace human interaction. 

AI also made my job as an instructor easier, saving me time on some class preparation. For example, over the semester, I asked an AI tool to turn my lecture notes into guided note sheets for students. My students lately seem to be leaving high school with less proficiency at taking their own notes, so I think this helps them to attend to and retain what we are talking about in class. I also used it to create a draft of the handout for the immigration activity. While I edit and revise all AI output, having a starting point saved me time. 

This gave me more time to focus on making the class interactive. The group of students I had this semester was easily distracted, so I needed to find ways to keep them engaged. AI helped me design activities that got them involved and participating while staying on topic, which made a big difference. 

AI has strengths and weaknesses when it comes to implementation in the classroom. AI can make learning more interactive. It gives students quick feedback on their ideas and helps instructors free up time for teaching. It also encourages students to think critically about what makes good writing or communication. For first-year students, this is an important skill. There are challenges, though. The biggest one is setting clear rules for how AI can be used. Some students used AI when it wasn’t allowed. Those in-class writing samples from early in the semester made it easier to spot work that didn’t match a student’s usual writing style. Clear policies and consistent enforcement are key. 

What Research Says 

Emerging literature has found some uses for AI in these kinds of courses. Research shows that AI can help students brainstorm and write, but some students find AI suggestions hard to review and prefer to just work on their own (Cummings, Monroe, and Watkins 2024). In other studies, scholars have found that AI tools have improved student engagement and AI literacy (Martin and Nesbit 2024). Other instructors suggest Chatbots have also been used as 24/7 teaching assistants to help students learn (Abdelhamid and Katz 2020). These studies show that AI can be valuable, but its success depends on how it’s used. 

Why First-Year Seminars Are a Good Fit for AI 

First-year seminars are meant to set students up for success in college. They introduce important skills and tools that students will use throughout their education. AI is becoming one of those tools. Teaching students how to use it responsibly and thoughtfully makes sense in a course designed to give them a strong start. 

Advice for Instructors 

If you want to use AI in your courses, here are some tips: 

  • Learn the tools yourself first. Spend time experimenting with AI tools like ChatGPT or Grammarly. Know what they can and can’t do. 
  • Set clear policies. Decide when and how students can use AI, and communicate your rules clearly. 
  • Be ready for misuse. Some students may use AI when it’s not allowed. Collect in-class writing samples or use other strategies to address this. 
  • Require reflection. Ask students to explain what the AI did well, what it struggled with, and how it helped their learning. Giving them specific reflection prompts and modeling this process in class can really help.  

References 

Abdelhamid, Sherif, and Andrew Katz. 2024. Using Chatbots as Smart Teaching Assistants for First-Year Engineering Students. Paper presented at 2020 First-Year Engineering Experience, East Lansing, Michigan. https://doi.org/10.18260/1-2–35782.  

Cummings, Robert E., Stephen M. Monroe, and Marc Watkins. 2024. “Generative AI in First-year Writing: An Early Analysis of Affordances, Limitations, and a Framework for the Future.” Computers & Composition/Computers and Composition 71. https://doi.org/10.1016/j.compcom.2024.102827

Martin, Angela, and Trevor Nesbit. 2024. Embracing the Use of Generative AI in a First-Year Information Systems Course. Paper presented at CITRENZ 2023 Conference Auckland, 27–29 September. https://doi.org/10.34074/proc.240102.  

Tong, Hakan, Ahmet Türel, Habibe Şenkal, S. Feyza Yagci Ergun, Orkan Zeynel Güzelci, and Sema Alaçam. 2023. “Can AI Function as a New Mode of Sketching: A Teaching Experiment with Freshman.” International Journal of Emerging Technologies in Learning (iJET) 18 (18): 234-248. https://doi.org/10.3991/ijet.v18i18.42603.  


About the Author­­­­  

Amanda Sturgill, Associate Professor of Journalism at Elon University, is a 2024-2026 CEL Scholar focusing on the intersection of artificial intelligence (AI) and engaged learning in higher education. Connect with her at asturgil@elon.edu.  

How to Cite this Post  

Sturgill, Amanda. 2024. “Using Generative AI in First-Year Seminars.” Center for Engaged Learning (Blog). Elon University, January 10, 2025. https://www.centerforengagedlearning.org/using-generative-ai-in-first-year-seminars/.