11/12/2025 | News release | Distributed by Public on 11/13/2025 01:01
A four-hour online course for undergraduates in the basics of generative artificial intelligence is among Boston University's initiatives in acclimating students and staff to this cutting-edge technology. Photo by Urupong/iStock
Boston University is launching a four-hour online "AI at BU" student certificate to help BU undergraduates explore the power of generative AI. Now open for registration (via the MyBU Student Portal), the interactive certificate introduces students to the fundamentals of artificial intelligence.
BU officials say the initiative is part of the University's broader commitment to preparing students for an AI-driven future and is led by the AI Development Accelerator (AIDA), in collaboration with the Institute for Excellence in Teaching & Learning and supported by BU Virtual. The certificate is free to obtain, and students will be encouraged to share their completion in pursuit of jobs and internships as proof of their AI skills. The certificate is not a for-credit course.
"Students are hungry for good information about generative AI, and this course delivers it in a fun and engaging format," says Joseph Bizup, Arts & Sciences associate dean for undergraduate academic programs and policies and an associate professor of English. "What I especially like is its scope and evenhandedness. It addresses the history of AI as a technology, why instructors might adopt different policies regarding AI in their courses, and issues of sustainability and ethical use."
John Byers, recently appointed to help shape BU's AI strategy as the executive director of AIDA, talked to BU Today about the certificate and the resources available for faculty and staff. Formerly the founding chief scientist and director at the technology incubator Cogo Labs, Byers is a CAS professor of computer science, with a secondary appointment on the Faculty of Computing & Data Sciences.
Byers: Across campus, nearly everyone can use AI to simplify routine tasks. For instance, students navigating the BU Hub, the University's general education program, can use AI tools to filter thousands of course options based on their interests, preferred departments time blocks, and frequency of course offerings. The same tools can assist academic advisors, helping them guide students through BU's expansive curriculum offerings with greater ease and accuracy.
Faculty and staff are also finding practical ways to use AI in their daily work. In biomedical engineering, one professor created an AI assistant to help manage hundreds of emails from prospective master's students, automating responses so he can focus on higher level engagement. In research, faculty are using AI to review complex grant proposals, ensuring compliance with federal guidelines and identifying missing elements faster.
We have spoken with faculty, students, and staff about their experiences with AI, and we have found the full spectrum, from seasoned experts to those just beginning to explore what's possible. Many faculty are already experimenting, incorporating it into their teaching and research in creative ways-inviting students to collaborate with AI in the classroom, from STEM fields to philosophy.
We also launched TerrierGPT, an AI-powered chatbot platform that gives the entire BU community, from faculty to students to staff, equitable access to foundational AI models that support learning, discovery, and innovation.
Byers: The certificate is self-paced and organized into six modules, each taking about 40 minutes to complete. Students who complete the program will earn a digital certificate recognizing their achievement. This credential can also be shared on LinkedIn and other platforms to highlight their AI skills.
The opening module focuses on the basics of prompting with AI, such as learning how to ask the right questions and probing for answers. Students will practice using TerrierGPT and other generative AI applications to summarize documents, generate and answer questions, upload and analyze class materials, streamline workflows, and craft clear and purposeful prompts and queries.
The next part is critical evaluation, or understanding that AI's outputs are not always accurate. When we teach AI tools, we're honest about their limits. They can hallucinate, give incorrect facts, or steer you in the wrong direction. A big part of the course is learning to treat AI like a collaborator-interrogating its answers, checking sources, spotting errors, and refining prompts until the output improves.
Byers: We spend substantial time on ethics and responsible use-when and how to disclose AI use, how to avoid passing off machine-generated work as human-authored, and how poor disclosure can damage trust.
The module on bias also tackles one of the most important aspects of AI literacy: recognizing that AI systems can reflect the biases of the data that that AI has been trained on. AI can reproduce gender, racial, or cultural bias found in their training data. The course explores how to identify and screen for these biases and apply critical judgment when interpreting AI-generated content.
And the course adds a touch of fun with "AI life hacks"-creative ways to use generative AI beyond the classroom or workplace, whether planning a trip, organizing a project or exploring a hobby. I like birding, for instance, so I might ask: I'm interested in seeing a quetzal, where should I go?
Yes, this is an important academic integrity issue. Right from the start, students learn that every professor will have their own AI policies, and it is their responsibility to understand and follow those guidelines. Using AI to write or generate work without citation and pass it off as your own work is plagiarism, plain and simple.
But not all misuse is intentional. The more subtle challenge is overreliance-letting AI do the thinking for you. We don't want students to default to AI as a replacement for their own learning. For example, one engineering instructor asks students not to use AI for prototyping ideas, since creativity and problem-solving are part of the assignment. However, she encourages using AI later to check calculations or to help polish your work. For example, spell-checking your document after you have written it is absolutely OK. This kind of thoughtful use of AI enhances learning rather than replaces it.
We ask the students to build a code of AI conduct for themselves, to reflect on how and when they want to use AI. This includes thinking about broader impacts, like the environmental cost of AI. Heavy use consumes significant computing energy. The course encourages mindful use, like asking five well-formed questions instead of 100 unfocused ones.
Faculty are eager to understand what students will be learning and to make sure they're up to speed. I'm going to be leading several 90-minute "in studio" workshops, starting with an initial one for faculty on November 17. Registration is required (find the link here). We're going to offer another one in mid-December, and we expect more to follow after intersession.
The workshops will be in person, in a studio classroom, and will include a guided tour of the online student course, so faculty can see firsthand what their students are learning. Participants will also get hands-on practice with TerrierGPT, learning effective prompting techniques and exploring key topics, such as responsible use, critical evaluation, and iterative response. The session will conclude with a collaborative exercise in which faculty design or strengthen a more robust AI policy for their own courses.
We will also be developing a parallel workshop for staff, focused on how AI can enhance their day-to-day work and job function. While it will begin with the same foundational generative AI concepts, we will also move into helping staff identify where AI can simplify routine tasks, improve efficiency, and support creative problem solving-essentially taking existing workflows and mapping that onto AI.
We approach AI at BU as a beneficial technology when used thoughtfully and responsibly. We're not evangelizing or insisting that everyone use it. There's no requirement that faculty, staff, and students use AI in their work.
We see AIDA as a service organization-we're here to support anyone who's curious, uncertain, or eager to learn more. Whether you're experimenting for the first time, looking to advance your understanding and learn from others, or even skeptical but open to learning, we want to make that exploration accessible. Most important, we want to be thoughtful about how we all engage with AI.
The reality is that AI is already a part of our daily lives. It is hard to avoid. If you're using Google, Netflix, or Yelp, you're engaging with AI-driven systems. Even simple tools like predictive text or autocorrect and spell-check are forms of AI. Generative AI is just another step in the evolution of these technologies, and learning how it works helps to demystify it.
Our goal is not to convert skeptics, but to encourage open dialogue and to create awareness. As educators and professionals at BU, we share a responsibility to stay informed about how digital tools are shaping our world. Developing a clear sense of AI's possibilities, limits, and risks allows us to teach and lead with integrity as this technology continues to evolve.
The interview was edited for clarity and brevity.
BU Launches Online AI Course For Undergrads; Additional AI Resources for Faculty, Staff