As artificial intelligence (AI) rapidly transforms the landscape of education, faculty at Brooklyn College—and around the globe—are debating both its promise and its pitfalls. From simplifying time-consuming administrative tasks to sparking new debates about academic integrity, AI is already reshaping how professors teach and how students learn.

But one thing is clear: AI isn’t going anywhere. That presents an exciting, once-in-a-generation opportunity for educators and administrators to lead the way by establishing thoughtful practices, systems, and protocols that can shape the future of education for the better.

Mariya Gluzman brings a unique perspective to the evolving conversation around AI in higher education. Currently serving as an instructional designer at the Brooklyn College Library & Academic IT department, she’s at the forefront of integrating AI into academic practice.

Over the past year, she has led numerous hands-on workshops for students, staff, and faculty, helping the campus community navigate the opportunities and challenges of AI. Most recently in July, she led a series of practicums for staff and faculty to explore and apply AI tools to their work in course design, instruction, and assessment. She also presented a case study on NotebookLM and led a hands-on demo session on the use of this tool at the third annual Teaching and Learning With AI Conference hosted by the University of Central Florida in May.

Gluzman’s insight goes beyond tech and training—she’s also a seasoned educator. For more than two decades, she taught in the Department of Philosophy as an adjunct lecturer, where she combined her expertise in instructional design with a passion for innovative, student-centered pedagogy.

We asked Gluzman about her work at Brooklyn College—and how her dual roles as an educator and instructional designer inform her vision for the future of AI in education.

What does AI literacy mean to you, and why is it important for faculty to embrace it for themselves and their students?

The notion of AI literacy often depends on context. Typically, it means understanding generative AI’s capabilities and limitations, plus effectively using various AI tools. In academic settings, however, it requires additional considerations, especially around academic integrity and social justice.

For students, AI literacy involves learning to discern appropriate and inappropriate uses of AI in their studies. This goes beyond simply picking the right tool or formulating a solid prompt; it also means understanding potential downsides, even with legitimate uses. The goal for students is to leverage AI to enhance their work and intellectual growth, not to outsource their thinking and decision-making to a machine.

For faculty, AI literacy includes yet another essential dimension: the responsibility to actively shape and direct AI’s role in teaching and learning.

What do you feel are the most effective ways professors can integrate AI tools like ChatGPT into their teaching without compromising academic integrity?

There are countless possibilities, but succinctly I would group AI applications for instructors into two main categories: enhancing student learning and streamlining faculty work.

To enhance student learning, AI can help create instructional scaffolding for assignments, develop accessible and multimodal content, generate tutorials and assignment templates, and gamify learning activities. For faculty efficiency, AI can assist with building test question banks, creating grading rubrics, and better aligning course content with learning outcomes.

Of course, faculty also have to remember that ethical AI implementation means being transparent about their own use of this technology while remaining careful and diligent.

What are the biggest misconceptions professors tend to have about AI’s capabilities in the classroom?

Some professors believe that not engaging with AI will somehow delay its presence in higher education, or that without explicit permission, students will simply avoid using this technology. The reality, however, is that AI is already part of the academic landscape and students are using it, often without proper guidance.

Another key concern some faculty have is that using AI for research, instruction, or assessment means outsourcing one’s cognitive labor or sacrificing academic freedom. This need not be the case. Consider how we use other advanced tools: We rely on sophisticated kitchen equipment to create complex dishes, but we still design the meal; we drive cars with adaptive cruise control and parking assist, yet we remain firmly in control of our destination.

Generative AI can help us to be more intentional and strategic in our work, allowing us to focus on higher-order tasks. Using these tools responsibly helps us actively shape their integration into learning.

You presented at the third annual Teaching and Learning With AI Conference. What were your biggest takeaways from that event?

One important takeaway is how some fundamental challenges in higher education persist, even with AI’s emergence. For example, during my presentation, a faculty member raised a classic concern: knowing if students are actually doing the required readings. This issue isn’t new. Unless professors dedicate class time to close reading activities, it’s difficult to be certain. This problem is even more pronounced in online asynchronous courses. Ultimately, we can only design varied learning activities and assessments that offer clues about student engagement with assigned texts. AI tools can make it easier to build these activities.

Another key insight is that most faculty just need to see compelling examples of how AI can genuinely help students. If they see AI being used to boost engagement with course materials or peers, help students acquire essential academic or life skills, or clarify important terms—all without adding extra burden on instructors—they’re often eager to adopt it. Once they see a useful application that truly benefits their students, they’ll take that idea and run with it.

What have you and your colleagues been doing to enhance the understanding of AI on campus?

During the last academic year, I’ve facilitated numerous faculty workshops on several AI topics. We covered everything from creating and enforcing AI policies and safeguarding assignments against AI plagiarism, to designing AI-enhanced student assessments and integrating AI literacy into course curricula.

Beyond workshops, a few trailblazing colleagues and I launched an informal group. We meet sporadically to share our work, brainstorm AI use cases, and foster ongoing conversation about AI integration.

Our Library & Academic IT team also developed an AI faculty guide. Additionally, the Center for Teaching and Learning created and moderated a “Mythbusting AI” panel for the 2025 Faculty Day Conference, exploring common assumptions about AI. We’re eager to continue these efforts, ideally with even greater faculty input and participation in the upcoming academic year.

You have called getting everyone up to speed about AI an urgent matter. Why?

The urgency of getting everyone to engage with AI, beyond just “getting up to speed,” is being felt on two distinct fronts.

First, many industries are rapidly adopting generative AI. Given the population Brooklyn College serves as a public university and our mission, it’s crucial we help our students succeed in today’s job market. They need to be just as prepared as applicants from brand-name colleges who’ve had every opportunity to work with cutting-edge technology. This requires a nuanced understanding of AI’s benefits and costs.

Another is the stewardship of this technology, especially its role in higher education. Many school administrations are already shaping policies and tool choices for their communities. But how many of these critical decisions are truly informed, let alone shaped, by faculty input? We had no say in whether this technology should even exist or be used in academic contexts. Considering AI’s rapid development, we risk losing a once-in-a-lifetime opportunity to direct and shape its trajectory here at Brooklyn College, and in higher education generally, through informed debate and engagement.