Managing the Risks and Rewards

of AI in Postsecondary Education

Thought Leadership

Managing the Risks and Rewards of AI in Postsecondary Education

By Roy Daykin, CFO (retired) with editorial support from AI

 

As postsecondary institutions explore how to leverage artificial intelligence (AI), CFOs are uniquely positioned to navigate the complex balance between its risks and rewards. In my view, the real challenge with AI in higher education is not just about technological adoption—it’s about survival. Institutions that fail to embrace AI risk falling behind, while those that move forward will likely encounter internal resistance, particularly around implementation and integration. 

History shows us that every major technological innovation—whether it was spell check, calculators, computers, or online learning—faced skepticism before becoming essential. Today, AI is no different, and the reluctance to embrace it, especially among faculty, mirrors the resistance seen in past advancements. 

To fully understand AI’s potential, we need to examine both its risks and rewards and explore a thoughtful, strategic approach to adoption. 

 

The Risks: Internal Resistance and Misuse

One of the primary risks I see with AI adoption is the internal disruption it will cause, particularly within faculty ranks. Many instructors are understandably resistant to students using AI for assignments, fearing it will lead to academic misconduct. But much like how calculators and word processors eventually became trusted tools, AI, too, can be integrated in ways that enhance learning rather than detract from it. 

One real-world example of misuse was a case where AI-generated images placed well-known political figures into fabricated criminal situations. While the capacity for misinformation and manipulation exists, it’s essential that institutions teach their students and faculty how to navigate these risks. Part of higher education’s role must be to educate students on how to differentiate legitimate uses of AI from deceptive ones. 

 

The Rewards: Efficiency, Scalability, and Improved Services

Let’s turn to the rewards, which, I believe, far outweigh the risks if managed correctly. AI has the potential to make organizations more operationally efficient and scalable, reducing the burden on human staff for routine tasks while enabling them to focus on higher-order challenges. Automation is a major part of AI’s value proposition, and it can significantly improve the accuracy and speed of operations. 

One area where AI is already showing promise is in student services. Increasingly, students expect 24/7 access to support without having to call or visit an office. AI-powered chatbots can answer basic inquiries, guide students through administrative processes, and even help manage their mental health concerns by providing immediate, round-the-clock access to resources. 

On the operational side, AI can streamline tasks such as heating and security management, finance and human resources processes, resource allocation, and course registration systems. For example, at Southern Alberta Institute of Technology (SAIT), we began experimenting with AI in our admissions and registration systems, which allowed us to better manage student data and course scheduling. This created a more seamless experience for students and reduced the workload for our staff. 

 

Building a Culture of Experimentation

One of the most critical strategies for managing AI’s risks is fostering a culture of experimentation. Too often, institutions are risk-averse, particularly public ones, due to concerns about reputation and political fallout. This can lead to poor decision-making or, worse, stagnation. As leaders, we need to encourage our teams to experiment with AI, but within a framework of strong policies and clear guidelines. 

Involving both faculty and students in developing AI-related policies is essential. I recently read about an institution whose faculty worked with students to create guidelines on how AI could be used in the classroom. This collaborative approach helped mitigate concerns and fostered an environment of shared responsibility and improved trust. Engaging students in this process helps ensure policies are grounded in real-world applications, making them more likely to succeed. 

Additionally, institutions need to build comfort with risk. This means not punishing individuals when things go wrong but instead encouraging them to learn from the experience. I’ve seen too many promising innovators get slapped down when an initiative didn’t go as planned. Over time, this creates a culture of risk aversion, which ultimately stifles progress. 

 

AI as a Competitive Advantage

Demographic shifts are creating a new reality in both Canada and the U.S. As our domestic populations of postsecondary students shrink, institutions must find ways to stay competitive. AI can offer institutions a competitive edge, both by improving student services and by attracting international students who expect cutting-edge technology in their learning environments. It can also provide personalized support for First Nations students who face unique challenges and opportunities in their educational journeys. 

For institutions that embrace AI, the rewards are plentiful. I believe institutions that successfully adopt AI will not only see higher enrolment numbers but also more satisfied students and improved operational efficiency. Through the offering of better services and leveraging AI to meet students’ evolving expectations, these institutions will be well-positioned to thrive in a competitive landscape. 

However, this will require more than just adopting AI for operational improvements. Faculty will need to embrace AI, both in their classroom and in their research.  Just as calculators eventually became a standard tool, AI must be integrated into the learning process. Institutions that can help their faculty overcome their resistance will find themselves ahead of the curve. 

 

Taking A Strategic Approach to AI

AI is here to stay, and institutions must approach it strategically. As CFOs, we are responsible for ensuring that our institutions not only survive but thrive in this era of rapid technological change. AI offers us a path to greater operational efficiency, improved student services, and a competitive edge in a shrinking student marketplace. However, this requires careful management of the risks, a willingness to experiment, and a commitment to involving all stakeholders in the process. 

By embracing AI thoughtfully and strategically, we can ensure our institutions remain relevant and competitive, both today and in the future. 

 

This content is part of a series that explores the impact and implications of AI in higher education, created through a collaboration between a human subject matter expert and a large language model. 

 

 

Roy Daykin

With over 35 years of experience in post-secondary education, Roy Daykin has guided some of Canada’s top institutions toward operational excellence and student-centered innovation. Recently retired from Southern Alberta Institute of Technology (SAIT) after six years as Chief Financial Officer, Roy’s leadership spanned finance, technology, human resources and facilities & capital planning, aligning operations with SAIT’s student-first mission. His career includes pivotal roles at Okanagan College and Langara College, where he served as President & CEO pro-tem. A Certified Professional Accountant with a Master’s in Leadership, Roy remains dedicated to improving educational processes as a member of CampusWorks’ Education Advisory Board.