(Steve here: This is Part 2; Part 1 can be found here. If you didn’t read Part 1, this column can be understood by itself. Part 3 will come out next week.)
Since ChatGPT launched in November last year, it has created a flurry of opinion pieces.
ChatGPT is artificial intelligence (A.I.) software that allows you to enter a real language prompt, and it will write a response. For example, a prompt could be, “Write a 1,200-word college admission essay that highlights what I learned about leadership from the student council, the chess club, and theater. I have a 3.2 GPA. And how I had to quickly grow up as my mother died from cancer when I was 12.”
The generated essay is here. The result needs editing, it stopped short of 1,200 words for some reason, got a few facts wrong, and won’t win an award. Still, I’ve seen far worse writing from business leaders.
One of the main questions people are asking is how students will use it to cheat and how to catch them when they do.
For those of us thinking about the future of higher education, that’s the wrong long-term question.
Question Me This
Rather, we should be considering the following:
How can we use A.I. to customize education to help students learn subjects more thoroughly?
What will A.I. mean for skills we teach around technical writing, business writing, and other less creative forms of written communication?
Do we need to train professors in how to use A.I.?
How can we ethically use A.I. for the first drafts of projects?
What courses will we need to develop to learn how to interact more effectively with A.I. prompts? One of the challenges of working with ChatGPT and Microsoft’s new A.I.-empowered Bing (which is not yet widely available), is knowing what questions to ask. This will quickly evolve.
How will A.I. like Stable Diffusion, which creates art images, impact how we teach art? (The images in this newsletter were created by AI). How can we teach students to use AI images to make more compelling presentations?
How will A.I. impact the need for textbooks? How can we incorporate AI into Open Educational Resources (OER, free and publicly available teaching, learning, and resources in the public domain) to help students, faculty, and the public?
How can A.I. bots be used to provide better student services, including academic advising, career advising, admissions, financial aid, and other areas? In 2016, Georgia Tech started using bots as student advisors in one of its classes—many students couldn’t tell the difference. At one point, the bot was voted best student advisor.
How do we develop new disciplines, like A.I. Forensics, to help tell the difference between human-created and A.I.-generated material?
What Ethics courses need to be developed across disciplines to help faculty, students, and staff understand the proper uses of A.I.
The list goes on.
The questions are all important and will be addressed.
However, for higher education policy makers, leaders, and reformers, there are bigger considerations.
The biggest questions for the future:
Will A.I. require us to rethink fundamental assumptions about how we teach and how students learn?
How can we use A.I. to help develop high-quality, low-cost degree programs?
How will this ability disrupt institutions that are already at risk?
The Dead City
Whichever way the A.I. revolution plays out, there will be a lot of creative destruction in higher education. Here are predictions of how some of this will unfold.
The response:
Large institutions with strong brands and solid funding will be the slowest to adapt, as they can afford to wait to see how things play out. However, they are not immune to enrollment pressures and won’t be able to sit on the sidelines for long.
Some smaller intuitions struggling to stay open will embrace A.I. as we saw early adopters of online education that now dominate the market (e.g., Southern New Hampshire University and Arizona State). This focus will lead to significant reorganizations, which will result in the closing and merging of departments resulting in faculty layoffs while presenting new opportunities for others.
Midlevel institutions, too big to fail, not quite big enough to thrive, will be in the hardest spot. They won’t have the flexibility to change quickly or have budgets to rapidly implement A.I. to gain a competitive advantage.
There will be fierce debate about how much class content created by A.I. should/will be approved by faculty groups, academic bodies, and accreditation agencies. As these organizations tend to favor the incumbents and traditional ways of education, they will lag behind the best practices developed by faculty and in the general marketplace.
The upside:
With good use of A.I., colleges and universities can improve student retention and outcomes. This has already been proven.
Schools that successfully implement A.I. in recruiting will have an advantage. Better identification and tracking of prospects will produce a better enrollment yield.
Colleges and universities that can bring A.I. into the online learning space quickly will have a major advantage.
A.I. processes could help streamline the business side of the house, saving time and money. That money can be reinvested elsewhere or used to keep costs in check.
Competition and Fallout:
There will be more non-traditional competitors in the online learning space taking full advantage of A.I. offering high-quality, job-specific credential programs. They will use all available tools available to keep costs down.
The increasing number of non-traditional options will encourage more students to forgo the traditional college route as they do not see the ROI on time, cost, and debt. This will especially apply to first-generation students and working adults.
That will accelerate the number of college mergers and closings, already exacerbated by COVID, a decline in traditional-aged college students, and increased competition from online and non-traditional offerings.
Expenses:
The demand for A.I. talent, like IT talent today, will hit college and university employment budgets.
Universities will dump millions of dollars into technology to remain competitive.
Like many new technology rollouts, campuses will be ill-equipped to manage them, and there will be massive cost overruns. (This isn’t a ding on higher education, this happens in business as well.)
Consulting firms will ramp up their A.I. offerings for higher education. Again, costing millions.
Many experiments at the campus level and in companies that serve higher education will fail. While a natural part of the process, it can be harmful if not managed properly.
This list of predictions is by no means exhaustive, and I’m sure you can think of a number of other areas to be impacted.
The point is to start thinking about it now so you can manage it and not have it manage you.
The question now is, what can you and institutions do to prepare for what is coming?
I’ll cover that in Part 3.