Skip to main content
← All Case Studies
Educationfailure

Why a University's $18M Adaptive Learning Platform Failed Its Students

Large Public Research University

$18M

investment

11%

faculty Adoption

-22%

student Satisfaction

Shelved

outcome

The Challenge

Facing pressure to improve graduation rates and compete with online education providers, a large public research university secured an $18M grant to build an AI-powered adaptive learning platform. The vision: personalize learning paths for 45,000 students across 200+ courses, identifying struggling students early and adjusting content difficulty in real time. The provost championed the initiative and set a two-year timeline aligned with an accreditation review cycle.

The Approach

The university hired an external technology firm to build the platform, selecting a vendor based primarily on an impressive demo rather than a rigorous evaluation process. Faculty were informed of the initiative via email and invited to "onboarding sessions" that were actually one-way presentations. The platform was designed to replace core elements of the learning management system faculty had used for a decade. No pilot courses were run. The system launched simultaneously across 50 introductory courses affecting 12,000 first-year students. IT staff received minimal training on maintaining the system, and no plan existed for model updates as course content changed.

The Results

Within the first semester, only 11% of faculty assigned to the platform actively used it—the rest found workarounds to continue teaching as before. Students in platform courses reported 22% lower satisfaction than peers in traditional courses, citing confusing navigation and recommendations that did not match their professor's teaching. The adaptive algorithms, trained on data from the vendor's other clients (primarily community colleges), produced inappropriate difficulty adjustments for a research university population. IT staff could not debug model behavior or update course content mappings. After three semesters and $18M spent, the platform was quietly shelved. Graduation rates did not improve.

Seven Pillar Insights

Leadership Alignment

Top-down mandate from the provost without faculty co-design created active resistance from the people the system depended on most.

Capability Building

No internal team could maintain, update, or troubleshoot the platform after the vendor's engagement ended, creating a $18M orphan system.

Continuous Evolution

No plan existed for updating models as courses evolved, meaning the AI became less accurate every semester as curricula changed.

Key Lessons

1

Faculty are not end users to be trained—they are domain experts whose input must shape the system from day one

2

AI models trained on one educational context rarely transfer to another without significant adaptation

3

Simultaneous rollout across 50 courses created 50 simultaneous failure points with no opportunity to learn and iterate

4

Building without internal technical capability to maintain and evolve the system guaranteed obsolescence

Ready to Avoid These Pitfalls?

Take the AI Leadership Assessment to identify your organization's strengths and vulnerabilities.

Want expert guidance on your AI strategy?

Schedule a consultation with Neil to explore how these lessons apply to your organization.

Schedule a Consultation

We use cookies to analyze site traffic and optimize your experience. By clicking “Accept All”, you consent to analytics and marketing cookies. Privacy Policy