KeyspiderKeyspider
Knowledge Hub/Case Study
Case Study

How a University Cut Student Services Enquiries by 43% — Without Increasing Headcount

A mid-sized university's student services team was fielding 240 avoidable calls and emails per week — students asking about enrolment deadlines, fee payment procedures, housing applications, and exam schedules. Information that existed on the website but was unfindable. After deploying Keyspider AI Search and AI Assistant, avoidable enquiry volume fell 43% in one semester.

12 min readHigher EducationMarch 2025Read Case Study

43%

reduction in avoidable student enquiries in one semester

University student services teams are, in many respects, the most efficient human search engines in higher education. Students who cannot find enrolment deadlines, fee payment portals, housing application windows, or exam timetables call the student services desk — and skilled staff answer the same questions hundreds of times per week. This is not a staffing problem. It is a search problem. One university solved it with Keyspider. In a single semester, 43% fewer students needed to call.

The University

The university enrolls approximately 18,500 students across undergraduate, postgraduate, and research programmes at two campuses. Its student services function — housed in a central Student Experience office — handles enquiries across enrolment, fee payment, financial aid, housing, student health, careers, international student support, accessibility services, and academic administration. The office employs 28 full-time staff, of whom 14 are in direct student-facing roles.

The university's digital ecosystem had grown to reflect its organisational complexity: a main university website, a student portal (authenticated), a learning management system, a separate postgraduate admissions site, a housing portal, a financial aid microsite, and eight faculty websites. Each had been developed independently, each had its own navigation logic, and none were connected by a shared search layer. A student trying to find the deadline to withdraw from a course without academic penalty might encounter the information on the student portal, the academic calendar PDF, the faculty handbook, or a FAQ page on the main website — depending on which entry point they happened to use.

An enquiry analysis conducted by the Student Experience office at the start of Semester 2 in 2024 quantified the problem. Of the 560 weekly student contacts — calls, emails, and in-person visits — 240 (43%) were categorised as 'navigational or informational': students seeking information that was published somewhere on the university's digital estate. The loaded cost per enquiry — staff time, supervision, infrastructure — was calculated at $22. Annual cost of avoidable enquiries: approximately $274,560.

560

weekly student contacts across all channels

43%

were navigational — information already published online

$274K

estimated annual cost of avoidable enquiries

8

separate digital platforms with no unified search

The Student Experience: Why They Called Instead of Searched

The student services team had a detailed understanding of why students were calling rather than self-serving. They had tracked it for two years through their ticket categorisation system. The pattern was consistent: students attempted to find information on the website, failed after two or three search attempts, and called. The search failure was not random — it clustered around five recurring themes.

The first was platform fragmentation. Students didn't know whether the information they needed was on the main website, the student portal, the LMS, or a faculty site. Searching on the wrong platform returned no results, even when the information existed on another. Students had no way to search across all platforms simultaneously.

The second was terminology mismatch. The university's content used institutional language: 'discontinuation of enrolment', 'academic transcript request', 'tuition payment instalment plan'. Students searched using their own language: 'drop out', 'get my grades', 'pay in instalments'. Without semantic understanding, keyword search couldn't bridge the gap.

The third was PDF opacity. Critical deadline information — the academic calendar, the fee payment schedule, the housing application timeline — was maintained as downloadable PDFs. Keyword search indexed the PDF filename, not the content. A student searching for 'enrolment deadline' would receive a PDF titled 'Academic Calendar 2024-25' with no indication of whether the relevant deadline was on page 1 or page 12.

The fourth was recency. Students, particularly at key periods in the academic calendar — start of semester, exam period, end of year — were searching for time-sensitive information: 'when does housing open?', 'is the fee payment deadline extended?'. Keyword search cannot prioritise current content; it returns results by relevance to the query terms, not by temporal relevance to the student's situation.

"Students are smart. They search first. But when they can't find the answer in two tries, they call us. They're not being lazy — the search genuinely isn't giving them what they need. We were a backup for a failing search engine."

Head of Student Experience

FERPA Compliance: The Non-Negotiable Requirement

The university's deployment of AI search introduced a critical compliance consideration. FERPA (the Family Educational Rights and Privacy Act) prohibits the disclosure of personally identifiable information from student education records to unauthorised parties. Any search system that connected public-facing university content with the authenticated student portal — which contained individual enrolment records, grade histories, and financial aid details — presented potential FERPA exposure if access controls were not correctly implemented.

How Keyspider handles FERPA in higher education deployments

Keyspider uses architecturally separate indexes for public-facing content and authenticated student content. The public search widget — accessible from the main university website without login — can only return results from the public index. Student portal search, accessible only after authentication, draws from the authenticated index and is filtered by the individual student's permissions. The separation is architectural, not query-time: it is impossible for a public search to return a result from the authenticated index, regardless of query. This approach satisfies FERPA requirements without relying on access control logic that could be misconfigured.

The university's privacy officer and general counsel reviewed Keyspider's FERPA architecture documentation and data processing agreement before procurement was approved. The architectural index separation — verified through a technical walkthrough — was the critical factor in clearing the deployment for sensitive student-facing content.

What Was Deployed

The deployment comprised two components. The first was Keyspider AI Search on the public-facing university website and the eight faculty sites — a unified search index covering all public content, accessible without authentication. The second was Keyspider AI Assistant on the student portal — an authenticated conversational AI interface drawing from the student-facing content index, allowing enrolled students to ask multi-step questions about their specific situations.

The AI Search deployment went live first, at the beginning of Week 2 of Semester 2 — chosen deliberately to coincide with the period when student enquiry volume was highest (start of semester, orientation week, enrolment confirmation deadline). The AI Assistant deployment followed in Week 6, after four weeks of AI Search analytics had been used to identify the most common multi-step query patterns that would benefit from conversational handling.

Total deployment time from contract signature to both components live: 13 days. The university's IT team had required no server provisioning, no CMS changes, and no integration work with the student management system — Keyspider indexed the published content surfaces, not the back-end systems.

Results: One Semester

43%

reduction in navigational and informational enquiries

92%

student satisfaction with AI search experience

68 sec

median time-to-answer for self-served queries

18 hrs

per week recovered for complex student casework

Enquiry Volume

At the end of Semester 2 — approximately 14 weeks after deployment — total weekly student contacts to the Student Experience office stood at 420, down from 560. Of this reduction, 120 were directly attributable to the AI search deployment: the navigational and informational category fell from 240 per week to 120, a 50% reduction in that category. The net reduction across all categories was 43%.

The categories with the largest reductions were enrolment and academic calendar enquiries (down 52%), fee payment and financial aid process questions (down 47%), and housing application procedure enquiries (down 44%). The smallest reduction was in mental health and welfare referral contacts — 3% — consistent with the expectation that AI search addresses information-seeking, not welfare support needs.

Student Satisfaction

End-of-semester student experience surveys collected 2,840 responses — 15% of enrolled students. The digital services section of the survey asked specifically about the AI search experience. Ninety-two percent of students who had used the search rated it positively. The most frequent qualitative comments described finding the right form, deadline, or procedure 'without having to call', and specific praise for AI-generated answers that 'actually explained what I needed to do next' rather than returning a list of links to navigate.

Staff Impact

The Student Experience team lead calculated that the reduction in navigational enquiries had recovered approximately 18 staff hours per week — time previously consumed by answering questions the website should have been able to answer. This capacity was reallocated to complex student support cases: academic appeals, financial hardship applications, international student visa enquiries, and accessibility service coordination. In the same semester, the team resolved 22% more complex cases than in the equivalent semester of the prior year, with no change in headcount.

The AI Assistant Difference for Complex Queries

The AI Search deployment solved the single-question problem well. But student enquiries are frequently multi-step: 'I'm an international student on a student visa — if I withdraw from two of my four enrolled units, what happens to my visa status and my scholarship?' This question requires synthesising information from immigration compliance guidance, academic regulations, and financial aid conditions — three separate content areas.

AI Assistant addressed this. For queries involving multiple policy intersections, the conversational interface allowed students to provide context ('I'm on a student visa'), ask follow-up questions, and receive synthesised, cited guidance that drew from all relevant content areas simultaneously. The interface made explicit that it was providing general guidance based on published policy — not individual legal or immigration advice — with every answer citing the source document and recommending follow-up with the relevant specialist for individual circumstances.

In the four weeks following AI Assistant activation, 34% of students who engaged with the chat interface reported that they would otherwise have called or emailed the student services team. For a team where staff time was the binding constraint, this was the metric that mattered most.

Ready to give your student services team their time back?

Book a demo with our higher education team. We'll configure Keyspider on a sample of your student-facing content and show you what self-service looks like for your students.

Book a Demo

Ready to give your users better answers?

AI Search, AI Assistant, and Workplace Search. Deployed in days, not months. See it live on your own content.

No credit card required · Live in 2 weeks · Cancel anytime