Collaboratory

Improving usability for a widely used higher ed engagement platform.

Project Details

Background Context

Using UX Research to Improve Data Collection in a SaaS platform used by 40+ universities.

Collaboratory is a SaaS platform used by 40+ U.S. higher education institutions to manage and track over 3,000 community engagement activities. Despite widespread adoption, the platform faced critical usability issues around navigation, data entry, and support resources. These challenges negatively impacted user satisfaction and platform efficiency.

As part of a 3-month UX research project, our team partnered with Collaboratory to uncover usability pain points and deliver actionable recommendations. our team partnered with Collaboratory to uncover usability pain points and deliver actionable recommendations.

Tools: FigJam, Figma, Google Sheets, Qualtrics

The Problem

Users struggled to navigate the platform, complete tasks efficiently, and access support when needed

Collaboratory users — faculty, staff, Community Partners, and software administrators — struggled to:

  • Navigate the platform efficiently

  • Enter data without confusion or redundancy

  • Find help resources when needed

These issues increased cognitive load, reduced efficiency, and limited the platform’s ability to scale its impact across institutions.

Research Question

My Role

Led research from problem discovery to synthesis

  • Designing and conducting semi-structured user interviews

  • Running a heuristic evaluation of the platform’s core features

  • Facilitating usability testing sessions

  • Synthesizing data through affinity diagramming and persona development

  • Communicating insights and recommendations to the client

Research

1. User Interviews

Users rely on fragmented tools and face barriers to efficient collaboration

We interviewed 4 faculty and engagement staff across four universities to explore how they manage community engagement data and navigate the Collaboratory platform. We interviewed users representing key roles within higher education institutions: faculty coordinators, engagement officers, and software administrators. Each brought distinct perspectives and responsibilities that shaped how they experienced the platform.

Key Themes:

  1. Fragmented Workflows
    Users juggle tools like Google Sheets, internal websites, and manual emails to track engagement, leading to inefficiencies and inconsistent data.

  2. Limited Access for Community Partners
    Faculty are burdened with extra administrative work because partners can’t enter data directly, creating bottlenecks and added responsibility.

  3. Unclear Terminology and Onboarding
    Users were confused by terms like “Activity” and “Partnership” and found the documentation insufficient for self-guided use.

  4. Lack of Storytelling Tools
    Participants want ways to visualize and present engagement data more meaningfully—for reports, grants, and broader institutional impact.

2. User Personas

Our synthesis revealed clear opportunities for alignment with user expectations

We synthesized interview data into three key user personas representing core institutional roles. These personas helped us prioritize user needs and align our usability findings and design recommendations.

🧑‍🏫 Faculty Coordinator

  • Goals: Track and report community engagement across departments

  • Pain Points: Tedious manual entry; struggles explaining the platform to collaborators

  • Needs: Streamlined workflows, stronger support materials, and better visibility into reporting outcomes

👩🏽‍💼 Engagement Officer

  • Goals: Support data entry and follow-up across multiple initiatives

  • Pain Points: Reliant on others for updates; no direct access for partners

  • Needs: Role-based permissions, access delegation, and in-platform communication tools

🧑‍💻 Software Administrator

  • Goals: Manage backend system and permissions for users

  • Pain Points: Limited configuration options; reactive support structure

  • Needs: Admin dashboard, user role customization, clearer system feedback

3. Competitive Analysis

Our synthesis revealed clear opportunities for alignment with user expectations

We developed personas to capture distinct user goals and roles. We also conducted a competitive review of 5 platforms (Salesforce, Eventbrite, GivePulse, etc.) to identify UI patterns and missed opportunities.

Insights

  • Several platforms offered advanced reporting features and dashboards that supported data storytelling (e.g., Visible Network Labs, Salesforce).

  • Most competitors included collaboration features with role-based access controls, something Collaboratory lacked.

  • Platforms like Eventbrite used cleaner search/filter flows and progressive onboarding, setting usability expectations that Collaboratory did not meet.

4. Heuristics + Usability Testing

Core workflows showed friction, especially for users unfamiliar with the platform

We conducted a heuristic evaluation using Nielsen’s 10 Usability Heuristics, and usability testing with 4 student participants unfamiliar with the platform. Our goal was to identify major navigation and interaction breakdowns in key tasks such as searching, onboarding, and accessing help resources.

Heuristic Findings

  • No progress indicators on multi-step forms

  • Inconsistent icons, typography, and layout design

  • Lack of a return button or task shortcuts

Usability Test Data

  • Only 1/4 users located the Help Center without assistance

  • 3/4 users struggled with applying search filters

  • Avg. task time: 3.5 mins (target was under 2 mins)

Final UX Recommendations + Screens

Redesign Search Functionality

  • Users struggled to find and apply filters in the search page, often missing the “update” step.

    • Improve visibility: Place search bar on homepage and use sticky scroll behavior

    • Reduce friction: Auto-apply filters instead of requiring a separate “update” button


      “I thought I applied the filter, but nothing changed.”

Image of the filter bar showing the “Update” button—users frequently missed this step, leading to failed searches

Clarify Help Center and Onboarding

  • Users overlooked help resources or couldn’t find what they needed.

    • Centralize support: Consolidate help content with plain language and video tutorials

    • Boost visibility: Increase text size and contrast on key Help Center elements


      Only 1 of 4 users found Help Center unaided

Help Center link buried in the footer; only 1 of 4 users discovered it without guidance.

Simplify Menu Navigation

  • Navigation layers and missing shortcuts added friction to task flows.

    • Flatten IA: Reduce unnecessary menu depth

    • Add shortcuts: Because the platform is a series of links, include a persistent back button for easier recovery


      “I wish there was a back button.”

Nested and redundant menu items (e.g., “Activities” listed twice) increased cognitive load and confusion.

Improve Data Entry Flow

  • Form interactions lacked clarity and guidance.

    • Provide feedback: Use progress indicators and inline error messages

    • Clarify completion: Add visual status icons (e.g., green checkmarks)


      2 users skipped required fields without realizing it

Required field (“Units”) left blank without error message—users were able to continue without completing the form.

Enable Role-Based Access for Community Partners

  • Faculty felt overwhelmed managing platform data alone.

    • Distribute responsibility: Allow partners to input their own information

    • Reduce admin burden: Expand permissions for non-institutional collaborators


      “I spend hours inputting for other departments.”

Final Presentation

Impact

Improving Engagement on the Collaboratory Platform

Although implementation fell outside our project timeline, our research offers clear benefits:

  • Reduced user frustration by aligning features with user mental models

  • Lowered onboarding barriers for new users and stakeholders

  • Improved long-term usability and institutional retention

Reflection

Client Impact

This project sharpened my ability to guide end-to-end UX research—from generative inquiry to evaluative testing—with a focus on actionable outcomes.

What I Learned

  • To tell compelling stories with user data

  • To advocate for usability through visual and verbal communication

  • To balance stakeholder goals with user-centered priorities

This experience strengthened my passion for solving complex, system-level UX problems and confirmed my readiness to contribute to other platforms through thoughtful, research-driven impact.