AI CyberSec Learning Assistant

AI CyberSec Learning Assistant

AI CyberSec Learning Assistant

May 16, 2025

An AI-powered learning assistant designed for the learning platform TryHackMe guiding learners through complex cybersecurity content with clarity and confidence. The experience uses contextual nudges, adaptive feedback, and motivational prompts to reduce friction, keep learners on track, and improve engagement without overwhelming them. The goal was simple: make learning feel supported, not solitary.

0M+

users reached

0+

months of collaboration

7000+

hours of time saved

The process

The design process balanced user empathy with data-driven decision-making. We combined behavioral insights, product analytics, and rapid prototyping to design an AI experience that feels natural, helpful, and motivating. Every design decision prioritized clarity, timing, and trust — ensuring the assistant enhanced learning rather than interrupting it.

01

Discovery

/research and insights

We explored how learners move through hands-on labs, where they get stuck, and what causes drop-off or frustration. By combining usage data, common learning paths, and qualitative insights, we identified key moments where timely guidance could make the biggest impact.

01

Discovery

/research and insights

We explored how learners move through hands-on labs, where they get stuck, and what causes drop-off or frustration. By combining usage data, common learning paths, and qualitative insights, we identified key moments where timely guidance could make the biggest impact.

01

Discovery

/research and insights

We explored how learners move through hands-on labs, where they get stuck, and what causes drop-off or frustration. By combining usage data, common learning paths, and qualitative insights, we identified key moments where timely guidance could make the biggest impact.

02

Design

/concepts and execution

We designed an assistant that feels like a guide, not an instructor - offering help only when it’s useful. The experience focused on clear language, lightweight prompts, and contextual suggestions that adapt to what the learner is doing in real time.

02

Design

/concepts and execution

We designed an assistant that feels like a guide, not an instructor - offering help only when it’s useful. The experience focused on clear language, lightweight prompts, and contextual suggestions that adapt to what the learner is doing in real time.

02

Design

/concepts and execution

We designed an assistant that feels like a guide, not an instructor - offering help only when it’s useful. The experience focused on clear language, lightweight prompts, and contextual suggestions that adapt to what the learner is doing in real time.

03

Testing & Iteration

/feedback and refinement

Early concepts were tested through prototypes and live experiments, refining tone, timing, and visibility of nudges. We continuously adjusted the experience to ensure it supported learning without becoming distracting or intrusive.

03

Testing & Iteration

/feedback and refinement

Early concepts were tested through prototypes and live experiments, refining tone, timing, and visibility of nudges. We continuously adjusted the experience to ensure it supported learning without becoming distracting or intrusive.

03

Testing & Iteration

/feedback and refinement

Early concepts were tested through prototypes and live experiments, refining tone, timing, and visibility of nudges. We continuously adjusted the experience to ensure it supported learning without becoming distracting or intrusive.

04

Delivery

/handoff and launch

The assistant was integrated directly into the learning flow, working alongside existing labs and dashboards. Design and engineering collaborated closely to ensure the experience scaled reliably across different content types and learner levels.

04

Delivery

/handoff and launch

The assistant was integrated directly into the learning flow, working alongside existing labs and dashboards. Design and engineering collaborated closely to ensure the experience scaled reliably across different content types and learner levels.

04

Delivery

/handoff and launch

The assistant was integrated directly into the learning flow, working alongside existing labs and dashboards. Design and engineering collaborated closely to ensure the experience scaled reliably across different content types and learner levels.

05

Scaling

/growth optimization

Post-launch, the system evolved through ongoing iteration, informed by engagement metrics and learner behavior. The assistant continues to improve as new content, features, and learning patterns emerge.

05

Scaling

/growth optimization

Post-launch, the system evolved through ongoing iteration, informed by engagement metrics and learner behavior. The assistant continues to improve as new content, features, and learning patterns emerge.

05

Scaling

/growth optimization

Post-launch, the system evolved through ongoing iteration, informed by engagement metrics and learner behavior. The assistant continues to improve as new content, features, and learning patterns emerge.

Performance at scale

Satisfaction

Score

87%
37+

Countries

37+

Countries

37+

Countries

0K+

Views

0K+

Views

0K+

Views

0+

Languages

0+

Languages

0+

Languages

17M+

Active users

2021

2022

2023

2024

2025