Turning Experience into Learning That Makes Sense

I'm Jordan Brown, a creator and educator. My work focuses on making learning more transparent, more engaging, and easier to apply. I design experiences that help people make sense of complex information, stay present, and connect learning to real life.

About Me

Skills

  • Learning experience and curriculum design

  • Storytelling and creative direction

  • Facilitation and community engagement

  • Accessibility and user-centered design

Design Perspectives

  • Play, curiosity, and creative exploration

  • Mindfulness and awareness in everyday moments

  • Learning from my children and everyday relationships

  • Travel and learning through experience

How This Shows Up in My Work

  • Learning experiences that feel intentional and human

  • Clear structure paired with room to explore

  • Design choices that respect time, attention, and context

  • Content that connects ideas to real life

Experiences that shape my design

  • Travel - Study abroad, work abroad, family travel

  • Service - volunteering, mentorship, restaurant industry, hosting events focused on media, culture, and technology

  • Relationships - parenthood, family business partnerships, community engagement

Tools & Craft

  • Articulate Storyline and Rise

  • Adobe Captivate and Creative Cloud

  • Blackboard Ultra and Canvas LMS

  • Figma, Miro, Canva

  • Notion and lightweight planning tools

Portfolio

Each project is part of an ongoing collection. These examples show how I approach learning design through clarity, experience, and real-world use.


Ergonomics Training
A scenario-based course that helps ergonomics practitioners support employees in both remote and in‑office setups.
View project


Survey data evaluation and revamp?
This project strengthened our evaluation system by updating survey tools and questions, giving us more flexible data and clearer insights to improve courses.
View project


Wildfire Smoke Awareness
This training gives employees straightforward steps to stay safe during wildfire smoke events, reinforcing the need for accessible learning in real-world emergencies.
View project


Gen Ai video
This project explored improving course evaluations by aligning feedback questions with learning objectives and identifying better tools for collecting and analyzing learner feedback.
View project


Digging Deep: Remixing Research
This microlearning experience helps participants assess and analyze art, culture, and everyday media they already find interesting. Learners explore how research exists across music, visuals, and cultural artifacts, building media literacy and critical thinking skills along the way. The project reflects my belief that meaningful learning can be found in all forms of culture.
View project

Check Out the View Film Series
This community film series was created to give people space to talk through films that explore personal and complex topics. Hosting these events made me more aware of how environment, facilitation, and shared experience shape engagement, especially among people meeting for the first time. By blending learning design, entertainment, and community-building, participants enjoyed new films while forming meaningful connections.
View project

Ergonomics Training

The Challenge
Course evaluations existed, but feedback was inconsistent and difficult to analyze across courses. Surveys were often too generic, limiting our ability to determine whether learners actually gained the intended skills or where specific courses needed improvement.

What I Did

Aligned Evaluations with Learning Objectives

  • Reviewed course objectives and mapped them to evaluation questions using the Kirkpatrick evaluation framework.

  • Designed a structure that combines generic learning experience questions with course-specific questions tied directly to learning outcomes.

Built a Scalable Question System

  • Organized courses into themes to streamline question design.

  • Created reusable question sets that could be applied across courses while still allowing targeted feedback for individual training topics.

  • Developed a prototype survey inside Blackboard to test how the questions function in a live course environment.

Tested the Learner Experience

  • Implemented the evaluation in a pilot course and monitored learner responses.

  • Focused on clear, behavior based questions that ask learners what they can do differently after training, not just whether they liked the course.

  • Collected early learner feedback to refine the approach.

Reflection & Iteration

  • The pilot helped reveal that evaluation questions can more effectively measure learning outcomes than general satisfaction.

  • Next iterations would focus on improving data visualization and scaling the system across more courses to compare results over time.

Skills Demonstrated
Instructional design • Learning evaluation strategy • Kirkpatrick model application • Survey design • Data-informed course improvement • Learner-centered design • Rapid prototyping • Stakeholder collaboration

Wildfire Smoke Awareness

The Challenge
California wildfires increasingly impact air quality across the state, placing DMV field employees and customers at risk. Managers needed clear guidance on how to interpret Air Quality Index (AQI) data, follow Cal/OSHA wildfire smoke regulations, and communicate safety measures to their teams.
Existing information was policy heavy, scattered across documents, and difficult to apply during real time wildfire conditions. The goal was to translate safety regulations into practical actions managers could take immediately to protect employees.

What I Did

Translated Complex Safety Policy into Practical Guidance

  • Interpreted wildfire smoke protection requirements from Cal/OSHA Section 5141.1.

  • Organized the material into clear decision points based on AQI levels.

  • Built a simple action framework for managers: monitor air quality, communicate risk, implement protections.

Designed Learner Centered Microlearning

  • Structured the training into short modules focused on real workplace scenarios.

  • Used practical examples showing how managers communicate wildfire alerts to staff.

  • Added quick reference tools to help leaders check AQI levels and respond quickly during smoke events.

Created Practical Job Aids

  • Developed a quick reference guide for AQI thresholds and required actions.

  • Included steps for providing N95 respirators and documenting training requirements.

  • Added support resources, including information on the Employee Assistance Program and safety contacts.

Focused on Clear Communication

  • Wrote short, direct messages managers could use to notify employees about wildfire smoke risks.

  • Included examples of effective vs ineffective safety communication.

  • Designed content at an accessible reading level so it could be easily understood during emergencies.

Impact

  • Provided managers with clear steps to protect staff during wildfire smoke events.

  • Simplified regulatory guidance into actionable workplace procedures.

  • Supported a culture of safety by helping leaders confidently communicate health risks and protections.

Reflection

If expanding this project, I would add interactive AQI simulations where managers practice making decisions based on changing air quality levels. I would also incorporate more scenario based practice for communicating with employees during high risk wildfire conditions.

Skills Demonstrated
Instructional design • Policy translation • Workplace safety training • Scenario based learning • Job aid design • Stakeholder collaboration • Accessibility focused writing • Learner-centered design

Gen Ai Video

The Challenge
Microsoft Copilot Chat rolled out to 9,000+ DMV employees with zero prior AI experience. Staff needed immediate training on state AI policies (Executive Order N-12-23, SIMM 150) without misusing confidential data or causing compliance violations. The constraint: make complex policy engaging and actionable.

What I Did

Researched & Restructured Policy Content:

  • Identified that initial guidance was too restrictive and unclear

  • Researched best practices from Harvard and Stanford AI policies.

  • Reorganized approved uses into two categories: Content Creation vs. Content Refinement.

  • Simplified core principle: "The task doesn't matter—the data does"

Designed Learner-Centered Content:

  • Opened with storytelling, not policy: "Every day, you serve millions of Californians..."

  • Created side-by-side examples: approved vs. not approved scenarios.

  • Used visual checkpoints (magnifying glass highlighting accuracy, appropriateness, compliance).

  • Softened punitive language based on stakeholder feedback about performance management concerns.

Produced Accessible Training Materials:

  • 8-minute video with professional narration

  • Full closed captions (SRT format)

  • WCAG 2.1 compliant transcript

  • Complete storyboard with slide-by-slide narration

Impact:

  • Positioned AI as a productivity tool, not a threat—stakeholders praised the approach.

  • Clarified the ambiguous state policy for immediate staff application.

  • Enabled 9,000+ employees to use AI tools safely and compliantly.

Skills Demonstrated
Instructional design • Script writing • Video production • Policy translation • Stakeholder collaboration • Accessibility compliance • Content restructuring • Learner-centered design

The Challenge
Course evaluations existed, but feedback was inconsistent and difficult to analyze across courses. Surveys were often too generic, which limited our ability to understand whether learners actually gained the intended skills or where specific courses needed improvement.

What I Did

Aligned Evaluations with Learning Objectives:

  • Reviewed course objectives and mapped them to evaluation questions using the Kirkpatrick evaluation framework.

  • Designed a structure that combines generic learning experience questions with course specific questions tied directly to learning outcomes.

Built a Scalable Question System

  • Organized courses into themes to streamline question design.

  • Created reusable question sets that could be applied across courses while still allowing targeted feedback for individual training topics.

  • Developed a prototype survey inside Blackboard to test how the questions function in a live course environment.

Tested the Learner Experience

  • Implemented the evaluation in a pilot course and monitored learner responses.

  • Focused on clear, behavior based questions that ask learners what they can do differently after training, not just whether they liked the course.

  • Collected early learner feedback to refine the approach

Impact

  • Piloted the evaluation with 20 learners, generating actionable feedback on course clarity and skill application.

  • Shifted course evaluations from general satisfaction surveys to learning outcome based feedback.

  • Established a repeatable structure that can scale across multiple DMV training courses to support continuous improvement.

Reflection & Iteration

  • Piloted the evaluation with 20 learners, generating actionable feedback on course clarity and skill application.

  • Shifted course evaluations from general satisfaction surveys to learning outcome based feedback.

  • Established a repeatable structure that can scale across multiple DMV training courses to support continuous improvement.

Skills Demonstrated
Instructional design • Learning evaluation strategy • Kirkpatrick model application • Survey design • Data informed course improvement • Learner centered design • Rapid prototyping • Stakeholder collaboration

Facilitiation
Change.......People often need spaces to talk about personal and complex topics, but starting those conversations can feel intimidating—especially among strangers. The goal of this community film series was to create an environment where participants could explore meaningful themes through film and connect with others in a comfortable setting.
What made it hard:
-Encouraging open discussion among people meeting for the first time
-Balancing entertainment with thoughtful conversation
-Creating a welcoming environment that supports engagement

I designed the series to blend learning, entertainment, and community-building. Each event featured a film that explored personal or challenging topics, followed by guided discussions. I focused on creating a safe, inviting space and used facilitation techniques to help participants share ideas and listen to different perspectives.
The experience included:
-Shared viewing: Films as a starting point for conversation
-Facilitated discussion: Prompts to guide meaningful dialogue
-Community-building: Opportunities for participants to connect beyond the film

Participants enjoyed discovering new films while forming real connections. Feedback showed that the mix of entertainment and structured discussion helped people feel comfortable sharing their thoughts.I learned how environment and facilitation shape engagement and how shared experiences can bring people together. Next time, I would add digital tools for continued conversation after the event.

TEST TEST