Concourse

After Boeing's Starliner software failures, NASA needed a better way to verify mission requirements. I designed Concourse—a platform that gives systems engineers confidence in their assessments through automated data aggregation and traceability

ROLE

Design Lead, Researcher

Contribution

User Research, Prototyping, Interaction Design, Usability Testing, Product Roadmap

timeframe

7 months

recognition

2021, 2025 NASA Honor Award winner

Image of a mockup of a Requirement data object page

Solution

We built a 0→1 product that eliminated manual Excel workflows requiring 10-15+ hours of data aggregation per assessment cycle, allowing engineers to focus on analysis instead of data wrangling.

Scaled from 1 to 5 NASA engineering teams across multiple centers since the 2021 deployment, becoming the standard platform for cross-program requirements assessment.

Won NASA Ames Honor Awards in 2021 and 2025 recognizing the platform's innovation and measurable impact on mission-critical workflows.

Animated GIF showing cursor navigating to the three main pages types of increasing data fidelity.

01

Showcase data at different levels of fidelity

Concourse uses progressive disclosure to display data to users of different roles across 3 different levels of fidelity

Animated GIF showing cursor clicking a data object trace link which smooth scrolls the user to where that data object's details are located further down on the page.

02

Follow the trail across different level data objects

Concourse aggregates relevant data objects on a single page and links them together, allowing users to quickly follow the trail

03

Concurrently view and make data assessments

Concourse allows engineers to enter assessments of the data that roll up to higher level views

Understanding existing pain points

We started off with conducting interviews with systems engineers, where we uncovered their existing workflow, pain points, and what was needed to have feature parity in the first major deployment.

Research Insights

🕒 Minimize time spent manipulating data

Engineers were frustrated by the amount of time they spent aggregating and maintaining data in Excel.

🔗 Following data trails is a cumbersome task

Being able to follow the traceability between data objects is crucial to the assessment process.

🥗 Data integrity is key

In a dynamic and complex data environment, engineers are always looking to confirm data freshness and integrity.

From Insights to Solutions

As we began our iterative design process, there were a number of considerations we wanted to keep in mind.

Design Considerations

01

Automate as much as possible

What are opportunities to automate so our users can spend more time doing actual data analysis?

02

Simplify Workflows

How can we reduce cognitive load by providing the right fidelity of data at the right time and place?

03

Data confidence

How might we give our users confidence in the tool regarding data freshness?

We iterated from low- to hi-fidelity prototypes, involving stakeholders as co-designers by having them create their own mockups.The final hi-fi clickable prototype was tested using think-aloud usability sessions focused on key tasks to uncover issues in critical flows.

Learnings and Feedback

This new NASA web platform was deployed during the pandemic in 2020 and since then, has continued to scale to multiple teams across the agency. We've been collecting feedback and iterating the design since.

Qualitative Feedback

  • More intuitive workflows
  • Higher user confidence
  • Time savings
  • Enhanced community accessibility
“Before Concourse, maintaining data integrity was unsustainable — time-consuming and error-prone — and sharing up-to-date results with our technical community was difficult.

Now, Concourse performs the data integration, facilitates our assessment process, and publishes our results through a well-designed web interface.

Our team can focus on core work, and results are available on-demand to anyone in our community.”

🏆 2021, 2025 NASA Honor Award winner

Since 1.0 deployment, we’ve expanded the Concourse application into a platform that now supports 5 different NASA engineering teams across multiple centers.

Lessons Learned

🛠️ Start with simpler features

We initially invested significant resources into building a custom filtering feature, only to learn through research and usage data that it was rarely used. In hindsight, an incremental rollout—starting with basic filtering capabilities—would have better matched user needs and helped us avoid over-engineering.

🤝 Co-design with stakeholders

Stakeholder input proved crucial to the final design, but we realized we could have saved significant time and reduced rework by engaging them earlier as co-designers.