Researchers at Honda’s 99p labs tasked our team with developing a test bed for controlling, monitoring, and diagnosing problems in an autonomous F1Tenth vehicle.

My team was tasked with collecting user needs, prototyping, and finally developing a Diagnostic Interface in React for the final prototype test bed.

  • Project Manager & Lead UI Developer

  • This prototype was the semester-long deliverable for the Rapid Prototyping course at Carnegie Mellon, in partnership with Honda & 99p Labs.

  • January - May 2023

  • Myself, Pat Vaidos, Anishwar Tirupathur, Elise Wang, Helena Spencer, Favour Adesina, Yichen Han, Mason Xiao

Context

Autonomous vehicles are an emerging technology that demands rigorous testing in its implementation. Ensuring an autonomous vehicle can reliably make safe decisions in real-world scenario is critical.

A cornerstone of developing AI-driven autonomous vehicles is repetition.

Repeated testing, analysis, and redeployment in various environments and scenarios. Of course, setting an in-progress self driving vehicle loose onto public roadways is… undesirable.

How might we create a simulation of real-world driving conditions, while keeping the design safe and scalable?

Make it! … Ten Times Smaller


This is the F1TENTH. It’s a little autonomous vehicle, so named because it is 1/10th the size of a Formula One race car.

While often used for autonomous vehicle racing, its open-source platform also gave our team the leeway to treat it like a real vehicle in simulated driving conditions.

The F1TENTH uses an array of sensors to capture data of its surroundings, including its position, speed, and acceleration.

Outlining the Process


Before embarking on the project, our larger class of 50+ students divided ourselves into 9 teams.

As part of the HCI team, my work included conceptualization & design of the interfaces, as well as front-end development later on.

From beginning to end, our teams were given about twelve weeks to conceptualize, design, and deliver a working prototype to our stakeholders. Given this tight timeline, rapid iteration was key.

Our team followed a double-diamond model while designing the test track.

To keep other teams on track with our process, we aligned the double diamond with 3 larger phases that the rest of the teams could follow:

Phase 1: Conceptualization

Workshops


In the initial weeks of the project, the HCI team conducted several workshops.

At these early stages of conceptualization, cross-team collaboration was critical.

These workshops were open to members of all teams, across specializations. We were able to fill in each others’ knowledge gaps, assess feasibility of ideas, and really nail down what was possible to achieve in this timeline.

Through these workshops, our team took away several key design requirements for a Diagnostic UI and Track:

Diagnostic UI must be Responsive

Users should not only be able to get live data, but also be able to directly control the track & vehicle using the Diagnostic UI, in case something goes wrong.

Track should accommodate Variation

The track should be able to handle diverse scenarios, encompassing typical traffic lights to unforeseen pedestrian encounters.

Track should be Modular

In more complex test scenarios, sections of the track should be interchangeable, allowing for the integration of programmable obstacles.

Track should be Easy to Assemble

Pieces of the track should be lightweight and durable for rapid assembly in whatever space is available.

Phase 2: Design

Baseline & Visionary Scenarios


To clarify how the Track & Diagnostic UI could be used, a teammate and I storyboarded one baseline and one visionary scenario for the final design.

the Baseline Scenario represented what we deemed achievable within our short timeframe.

the Visionary Scenario outlined what a future iteration may look like, and was more speculative

Through phase one, our team presented iterations of these scenarios several times to both our internal teams and Honda stakeholders.

Internally, we collected feedback and refined the scenarios to align more closely with the team's perception of achievable goals within the given timeframe.

Furthermore, we collaborated with Honda stakeholders to determine the essential information required for diagnosing and analyzing testing data from autonomous vehicles in their context.

Initial Prototypes


Live feed & position monitoring. Visual vehicle status information.

Live feed & position monitoring. Visual vehicle status information.

To evaluate our diagnostic prototype, we engaged with a group of five participants well-versed in data analytics and the administration of autonomous vehicle testing.

We brought the prototypes to life with Wizard of Oz techniques, and encouraged our participants to Think-Aloud. Then, we synthesized our interviews into the following takeaways:

Tests must be Retrievable

The recordings and data collected from previous tests should be easily accessible for retrieval and analysis.

Users want a Modular UI

Stakeholders require varying information architecture based on the nature of the test. The UI should accommodate these different views effectively.

Phase 3: Implementation

Front-End Development


As lead developer on the UI, I established the Git repository for code production, selecting development libraries, and building the primary components for the UI.

For this project, I chose React because it allowed for the simultaneous development of different components with minimal code conflict. I also decided to utilize the Material UI library to save time on redeveloping components with well established design patterns.

Given the needs we established in Phase 2, I worked with developers from other teams to develop various components for the User Interface.

For example, I requested graph components from members of the Data Analysis team for the test retrieval pages. The position tracking component on the main dashboard was also developed by Ground Systems Software, and was quickly integrated as a component on the main dashboard.

Final Interface


Vehicle Monitoring

Through the Diagnostic Dashboard, users can actively monitor vehicle data, streamed directly to the interface in real-time.

The vehicle reports its own Yaw, Roll, Pitch, Velocity, and Acceleration. Various position sensors surrounding the track report the vehicle’s position.

Additional tabs allow the user to control obstacles in the track environment, or view the console log coming directly from the F1TENTH vehicle.

Simulation Review

On the sidebar, users can switch to a review tab. Here, users can search for previous tests to revisit.

Along with a camera recording of the test, the Diagnostic Dashboard also displays graphed data from the vehicle over the course of the run.

Demo

An action shot of a certain designer 👀 trying to grab a cable hanging from the ceiling

F1TENTH, just about to hit the track

The demo setup, with the requisite last-minute hanging cables from the light fixtures

The F1TENTH traversing the track being monitored on the Diagnostic UI

Our F1TENTH on the day of the demo

Closeup of the UI with the live camera feed and vehicle data

Previous
Previous

City Slicker