CGA
Grace Alexander

Hi, I'm

Grace — Caroline Grace Alexander, Senior UX Researcher

I'm a senior user researcher based in San Francisco. I love being creative and spending time outdoors.

I'm a Senior UX Researcher at Apple.

These projects are personal work that reflect how I think and create outside of Apple.

At work, I focus on mixed-methods research, with the goal of grounding product decisions in real user insight.

Portfolio
Lamp with a wood base and translucent porcelain shade

Design Through Constraints.

A two-year material study in translucent porcelain. A lamp no one had made before, iterated toward a reproducible process.

I was living in an old San Francisco apartment with warm, flattering light — most of it coming through frosted glass fixtures and incandescent bulbs. I wanted to recreate that quality of light — that warm, atmospheric glow — with a ceramic light.

I'd heard that porcelain, thrown thin enough, can be translucent. That became the starting point: two years of making, testing, and iterating toward a ceramic light cover that glows.

01

Prove the Concept

Translucent porcelain cup placed on top of a light fixture — proof of concept

The first question was whether porcelain could be thrown thin enough to let light through. I made a test piece, and it worked — the translucency was clear, and the concept held.

You can see here my test piece placed on top of a light fixture in my house.

Next steps

With the concept proven, I needed to design the full lamp — how the lighting component would fit inside, how the parts would connect, and how to make it a standalone object.

02

Develop a Plan

Hand-drawn sketch showing the planned lamp design — base, shade, and lighting fixture placement

With the concept proven, I sketched out a plan. I needed to figure out how to fit the lighting component inside the piece, how the parts would connect, and how to make it sit flat on a surface. The goal was a fully standalone lamp — one self-contained object you could plug in and use.

Next steps

Build it and test it — see how close the reality was to the plan.

03

Initial Design

Second prototype with translucent porcelain shade
Second prototype with shade removed

The first version got the overall look right, but had real problems. The light fixture sat separately from the rest of the piece, the cord exit felt unresolved, and the components didn't come together as one. Only about one in every five shade attempts came out properly translucent.

Next steps

Improve the shade success rate and find a method for making it consistently. Integrate the components so the lamp reads as one unified object.

04

Iterate on the Shade

Lamp with ceramic base and translucent ceramic shade
Inside the lamp with the light off and shade removed

After trying hand building, carving, and other methods, I landed on slip casting. I threw the lampshade form on the wheel, made a plaster mold from it, and used the mold to cast extremely thin porcelain shells. I also took a slip casting class to learn the technique properly.

Traditionally, slip is left in a mold for 30–40 minutes to build up the walls. For these, I pulled it at five — well outside the standard parameters of the method, but the only way to get the walls thin enough to glow.

Next steps

Raw white porcelain looked right for the base, but getting it to fit the shade precisely was difficult. Clay shrinks about 15% during firing, and starts shrinking as soon as it begins to dry. The shade used a liquid slip clay while the base was thrown from a different porcelain, so they had different shrinkage rates and were always at different stages of dryness. Aligning the two consistently was a persistent problem.

05

Iterate on the Base

Lamp with a wood base and translucent porcelain shade
Wooden base carved out to fit the ceramic lampshade, with lighting fixture attached inside

While I'm most comfortable with the ceramic medium, it was time to try a new medium that could solve some of the issues a ceramic base introduced. I took a wood carving class, learned to use the lathe, and switched the base to wood. Wood made it straightforward to install the lamp hardware and drill a clean hole for the cord — both things that had always been messy in ceramic.

The bigger gain was fit. With a wood base, I could bring the fully fired shade to the lathe and carve the groove to match it exactly — test-fitting the shade as I went, shaving a little more until it sat properly. The shrinkage problem went away.

Slip casting for the shade, wood turning for the base — after two years of experimentation, I had a consistent, replicable method for making the light.

Result

A standalone, plug-in lamp — one object, consistent to make, and the warm glow I'd set out to recreate.

Portfolio
Conversations with Women — zine cover art

The Empathy Interviews.

A research pilot exploring everyday emotions, designed to help people feel seen and curious about each other.

I've always been curious about what other people's lives actually feel like from the inside.

This project is a research pilot designed to surface those feelings. The goal is twofold: to help people feel seen in their everyday emotional experience, and to build curiosity about the lives of others.

Designing the five questions took real care. Each one had to be open enough that participants could take it anywhere — no specific emotions named, no examples that might anchor their thinking. I wanted each person to describe their own experience in their own words, without being steered toward what I expected to hear.

I piloted the study with people I know before expanding to strangers. Even in the pilot, the range of feelings people named surprised me. I'd expected more overlap with my own experience. The variety was a useful reminder of how differently we each move through the world.

01

Tell me about your day so far today — what's happened?

02

Is there a feeling you've been feeling a lot recently?

03

Tell me about a time recently when you felt that way.

04

How do you feel about the role that feeling plays in your life?

05

In the future, what do you want your relationship to that feeling to look like?

Take the questions to strangers! Interviewing people I don't know will be a different challenge. I'm curious what feelings come up, and whether I can create the same sense of comfort with someone I've never met.

Full interviews coming soon.

Coming soon
Portfolio
TreeID app icon — a tree illustrated in dots

The AI Design Experiment.

I replaced the product team with AI and kept real usability testing in the loop. It created a usable app.

This project explores what happens when AI and usability testing are connected in a closed loop with no designer or developer in the middle. The premise is simple: an AI generates a prototype, real users test it, and their feedback goes straight back into the AI to generate the next version.

Tools like UserTesting.com already let you connect to users on demand. In a fully realized version of this loop, a design could go from prototype to tested to revised without a human decision point in between. And since most usability testing doesn't require specialized expertise, this could be an inexpensive way to iterate.

I believe humans should be in the loop with AI. This project is a deliberate experiment to see what happens when they're not.

Diagram of an iterative product loop cycling through usability testing, user feedback, development, and new design
App home screen with a tree photo and two action buttons: Take a Photo and Choose from Library
App results screen identifying the tree as a White Oak with detailed information about the species

It took more iterations than I expected, but the loop produced a working app.

  • Claude optimized for the feedback it received rather than design best practices. It gave users what they asked for, not what a skilled designer would know to do from the start.
  • It didn't weigh findings by frequency; one user's feedback carried the same weight as three.
  • Users could complete every task, but this experiment shows usable and well-designed aren't necessarily the same thing.

I was impressed with the output. I think one of the main limiting factors was the lightweight prompts I used. With more prompt structure around how to weigh and incorporate findings, the loop could get to a usable point faster and more reliably.

01

Iteration One

Iteration 1 home screen — camera viewfinder UI with a question mark placeholder, a button to identify a tree, and the prompt 'point your camera at any tree or leaf'
Iteration 1 results screen — tree identified as a White Oak, with an About section and a fun fact about the species

Research findings

  • Core flow worked — users could identify a tree without friction
  • "?" placeholder icon flagged as unfinished by all users
  • Results page didn't afford scrolling — one user never found the fun fact or CTA
  • Consistent ask for more depth: reference photos, tappable match cards, a habitat map

AI's recommendations

  • Remove the redundant Recent strip from the home screen
  • Add a sticky "Identify Another Tree" button outside the scroll view
  • Make each match card expandable with a Wikipedia detail sheet
  • Link the habitat card to an Apple Maps search

Three of the four recommendations were each raised by only one user — not enough to be a reliable finding.
The AI also missed the "?" icon entirely, despite all three users raising it independently.

02

Iteration Two

Iteration 2 home screen — camera viewfinder with a green circle placeholder, the prompt 'point your camera at any tree or leaf', a button to identify a tree, and a History button in the top right
Iteration 2 results screen — top matches showing White Oak at 91% confidence, followed by Bur Oak and Chinkapin Oak, with an About section and an option to identify another tree

Research findings

  • Landing page unanimously called "funny" or unpolished
  • Users kept expecting to press the main button again rather than choose from the dialog

AI's recommendations

  • Redesign the landing page with a centered hero structure — app name and tagline at the top, illustration in the middle, buttons anchored at the bottom
  • Replace the single CTA with two explicit buttons ("Take Photo" / "Choose from Library"), eliminating the dialog entirely
  • Add reference photos to expanded match cards
  • Remove confusing scroll-hint chevrons

Removing duplicate text and prioritizing information visually were changes in the right direction.
But the AI was still overgeneralizing — several recommendations came from two users rather than three. The threshold for what counted as a "finding" was still low.

03

Iteration Three

Iteration 3 home screen — green circle placeholder, the tagline 'identify any tree instantly', two action buttons to take a photo or choose from library, and a History button in the top right
Iteration 3 results screen — tree identified as White Oak at 91% confidence, with two other oak species as alternate matches, an About section, and a button to identify another tree

Research findings

  • Identify flow and results screen stopped drawing complaints — a meaningful improvement from round one
  • One issue raised by all three users: a black "?" box on top of the green circle (tree emoji not rendering in the iOS simulator)

AI's recommendations

  • Replace the emoji with an SF Symbol (tree.fill) for reliable rendering across all iOS devices
  • Nudge the fun fact text back from secondary grey to primary — the deprioritization from the previous round had gone slightly too far

The app was usable, but it still hadn't solved for the icon issue that was mentioned every round of testing.
In this case, AI was overgeneralizing UI changes from single data points. The icon was flagged every round, yet never prioritized. The loop may work, but having an initial design pass could save a lot of iterations. I incorporated these recommendations to get the result shown above.

I love to build beautiful, functional things.

I'm a UX researcher and designer with an engineering background.

At Northwestern, I studied Computer Science and Design.
Within the CS curriculum, I focused on Human Computer Interaction and Machine Learning. I was a TA for 5 quarters across different CS classes. This taught me how to break down complex topics and explain them clearly, and gave me my first real experience in mentorship and building relationships through teaching.

I started at Apple in a cross-functional rotational program.
Across my rotations, I worked in product management, front-end development, and user research, which gave me a strong foundation for working across disciplines. From there, I grew into increasingly scoped research roles, leading research on Apple TV+ before building Apple Legal's research practice from the ground up.

Outside of work, you'll likely find me at my ceramics studio.
I also love backpacking and rock climbing in California's mountains.

Grace Alexander throwing a pot on the wheel
Handmade ceramic teapot and mug set
Handmade ceramic citrus juicer
Handmade ceramic plant pot with leafy plant

Let's connect!

If you're hiring for research, want to compare notes on mixed-methods work, or just want to say hi — I'd love to hear from you.

View of the Golden Gate Bridge rendered in dots