Community Spotlight

How to design empathetic user experiences when working with Augmented Reality and Computer Vision

Abhas Misraraj
Product Designer, iNaturalist

In this interview, one of our community members Jonathan Su sat down with Abhas Misraraj, Product Designer at iNaturalist for a conversation on how he designed the user experiences of Seek an app by iNaturalist to help people identify and learn about the plants and animals around them.

Abhas welcome! Tell us a little bit about yourself and how did you get into your current role?

I studied wildlife biology in college and worked as a wildlife biologist for a few years after graduating, but always did design as a hobby. I eventually realized that I wanted creative work to be my full-time job, so I took the leap and joined Tradecraft, where I got to consult for startups in San Francisco and build a portfolio centered on end-to-end product design, specializing in AR/VR.

Since 2018, I’ve been working as the product designer at iNaturalist. As the sole designer, I get to wear a lot of hats, working on UX, UI, branding, marketing design, and even product management. As a designer with a background in biology, I was uniquely equipped to translate the platform’s biological concepts into easy-to-understand UI. I also lead Seek by iNaturalist, an app that allows users to identify plants and animals using augmented reality.

In 2018, our team decided to train a computer vision model to identify species on iNaturalist observations using machine learning. This model has been integrated into both iNaturalist and our second app: Seek by iNaturalist! Where iNaturalist attracts users more experienced with nature and biology, Seek lowers the barrier for anyone curious about nature to learn, and is kid-friendly. It acts as a gamified offshoot of iNaturalist that allows users to identify species on-screen in real-time in augmented reality.

What design considerations do you need to take when creating experiences around computer vision and augmented reality?

Any experience where objects in the real world are enhanced by assets laid on top of it is augmented reality (AR)– this could be anything from road signage and post-it notes to Instagram face filters and placing Pokemon in your bedroom. When designing for AR, you have to keep in mind the background could be any color, so text and overlaid assets need to be clearly visible on any background and adaptable to different smartphone camera resolutions.

Computer vision (CV) involves using machine learning to train a model to identify specific traits/objects using visual input data. Key design problems for CV generally deal with limitations on the technology itself and communicating those to the user. In the case of Seek, it is trained on 20,000 species, but isn’t able to identify everything.

Computer vision can often seem like a black box, which can lead to frustration on the users’ end. We combat this in Seek by providing on-screen identification that changes as a user scans their environment, giving users better insight into how the technology works.

How do you design an experience around imperfect computer vision?

Generally the public is still unfamiliar with how computer vision works and may be skeptical or apprehensive of it. Here are tactics I’ve learned to create a better experience.

  • Give the user a better understanding of how the CV model works. Seek uses on-screen, real-time identification that allows users to interact and see how it responds to their environment.
  • Help the user understand that the technology is not perfect. Clear language and help text in our app allows the user to understand and empathize with Seek’s imperfect model rather than expecting an identification for every species they observe.
  • Give the user a clear route to get the answers they need if the CV can’t provide them. When Seek can’t identify an organism, we encourage users to post to iNaturalist, where our community can provide identification.

What are three lessons you've learned, while designing this cross-platform AR experience, that made you a better designer today?

1. Designing a cross-platform AR experience means that users may have wildly different experiences depending on their hardware. Some older devices can’t handle the load, so it’s important to create alternatives for users on all devices where AR may not work. If a Seek user has a device which can’t run the AR camera, they are automatically prompted to upload images from their camera roll.

2. Users may have different levels of familiarity and understanding with taxonomy and biology. Making these concepts clear, easy to understand, and fun help the user to learn more effectively. In Seek, I added badges, challenges, and packed it with digestible information to get curious naturalists of all ages more excited to learn.Assume the user is on your side.

3. Users want to see this technology succeed, so give them the tools to solve and understand problems they may face. We make it clear that the Seek doesn’t always have the answers, and provide options for them to follow their curiosity and find answers on their own.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

In this interview, one of our community members Jonathan Su sat down with Abhas Misraraj, Product Designer at iNaturalist for a conversation on how he designed the user experiences of Seek an app by iNaturalist to help people identify and learn about the plants and animals around them.

Abhas welcome! Tell us a little bit about yourself and how did you get into your current role?

I studied wildlife biology in college and worked as a wildlife biologist for a few years after graduating, but always did design as a hobby. I eventually realized that I wanted creative work to be my full-time job, so I took the leap and joined Tradecraft, where I got to consult for startups in San Francisco and build a portfolio centered on end-to-end product design, specializing in AR/VR.

Since 2018, I’ve been working as the product designer at iNaturalist. As the sole designer, I get to wear a lot of hats, working on UX, UI, branding, marketing design, and even product management. As a designer with a background in biology, I was uniquely equipped to translate the platform’s biological concepts into easy-to-understand UI. I also lead Seek by iNaturalist, an app that allows users to identify plants and animals using augmented reality.

In 2018, our team decided to train a computer vision model to identify species on iNaturalist observations using machine learning. This model has been integrated into both iNaturalist and our second app: Seek by iNaturalist! Where iNaturalist attracts users more experienced with nature and biology, Seek lowers the barrier for anyone curious about nature to learn, and is kid-friendly. It acts as a gamified offshoot of iNaturalist that allows users to identify species on-screen in real-time in augmented reality.

What design considerations do you need to take when creating experiences around computer vision and augmented reality?

Any experience where objects in the real world are enhanced by assets laid on top of it is augmented reality (AR)– this could be anything from road signage and post-it notes to Instagram face filters and placing Pokemon in your bedroom. When designing for AR, you have to keep in mind the background could be any color, so text and overlaid assets need to be clearly visible on any background and adaptable to different smartphone camera resolutions.

Computer vision (CV) involves using machine learning to train a model to identify specific traits/objects using visual input data. Key design problems for CV generally deal with limitations on the technology itself and communicating those to the user. In the case of Seek, it is trained on 20,000 species, but isn’t able to identify everything.

Computer vision can often seem like a black box, which can lead to frustration on the users’ end. We combat this in Seek by providing on-screen identification that changes as a user scans their environment, giving users better insight into how the technology works.

How do you design an experience around imperfect computer vision?

Generally the public is still unfamiliar with how computer vision works and may be skeptical or apprehensive of it. Here are tactics I’ve learned to create a better experience.

  • Give the user a better understanding of how the CV model works. Seek uses on-screen, real-time identification that allows users to interact and see how it responds to their environment.
  • Help the user understand that the technology is not perfect. Clear language and help text in our app allows the user to understand and empathize with Seek’s imperfect model rather than expecting an identification for every species they observe.
  • Give the user a clear route to get the answers they need if the CV can’t provide them. When Seek can’t identify an organism, we encourage users to post to iNaturalist, where our community can provide identification.

What are three lessons you've learned, while designing this cross-platform AR experience, that made you a better designer today?

1. Designing a cross-platform AR experience means that users may have wildly different experiences depending on their hardware. Some older devices can’t handle the load, so it’s important to create alternatives for users on all devices where AR may not work. If a Seek user has a device which can’t run the AR camera, they are automatically prompted to upload images from their camera roll.

2. Users may have different levels of familiarity and understanding with taxonomy and biology. Making these concepts clear, easy to understand, and fun help the user to learn more effectively. In Seek, I added badges, challenges, and packed it with digestible information to get curious naturalists of all ages more excited to learn.Assume the user is on your side.

3. Users want to see this technology succeed, so give them the tools to solve and understand problems they may face. We make it clear that the Seek doesn’t always have the answers, and provide options for them to follow their curiosity and find answers on their own.

GRATITUDE

People who support Leadership Circle

Deepest thanks to the following people who graciously offered feedback
and support while beta testing Leadership Circle.

Leslie Yang

Director, Product Design
OpenTable

Jeff Smith

Senior Design Manager
Coinbase

Julie Zhuo

Co-Founder
Sundial

Aniruddha Kadam

Product Design Manager
LinkedIn

Jen Kozenski-Devins

Head of Google
Accessibility UX

Jian Wei

Design Manager
‍Zendesk

Courtney Kaplan

Leadership Coach

Cammy Lin

Product Design Manager
Everlaw

Sun Dai

Senior Product Designer
Facebook

Liana Dumitru

Design Manager
Plaid

Mike Dick

Co-Founder
Gather