OVR Technology CEO Aaron Wisniewski Interview on “The Power of Scent”
Johnny Rodriguez and I sit down virtually with Aaron Wisniewski, CEO of OVR Technology, for a near hour chat on how the “Power Of Scent” truly makes #VirtualReality a complete multi-sensory experience.
OVR wants to use scent to the fullest for more engaging, emotional, and effective virtual experiences. Learn more about how they strive to improve our real lives through their scentware technology in #VR in training, simulations, health, wellness, and more.
Watch the full interview on Youtube, or read the interview transcript below:
Full Interview Transcript
Elisha Terada: Hello. We have a guest today from OVR technology. Aaron, you’re the CEO of OVR technology. And Johnny, I already got to experience this ahead of time the OVR technology let me smell inside the VR headset. And I know most of people watching haven’t experienced this before, so we are going to do some of the demo first, and we’ll go through the introduction of OVR technology and ask some interesting questions.
How are you doing today?
Aaron Wisniewski: Good. Good. It’s a pleasure to meet both of you and, get a chance to talk about our sense of smell and technology and how we’re working together to pave the way for a new way to experience digital media.
Thanks for having me on.
Johnny Rodriguez: Definitely.
Elisha Terada: We have Johnny, who’s a Strategic Innovation Director at Fresh Consulting. Johnny-
Johnny Rodriguez: Hello.
Elisha Terada: –Are you excited to see the OVR?
Johnny Rodriguez: I’m a little jealous. So for those that are joining, Elisha and I are actually not in the same geographical location, he’s in Seattle and I’m in Austin, Texas.
And so I did not have a chance to experience the technology, but man, I’ve been seeing you guys in the news, Aaron, like all year, I’ve seen some really cool demos that have popped up on my YouTube feed the algorithm knows that I’m interested in this kind of innovative stuff you guys are doing.
So I’m super excited to hear more and dive a little deeper. You’ve covered a lot of what your company does and the kind of the innovative stuff you’re doing with scent. So yeah, I’m super excited. Elisha.
I’m Elisha and Technical Innovation Director at Fresh.
Elisha Terada: And we love doing these things where we get to try new technology and also ask deeper questions. Today’s agenda we are going to go through the demo, talking about how it works and visually, so that people understand how this can be experienced in person.
And we’re going to go through some of the introduction about OVR and its technology. We’ll ask Aaron some questions and we get into some interesting explorations that we want to do, which I think is very rare in other interviews. I think we’re probably unique in that.
I think smell in the VR is such a new concept that not many people have experienced and people might understand in their head what it might be, but we thought we will do show and tell. This is a footage that I’ve recorded using the OVR headset. On top you can see the headset that kind of looks like a Meta Quest 2, that people have probably seen in person. It’s actually by Pico and right under the device.
You’ll see the black device with the light around my nose. I would say heart of OVR technology that produce a scent. And in the footage, you can see that I’m looking at the roses right in front of me inside the VR experience. So that’s why it’s the OVR, right? It’s a VR and the scent device combined.
As I immerse myself into the virtual world and look at the rose and get close to it, I can smell the rose. It’s magic. And in my mind, I understood how it might work, but until you experience it, you never really understand the immersive experience they will create for you.
Johnny, what’s your impression just by looking at it from a person who hasn’t quite experienced yet?
Johnny Rodriguez: For what it’s providing, it’s a very discreet device and it does seem like there’s been a lot of detail and intentionality around what the device looks like and how it fits onto headsets. I can see that this will fit on different headsets, right?
You did have the Meta Quest versus the Pico, or if you did have a whatever the headset might be, it’s got a strap that fits and it gets away from the cameras and that kind of thing. But also it’s placed in a location where it’s really close to your nose.
Johnny Rodriguez: That’s pretty cool. And I can tell that it’s not tethered, So there doesn’t have a cable to its Bluetooth connection, is what I found out from that. But it’s just natural. I don’t think it’s adding a lot of weight or anything from that side of things. So you’re just having your normal experiences, but then you’re providing this new kind of layer, the layer of immersiveness.
I think that’s really fascinating. So yeah, discreet, easy-to-use, it looks really high end.
Elisha Terada: And I’m going to the first question, which is what is OVR and from the website, what I got the gist of it is a scent technology for virtual reality and the scent makes virtual experience more engaging, more immersive, more emotional, more effective.
And we would like to hear from you what it means to be engaging, immersive and how is it like more effective and emotional?
Aaron Wisniewski: Yeah, absolutely. I’m gonna do this like a little bit out of order too, but since we just saw that demo, One of the things that isn’t completely apparent from that demo that I think is one of the things that we’re most proud of about what we’ve been able to engineer is what you were experiencing was spatial scent there.
So for people watching it, they’re seeing flowers, they’re seeing that campfire smoke and they think, okay, if that’s what I’m seeing, then that’s what I’m smelling. But actually for the flowers, for example, you weren’t able to smell the flower until you leaned over and got close to it, or you picked one up and brought it to your nose.
Aaron Wisniewski: So having that spatial quality has something that’s never really been done before, and it makes it a lot more immersive and it allows your brain to tie the smell to these virtual assets, which makes it a lot more realistic too. So I just thought of, it’s cool to point that out as one of the primary features of our platform is that it’s spatial, it’s wearable, and it’s personal.
Now, I can’t wait to dive into those things, but I’d love to start and rewind a little bit and just talk about smell for a minute. Like why smell? Cause I think that most people are pretty familiar with their sense of smell when it comes to buying maybe a perfume or an air freshener, or when something is off in the fridge.
or maybe even some folks are familiar with their sense of smell recently because of COVID, they may have lost their sense of smell. So there’s this kind of like general understanding of smell and how we experience smell, but there is this whole other world, this whole other side, to our sense of smell that most people aren’t aware of and what a incredible power it has over how we think, how we feel, what our motivations, our memories, and our behavior is.
Aaron Wisniewski: And so even though it’s subconscious, we’re not noticing it, every single breath that we take every day, we’re getting information from the world around us and it’s influencing how we think, how we feel and how we behave. So the kind of genesis of OVR was given that smell is so powerful, how can we use it to make these digital experiences, whether it’s VR, AR, XR, just traditional digital experiences, how can we make them more powerful by introducing senses other than sight and sound?
And we also asked ourselves that yes, given that smell is powerful, can we use immersive technology like VR to actually amplify focus, the inherent value of scent on mood and behavior and emotion. So the way that we did that was by creating this technology platform and starting OVR based around that. This technology platform is the first ever scent technology that’s spatial, personal, and wearable. So you can experience sent in virtual reality for the first time.
Elisha Terada: That’s really cool. And it actually leads into the next slide that we had.
How does olfactory influence our VR experience? And you talk about some of the general overview, but if you have some of the examples that you can talk about, and some of the, what I understood is that can be used first responders use case.
And if you could expand on some of that use cases, that would be really great.
Aaron Wisniewski: There’s a number of ways to look at the benefit of smell in these virtual experiences. And so the overarching theme is that no matter what the experience is, the addition of smell is going to make it more immersive. It’s going to make it more emotional. It’s gonna make it more engaging. And then for most of ’em, it’s gonna make it more effective. Now you can measure effective in a number of different ways, but, just the addition of more senses is going to make these experiences better.
And because scent is the only one of our senses directly connected to the limbic system in our brain, which is the memory and emotion center, it’s obviously gonna make these experiences a little bit more engaging, emotional and immersive. We really focus on a few different categories of immersive experience that we like to support. Health and wellness, being a big one, kind of arts and entertainment being a big one.
And I think an obvious one, social connection. How can we make our social interactions a little bit more engaging and emotional. And then kind of education, training and simulation also. So if you look at an overview of the benefit of scent, you’ve got added immersion on one side of the Venn diagram, and then you’ve got the inherent value of scent on the other side of the Venn diagram.
So for example, an immersive training scenario field medics, right? For the air force or the military, you can go through a training simulation to do something that’s either too expensive, too dangerous, or too complex to do in real life. And by adding your sense to smell, not only are you more likely to commit that experience to memory, and so you kinda acquire those skills more effectively, but it’s more realistic. So your eyes are engaged, your ears are engaged and your nose is engaged. And because while visual and auditory memory actually degrade pretty quickly, scent memory is stable for much longer period of time.
We’ve all had these experiences while we’re walking down the street or something happens, and we catch a whiff of a smell and instantly we’re transported back to childhood, or a core memory. Our sense of smell is really good at associating an emotional memory with a particular experience object or person, or time, or place, which is helpful for these training scenarios, but also in general, if you think about social interactions, wellness, the link to emotion memory is really powerful.
A couple other use case examples is we develop a immersive meditation program that we call INHALE. And you’re in these beautiful nature scenarios. You have a guided meditation and that meditation uses scent, not just to make it more realistic and immersive, but it helps guide your breath and your focus for these meditations.
There’s a number of hospitals, clinics and detox and rehab facilities that are using this INHALE platform as a non-pharmacological intervention to help manage the effects of pain, stress, and anxiety. And so if you’re feeling anxious or if you are perceiving pain, getting yourself into a VR headset and replacing the negative sensory input with positive sensory input, then you’re able to manage the impact of that pain, stress and anxiety quite a bit.
In fact, we worked with the University of Vermont medical center on a study that they did for inpatient psychiatry, where they used olfactory virtual reality a few times a week for about eight minutes. They took self-reported data from the participants right before, right after, and three hours after this experience.
And the reported feelings of pain, stress, and anxiety right before the experience were a nine out of 10. and immediately after the experience was a three out of 10. And what might be even more interesting is that three hours after the experience, they remained at a three out of 10. So we’re not a medical device.
We’re not FDA anything, we’re not treatment. But what this indicates to us is that when you combine scent and immersive technology, you can dramatically shift your mood and your psychological perception in ways that are beneficial to you if you’re experiencing pain, stress, and anxiety.
So we found that really interesting.
Elisha Terada: Yeah. Compared to-
Johnny Rodriguez: -that’s fascinating.
Elisha Terada: -maybe like a typical like aromatherapy type of stuff where you just darken your room and maybe you start a candle. How is it different in the VR experience versus maybe in reality, I try to simulate the similar, like maybe calm mindset?
Aaron Wisniewski: It’s a great question. It can get a little complex, I’ll try not to get too nerdy right off the bat. But when you think about something like aromatherapy, which we are not. Aromatherapy is based on the assumption that there’s something inherent in these plants, the aroma of these plants that has a physiological effect on you, a healing physiological effect.
Now without getting into the science of whether that may or may not be true, what we’re focusing on is how our relationship to certain smells either preexisting ones or ones that are built in VR affect us, affect our mood and affect us psychologically because of our relationship to those smells.
Especially in the US, we think of lavender as a relaxing smell. Often. In fact, there’s nothing inherently relaxing about lavender, right? When we are born, we have no relationship to smells. All our relationship smells are learned. We don’t like or dislike really any smell when we’re babies, but over time we learn subconsciously to have these relationships with smells.
For example, my mother always used to put a lavender sachet under my pillow before bedtime so I developed a relationship over time.
Elisha Terada: Interesting.
Aaron Wisniewski: -that lavender means bedtime for me. What we can do with VR, which is really cool is instead of waiting for that relationship to be built over a lifetime, we can create a virtual scenario that immediately and permanently links a particular smell to particular mood or behavior.
And so in this program, INHALE, for example, as you’re meditating and getting to a calm state, you begin to associate that calm state with these certain smells. And so those smells in turn have that effect on you either in VR or out of VR. So it’s a compounding effect and your brain when it comes to smell, absolutely loves context and association.
So if you just smell a smell without any sort of visual, auditory, or emotional context, doesn’t really have that big of effect on you. But if you have visual audio context and you’re emotionally engaged, then it has a really powerful effect. And we see this in positive, like the example I just explained, but also we see it in negative where veterans, for example, who have suffered trauma during combat and have symptoms of PTS, post-traumatic stress, often the smells are one of the things that will trigger that emotional response the longest even after therapy, which is also why USC for example, is using virtual reality and smell together to help combat vets with PTSD, manage those symptoms and go through kind of a rehabilitation program. So scent has this ability to genuinely affect your psychology, your state of mind your mood, but only through your association with those scent. And so that’s the main difference in how we approach scent for health and wellness or for training and simulation versus something like aromatherapy.
A lot of this work that we’ve done too, has also been in partnership with Dr. Rachel Herz, who is the leading olfactory neuroscientist in the country. She’s our chief scientific advisor. She’s kind the authority on how smell affects your behavior and your psychology and your physiology.
A lot of the ways we’ve developed the product to hardware, software scentware has been using the research that we know about how scent works biologically and how it affects our psychology as well.
Elisha Terada: Very fascinating to hear that you can learn to associate with the new scent and also possibly unlearn negatively associated smell and then relearn and transform it into new association.
That’s really interesting. I didn’t think about the notion of lavender being society teaching you-
Johnny Rodriguez: -like a cultural
Elisha Terada: Yeah
Johnny Rodriguez: -effect. That’s really interesting. I think when you start to add the layer that you mentioned, the spatial component, the proximity component to this, it becomes really fascinating. One of our first interviews we did, Aaron, was with a guy named Bruno Larvol.
He has a company called Larvol. And he has been running his company in the Metaverse for over a year. It’s quite a lot of people that are basically jumping into I think it’s actually Spatial is the metaverse environment. And then as well as Meta Workrooms.
I think it’s the two platforms he uses. And he basically says that there’s some studies and stuff that are happening because of that, they are saying things, like, hey, is your brain being rewired because of you doing this of being like completely immersed and doing this for your business.
And he brought up like one of the biggest takeaways, which was proximity and how with being more virtual in the last few years, given the circumstances, we’ve gone to 2D version, but we’re losing the empathy and we’re losing the trust, and we’re losing the caring. And then add the layer of proximity to that.
And what he’s been finding is that proximity leads to empathy. So then I’m just imagining, like when you add the layer of proximity, the spatial component to the scent technology and all of the research and all of the data that you just gave us there, just the layer of immersion and empathy that adds.
Johnny Rodriguez: I could see that being a huge factor and all that. Just really powerful.
Aaron Wisniewski: It’s a really good point. When you lose your sense of smell it’s called anosmia. A lot of people suffered that during COVID and Brooke Jarvis wrote this really great article on it for the New York Times.
It was earlier this year. And so she talks to a lot of people who have lost their sense of smell, and they describe what that’s like as like living in black and white or behind a sheet of glass. And this idea that their sense of normalcy has disappeared with their sense of smell.
They didn’t really understand how powerful their smell wasn’t until it was gone. And I actually think that there is, to a degree, some of that happening in digital and in the Metaverse where we have this incredible connectivity but it’s only audio visual for the most part.
And so to really bring a human element, a connectedness and a proximity, you really wanna engage your other senses and smell being such a powerful, emotional sense. That’s really critical to have in there. And in fact, your sense of smell is fully formed and functioning even before you’re born 10 weeks old in the womb, your sense of smell is working.
Johnny Rodriguez: So you’re already getting those connections. And then when you’re born, you don’t recognize your parents through how they sound or how they, how they look right away for the first few months smell is how babies recognize their mother and vice versa. there’s these kind of like big, obvious things you can do with smell, but then there’s the smaller things like, how does smell affect how close we feel to each other and how connected we are to the world. There’s a researcher, her name’s Laura Barry. She’s doing some really interesting research right now at Trinity, College in Dublin. She’s studying this phenomenon called psychological distance, which is used a lot in product marketing, Psychological distance is how close do you feel to someone or something? There’s like the physical distance, like how physically distant, but also how emotionally close you feel. So she’s using OVR right now to study how scent affects the phenomenon of psychological distance in the Metaverse. The results we’ll publish earlier next year and I’m really interested to see how that works. I just think it’s one of the pieces to the puzzle when it comes to a really, meaningful, effective,Metaverse in the future. I completely agree with that. That’s really fascinating.
Elisha Terada: Yeah. I was he imagining it’ll be funny if I could smell Johnny as I get closer in the Metaverse avatar.
Aaron Wisniewski: Yeah, that’s exactly. One of the things we’re working on right now- exactly right- is having almost like a scent DNA. And this is if you wear a cologne or a perfume, I think that’s like a little example of how you are socially signaling to other people.
Johnny Rodriguez: Yeah.
Aaron Wisniewski: You wanna smell a certain way, but now with this technology, you can do that in the Metaverse too, where you can program, we’re working towards this isn’t something you can do right now- but your scent DNA. the thing that you want to represent you from a scent standpoint in the Metaverse as interoperable to something that you can carry with you from platform, the platform that you can share directly with friends and family, or in these larger spaces.
It’s probably a good time to talk about how this thing actually works, but the same way that you can have digital assets in, Warcraft or second line or Roblox or whatever, you can do the same thing with sent using our software plugin. So basically any digital asset, you can tag as a scent enabled asset.
And so when someone else gets close to it, it’s basically a collision system. so someone else gets close to it, it triggers the scent and we’ve designed it so as you get closer to an object, the intensity of the scent increases. So you can understand, you can orient yourself spatially in these digital environments using smell as well as sight and sound.
Right now we’re pretty VR focused. we think VR because we are so immersive, it’s probably one of the most powerful ways to experience this. But we’re working towards, not just VR, but AR mobile, desktop, tablet, et cetera. So you get the benefit of scent in all these different digital platforms.
Elisha Terada: Very fascinating. It will get to the acceleration of combining multiple emerging technologies as well. And I want to, get back to this slide actually, just so people understand how this all works together. Basically there’s the headset that I was wearing. It’s by Pico but virtually can be any, headset device.
And you’ll be essentially wearing a scent device near your nose. And right now you’re seeing- call it like a ink cartridge, almost like a scent cartridge?
Aaron Wisniewski: Yeah. That’s a great way to put it. It’s the sync cartridge, that you’re holding in your hand right there and the demo, you just pop it in.
Yep. You just pop it in it’s it was pretty easy to pop in. The cartridge is itself actually smelled really good for some reason.
Johnny Rodriguez: And that’s where the nano particles are coming out and are being released there. Okay.
Elisha Terada: You just put it with the band. I assume this is just more like one way to put it on and you’ll have probably different ways that you’re probably designing on how it attaches to different devices in the future. Especially as the form of the VR headset changes, where’re excited to see the Meta Quest Pro hopefully announced next month and see how that transforms, from the bulky, big headset experience.
Aaron Wisniewski: So maybe smaller and as such OVR might have to adopt two different ways of attaching, wearing- maybe there’s attachment that you can buy.We’re launching our new form factor at CES in January, which as you mentioned, the shape and size of the headsets is changing. It’s getting smaller and smaller. The difference between AR and VR that line is blurring, we’re moving more towards the glasses. So our new form factor doesn’t attach to the headset at all. You actually wear it almost like a microphone, so you can wear it interesting with any headset.
It doesn’t have to mount to the headset. You can wear it with any AR VR headset, or you can also interface with mobile desktop or tablet too.
It’s a little teaser there for CES.
Johnny Rodriguez: Yeah, that’ll be exciting. we will both be there, this next year, so we’ll make sure we, stop by and capture some video of that as well, but that’ll be great.
Aaron Wisniewski: Awesome.
Yeah, that will be pretty exciting. I hope people get to actually try in person. It’s really great demo that I had and I want more people to try.
You have five scenes pre-programmed inside the VR headset. And I had people just try it out, but outside of Fresh Consulting, which most of them are really impressed and never even used VR headset in the past. That was actually shocking to me because I was just introducing them as, oh, you can smell inside of VR.
Elisha Terada: And they were like, oh, I never used VR in the past. Oh, then it’ll be, the most immersive experience in VR that you’re gonna have on the first try. But we are very curious, how has it been received? I’m pretty sure you guys had hundreds of people try, already.
Thousands of people probably at this point have tried it.
Aaron Wisniewski: And I would say almost exclusively, the responses are similar like joy delight, they’ve been really positive reactions and that’s not necessarily consistent with how they react to the idea of it before they try it. I think there’s a lot of people who hear about scent and VR scent and digital, and they’re like, oh totally,
that’s inevitable. Obviously we would want that. And there’s also a lot of people who are definitely not convinced who are very skeptical of A. Why it would matter ,why it’s important and the quality of the experience that you can get when scent is spatial like this. But I would say almost everyone who tries it as soon as they get in there within a few seconds and they experience it.
Aaron Wisniewski: There’s that aha moment where they go, oh yeah. not only is this really cool, but this is the way it should be. and that’s exactly what we’re hoping for. And even the folks who tried it olfactory virtual reality before they’ve ever tried virtual reality, it’s like trying color TV without starting in black and white, right?
Like yes, if you watch black and white TV, it does the trick in a certain way that you get the information, but why would you?It’s so much better with color. And I think VR is so much better with scent. There’s some interesting reactions too, about the expectation of scent in these virtual environments that I think speaks a lot to the psychology of scent, too, where when we first started the company. One of our primary goals when it comes to scent and scentware was how can we reproduce scents of the real world as accurately as possible in VR? And that’s what we did for the first year or so of the company.
I’ve been working in scent for very long time. We use our sensory expertise. We use very fine tuned, complex, like analytical lab equipment to analyze each scent molecule by molecule and rebuild them as accurately as possible, and then put them in these environments.
Aaron Wisniewski: What we found was that even when a scent is quote unquote accurate, as in we’ve measured it in real life, we’ve recreated it molecule for molecule in the lab. We know empirically that this is the same scent when we put it in an environment, and someone tries it, it doesn’t resonate. They go, Mmm.It’s not quite right.
So what we discovered is that there’s the uncanny valley of scent. And by trying to recreate scents accurately, we’re actually breaking immersion more often than we’re creating it. And there’s a few reasons for that one, because scent is so subjective, if I tell you I’ve re recreated the smell of the forest, each of you has a very particular memory formed in your head about what the forest smells like.
I’m here in Burlington, Vermont, the forest outside my house smells a particular way on a particular day on a particular time of year. Might be very different in Seattle or Austin. So if I tell you its forest, it sets an expectation in your brain of a very exact memory that only you have that nobody else in the world does.
So when you smell in and it’s not exactly that it actually sends off a red flag in your brain that something might be wrong. My expectation, and the reality don’t match, that’s a problem. That happens in VR in general, too. If the physics don’t line up, if the visuals aren’t very good, it doesn’t meet your expectations.
Aaron Wisniewski: It’s not a very positive or effective experience, same thing with scent. So what we realized was that instead of trying to recreate the real world and just port it over into digital, which is interesting in a certain cases, it’s still what we wanna do. It’s actually definitely not the most interesting thing you can do with it though, right?
If we’re just repeating what’s out there in VR, I don’t know, where’s the fun in that. The whole point of VR is that anything is possible, right? Anything’s possible in the Metaverse. And so the Metaverse needs and VR needs is not smells that we’re used to, that we can check a box and go, yes, that’s accurate.
Aaron Wisniewski: What it needs is an entirely new alphabet of scents. Now in the real world, we can experience trillions of different smells. literally trillions of different smells. We might not need to create a trillion smells in the Metaverse, but if we can create a few hundred or a few thousand options, then we can essentially recreate the variety of smells that we experience in the real world in the Metaverse in VR and form new associations and new memories that are digitally native.
And that’s what’s most exciting for me because that has the most potential and it’s limitless. That’s what we’ve done. And what we’re working on is in our scent cartridge, which has nine primary odors, we’ve identified, what are the nine odors? Yes. Perfect. What are the nine categories that you can combine not to create as many realistic scents as possible, but just the most variety. Period.
Like a cool analogy for this is flavors, right? I’m gonna call them artificial flavors. There’s gonna be a lot of reaction to that term “artificial” put that word aside for a minute. The point of the story is that in 1850, there was no such thing as artificial flavor. You got what nature gave you.
Aaron Wisniewski: Nature gave you all kinds of amazing things. Then in the early 1950s, some scientists discovered a few artificial flavors. They discovered cherry, banana, grape, and apple, I think. And they exhibited them in some kind of world’s fair. And they put ’em in hard candies and people try these hard candies.
And they’re like, ooh, apple hard candy. It didn’t really taste like apple, but it was delicious. And so they said, oh, I love this. And from that point on, there was a split. We were able to imitate or recreate nature in a certain way, but really we had this whole new palette and you fast forward to today, flavors are in everything, whether they’re listed in ingredient panel or not, flavors are in everything.
They define the food and the drink that we consume and we love them. and there’s very conceptual flavors and there’s very realistic seeming flavors, but there’s this whole new alphabet, this whole new way to create flavor experiences where we’re no longer just constrained to nature. And that’s what we’re working on creating for the Metaverse.
That might have been a little bit rambling, but it’s such an important distinction and it’s something that I’m so excited about is the future of digital is not just a replica of the real world. It’s only limited by your imagination, and because we can do that with sounds and sights, now we can do that with smells also.
Elisha Terada: I was actually imagining the situation for, food coloring, where if you let’s say, create ice cream of avocado or the ube yam. Maybe in reality, the color comes out very faint, but we like to see a strong color of purple, the strong color of green to feel like, yes, I am eating that flavor of ice cream, even if it’s like artificially done.
Yeah. There’s some science to it and there’s some art to getting the right multisensory experience. The right visuals, the right physics, the right audio, the right smell, the right kind of- all these pieces get combined together.
Aaron Wisniewski: And when they’re done, you have a remarkable experience. there’s nothing quite like it. If they’re done wrong, it might not be so compelling.
Johnny Rodriguez: That’s true. That makes a lot of sense. I really like what you were saying about the ABC of scents as well. Like being able to say hey, these are the thousand and the points around like the uncanny valley aspect.
We’ve covered that a lot with actually avatars and thinking about that and how, what we look like and personal identity and how you wanna represent yourself. But anyways, you think about scents, it’s true. There’s an element there. It’s a net new experience.
Johnny Rodriguez: It might be a new smell, but that might generally get you to that smell and be like, this is the smell of blank, right? This is how roses smell. But it wouldn’t be exactly like what you would think as the individual creating the scent. It may be very different from what you smell or what Elisha smells.
I thought that was really interesting.
Aaron Wisniewski: Thank you.
Elisha Terada: Yeah.
We can get little bit more technical than maybe on the other shows. One of the questions that some of us had as we were experiencing the VR headset is can you mix two scents to produce one?
And this might be somewhat obvious question for some people, but as you think deeply about it, it might be more challenging than it seems. For example, there may be a bacon smell, that maybe this cartridge can already produce. And then maybe there is the egg smell that you can produce, but if you combine them together, do they actually smell like a bacon and egg together?
Or would there be some challenges in combining some smells? I’m thinking when I combine the color palettes, if you just combined without thinking, it just become like a black color in the end, it just gets mushed over and it doesn’t blend as you might think.
Aaron Wisniewski: So the answer, although it’s not a very satisfactory answer is maybe, and sometimes , so there’s a few ways to look at this. Unfortunately when it comes to smell 1 + 1 rarely equals 2. Now there is some predictability and there’s actually some really interesting research going on right now, including Alex Wiltschko at Google trying to understand, by the shape of the molecule, how they might combine to create a sensory perception, but because of the antagonist effect of smell molecules, if it were literally eggs and bacon and you combine them it would smell like bacon and eggs like that one we already know.
But what happens often is that when you combine two that you think you would be able to predict what the result is, you get something completely different. But if you combine cotton candy and peanuts, you think you get sweet candy peanuts, but you get fish.
And that’s like a strange sounding example, but there are instances where you get these very peculiar off odors. So that’s one piece of the question is there a predictable way to combine things to create new things? And once again, it’s very hard using a model, like a theoretical model, to predict the sensory outcome of combining two molecules or groups of molecule centered molecules.
The other part of the question is is there an RGB or C M Y K of scent, right? Are there base smells or molecules that are able to produce all of the combinations that we smell and the answer there is no. So unlike light, where we can do RGB or C M Y K and produce millions of colors, there are so many different odor molecules that look and behave differently and fall into different sensory categories that they can combine in a trillion different ways that there’s no kind of RGB.
Each human has about 400 different types of olfactory receptors that kind of lock and key style things in the nose that those scent molecules, when we breathe in, they latch on, in different combinations that we get a sensory perception and we associate that sensory of perception with a certain feeling or person or experience.
So unfortunately there’s no RGB or CMYK. We can to some degree, infer what different combinations will produce and we have a lot of information from research, fragrance companies, flavor companies about what has worked and what hasn’t. but there’s always an X-factor in what you’re gonna perceive.
And then there’s even a Y factor. You might have a different perception of that new smell combination than me because of previous associations you have with components of it. So I’m not sure that completely answers the question or the spirit of the question. but the way that we approach it is we know that there’s gonna be unpredictable outcomes, but can we select nine base compounds, where we know that whatever those outcomes are, they fall on the right side of three axes.
One is positive valance. Can we, with good degree of certainty, know that you’re not gonna actually create something disgusting and we’re able to say, yeah, it’s probably gonna fall in these category. So positive valence. The second is some degree of familiarity. If we create something so novel that you’ve got zero perception of how to even categorize it, you might not even be able to smell it, which is a weird thing to say, but there’s examples of certain molecules that don’t fit into any category.
And some people literally can’t perceive them. So we need some degree of familiarity. And then the third axis is arousal. Is it arousing or is it sedating? And we look at those kind of three axes and so we know that if we start with these nine compounds, no matter what gets created or what you create, they’re gonna fall in the area that we want them to, even if we can’t predict what that smell will be. from that combination.
Elisha Terada: Interesting, I’m imagining the musical scale where if it’s the scale of a G major, no matter what sound you play within a scale, it’s kind of go well. But you know that if you introduce flats or sharps, out of scale sound, it can sound like, whoa, that’s a weird sound.
I don’t want to hear this weird musical note. Is it similar?
Aaron Wisniewski: Oh, that’s a great way to put it. And in fact, music is a really good analogy to smell in many ways, actually, even the language that we use when describing scents, we use note and a chord, and we use a lot of the same language, so combining different scent notes and a scent note might be green, or floral or musky or something like that.
We combine those notes to create an a chord a finished smell using our platform. You can create a single smell, where you get one impact or over time, you can create a smell that kind of transforms over time. This the same way that A. you can a musical track, and even using the language around smell, or language around sound to describe how smell plays a role in digital experiences is great, right?
Sound plays two roles. If you’re watching a movie, for example, sound effects give realism and immersion. to the movie soundtrack provides mood, and emotion and tension and all those kind of intangible things. And you can use smell in exactly the same way in a virtual environment to give a degree of realism, and context, but also to provide mood and emotion.
Johnny Rodriguez: Wow. that’s pretty cool. Some of the examples, Elisha, you showed, I think it was a fire and then marshmallows. And I think that when you showed the example of the bacon and the eggs and oh, can those be combined? It’s interesting that what was probably actually happening was you were actually only introducing a single smell with the technology, Probably fire or something, but the fact that maybe in his mind, he remembers the smell of fire and marshmallow. So he might have gotten a hint of the marshmallow, even though it wasn’t there. And so it’s because of the memory might be associated that way. Cause he is oh, and it smells like a little bit like sweet or something that you said that was interesting.
Cause you mentioned two smells and in reality you might have just been one. And so there is power in like how you do that. And you said you add a note of this and a note of that to provide something that might introduce something that’s actually not there.
Aaron Wisniewski: That’s a another great example. And it goes back to the uncanny valley, right?
The same way that your brain is pretty good at filling in visual information in the gaps, your brain can also to a certain degree fill in perceptual information, right? So instead of trying to make exactly the right smell, we give you a direction and then your brain creates an association that kind of fills in that last 5, 10, 20%.
We did this interesting experiment, too, where we created the smell- generic citrus smell. We combined like a lemon in an orange and a line, very generic citrus. And then in a virtual environment, we gave someone a lemon, an orange these different virtual assets with the same smell.
This is a very informal thing that we did, but we asked people to describe the accuracy and it was identical across all of them. So your brain smelled citrusy, it saw a lemon and it said, spot on. Same thing with an orange, same thing with a lime, same thing with all these things. So what that proved to us is that you give your brain a little nudge in the right direction and it fills in the rest of the information.
Elisha Terada: I think what was really fascinating for me in the demo is the smell of the campfire. When I smelled like a barbecue, like a smokey smell, because I’m so used to smelling something more like generally pleasing, right? Like a lavender, oh, it’s it’s supposed to be good and relaxing.
I never smell something like a campfire where it’s oh, just smokey smell. It’s not necessarily meant to make me feel in artificial way positive for aromatherapy. It’s oh, wow, that’s really interesting smell. And I wanna ask this question of, can you simulate chemical reactions?
So for example, we talked about bacon and egg. there are different kind of food. It could be a steak, like you can grill the steak, but you can also maybe bake the steak. Maybe you don’t wanna boil the steak, but you could boil the steak and maybe cook it in different ways. Could OVR technology, or even at least in theory, could you simulate chemical reaction and say add, steak and fire, like add steak and grill, add steak and the apple chip would smell slightly similar, but a little bit different question than the earlier example of just two independent object.
In this case, it’s an object or food, and a chemical reaction combined.
Aaron Wisniewski: Yeah. So I think I understand what you’re asking and the answer is yes. And actually Sarah, our Head of Scentware, is working on some research around this too. And once again, like a lot of this, there’s a couple different ways to look at it.
A raw piece of meat versus a grilled steak, right? The smell is an indicator of certain chemical reactions in the case of the steak, the Maillard reaction, which uses protein and sugars to create that kind of caramel effect. Cooking is chemistry.
So what are the different kind of smells that indicate, changes in the state of matter, right? Protein, coagulation, caramelization, burnt fermentation. So you can classify these modifiers to add to particular base or aromas. You still can look at a raw steak and a grilled steak as two completely different entities, or you can have these modifying smells, that work to alter your perception of the base smell in a way that, gives you the impression of a different matter state altogether.
And that doesn’t just apply to food, right? This is something that’s been in the perfume industry for a very long time. You come up with a base smell like a base fragrance, and then let’s say you wanna add sensuality or you wanna add freshness, or you wanna add liveliness or, comfort. It’s not just about chemical reactions. It’s about the change states, if that’s the right word. It can be very physical, like grilling a steak, or it can be a little bit more emotional, like sensuality or comfort. And by adding those modifiers in the right proportion, you get a new impression that’s reminiscent of the old, and you can present it as its own discreet thing or using VR you can tell the story of that change. And like you said, there’s some similarities to this question and the previous one, but this is more about, are there modification smells, that tell the story of changing the state of matter and answer is yes.
Elisha Terada: Fascinating. Yeah. I was very curious about those questions because people like us who are like engineers ourselves and, write software, piece, build VR experience when this OVR have this SDK where hey, you can program that experience that you want your customers to experience.
We are very interested in a RGB. And I know you said that’s difficult, but maybe using the nine base scent combination, how can we be guided to create a cord of oh, let’s have people experience like picking up a meat and, grill it on the barbecue. And without being the scent expert, we have a guide on how to go about creating the experience.
Maybe there is a guide that we still have to read and understand, but as external person who doesn’t have all the knowledge that you guys have in house, how do we help produce those smells? So that’s why we’re interested.
Aaron Wisniewski: Yeah. Awesome question. And so just to frame it, the plugin that we have for, if you’re building a project in Unity or Unreal, allows you to essentially tag assets as scent enabled. And we’ve structured the plugin to be as user friendly as possible.
So it behaves a lot like Spatial audio, you’ve got an object, you’ve got a listener, and you basically apply an invisible geometry to this virtual asset. And as the user collides with the exterior of that geometry, it begins to trigger scent. The closer you get to the interior of the geometry, the more intense the scent is.
And then when you either move yourself or the geometry way you stopped smelling. So that’s like the physics of it. Now, what you’re talking about is how do you select the right scent combination of sense or the right sequence of sense over time? For one we would love to help you or anyone else and guide you through that process.
And also the reality is you’re learning. And so are we. We actually don’t necessarily have all the answers. A lot of it is gonna be trial and error, especially because of the unpredictability of scent, where we go, what is the right scent for this moment or this asset, or this environment.
And so I think a lot of it is trial and error and just like an audio engineer or a musician who’s engineering a soundtrack, you’re actually learning to play an instrument in a certain way. So it’s not the best answer for really engineering minded people to say, hey, there’s not a if than statement for everything.
There’s a lot of kind of, artistry in trying to create the right smell at the right moment. but,I’d say we love having this conversation with developers and with engineers about a really small detail and because even little things like the geometry that you put around the rows that you experienced, for example, we played with the size and shape of that geometry quite a bit before we felt that the interaction was authentic.
At first, it took too long for the scent to impact to hit you. but if the geometry’s too big, then you don’t really associate the smell of the object. There’s a creativity and a play involved in getting just the right thing. And the more you mess around with it and the more you discover, the more we learn from you also.
From an engineering standpoint of saying, oh, there’s actually an opportunity to help contribute and help write the guidelines and create the best practices around how you engineer and proximity based, on the types of scents and things like that.
Johnny Rodriguez: How does that feedback loop or how would it look for developers? would they be reaching out to you by email? Do you guys have a discord or a slack channel? Like how do you guys get that feedback from engineers or how would you gather that?
Aaron Wisniewski: Discord primarily.
You can definitely reach out to us via email as well, but we mostly communicate with the developers via Discord. My co-founder and CTO, Matt, this is one of the things that he has spent a lot of focus on is trying to understand, and to the best that we can, structure and categorize that feedback in a way that we can then make the developer tools a little bit more intuitive create some standards at the standard setting level around how to use this while still leaving lots of room for discovery and creativity.
Anyone who, is working on a project with us, whether it’s for health and wellness or for marketing or entertainment knows that we’re pretty heavily involved on the discord channel. And so we’ve got a lot of documentation about how to use it and best practices, but that feedback loop where we’ll capture feedback and sometimes make adjustments in real time, but capture it all.
and then, continue to release new features, and functionality and bug fixes and improvements.
Elisha Terada: Yeah, that’s really great for engineers to have such community that we join and learn and exchange ideas. And the speaking of creativity, I wanna spend last bit of our time talking about how your technology could work along or together with the other technologies.
And earlier, I know we mentioned, the mobile and PCs and Metaverse, and maybe I’ll start and you guys can jump in and maybe have some thoughts together. This is just like a brainstorming, on live. So for example, AR is a natural maybe branching or segment. from the VR experience. We talked about the AR and VR, even combined together as the MR in the latest, headset that’s probably coming up soon. And in AR I was imagining, what if you can have, let’s say like a card that’s printed with a symbol, like a tracker and the tracker, generates the visual of a row. So I can like place it anywhere on my desk. And then as I get close to the tracker, I can actually smell the rose.
That’s just one of the ways that you can possibly combine with the other technologies. And feel free to jump in. We talked about Metaverse and having a scent smell. Like I can smell Johnny. Maybe I can get close and I smell Johnny here.
Johnny Rodriguez: You talked about the scent DNA and individuals having DNA.
I think about some of the environments, we’ve been a part of some of the Metaverse environments that we’ve been testing and experimenting with and how that hasn’t really crossed my mind. But then I think about walking into a store has a sense sometimes or walking into my grandparents’ house.
I know that sounds weird, but there is a way to create memories and things like that. There’s a few different, environments out there, but it could be interesting to see that in something like Meta Horizon Worlds, where you’re building a world, but then you can actually have scent be a part of the location, like the room, or even your NFTs, right?
as you get approaching it, pairing an NFT with a scent, And then there’s a DNA to that. Could be really interesting.
Aaron Wisniewski: These ideas are perfect. They’re amazing. And some of it we’ve thought about before, some of it’s new, which is even better. That AR functionality that you talked about, Elisha, that’s definitely one of the things that we’re like really excited about.
Having the tracker and being able to have the rose, I think is a great example to be able to superimpose something beautiful, like a rose on an intimate object, can you imagine just walking down the street and every single, fire hydrant, trash can, whatever that is just it has to be there, but isn’t necessarily part of the beautiful design.
You have an augmented layer over that. So when you walk by it, you get good smells, more interesting smells can interact with it. And the same thing with people right now, if we were to wear a fragrance in the real world, we would be emitting fragrance molecules to your nose, and you would smell it if you walk by us. If you reverse that, where you’re emitting a digital signal, when someone walks by you, their wearable triggers that smell. Its digital perfume. I’ll tease a little bit more about CES. One of the cool things that we’re releasing, in the consumer product next year is a creator tool. So all of these tools, not just for VR creators anymore, but anyone can have the digital, scent creation tool to create these moments or sequences and share them with each other and even become a digital perfumer if you wanted to, and be able to send to friends or post in a marketplace.
And, and so there’s all of these kind of augmented and Metaverse applications that we see potential for in the future. And I get really excited about that.
Elisha Terada: Yeah. We love to get our hands on it. Once that comes up. Even in the beta form, we would love to try it out and give feedback. Earlier we talked about the digital scent and I thought that’s a perfect use case for web3.
If I could commit my DNA right- unique DNA into web3 and maybe a little bit weird, but that could be turned into some kind of like a token, right? Like NFT and oh, I can give you my scent DNA. And maybe it’s weird if it’s attached to me, maybe it makes sense for some other use case.
Maybe it’s not a human based scent DNA, but there are some interesting use case where you can tokenize, some data like that and exchange.
Johnny Rodriguez: Yeah.
Aaron Wisniewski: No, I think that’s a great one. I definitely do not claim to have this all figured out, but we have talked about. DNA and tokenizing it and how that may or may not relate to NFT and even the privacy and data rights around smell, is something that we’ve talked about since day one, but we don’t have all of it figured out yet, but it’s something that certainly needs to be addressed at each of these stages.
Yeah, it might be interesting to have a history of my scent DNA and it probably changes like as I age, if I’m capturing it as I age, or as I’m placing myself in different environments or decide to represent myself differently in the smell and having a history of a chain that is interesting.
Elisha Terada: And you already mentioned about mobile and PC. Could you expand on that a little bit, how you ‘re trying to support that, outside of the VR headset?
Aaron Wisniewski: Yep. the form factor itself, is changing a little bit. the base technology is the same, but the form factor is changing a little bit to make it wearable with any sort of digital interface or consumption layer.
So you’ll be able to experience digital content, web3, Metaverse, AR, XR with the same piece of technology. There’s really two changes. One, there is a form factor change, with some optimizations and product improvements, actually that Fresh is assisting us with, and by the way, we’re in the business of moving molecules here.
So the fluid dynamics ,the electronics, this is like really complicated. so thank you for being part of our development process and our creative brainstorm sessions, and actually taking this into reality. So the form factor and the hardware components are reorganizing and being optimized a little bit.
And then our software compatibility is expanding a little bit. So we’re gonna be launching a mobile app that allows more people to access, AR and XR functionality on their phones, as well as VR headsets. We believe that although VR is probably the most immersive and maybe the most compelling way to experience it, it is just a part of the Metaverse and Web3. And we really believe and want scent to not live in one corner of it, but to be able to be experienced by as many people as possible.
Elisha Terada: Thank you. Yeah. And we’re gonna be closing here very soon. I would love to see AI and ML playing maybe in the future of generating the smell that human didn’t think of, but maybe the AI can figure something out.
So the last question that I wanna ask, Aaron, is what’s next for you? You already talked about coming to CES. And other exciting things I assume happening soon.
Aaron Wisniewski: Yeah. We’ve touched on some of it for sure. What’s really capturing my attention and my excitement isCES in 2023.
Some of the new things that we’ll be able to do with technology in 2023. And, at least as exciting is who will be able to partner with in 2023 too. So right now OVR is its own ecosystem. I’m looking forward to partnering with more developers, more creative people in general, game and entertainment studios, more brands, more health and wellness and more researchers.
And so really expanding our partnership network and being able to hand this tool over to all these creative people and seeing what they can create or what we can create together on our new platform.
Elisha Terada: Yeah. And I’ve listed some of the social media presence that you guys have. What’s the best way to reach out to you and learn more about how they can engage in partnership with you?
I would say the best way is through our website. the contact, page on our website, email@example.com, but feel free to reach out on any of these social platforms as well, especially Twitter or LinkedIn. but whoever it is, who’s listening out there. We wanna hear from you.
Aaron Wisniewski: We wanna hear what’s interesting to you and we’d love to partner with you, whether you’re a creator or whether you’re a giant brand or company too.
Elisha Terada: Yeah. Aaron, thank you so much. And we learned so much from you and we’re excited to see you at CES 2023.
Aaron Wisniewski: It was a real pleasure getting to speak with you and I look forward to continuing to work with you, and seeing you CES.
Johnny Rodriguez: Yeah. Thank you very much.