Warning: Illegal string offset 'sfsi_rectsub' in /home/customer/www/3dstartpoint.com/public_html/wp-content/plugins/ultimate-social-media-icons/libs/controllers/sfsiocns_OnPosts.php on line 21
Warning: Illegal string offset 'sfsi_rectfb' in /home/customer/www/3dstartpoint.com/public_html/wp-content/plugins/ultimate-social-media-icons/libs/controllers/sfsiocns_OnPosts.php on line 24
Warning: Illegal string offset 'sfsi_rectshr' in /home/customer/www/3dstartpoint.com/public_html/wp-content/plugins/ultimate-social-media-icons/libs/controllers/sfsiocns_OnPosts.php on line 27
Warning: Illegal string offset 'sfsi_recttwtr' in /home/customer/www/3dstartpoint.com/public_html/wp-content/plugins/ultimate-social-media-icons/libs/controllers/sfsiocns_OnPosts.php on line 30
Warning: Illegal string offset 'sfsi_rectpinit' in /home/customer/www/3dstartpoint.com/public_html/wp-content/plugins/ultimate-social-media-icons/libs/controllers/sfsiocns_OnPosts.php on line 33
Warning: Illegal string offset 'sfsi_rectfbshare' in /home/customer/www/3dstartpoint.com/public_html/wp-content/plugins/ultimate-social-media-icons/libs/controllers/sfsiocns_OnPosts.php on line 36
Warning: Illegal string offset 'sfsi_shuffle_icons' in /home/customer/www/3dstartpoint.com/public_html/wp-content/plugins/ultimate-social-media-icons/libs/sfsi_widget.php on line 229
The Virtual Reality Ecosystem from Gravity Sketch to Theia to HP
Communicating early stage design to users and clients is always challenging. Imagine the benefit of being able to not just share a visualization of what is in the designer’s mind, but to also allow others to experience it firsthand. Welcome to the power of virtual reality for design and development. Joining Tom Hazzard and Tracy Hazzard on today’s episode are Oluwaseyi “Shay” Sosanya from Gravity Sketch, Stephen Phillips from Theia Interactive, and Joanna Popper from HP. These three companies specialize in different aspects of virtual reality from VR 3D design to 3d print to production. Tune in to this episode to learn more about what these companies are doing in the VR world and the whole creative workflow.
Watch the episode here:
Listen to the podcast here:
From VR 3D Design to Production: Creating, Visualizing, and Experiencing Design in Virtual Reality with Oluwaseyi (Shay) Sosanya, Stephen Phillips and Joanna Popper
We’re going to talk about VR 3D design. This is exciting and it’s an unusual episode format for us.
It’s jam-packed because we have three different companies, three different aspects related to VR 3D design as an overall process, but also to 3D prints. VR 3D design to 3D printing, all of that process inherence. We’re breaking it up into three pieces. First, we’re going to talk about the design side of it by talking to Gravity Sketch. We’re then going to move into users of it who is combining visualization tools and other things to do that space and product visualization utilizing VR technology. We’re going to talk to Joanna Popper from HP to talk about some of the technology that they’re using all along the way to develop the tools to utilize in the visualization process. For all of us, go out there and have some fun and check it out. We’re going to do that through the whole process. I was a VR skeptic, Tom. You know that.
Once upon a time, you were a VR skeptic. You went to a VR conference and you came back and I’m like, “Who are you? What happened to you?” You were brainwashed after you got back from this conference because you were so taken with VR and such a proponent of what it’s going to do.
I’m going to broaden that by saying XR. That’s what I was taken with. I was taken with the idea of AR and VR and this mixed reality idea of combining the things as well. Also, adding into that the 3D print visualization component. This is the ideal for me and this is what got me jazzed and excited. It’s going from something that seemed so much gamer-oriented to being someone that was more application-oriented. Being able to utilize in the design process and someone like me without detailed heavy-duty CAD skills could get someone to visualize what’s in my head, to be able to push that together, show that to someone, and walk around that. To be able to have them take that from there and do that final engineering and the detail work and the things that was needed and/or get it out to the printer. The idea that I had to have less and less technical skills and that only needed to rely on my design and creativity skills, that’s valuable to me at the end of the day. That’s why we are highlighting this VR 3D design process because it allows those things to happen together because I can push, pull, touch, point to it and communicate better in creating the design result that I’m looking for in the product result.
VR, AR, XR has come a long way and there are some real practical applications for the design of products. There are practical applications and software for visualization and helping to communicate designs in huge projects in less expensive ways to get budgets approved, or to make things happen much quicker and less expensively. The computer hardware itself has come a long way. Who are we going to talk to first?
We’re going to talk to Shay Sosanya from Gravity Sketch. Shay is a London-based design engineer. He’s passionate about traditional forms of making, which is valuable because if you don’t know how stuff is made, how are you going to design tools to make it? He has mass production and digital production process experience. He holds an MSc in Innovation Design Engineering from Imperial College of London where he met his Cofounder, Daniela Paredes Fuentes. He’s the CEO of Gravity Sketch. He’s focusing on the challenges. Traditional digital tools have to empower designers of all disciplines at all levels to become more creative by transforming workflows. His mission is to make emerging technologies more accessible to the broader creative community by building tools that lower the barriers to entry with human-centric user experiences.
With an intro like that, I can’t wait to conduct this interview, so let’s go there.
Shay, welcome to the show. Thanks so much for joining us.
Thanks for having me, Tracy.
This whole series has been interesting for us and getting to explore new technologies, software, hardware, all new things in the process of producing this episode on XR, as we termed it from our conversation. How long have you been working in this industry and when did you found Gravity Sketch?
We found Gravity Sketch in 2014 and it was on the back of a thesis project. I was a returning professional to academia with a background in design for manufacturing. I worked for a big original parts manufacturer in Taipei. I had a close experience with what it takes to bring something to market and all the different CAD cycles that you have to go through with the design team to manufacturing. One of the things that we wanted to do in academia was to see if there’s a bridge between design and CAD and engineering world. We felt that everyone speaks through sketches, but those sketches are 2D. As an industrial designer, we sketch in a more emotional expressive way. As an engineer, we tend to do the plan views, the three views and maybe an isometric view. If there’s a way to sketch in 3D, there’s no miscommunication. What you see is what you get. That was the nugget of what we want to address.
It’s a lot more valuable but time-consuming as you know.
What’s been fascinating to me is learning about your company and seeing a lot of the videos that you guys have posted about how your software works that I have not had the opportunity to try it myself. A number of years ago, we were able to try some things with Microsoft and the HoloLens and some rudimentary creating of 3D objects in a VR environment. At the time, it was fascinating but it struck me that it wasn’t intuitive and they hadn’t worked out a lot of the user interface. Watching the videos of what your software does, it seems natural. You have to learn how to use the controllers and what buttons and commands are going to do things, but I can see how drawing in that 3D space sketching is a freeing experience. I’m fascinated by it. You must have spent a lot of time experimenting, trying to figure out what was going to be the most intuitive ways to create objects. Can you share some about that development?
During the development phase, we did want to strip away most of that interface that we see when we jump into even a Photoshop or a CAD package, where you have drop-in windows or command that you need to type or a hotkey. We wanted to focus on the creative experience itself. Picking up a pencil and putting it into the paper, everyone knows how to do that from my four-year-old niece to my mother who’s a school principal. What we want to do is figure out how does that experience translates into 3D. Since we use our hands and we gesture, we have a foundation that we can build an experience on top of. We came up with a couple of rules that guided us through how we implement the user experience. One of which was everything that the user does has to be physically based. Gestural interaction was a fundamental thing for us because we did want to spend that chasm of people’s skill levels. We also want to have pretty much minimal to no language in the application. There’s no need for you to read something or click to access a feature.
The last principle that we want to implement was this immediacy. When you’re creating in traditional CAD tools, you might define a spline, then a center line and then punch in the degree of rotation that you might want to rotate around that center line. That’s three steps. Can we cut those three steps out with one fluid stroke? All these things were the guiding principles for us to go through that process of defining what the right UX scheme is. It’s also helpful that we have industrial designers and manufacturing engineers as the founders of the company. We have quite a breadth of experience in both the Photoshop world, as well as the CAD world, but we don’t want to be the only guinea pigs. Being in close proximity to our university, we were able to crowdsource a lot of testing. We would have these open nights where we had set up VR.
Back in 2014, it was novel. You set up an Oculus DK1 and then in 2015, 2016, you’d have the HTC Vive. You put people through that experience. What was great is that you had a swarm of students who are looking in the Master’s program, thinking about how they can redefine their creative workflow and their artistic approach. They’re experimenting and I can’t stress enough how important I feel students, artists, designers are when testing out an early-stage product because they are looking to stress test this thing. They want to push it to its limit. They’re not falling back on conventional norms. They’re not constrained or confined by industry. At the Master’s level, you get a flavor of people who have been in the industry before and want to run away from it and people who have never been in the industry before. They were completely free. That also helped us govern what was working and what wasn’t working.
How do you get these entrenched industrial designers or engineers who are used to their technology and their CAD to switch and look at the opportunity? Someone like me who has great design skills and all of that, I usually express and sketch it to Tom. He then takes it for the rest of the way and we collaborate from that point forward. I was frustrated with learning CAD because it was too time-consuming for me that it inhibited my part of the creative process. I could see how someone like me would be like, “I can take this on. I could learn this.” I could see someone who’s used to their CAD program being reluctant. What did you find as you were testing it?
One thing that we don’t want to do is we don’t want to replace CAD. That’s not at all on our agenda. Tracy, your role with Tom is much still intact, but the distance between your idea and the finished product is much closer because you’re handing off a 3D sketch to Tom. He gets it when he pulls it up in whatever CAD software he’s using. We’ve started to build integrations and bridges so that you can pull up a Rhino file out of a Gravity Sketch file directly. You can edit it at Tom’s native environment. You can design in your native fluid sketching environment.
That sounds fun because now you’re talking our language. We do happen to be Rhino users here. You might have noticed that. That invites that collaboration because often that’s what we found over the years that the engineers, designers, and manufacturers all get into their silos. They aren’t speaking the same language and that design has difficulty in communication unless you’re extremely good at the communication process. That’s what we’ve had to hone over the years in order to get through the vision and keep it moving through the process.
On both sides, we both sketch and that sketch is nothing but a communication tool. We’re communicating what’s in our brains. We use the most immediate means possible, which is 2D but with the advent and accessibility of this technology, we’re able to do all that napkin doodling all the way through to a nice resolved industrial sketch all in 3D. It removes the need to do 7 or 8 different perspectives.
That’s one of the things that I found exciting, studying your software and seeing examples of objects that people have created with it. I see the creative benefits of being in this 3D virtual environment and drawing in three dimensions, not just in two dimensions, all the different tools, and then the way you manipulate certain things. I was thinking back to some of the office chairs that we’ve designed that are mesh fabric but a plastic molded frame. Creating that frame and seeing some of the tools that you have created where when you’re drawing and creating these forms, how you’re able to push and pull, manipulate them and then have it simultaneously be symmetrical and even as you’re drawing. I was thinking, “That would save a lot of time.” In unconventional CAD, it can be hard sometimes to create the actual form that you have in your mind’s eye, but here it seems easy to create it.
Base on my experience as a designer and engineer, what we see is that once we move into the CAD phase or we want a digital 3D prototype, we start to land on things that we’re not a quite comfortable landing at like, “What’s this radius here? How thick should this be?” We’re still in that fuzzy phase and we want to stay in that phase a little bit longer. To be able to carry that into 3D is quite powerful. That way you can push and pull things so that when you get to the point where you need to 3D print it or bring it to physicality, you can then address, “We need thin this wall. We need to thicken this up a little bit here.” If you start from those units and those dimensions, it does stifle the creativity or at least it stops fully being able to express yourself. There’s a nice little area between the 2D and 3D CAD world that we think this fits nicely within.
We are used to working remotely with our teams. We have a big team over in China that we’ve utilized from time to time. You can’t always be there in person to collaborate. There’s a lot of remote working conversation going on. Does this have a multi-user virtual component? Can we collaborate using it at the same time?
Absolutely. We’ve been testing this in a silo with one of our customers in the automotive space and they have a design team in Turkey, one here in the UK, as well as Detroit. You could guess it’s only a number of companies. It’s been great because it wasn’t a technology constraint so much, especially with the small size of data that we’re producing at Gravity Sketch file format. It was mainly around the actual culture of design. How do I restrict you from moving this curve, but allow you to move this curve? It’s some of those problems that we were trying to solve. That’s what held us back from deploying this at a wider scale, but the great news is we should have a beta version that’s accessible to about everyone in our network. It should be testable for you in the near future.
That fascinates me because we have had those issues before where the manufacturer wants to make adjustments to it but they go too far. They create a use constraint or a problem with what we intended the consumers to be able to do. It takes watching the drawings and process. If you could constrain those ahead of time and make sure that they can’t be adjusted, then they have to be innovative and figure out a workaround. I love the idea that you’re taking it to that level. You were talking about your niece and your mom being able to access that, and I love that broad age range. Is it accessible from a pricing standpoint too? Are there various levels of it so that it’s not just pros?
This is another thing about launching a product in the VR space. The majority of people that VR is targeting now or most of these companies are targeting is gaming, with the exception of potentially HP who’s has a deep seat in the enterprise industry. Oculus, Vive, all of these are packaged and branded as gaming consoles almost. For us to gather a wide breadth of users to test once we are outside of academia and more in the remote testing phase, we need to offer something that allows people to get easy access, start using it in a more casual way. We also need to make sure our UX layer is accessible enough as well. It’s easy enough to jump in and start sketching. It’s been a product of that and bringing more people into the ecosystem before this becomes the third display on your desk as an engineer so we can define what feels right. Going back to that UX problem, that’s what we’ve been focusing on. It’s amassing a lot of people to the platform, getting them to use it, understand how it’s used, and give us feedback. They may not be our core end-users but they can engage and enjoy the experience.
I also think it’s a great entry point. I’ve written a lot of articles on this when you come in at that younger age group, as you’re talking about college students, which is fantastic. When we were in college, CAD systems were brand new. While we had a CAD lab that everyone had access to at Rhode Island School of Design, it wasn’t a part of the curriculum yet because the teachers and the professors hadn’t had the opportunity to integrate it into the programs. We were at that edge of it. That’s where VR is as well. It’s not integrated into most programs, but to create that access starts to get someone thinking about the design process included in it.#GravitySketch has implemented minimal to no language in the application. There's no need to overthink when accessing a feature. @hp @zbyhp Click To Tweet
What we’ve seen with academia is students go out and purchase the headset themselves because it’s half the price of an iPad. They’ve gone out and purchased a Quest headset. It’s roughly $30 on the Quest store. They’ve almost flipped it. They’ve gone out and got the hardware themselves, start to implement themselves, and now they’re introducing it to their instructors. We get a lot of migration from the students to the instructors to us, as opposed to us going out and reaching out.
It’s the same thing in 3D printing. That’s the model we saw going early on in 3D printing.
While I can completely relate to the benefits of this tool from a creative and communication perspective and a workflow in the earliest stages of design, I’m curious to ask a little bit more on a technical level about either what you’re already achieving or what you’re planning to do in terms of workflow taking it from that sketch to then reality. I know you mentioned a little bit about it, but a little more detail. It struck me from what I saw anyway. While the creative opportunities are there and to realize what’s in your mind’s eye, I was not sure how accurately sized things are. How easy is it to do that in Gravity Sketch? Is that not important? Is it proportion more than specific accuracy? Does that happen later? Can you help share a little bit about your intention there?
Whenever you build a tool, it’s the users who define what that tool is used for. We’ve realized that more now than ever with tens of thousands of users on the platform. Initially, it was very much so about form-finding, figuring out the right shape and proportion, and then go and scale it in another tool somewhere else. More and more we’re getting people that don’t have any pre-existing CAD knowledge and want to understand the proportions, the measurements, and this is a real demand. What we don’t want to do is we can’t on some of the things that we said we were going to keep as our guiding principles. What happens is we spend a lot of time figuring out, how do you proportion things and make it feel natural, easy, and intuitive? A tape measure is one of the things that we’ve been working with. It’s still not perfect because how you measure something from a large distance you have to scale it down first. That’s one aspect of it. The user starts to define what the actual tool is used for. We need to address these things.
Another aspect here is we’ve created everything on top of the nerves, which is non-uniform rational, B-splines, essentially the underlying Maths that are accessed through SOLIDWORKS, Rhino, Plethora, and other CAD packages. You can export it seamlessly into another package. Likewise, you can import something in. What we try to encourage users to do more than try to measure out everything before they start sketching is, “Why don’t you bring in a mannequin, chassis of a car, the rough tooling bed for your 3D printer, and then use that as the volume that you want to sketch within?” That way you understand some rough proportions, but you don’t necessarily need to focus on the numbers. There’s an area here that we still need to work on, I’ll be honest about that, but we do capture workflows and we do figure out ways of going around that. That’s where the creatives are best and hopefully, we don’t need to figure it out. It’s the students, the creatives and the community that is helping us figure that out.
That’s fantastic and don’t get me wrong, I wasn’t trying to take anything away from what you are doing or have done. Rhino is my primary CAD software because of the age I am, when I started and what I learned along the way. It’s not perfect by any means either, but it is grounded in finite and tangible mathematics that you can manufacture things that do fit and interface with other parts existing or created. I’m thrilled to hear that you can go back and forth between the two because I could see myself easily 3D sketching in your software and bringing it into Rhino as a creative process and then getting it to the point from there, having that framework to work with to create something that is a completely closed object that I can then have made.
Rhino is our number one inspiration on the 3D CAD side with the way that they’ve built the product and their philosophy. Bob McNeel has been at it forever and he’s not a get-rich-quick kind of guy. He’s bringing a community around the product and it’s the community that’s helping define what Rhino is with the Grasshopper integration. That’s something that we want to also build in and be aspired to. We want to belong to the Rhino family but also create a family around Gravity Sketch of additional plugins in the future. If there’s a specific 3D printing workflow, who better than someone from the community to help determine how that should roll out? We’re in the same line of vision and something that we’ve done to even bring our two worlds closer is we’ve created a plugin. You can take a Gravity Sketch file, drag and drop it into Rhino. It repopulates for using the Rhino’s CAD engine. You don’t even have to worry about clicking and converting anything. Their layers are exactly as you left in Gravity Sketch. You can pick up where you left off in Gravity Sketch with the Rhino.
That’s a pleasant surprise because I didn’t know that researching for this interview. I liked the product even more. What’s the best headset that I could buy?
This sounds like a lot of democratization of design, which I love because this is my thing. Someday in my retirement, I’m going to learn CAD, but now I’m not. I’m going to go use Gravity Sketch and go over it that way or something. Thank you for that. In the back of my mind, I’ve always thought this is holding back my vision. I can’t get it all the way through. You’re creating the ability to do that and I love that. Thinking about how that goes all the way through to it, are you working on that 3D print process? If I could skip Rhino and all of that, then I can go straight into 3D print. Now I can start to visualize what I’m looking at and what I want, assuming it’s small enough.
We’ve explored that quite a bit, especially in the early days of Gravity Sketch. You think about 2014, 2015 was that 3D printing year. A lot of things happen in the industry. It was something that we could close our eyes around. What we realized when we started to dive into that 3D printing ecosystem was that you have to understand what type of printer you’re going to use, what type of technology, and what type of material. You can’t design for 3D printing without understanding that digital manufacturing process. We haven’t built intuitive tools to help go straight from Gravity Sketch to printing. What we have done is for the person that has that inherent knowledge, they can create a pretty watertight mesh within Gravity Sketch. They can then export that as an OBJ and take that directly to Akira or whatever slicing software they’d like.
We’ve also implemented subdivision modeling, which allows you to push and pull quads instead of triangles. That’s always going to remain close as long as you stitch that together. You have much more confidence in the fact that whatever you made in Gravity Sketch will be kicked out to a printer competently. Every stroke in Gravity Sketch is a small tube and it’s a closed small tube. Even if you were to make a mess of strokes, that’s all fully watertight each individual strokes. You can’t print that.
That makes so much sense when you think about it because you’re not a two-dimensional environment, you’re a three-dimensional environment. Why should any line be a line in theoretical space with no volume to it? No matter what scale it should have some volume. That must have been in that Tiffany there. That’s brilliant.
It sounds like I might still need Tom, which is good when we still have a partnership going on here for the 3D print process. To get to Tom’s question, did you create this so that it’s tech agnostic from a VR hardware standpoint? Did you make it device specific? How does that work for you?
Being in the VR ecosystem and us raising money as a startup in that ecosystem, we need to show that we can have credible growth. The best way to do that is to be agnostic. Essentially, every VR device out there, we even explored and experimented with the MagicLeap in the AR space. We have an iPad product that’s in the works. That will come out relatively soon. The idea is it’s democratic and it’s not to move towards a point where you don’t need Tom, and Tom doesn’t need Tracy. It’s to create a much more synergetic workflow between you two so you can produce far better products far faster. In order to do that, VR doesn’t just exist in a silo. It exists next to your desktop computer. It exists next to your iPad, next to your iPhone or smartphone, whatever it may be. We want to make sure that we create an experience that can transition between those 3 or 4 different types of devices. We’ve created also a Cloud platform where you can send up the model and then you can bring it down and view it on your phone or tablet.
That’s going to make the remote team working even better as well. I’m glad you talk about that because the more successful companies, and we’ve seen that through the 3D print hype cycle, are the ones that remain broad in what they do have created a better bridge to all the things that everybody’s already using and less objections in the process. It’s a better business overall. We talked a little bit about the HP products. Do you use the backpack and have you tried them?
We asked HP for a loaner and we got the stuff with the lockdown situation.
The backpack sounds cool. There are some ideas like, how much freedom is that going to create in that process and how much fun is that going to be? We’ll have to hear it. You’ll have to give us a little update or send us a video once you manage to get one and show us how that’s working. The last thing that I want to do is people like to understand how it’s being used. Do you have some use case that you could share with us?
I have a few use cases. I’ll try to rattle through four use cases that I find compelling and exciting. The obvious automotive use case which is cool, we get to be in a lot of amazing automotive design studios from Italy to Detroit. What they ended up doing is they bring in the car chassis. You have a wheelbase that you’re going to start from. If it’s a combustion engine or electric vehicle, you’re starting from some rough engineering diagram or dimensions. You then have safety requirements. You can bring in all that information to Gravity Sketch from whatever CAD system you’re using. It’s usually CATIA. Guys have to sketch on top of that, creating loads and loads of integrations. They’ll create 7 or 8 different vehicles as a wireframe sketch in Gravity Sketch. They’ll have designed conferences and reviews.
That’s way faster than the clay modeling. I’ve seen it because I used to work in that industry.
The most interesting and compelling use case was when one of the designers pull himself up into the driver’s seat. He did an ergonomic study. It wasn’t even about creating the actual design. He was like, “This is how far I need the steering wheel. This is how far I need the shifter.” The designer at one-to-one scale, at the napkin sketch phase, being able to understand where things belong in the cockpit of the vehicle. I found that amazing. Another use case on the same vein was a drone manufacturer who’s bringing in the CAD model from SOLIDWORKS and they were doing all the cabling. They are zooming themselves up to the size of the hand and they’re doing all the cabling within the model, then exporting the IGES file, then applying the cabling in the CAD package. They were able to do all the cable routing in Gravity Sketch because it’s much more intuitive to do 3D cable wire routing with 3D software.#XRtechnology is becoming the go-to term because development might use VR, AR, or other technology to create an Extended Reality. @hp @zbyhp Click To Tweet
I would think to see what I’ve seen visually about how your tool works, drawing that line, the right diameters, and the cable routing, that it’d be perfect.
The two last cases are around the industrial design workflow for the human body. Footwear has been an interesting use case for us. If you check our Instagram, you’ll see a lot of great footwear examples. We have professional footwear designers from some of the biggest brands using Gravity Sketch to explore concepts. It’s not so much about, “I need to make this manufacturable.” It’s more like, “I’m a footwear designer. I don’t have a trusted CAD package of choice.” I’ve been mainly working in Illustrator and now this is something that speaks to me. It allows me to explore the 3D shapes that are in my head. There’s one footwear designer who’s sent over a model to a factory in Vietnam who also uses Rhino. He was able to shave off about 25% of his workflow by communicating with that factory through 3D as opposed to through precise 2D Illustrator sketches.
The last example is a little bit of a combination of everything. It’s a designer who’s scanned in his head and he brought up his head in Gravity Sketch and he was able to create a helmet around his head. The perfect ergonomic fit exactly the dimensions of the foam and even the aerodynamic nature of the helmet, all custom and tailor-made. That’s the future of design 3D printing and Virtual Reality. Being able to have that level of detail and customization and our technologies are here now. We’re seeing these use cases pop up all the time. It’s about the designers who want to take the bull by the horn and run with it.
Thank you for empowering designers.
Thank you for supporting us. The community has embraced us and we’re trying to do our best to do the same.
Shay, thanks so much for joining us. We appreciate your insights.
Thank you, Tracy and Tom. I appreciate your time.
That was mind-blowing with what Shay had to share with us on the design tools and how they’re progressing. I can’t wait for us to talk about how they’re being applied and what they’re doing to create businesses, environments, and products.
We’re going to hear from Stephen Phillips of Theia. Stephen leads the development of Theia’s advanced VR and AR visualizations for enterprise clients in the AEC space, especially in the architectural space. He graduated from university with a Game Development degree and multiple honors. Gamification became a big thing for him and use the Unreal Engine pioneered by VR applications for the luxury hotel design views, medical facility walkthroughs, real estate pre-leasing, and construction site approvals. They have done award-winning product demonstrations. Stephen manages custom projects and works with the company leaders to bring the future of real-time rendering to traditional industries and very expensive projects. This is big stuff. It’s making the barrier to entry to big projects less because there’s less surprise down the road. I don’t want to get too far ahead of myself. You’re going to learn a lot from Stephen and our discussion with him. Let’s go to that.
Stephen, welcome to the show. We’re glad to talk to you about XR. I’m going to call it XR because you’re more than VR.
It covers quite a bit these days between hardware and software and everything in between. I’m glad to be here.
Let’s define that for our readers. You mentioned it briefly there. Everybody has heard about VR, AR, now XR. Can you please give us a little one-on-one on that?
XR is becoming the go-to term because it doesn’t matter what you’re developing for, you’re probably going to be developing it in a similar way. The XR stands for Extended Reality, which means the whole domain of anything that brings virtual and real-world into some blend. I used to think of it as X being a variable, any type of reality like fill in the blank reality.
We’re not talking about extreme reality. We’re talking about any kind of reality. That’s broad. Bringing in the what. That’s what we’re here for, the what of it. You’ve been working in this world for quite some time and how has it changed the process of design?
A lot has changed quickly. My business, we’ve been doing VR, XR, AR, whatever it was at the time. We’ve been doing that for a few years. Initially, we were doing traditional game development workflow. The 3D modeling for the entertainment type workflow and then bringing it into a game engine for rendering. Once it’s there, you can put it to any particular platform. Your design process depends on your targeted performance or your targeted visual appeal for these different platforms. Whether you’re making an environment that needs to surround the user or you’re making an object that needs to have a real-life environment. I’d say the way you go about creating those experiences, how does that necessarily change from a fundamental standpoint of where you would start with it, but now you end up using those VR applications to then go full circle and be developing stuff that way. That’s a more interesting thing is developing your VR apps within VR is now possible. It’s come full circle and it’s exploratory and interesting to be a part of that.
That’s a mind-bender, designing VR within the VR world.
It’s interested us for a while. We did an episode on HoloLens. We could not reveal it because it was early on. We couldn’t reveal all that we learned and saw, but the idea of designing within a VR environment is an interesting concept to us. I want to touch base and I want to get some background on what Theia is doing so that people have an understanding of where you’re coming from in this viewpoint. Give us a little background on what Theia is doing.
What Theia does is we are a bit of a services company, a consultant for XR applications. Anytime a business could come to us and say, “I have a problem to solve. I have a thing to be visualized. I have an application that needs to be built and we want it to be immersive, interactive or visually interesting,” they come to us and we can build them out that solution. We’ve done quite a few architectural walkthroughs like entire properties or buildings that people could explore in 3D before it’s built, that could serve many purposes. We’ve done some training and simulation applications for fixing up an airplane or something to that effect. We’ve even built an augmented reality video game where you get to walk around with a pet dog virtually and toss treats for it and play fetch and things like that. We cover all the bases. It’s targeted for kids. We almost put in a thing where you were training your kids on owning a dog that’s difficult and you do have to pick up the virtual poop every once in a while.
It sounds like still now though that this design work is suited for that. I’m going to stay at a slightly more expensive model. It needs to be too expensive to build whatever it is. We need to visualize it first, whether that’s building out the entire interior space. I’ve seen them where they build out oil pipelines. Not only is it expensive if you build it and it’s wrong, that’s expensive because you have to redo. It sounds like we’re still in that stage.
That is correct. It is easier and easier to build stuff in VR. Maybe for lower and lower cost things, that’s worth it to get a quick preview here and there. For our business, it’s been about the huge cost involved. We have people come to us and they’re like, “We’re building a $2 billion casino and there are 200 rooms in it. We need to make sure that every square inch of this can be inspected in virtual reality to make sure there’s not a mistake.” They pay us to spend a lot of time and money on preparing that intricate virtual walkthrough. These days, it’s easier to get into that. Even we can do it a lot faster and about anybody can down your headset and move 3D files around. I would see it being applied to a lot more types of products and projects going forward.
Have you seen people using it to create any smaller objects? I’m sure when you create an environment, you don’t just create the architecture. Maybe you create things that are within the architecture to make it seem more real. How much are you seeing people create smaller objects? Something that you might be able to 3D print, for example.
For VR applications, I haven’t seen as many uses of that. The thought you brought to mind is 3D sculpting has been around for a while and I’ve seen people who do 3D sculpting for jewelry pieces. There are now more and more ways to do that 3D sculpting within VR. That stuff is starting to come back around. A lot of the VR business has so far been about taking data sets, taking big things, and putting it into VR. You can be either bite-sized or you could have a gigantic piece of jewelry in front of you sculpting it intricately for it to then be 3D printed somewhere else to be manufactured.
I would think that it’s starting to change the design process as you’re going through it in a greater way and the workflow of everything. If we’re starting back at the visualization, it’s re-informing how everything gets to output. What are you seeing along the way of that workflow?
For us, the thing about VR or these other immersive technologies is that it reveals to you a lot more than you thought about the product you’re developing. We’ve had people come to us and they have a specific goal and thing that they need to visualize. They’ll say, “Here are the assets to look at. Here’s the purpose for this application.” Once they put on a VR headset, they’ll say, “This isn’t what I thought it was at all.”A consultant for XR applications, #Theia covers all the basis – from training and simulation apps to augmented reality. @hp @zbyhp Click To Tweet
How often does that happen with clients in real life?
It happens a lot. It’s better to do those realizations early in the process. We see a lot of those realizations take place. We had someone ask us for a training and simulation experience for a type of hospital room. Once they were developing that, they realize that some things are out of reach and they needed to modify the design of the room. Those are things that only come out of having that immersive experience.
I would imagine then it shortened the building cost overall because you don’t have that, “I have to retrofit every single hospital room at this point. I’m trying to makeshift some hack to get it to work for people.”
We’ve had a lot of those realizations happen where the 3D models were all correct and designed properly, but because you get in there and look around naturally something wasn’t quite right fit. They went and we revaluated the design files. It would have cost them a lot of money to make any changes after the fact like you said. A lot of customers are building these big products like a hotel. They build physical mock-ups of these things in order to prototype it and understand the hand reaching things that you can discover in VR. We can speed up the initial iterative process and skip a couple of those physical mock-ups along the way, which has a discreet cost associated with it.
It’s reminding me of my early experience as a designer in practice. You learn lessons quickly that you’ve got to get things as inexpensively as possible in the proper scale and context in order to be able to know what you’re going to develop in the long run is right. You often get surprised when you create something on paper and the computer in CAD. It’s tough when you’re creating something in isolation because there’s no context around it. What you’re doing is giving people a realistic context.
A realistic context where you can get in there and immediately understand something instead of trying to interpret a 2D image. 3D software is available on open source and there are lots of 3D web browsers that you could spin around like a 3D model on. A soon as you step into a headset or take that 3D model and put it into your living room, through your cell phone, suddenly the scale of it is making a huge difference. The time it takes to walk around the thing makes a huge difference.
I’ve even had that surprise many times when I’m creating something very small and sometimes objects that I would design and create are small. In the computer, I think, “This is the right scale.” Even though you have every measurement tool available on earth, once you 3D print it, we’re like, “This is twice as big as I thought it was going to be. What the heck happened here?” We experienced it on a small scale, but I’m sure you experienced it on a large scale as well.
Yeah, that medium-range. Furniture specifically has been a struggle for us, especially in the early days of VR where we want to sell VR products. We want to tell people that it’s hyper-accurate. It’s the best thing ever. It gives you that great emergent and yet because it’s so new and there are technical problems that could arise. We’re always skeptical. We’ve had customers come to us and say, “We specified that this chair is exactly how many inches from the ground on the seat. When I am in VR, it feels like it’s smaller than that.” I go back to my team and I’m worried that we made the thing incorrectly and we measure it in our 3D software. We go into VR and measure it. We craft a measuring tool and the metric there. We’d line up a real-life chair and sit in it and it’s totally correct. It’s just that doubt in the 3D object than the physical object.
That’s because the chair is too short, to begin with. It’s the chair itself. This happens often because we have done a lot of work in the furniture industry back with my work with Herman Miller and the cataloging. That’s the problem with the visualization tools is that a lot of them were inaccurate. It’s getting this carry over where they’re skeptical about what you’re creating here. The reality is most of the time they would specify these things out of a visual catalog. They’ve never sat at it. They don’t know what it feels like. They look at it in space and they’re like, “This is wrong. It’s like dwarfed in this space or it’s too low. It’s not comfortable.” You’re stuck with it because you spent eight weeks ordering that furniture to arrive in the perfect material. I can see how the furniture is a challenge and texture materials are a challenge too. I would imagine that’s part of another complication of furniture.
Furniture has a lot of aspects to it because some customers want some piece of furniture to act as a placeholder, but some people build custom furniture and they specified every single material. They sourced it from every different region of the world. They’ll shoot us samples to make sure we can accurately represent it. It can be complicated.
Is this a technology to the point where you could sit on a chair in virtual reality and feel how high off the floor it is or is it all visual still?
You can sit with the understanding that if there’s nothing there, you’d be faking it. You’d be squatting down and the measurements would be right and the eye level would be right, but it wouldn’t feel the same. You could also line up a real chair with the real dimensions and use it as a proxy. I’m comfortably in a chair and I can understand what it’s like to see the world from a chair.
You can get the viewpoint. It’s starting to make me think about the opportunities here and the benefits of being able to use it. It also is complicated by the fact that as you’re pointing out that you have to hand create all these assets. If the furniture companies aren’t creating VR models for you and the product companies aren’t, you don’t have all of these tools to drop into space. You also suffer from the fact that either it costs you a lot to do the visualization and time consuming then it’s not as quick as you want to. That goes back to the big opportunity here is that for people to start looking at the asset of their creation, of the product that they’re creating, whatever that might be, whether it’s a piece of furniture or a doorknob. The opportunity of it is to make sure that we start to create three-dimensional assets of them so that they aren’t useful to a company like yours to that design process, to the specification process.
The good news of that is it’s all going that direction, luckily. Everyone’s designed software has been 3D for a while, but historically it’s been locked down to expensive, proprietary and awkward engineering applications. Not the tiny little OBJ file you can send to a friend, which also some furniture manufacturers can make their models available online for people like us to use, but then there’s a quality discrepancy or I needed high quality for good visuals. Someone else needs it low quality to run performantly. Everyone’s starting to have that arms race and that balance of what you need.
They are starting to standardize on, “Here are the three different sizes. Here are the three different model types. Here are the formats.” It’s always an issue.
Some people are pushing that and committed to creating high quality performance and a wide variety of file formats available. It is happening behind the scenes. It’s all good news going forward. A lot of this VR and 3D application revolution has contributed to that because there are a lot of free and low-cost tools to either create 3D models from scratch or create scenes of other people’s models, as well as 3D printing. Everyone wants to be able to grab an asset and print it outright instead of needing to build it from scratch every time. This whole community has been driving the industry to come up with better solutions to make it easier to use those real-life references.
That’s always the promise that I saw. I was a VR skeptic. I went away, got a presentation, Tom came back and he was like, “What Kool-Aid did you drink?” It was the opportunity of this idea that you would create such an immersive environment that I could grab something and then go 3D print that, or grab something and go buy it instantly in the gamified environment, whatever that might be and thinking, “That makes sense to me.” Nowadays I’m looking at this sheltered in place world that we might be accustomed to from time and again. Thinking about that being able to be more immersive in our shopping processes and in our selection process of things. That’s going to be critical because we can’t always go out and get it.
The big eye-opener for me was when Apple released the ARKit back in 2017 and Google followed shortly thereafter. Many retailers or furniture makers like IKEA or many other people now have these digital versions that they’re building for all of their content to get it in the hands of consumers, to see it in context and the real world, and make those better buying decisions. I have seen quite a few presentations from companies like this who give a return on investment of we sold 50% more products, those that were 3D. It’s still the early days to make the final decision on how valuable it is in terms of dollars. Certainly, especially in this ecosystem we’re living in, but also from the sheer accessibility of everyone’s cell phones that are composed into these amazing virtual worlds. It seems like the best way for it to make that is to get it in the hands of more people.
Once I see it on me, I want it. There is that demand now. We can’t quantify it. I see that as a big opportunity in future. What would you love for us and our readers to know about Theia? What you’re pioneering and what you’re working on and doing that’s going to help all of us learn about how the design process and workflow? How technology is going to change things for us?
The way we’ve been doing our business for years, whether it’s incredibly secure training operations or previewing a house that is yet to be sold. We’ve been focusing on game engine technology, primarily Unreal Engine, which is free and incredibly powerful for seeing 3D content creating immersive experiences or video games. Even leveraging that to satisfy all these customers of ours. The next steps for Theia is that we’re coming out with a toolset for unrelenting developers to more easily create their own VR experiences. Many of our customers or the people that we know want to build an interactive VR experience where they could bring in an environment or bring a product and get in the headset and trust that someone else can get in the headset and they can talk about that. It’s not that easy, especially if you’re going down the path of something like Unreal Engine, which might require some of that custom programming to do that. We’re releasing a version of a product, Optim, which is a quick start to get into virtual reality and see that content and communicate more easily with design review tools and voice communication. We’re excited about empowering that type of customer.HP is focused on #VRtechnology and coming up with great products that enable and empower designers. @hp @zbyhp Click To Tweet
We’re not trying to do a commercial here for HP, but we’re trying to make sure that people understand because we’ve been talking about the Z by HP, the technology of being able to create texture maps, and other things like that. How have the tools that you’ve gotten there been able to help you?
In terms of VR technology, the insight of tracking that you would get with something like the Reverb is cool because it allows you to explore a space naturally and quickly instead of needing to set up a more complicated based station type setup. It’s easy to throw it on. In our office, when we have many developers jam packed into one space, we have to set up, install and configure all these different things to make sure that everyone gets a consistent VR experience. With the Reverb, we can throw it on and it’s going to work at your computer. The HP product we love a lot is the VR backpacks because they are not only good desktop computers. That’s the one I’m working on without even using it for VR. Being able to throw on the backpack and walk around is incredibly free because it’s that perfect bridge between the standalone VR headsets, which are limited in terms of performance. You can’t get amazing content, stunning, realistic content on those standalone headsets. When you have a GPU VR that you need on your backpack, it’s a little bit more of that limitless feeling where you can make something that not only it looks super immersive, but you could run or look around. If you’re trying to do something boring like architecture.
You don’t walk around and look at all those pipelines. I’m glad that you mentioned that because we don’t personally have that experience of getting able to use it. If it’s helping you in the development process, as well as that’s the end-use, you’re going to have strap your client into it and make them walk through their own space. That’s helpful too.
It’s helpful for the end client, and for us being able to put on a backpack or put on a headset quickly is great. Being able to recommend it to the customer is always a nice thing where we have a guaranteed experience. We know that it’s a nice tight package and a consistent look and feel, and it’s going to work exactly how we say it’s going to work. You don’t have to worry about trying to set up your own weird configurations of custom computers or buying one from Best Buy.
We’ve been talking about VR for a long time. I feel that there are some people out there who are skeptical and they’re like, “This VR thing has never taken hold and it’s never going to tip.” What do you think the biggest challenge still is to get that into the mainstream process of where everybody’s working in a 3D world?
The circle with VR has always been putting on the headset. It’s somewhat dirty. We have to be extra thoughtful of viruses and how close things have been to our nose and mouths. It’s a little bit hot. It isolates you from people in a way. The end dream is that regardless of whether you’re doing VR in a headset or AR on a phone, there’s always going to eventually be this blending of the two where you’re going to have an ultra-lightweight, transparent glasses or contact lenses. These days it’d be something like the HoloLens where you can get the best of both worlds. As far as comfort and usability, it’s about that isolation or feeling tied down by a headset. There’s also the accessibility of it as far as price and the price of these headsets have come way down. Now that we have these standalone headsets, you don’t even need the computer in the first place. It’s far more accessible. That’s wonderful news for sure.
I would think the cell phone is going to help democratize this to a degree. The backpack is a much more powerful computing engine that gives you a different experience. We’re holding in our hands a phone that has more power than was on the spaceships in the late ’60s and ’70s that had gone to the moon. I don’t think it’s a stretch to think that there’s going to be a way for more and more people to experience this coming down to a handheld device. Would you agree?
I completely agree. A few years ago, we had cell phone powered VR headsets that were thought to be the future and it worked fairly well. It’s long been hypothesized the way Apple is going to release its first AR headset. It’s a lightweight device that plugs into your phone. As you said, we have so much power built into these teeny-tiny devices. You’d be surprised what you can do with it as well. The other driver, which is more interesting when you dig in behind the scenes that the benefits of cell phone technology don’t have to be consumed through the cell phone. All of those standalone VR headsets are using the cell phone screens, batteries and processors that have been mixed and mashed and maybe rebuild from the ground up that drive for handheld devices.
Thanks, Stephen, for coming on the show. We appreciate your time and we look forward to keeping up and seeing what Theia is going to keep doing in the future.
Thanks so much. This has been fun.
I’m jazzed up about all the creations and everything, but Stephen got us a little bit excited about what do we use? Where are the tools? We’ve rushed to talk to Joanna Popper from HP. She’s a Global Head of Virtual Reality for Location-Based Entertainment. She’s got some great background.
It’s something we’ve been referring to called the backpack.
The HP Reverb, and you might see it virtually behind her so be looking for that. If you’re reading this, you might want to go check that video out. You can check it out on YouTube. Joanna Popper is a Hollywood and Silicon Valley media executive. She has a media background and it’s interesting that she’s moving into this heavy innovation. She’s leading their HP initiatives for Go-To-Market and Location Based Entertainment for Virtual Reality. Think about what that title says. There’s some cool fun stuff coming up where we can go experience it in person, which maybe why there’s a backpack involved. Prior she was the Executive Vice President of Media and Marketing at Singularity University and the VP of Marketing, NBC Universal. Joanna developed a TV show partnership with NBC and Singularity for a new TV series on technology and innovation. She was selected as 1 of 50 Women who Can Change the World in Media and Entertainment. She’s a Top Women in Media: Game Changer and a Top Women in Media: Industry Leader. She is phenomenal and I’m excited about the technology. I can’t wait for us to talk to her. She’s the expert. We’re going to hear about it from here. Let’s go to that interview.
Joanna, thank you so much for joining us. I’m excited to talk about hardware this time. We’ve been talking about systems and software, but not hardware.
I’m glad to be here. You’ve had some great guests so far.
People who are reading this, she’s got a cool virtual background that’s going on with the headset. That’s the HP Reverb. We’ve talked a little bit about this virtual reality world and designing it and other things, but the technology, the hardware has come a long way. Tell us how much that has changed in the last few years alone.
In HP, we’ve been doing some exciting things. HP was part of the forming of Silicon Valley. It is credited with being the first Silicon Valley company and creating Silicon Valley to be what we know is to be now. Since then HP has been all about inventing and reinventing technology and the future. We see technologies such as VR, AR immersive computing, data science, AI, 3D printing and manufacturing as being that future of computing. That’s why HP focused on Virtual Reality. That’s why we’re investing deeply and coming out with such great products that enable people to empower people. We’re seeing a lot in Virtual Reality, particularly right now, is VR empowers you to collaborate, connect, create, learn, and even to game. With your guests, they’ve been talking a lot about VR enabling them to create. At HP, we’re on our third-generation headset. The headset behind me is the Reverb Generation 2. We did our first headset with Microsoft, the Windows Mixed Reality headset that was launched probably a few years ago.
We launched the Reverb First Generation and the focus of that headset is we listened a lot to our partners and the industry to see what do they want to see in a headset. There were a couple of things that stood out and a lot of them did come from this product development, architecture, engineering, construction, and creation community, which is the community that we’re talking about here. Some of those things were people are looking for very high resolution in their headset. If you’re looking at product design, it’s important that the quality and the realism be as good as possible. The headset that we came out with the Reverb Generation 1 was 2160 x 2160 per eye. That’s 2X the resolution of any other leading headset. We’re talking virtual ultra-real. We were looking at making the headset more comfortable, more ergonomic.
Especially with 1.1 pounds, if somebody is doing product design review or development in the headset for many hours, it’s comfortable for them. It’s also easy to get on. The Windows Mixed Reality, which is inside out tracking, you don’t need a base station so it’s fairly easy to set up. We have Reverb Generation 2 that’s on its way. It’s a partnership with Valve and Microsoft. I’m excited with the three great tech companies, each bringing what they’re best at together for this new headset. That will be coming. The pre-orders are up and ready and we’re excited. We took everything that was good about Reverb Generation 1 on the resolution side and continues to iterate on and improve on some of the other areas together with the help of our partners at Microsoft and Valve.
This is interesting because we’ve been seeing the same goal throughout every interview that we’ve done with the entire HP team as we’ve done the series. We’re seeing that these collaborations and these partnerships are at such a deep level. That integration with Microsoft and Valve, it’s going to take it to that much more deep learning level because you’re tapping into the organization’s deep learning rather than reinventing everything yourself. That’s refreshing as what we’ve been finding out about here.
Valve has been in the VR business. They’ve been making VR Optics, researching and creating for years. We’re tapping into their knowledge around optics and spatial audio. We’re bringing that to the table, their knowledge around ergonomics of headsets. Microsoft is bringing that best in class, six staff, inside-out tracking. The two companies working together, we’re able to make the integration between Windows Mixed Reality and steam VR pretty seamless. Those are some of the exciting things that we have on the headset side.
I want to talk about the backpack because the freedom of that sounds exciting. We’ve gotten to play a little bit with it. We’ve gotten on some of those ones where they have rigs that you’re all attached to in a grid. The ones where you feel like you’re wandering around the room and you’re not sure if you’re okay. The backpack sounds amazing in terms of freedom.
The backpack is a great product. It is designed so that you have the full capability of powerful, high performance, virtual reality, and all the freedom of flexibility of being able to move around. You can take a tethered headset like the Reverb with your 2160 x 2160 for eyes, a high resolution, went into the backpack. Put the backpack on your back. The backpack has an NVIDIA card 2080. It’s super powerful. All the latest Intel graphics as well, and the CPU side, and you can move around. They’ve been used by location-based payment venues to create these fun environments for people to engage with their family and friends with full immersion. On the architecture engineering construction side or product dev, there are some amazing examples. We had Stephen on from Theia Interactive. They created for us. One of our big philosophies at HP is we drink our own champagne. It’s not just about working with partners and clients and having them do VR projects or products but we also use VR at work and in different workflows.
We were building our new Houston office. We had Theia create what the office would look like so that the employees would get a sense of what would their new office is. Using virtual reality as a preview of what they were going to come to when the office opened. The employees were able to put on our HP VR headset, put on the backpack and walk around the construction site, and in the headset, see, “This is where the hall, the office, and the cafeteria.” You can have that experience of imagining what the office would be in the future, even though it didn’t yet exist, but in virtual reality, it did exist.
We’re going to have to re-imagine all of our offices. This might be a good way in re-imagining the retrofit of the offices.
That’s one of the things that VR are powerful around creating environments that may not exist in architecture, engineering, construction like we were talking with Stephen or product design that we were talking with Shay. Having the ability to make pipelines, workflows, and product design review is much more efficient and productive. In the end, it’s saving companies a lot of money and creating strong ROI.
Did anybody record the architectural work through? Is that possible so we can see that?
I don’t know if there’s an actual video of it. We can show you the visuals of what the virtual office looks like.
This is just a touch on all the great things that are going on. The real interaction that’s going on, not just with HP and their partners and your partners that you’re working with, but also with the customers and the access to the customer. In some cases, it’s like Gravity Sketch. They’re the ones that are touchpoints through to the customers that are working with it. That feedback loop that you’re creating, how do you use that to feed the redesigns and the information?
I’ll give you an example. The original backpack, we had two. One had a NVIDIA GeForce card and it was designed for gamers, and one had a Quadro card and it was designed for commercial. The story is we first made the version of the backpack for gamers and started getting calls from our enterprise clients saying, “I want one of those. We want that too.” We took the first one, which was Omen branded. We then created an HP Z branded version for the commercial clients. We got feedback around the card inside, the harnesses, and the fit. We took that feedback and then created Generation 2 backpack, HP G2 version. We’re taking all the consumer’s feedback, how to improve it and what to make better.
I love that because you’re creating a rich environment with all the partners in it. The industry is moving along with everybody, but you’re also in tune with the end-user. That’s creating that proper way, but you’re not only doing it with early adopters. You’re also doing it with people who are entrenched in the marketplaces. They’re building digital manufacturing facilities. They are doing all of these things already. They’re not just early adopters in that fun and gamer way that we saw. Informing some bad technology in the last decade that we all went, “This VR thing is never going to take off.” You have to tap into the ones who feel that they could use it every single day.
A similar story on the headset because we had the original WMR headset. We’ve got a lot of feedback from all different segments, industries, and use cases. People who are using the headsets to learn, to connect and to game. What’s interesting is some of the feedback is different by use case, but a lot of the feedback is similar. We learned what do people most want. We learned what they liked about what we did in each iteration. We also learned what they wanted us to keep improving. With our Reverb G2, it’s an amazing product in so far as it takes everything that was great about Generation 1. They even go on and say, “Right it.” In all the different places, we get feedback directly from customers. One by one, the things that people said, “I wish this.”
I’m envisioning a world, and I hate to say this because I don’t want to jinx it, but the next time sequestering may have ever had to happen, we’re all going to virtually gather together and you’re going to be right on the tipping point of that.
We do. I would say those of us who work in the industry and all of our partners and clients are already engaged in virtual worlds. The use has much accelerated during this time period. On our own team, we do staff meetings in VR quite often. We give presentations in VR. We may have even done some of the product design and quality review for the headset that we’ve been talking about in VR. That’s on the corporate or professional side. On the more work personal side, we all have those moments at work where you would have a virtual water cooler, whether it’s you go to lunch with someone, you walk by them and have a coffee. Now, we don’t have that because many of our offices are closed. Those of us who are lucky enough to be able to work from home often aren’t. We’re creating these VR hangs where we go into some of the social XR spaces and play basketball, do bowling and play dodgeball together. We can even fly, which is funny because even if we were all back at the office and maybe play basketball or grab a coffee we still couldn’t fly. That VR gives you that superpower to be able to fly. We even had one of our employees’ retirement party in Virtual Reality.
It’s a whole new world we’re talking about.
It’s going to give a new meaning to our Minecraft and Unicorn dance party that we’ve had to throw for our two daughters. They’re going to get a virtual reality aspect. We’re going to have to up our game. Joanna, thank you so much for joining us. Thank you so much for bringing the partners in with us so that we could have a fuller view of what’s going on in this XR world and getting excited about it again. These technologies start moving, whether it’s 3D printing, XR, VR and all of these different things. We get a little like, “We’ve been talking about this forever.” We aren’t paying attention to how far they’ve come. Thank you for sharing that with us.
Thank you for inviting us all on. It was great. I love what you are doing with the show. Keep it up. It’s excellent.
From VR 3D Design to Production: Creating, Visualizing, and Experiencing Design in Virtual Reality — Final Thoughts
We’ve covered VR in a broad way. We’ve also covered not just VR 3D design, but we’ve covered the creative workflow in those processes and democratizing design with VR.
I was blown away in many ways by each of these interview subjects we talked to. The first one especially with Shay and Gravity Sketch, I am impressed and blown away with what they’ve done, how they’ve done it, why they did it, and their perspective of where they’re coming from. As a Rhinoceros user myself for many years, I am thrilled at how Gravity Sketch is seamlessly integrated to work with Rhino among other CAD programs, but they have this direct import-export. I’m dying to use it myself. Unfortunately, I haven’t been able to. We were supposed to be able to try out some of these things and go down to HP and San Diego to do that.
COVID-19 presented some challenges there that we weren’t allowed to go in and spend time with them. I can’t speak from that personal user experience yet, but I’m dying to. When I think about all the different ways and what I’ve seen in some of the videos I’ve been through and studied in detail on their YouTube channel, all the different Gravity Sketch videos and see how people are creating objects. The creative possibilities and the speed with which you can communicate the idea from your mind’s eye to getting into the cabinet computer are fantastic.
Even the creative workflow between the two of us would be so much greater. If we could have that push-pull actual live discussion about the items before you had to go through so much design engineering to get the forms created. Before you can even share it with me and then I have my input and ruin it.
It’s faster using Gravity Sketch before you have me change everything I was doing.
Thinking about how great you can communicate with your teams, but that’s where Theia comes in. That’s where Stephen excited me and the idea that in a post-COVID world where many of your workers are remote and many of your clients and you are not interacting in person anymore. We’ve known that for almost a decade now. We work more remotely with our clients and our factories and our other things than we do in person. We always have for years. We’ve been looking at that for a long time. Anything you can do to make the communication skills clearer and the communication on a product on our design clear in that process means less money and less time lost. That’s less money spent.
Speeding up the process of getting a product, an environment, architects, whole building pipelines, whatever into the marketplace, this is going to be hugely valuable. I love what they’re doing there as well because this creates that flow through from Gravity Sketch at the beginning with the design side of it and get Theia in the middle here with this visualization and communication to the marketplace, the customer, your team and throughout that. We get into the technology that’s making it happen, the tools, and the equipment.
Joanna is amazing. What’s going on at HP is cool. They’ve got some good things tapped in there. I do love that all along the process, everything has been agnostic. Even at HP, they’re not reinventing everything. They’re working in partnership with everyone so that they’re creating a more robust industry. The products, hardware, and software that supports everything is more universal for everyone. That’s going to make the world easier. This is what we found in 3D printing and now they’re doing that on the VR side as well.
No question, the cooperation and collaboration across disciplines and in different parts of the workflow and the hardware and the software and all this, it’s refreshing to see this collaboration and cooperation. It’s going to be in the best interest of the industry as a whole and each of the different disciplines and people across the entire process.
I am excited about our new XR world.
So am I, and I wasn’t as excited about it as you were when you came back from that VR conference years ago but I am being convinced. I am and I’m looking forward to using it myself. I hope you’re as excited as we are. Thank you for reading this blog. I hope you enjoyed it and you’ve got a lot out of it, but there’s more.
We’re only halfway through our series. We’ve got more coming, some exciting things that we’re going to shift into and start talking about on the design engineering side. We’re going further and deeper into that. We’re looking forward to being with you for the rest of this series.
Thanks. We’ll be back for the next episode. Talk to you soon.
Get Even More!
- How Gravity Sketch Works
- Workflow Examples: Industrial Design and the Latest Tools
- Co-Creation in Action
- Footwear Design
- Gravity Sketch at F8
- Theia Highlights
- Case Study: Nvidia and the Hard Rock Hotel and Casino
- SteamVR Reverb
More About HP:
Capture and Create with Z by HP, Inspiring you for your next creative breakthrough with the Z portfolio designed and built to improve the way you create. Discover the latest Z Book to help you with your latest creative project.
Experience your design with HP Multi Jet Fusion technology and solutions reinvent design and manufacturing, unlocking the full potential of 3D printing and bringing down the barriers of 3D printing adoption across industries through materials innovation. For more details about Multi Jet Fusion technology click here.