3DLive on the Apple Vision Pro: Q&A with Tom Acland


3DExcite’s CEO explains how Dassault Systemes’ visionOS app works and why it’s a crucial part of the next-generation 3DExperience platform.

Dassault Systèmes recently announced 3DLive, an upcoming app for the Apple Vision Pro headset that will bring spatial computing to users of the 3DExperience platform.

Scheduled for release this summer, 3DLive is part of Dassault’s next-generation concept of “3D UNIV+RSES”, a strategy that leans heavily on the merging of physical and virtual reality.

To learn more about 3DLive, Engineering.com sat down with Tom Acland, CEO of 3DExcite at Dassault Systèmes. He explained how the visionOS app works, why Dassault chose to collaborate with Apple, and how 3DLive fits into the 3D UNIV+RSES strategy.

Tom Acland, CEO of 3DExcite. (Image: Tom Acland via LinkedIn.)

The following transcript has been edited for brevity and clarity.

Engineering.com: What’s 3DLive all about?

Tom Acland: The release that we’re making in the summer of this year really consists, from a product perspective, of two components. There’s the 3DLive app, which is going to be available on the Apple Vision Pro. It’s the way that people access the information which is published from the 3DExperience platform.

The other half is the ability to create use case focused scenarios to help people in business collaborate with each other. And that tool chain is resident on the 3DExperience platform. So using the components which are on 3DExperience platform, you can aggregate different pieces of the virtual twin which are relevant in the context of a particular use case.

Is that a new app within 3DExperience?

We’re leveraging technology which was already there on the 3DExperience platform, but we’ve been able to extend it to make the experiences that you publish spatially accessible.

Specifically, there’s an app called Creative Experience which is part of the Experience Creator role. And that app has been available for many years already. It’s typically used by engineering teams who need to explain the value of what it is that they’re doing. It’s also available in the 3DExperience Works portfolio as Product Communicator.

[Related: How to use 3DExcite’s Creative Experience]

So Solidworks users will be able to use this tool?

Yeah, and they already use it today.

For what?

You can craft experiences for use in a 2D context. You can also generate portable content from the experience that you’ve created. So for example, if you need to create specific content which you’re going to use on your website, or animations, videos, those things, those can also be generated from the same application and using the same tool chain.

Could you tell me more about 3DExcite?

3DExcite is one of the Dassault Systèmes brands. It helps manufacturers take their products to market. So we help our manufacturing clients express the value of the inventions that they’re coming up with on the 3DExperience platform.

Obviously a big part of that is the storytelling. So how does this particular innovation help the people that it’s designed to serve? In a Solidworks world, for example, where you have people making machine tools, you have a similar challenge. How do I show my customer what it is that I’m developing?

So is 3DLive a marketing tool?

Well, if you think about traditional marketing, that’s often tied up with advertising. But as these products become more sophisticated, for example, more software defined, the way that you show the value of the product to a customer is not just through advertising. You have to be able to illustrate and explain new features.

For example, you’ve just released a product over-the-air. You might need some content which appears in the app which goes with the product, so that users can understand this new feature. So you can create advertising content, but you can also create content which is useful for end users, and that’s really the key.

What we’re seeing because of software definition and the speed of change is that it’s increasingly important that you define what the value is for the customer as early as possible. So you could look at this as a way of capturing requirements from a customer-centric perspective. So you’re not just writing things down, you’re modeling what the outcome of that experience is going to be so that you can show it to someone: “Is this what you want?” You can engineer it and then make sure that your engineering matches what you’re aiming for.

So being customer-centric is not just about communication outwards, it’s about communication inwards to everyone who’s building that product, so that everyone understands what it is we’re trying to make.

Dassault Systèmes’ promo video for 3DLive.

How closely did you work with Apple to develop the new app?

The idea goes well back before the collaboration with Apple. But what is special about the technology that Apple has developed for spatial computing is that you have a very powerful set of capabilities on the Apple Vision Pro, in terms of processing, in terms of sensors, in terms of the OS, which allows you to deliver those experiences in a very true-to-life fashion. And they’re easy to use.

The collaboration with Apple goes back over a year. They were actually at 3DExperience World last year. They came to visit us. We’d already started conversations. And it’s been a journey that’s been going on for over a year to work out exactly how 3DExperience can interact with and work with the Apple Vision Pro.

I think people sometimes talk about these things generically as a headset, right? But we see the Apple Vision Pro as not just another headset. It’s a different type of capability, which is a function of the hardware, but also the software which is powering those kind of experiences. So we don’t really see this as a case of just swapping out one headset for another. The VR thing’s been done before, but this is a next-generation capability for putting people inside the model.

How so? How does this Vision Pro app compare to VR experiences on other headsets?

There are a whole lot of specifics about the Apple Vision Pro capabilities which I’m not going to go into myself, but I’ll tell you about the benefits in terms of what the difference is. If we’re talking about the use cases which are typically addressed in VR today in conjunction with the 3DExperience platform, you’re often talking about design type situations where you’re looking at the exterior shell, the physical design of the product. And that’s typically a function of configuration, materials and geometry.

[Related: Should engineers buy the Apple Vision Pro?]

What we’re doing with the Apple Vision Pro is radically different, because you’re actually looking at all of the facets of the interaction with that thing, including kinematics, including systems information, and putting that in the context of an end user benefit. So it’s a much richer experience that you can create, and you can really get a sense of how the thing that you’re building is going to help the people it’s designed to serve. It’s not just a tool for designers. It’s a tool for everybody who needs to understand the benefit of a particular process or a particular product itself.

So this isn’t an existing capability being ported to a new headset?

No, it’s an entirely new thing. And it’s just the start. The whole idea that we’re trying to address in working with the Apple Vision Pro on the 3DExperience platform is a pillar of gen seven.

[Gen seven refers to 3D UNIV+RSES, “the seventh generation of representation of the world introduced by Dassault Systèmes”.]

So it’s a strategic aspect to the next-generation of the 3DExperience platform, which is designed to help people design better products to deliver more value to their customers, but also help customers understand what it is that they’re getting. If you’re selling a robot, for example, the customer may not understand how the robot’s made, but they want to understand how the robot’s going to fit their specific use case. So it’s as much to help the customers understand the value of the product that’s being engineered as it is a tool for the engineer to make a better product.

How will users access the 3DLive app, and what will it cost?

In an enterprise context, if you’re deploying Apple technology, you typically have an enterprise app store. Your devices themselves are often managed through device management, so you have a very similar experience to what you would have as a consumer, but the applications available to you as a user of an enterprise are curated by your IT department. And that’s using standard Apple technology for making iPhones, iPads, etc. part of the enterprise ecosystem.

So the app is going to be available to people by those means, on the enterprise app store for companies who’ve deployed this process. And there is no additional charge expected for having that app available in that way. Sign in through your 3DExperience ID and it’s up and running.

How you then discover those experiences, how they’re organized, is part of the value of the process. It’s not just the experience itself, it’s how you access it in context, so that people who are part of that work group can look at the things that they need to see together.

Do you plan to bring this technology to other XR headsets akin to the Apple Vision Pro, like Samsung’s Project Moohan?

The idea of spatial computing—or sense computing, as we call it, because we think it could become broader in the next 20 years—is going to be a very emerging field. So there may be other technologies by Apple or by other people which are relevant. And of course we want to embrace the best of the market to be able to execute on the Dassault Systèmes vision for sense computing.

That said, there’s something unique about the level of integration in the Apple stack. This is my personal view. If you are able to combine that very, very sophisticated hardware with the OS, with the experiences that are deployed to that device, you can achieve completely different things than when you have, let’s say, an ecosystem where the OS is separate from the device.

The ability to create that sense of stability, where everything is locked in place, is what you need if you want to, say, walk up to a machine and press a button and the virtual system responds in the right way. That’s very, very hard to achieve if you have dozens of different devices all nominally conforming to a spec. So we see that the technology that Apple has brought to market is at the moment leading not just because of the hardware that’s inside, but because of the approach. It’s because of the fact that you’ve got that close integration between the software and the hardware on the device. It allows you to do completely different things. And we don’t really see too many other companies at the moment with that level of capability.

So we’ll see what happens with the space. It’s likely to evolve, and there’ll be new types of devices, but obviously we want to work with the ones that actually achieve the objectives of Dassault Systèmes and 3DExperience.

You gave the example of walking up to a machine and pressing a button. Is that a capability of this app?

Yes. In one of the demos there’s a training scenario that’s an example of how a maintenance engineer who’s designing maintenance procedures would create a little boot camp for an operator to run through that procedure virtually. And you can imagine that if you’re trying to get a new line stood up, or you’re trying to turn over a line, or even an entire factory, there’s going to be hundreds of those specific use cases. And in that environment, it’s very important that you have a sense of being in the place and things behave the way that they’re going to behave.

So yeah, if there is an actuator in the context of a particular instruction that you’ve got to work through, that will be active, and you’ll be able to interact with it like in the real world. Likewise, if you have a screen in there that’s going to show you your work instruction, for example, it’ll have the actual work instruction that you’re going to encounter in the real world. So you’re really trying to give people a sense of proximity to the real world so that they really understand what it is they’re going to do.

If I had something like a TV stand in the app, could I go up to it and move it up and down?

Yes, absolutely. Kinematics is one of the things that makes a big difference in terms of traditional digital content creation versus the approach we’re taking here. Because the way that you make the experience is derived straight from the CAD, it has all of the kinematics and so on available to it, to make sure that the way those things are represented are true to the engineering.

And it’s quite possible that there’s a bit of back and forth between the engineering team and the customer. Things change. You don’t want to have to go back to the start again, export all the CAD again, go through that loop, which typically takes a long time for every single engineering change. You want to be able to just update that specific aspect, like the kinematics, and then it’ll be available to you within minutes to be able to show that update to the customer.

If I’m in the headset and my colleague next to me updates the model, will that change propagate to 3DLive?

One of the other aspects of gen seven is the virtual companion. Virtual companion is about giving people superpowers through the use of generative AI or AI in general, but also about being able to automate processes that previously were done manually.

So the objective is to do exactly as you described. That those processes which are already repeatable and manageable can also be automated, so that you can essentially run those processes in the cloud fully automatically.

I can’t tell you that’s all going to be there in the summer, but that’s exactly the intent. Once you’ve created those scenarios and you’ve created the relationships between those scenarios and the CAD, you don’t need to have to come in every time and run it again manually.

What about collaboration? Could both of us be in a headset and work on the same thing at the same time?

That will be available at the release in the summer. There are still a few kinks being worked out there, but that is absolutely the idea. It’s one of the things that we see as being most in demand in those immersive environments, the ability to be colocated in a virtual space with somebody else.

You use the term sense computing. How do you see different senses being incorporated into spatial computing?

We don’t know 100% yet. I think touch haptics is probably next in terms of being able to get the idea of surface texture. I think that’s quite likely to be the next one. Smell, I’m not so sure. That’d be kind of cool, but we’ll see how long it takes us to get there.

What else excites you about 3DLive?

I think it’s the direction of where spatial computing is going and why it’s important to see spatial computing as a function of virtual twins.

At the moment we’re talking mostly about creating virtual representations of something which is going to arrive in the future. But you can reverse the polarity of that. In the future, you’ll be able to superimpose on the real world things which are coming from the virtual, so you’ll be able to actually explain to people how devices or how products are composed, how they work, by reverse engineering the real world and getting back to where the information came from.

I think that is a super exciting outlook, because you’re not just talking about going from virtual to real. You’re talking about going from real to virtual as well. And to do that you need to be able to create continuity from the virtual to the real. The devices are able to recognize things precisely because they’ve been trained on the information which is inherent in the virtual twin.

In order to be able to do that recognition, you need to be able to have a well-defined model, which will allow these spatial computing devices generally to identify objects and then associate them with information that isn’t necessarily immediately visible.

So you’re going to see the virtual and the physical worlds kind of blend together, not just in terms of engineering and design, but in terms of use, and maybe in terms of circularity. Like, what else could that thing be if I were to deconstruct it? What elements of that could I take out? How could I recycle and how could I use them some other way? That’s part of what our purpose is, to make sure that there’s more value out of less resources that get consumed.

How does 3DLive fit into the concept of 3D UNIV+RSES?

I think the underpinning construct is the idea of moving from data up to representation of knowledge. CAD or IoT information, for example, unless it’s contextualized in a scenario which is meaningful to somebody, remains a little bit abstract. It makes it a little bit difficult to leverage. What does it mean semantically? Not just as a number, but what does it mean? And also, how is that knowledge used by people to create something new? And that’s the know-how element that occurs when people work together around a set of known concepts.

So you’re not modeling just the product. You’re talking about how the product interacts with other products and people in the context of its use. What happens to it in the real world? What can we learn from its actual interactions with the real world to make the design better? And that means we have to model the context to a certain extent at the same level of fidelity as we would have typically modeled the product in the past. And that’s quite an exciting new era, because we’re going to be modeling factories, we’re going to be modeling hospitals, we’re going to be modeling any place where these products add value to people’s lives, not just the products themselves. And I think that’s a sort of a step change in how we think about designing things for the real world.

So there’s a lot going into gen seven, which is about elevating what’s been done so far on the 3DExperience platform into the era of AI by adding meaning to data through experiences like we’ve been talking about with sense computing. And I think this is going to be quite an exciting journey as these things evolve all around us.



Source link

Related Posts

About The Author

Add Comment