How it works
XRStar is a proprietary simulation platform that allows the developers to have full control over each phase of the development pipeline.
Starting from references such as images, videos, laser scanning data and CAD we create all the 3D models optimized for AR/VR. CADs data can be directly imported in the platform. The 3D contents are developed to be used independently of the final platform (VR or AR).
- Team operations roles
- Caching system
- Interoperability VR AR
- Interface with external systems
- Develops same contents for AR e VR
- Cad import
- Blended experience
- Shared experience
The main module involved in a simulation is the instructor station, a coordinator that holds all the scene data and can modifies the behavior of the actors involved in the simulation to reproduce exactly real situations.
The augmented reality and virtual reality clients connect to the instructor station and at runtime thanks to a sophisticated caching system, download all the necessary data for the simulation from the instructor station saving it locally. In this way the complexity is moved from the front-end clients to the back end.
Thanks to the definition of roles for each client, it is possible to have connected to the same simulation users with different skill level (beginner, skilled, supervisor). In particular, the role of supervisor allows an expert user to constantly monitor the work performed by other users with less experience.
It is possible to interface the XRStar platform with external systems such as PLC using different protocols like REST or OPC-UA. In this way it is possible to receive data and signals directly from real system and inject them into the VR / AR simulation to reproduce the exact behavior of the real environment at runtime.
The platform allows also to use simulations to evaluate the ergonomics of the environments and systems through the Blended experience.
A VR experience can be hard to describe to those who don’t wear the headset and regular 2D videos shot from the first-person perspective of the user don’t do justice to the real experience.
The blended experience using the green screen allows to share directly the VR experience to evaluate from am second point of view the level of immersion of the user and the ergonomics of the VR scene.
VR allows to create a virtual three-dimensional photo-realistic copy of a real environment, such as buildings, industrial plants or equipment. Users can execute in real time all the actions that they can perform in the real world, simulating the functioning of different equipment and processes.
This digital twin is a bridge between the physical and digital world, a complete virtual model of a process, product or service.
This pairing of the virtual and physical worlds enables training, analysis of data and understanding of systems.
- 3D VR engine
- Dynamic chemical-physical simulation
- Mechanical-physical simulation
- Immersivity and stereoscopy
- Fluid/gas dynamic simulation
- Fire simulation
- Volumetric weather conditions
Our 3D Augmented Reality systems are cutting-edge technologies that enable effective industrial solutions. They enable the creation of applications which manage information layers made of 3D assets enriched with images, charts, video and other elements.
Augmented Reality systems can be applied to industrial plants, design products as well as in many other fields (Medicine, Research, Communication, etc.), Operators have access to valuable information and are supported in activities and task completion e.g. maintenance & repair and overhaul (MRO).
We support different AR devices, in particular Microsoft HoloLens version 1 and 2. Using the Hololens SDK in our platform, we developed and personalized different interaction mode and features.
We support all the base input gesture of Hololens, personalized vocal command for every need. We also developed modules for bar code/QR code scan and image recognition for marker based applications.
Projected reality renders virtual objects directly within or on the user’s physical space. A key benefit of PR is that the user does not need to wear a head-mounted display. Instead, with the use of spatial displays, wide field of view and possibly high-resolution images of virtual objects can be integrated directly into the environment. For example, the virtual objects can be realized by using digital light projectors to paint 2D/3D imagery onto real surfaces, or by using built-in flat panel displays.
The projection technology can turn any space into a type of interactive display, by projecting images on different surface and objects.
Through the projected reality every object can become a canvas, creating a visual narrative.
Our “3D Assisted Reality” (ASR) solution enables the delivery of real time remote support to Field Operators for interventions and operations through voice and video call features.
The Remote Center supports Field Operators delivering them information on which tasks and activities have to be executed and completed.
Operators receive and visualize instructions directly on dedicated devices (wearables) and can execute tasks and operations more effectively and efficiently.
We have developed our platform SDK to allow third parties to build scenarios and training procedures natively in XRStar.
Thanks to the SDK it is also possible to modify the native components and objects of our platform to extend their behavior and customize each solution according to the customer’s needs.
A continuous update of the SDK, allows the users to be constantly updated on the new features introduced in the platform and use them to customize their solutions.