Pushing ARTIFICIAL INTELLIGENCE ahead: the practical business value of cognitive visualizations
Industrial Internet and edge computing
An important advantage of cognitive visualizations is the immediate contextualization of data. The functionality exists for evaluating various components of live IT systems and is exemplified by applications in the Industrial Internet. Mizufuka says, “AR glasses enable data to have contextual relevance, meaning a dashboard can ‘live’ over a physical pump or piece of equipment. This will enable a field worker to visually scan a plant and see KPIs over physical equipment.”
The immediacy of the data visualization and its indication of whether a system component is functioning properly is unparalleled among field applications. It’s also useful within the enterprise as well. Haggar references a company involved with security software that uses AR technologies for “data visualization around security threats to your exterior network.”
The capability to rapidly visualize data takes on renewed significance for edge computing, which, Boris says, “is often complementary to some machine or device, say a complex inspection machine in an auto assembly plant.” Critical visualizations facilitate immediate visibility into data assets without transporting them to a centralized cloud, which may induce delays and bandwidth concerns. Despite the elaborate design of certain headsets, smart glasses are widely considered wearable devices that are expanding the Internet of Things while emphasizing its edge capabilities. “VR and, more so, AR can be viewed as the edge compute that attaches to the carbon-based asset called the human, forming the #IIoP or Industrial Internet of People,” Boris says.
XR may well become the proverbial “killer app” the IoT needs to attain mainstream adoption. The natural correlation between AI, the IoT and cognitive visualizations is underscored by Mizufuka who says, “AR headsets will be the browser for the Internet of Things.”
Use cases for cognitive visualizations via edge computing applications in the Industrial Internet include troubleshooting a manufacturing line attended by numerous pieces of equipment all generating sensor data. When trying to determine any broad number of causes for failure, “being able to have data that you can get to quickly with a device that doesn’t require you to take yourself away from whatever it is that you’re doing is really useful,” Wheelwright says.
Of all the data management mainstays cognitive visualizations improve, data modeling is likely the most pervasive. In general, cognitive visualizations impact data management by delivering benefits at a macrocosm perspective that are usually implemented within the microcosm of IT systems. That relationship is most clearly expressed in XR’s data modeling effects.
Sony Green, head of business development for Kineviz, says, “There’s two different avenues as we see it: There’s the simulation and the abstraction [of data models].” In both of those use cases, cognitive visualizations furnish 3-D perspectives, as opposed to standard 2-D models. Simulation is a powerful tool for industrial applications and fields such as construction, automotive and even aviation, enabling users to better visualize products for any assortment of purposes.
Haggar describes a state railway use case in which 3-D modeling is influential for retrofitting locomotive cockpits. Utilizing AR capabilities, workers create 3-D models of a particular cockpit by scanning it and determining, for instance, where to position a new LCD screen. “That process, which they would do with regular pictures and tape measures, would take three to five days,” Haggar says. “Using AR, that whole thing now takes less than one day.”
However, 3-D abstractions are perhaps even more utilitarian because they can abet conventional data modeling by providing visibility into intricate data elements that are otherwise difficult to visualize. Green details a case in which algorithms were used for bioinformatics data for “separating clusters and being able to see them in 3-D and then being able to stand inside of that. You get these sorts of natural structures that correlate to the relationships between the different data points.” The same approach is useful for both creating data modeling schema and adjusting them to include new requirements or sources.