The Ohio State University Corporate Engagement Office

Back to All Technologies

Point Prognostics: Closed Loop Particle Forecasting Platform for Decision Support and System Prognostics

Software & Information Technology
Machine Learning / Artificial Intelligence
College
College of Engineering (COE)
Researchers
Kumar, Mrinal "Mrinal"
Yang, Chao "Chao"
Licensing Manager
Zinn, Ryan
614-292-5212
zinn.7@osu.edu

T2019-029 A scalable, adaptive computational platform (software) that performs accurate, predictive computer simulations in less time, with the end goal of supporting a decision-making agency

The Need:

A report by IOT Analytics evaluated the predictive maintenance (PdM) market at $1.5 billion in 2016 and anticipated a growth of 39% annually to $10.96 billion by 2022. Particle methods (broadly known as "Monte Carlo'' methods, or, MC) are a class of computational algorithms used in PdM to predict expected behavior of complex systems or processes. MC tools are popular for their simplicity and their scalability for simulating complex problems. The use of fixed sized "particle ensembles," however, renders the simulations unable to provide performance guarantees in quantifying system uncertainty. Thus, there is no way of knowing, except through retrospective evaluation, as to how accurate the generated forecast is. Since existing MC tools cannot guarantee the desired level of accuracy, users often "over-compute" by using larger ensembles in the hopes that it works over the entire duration of forecasting. Alternatively, they build an adequate simulation through repeated trial and error. For moderately to highly complex systems, this is likely to be too time consuming, computationally burdensome and likely to divert resources away from other important tasks. Moreover, over-computing still does not provide any guarantees of accuracy. There is, therefore, a need for a new platform which provides system forecasts with guaranteed performance in terms of well define quantities of interest, while consuming the least amount of time in the process in order to most efficiently utilize available resources.

The Technology:

To address this need, researchers in The Ohio State University’s Laboratory for Autonomous and Data-Driven Systems (LADDCS) have created a closed-loop adaptive particle forecasting framework that delivers guaranteed accuracy in the estimation of system specific quantities of interest (QoIs). The forecasting system continuously monitors its performance by measuring the accuracy of its QoI estimation, and comparing it with a user-defined accuracy level. When the ensemble-based simulation is found to be underperforming, the platform formulates and solves a sequence of optimization problems that serve to enhance the ensemble’s efficiency. This allows the platform to be adaptive and creates trustworthy forecasts by ensuring that the accuracy of QoI prediction is maintained within user-defined bounds at all times. The value of this feature manifests as “front-end error control”, with the user at the helm. An additional, optional feature of this system is that when the platform is found to be over-performing, particles can be removed from the ensemble in order to improve computational efficiency, with the probability of removal inversely related to the particle’s significance to the current ensemble. Together, the technology delivers trustworthy, reliable predictions in minimal time. It eliminates guesswork, and enables robust decision making.

Commercial Applications:

  • Prognostics of performance and failure in:
    • aviation (jet) engines, oil-drilling equipment, power generation plants, electric automobile systems (e.g. useful battery life), manufacturing systems, and reliability in chemical and nuclear processes
  • Decision making support (e.g. Wind Farms or structural design)
  • Space Situational Awareness (SSA)

Benefits/Advantages:

  • Self-monitoring capability for monitoring accuracy of forecasts
  • Self-correcting capability
  • Front-end error control (eliminates guesswork in delivery of accurate forecasts)
  • Scalability of forecast computation
  • Minimum run time for client-prescribed accuracy: cuts run time in half for complex simulations
  • Ability to work with systems of very high complexity: can tackle systems with hundreds of thousands of system variables
  • Ability to generate prescriptive analytics: pin-points potential failure locations due to its physics-based architecture
  • Ability to improve client’s internal data-driven models by integrating with learning tools
  • Ability to keep customer data safe through use of appropriate communication protocols and encryption in the user interface

Research Interests:

The Ohio State University laboratory that developed this technology has expertise in the quantification of uncertainty in complex engineering systems. They perform theoretical, computational and experimental research in multi-agent autonomous systems, evidential sensor fusion, and randomized algorithms for modeling, uncertainty forecasting, optimization and control. The lab is focused on applications in the areas of scalable prognostics, collaborative autonomous robots and space situational awareness and is open for collaboration for further products and investigational routes.