Papers
Topics
Authors
Recent
2000 character limit reached

Predictive Process Model Monitoring using Recurrent Neural Networks (2011.02819v3)

Published 5 Nov 2020 in cs.LG

Abstract: The field of predictive process monitoring focuses on case-level models to predict a single specific outcome such as a particular objective, (remaining) time, or next activity/remaining sequence. Recently, a longer-horizon, model-wide approach has been proposed in the form of process model forecasting, which predicts the future state of a whole process model through the forecasting of all activity-to-activity relations at once using time series forecasting. This paper introduces the concept of \emph{predictive process model monitoring} which sits in the middle of both predictive process monitoring and process model forecasting. Concretely, by modelling a process model as a set of constraints being present between activities over time, we can capture more detailed information between activities compared to process model forecasting, while being compatible with typical predictive process monitoring objectives which are often expressed in the same language as these constraints. To achieve this, Processes-As-Movies (PAM) is introduced, i.e., a novel technique capable of jointly mining and predicting declarative process constraints between activities in various windows of a process' execution. PAM predicts what declarative rules hold for a trace (objective-based), which also supports the prediction of all constraints together as a process model (model-based). Various recurrent neural network topologies inspired by video analysis tailored to temporal high-dimensional input are used to model the process model evolution with windows as time steps, including encoder-decoder long short-term memory networks, and convolutional long short-term memory networks. Results obtained over real-life event logs show that these topologies are effective in terms of predictive accuracy and precision.

Summary

  • The paper introduces a novel predictive process model monitoring approach (PPMM) that uses RNNs to forecast process constraints in execution windows.
  • It details two architectures, encoder-decoder LSTMs and ConvLSTMs, achieving over 90% precision in predicting declarative process constraints.
  • The evaluation on BPI Challenge datasets demonstrates practical efficiency and highlights ConvLSTMs’ superior performance despite increased computational demands.

Predictive Process Model Monitoring using Recurrent Neural Networks

The paper "Predictive Process Model Monitoring using Recurrent Neural Networks" (2011.02819) advances the domain of predictive process monitoring, introducing an innovative approach known as Predictive Process Model Monitoring (PPMM). This essay details the methodology, implementation, and evaluation of this technique, focusing on its performance in practical applications using recurrent neural networks.

Introduction to the Methodology

PPMM bridges the gap between predictive process monitoring, which focuses on individual case predictions, and process model forecasting, which offers system-wide state predictions. The core idea is to model processes as constrained sequences over time, allowing for detailed and robust prediction capabilities through recurrent neural networks (RNNs). The technique employs Processes-As-Movies (PAM), which jointly mines and predicts declarative constraints across process execution windows.

PPMM leverages constraints expressed in Linear Temporal Logic (LTL) and modeled via Declare templates, building a detailed feature space conducive for machine learning. RNNs, specifically encoder-decoder LSTMs and convolutional LSTMs (ConvLSTMs), analyze these features to predict future constraints in subsequent execution windows.

Implementation Details

Feature Generation

Process traces are divided into windows, each mined for LTL constraints to form a feature space capable of illustrating process evolution. Feature vectors derived from the presence of these constraints are input into RNNs. The constraints include common declarative constructs like absence, chain response, and coexistence, among others.

Neural Network Topologies

Two neural network architectures are extensively employed:

  1. Encoder-Decoder LSTMs: This architecture reduces the input’s dimensionality through a series of encoding layers, and then reconstructs the desired output through corresponding decoding layers. It's adept at modeling sequences of fixed-length constraints.
  2. Convolutional LSTMs: Exploiting convolutional layers alongside LSTMs enables the network to process high-dimensional inputs, maintaining spatial correlations over time. This is particularly advantageous for the PAM approach, mimicking the convolutional techniques used in video analysis.

Both approaches are benchmarked using well-structured event logs from the BPI 2012 and 2017 Challenge datasets, simulating real-life process executions.

Evaluation

Comparative Performance

Performance on predicting constraints was evaluated using metrics such as Area Under Curve (AUC), precision-recall (PR), and F-score. The ConvLSTM architecture generally outperformed encoder-decoder LSTMs, providing higher average precision (over 90% in certain cases), indicating superior capability in handling high-dimensional, multiplicative constraint representations.

Constraint-specific Results

The evaluation revealed that unary constraints (e.g., absence, init) are predicted with high accuracy, underscoring the model's proficiency in capturing fine-granular process details. ConvLSTMs, in particular, demonstrated robust performance across different settings of window sizes and numbers, although encoder-decoders offered competitive results in more focused scenarios.

Computational Efficiency

The neural networks were tested on standard desktop setups, delivering timely computations suitable for practical applications. Despite ConvLSTMs demanding more computational resources, their enhanced prediction accuracy justifies the trade-off in most applications.

Conclusion

PPMM offers a novel hybrid approach to predictive process monitoring, effectively balancing between temporal forecasting and case-level predictions. Its architecture, particularly with ConvLSTMs, promises scalable and precise predictions of process models through constraints.

Future research could refine feature generation strategies, explore more nuanced windowing techniques, and further optimize neural network architectures. Deploying this methodology in real-time scenarios could also unveil further insights into its efficacy and benchmark improvements in predictive process monitoring tasks.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.