Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Kalman Filter Auto-tuning through Enforcing Chi-Squared Normalized Error Distributions with Bayesian Optimization (2306.07225v1)

Published 12 Jun 2023 in cs.RO, cs.SY, and eess.SY

Abstract: The nonlinear and stochastic relationship between noise covariance parameter values and state estimator performance makes optimal filter tuning a very challenging problem. Popular optimization-based tuning approaches can easily get trapped in local minima, leading to poor noise parameter identification and suboptimal state estimation. Recently, black box techniques based on Bayesian optimization with Gaussian processes (GPBO) have been shown to overcome many of these issues, using normalized estimation error squared (NEES) and normalized innovation error (NIS) statistics to derive cost functions for Kalman filter auto-tuning. While reliable noise parameter estimates are obtained in many cases, GPBO solutions obtained with these conventional cost functions do not always converge to optimal filter noise parameters and lack robustness to parameter ambiguities in time-discretized system models. This paper addresses these issues by making two main contributions. First, we show that NIS and NEES errors are only chi-squared distributed for tuned estimators. As a result, chi-square tests are not sufficient to ensure that an estimator has been correctly tuned. We use this to extend the familiar consistency tests for NIS and NEES to penalize if the distribution is not chi-squared distributed. Second, this cost measure is applied within a Student-t processes Bayesian Optimization (TPBO) to achieve robust estimator performance for time discretized state space models. The robustness, accuracy, and reliability of our approach are illustrated on classical state estimation problems.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (33)
  1. Mohinder Grewal and A. Andrews, “Van Loan’s Method for Computing Q⁢_⁢k𝑄_𝑘{{Q}}\_kitalic_Q _ italic_k from Continuous Q𝑄{{Q}}italic_Q,” in Kalman Filtering: Theory and Practice with MATLAB, 2015, pp. 150–152.
  2. R. E. Kalman and R. S. Bucy, “New results in linear filtering and prediction theory,” Journal of Basic Engineering, vol. 83, no. 1, pp. 95–108, 1961.
  3. Z. Chen, C. Heckman, S. Julier, and N. Ahmed, “Weak in the nees?: Auto-tuning kalman filters with bayesian optimization,” in 2018 21st International Conference on Information Fusion (FUSION).   IEEE, 2018, pp. 1072–1079.
  4. L. Cai, B. Boyacıoğlu, S. E. Webster, L. van Uffelen, and K. Morgansen, “Towards auto-tuning of kalman filters for underwater gliders based on consistency metrics,” in OCEANS 2019 MTS/IEEE SEATTLE, 2019, pp. 1–6.
  5. T. D. Powell, “Automated tuning of an extended kalman filter using the downhill simplex algorithm,” Journal of Guidance, Control, and Dynamics, vol. 25, no. 5, pp. 901–908, 2002.
  6. Y. Morales, E. Takeuchi, and T. Tsubouchi, “Vehicle localization in outdoor woodland environments with sensor fault detection,” in 2008 IEEE International Conference on Robotics and Automation.   IEEE, 2008, pp. 449–454.
  7. B. Altinöz and D. Ünsal, “Determining efficient temperature test points for imu calibration,” in 2018 IEEE/ION Position, Location and Navigation Symposium (PLANS).   IEEE, 2018, pp. 552–556.
  8. H. Dette, A. Pepelyshev, and A. Zhigljavsky, “Optimal designs in regression with correlated errors,” Annals of statistics, vol. 44, no. 1, p. 113, 2016.
  9. L. Wanninger, “Real-time differential gps error modelling in regional reference station networks,” in Advances in Positioning and Reference Frames.   Springer, 1998, pp. 86–92.
  10. S. T. Barratt and S. P. Boyd, “Fitting a Kalman smoother to data,” in 2020 American Control Conference (ACC).   IEEE, 2020, pp. 1526–1531.
  11. J. Duník, O. Kost, O. Straka, and E. Blasch, “Covariance estimation and Gaussianity assessment for state and measurement noise,” Journal of Guidance, Control, and Dynamics, vol. 43, no. 1, pp. 132–139, 2020.
  12. L. Zhang, D. Sidoti, A. Bienkowski, K. R. Pattipati, Y. Bar-Shalom, and D. L. Kleinman, “On the Identification of Noise Covariances and Adaptive Kalman Filtering: A New Look at a 50 Year-Old Problem,” IEEE Access, vol. 8, pp. 59 362–59 388, 2020.
  13. J. Duník, O. Straka, O. Kost, and J. Havlík, “Noise covariance matrices in state-space models: A survey and comparison of estimation methods—Part I,” International Journal of Adaptive Control and Signal Processing, vol. 31, no. 11, pp. 1505–1543, 2017.
  14. J. Ko and D. Fox, “GP-Bayes filters: Bayesian filtering using gaussian process prediction and observation models,” Autonomous Robots, vol. 27, no. 1, pp. 75–90, 2009.
  15. Q. Xia, M. Rao, Y. Ying, and X. Shen, “Adaptive fading kalman filter with an application,” Automatica, vol. 30, no. 8, pp. 1333–1338, 1994.
  16. S. T. Barratt and S. P. Boyd, “Fitting a kalman smoother to data,” in 2020 American Control Conference (ACC), 2020, pp. 1526–1531.
  17. B. M. Åkesson, J. B. Jørgensen, N. K. Poulsen, and S. B. Jørgensen, “A tool for kalman filter tuning,” in Computer Aided Chemical Engineering.   Elsevier, 2007, vol. 24, pp. 859–864.
  18. M. Poncela, P. Poncela, and J. R. Perán, “Automatic tuning of kalman filters by maximum likelihood methods for wind energy forecasting,” Applied Energy, vol. 108, pp. 349–362, 2013. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S0306261913002377
  19. T. Ting, K. L. Man, E. G. Lim, and M. Leach, “Tuning of kalman filter parameters via genetic algorithm for state-of-charge estimation in battery management system,” The Scientific World Journal, vol. 2014, 2014.
  20. J. Yan, D. Yuan, X. Xing, and Q. Jia, “Kalman filtering parameter optimization techniques based on genetic algorithm,” in 2008 IEEE International Conference on Automation and Logistics, 2008, pp. 1717–1720.
  21. Y. Oshman and I. Shaviv, “Optimal tuning of a kalman filter using genetic algorithms,” in AIAA Guidance, Navigation, and Control Conference and Exhibit, 2000, p. 4558.
  22. Z. Chen, C. Heckman, S. Julier, and N. Ahmed, “Time dependence in kalman filter tuning,” in 2021 IEEE 24th International Conference on Information Fusion (FUSION).   IEEE, 2021, pp. 1–8.
  23. M. Pelikan, D. E. Goldberg, and E. Cantú-Paz, “Boa: The bayesian optimization algorithm,” in Proceedings of the 1st Annual Conference on Genetic and Evolutionary Computation-Volume 1.   Morgan Kaufmann Publishers Inc., 1999, pp. 525–532.
  24. A. Shah, A. G. Wilson, and Z. Ghahramani, “Bayesian optimization using student-t processes,” in NIPS Workshop on Bayesian Optimisation, 2013.
  25. B. Minasny and A. B. McBratney, “The matérn function as a general model for soil variograms,” Geoderma, vol. 128, no. 3-4, pp. 192–207, 2005.
  26. M. G. Genton, “Classes of kernels for machine learning: a statistics perspective,” Journal of machine learning research, vol. 2, no. Dec, pp. 299–312, 2001.
  27. B. D. Tracey and D. Wolpert, “Upgrading from gaussian processes to student’st processes,” in 2018 AIAA Non-Deterministic Approaches Conference, 2018, p. 1659.
  28. D. E. Finkel, “Direct optimization algorithm user guide,” Center for Research in Scientific Computation, North Carolina State University, vol. 2, pp. 1–14, 2003.
  29. R. Martinez-Cantin, “Bayesopt: A bayesian optimization library for nonlinear optimization, experimental design and bandits,” Journal of Machine Learning Research, vol. 15, pp. 3915–3919, 2014. [Online]. Available: http://jmlr.org/papers/v15/martinezcantin14a.html
  30. E. A. Wan and R. Van Der Merwe, “The unscented kalman filter for nonlinear estimation,” in Proceedings of the IEEE 2000 Adaptive Systems for Signal Processing, Communications, and Control Symposium (Cat. No. 00EX373).   Ieee, 2000, pp. 153–158.
  31. Z. Chen, “Visual-inertial slam extrinsic parameter calibration based on bayesian optimization,” University of Colorado at Boulder Thesis, 2018.
  32. P. Ding, “On the conditional distribution of the multivariate t distribution,” The American Statistician, vol. 70, no. 3, pp. 293–295, 2016.
  33. E. Brochu, V. M. Cora, and N. De Freitas, “A tutorial on bayesian optimization of expensive cost functions, with application to active user modeling and hierarchical reinforcement learning,” arXiv preprint arXiv:1012.2599, 2010.
Citations (1)

Summary

We haven't generated a summary for this paper yet.