[1] LECUN Y,BENGIO Y,HINTON G.Deep learning[J].Nature,2015,521(7553):436-444. [2] BENGIO Y.Learning deep architectures for AI[M].Boston:Now Publishers Inc,2009:1-127. [3] HOCHREITER S,SCHMIDHUBER J.Long short-term memory[J].Neural Computation,1997,9(8):1735-1780. [4] YIN W,KANN K,YU M,et al.Comparative study of CNN and RNN for natural language processing[J/OL].[2022-06-10].https://arxiv.org/abs/1702.01923. [5] LI S,LI W,COOK C,et al.Independently recurrent neural network(IndRNN):building a longer and deeper RNN[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition.Salt Lake City:[s.n.],2018:5457-5466. [6] FU R,ZHANG Z,LI L.Using LSTM and GRU neural network methods for traffic flow prediction[C]//2016 31st Youth Academic Annual Conference of Chinese Association of Automation (YAC).Orlando:IEEE,2016:324-328. [7] CHUNG J,GULCEHRE C,CHO K H,et al.Empirical evaluation of gated recurrent neural networks on sequence modeling[J/OL].[2022-06-10].https://arxiv.org/abs/1412.3555. [8] SHEWALKAR A.Performance evaluation of deep neural networks applied to speech recognition:RNN,LSTM and GRU[J].Journal of Artificial Intelligence and Soft Computing Research,2019,9(4):235-245. [9] VASWANI A,SHAZEER N,PARMAR N,et al.Attention is all you need[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems.Long Beach:[s.n.],2017:6000-6010. [10] KNUDSEN E I.Fundamental components of attention[J].Annual Review of Neuroscience,2007,30(1):57-78. [11] ZAHEER M,AHMED A,SMOLA A J.Latent LSTM allocation:joint clustering and non-linear dynamic modeling of sequence data[C]//34th International Conference on Machine Learning.NSW,Australia:[s.n.],2017:3967-3976. [12] YAQI C U I,YOU H E,TIANTIAN T,et al.A new target tracking filter based on deep learning[J].Chinese Journal of Aeronautics,2022,35(5):11-24. [13] BECKER P,PANDYA H,GEBHARDT G,et al.Recurrent Kalman networks:factorized inference in high-dimensional deep feature spaces[C]//36th International Conference on Machine Learning.Long Beach:[s.n.],2019:544-552. [14] AL BITAR N,GAVRILOV A I.Neural networks aided unscented Kalman filter for integrated INS/GNSS systems[C]//2020 27th Saint Petersburg International Conference on Integrated Navigation Systems (ICINS).Petersburg:IEEE,2020:1-4. [15] JONDHALE S R,DESHPANDE R S.Kalman filtering framework-based real time target tracking in wireless sensor networks using generalized regression neural networks[J].IEEE Sensors Journal,2018,19(1):224-233. [16] SATORRAS V G,AKATA Z,WELLING M.Combining generative and discriminative models for hybrid inference[C]//Proceedings of the 33rd International Conference on Neural Information Processing Systems.Vancouver:[s.n.],2019:13820-13830. [17] KRISHNAN R G,SHALIT U,SONTAG D.Deep Kalman filters[J/OL].[2022-06-10].https://arxiv.org/abs/1511.05121. [18] WANG R,LIU M S,ZHOU Y,et al.A deep belief networks adaptive Kalman filtering algorithm[C]//2016 7th IEEE International Conference on Software Engineering and Service Science (ICSESS).Beijing:IEEE,2016:178-181. [19] JUNG S,SCHLANGEN I,CHARLISH A.Sequential Monte Carlo filtering with long short-term memory prediction[C]//2019 22th International Conference on Information Fusion (FUSION).Ottawa:IEEE,2019:1-7. [20] COSKUN H,ACHILLES F,DIPIETRO R,et al.Long short-term memory Kalman filters:recurrent neural estimators for pose regularization[C]//Proceedings of the IEEE International Conference on Computer Vision.Venice:[s.n.],2017:5524-5532. [21] REVACH G,SHLEZINGER N,VAN SLOUN R J G,et al.Kalman net:data-driven Kalman filtering[C]//2021 IEEE International Conference on Acoustics,Speech and Signal Processing (ICASSP).Toronto:IEEE,2021:3905-3909. [22] REVACH G,SHLEZINGER N,LOCHER T,et al.Unsupervised learned Kalman filtering[J/OL].[2022-06-10].https://arxiv.org/abs/2110.09005. [23] REVACH G,SHLEZINGER N,NI X,et al.Kalman net:neural network aided Kalman filtering for partially known dynamics[J].IEEE Transactions on Signal Processing,2022,70(1):1532-1547. |