簡易檢索 / 詳目顯示

研究生: DENNIS ATYUGRASIWI KUNARSITO
DENNIS KUNARSITO
論文名稱: 編碼解碼結構與注意機制應用於殘差堆疊門控循環單元 以預測時間性交通的預測
Residual Stacked Gated Recurrent Unit with Encoder-Decoder Architecture and an Attention Mechanism for Temporal Traffic Prediction
指導教授: 郭 人介
Ren-Jieh Kuo
口試委員: 歐陽超
Chao Ou-Yang
王孔政
Kung-Jeng Wang
郭 人介
Ren-Jieh Kuo
學位類別: 碩士
Master
系所名稱: 管理學院 - 工業管理系
Department of Industrial Management
論文出版年: 2020
畢業學年度: 108
語文別: 英文
論文頁數: 73
中文關鍵詞: 遞迴神經網絡注意機制交通預測
外文關鍵詞: Recurrent neural network, Attention mechanism, Traffic prediction
相關次數: 點閱:177下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 由於資訊及通訊科技的快速發展,深度學習技術被廣泛應用在許多領域。有鑒於持續增長的車輛,交通問題已經成為現代都市中主要的問題,為了實現智慧交通,追蹤整個道路的壅塞情形勢在必行。預測交通流量的挑戰在於整個交通是在壅堵、回復、事故的阻塞以及流暢中過渡的,且上述的事件都是極度的非線性,縱使在這領域中已提出了不同的神經網路來預測交通,提升其準確度仍然有其必要性。
    因此,這項研究提出了在遞迴神經網絡(RNN)上應用的注意機制模型,注意機制用於解決一般RNN長期依賴建模及無法有效使用記憶體來計算的問題,修改後的RNN也加進了編碼-解碼器,並結合殘差方法及深度堆疊的RNN,藉由減少梯度消失的可能性與加強擷取較長依賴性來增加預測的成效。
    這項研究使用從實際道路的感測器採集到的兩個開源資料庫的資料:PeMS San Jose Bay地區和Northbound Interstate I405N地區,來呈現其方法的效益。這兩個個案在交通上的變化都非常的快,此研究也顯示了具有注意機制的深度學習如何提供在長短期交通上的精準預測。本研究架構由五個部分組成,包含了數據收集、資料前處理、預測、預測方法、演算法驗證及應用,研究結果顯示此方法相較於其他演算法有更好的成效。


    Due to the fast growth of information and communication technology, deep learning technology has been wildly applied in many areas. Traffic has become one of the leading major problems for modern life in urban settings because of the steady growth of vehicles. It is necessary to track congestion throughout the network road for achieving intelligent transportation systems. The challenges of predicting traffic flow are sharp nonlinearities due to transitions between free flow, breakdown, recovery, and congestion. Though different neural networks have been put forward in the field of traffic prediction and have been put to extensive use, yet it is still necessary to enhance the prediction accuracy.
    Thus, this study proposes a model that uses an attention mechanism on recurrent neural networks (RNN). The attention mechanism is used to address the limitation of modeling long-dependencies and efficient usage of memory for computation that is unable to achieve by ordinary RNN. The modified RNN also applied as the encoder-decoder training function, which combines the Residual module and deep stacked RNN to increases the prediction performance of the model by decreasing the potential of vanishing gradient and enhance the ability to capture the longer dependencies.
    This study illustrates the methodology on two real-world road sensor data from open-access database PeMS San Jose Bay area and Northbound Interstate I405N area. Both cases have sharp traffic flow regime changes occurring very suddenly, and this study shows how deep learning featured with attention mechanism provides precise short-term and long-term traffic prediction. This research framework consists of five parts, including data collection, data preprocessing, forecasting methodology, algorithm verification, and application. The result indicated that the proposed method also shows better performance compared to the other model.

    摘要 iv ABSTRACT v ACKNOWLEDGMENT vi TABLE OF CONTENTS vii LIST OF TABLES ix LIST OF FIGURES x LIST OF APPENDIX xi CHAPTER 1 INTRODUCTION 1 1.1 Background and Motivation 1 1.2 Research Objectives 6 1.3 Research Scope, Constraints, and Assumptions 7 1.4 Thesis Organization 7 CHAPTER 2 LITERATURE REVIEW 9 2.1 Urban Traffic Congestion Forecasting 9 2.2 Deep Learning 11 2.2.1 Recurrent Neural Network (RNN) 13 2.2.1.1 Gated Recurrent Unit (GRU) 14 2.2.2 Stacked Model 15 2.2.3 Residual Model 17 2.2.4 Encoder-Decoder Model 19 2.2.5 Encoder-Decoder Architecture with an Attention Mechanism 20 CHAPTER 3 METHODOLOGY 22 3.1 Methodology Framework 22 3.2 Objective Functions 23 3.3 Data Collection 23 3.4 Proposed Model 24 3.5 Pseudocode for the Proposed Algorithm 32 CHAPTER 4 EXPERIMENTAL RESULTS 35 4.1 Dataset Description 35 4.2 Experimental Parameter Setting 37 4.3 Finding The Best Hyperparameters 38 4.4 Traffic Forecasting Performance Results 39 4.5 Sensitivity Analysis 40 4.6 Time Complexity analysis 42 CHAPTER 5 CASE STUDY RESULTS 43 5.1 Profile of Case Study 43 5.2 Case Study Parameter Setting 47 5.3 Traffic Forecasting Performance Results 47 5.4 Statistical Hypothesis 49 5.5 Sensitivity Analysis 52 CHAPTER 6 CONCLUSIONS AND FUTURE RESEARCH 54 6.1 Conclusions 54 6.2 Research Limitations 54 6.3 Contributions 55 6.4 Suggestions for Future Research 55 REFERENCES 56 APPENDIX 63

    Asteriou, D. and Hall, S.G., 2015. Applied econometrics. Macmillan International Higher Education.
    Bahdanau, D., Cho, K. and Bengio, Y., 2014. Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473.
    Bengio, Y., 2009. Learning deep architectures for AI. Foundations and trends® in Machine Learning, 2(1), pp.1-127.
    Bengio, Y., Mesnil, G., Dauphin, Y. and Rifai, S., 2013, February. Better mixing via deep representations. In International Conference on Machine Learning (pp. 552-560).
    Brockwell, P.J., Davis, R.A. and Fienberg, S.E., 1991. Time series: theory and methods: theory and methods. Springer Science & Business Media.
    Cascetta, E., 2013. Transportation systems engineering: theory and methods (Vol. 49). Springer Science & Business Media.
    Chakraborty, P., Marwah, M., Arlitt, M. and Ramakrishnan, N., 2012, July. Fine-grained photovoltaic output prediction using a bayesian ensemble. Twenty-Sixth AAAI Conference on Artificial Intelligence.
    Chen, L. and Chen, C.P., 2007, April. Ensemble learning approach for freeway short-term traffic flow prediction. In 2007 IEEE International Conference on System of Systems Engineering (pp. 1-6), USA, April 16-18, 2007. IEEE.
    Chen, S., Wang, X.X. and Harris, C.J., 2007. NARX-based nonlinear system identification using orthogonal least squares basis hunting. IEEE Transactions on Control Systems Technology, 16(1), pp.78-84.
    Chevalier, G., 2018. LARNN: Linear Attention Recurrent Neural Network. arXiv preprint arXiv:1808.05578.
    Cho, K., van Merriënboer, B., Gulcehre, C., Bahdanau, D., Bougares, F., Schwenk, H. and Bengio, Y., 2014, October. Learning phrase representations using RNN encoder-decoder for statistical machine translation. In Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP) (pp. 1724-1734).
    Chung, J., Gulcehre, C., Cho, K. and Bengio, Y., 2014. Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv preprint arXiv:1412.3555.
    Connor, J., Martin, R. and Atlas, L., 1994. Recurrent neural networks and robust time series prediction. IEEE Transactions on Neural Networks, 5(2), pp. 240-254.
    Cooijmans, T., Ballas, N., Laurent, C., Gülçehre, Ç. and Courville, A., 2016. Recurrent batch normalization. arXiv preprint arXiv:1603.09025.
    Daganzo, C. and Daganzo, C.F., 1997. Fundamentals of transportation and traffic operations (Vol. 30). Oxford: Pergamon.
    Drucker, H., Burges, C.J., Kaufman, L., Smola, A.J. and Vapnik, V., 1997. Support vector regression machines. In Advances in Neural Information Processing Systems 9 (NIPS) (pp. 155-161).
    Duan, Y., Lv, Y. and Wang, F.Y., 2016, November. Travel time prediction with LSTM neural network. In 2016 IEEE 19th International Conference on Intelligent Transportation Systems (ITSC) (pp. 1053-1058). IEEE.
    El Hihi, S. and Bengio, Y., 1996. Hierarchical recurrent neural networks for long-term dependencies. In Advances in Neural Information Processing Systems 8 (NIPS) (pp. 493-499).
    Frigola, R. and Rasmussen, C.E., 2013, December. Integrated pre-processing for Bayesian nonlinear system identification with Gaussian processes. In 52nd IEEE Conference on Decision and Control (pp. 5371-5376), Italy, December 10-13, 2013. IEEE.
    Fusco, G. and Gori, S., 1995, June. The use of artificial neural networks in advanced traveler information and traffic management systems. In Applications of advanced technologies in transportation engineering (pp. 341-345), Italy, June 27-30, 1995. ASCE.
    Giles, C.L., Miller, C.B., Chen, D., Sun, G.Z., Chen, H.H. and Lee, Y.C., 1992. Extracting and learning an unknown grammar with recurrent neural networks. In Advances in neural information processing systems (pp. 317-324).
    Glorot, X., Bordes, A. and Bengio, Y., 2011, June. Domain adaptation for large-scale sentiment classification: A deep learning approach. In Proceedings of the 28th International Conference on Machine Learning (ICML'11) (pp. 513-520).
    Golden, J., Milewicz, J. and Herbig, P., 1994. Forecasting: Trials and tribulations. Management Decision, 32(1), p.33.
    Goodfellow, I., Bengio, Y., Courville, A. and Bengio, Y. 2016. Deep Learning (Vol. 1). Cambridge: MIT Press.
    Goodfellow, I., Lee, H., Le, Q.V., Saxe, A. and Ng, A.Y., 2009. Measuring invariances in deep networks. In Advances in Neural Information Processing Systems 22 (NIPS) (pp. 646-654).
    Graves, A., 2013. Generating sequences with recurrent neural networks. arXiv preprint arXiv:1308.0850.
    Guan, S., Yun, J., Zhang, Q. and Aurora, A., 2016. Intelligent transportation system contributions to the operating efficiency of Urban traffic. Journal of Intelligent & Fuzzy Systems, 31(4), pp.2213-2220.
    Hamilton, J.D., 1994. Time Series Analysis (Vol. 2, pp. 690-696). New Jersey: Princeton.
    He, K., Zhang, X., Ren, S. and Sun, J., 2016. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (pp. 770-778), USA, June 27-30, 2016. IEEE
    He, K., Zhang, X., Ren, S. and Sun, J., 2016. Identity mappings in deep residual networks. In European Conference on Computer Vision (ECCV) (pp. 630-645), Netherlands, October 11–14, 2016
    Hermans, M. and Schrauwen, B., 2013. Training and analyzing deep recurrent neural networks. In Advances in Neural Information Processing Systems 26 (NIPS) (pp. 190-198).
    Hinton, G. E., Osindero, S. and Teh, Y.-W., 2006. A Fast Learning Algorithm for Deep Belief Nets. Neural Computation, 18(7), pp. 1527-1554.
    Hochreiter, S. and Schmidhuber, J., 1997. Long short-term memory. Neural Computation, 9(8), pp.1735-1780.
    Hua, J. and Faghri, A., 1994. Applications of artificial neural networks to intelligent vehicle-highway systems. Transportation Research Record, 1453, pp 83-90
    Huang, G.B., Chen, L. and Siew, C.K., 2006. Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans. Neural Networks, 17(4), pp.879-892.
    Ishak, S., Kotha, P. and Alecsandru, C., 2003. Optimization of dynamic neural network performance for short-term traffic prediction. Transportation Research Record, 1836(1), pp.45-56.
    Jozefowicz, R., Zaremba, W. and Sutskever, I., 2015, July. An empirical exploration of recurrent network architectures. In Proceedings of The 32nd International Conference on Machine Learning (ICML'15) (pp. 2342-2350), France, July 6-11, 2015.
    Kalchbrenner, N. and Blunsom, P., 2013. Recurrent Continuous Translation Models. Proceedings of the 2013 Conference of Empirical Methods in Natural Language Processing (EMNLP) (pp. 1700-1709), USA, October 18-21, 2013.
    Karlaftis, M.G. and Vlahogianni, E.I., 2011. Statistical methods versus neural networks in transportation research: Differences, similarities and some insights. Transportation Research Part C: Emerging Technologies, 19(3), pp.387-399.
    Khandelwal, I., Satija, U. and Adhikari, R., 2015, February. Forecasting seasonal time series with functional link artificial neural network. In 2015 2nd International Conference on Signal Processing and Integrated Networks (SPIN) (pp. 725-729), India, February 19-20, 2015
    Kingma, D.P. and Ba, J., 2014. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980.
    Kuremoto, T., Kimura, S., Kobayashi, K. and Obayashi, M., 2014. Time series forecasting using a deep belief network with restricted Boltzmann machines. Neurocomputing, 137, pp.47-56.
    Lawrence, S., Giles, C. L. and Fong, S., 2000. Natural Language Grammatical Inference with Recurrent Neural Networks. IEEE Transactions on Knowledge and Data Engineering, 12(1), pp. 126-140.
    Leal, M.T., 2002. Empirical analysis of traffic flow features of a freeway bottleneck surrounding a lane drop. MS Report. Department of Civil and Environmental Engineering, Portland State University, Portland, Ore.
    LeCun, Y., Bengio, Y. and Hinton, G., 2015. Deep learning. Nature, 521(7553), pp. 436-444.
    Li, Y., Yu, R., Shahabi, C. and Liu, Y., 2018, April. Diffusion convolutional recurrent neural network: Data-driven traffic forecasting. Sixth International Conference on Learning Representations (ICLR'18), USA, 30 April - 3 May, 2018.
    Liu, Z. and Hauskrecht, M., 2015, February. A regularized linear dynamical system framework for multivariate time series analysis. Twenty-Ninth AAAI Conference on Artificial Intelligence.
    Lingras, P., Sharma, S. and Zhong, M., 2002. Prediction of recreational travel using genetically designed regression and time-delay neural network models. Transportation Research Record, 1805(1), pp.16-24.
    Lippi, M., Bertini, M. and Frasconi, P., 2013. Short-term traffic flow forecasting: An experimental comparison of time-series analysis and supervised learning. IEEE Transactions on Intelligent Transportation Systems, 14(2), pp.871-882.
    Lipton, Z. C., Berkowitz, J. and Elkan, C., 2015. A critical review of recurrent neural networks for sequence learning. arXiv preprint arXiv:1506.00019.
    Liu, H., Van Zuylen, H., Van Lint, H. and Salomons, M., 2006. Predicting urban arterial travel time with state-space neural networks and Kalman filters. Transportation Research Record, 1968(1), pp.99-108.
    Liu, W., Zheng, Y., Chawla, S., Yuan, J. and Xing, X., 2011, August. Discovering spatio-temporal causal interactions in traffic data streams. In Proceedings of the 17th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (pp. 1010-1018). USA, August 23-27, 2011.
    Luong, M.T., Pham, H. and Manning, C.D., 2015. Effective approaches to attention-based neural machine translation. arXiv preprint arXiv:1508.04025.
    Ma, X., Tao, Z., Wang, Y., Yu, H. and Wang, Y., 2015. Long short-term memory neural network for traffic speed prediction using remote microwave sensor data. Transportation Research Part C: Emerging Technologies, 54, pp.187-197.
    Ma, X., Yu, H., Wang, Y. and Wang, Y., 2015. Large-scale transportation network congestion evolution prediction using deep learning theory. PloS one, 10(3).
    Makino, H., Tamada, K., Sakai, K. and Kamijo, S., 2018. Solutions for urban traffic issues by ITS technologies. IATSS Research, 42(2), pp.49-60.
    Michael, P.G., Leeming, F.C. and Dwyer, W.O., 2000. Headway on urban streets: observational data and an intervention to decrease tailgating. Transportation Research Part F: Traffic Psychology and Behaviour, 3(2), pp.55-64.
    Mikolov, T. and Zweig, G., 2012., Context dependent recurrent neural network language model. In 2012 IEEE Spoken Language Technology Workshop (SLT) (pp. 234-239), USA, December 2-5, IEEE.
    Mnih V, Heess N, and Graves A. 2014. Recurrent models of visual attention. Advances in Neural Information Processing Systems 27 (NIPS), pp. 2204 -2212.
    Moretti, F., Pizzuti, S., Panzieri, S. and Annunziato, M., 2015. Urban traffic flow forecasting through statistical and neural network bagging ensemble hybrid modeling. Neurocomputing, 167, pp.3-7.
    Ogunmolu, O., Gu, X., Jiang, S. and Gans, N., 2016. Nonlinear systems identification using deep dynamic neural networks. arXiv preprint arXiv:1610.01439.
    Park, D. and Rilett, L.R., 1998. Forecasting multiple-period freeway link travel times using modular neural networks. Transportation Research Record, 1617(1), pp.163-170.
    Park, D. and Rilett, L.R., 1999. Forecasting freeway link travel times with a multilayer feedforward neural network. Computer‐Aided Civil and Infrastructure Engineering, 14(5), pp.357-367.
    Park, D., Rilett, L.R. and Han, G., 1999. Spectral basis neural networks for real-time travel time forecasting. Journal of Transportation Engineering, 125(6), pp.515-523.
    Rumelhart, D.E., Hinton, G.E. and Williams, R.J., 1986. Learning representations by back-propagating errors. Nature, 323(6088), pp.533-536.
    Russakovsky, O., Deng, J., Su, H., Krause, J., Satheesh, S., Ma, S., Huang, Z., Karpathy, A., Khosla, A., Bernstein, M. and Berg, A.C., 2015. Imagenet large scale visual recognition challenge. International Journal of Computer Vision, 115(3), pp. 211-252.
    Saad, E., Prokhorov, D. and Wunsch, D., 1998. Comparative study of stock trend prediction using time delay, recurrent, and probabilistic neural networks. IEEE Transactions on Neural Networks, 9(6), pp. 1456-1470.
    Salakhutdinov, R. and Larochelle, H., 2010, March. Efficient learning of deep Boltzmann machines. In Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics (pp. 693-700), Italy, May 13-15, 2010
    Schmidhuber, J., 1992. Learning complex, extended sequences using the principle of history compression. Neural Computation, 4(2), pp.234-242.
    Schmidhuber, J., 2015. Deep learning in neural networks: An overview. Neural Networks, p. 85–117.
    Schrank, D., Lomax, T. and Eisele, B., 2019. 2019 urban mobility report. Texas Transportation Institute,[ONLINE]. Available: https://static.tti.tamu.edu/tti.tamu.edu/documents/mobility-report-2019-appx-a.pdf (Accessed: 1 May 2020)
    Shen, T., Zhou, T., Long, G., Jiang, J., Wang, S. and Zhang, C., 2018. Reinforced self-attention network: a hybrid of hard and soft attention for sequence modeling. arXiv preprint arXiv:1801.10296.
    Sholl, P. and Wolfe, R.K., 1985. The Kalman filter as an adaptive forecasting procedure for use with Box-Jenkins ARIMA models. Computers & Industrial Engineering, 9(3), pp.247-262.
    Srivastava, R.K., Greff, K. and Schmidhuber, J., 2015. Highway networks. arXiv preprint arXiv:1505.00387.
    Sutskever, I., Vinyals, O. and Le, Q.V., 2014. Sequence to sequence learning with neural networks. In Advances in Neural Information Processing Systems 27 (NIPS) (pp. 3104-3112).
    Szegedy, C., Ioffe, S., Vanhoucke, V. and Alemi, A.A., 2017, February. Inception-v4, inception-resnet and the impact of residual connections on learning. In Thirty-First AAAI Conference on Artificial Intelligence. (pp. 4278-4284) USA, February 4-10, 2017.
    Van Lint, J.W., 2006. Reliable real-time framework for short-term freeway travel time prediction. Journal of Transportation Engineering, 132(12), pp.921-932.
    Van Lint, J.W.C., Hoogendoorn, S.P. and van Zuylen, H.J., 2002. Freeway travel time prediction with state-space neural networks: modeling state-space dynamics with recurrent neural networks. Transportation Research Record, 1811(1), pp.30-39.
    Van Lint, J.W.C., Hoogendoorn, S.P. and van Zuylen, H.J., 2005. Accurate freeway travel time prediction with state-space neural networks under missing data. Transportation Research Part C: Emerging Technologies, 13(5-6), pp.347-369.
    Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A.N., Kaiser, Ł. and Polosukhin, I., 2017. Attention is all you need. In Advances in Neural Information Processing Systems 30 (NISP) (pp. 5998-6008).
    Vlahogianni, E.I., Golias, J.C. and Karlaftis, M.G., 2004. Short‐term traffic forecasting: Overview of objectives and methods. Transport Reviews, 24(5), pp.533-557.
    Vlahogianni, E.I., Karlaftis, M.G. and Golias, J.C., 2014. Short-term traffic forecasting: Where we are and where we’re going. Transportation Research Part C: Emerging Technologies, 43, pp.3-19.
    Watrous, R.L. and Kuhn, G.M., 1992. Induction of finite-state automata using second-order recurrent networks. In Advances in Neural Information Processing Systems 4 (NISP) (pp. 309-317).
    Wences, P., Martinez, A., Estrada, H. and Gonzalez, M., 2017, November. Decision-Making Intelligent System for Passenger of Urban Transports. In 11th International Conference on Ubiquitous Computing and Ambient Intelligence (UCAml) (pp. 128-139). USA, November 7–10, 2017.
    Whittle, P. 1951. Hypothesis testing in time series analysis. Thesis, Uppsala University, Upsalla, Sweden.
    Wu, Y., Hernández-Lobato, J.M. and Zoubin, G., 2013, February. Dynamic covariance models for multivariate financial time series. International Conference on Machine Learning, USA, 558-566.
    Xu, K., Ba, J., Kiros, R., Cho, K., Courville, A., Salakhudinov, R., Zemel, R. and Bengio, Y., 2015, June. Show, attend and tell: Neural image caption generation with visual attention. In International Conference on Machine Learning (pp. 2048-2057), France, July 7-9, 2015
    Yang, Q.F., Zhang, B. and Gao, P., 2012. Short-term traffic flow prediction method based on improved dynamic recurrent neural network. Journal of Jilin University(Engineering and Technology Edition), 42(4), pp.887-891
    Ye, Q., Szeto, W.Y. and Wong, S.C., 2012. Short-term traffic speed forecasting based on data recorded at irregular intervals. IEEE Transactions on Intelligent Transportation Systems, 13(4), pp.1727-1737.
    Yin, H., Wong, S., Xu, J. and Wong, C.K., 2002. Urban traffic flow prediction using a fuzzy-neural approach. Transportation Research Part C: Emerging Technologies, 10(2), pp.85-98.
    Yong-chuan, Z., Xiao-qing, Z. and Zhen-ting, C., 2011. Traffic congestion detection based on GPS floating-car data. Procedia Engineering, 15, pp.5541-5546.
    Younes, M.B. and Boukerche, A., 2015. A performance evaluation of an efficient traffic congestion detection protocol (ECODE) for intelligent transportation systems. Ad Hoc Networks, 24, pp.317-336.
    Yue, B., Fu, J. and Liang, J., 2018. Residual recurrent neural networks for learning sequential representations. Information, 9(3), p.56.
    Zhang, H., Goodfellow, I., Metaxas, D. and Odena, A., 2018. Self-attention generative adversarial networks. arXiv preprint arXiv:1805.08318.
    Zhang, S., Wu, Y., Che, T., Lin, Z., Memisevic, R., Salakhutdinov, R.R. and Bengio, Y., 2016. Architectural complexity measures of recurrent neural networks. In Advances in Neural Information Processing Systems 29 (NISP) (pp. 1822-1830).
    Zhao, Y., Yang, R., Chevalier, G., Xu, X. and Zhang, Z., 2018. Deep residual bidir-LSTM for human activity recognition using wearable sensors. Mathematical Problems in Engineering, 2018.
    Zheng, L., Ismail, K. and Meng, X., 2014. Traffic conflict techniques for road safety analysis: open questions and some insights. Canadian journal of Civil Engineering, 41(7), pp.633-641.
    Zheng, W., Lee, D.H. and Shi, Q., 2006. Short-term freeway traffic flow prediction: Bayesian combined neural network approach. Journal of Transportation Engineering, 132(2), pp.114-121.
    Zheng, Y., Capra, L., Wolfson, O. and Yang, H. 2014. Urban computing: concepts, methodologies, and applications. ACM Transactions on Intelligent Systems and Technology (TIST), 5(3), pp. 38.
    Zhou, P., Shi, W., Tian, J., Qi, Z., Li, B., Hao, H. and Xu, B., 2016, August. Attention-based bidirectional long short-term memory networks for relation classification. In Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (volume 2: Short papers) (pp. 207-212), Germany, August 7-12, 2016

    無法下載圖示 全文公開日期 2025/07/05 (校內網路)
    全文公開日期 2025/07/05 (校外網路)
    全文公開日期 2025/07/05 (國家圖書館:臺灣博碩士論文系統)
    QR CODE