簡易檢索 / 詳目顯示

研究生: 歐陽志暉
OU YANG,CHIN HUI
論文名稱: 運用決策樹和長短期記憶遞迴神經網路協助固晶打線機台參數調整與品質預測
Using decision tree and long short-term memory recurrent neural networks to assist in the adjustment parameters of die/wire bonding and quality prediction
指導教授: 歐陽超
Chao Ou-Yang
口試委員: 楊朝龍
Chao-Lung Yang
郭人介
Ren-Jieh Kuo
學位類別: 碩士
Master
系所名稱: 管理學院 - 工業管理系
Department of Industrial Management
論文出版年: 2018
畢業學年度: 106
語文別: 中文
論文頁數: 64
中文關鍵詞: 固晶打線機台參數長短期記憶遞迴類神經網路決策樹特徵製造
外文關鍵詞: die bonding, wire bonding, machine parameters, long short-term memory, decision trees, feature construction
相關次數: 點閱:361下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 在LED零組件組裝製程中,固晶和打線為影響製程不良率的關鍵因子,在固晶與打線機台運作過程中,工程師會根據以往經驗做參數的調整,並將產品透過人力做品質檢測,在尚未確認品質合格前,機台將呈現停滯狀態,使得調整參數及品質檢驗上缺乏效率。本研究利用某製造廠所提供之機台參數資料,根據資料調整前後之關係性,發展完善的長短期記憶遞迴神經網路(LSTM)預測模型,並結合工程師專業,挖掘適當預測模型之特徵,以提升整體模型預測率。
    固晶資料集記錄了每次調機產生之固晶狀況及銀膠高度,並發現銀膠高度與固晶狀況的對應關係,透過決策樹將資訊整合,並利用分枝結果協助工程師做為調整參數之依據。
    本研究之成果除運用在LED製程外,亦可應用此預測模式於相關產業的生產製程,藉此提升產線之生產效率。


    During the assembly process of LED components, die bonding and wire bonding are the key factors affecting the defective rate. When the die bonding and wire bonding machine work, the engineers will adjust the parameters based on their experience. Meanwhile, the equipment wouldn’t be remained functioning until products passed through quality inspections. It is inefficient for tuning parameters and quality inspection. Therefore, the research utilizes the parameters of machine by a manufacturer. It dependents on their relationship and Constructs a Long Short-term Memory (LSTM) Recurrent Neural Network Prediction Model. During the process, the research also combines the information by engineers, trying to construct the new features for the prediction model. Finally, it improved the overall model of prediction rate.
    In the die bonding data, the die bonding condition and the height of the Silver Epoxy were recorded on each parameter of the machine. The relationship between the height of the silver Epoxy and the status of the die bonding is observed. We integrated the information into the decision tree and its branches are used as well. The result assists the engineer to adjust the parameters.
    The approach of this study can not only apply to the LED manufacturing process but also apply to similar area based on the same concept. It can improve the efficiency of a production line.

    壹、 緒論 1 1.1 研究背景 1 1.2 研究目的 1 1.3 研究議題 5 1.4 重要性 6 貳、 文獻探討 7 2.1 LED封裝 7 2.1.1 Lamp 形式封裝 7 2.1.2 SMD 形式封裝 8 2.2 遞回神經網絡(recurrent neural network,RNN)介紹 8 2.3 遞回神經網絡(recurrent neural network,RNN)應用 9 2.4 長短期記憶模型 (Long Short-Term Memory) 10 2.5 時間序列(Time series) 12 2.6特徵工程(Feature engineering) 13 參、 研究方法 15 3.1 原始資料說明 15 3.2 資料序列關係 16 3.3 研究架構與流程 16 3.4 資料預處理 17 3.4.1遺失值處理 18 3.4.2正規化 18 3.4.3時間序列問題轉化為監督式學習問題 18 3.5 長短期記憶遞迴神經網路 19 3.5.1長短期記憶遞迴神經網路架構 20 3.5.2長短期記憶遞迴神經網路流程 22 3.6 特徵製造 25 3.6.1相關分析 25 3.7 決策樹 26 肆、 實作結果 29 4.1 個案資料 29 4.1.1機台參數 29 4.1.2固晶參數說明 30 4.1.3銲線參數說明 31 4.1.4參數序列折線圖 32 4.1.5品質評估 33 4.2 資料預處理 35 4.2.1資料整理 35 4.2.2時間序列問題轉化為監督式學習問題 37 4.3 長短期記憶遞迴神經網路(Long Short-Term Memory) 39 4.3.1銀膠高度預測 39 4.3.1打線品質預測 40 4.4 特徵製造 42 4.4.1相關分析 43 4.4.2特徵製造結果 44 4.5 決策樹分類結果 46 4.5.1混淆矩陣和分類報表 47 4.6 各方法比較 48 伍、 結論與建議 50 5.1 結論 50 5.2 研究限制與未來建議 51 參考文獻 53

    Amari, S.-I. (1998). "Natural gradient works efficiently in learning." Neural Computation 10(2): 251-276.

    Becker, S., and Y. Le Cun (1988). Improving the convergence of back-propagation learning with second-order methods. Proceedings of the 1988 connectionist models summer school, San Matteo, CA: Morgan Kaufmann.

    Bengio, Y., et al. (1994). "Learning long-term dependencies with gradient descent is difficult." IEEE transactions on neural networks 5(2): 157-166.

    Deng, L., et al. (2013). Recent advances in deep learning for speech research at Microsoft. Acoustics, Speech and Signal Processing (ICASSP), 2013 IEEE International Conference on, IEEE.

    Ding, H., et al. (2008). "Querying and mining of time series data: experimental comparison of representations and distance measures." Proceedings of the VLDB Endowment 1(2): 1542-1552.

    Fu, T.-c. (2011). "A review on time series data mining." Engineering Applications of Artificial Intelligence 24(1): 164-181.

    Gers, F. A., et al. (1999). "Learning to forget: Continual prediction with LSTM."

    Graves, A., et al. (2013). Speech recognition with deep recurrent neural networks. Acoustics, speech and signal processing (icassp), 2013 IEEE international conference on, IEEE.

    Graves, A. and J. Schmidhuber (2005). "Framewise phoneme classification with bidirectional LSTM and other neural network architectures." Neural Networks 18(5-6): 602-610.

    Haq, A. A. U., et al. (2016). "Feature Construction for Dense Inline Data in Semiconductor Manufacturing Processes." IFAC-PapersOnLine 49(28): 274-279.

    Hochreiter, S. and J. Schmidhuber (1997). "Long short-term memory." Neural Computation 9(8): 1735-1780.

    Kingma, D. P. and J. Ba (2014). "Adam: A method for stochastic optimization." arXiv preprint arXiv:1412.6980.

    Nurunnahar, S., et al. (2017). A short-term wind speed forecasting using SVR and BP-ANN: A comparative analysis. Computer and Information Technology (ICCIT), 2017 20th International Conference of, IEEE.

    Petridis, S. and M. Pantic (2016). Deep complementary bottleneck features for visual speech recognition. Acoustics, Speech and Signal Processing (ICASSP), 2016 IEEE International Conference on, IEEE.

    Qing, X. and Y. Niu (2018). "Hourly day-ahead solar irradiance prediction using weather forecasts by LSTM." Energy 148: 461-468.

    Rani, S. and G. Sikka (2012). "Recent techniques of clustering of time series data: a survey." International Journal of Computer Applications 52(15).

    Rush, A. M., et al. (2015). "A neural attention model for abstractive sentence summarization." arXiv preprint arXiv:1509.00685.

    Sundermeyer, M., et al. (2012). LSTM neural networks for language modeling. Thirteenth Annual Conference of the International Speech Communication Association.

    Tieleman, T. and G. Hinton (2012). "Lecture 6.5-RMSProp, COURSERA: Neural networks for machine learning." University of Toronto, Technical Report.

    Ting, J., et al. (2006). "Mining of stock data: intra-and inter-stock pattern associative classification." Threshold 5(100): 5-99.

    Viegas, R., et al. (2017). "Daily prediction of ICU readmissions using feature engineering and ensemble fuzzy modeling." Expert Systems with Applications 79: 244-253.

    Wei, L. and E. Keogh (2006). Semi-supervised time series classification. Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining, ACM.

    Williams, R. J. and D. Zipser (1989). "A learning algorithm for continually running fully recurrent neural networks." Neural Computation 1(2): 270-280.

    Xingjian, S., et al. (2015). Convolutional LSTM network: A machine learning approach for precipitation nowcasting. Advances in neural information processing systems.

    Yang, K. and C. Shahabi (2005). On the stationarity of multivariate time series for correlation-based data analysis. Data Mining, Fifth IEEE International Conference on, IEEE.

    Yang, Y., et al. (2018). "A CFCC-LSTM model for sea surface temperature prediction." IEEE Geoscience and Remote Sensing Letters 15(2): 207-211.

    Zdravevski, E., et al. (2015). Robust histogram-based feature engineering of time series data. Computer Science and Information Systems (FedCSIS), 2015 Federated Conference on, IEEE.

    Zhao, Y., et al. (2018). "Applying deep bidirectional LSTM and mixture density network for basketball trajectory prediction." Optik-International Journal for Light and Electron Optics 158: 266-272.

    張祐銜 and 劉正毓 (2009). 發光二極體的封裝技術, 科學發展.

    無法下載圖示 全文公開日期 2023/07/10 (校內網路)
    全文公開日期 本全文未授權公開 (校外網路)
    全文公開日期 本全文未授權公開 (國家圖書館:臺灣博碩士論文系統)
    QR CODE