簡易檢索 / 詳目顯示

研究生: Muhammad Fadli
Muhammad Fadli
論文名稱: 用於電子音樂的自動對拍和等化器混音
Automatic Beatmatching and Equalizer Mixing for Electronic Dance Music
指導教授: 鄭瑞光
Ray-Guang Cheng
口試委員: 黃琴雅
Chin-Ya Huang
許獻聰
Shiann-Tsong Sheu
江振國
Chen-Kuo Chiang
學位類別: 碩士
Master
系所名稱: 電資學院 - 電子工程系
Department of Electronic and Computer Engineering
論文出版年: 2022
畢業學年度: 110
語文別: 英文
論文頁數: 52
中文關鍵詞: 電子音樂自動對拍等化器混音
外文關鍵詞: Electronic dance music, Automatic beatmatching, Automatic equalizer mixing
相關次數: 點閱:95下載:1
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 電子音樂(EDM)是世界中流行的音樂類型之一。唱片騎士(DJ)是EDM的主要表演者,通常以將歌曲混合成一個無縫的混音而聞名,確保EDM歌曲在沒有任何中斷或沈默的情況下混合在一起,並通過節奏、節拍和下行節拍進行結構上的協調。此外,DJ控制EDM歌曲的頻率以防止不同樂器之間的沖突。調整節奏、節拍、下拍點和保持EDM歌曲的頻率的做法被稱為對拍和等化器混音。對拍和等化器混音對於保持聽覺體驗至關重要。從長遠來看,雇用一個DJ可能相當昂貴。作為人類,DJ長時間連續工作的能力也是有限的。此外,手動對拍和調整等化器對於普通音樂消費者而言是有難度的,因為他們缺乏關於DJ相關技術和知識。音頻處理和音樂訊息檢索技術的進步使自動對拍和等化器混音系統得以發展。我們設計了一個開源的自動對拍和等化器混音系統,通過整合現有的最佳節拍跟蹤、節奏運算和下拍點跟蹤方法來改善歌曲轉換中的聽覺體驗。根據台灣15位專業DJ對400首EDM歌曲的評估,與沒有使用我們提出的系統的歌曲混合相比,我們提出的系統改善了歌曲混合的聽覺體驗。


    Electronic dance music (EDM) is one of the most popular music genres in the world of music. Disk jockey (DJ), the primary performer of EDM, is often known for mixing songs into a seamless mix that ensures EDM songs are blended without any breaks or silences and structurally coherent by tempo, beat, and downbeat. Additionally, DJ controls the frequencies of EDM songs to prevent clashing between different instruments. The practice of aligning tempo, beat, downbeat, and maintaining frequencies of EDM songs is called beatmatching and equalizer mixing. Beatmatching and equalizer mixing is essential to keep the listening experience uninterrupted. In the long run, hiring a DJ may be quite expensive. DJs as human beings are also limited in their ability to work continuously for long periods. Also, manually beatmatching and changing equalizer values will challenge general music consumers due to their lack of knowledge about DJing techniques. Advancements in audio manipulations and retrieval of music information techniques have enabled the development of automatic beatmatching and equalizer mixing system. We design an open-source automatic beatmatching and equalizer mixing by integrating the best available beat tracking, tempo estimation, and downbeat tracking method to improve the listening experience in song transition. Based on an evaluation from 15 professional DJ in Taiwan with 400 EDM songs, the proposed system improve the listening experience on song mixes compared to song mixes without using our proposed system.

    Letter of Authority ii Letter of Authority iii Abstract in Chinese iii Abstract in English iv Acknowledgements v Contents vi List of Figures viii List of Tables x 1 Introduction 1 2 Background Knowledge 4 2.1 Beat 4 2.2 Bar, Time Signature, and Downbeat 4 2.3 Tempo 6 2.4 Equalizer 9 2.5 Song Structure 10 3 Related Works 12 4 System Architecture 16 4.1 Music Library 17 4.1.1 Beat Tracking 18 4.1.2 Tempo Estimation 20 4.1.3 Downbeat Tracking 20 4.2 Beatmatching and Equalizer Mixing 21 5 Experimental Result 28 6 Conclusions 36 6.1 Future Work 36 References 38 Letter of Authority 41

    [1]. M. M. EDM, “Electronic music is the world’s third most popular genre.” https://mixmag.net/read/electronic-music-third-popular-genre-news/, Jun 2022.

    [2]. A. Redfield and M. Thouin-Savard, “Electronic dance music events as modern-day ritual,” International Journal of Transpersonal Studies, vol. 36, pp. 52–66, Sep 2017.

    [3]. L. Veire and T. Bie, “From raw audio to a seamless mix: creating an automated dj system for drum and bass,” EURASIP Journal on Audio, Speech, and Music Processing, vol. 2018, Sep 2018.

    [4]. Á. M. Bjartmarz, “Electronic dance music (edm): brief introduction of edm production,” bachlelor’s thesis, Iceland University of the Arts, Dec 2017.

    [5]. M. Halick, “What can you teach with electronic dance music? a music teachers guide to edm,” General Music Today, vol. 30, Mar 2016.

    [6]. J. W. Cannon and A. E. Greasley, “Exploring relationships between electronic dance music event participation and well-being,” Music & Science, vol. 4, p. 2059204321997102, Mar 2021.

    [7]. R. Keller, “Mapping the soundscape: Rhythm and formal structure in electronic dance music,” Master’s thesis, Florida State University, Nov 2004.

    [8]. F. Jimi, Rave Culture: An Insider’s Overview. Small Fry Pub, Jan 2000.

    [9]. S. John, DJing for dummies. John Wiley Sons, Dec 2014.

    [10]. A. Carroll, “Beat-mixing rock music,” eJournalist, vol. 13, pp. 102–115, Jul 2013.

    [11]. J. A. Magana, “Performance in edm - a study and analysis of djing and live performance artists,” Master’s thesis, California State University, Dec 2018.

    [12]. M. E. P. Davies, P. Hamel, K. Yoshii, and M. Goto, “Automashupper: An automatic multi-song mashup system,” in ISMIR, 2013.

    [13]. M. E. P. Davies, P. Hamel, K. Yoshii, and M. Goto, “Automashupper: Automatic creation of multi song music mashups,” Audio, Speech, and Language Processing, IEEE/ACM Transactions on, vol. 22, pp. 1726–1737, Dec 2014.

    [14]. D. Cliff, “Hang the dj: Automatic sequencing and seamless mixing of dance-music tracks,” Jul 2000.

    [15]. H. Ishizaki, K. Hoashi, and Y. Takishima, “Full-automatic dj mixing system with optimal tempo adjustment based on measurement function of user discomfort.,” pp. 135–140, Jan 2009.

    [16]. A. Kim, S. Park, J. Park, J. W. Ha, T. Kwon, and J. Nam, “Automatic dj mix generation using highlight detection,” Proc. ISMIR, late-breaking demo paper, Oct 2017.

    [17]. Y. T. Lin, C. L. Lee, J. S. Jang, and J. L. Wu, “Bridging music via sound effects,” in 2014 IEEE International Symposium on Multimedia, pp. 116–122, Dec 2014.

    [18]. Mixxx, “Mixxx dj software.” https://www.mixxx.org/, Jun 2022.

    [19]. Serato, “Serato dj software.” https://serato.com/dj, Jun 2022.

    [20]. A. Production, “Virtual dj software.” https://www.virtualdj.com/, Jun 2022.

    [21]. A. Theta, “Rekordbox dj software.” https://rekordbox.com/en/, Jun 2022.

    [22]. H. H. Shih, S. S. Narayanan, and C. C. J. Kuo, “Music indexing with extracted main melody by using modified lempel-ziv algorithm,” in Internet Multimedia Management Systems II, vol. 4519, pp. 124–135, SPIE, July 2001.

    [23]. P. Tranchant, D. T. Vuvan, and I. Peretz, “Keeping the beat: A large sample study of bouncing and clapping to music,” PLOS ONE, vol. 11, pp. 1–19, Jul 2016.

    [24]. D. Tuncer, “In music education, in the context of measuring beats, anacrusic examples prepared with simple time signature,” Procedia - Social and Behavioral Sciences, vol. 197, pp. 2403–2406, Jul 2015.

    [25]. J. D. McAuley, “Tempo and rhythm,” in Music perception, pp. 165–199, Springer, Aug 2010.

    [26]. A. Robertson, “Decoding tempo and timing variations in music recordings from beat annotations,” Proceedings of the 13th International Society for Music Information Retrieval Conference, ISMIR 2012, pp. 475–480, Jan 2012.

    [27]. S. Nercessian, “Neural parametric equalizer matching using differentiable biquads,” in Proc. Int. Conf. Digital Audio Effects (eDAFx-20), pp. 265–272, Sep 2020.

    [28]. Pioneer, “Pioneer djm900-nxs2 operating instructions.” https://docs.pioneerdj.com/Manuals/DJM_900NXS2_DRI1300A_manual, Apr 2022.

    [29]. A. Eigenfeldt and P. Pasquier, “Evolving structures for electronic dance music,” in Proceedings of the 15th Annual Conference on Genetic and Evolutionary Computation, GECCO ’13, (New York, NY, USA), p. 319–326, Association for Computing Machinery, Jul 2013.

    [30]. M. I. Key, “Visualize the structure of dance music.” https://mixedinkey.com/book/visualize-the-structure-of-dance-music/, May 2022.

    [31]. W. Darling, “Edm song structure: Arrange your loop into a full song.” https://www.edmtips.com/edm-song-structure/, Jun 2022.

    [32]. Hyperbits, “Essential guide to edm song structure.” https://hyperbits.com/edm-song-structure/, Jun 2022.

    [33]. K. van den Brink, “Finding “the drop” : Recognizing the climax in electronic music using classification models.,” Jul 2020.

    [34]. M. A. Miguel, M. Sigman, and D. Fernandez Slezak, “From beat tracking to beat expectation: Cognitive-based beat tracking for capturing pulse clarity through time,” PLOS ONE, vol. 15, pp. 1–22, Nov 2020.

    [35]. S. Böck, F. Korzeniowski, J. Schlüter, F. Krebs, and G. Widmer, “Madmom: A new python audio and music signal processing library,” in Proceedings of the 24th ACM International Conference on Multimedia, MM ’16, (New York, NY, USA), p. 1174–1178, Association for Computing Machinery, May 2016.

    [36]. S. Böck and M. E. P. Davies, “Deconstruct, analyse, reconstruct: How to improve tempo, beat, and downbeat estimation,” in Proceedings of the 21th International Society for Music Information Retrieval Conference, ISMIR 2020, Montreal, Canada, October 11-16, 2020 (J. Cumming, J. H. Lee, B. McFee, M. Schedl, J. Devaney, C. McKay, E. Zangerle, and T. de Reuse, eds.), pp. 574–582, Oct 2020.

    [37]. H. Schreiber, J. Urbano, and M. Müller, “Music tempo estimation: Are we done yet?,” Transactions
    of the International Society for Music Information Retrieval, vol. 3, p. 111, Aug 2020.

    [38]. C. Y. Liang, “Implementing and adapting a downbeat tracking system for real-time applications,”
    Master’s thesis, Dec 2017.

    [39]. B. Quay, “Rubber band library.” http://breakfastquay.com/rubberband, Apr 2022.

    [40]. R. Clement, “yodel: the swiss army knife for your sound.” https://github.com/rclement/
    yodel, 2014.

    QR CODE