自然語言理解是一個近幾年熱門的題目,而BERT(Bidirectional Encoder Representation from Transformers)是Google開源的一個自然語言預訓練技術,本主題主要分享BERT在詞向量、句向量上帶給自然語言理解的改變,以及BERT最新的發展趨勢,包含知識圖譜、跨模型、語言生成、跨語系等。同時也分享在實務上面採用關鍵字技術與BERT技術所開發的聊天機器人差異,希望幫助大家知道如何採用人工智慧的方法來實現聊天機器人的開發。
1.什麼是自然語言理解
2.什麼是BERT
3.Keyword與BERT的差別
4.BERT的發展趨勢
About JerryWu
JerryWu is a Data Scientist, currently is Google Developers Expert in Machine Learning. He is also a Founder & Chief Technology Officer (CTO) in the Asia Pacific Machine Intelligence Company (APMIC OpenTalk). Jerry Wu’s teaching and research interests include Machine Intelligence and Natural Language Understanding (NLU). He eager to do research related to data technology, share them with people and solve problems.