Menu

News

最新消息

2020-03-20
【專題演講】109/3/26(四)15:30-17:00 李育杰研究員

Abstract

Nowadays, Machine learning performs astonishingly in many different fields. The more data we have, our machine learning methods show better results. However, in some cases, the data owners may not want to share the information they have, because those materials contain privacy issues. On the other hand, sometimes we encounter a very large dataset, which are difficult to store in a single machine. To deal with these two problems, we propose the distributed consensus reduced support vector machine (DCRSVM) for binary classification. Imagine that we have many local working units and a central master, and each working unit owns its data. The DCRSVM includes the following two merits. First, our method keeps the privacy of data, so we are not going to disclose local data to the central master. Besides, when we confront a large dataset, which is hard to store in a single server, the central master can still derive a good machine learning model even if the data stores only in local devices. Our method successfully solves the problems we mentioned above, and it generates a competitive result.

檔案下載