FLightNER: A Federated Learning Approach to Lightweight Named-Entity Recognition

It appears your Web browser is not configured to display PDF files. Download adobe Acrobat or click here to download the PDF file.

Click here to download the PDF file.

Creator: 

Abadeer, Macarious Philip Aziz

Date: 

2022

Abstract: 

We introduce FLightNER, a Federated Learning model that extends a state-of-the-art Named-Entity Recognition model using prompt-tuning known as LightNER. FLightNER allows the aggregation of only the trainable parameters of LightNER without model accuracy degradation saving 10 GB per client and enabling more clients to join a federation without extending the central server's memory. We evaluate our approach against two baselines using three diverse datasets with different distributions across up to seven clients in a federation. We empirically show that compared to the centrally-trained LightNER model, FLightNER outperforms it by 19% when performed on an unbalanced medical dataset and matches it when performed on two balanced datasets: CoNLL and I2B2. Furthermore, we use and evaluate two memory-saving techniques: AdaFactor optimizer and Automatic Mixed Precision. Our findings enable owners of sensitive data, such as healthcare practitioners, to train a NER model collaboratively, with low memory requirements, while keeping their data on-premise.

Subject: 

Computer Science

Language: 

English

Publisher: 

Carleton University

Thesis Degree Name: 

Master of Computer Science: 
M.C.S.

Thesis Degree Level: 

Master's

Thesis Degree Discipline: 

Computer Science

Parent Collection: 

Theses and Dissertations

Items in CURVE are protected by copyright, with all rights reserved, unless otherwise indicated. They are made available with permission from the author(s).