1
Investigation of knowledge distillation technique on FMRI-Omage cross-modality GANs | |
Author | Nopphon Rattanamanee |
Call Number | AIT Thesis no.CS-22-04 |
Subject(s) | Knowledge management--Data processing Artificial intelligence Information technology--Technological innovations |
Note | A thesis submitted in partial fulfillment of the requirements for the degree of Master of Engineering in Computer Science |
Publisher | Asian Institute of Technology |
Abstract | Cross-modal retrieval has shown promising results in retrieving the data from another modality, especially cross-modal with generative adversarial or GANs. However, these large frameworks could be challenging when deploying such a framework to a low memory device. In this work, we investigate the knowledge distillation (KD) algorithms attempting to find the suitable for cross-modal GANs. In this investigation, we use the fMRI with a handwritten character dataset and the CFID score metrics to evaluate each knowledge distillation algorithm’s performance. The results show that the most compressed generator (student generator) is worse than the non-KD students. We agree that the KD is not straight forward method. A lot of factors need to be considered. For example, the student model’s architecture must be carefully designed. Students training generation or born-again distillation may required for truly improve the student performance. |
Year | 2022 |
Type | Thesis |
School | School of Engineering and Technology |
Department | Department of Information and Communications Technologies (DICT) |
Academic Program/FoS | Computer Science (CS) |
Chairperson(s) | Chaklam Silpasuwanchai |
Examination Committee(s) | Dailey, Matthew N.;Attaphongse Taparugssanagorn |
Scholarship Donor(s) | Royal Thai Government Fellowship |
Degree | Thesis (M. Eng.) Asian Institute of Technology, 2022 |