1 AIT Asian Institute of Technology

Investigation of knowledge distillation technique on FMRI-Omage cross-modality GANs

AuthorNopphon Rattanamanee
Call NumberAIT Thesis no.CS-22-04
Subject(s)Knowledge management--Data processing
Artificial intelligence
Information technology--Technological innovations
NoteA thesis submitted in partial fulfillment of the requirements for the degree of Master of Engineering in Computer Science
PublisherAsian Institute of Technology
AbstractCross-modal retrieval has shown promising results in retrieving the data from another modality, especially cross-modal with generative adversarial or GANs. However, these large frameworks could be challenging when deploying such a framework to a low memory device. In this work, we investigate the knowledge distillation (KD) algorithms attempting to find the suitable for cross-modal GANs. In this investigation, we use the fMRI with a handwritten character dataset and the CFID score metrics to evaluate each knowledge distillation algorithm’s performance. The results show that the most compressed generator (student generator) is worse than the non-KD students. We agree that the KD is not straight forward method. A lot of factors need to be considered. For example, the student model’s architecture must be carefully designed. Students training generation or born-again distillation may required for truly improve the student performance.
Year2022
TypeThesis
SchoolSchool of Engineering and Technology
DepartmentDepartment of Information and Communications Technologies (DICT)
Academic Program/FoSComputer Science (CS)
Chairperson(s)Chaklam Silpasuwanchai
Examination Committee(s)Dailey, Matthew N.;Attaphongse Taparugssanagorn
Scholarship Donor(s)Royal Thai Government Fellowship
DegreeThesis (M. Eng.) Asian Institute of Technology, 2022


Usage Metrics
View Detail0
Read PDF0
Download PDF0