This repository contains the code for: GaitSADA: Self-Aligned Domain Adaptation for mmWave Gait Recognition The paper submitted to IEEE MASS 2023
Please visit our webpage for more detail
📢 [2023/09/25] Presentation at IEEE MASS 2023 (Toronto, Canada) in Session 1A: AI/ML based Smart Design 1 (9:55 - 11:05).
📢 [2023/06/26] Paper accepted to IEEE MASS 2023.
pip install tensorflow scikit-learn matplotlib sklearn h5py PyYAML numpy tqdm
Add ouput path
export MMWAVE_PATH=~/mmwave-data/exit/preprocessed/256_resized/
- GaitSADA:
python3 GaitSADA.py --train_src_days=3 --train_trg_days=3 --src_aug=1 --trgt_aug=1 --epochs=10000 --epochs_2stage=10000 --log_dir=logs/example/GaitSADA/ --notes=temperal_3_day --notes_2stage=v1
- Supervised Learning
python3 models/supervised.py --train_src_days=3 --epochs=5000 --log_dir=logs/example/vanilla/
- GAN
python3 GAN.py --train_src_days=3 --train_trg_days=3 --log_dir=logs/example/GAN
- GRL
python3 GRL.py --train_src_days=3 --train_con_days=3 --log_dir=logs/example/GRL --note=GRL_vanilla
- ADDA
python3 ADDA.py --train_src_days=3 --train_off_days=3
- CDAN
python3 CDAN.py --train_src_days=3 --train_trg_days=3 --log_dir=logs/example/CDAN
- FixMatch
python3 FixMatch.py --train_src_days=3 --train_trg_days=3 --log_dir=logs/example/FixMatch
Number of days of source data can be specified by
--train_src_days=3
Number of days of target data can be specified by
--train_trg_days=3
--train_ser_days=3
--train_con_days=3
--train_off_days=3
Result of training on 1 to 3 days on the data from laboratory location (source domain) while adapting to different 1 to 3 days data of same location (i.e., temporal target domain) and 1 to 3 days of different target locations, (i.e., server, conference, and office)
GaitSADA itself is released under the MIT License.
If you find our work useful in your research, please consider citing:
@article{pinyoanuntapong2023gaitsada,
title={GaitSADA: Self-Aligned Domain Adaptation for mmWave Gait Recognition},
author={Ekkasit Pinyoanuntapong and Ayman Ali and Kalvik Jakkala and Pu Wang and Minwoo Lee and Qucheng Peng and Chen Chen and Zhi Sun},
year={2023},
eprint={2301.13384},
archivePrefix={arXiv},
primaryClass={cs.CV}
}