Are you sure you want to delete this task? Once this task is deleted, it cannot be recovered.
|
1 year ago | |
---|---|---|
script | 1 year ago | |
src | 1 year ago | |
README.md | 1 year ago | |
eval.py | 1 year ago | |
export.py | 1 year ago | |
requirements.txt | 1 year ago | |
train.py | 1 year ago |
Generative Adversarial Nets were recently introduced as a novel way to train generative models. In this work we introduce the conditional version of generative adversarial nets, which can be constructed by simply feeding the data, y, we wish to condition on to both the generator and discriminator. We show that this model can generate MNIST digits conditioned on class labels. We also illustrate how this model could be used to learn a multi-modal model, and provide preliminary examples of an application to image tagging in which we demonstrate how this approach can generate descriptive tags which are not part of training labels.
Paper: Conditional Generative Adversarial Nets.
Architecture guidelines for Conditional GANs
Train CGAN Dataset used: MNIST
└─data
└─MNIST_Data
└─train
.
└─CGAN
├─README.md # README
├─requirements.txt # required modules
├─scripts # shell script
├─run_standalone_train.sh # training in standalone mode(1pcs)
├─run_distributed_train_ascend.sh # training in parallel mode(8 pcs)
└─run_eval_ascend.sh # evaluation
├─ src
├─dataset.py # dataset create
├─cell.py # network definition
├─ckpt_util.py # utility of checkpoint
├─model.py # discriminator & generator structure
├─ train.py # train cgan
├─ eval.py # eval cgan
├─ export.py # export mindir
# distributed training
bash run_distributed_train_ascend.sh /path/to/MNIST_Data/train /path/to/hccl_8p_01234567_127.0.0.1.json 8
# standalone training
bash run_standalone_train.sh /path/MNIST_Data/train 0
# evaluating
bash run_eval_ascend.sh /path/to/script/train_parallel/0/ckpt/G_50.ckpt 0
run_standalone_train_ascend.sh
for non-distributed training of CGAN model.# standalone training
bash run_standalone_train_ascend.sh /path/MNIST_Data/train 0
run_distributed_train_ascend.sh
for distributed training of CGAN model.bash run_distributed_train_ascend.sh /path/to/MNIST_Data/train /path/to/hccl_8p_01234567_127.0.0.1.json 8
Training result will be stored in img_eval
.
run_eval_ascend.sh
for evaluation.# eval
bash run_eval_ascend.sh /path/to/script/train_parallel/0/ckpt/G_50.ckpt 0
Evaluation result will be stored in the img_eval path. Under this, you can find generator result in result.png.
python export.py --ckpt_dir /path/to/train/ckpt/G_50.ckpt
Parameters | Ascend |
---|---|
Model Version | V1 |
Resource | CentOs 8.2; Ascend 910; CPU 2.60GHz, 192cores; Memory 755G |
uploaded Date | 07/04/2021 (month/day/year) |
MindSpore Version | 1.2.0 |
Dataset | MNIST Dataset |
Training Parameters | epoch=50, batch_size = 128 |
Optimizer | Adam |
Loss Function | BCELoss |
Output | predict class |
Loss | g_loss: 4.9693 d_loss: 0.1540 |
Total time | 7.5 mins(8p) |
Checkpoint for Fine tuning | 26.2M(.ckpt file) |
Scripts | cgan script |
We use random seed in train.py and cell.py for weight initialization.
Please check the official homepage.
cgan can generate digit pictures by noise
Python Shell Text