9/2/2023 0 Comments Peek a boo who![]() dist-backend 'nccl ' -multiprocessing-distributed \ resume /path/to/the/pruning/output/file \ psg-sparsify -psg-threshold " $ " -psg-no-take-sign \ optimizer BOP -ar $bop_ar -tau $bop_tau \ arch psg_resnet50 -init-method kaiming_normal \ To prune a ResNet-50 network at its initialization, we first run the following command to perform SynFlow, which follows a similar manner for the arguments as in CIFAR experiments: The filtered small gradients are set to zero when it is true.įor PaB experiments on ImageNet, we run the pruning and Bop training in a two-stage manner, implemented in main_imagenet_prune.py and main_imagenet_train.py, respectively. -psg-no-take-sign - A boolean that indicates to bypass the "taking-the-sign" step in the original PSG method.-psg-threshold - A float, the threshold that filters out coarse gradients with small magnitudes to reduce gradient variance.-msb-bits, -msb-bits-weight, -msb-bits-grad - Three floats, the bit-width for the inputs, weights and output errors during back-propagation.PSG uses low-precision computation during backward passes to save computational cost. PSG stands for Predictive Sign Gradient, which was originally proposed in the E2-Train paper. -ar-decay-ratio - A float, the decay ratio of the adaptivity ratio decaying.-ar-decay-freq - An integer, interval in epochs between decays of the adaptivity ratio.-tau - A float, corresponding to the threshold that decides if a binary weight needs to be flipped.-ar - A float, corresponding to the adativity rate for the calculation of gradient moving average.You can pass 'SGD' to this argument for a standard training of SGD. -optimizer - A string that specifies the Bop optimizer.More details can be found in the original Bop paper. Bop has several hyperparameters that are essential to its successful optimizaiton as shown below. Check the SynFlow paper for what this means. -prune-iters - An integeer, the number of pruning iterations in one run of pruning.-prune-ratio - A float, the ratio of non-zero parameters remained after (the last) pruning. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |