You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on May 1, 2023. It is now read-only.
I'm implementing both sensitivity and sparsity level pruning for 3D networks.
Sensitivity analysis works as expected as can be seen from the figure below so I assume it is not a problem with using a 3D model for element-wise pruning methods.
However, when pruning during training the network weights are not going to zero.
My implementation can be abbreviated as the following:
# for each epochforiinrange(opt.begin_epoch, opt.begin_epoch+opt.n_epochs):
compression_scheduler.on_epoch_begin(i)
model.train()
# for each mini-batchforj, (inputs, targets) inenumerate(train_loader):
compression_scheduler.on_minibatch_begin(i, minibatch_id=j, minibatches_per_epoch=len(train_loader))
targets=targets.cuda()
inputs=Variable(inputs)
targets=Variable(targets)
outputs=model(inputs)
loss=criterion(outputs, targets)
# before backwards pass - update loss to include regularizationcompression_scheduler.before_backward_pass(i, minibatch_id=j, minibatches_per_epoch=len(train_loader), loss=loss)
optimizer.zero_grad()
loss.backward()
compression_scheduler.before_parameter_optimization(i, minibatch_id=j, minibatches_per_epoch=len(train_loader), optimizer=optimizer)
optimizer.step()
compression_scheduler.on_minibatch_end(i, minibatch_id=j, minibatches_per_epoch=len(train_loader))
And an example compression file I'm using would be:
I'm implementing both sensitivity and sparsity level pruning for 3D networks.
Sensitivity analysis works as expected as can be seen from the figure below so I assume it is not a problem with using a 3D model for element-wise pruning methods.
However, when pruning during training the network weights are not going to zero.
My implementation can be abbreviated as the following:
And an example compression file I'm using would be:
Any help would be appreciated!
The text was updated successfully, but these errors were encountered: