Skip to content

Conversation

@mrcslws
Copy link
Contributor

@mrcslws mrcslws commented Nov 4, 2020

I'm working on adding an interface that allows the VaryBatchSize mixin to adjust the self.total_steps property so that it is compatible with OneCycleLR. That change will be easier with this simplified VaryBatchSize code.

Note that it's better to do isinstance(v, Sequence) instead of isinstance(v, list), because tuples should also be allowed.

batch_sizes = config.get("batch_sizes", None)
assert isinstance(batch_sizes, list), "Must specify list of batch sizes"
self.batch_sizes = batch_sizes
assert isinstance(batch_sizes, Sequence), "Must specify list of batch sizes"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The comment should now read "Must specify a sequence of batch sizes"

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, I can change it. (Often in code comments I use "list" in the generic English sense. Usually I rely on duck-typing rather than type checking -- as long as they pass in something that supports indexing and len, the code will work)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see. That's fair. Either way.

if 0 < self.current_epoch < len(self.batch_sizes):
batch_size = self.batch_sizes[self.current_epoch]
self.train_loader = self.create_train_dataloader(
{**self.config, "batch_size": batch_size}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cool trick to know.

@mrcslws mrcslws merged commit 9d392a7 into numenta:master Nov 13, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Development

Successfully merging this pull request may close these issues.

2 participants