Practical implementation of an astoundingly simple method for self-supervised learning that achieves a new state-of-the-art (surpassing SimCLR) without contrastive learning and having to designate negative pairs. This repository offers a module that one can easily wrap any image-based neural network (residual network, discriminator, policy network) to immediately start benefitting from unlabelled image data. There is now new evidence that batch normalization is key to making this technique work well. A new paper has successfully replaced batch norm with group norm + weight standardization, refuting that batch statistics are needed for BYOL to work. Simply plugin your neural network, specifying (1) the image dimensions as well as (2) the name (or index) of the hidden layer, whose output is used as the latent representation used for self-supervised training.

Features

  • Practical implementation of an astoundingly simple method
  • Group norm + weight standardization
  • Simply plugin your neural network
  • BYOL does not even need the target encoder to be an exponential moving average of the online encoder
  • Fetch the embeddings or the projections
  • Without contrastive learning

Project Samples

Project Activity

See All Activity >

Categories

Machine Learning

License

MIT License

Follow Bootstrap Your Own Latent (BYOL)

Bootstrap Your Own Latent (BYOL) Web Site

Other Useful Business Software
Auth0 for AI Agents now in GA Icon
Auth0 for AI Agents now in GA

Ready to implement AI with confidence (without sacrificing security)?

Connect your AI agents to apps and data more securely, give users control over the actions AI agents can perform and the data they can access, and enable human confirmation for critical agent actions.
Start building today
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of Bootstrap Your Own Latent (BYOL)!

Additional Project Details

Programming Language

Python

Related Categories

Python Machine Learning Software

Registered

2022-08-17