Skip to content

y0ngjaenius/CVPR2024_FLOWERFormer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

FlowerFormer: Empowering Neural Architecture Encoding using a Flow-aware Graph Transformer

Official implementation of our paper, FlowerFormer: Empowering Neural Architecture Encoding using a Flow-aware Graph Transformer (CVPR 2024)

Prerequisites

  • Python 3.10
  • Pytorch 1.13.1
  • Pytorch Geometric 2.2.0

Our model implemented on with GraphGPS framework as the backbone.

Please, install GraphGPS from link

Dataset

In our paper, we used 5 datasets: NAS-Bench-101, NAS-Bench-201, NAS-Bench-301, NAS-Bench-Graph, NAS-Bench-ASR.

We provided preprocessed datasets (PyG format) here.

Please place the data in ./data folder.

To run experiments

Run python experiments.py [config_path] with corresponding config path.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages