0% found this document useful (0 votes)
50 views3 pages

Flux (Machine-Learning Framework)

Flux is an open-source machine learning library and ecosystem written in Julia. It has a layer-stacking interface for simpler models and strong support for interoperability with other Julia packages. Flux focuses on interoperability which enables applications like neural differential equations. Flux supports recurrent and convolutional networks through automatic differentiation via Zygote.jl. Flux is widely used in Julia for machine learning and has been applied to tasks like training models on encrypted data.

Uploaded by

ava939
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
Download as pdf or txt
0% found this document useful (0 votes)
50 views3 pages

Flux (Machine-Learning Framework)

Flux is an open-source machine learning library and ecosystem written in Julia. It has a layer-stacking interface for simpler models and strong support for interoperability with other Julia packages. Flux focuses on interoperability which enables applications like neural differential equations. Flux supports recurrent and convolutional networks through automatic differentiation via Zygote.jl. Flux is widely used in Julia for machine learning and has been applied to tasks like training models on encrypted data.

Uploaded by

ava939
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
Download as pdf or txt
Download as pdf or txt
You are on page 1/ 3

Flux (machine-learning framework)

Flux is an open-source machine-learning software library and


Flux
ecosystem written in Julia.[1][5] Its current stable release is
v0.12.8.[6] It has a layer-stacking-based interface for simpler
models, and has a strong support on interoperability with other
Julia packages instead of a monolithic design.[7] For example, Original author(s) Michael J
GPU support is implemented transparently by CuArrays.jl[8] This Innes,[1]
is in contrast to some other machine learning frameworks which Dhairya
are implemented in other languages with Julia bindings,  such as
Gandhi,[2] and
TensorFlow.jl, and thus are more limited by the functionality
Contributors[3]
present in the underlying implementation, which is often in C or
C++.[9] Flux joined NumFOCUS as an affiliated project in Stable release v0.13
December of 2021.[10] Repository github.com
/FluxML/Flux.jl
Flux's focus on interoperability has enabled, for example, support
(https://github.
for Neural Differential Equations, by fusing Flux.jl and
com/FluxML/Fl
DifferentialEquations.jl into DiffEqFlux.jl.[11][12]
ux.jl)
Flux supports recurrent and convolutional networks. It is also Written in Julia
capable of differentiable programming[13][14][15] through its Type Machine
source-to-source automatic differentiation package, Zygote.jl.[16]
learning library

Julia is a popular language in machine-learning[17] and Flux.jl is its License MIT[4]


most highly regarded machine-learning repository.[17] A Website https://fluxml.ai
demonstration[18] compiling Julia code to run in Google's tensor
processing unit (TPU) received praise from Google Brain AI lead Jeff Dean.[19]

Flux has been used as a framework to build neural networks that work with homomorphic encrypted data
without ever decrypting it.[20][21] This kind of application is envisioned to be central for privacy to future
API using machine-learning models.[22]

Flux.jl is an intermediate representation for running high level programs on CUDA hardware.[23][24] It was
the predecessor to CUDAnative.jl which is also a GPU programming language.[25]

See also
Differentiable programming
Comparison of deep-learning software

References
1. Innes, Michael (2018-05-03). "Flux: Elegant machine learning with Julia" (https://doi.org/10.2
1105%2Fjoss.00602). Journal of Open Source Software. 3 (25): 602.
Bibcode:2018JOSS....3..602I (https://ui.adsabs.harvard.edu/abs/2018JOSS....3..602I).
doi:10.21105/joss.00602 (https://doi.org/10.21105%2Fjoss.00602).
2. Dhairya Gandhi (https://github.com/DhairyaLGandhi), GitHub, 2021-06-27, retrieved
2021-06-27
3. Flux Contributors (https://github.com/FluxML/Flux.jl/graphs/contributors), GitHub, 2021-06-
27, retrieved 2021-06-27
4. "github.com/FluxML/Flux.jl/blob/master/LICENSE.md" (https://github.com/FluxML/Flux.jl/blo
b/master/LICENSE.md). GitHub. 6 November 2021.
5. Innes, Mike; Bradbury, James; Fischer, Keno; Gandhi, Dhairya; Mariya Joy, Neethu; Karmali,
Tejan; Kelley, Matt; Pal, Avik; Concetto Rudilosso, Marco; Saba, Elliot; Shah, Viral; Yuret,
Deniz. "Building a Language and Compiler for Machine Learning" (https://julialang.org/blog/
2018/12/ml-language-compiler). julialang.org. Retrieved 2019-06-02.
6. FluxML/Flux.jl v0.12.8 (https://github.com/FluxML/Flux.jl/releases/tag/v0.12.8), Flux, 2021-
12-01, retrieved 2021-12-01
7. "Machine Learning and Artificial Intelligence" (https://juliacomputing.com/domains/ml-and-ai.
html). juliacomputing.com. Retrieved 2019-06-02.
8. Gandhi, Dhairya (2018-11-15). "Julia at NeurIPS and the Future of Machine Learning Tools"
(https://juliacomputing.com/blog/2018/11/15/julia-ml-three-papers.html). juliacomputing.com.
Retrieved 2019-06-02.
9. Malmaud, Jonathan; White, Lyndon (2018-11-01). "TensorFlow.jl: An Idiomatic Julia Front
End for TensorFlow" (https://doi.org/10.21105%2Fjoss.01002). Journal of Open Source
Software. 3 (31): 1002. Bibcode:2018JOSS....3.1002M (https://ui.adsabs.harvard.edu/abs/20
18JOSS....3.1002M). doi:10.21105/joss.01002 (https://doi.org/10.21105%2Fjoss.01002).
10. "Flux <3 NumFOCUS" (https://fluxml.ai/blog/2021/12/01/flux-numfocus.html). fluxml.ai.
Retrieved 2021-01-12.
11. Rackauckas, Chris; Innes, Mike; Ma, Yingbo; Bettencourt, Jesse; White, Lyndon; Dixit,
Vaibhav (2019-02-06). "DiffEqFlux.jl - A Julia Library for Neural Differential Equations".
arXiv:1902.02376 (https://arxiv.org/abs/1902.02376) [cs.LG (https://arxiv.org/archive/cs.LG)].
12. Schlothauer, Sarah (2019-01-25). "Machine learning meets math: Solve differential
equations with new Julia library" (https://jaxenter.com/julia-machine-learning-library-154880.
html). JAXenter. Retrieved 2019-10-21.
13. "Flux – Reinforcement Learning vs. Differentiable Programming" (https://fluxml.ai/2019/03/0
5/dp-vs-rl.html). fluxml.ai. Retrieved 2019-06-02.
14. "Flux – What Is Differentiable Programming?" (https://fluxml.ai/2019/02/07/what-is-differentia
ble-programming.html). fluxml.ai. Retrieved 2019-06-02.
15. Heath, Nick (December 6, 2018). "Julia vs Python: Which programming language will rule
machine learning in 2019?" (https://www.techrepublic.com/article/julia-vs-python-which-prog
ramming-language-will-rule-machine-learning-in-2019/). TechRepublic. Retrieved
2019-06-03.
16. Innes, Michael (2018-10-18). "Don't Unroll Adjoint: Differentiating SSA-Form Programs".
arXiv:1810.07951 (https://arxiv.org/abs/1810.07951) [cs.PL (https://arxiv.org/archive/cs.PL)].
17. Heath, Nick (January 25, 2019). "GitHub: The top 10 programming languages for machine
learning" (https://www.techrepublic.com/article/github-the-top-10-programming-languages-fo
r-machine-learning/). TechRepublic. Retrieved 2019-06-03.
18. Saba, Elliot; Fischer, Keno (2018-10-23). "Automatic Full Compilation of Julia Programs and
ML Models to Cloud TPUs". arXiv:1810.09868 (https://arxiv.org/abs/1810.09868) [cs.PL (http
s://arxiv.org/archive/cs.PL)].
19. Dean, Jeff [@JeffDean] (2018-10-23). "Julia + TPUs = fast and easily expressible ML
computations" (https://twitter.com/JeffDean/status/1054951415339192321) (Tweet).
Retrieved 2019-06-02 – via Twitter.
20. Patrawala, Fatema (2019-11-28). "Julia Computing research team runs machine learning
model on encrypted data without decrypting it" (https://hub.packtpub.com/julia-computing-res
earch-team-runs-machine-learning-model-on-encrypted-data-without-decrypting-it/). Packt
Hub. Retrieved 2019-12-11.
21. "Machine Learning on Encrypted Data Without Decrypting It" (https://juliacomputing.com/blo
g/2019/11/22/encrypted-machine-learning.html). juliacomputing.com. 2019-11-22. Retrieved
2019-12-11.
22. Yadav, Rohit (2019-12-02). "Julia Computing Uses Homomorphic Encryption For ML. Is It
The Way Forward?" (https://analyticsindiamag.com/julia-computing-uses-homomorphic-encr
yption-for-ml-is-it-the-way-forward/). Analytics India Magazine. Retrieved 2019-12-11.
23. Roesch, Jared and Lyubomirsky, Steven and Kirisame, Marisa and Pollock, Josh and
Weber, Logan and Jiang, Ziheng and Chen, Tianqi and Moreau, Thierry and Tatlock,
Zachary (2019). "Relay: A High-Level IR for Deep Learning". arXiv:1904.08368 (https://arxiv.
org/abs/1904.08368) [cs.LG (https://arxiv.org/archive/cs.LG)].
24. Tim Besard and Christophe Foket and Bjorn De Sutter (2019). "Effective Extensible
Programming: Unleashing Julia on GPUs". IEEE Transactions on Parallel and Distributed
Systems. Institute of Electrical and Electronics Engineers (IEEE). 30 (4): 827–841.
arXiv:1712.03112 (https://arxiv.org/abs/1712.03112). doi:10.1109/tpds.2018.2872064 (http
s://doi.org/10.1109%2Ftpds.2018.2872064). S2CID 11827394 (https://api.semanticscholar.o
rg/CorpusID:11827394).
25. Besard, Tim (2018). Abstractions for Programming Graphics Processors in High-Level
Programming Languages (PhD). Ghent University.

Retrieved from "https://en.wikipedia.org/w/index.php?title=Flux_(machine-learning_framework)&oldid=1153227099"

You might also like