Sean Moriarity, creator of the Axon deep studying framework, co-creator of the Nx library, and creator of Machine Studying in Elixir and Genetic Algorithms in Elixir, printed by the Pragmatic Bookshelf, speaks with SE Radio host Gavin Henry about what deep studying (neural networks) means right now. Utilizing a sensible instance with deep studying for fraud detection, they discover what Axon is and why it was created. Moriarity describes why the Beam is good for machine studying, and why he dislikes the time period “neural community.” They focus on the necessity for deep studying, its historical past, the way it gives match for a lot of of right now’s complicated issues, the place it shines and when to not use it. Moriarity goes into depth on a variety of subjects, together with the way to get datasets in form, supervised and unsupervised studying, feed-forward neural networks, Nx.serving, resolution timber, gradient descent, linear regression, logistic regression, assist vector machines, and random forests. The episode considers what a mannequin seems like, what coaching is, labeling, classification, regression duties, {hardware} assets wanted, EXGBoost, Jax, PyIgnite, and Explorer. Lastly, they have a look at what’s concerned within the ongoing lifecycle or operational facet of Axon as soon as a workflow is put into manufacturing, so you may safely again all of it up and feed in new knowledge.
This episode is sponsored by Miro.
Present Notes
Associated Hyperlinks
Podcast: Play in new window | Obtain
Subscribe: Apple Podcasts |
Tags: axon, beam, bumblebee, Determination Timber, deep studying, elixir, erlang, EXGBoost, Gradient Descent, Jax, labeling, Linear Regression, Logistic Regression, machine studying, neural networks, nx, oban, PyIgnite, Random Forests, tensors