$ Rust Neural Network

View on GitHub →

This project implements a basic feedforward neural network in Rust. It's designed to be a clear and understandable example of how neural networks work, incorporating fundamental concepts like activation functions and backpropagation for training.

### features

  • - Activation Functions — Supports ReLU, Sigmoid, Tanh, and Linear activation functions.
  • - Feedforward Propagation — Calculates the output of the network given an input.
  • - Backpropagation Algorithm — Implements backpropagation for training the network.
  • - Weight Initialization — Uses He or Xavier scaling for initializing weights.
  • - Serialization — Network structure can be serialized and deserialized using `serde`.

### example usage

The `main.rs` example demonstrates how to create, train, and use the neural network for a simple task.

It's a foundational project for anyone looking to understand the mechanics of neural networks from a low-level perspective using Rust.