Welcome to SilvaNet, a lightweight Python library designed to make deep learning concepts easy to grasp and apply. Whether you’re diving into neural networks for the first time or looking to streamline your educational tools, SilvaNet offers a simplified framework for constructing, training, and evaluating models.
-
Autograd Support: Enjoy seamless gradient computation with our autograd-enabled tensor class, simplifying backpropagation and model training.
-
Intuitive API: Build neural networks effortlessly with our user-friendly API. Create models, apply operations, and handle data with minimal code.
-
Element-wise Operations: Perform a variety of element-wise operations such as addition, subtraction, and multiplication on tensors.
-
Activation Functions: Utilize essential activation functions like sigmoid, tanh, and ReLU to introduce non-linearity into your models.
-
Loss Functions: Implement popular loss functions such as softmax cross-entropy for classification tasks and optimize your models effectively.
-
Flexible and Extensible: Customize your neural network with different layers, activation functions, and optimization algorithms. SilvaNet encourages experimentation.
-
Model Management: Save and load your trained models easily, enabling efficient reuse and sharing.
Here’s a quick example to get you started with SilvaNet:
from autograd import Tensor
import numpy as np
from sklearn.datasets import load_breast_cancer
from sklearn.model_selection import train_test_split
from nn.Layers import Sequential, Dense
from nn.losses import CrossEntropyLoss
from nn.optimizer import SGD
from Network import NeuralNetwork
# Load Breast Cancer dataset
data = load_breast_cancer()
X = data.data
y = data.target
# Split the data into training and test sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)
# Normalize features
X_train = X_train / np.max(X_train, axis=0)
X_test = X_test / np.max(X_test, axis=0)
# Define the neural network architecture
model = Sequential()
model.add(Dense(n_inputs=X_train.shape[1], n_units=64, activation='relu'))
model.add(Dense(n_inputs=64, n_units=32, activation='relu'))
model.add(Dense(n_inputs=32, n_units=2)) # Output layer without activation for binary classification
# Define loss function and optimizer
loss_fn = CrossEntropyLoss()
optimizer = SGD(parameters=model.get_parameters(), alpha=0.01)
# Create a neural network instance
nn = NeuralNetwork(model, loss_fn, optimizer)
# Compile the model
nn.compile(loss_fn, optimizer)
# Print model summary
nn.summary()
# Train the model
nn.fit(X_train, y_train, epochs=100, batch_size=16)
# Evaluate the model
nn.evaluate(X_test, y_test)
Install SilvaNet using pip:
git clone https://github.com/silvaxxx1/SilavaNet
For detailed guides and API references, visit our documentation.
We welcome contributions! If you have suggestions, bug reports, or want to contribute code, please review our contributing guidelines.
SilvaNet is licensed under the MIT License. See the LICENSE file for more details.