Implementing and Training a Neural Network with PyTorch Building and Training a Feed-Forward Neural Network on the MNIST Dataset By KAHSAY August 28, 2024 1
Introduction
Overview • In this presentation, we will go through the process of building and training a feed-forward neural network using PyTorch. • Our objective is to classify handwritten digits using the MNIST dataset. • We’ll define each step, method, and function in the implementation. 2
Step 1: Downloading the Dataset
Downloading the Dataset • The download_mnist_datasets function is responsible for downloading the MNIST dataset. • The MNIST dataset contains images of handwritten digits (0-9) and is commonly used for training image processing systems. 1 def download_mnist_datasets(): 2 train_data = datasets.MNIST( 3 root="data", 4 train=True, 5 download=True, 6 transform=ToTensor(), 7 ) 8 validation_data = datasets.MNIST( 9 root="data", 10 train=False, 11 download=True, 12 transform=ToTensor(), 3
Step 2: Creating the Data Loader
Creating the Data Loader • The create_data_loader function creates a data loader for the training data. • Data loaders are used to batch and shuffle the data, making it easier to manage during training. 1 def create_data_loader(train_data, batch_size): 2 train_dataloader = DataLoader(train_data, batch_size =batch_size) 3 return train_dataloader 4 4
Step 3: Building the Model
Building the Model • The FeedForwardNet class defines the architecture of our feed-forward neural network. • The network consists of a flattening layer, two dense (fully connected) layers, and a softmax layer for output. 1 class FeedForwardNet(nn.Module): 2 3 def __init__(self): 4 super().__init__() 5 self.flatten = nn.Flatten() 6 self.dense_layers = nn.Sequential( 7 nn.Linear(28 * 28, 256), 8 nn.ReLU(), 9 nn.Linear(256, 10) 10 ) 11 self.softmax = nn.Softmax(dim=1) 12 13 def forward(self, input_data): 5
Step 4: Training the Model
Training the Model • The train_single_epoch method trains the model for a single epoch. • It performs a forward pass, computes the loss, backpropagates the error, and updates the model weights. 1 def train_single_epoch(model, data_loader, loss_fn, optimiser, device): 2 for input, target in data_loader: 3 input, target = input.to(device), target.to( device) 4 5 # calculate loss 6 prediction = model(input) 7 loss = loss_fn(prediction, target) 8 9 # backpropagate error and update weights 10 optimiser.zero_grad() 11 loss.backward() 6
Training the Model (Continued) • The train function controls the training process over multiple epochs. • It repeatedly calls the train_single_epoch function and prints the training progress. 1 def train(model, data_loader, loss_fn, optimiser, device , epochs): 2 for i in range(epochs): 3 print(f"Epoch {i+1}") 4 train_single_epoch(model, data_loader, loss_fn, optimiser, device) 5 print("---------------------------") 6 print("Finished training") 7 7
Step 5: Saving the Model
Saving the Model • After training, we save the trained model’s parameters using torch.save. • The model can be loaded later for inference or further training. 1 torch.save(feed_forward_net.state_dict(), " feedforwardnet.pth") 2 print("Trained feed forward net saved at feedforwardnet. pth") 3 8
Conclusion
Conclusion • In this presentation, we covered how to implement and train a neural network using PyTorch. • We used the MNIST dataset for training a simple feed-forward neural network. • Each method and function was explained in detail, showing how they contribute to the overall process. 9

implementing _Training_Neural_Network_with_PyTorch .pdf

  • 1.
    Implementing and Traininga Neural Network with PyTorch Building and Training a Feed-Forward Neural Network on the MNIST Dataset By KAHSAY August 28, 2024 1
  • 2.
  • 3.
    Overview • In thispresentation, we will go through the process of building and training a feed-forward neural network using PyTorch. • Our objective is to classify handwritten digits using the MNIST dataset. • We’ll define each step, method, and function in the implementation. 2
  • 4.
  • 5.
    Downloading the Dataset •The download_mnist_datasets function is responsible for downloading the MNIST dataset. • The MNIST dataset contains images of handwritten digits (0-9) and is commonly used for training image processing systems. 1 def download_mnist_datasets(): 2 train_data = datasets.MNIST( 3 root="data", 4 train=True, 5 download=True, 6 transform=ToTensor(), 7 ) 8 validation_data = datasets.MNIST( 9 root="data", 10 train=False, 11 download=True, 12 transform=ToTensor(), 3
  • 6.
    Step 2: Creatingthe Data Loader
  • 7.
    Creating the DataLoader • The create_data_loader function creates a data loader for the training data. • Data loaders are used to batch and shuffle the data, making it easier to manage during training. 1 def create_data_loader(train_data, batch_size): 2 train_dataloader = DataLoader(train_data, batch_size =batch_size) 3 return train_dataloader 4 4
  • 8.
  • 9.
    Building the Model •The FeedForwardNet class defines the architecture of our feed-forward neural network. • The network consists of a flattening layer, two dense (fully connected) layers, and a softmax layer for output. 1 class FeedForwardNet(nn.Module): 2 3 def __init__(self): 4 super().__init__() 5 self.flatten = nn.Flatten() 6 self.dense_layers = nn.Sequential( 7 nn.Linear(28 * 28, 256), 8 nn.ReLU(), 9 nn.Linear(256, 10) 10 ) 11 self.softmax = nn.Softmax(dim=1) 12 13 def forward(self, input_data): 5
  • 10.
  • 11.
    Training the Model •The train_single_epoch method trains the model for a single epoch. • It performs a forward pass, computes the loss, backpropagates the error, and updates the model weights. 1 def train_single_epoch(model, data_loader, loss_fn, optimiser, device): 2 for input, target in data_loader: 3 input, target = input.to(device), target.to( device) 4 5 # calculate loss 6 prediction = model(input) 7 loss = loss_fn(prediction, target) 8 9 # backpropagate error and update weights 10 optimiser.zero_grad() 11 loss.backward() 6
  • 12.
    Training the Model(Continued) • The train function controls the training process over multiple epochs. • It repeatedly calls the train_single_epoch function and prints the training progress. 1 def train(model, data_loader, loss_fn, optimiser, device , epochs): 2 for i in range(epochs): 3 print(f"Epoch {i+1}") 4 train_single_epoch(model, data_loader, loss_fn, optimiser, device) 5 print("---------------------------") 6 print("Finished training") 7 7
  • 13.
    Step 5: Savingthe Model
  • 14.
    Saving the Model •After training, we save the trained model’s parameters using torch.save. • The model can be loaded later for inference or further training. 1 torch.save(feed_forward_net.state_dict(), " feedforwardnet.pth") 2 print("Trained feed forward net saved at feedforwardnet. pth") 3 8
  • 15.
  • 16.
    Conclusion • In thispresentation, we covered how to implement and train a neural network using PyTorch. • We used the MNIST dataset for training a simple feed-forward neural network. • Each method and function was explained in detail, showing how they contribute to the overall process. 9