Task: Implement a Simple RNN with Backpropagation Through Time (BPTT)
Your task is to implement a simple Recurrent Neural Network (RNN) and backpropagation through time (BPTT) to learn from sequential data. The RNN will process input sequences, update hidden states, and perform backpropagation to adjust weights based on the error gradient.
Write a class SimpleRNN with the following methods:
__init__(self, input_size, hidden_size, output_size): Initializes the RNN with random weights and zero biases.forward(self, x): Processes a sequence of inputs and returns the hidden states and output.backward(self, x, y, learning_rate): Performs backpropagation through time (BPTT) to adjust the weights based on the loss.
In this task, the RNN will be trained on sequence prediction, where the network will learn to predict the next item in a sequence. You should use 1/2 * Mean Squared Error (MSE) as the loss function and make sure to aggregate the losses at each time step by summing.
Examples
Example 1:
Input:
import numpy as np
input_sequence = np.array([[1.0], [2.0], [3.0], [4.0]])
expected_output = np.array([[2.0], [3.0], [4.0], [5.0]])
# Initialize RNN
rnn = SimpleRNN(input_size=1, hidden_size=5, output_size=1)
# Forward pass
output = rnn.forward(input_sequence)
# Backward pass
rnn.backward(input_sequence, expected_output, learning_rate=0.01)
print(output)
# The output should show the RNN predictions for each step of the input sequence.Output:
[[x1], [x2], [x3], [x4]]Explanation: The RNN processes the input sequence [1.0, 2.0, 3.0, 4.0] and predicts the next item in the sequence at each step.
Starter Code
import numpy as np
class SimpleRNN:
def __init__(self, input_size, hidden_size, output_size):
"""
Initializes the RNN with random weights and zero biases.
"""
self.hidden_size = hidden_size
self.W_xh = np.random.randn(hidden_size, input_size)*0.01
self.W_hh = np.random.randn(hidden_size, hidden_size)*0.01
self.W_hy = np.random.randn(output_size, hidden_size)*0.01
self.b_h = np.zeros((hidden_size, 1))
self.b_y = np.zeros((output_size, 1))
def forward(self, x):
"""
Forward pass through the RNN for a given sequence of inputs.
"""
pass
def backward(self, x, y, learning_rate):
"""
Backpropagation through time to adjust weights based on error gradient.
"""
pass
Python3
ReadyLines: 1Characters: 0
Ready