Python > Data Science and Machine Learning Libraries > TensorFlow and Keras > Neural Networks
Regression with a Neural Network using Keras
This code demonstrates how to build a simple neural network for a regression task using Keras. It generates sample data, defines a neural network model, trains the model on the data, and then makes predictions on new data. This is a foundational example for understanding regression problems with neural networks.
Import Libraries
This imports the necessary libraries: numpy
for numerical operations and tensorflow
and keras
for building the neural network.
import numpy as np
import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import layers
Generate Sample Data
This creates sample data for a linear regression problem. X
represents the input features, and y
represents the target variable with some added noise. The data is then split into training and testing sets.
X = np.linspace(-5, 5, 100)
y = 0.5 * X + np.random.normal(0, 1, 100)
X_train = X[:80]
y_train = y[:80]
X_test = X[80:]
y_test = y[80:]
Define the Model
This defines a simple neural network model with one hidden layer. The input shape is set to 1 because we have one input feature. The hidden layer has 10 neurons with a ReLU activation function. The output layer has one neuron because it's a regression problem.
model = keras.Sequential([
layers.Dense(10, activation='relu', input_shape=[1]),
layers.Dense(1)
])
Compile the Model
This compiles the model, specifying the optimizer and loss function. The Adam optimizer is a popular choice, and Mean Squared Error (MSE) is commonly used for regression problems.
model.compile(optimizer='adam', loss='mse')
Train the Model
This trains the model on the training data for a specified number of epochs (iterations). The verbose=0
argument suppresses the training output.
model.fit(X_train, y_train, epochs=100, verbose=0)
Make Predictions
This uses the trained model to make predictions on the test data and prints the predictions.
predictions = model.predict(X_test)
print("Predictions:", predictions)
Concepts Behind the Snippet
This code demonstrates the key concepts behind neural network regression:
Real-Life Use Case
Neural network regression can be used for a wide range of real-world applications, such as:
Best Practices
Interview Tip
Be prepared to discuss:
When to Use Them
Use neural network regression when you have a continuous target variable and complex relationships between the input features and the target variable. They are particularly useful when linear models are not sufficient.
Memory Footprint
The memory footprint depends on the size of the model (number of layers and neurons) and the size of the training data. Smaller models and smaller datasets will require less memory.
Alternatives
Pros
Cons
FAQ
-
What does MSE stand for?
MSE stands for Mean Squared Error, a common loss function used in regression tasks. It calculates the average squared difference between the predicted and actual values. -
Why is the input_shape set to [1]?
The input_shape is set to [1] because we have one input feature (the X value) for each data point. -
How do I choose the number of neurons in the hidden layer?
The number of neurons in the hidden layer is a hyperparameter that needs to be tuned. You can experiment with different values and evaluate the model's performance on a validation set to find the best value.