吴恩达深度学习课程deeplearning.ai课程作业:Class1Week3assig。。。
吴恩达deeplearning.ai课程作业,⾃⼰写的答案。
补充说明:
1. 评论中总有⼈问为什么直接复制这些notebook运⾏不了?请不要直接复制粘贴,不可能运⾏通过的,这个只是notebook中我们要⾃⼰写的那部分,要正确运⾏还需要其他py⽂件,请⾃⼰到GitHub上下载完整的。这⾥的部分仅仅是参考⽤的,建议还是⾃⼰按照提⽰⼀点⼀点写,如果实在卡住了再看答案。个⼈觉得这样才是正确的学习⽅法,况且作业也不算难。我有特异功能
2. 关于评论中有⼈说我是抄袭,注释还没别⼈详细,复制下来还运⾏不过。答复是:做伸⼿党之前,请先搞清这个作业是⼲什么的。⼤家都是从GitHub上下载原始的作业,然后根据代码前⾯的提⽰(通常会指定函数和公式)来编写代码,⽽且后⾯还有expected output供你⽐对,如果程序正确,结果⼀般来说是⼀样的。请不要⽆脑喷,说什么跟别⼈的答案⼀样的。说到底,我们要做的就是,看他的⽂字部分,根据提⽰在代码中加⼊部分⾃⼰的代码。我们⾃⼰要写的部分只有那么⼀⼩部分代码。
3. 由于实在很反感⽆脑喷⼦,故禁⽌了下⾯的评论功能,请见谅。如果有问题,请私信我,在⼒所能及的范围内会尽量帮忙。Planar data classification with one hidden layer
Welcome to your week 3 programming assignment. It’s time to build your first neural network, which will have a hidden layer. You will e a big difference between this model and the one you implemented using logistic regression.
You will learn how to:
- Implement a 2-class classification neural network with a single hidden layer
- U units with a non-linear activation function, such as tanh
- Compute the cross entropy loss
- Implement forward and backward propagation
1 - Packages
Let’s first import all the packages that you will need during this assignment.
- is the fundamental package for scientific computing with Python.
- provides simple and efficient tools for data mining and data analysis.
- is a library for plotting graphs in Python.
- testCas provides some test examples to asss the correctness of your functions
- planar_utils provide various uful functions ud in this assignment
# Package imports
import numpy as np
import matplotlib.pyplot as plt
from testCas import *
import sklearn
土匪猪肝import sklearn.datats
import sklearn.linear_model
from planar_utils import plot_decision_boundary, sigmoid, load_planar_datat, load_extra_datats
%matplotlib inline
np.random.ed(1) # t a ed so that the results are consistent
2 - Datat
First, let’s get the datat you will work on. The following code will load a “flower” 2-class datat into variables X and Y.
X, Y = load_planar_datat()
Visualize the datat using matplotlib. The data looks like a “flower” with some red (label y=0) and some blue (y=1) points. Your goal is to build a model to fit this data.
# Visualize the data:
plt.scatter(X[0, :], X[1, :], c=Y, s=40, Spectral);
You have:
- a numpy-array (matrix) X that contains your features (x1, x2)
- a numpy-array (vector) Y that contains your labels (red:0, blue:1).
Lets first get a better n of what our data is like.
Exerci: How many training examples do you have? In addition, what is the shape of the variables X and Y? Hint: How do you get the shape of a numpy array?
### START CODE HERE ### (≈ 3 lines of code)
shape_X = X.shape
shape_Y = Y.shape
m = shape_X[1] # training t size
### END CODE HERE ###
print ('The shape of X is: ' + str(shape_X))
print ('The shape of Y is: ' + str(shape_Y))
print ('I have m = %d training examples!' % (m))
The shape of X is: (2, 400)
The shape of Y is: (1, 400)
I have m = 400 training examples!香妃草
Expected Output:
shape of X(2, 400)
shape of Y(1, 400)
m400
3 - Simple Logistic Regression
Before building a full neural network, lets first e how logistic regression performs on this problem. You can u sklearn’s built-in functions to do that. Run the code below to train a logistic regression classifier on the datat.
# Train the logistic regression classifier
clf = sklearn.linear_model.LogisticRegressionCV();
# clf.fit(X.T, Y.T);
clf.fit(X.T, Y.T.ravel());
You can now plot the decision boundary of the models. Run the code below.
# Plot the decision boundary for logistic regression
plot_decision_boundary(lambda x: clf.predict(x), X, Y)
plt.title("Logistic Regression")
# Print accuracy
LR_predictions = clf.predict(X.T)
print ('Accuracy of logistic regression: %d ' % float((np.dot(Y,LR_predictions) + np.dot(1-Y,1-LR_predictions))/float(Y.size)*100) + '% ' + "(percentage of correctly labelled datapoints)")
Accuracy of logistic regression: 47 % (percentage of correctly labelled datapoints)
Expected Output:
Accuracy47%
Interpretation: The datat is not linearly parable, so logistic regression doesn’t perform well. Hopefully a neural network will do better. Let’s try this now!
4 - Neural Network model
Logistic regression did not work well on the “flower datat”. You are going to train a Neural Network with a single hidden layer.
Here is our model:
Mathematically:
For one example :
Given the predictions on all the examples, you can also compute the cost as follows:
Reminder: The general methodology to build a Neural Network is to:
1. Define the neural network structure ( # of input units, # of hidden units, etc).
2. Initialize the model’s parameters
3. Loop:
- Implement forward propagation
- Compute loss
- Implement backward propagation to get the gradients
- Update parameters (gradient descent)
You often build helper functions to compute steps 1-3 and then merge them into one function we call nn_model(). Once you’ve built nn_model() and learnt the right parameters, you can make predictions on new data.
4.1 - Defining the neural network structure
Exerci: Define three variables:
- n_x: the size of the input layer
- n_h: the size of the hidden layer (t this to 4)
- n_y: the size of the output layer
Hint: U shapes of X and Y to find n_x and n_y. Also, hard code the hidden layer size to be 4.
# GRADED FUNCTION: layer_sizes
def layer_sizes(X, Y):
极度焦虑"""
Arguments:
X -- input datat of shape (input size, number of examples)
Y -- labels of shape (output size, number of examples)
Returns:
n_x -- the size of the input layer
n_h -- the size of the hidden layer
n_y -- the size of the output layer
"""
### START CODE HERE ### (≈ 3 lines of code)
n_x = X.shape[0] # size of input layer
蔬菜种植技术大全n_h = 4
n_y = Y.shape[0] # size of output layer
### END CODE HERE ###
return (n_x, n_h, n_y)
X_asss, Y_asss = layer_sizes_test_ca()
(n_x, n_h, n_y) = layer_sizes(X_asss, Y_asss)
print("The size of the input layer is: n_x = " + str(n_x))
print("The size of the hidden layer is: n_h = " + str(n_h))
print("The size of the output layer is: n_y = " + str(n_y))
The size of the input layer is: n_x = 5
The size of the hidden layer is: n_h = 4
The size of the output layer is: n_y = 2
Expected Output (the are not the sizes you will u for your network, they are just ud to asss the function you’ve just coded).
n_x5
n_h4
n_y2
4.2 - Initialize the model’s parameters
Exerci: Implement the function initialize_parameters().
Instructions:
生物学
- Make sure your parameters’ sizes are right. Refer to the neural network figure above if needed.
小孩耳朵- You will initialize the weights matrices with random values.
- U: np.random.randn(a,b) * 0.01 to randomly initialize a matrix of shape (a,b).
- You will initialize the bias vectors as zeros.
- U: np.zeros((a,b)) to initialize a matrix of shape (a,b) with zeros.
# GRADED FUNCTION: initialize_parameters
def initialize_parameters(n_x, n_h, n_y):
"""
Argument:
n_x -- size of the input layer
n_h -- size of the hidden layer
n_y -- size of the output layer
Returns:
params -- python dictionary containing your parameters:
W1 -- weight matrix of shape (n_h, n_x)
b1 -- bias vector of shape (n_h, 1)
W2 -- weight matrix of shape (n_y, n_h)
b2 -- bias vector of shape (n_y, 1)
"""
np.random.ed(2) # we t up a ed so that your output matches ours although the initialization is random.
### START CODE HERE ### (≈ 4 lines of code)
W1 = np.random.randn(n_h, n_x) * 0.01
b1 = np.zeros((n_h, 1))
W2 = np.random.randn(n_y, n_h) * 0.01
b2 = np.zeros((n_y, 1))
### END CODE HERE ###
asrt (W1.shape == (n_h, n_x))
asrt (b1.shape == (n_h, 1))
asrt (W2.shape == (n_y, n_h))
asrt (b2.shape == (n_y, 1))
parameters = {"W1": W1,
"b1": b1,
"W2": W2,
"b2": b2}
return parameters
n_x, n_h, n_y = initialize_parameters_test_ca()
parameters = initialize_parameters(n_x, n_h, n_y)
print("W1 = " + str(parameters["W1"]))
print("b1 = " + str(parameters["b1"]))
print("W2 = " + str(parameters["W2"]))
print("b2 = " + str(parameters["b2"]))
W1 = [[-0.00416758 -0.00056267]
[-0.02136196 0.01640271]
[-0.01793436 -0.00841747]
[ 0.00502881 -0.01245288]]
b1 = [[ 0.]
[ 0.]
[ 0.]
[ 0.]]
W2 = [[-0.01057952 -0.00909008 0.00551454 0.02292208]]
b2 = [[ 0.]]
将谓偷闲学少年Expected Output:
W1[[-0.00416758 -0.00056267] [-0.02136196 0.01640271] [-0.01793436 -0.00841747] [ 0.00502881 -0.01245288]] b1[[ 0.] [ 0.] [ 0.] [ 0.]]