From Planar data classification with one hidden layer and Logistic Regression with a Neural Network mindset ,we achieved the shallow neural network, to get the higher accuracy ,we will achieve deep neural network in this paper, we will find that after a deep neural network,our network will more outstanding.
initialize_parameters_deep
we set the random seed 1 to fix the result, use np.random.rand to initialize w and np.zeros to initialize b:
1 

linear_forward
Let’s implement simple linear propagation first:
1  def linear_forward(A, W, b): 
linear_activation_forward
Then activate the result of forward propagation:
1  def linear_activation_forward(A_prev, W, b, activation): 
L_model_forward
After implementing the forward propagation of a single layer, we realize the forward propagation of the multilayer network based on this:
1 

compute_cost
After forward propagation, we need to calculate the cost based on the results of the forward propagation and the label:
1 

linear_backward
After completing the forward propagation, we need to implement backpropagation, which is the key to the neural network that can well learn the characteristics of the input data.
First, it is the same as forward propagation to achieve a simple backpropagation of a singlelayer network:
1 

linear_activation_backward
Combined with the form of different activation functions, we can calculate the inverse calculated gradient:
1 

L_model_backward
After implementing singlelayer back propagation, we implement backpropagation of multilayer neural networks based on this:
1 

update_parameters
After the backpropagation is complete, we can update the parameters:
1 

L_layer_model
After implementing the structure of the deep neural network, we can classify the image：
1 

After running the above code, we can get the following results;
and the learning rate as :
Links
the dataset and source code in :github