aPaperADay
1 Kaiming Discussion and key takeaways
The “Kaiming” initialization method has proved to be a mathematically solid way to initialize weights for deep neural networks.
2019-12-05
1 Kaiming PReLU
Rectified Linear Unit (ReLU), is one of several keys to the recent success of deep networks, and studying them has yielded important results.
2019-12-04
1 Kaiming Implementation Details
The architecture used by Kaiming et al. is similar to VGG-19, with a few minor differences.
2019-12-03
1 Kaiming Comparison with “Xavier’ Initialization
The “Xavier” initialization method was an earlier method proven to be well suited to deeper layered networks. The “Kaiming” init method is here contrasted with it.
2019-12-02