1Department of Automation and BNRist, Tsinghua University, Beijing, 100084, China.
View abstract on PubMed
Researchers generalized the Rectified Linear Unit (ReLU) activation function to a Generalized Multivariate projection Unit (GeMU). GeMU, a projection onto convex cones, enhances neural network expressiveness and performance across various tasks.
Area of Science:
Background:
Purpose of the Study:
Main Methods:
Main Results:
Conclusions: