Author
Listed:
- Xu Chen
(College of Design and Innovation, Tongji University, Shanghai 200092, China
These authors contributed equally to this work.)
- Yinlei Cheng
(School of Artificial Intelligence and Innovative Design, Beijing Institute of Fashion Technology, Beijing 100029, China
These authors contributed equally to this work.)
- Siqin Wang
(International Design Trend Center, Hongik University, Seoul 04068, Republic of Korea)
- Guangliang Sang
(International Design Trend Center, Hongik University, Seoul 04068, Republic of Korea
School of Engineering, Korea National University of Transportation, Chungju 27469, Republic of Korea)
- Ken Nah
(International Design Trend Center, Hongik University, Seoul 04068, Republic of Korea)
- Jianmin Wang
(College of Design and Innovation, Tongji University, Shanghai 200092, China)
Abstract
Activation functions play a crucial role in ensuring training stability, convergence speed, and overall performance in both convolutional and attention-based networks. In this study, we introduce two novel activation functions, each incorporating a sine component and a constraint term. To assess their effectiveness, we replace the activation functions in four representative architectures—VGG16, ResNet50, DenseNet121, and Vision Transformers—covering a spectrum from lightweight to high-capacity models. We conduct extensive evaluations on four benchmark datasets (CIFAR-10, CIFAR-100, MNIST, and Fashion-MNIST), comparing our methods against seven widely used activation functions. The results consistently demonstrate that our proposed functions achieve superior performance across all tested models and datasets. From a design application perspective, the proposed functional periodic structure also facilitates rich and structurally stable activation visualizations, enabling designers to trace model attention, detect surface biases early, and make informed aesthetic or accessibility decisions during interface prototyping.
Suggested Citation
Xu Chen & Yinlei Cheng & Siqin Wang & Guangliang Sang & Ken Nah & Jianmin Wang, 2025.
"A Periodic Mapping Activation Function: Mathematical Properties and Application in Convolutional Neural Networks,"
Mathematics, MDPI, vol. 13(17), pages 1-22, September.
Handle:
RePEc:gam:jmathe:v:13:y:2025:i:17:p:2843-:d:1741541
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:13:y:2025:i:17:p:2843-:d:1741541. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.