·¬ÇÑÉçÇø

Event Details

Universal Activation Function for Machine Learning

Presenter: Brosnan Yuen
Supervisor:

Date: Thu, August 1, 2024
Time: 13:00:00 - 00:00:00
Place: Zoom, link below.

ABSTRACT

Location: Zoom Meeting ID: 850 8650 1062   , Zoom Password: 284068

Abstract: This research proposes a universal activation function (UAF) that achieves near optimal performance in quantification, classification, and reinforcement learning (RL) problems. For any given problem, the gradient descent algorithms are able to evolve the UAF to a suitable activation function by tuning the UAF’s parameters. For the CIFAR-10 classification using the VGG-8 neural network, the UAF converges to the Mish like activation function, which has near optimal performance F1 = 0.902±0.004 when compared to other activation functions. In the graph convolutional neural network on the CORA dataset, the UAF evolves to the identity function and obtains F1 = 0.835 ± 0.008. For the quantification of simulated 9-gas mixtures in 30 dB signal-to-noise ratio (SNR) environments, the UAF converges to the identity function, which has near optimal root-mean-square
error of 0.489 ± 0.003 μM. In the ZINC molecular solubility
quantification using graph neural networks, the UAF morphs to
a LeakyReLU/Sigmoid hybrid and achieves RMSE=0.47 ± 0.04.
For the BipedalWalker-v2 RL dataset, the UAF achieves the 250
reward in 961 ± 193 epochs with a brand-new activation function, which gives the fastest convergence rate among the activation functions.