SoatDev IT Consulting
SoatDev IT Consulting
  • About us
  • Expertise
  • Services
  • How it works
  • Contact Us
  • News
  • July 1, 2023
  • Rss Fetcher

Neural network activation functions transform the output of one layer of the neural net into the input for another layer. These functions are nonlinear because the universal approximation theorem, the theorem that basically says a two-layer neural net can approximate any function, requires these functions to be nonlinear.

Heaviside function plot

Activation functions often have two-part definitions, defined one way for negative inputs and another way for positive inputs, and so they’re ideal for Iverson notation. For example, the Heaviside function plotted above is defined to be

f(x) = left{ begin{array}{ll} 1 & mbox{if } x > 0 \ 0 & mbox{if } x leq 0 end{array} right.

Kenneth Iverson’s bracket notation, first developed for the APL programming language but adopted more widely, uses brackets around a Boolean expression to indicate the function that is 1 when the expression is true and 0 otherwise. With this notation, the Heaviside function can be written simply as

f(x) = [x > 0]

Iverson notation is fairly common, but not quite so common that I feel like I can use it without explanation. I find it very handy and would like to popularize it. The result of the post will give more examples.

ReLU

The popular ReLU (rectified linear unit) function is defined as

f(x) = left{ begin{array}{ll} x & mbox{if } x > 0 \ 0 & mbox{if } x leq 0 end{array} right.

and with Iverson bracket notation as

f(x) = x[ x> 0]

The ReLU activation function is the identity function multiplied by the Heaviside function. It’s not the best example of the value of bracket notation since it could be written simply as max(0, x). The next example is better.

ELU

ELU

The ELU (exponential linear unit) is a variation on the ReLU that, unlike the ReLU, is differentiable at 0.

f(x) = left{ begin{array}{ll} x & mbox{if } x > 0 \ e^x - 1 & mbox{if } x leq 0 end{array} right.

The ELU can be described succinctly in bracket notation.

f(x) = (e^x-1)[ x leq 0] + x[x > 0]

PReLU

PReLU -- parameterized rectified linear unit -- graph

The PReLU (parametric rectified linear unit) depends on a small positive parameter a. This parameter must not equal 1, because then the function would be linear and the universal approximation theorem would not apply.

f(x) = left{ begin{array}{ll} x & mbox{if } x > 0 \ ax & mbox{if } x leq 0 end{array} right.

In Iverson notation:

f(x) = ax[ x leq 0] + x[x > 0]

Related posts

  • Smoothed step function
  • How to compute softmax
  • Floor, ceiling, bracket

The post Activation functions and Iverson brackets first appeared on John D. Cook.

Previous Post
Next Post

Recent Posts

  • 5 Tips on How to be Vigilant on Social Media
  • IT News Africa and Infobip Exclusive Webinar on Digital Loan Recovery for Africa’s BFSI Sector
  • Mysterious hacking group Careto was run by the Spanish government, sources say
  • 5 Dangers of Oversharing on Social Media
  • Can a dev environment spark joy? The Android team thinks so.

Categories

  • Industry News
  • Programming
  • RSS Fetched Articles
  • Uncategorized

Archives

  • May 2025
  • April 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • October 2024
  • September 2024
  • August 2024
  • July 2024
  • June 2024
  • May 2024
  • April 2024
  • March 2024
  • February 2024
  • January 2024
  • December 2023
  • November 2023
  • October 2023
  • September 2023
  • August 2023
  • July 2023
  • June 2023
  • May 2023
  • April 2023

Tap into the power of Microservices, MVC Architecture, Cloud, Containers, UML, and Scrum methodologies to bolster your project planning, execution, and application development processes.

Solutions

  • IT Consultation
  • Agile Transformation
  • Software Development
  • DevOps & CI/CD

Regions Covered

  • Montreal
  • New York
  • Paris
  • Mauritius
  • Abidjan
  • Dakar

Subscribe to Newsletter

Join our monthly newsletter subscribers to get the latest news and insights.

© Copyright 2023. All Rights Reserved by Soatdev IT Consulting Inc.