Daily Dose of Data Science

Share this post

A Highly Overlooked Point In The Implementation of Sigmoid Function

www.blog.dailydoseofds.com

A Highly Overlooked Point In The Implementation of Sigmoid Function

Some subtle stuff that is often ignored.

Avi Chawla
May 22, 2023
4
Share

There are two variations of the sigmoid function:

  • Standard: with an exponential term e^(-x) in the denominator only.

Standard sigmoid function
  • Rearranged: with an exponential term e^x in both numerator and denominator.

Rearranged sigmoid function

The standard sigmoid function can be easily computed for positive values. However, for large negative values, it raises overflow errors.

This is because, for large negative inputs, e^(-x) gets bigger and bigger.

To avoid this, use both variations of sigmoid.

  • Standard variation for positive inputs. This prevents overflow that may occur for negative inputs.

  • Rearranged variation for negative inputs. This prevents overflow that may occur for positive inputs.

A plot of e^x and e^-x

This way, you can maintain numerical stability by preventing overflow errors in your ML pipeline.

Having said that, luckily, if you are using an existing framework, like Pytorch, you don’t need to worry about this.

These implementations offer numerical stability by default. However, if you have a custom implementation, do give it a thought.

Over to you:

  • Sigmoids’s two-variation implementation that I have shared above isn’t vectorized. What is your solution to vectorize this?

  • What are some other ways in which numerical instability may arise in an ML pipeline? How to handle them?

    Thanks for reading Daily Dose of Data Science! Subscribe for free to learn something new and insightful about Python and Data Science every day. Also, get a Free Data Science PDF (250+ pages) with 200+ tips.


👉 Read what others are saying about this post on LinkedIn and Twitter.

👉 Tell the world what makes this newsletter special for you by leaving a review here :)

Review Daily Dose of Data Science

👉 If you liked this post, don’t forget to leave a like ❤️. It helps more people discover this newsletter on Substack and tells me that you appreciate reading these daily insights. The button is located towards the bottom of this email.

👉 If you love reading this newsletter, feel free to share it with friends!

Share Daily Dose of Data Science


Find the code for my tips here: GitHub.

I like to explore, experiment and write about data science concepts and tools. You can read my articles on Medium. Also, you can connect with me on LinkedIn and Twitter.

4
Share
Previous
Next
Comments
Top
New
Community

No posts

Ready for more?

© 2023 Avi Chawla
Privacy ∙ Terms ∙ Collection notice
Start WritingGet the app
Substack is the home for great writing