Resolving Module ‘keras.optimizers’ has no attribute ‘adam’

Encountering the Module ‘keras.optimizers’ has no attribute ‘adam’ when working with neural networks, and deep learning in Python using the Keras library is not uncommon. This error can be perplexing, especially for those new to Keras.

In this article, we’ll delve into the potential causes of this error and find the ways to resolve it.

What is ‘Module ‘keras.optimizers’ has no attribute ‘adam’?

The error statement “Module ‘keras.optimizers’ has no attribute ‘adam'” is a distinctive issue in Python programming, specifically when working with the Keras library for deep learning.

This error is triggered when attempting to access the Adam optimizer within Keras, but the optimizer is not located in the ‘keras.optimizers’ module.

In essence, the system cannot locate the specified optimizer, leading to this particular error message.

Let’s break down the error message: AttributeError: Module ‘keras.optimizers’ has no attribute ‘adam’:

  • AttributeError: This built-in Python exception is raised when an object does not have a specific attribute or method. In this case, the ‘adam’ attribute is not found in the keras.optimizers module.
  • Module ‘keras.optimizers’: This part of the error message indicates that the issue lies within the keras.optimizers module. In the keras.optimizers module the Keras stores various optimizer classes.
  • No attribute ‘adam’: This is the specific problem. The error states that it cannot find an attribute named ‘adam’ within the keras.optimizers module. This suggests that there might be a typo, a version mismatch, or some other issue causing this problem.

Potential Causes of the Module ‘keras.optimizers’ has no attribute ‘adam’

Some of the causes of Module ‘keras.optimizers’ has no attribute ‘adam’ are as follows:

Typographical Error

The most common reason for this error is a typographical mistake. You might have accidentally misspelled ‘adam’ when trying to specify the optimizer. Python is case-sensitive, so ‘adam’ and ‘Adam’ are treated as different attributes, which can raise the error.

Keras Version Mismatch

The availability of certain optimizers can vary depending on the version of Keras you are using. If you are using an older version of Keras, the ‘adam’ optimizer might not be available, and you may need to update to a newer version.

Custom Optimizer

Another possibility is that you use a custom optimizer or have defined your optimizer class within your code. If this optimizer is mistakenly named ‘adam’, it could lead to this error. Double-check your code for any custom optimizer definitions.

Resolving the Module ‘keras.optimizers’ has no attribute ‘adam’

Now that we understand the potential causes, let’s explore the steps to resolve this “Module ‘keras.optimizers’ has no attribute ‘adam'”

Check for Typos

Carefully inspect your code to ensure you have spelled ‘adam’ correctly when specifying the optimizer. Python is case-sensitive, so ‘adam’ and ‘Adam’ are considered different attributes.

# Incorrect --
optimizer = keras.optimizers.adam(lr=0.001) 

# Correct --
optimizer = keras.optimizers.Adam(lr=0.001) 

Verify Keras Version

If you suspect a version mismatch, use a Keras version that supports the ‘adam’ optimizer. You can update Keras using pip by running:

pip install --upgrade keras

Look for Custom Optimizer Definitions

If you have defined a custom optimizer within your code, ensure it doesn’t inadvertently use the name ‘adam’. This could override the built-in optimizer and lead to the error.

Here’s an example of a custom optimizer named ‘adam’ causing the error:

class custom_adam:
       def init(self, learning_rate):
              self.learning_rate = learning_rate
       def get_updates(self, loss, params):
              updates = []
              for p, g in zip(params, grads):
                      new_p = p - self.learning_rate * g
                      updates.append((p, new_p))
              return updates

Use an Alternate Optimizer

If resolving this issue seems difficult by the usual methods, consider temporarily using an alternate optimizer like ‘SGD’ or ‘RMSprop’ to proceed with your work. These are reliable optimizers that can serve as substitutes.

optimizer = keras.optimizers.SGD(lr=0.001) 
# or
optimizer = keras.optimizers.RMSprop(lr=0.001)

Following these steps and using the provided code examples, you can effectively troubleshoot and resolve the Module ‘keras.optimizers’ has no attribute ‘adam’.


What should I do if the error persists even after checking for typos and updating Keras?

If the error persists, it may be worth examining the environment in which you are running your code. Ensure that there are no conflicting installations or custom configurations that might be interfering with the Keras library. Additionally, consulting relevant forums or communities for assistance with specific environments or setups can be beneficial.

Can I use a different optimizer if I encounter the “Module ‘keras.optimizers’ has no attribute ‘adam'” error?

You can use an alternative optimizer like Stochastic Gradient Descent (SGD) or RMSprop as a temporary workaround. For example, you can use keras.optimizers.SGD() or keras.optimizers.RMSprop() with the desired learning rate. This allows you to continue your work while investigating and resolving the ‘adam’ attribute error.


The Module ‘keras.optimizers’ has no attribute ‘adam’ error, a common stumbling block for those working with neural networks in Keras. By understanding the nature of the error and considering the potential causes outlined in this article, you can effectively troubleshoot and resolve this issue, allowing you to continue building and training your deep learning models confidently.

Remember to double-check your code for typos, verify your Keras version, and inspect for any custom optimizer definitions that might be causing the problem. With these steps, you’ll be well-equipped to tackle this error and continue your deep-learning endeavors.


  1. Keras.optimizers

Leave a Comment