How to solve MySQL max_user_connections error

This happens due to limit specified in the mysql configuration, the system variable max_user_connections.

Solutions

Killing the queries which are stuck at the backend is only a solution I would suggest if it is a SELECT query. Queries that change data, like UPDATE/DELETE/INSERT, are not to be killed.

Secondly, you can use the command mysqladmin processlist to check what is going on inside mysql.

If locking is causing your problem, you can check which engine you are using and change it to another. IBM's SolidDB documentation on table locks might help you. Though there may be another reason for this. (For example, perhaps your queries are taking too long because of an unoptimized query, or the table size is too big, or you have a spammed database).


Your best bet is to increase max_connections. For a MySQL instance serving multiple different web apps (raw php, WordPress, phpBB), you probably want a value of at least 60 for this.

Issue this command and you'll find out how many global connections you have available:

show global variables like '%connections%'

You can find out how many connections are in use at any given moment like this:

show status like '%connected%'

You can find out what each connection is doing like this:

show full processlist

I would try for a global value of at least 100 connections if I were you. Your service provider ought to be able to help you if you don't have access to do this. It needs to be done in the my.cnf file configuration for MySQL. Don't set it too high or you run the risk of your MySQL server process gobbling up all your RAM.

A second approach allows you to allocate those overall connections to your different MySQL users. If you have different MySQL usernames for each of your web apps, this approach will work for you. This approach is written up here. https://www.percona.com/blog/2014/07/29/prevent-mysql-downtime-set-max_user_connections/

The final approach to controlling this problem is more subtle. You're probably using the Apache web server as underlying tech. You can reduce the number of Apache tasks running at the same time to, paradoxically, increase throughput. That's because Apache queues up requests. If it has a few tasks efficiently banging through the queue, that is often faster than lots of tasks because there's less contention. It also requires fewer MySQL connections, which will solve your immediate problem. That's explained here: Restart Mysql automatically when ubuntu on EC2 micro instance kills it when running out of memory

By the way, web apps like WordPress use a persistent connection pool. That is, they establish connections to the MySQL data base, hold them open, and reuse them. If your apps are busy, each connection's lifetime ought to be several minutes.


Read max_connections document to solve your problem

If clients encounter Too many connections errors when attempting to connect to the mysqld server, all available connections are in use by other clients.

The permitted number of connections is controlled by the max_connections system variable. The default value is 151 to improve performance when MySQL is used with the Apache Web server. To support more connections, set max_connections to a larger value.

First: Check your current database max_connection variable

SHOW VARIABLES LIKE 'max_connections';
+-----------------+-------+
| Variable_name   | Value |
+-----------------+-------+
| max_connections | 151   |
+-----------------+-------+

Then Try to increase the max_connection parameter either with running command like:

 SET GLOBAL max_connections = 300;

Or set this parameter in my.cnf that mostly is located at /etc/my.cnf

vi /etc/my.cnf
max_connections = 300

Finally: Restart MySQL service

FYI
you can also check max_user_connections. however, they are related like this:

max_connections set the total connection limit
max_user_connections set limit per user

====

As Sushilzzz asked: can this be caused by low RAM?
Short answer: No
Long Answer: yes, If Ram Size is low and MySQL can't respond as fast as needed there will be many open connections and you can easily hit the max connection.
The estimated number of max connections per 1GB of ram is 100 (if you don't have any other process using ram at the same time). I usually use ~75 for max_connection per 1GB of RAM

RAM      max_connection
1GB      70      
2GB      150      
4GB      300      
8GB      500  

First, this is a hack, but works, especially on a shared host.
We all have bad "neighbors" sometimes, right?

If you have access to your /etc/ increase the limit from 30 to 50, in your my.cnf or through the information schema.

  1. To ignore the error message the visitor might see, use @mysql_connect().
  2. If there are more than 30 MUCs, use the "or die()" statement to stop the query.
  3. Replace the "or die" message with die(header(location: THIS PAGE)) and be sure to mysql_close();

Yes, it will cause a delay in page loading. But better to load than a white screen of death -or worse error messages that visitors have no understanding of.

Tags:

Mysql