A very common practice is to use a column of type integer for the primary key on a table with auto increment. This works great in most cases. However, there is a limitation to this that should be kept in mind. The largest value that an integer column can hold is 2,147,483,647 (2.1 billion). So, if you have a database that will contain the names of everyone living in Wyoming and Montana, you’ll be fine. However, if your database will contain the names of everyone living in China and India, you will definitely have an overflow issue.
So, how can you tell what your risk level is? You could go through each table and select the max ID field, but that would take too long unless your database is very small. Yes, I did start doing it this way myself, but quickly realized this was not going to be very fun. So, I came up with this query that provides the Table Name, the Last Value used in the identity column, the Data Type of the column, and the percentage of valid integers that have been used (for the integer data type).
Running this on my database, I was able to see that we do have one table that has an identity column that has reached 50 million. This is only about 2.3% of the way to the largest integer, so we have some time to deal with it. Still, it is nice to know that we are safe for a while.
SELECT sys.tables.name AS [Table Name], last_value AS [Last Value], CASE (MAX_LENGTH) WHEN 1 THEN 'TINYINT' WHEN 2 THEN 'SMALLINT' WHEN 4 THEN 'INT' WHEN 8 THEN 'BIGINT' ELSE 'UNKNOWN' END AS DataType, CAST(cast(last_value as int) / 2147483647.0 * 100.0 AS DECIMAL(5,2)) AS [Percentage of ID's Used] FROM sys.identity_columns INNER JOIN sys.tables ON sys.identity_columns.object_id = sys.tables.object_id ORDER BY last_value DESC
For more tips on managing a SQL Server Database, see Managing a SQL Server Database: Tips for Beginners