what is causing arithmetic overflow in the query below?
I am struggling to find the reason of the arithmetic overflow. why is it happening?
Most likely the metadata is returning some unexpected values that your code cannot handle. For example:
-- Example values returned from sysfiles and FILEPROPERTY
DECLARE
@size integer = 1,
@spaceused integer = 10000;
-- The essence of the code in the question
SELECT
CAST
(
100 *
(
CAST
(
(
(@size/128.0 - @spaceused/128.0)/(@size/128.0)
)
AS decimal(5,2)
)
)
AS varchar(8)
) + '' AS FreeSpacePct;
...returns the error mentioned in the question, because the computed (negative!) value will not fit in decimal(5,2)
.
There are reasons why size might be reported as much lower than space used, including tempdb file growths, filestream files, bugs in older versions of SQL Server...too many to list. You could/should code defensively against this possibility (and also for offline/defunct files...and so on).
The question is tagged SQL Server 2014, so there's no need to use the deprecated sys.sysfiles
view (for backward compatibility with SQL Server 2000):
I might write this query as:
SELECT
DatabaseName = DB_NAME(),
[FileName] = DF.name,
FileType = DF.type_desc,
SizeMB = STR(DF.size * Factor.PagesToMB, 10, 2),
SpaceUsedMB = STR(FP.SpaceUsed * Factor.PagesToMB, 10, 2),
FreeSpaceMB = STR(FS.FreeSpace * Factor.PagesToMB, 10, 2),
FreeSpacePct = STR(Factor.ToPct * FS.FreeSpace / DF.size, 7, 4)
FROM sys.database_files AS DF
CROSS APPLY (SELECT FILEPROPERTY(DF.name, 'SpaceUsed')) AS FP (SpaceUsed)
CROSS APPLY (SELECT DF.size - FP.SpaceUsed) AS FS (FreeSpace)
CROSS JOIN (SELECT 8e0 / 1024e0, 1e2) AS Factor (PagesToMB, ToPct);
Main advantages:
- It separates out the calculation steps
- Uses float arithmetic to avoid overflows
STR
formats the result and does not raise an error on overflow- It does not cause the error in the question to be thrown
A dynamic SQL version (to collect information for all databases):
DECLARE @SQL nvarchar(2000);
SET @SQL = N'
USE ?;
SELECT
DatabaseName = DB_NAME(),
[FileName] = DF.name,
FileType = DF.type_desc,
SizeMB = STR(DF.size * Factor.PagesToMB, 10, 2),
SpaceUsedMB = STR(FP.SpaceUsed * Factor.PagesToMB, 10, 2),
FreeSpaceMB = STR(FS.FreeSpace * Factor.PagesToMB, 10, 2),
FreeSpacePct = STR(Factor.ToPct * FS.FreeSpace / DF.size, 7, 4)
FROM sys.database_files AS DF
CROSS APPLY (SELECT FILEPROPERTY(DF.name, ''SpaceUsed'')) AS FP (SpaceUsed)
CROSS APPLY (SELECT DF.size - FP.SpaceUsed) AS FS (FreeSpace)
CROSS JOIN (SELECT 8e0 / 1024e0, 1e2) AS Factor (PagesToMB, ToPct);
';
DECLARE @Results AS table
(
DatabaseName sysname NOT NULL,
[FileName] sysname NOT NULL,
FileType nvarchar(60) NOT NULL,
SizeMB char(10) NULL,
SpaceUsedMB char(10) NULL,
FreeSpaceMB char(10) NULL,
FreeSpacePct char(7) NULL
);
INSERT @Results
EXECUTE sys.sp_MSforeachdb
@command1 = @SQL;
SELECT R.*
FROM @Results AS R
ORDER BY R.DatabaseName; -- Or whatever
Usual caveats about using sp_MSforeachdb
.
Maybe you have forgotten that a file can have 100% free space?
In that case you need DECIMAL(5,2), not DECIMAL(4,2).
Run the query in SSMS and look at the actual execution plan. Highlight the Compute Scalar and look at its properties (F4). Look at the Defined Values property. On my system I got this:
[Expr1008] = Scalar Operator(db_name()), [Expr1009] =
Scalar Operator(CONVERT(int,[Expr1013]/(128.0)CONVERT_IMPLICIT(numeric(10,0),
fileproperty([ReadReceipt].[sys].[sysprufiles].[lname],'SpaceUsed'),0)
/(128.0),0)), [Expr1010] = Scalar Operator(CONVERT(varchar(8),(100.)
*CONVERT(decimal(5,2),([Expr1013]/(128.0)-CONVERT_IMPLICIT(numeric(10,0),
fileproperty([ReadReceipt].[sys].[sysprufiles].[lname],'SpaceUsed'),0)
/(128.0))/([Expr1013]/(128.0)),0),0)+'')
There's a lot of CONVERT_IMPLICIT going on in there. Chances are one of your DBs is overflowing one of these intermediate calculations. I don't see an error on my small development box.
To debug I would comment out each of your calculated values in turn to see which is throwing the error. Then I'd filter out large DBs using WHERE. If it works for small DBs that would be a clue. Next strip the calculation down to its minimum and run it for just the the largest DB. Add the CASTs back one at a time and compare the Defined Values for arrangements where it works and where it fails.
My feeling is a single CAST to INT around the whole calculation is likely to be your best option.
This article state ARITHABORT setting has a bearing, too.