How to standardize raster output from 0 to 100 using raster algebra?
The language of data transformation can be confusing. Standardization refers to transforming your data so it has a mean of 0 and a standard deviation of 1 and is only appropriate for normally (Gaussian) distributed data. Whereas, normalization transforms your data so that the minimum value is 0 and the maximum is 1 while keeping the shape of the original distribution. You are wanting a stretch or normalization.
Here is the raster algebra syntax for a data stretch. The "+ 0" is, in this case, obviously irrelevant but is left in for cases where the desired minimum value is not zero. The min("raster") and max("raster") refer to the global min/max values of the rasters. I provide this example because it allows a specification of any desired output min/max raster values.
("raster" - min("raster")) * 100 / (max("raster") - min("raster")) + 0
In your case you could just normalize and multiply by 100. The reason for this transformation is because the index needs the summed variables to be in the same variable space.
("raster" - min("raster")) / (max("raster") - min("raster")) * 100
If you are using ArcGIS this toolbox has a tool for statistical transformations, including normalization. For this model it is not necessary to have the inputs in a 1-100 range, you can use 0-1 and then multiply the output by 100 to get the desired data range for the index.