Signed & unsigned integer multiplication
This post Talks about what happens when multiplying signed and unsigned integers. Short answer is, as long as they are the same rank (size), a signed is implicitly typecast to unsigned.
As long as you understand the typecasting rules (of whatever language you are programming in), or use explicit typecasting, and you also understand the implications of typecasting from signed to unsigned (a negative number will produce what may appear as gibberish when typecasted to a signed value), then there should be no issue mixing signed and unsigned types.