How to type a Typescript array to accept only a specific set of values?
So, your problem seems to be this:
type Fruit = "Apple" | "Pear";
interface FruitFilter {
fruits: Fruit[];
}
declare function Fruits(filter: FruitFilter): boolean;
Fruits({ fruits: ["Apple", "Apple", "Pear"] }); // okay
Fruits({ fruits: ["Apple", "App1e", "Pear"] }); // error
// actual error: ~~~~~~~ ~~~~~~~ ~~~~~~ <-- string not assignable to Fruit
// expected error: ~~~~~~~ <-- "App1e" not assignable to Fruit
It's not that you have an error, but that the error isn't properly constrained to the "bad" elements of the array.
My guess about why this is happening is that the compiler tends to widen string literals to string
and tuple types to arrays unless you give it hints not to do that. Therefore, when it can't verify that the fruits
is of type Fruit[]
, it backs up and looks at what you gave it. It widens ["Apple", "App1e", "Pear"]
to string[]
(forgetting both about the string literals and the fact that it is a three-element tuple), realizes that string[]
is not assignable to Fruit[]
, and then proceeds to warn you about this by flagging each element. I did a brief search of GitHub issues to see if this has ever been reported, but I haven't seen it. It may be worth filing something.
Anyway, to test my guess, I decided to alter the declaration of Fruits()
to hint that we want a tuple of string literals if at all possible. Note that there is currently no convenient way to do this; the ways to do hinting right now are, uh, alchemical:
// ð§⚗ðð❓
declare function Fruits2<S extends string, T extends S[] | [S]>(arr: {
fruits: T & { [K in keyof T]: Fruit };
}): boolean;
Fruits2({ fruits: ["Apple", "Apple", "Pear"] }); // okay
Fruits2({ fruits: ["Apple", "App1e", "Pear"] }); // error
// ~~~~~~~ <--string is not assignable to never
Well, the placement of that error is where you want it, although the message is possibly still confusing. That's what happens when the compiler tries to assign "Apple"
to the intersection Fruit & "App1e"
which doesn't exist. The compiler reduces Fruit & "App1e"
to never
... correctly, but possibly a bit too soon for the error message to be useful.
Anyway, I don't recommend this "solution" since it's much more complicated and only gives you a somewhat better error experience in error situations. But at least this is something like an answer as to why it's happening, along with a possible direction for how to address it (e.g., find or file an issue about it). Okay, good luck!
Link to code
It works as well, if you don't want a type
export interface MyInterface {
fruits: Array<'apple' | 'pear' | 'strawberry'>
}
You might also use Enums for that:
enum Fruits {
Apple,
Pear,
}
interface FruitFilter {
fruits: Array<Fruits>;
}
These will be converted to 0 and 1 in plain Javascript.
If you need to, you can also use strings instead of numbers. Then you had to define the enums like that:
enum Fruits {
Apple = 'Apple',
Pear = 'Pear',
}
The TypeScript doc has some more examples and how this is being used at runtime:
https://www.typescriptlang.org/docs/handbook/enums.html#enums-at-runtime