Issue
In the following example:
function fn(_: number | { a: number }) {}
declare const x: number & { a: string };
// ^___ it's type string instead of number
fn(x); // no error
declare const y: string & { a: number };
// ^___ it's type string instead of number
fn(y); // no error
Is there a way to prevent the intersection type from working? or prevent the intersection type from being partly incorrect?
Thank you.
Edit:
You could get the intersection type by doing Object.assign(1, { a: 9 })
.
Since real overloads don't exists in Javascript, the purpose of such a function would be to do something when the argument is a number (namely use it as an index) and another thing when the argument is a record with properties of a particular type.
To be able to operate on both types of arguments at the same time (like a javascript overload), it would need to be able to restrict the intersection type, which is why I'm asking if it's possible.
Thank you.
Solution
TypeScript doesn't want to help enforce your constraint because it goes against the rules of the type system, such as: every member of a union is assignable to the union (T
is assignable to T | U
for all T
and U
) and an intersection is assignable to every member of the intersection (T & U
is assignable to T
for all T
and U
). That means a normal function that accepts number | {a: number}
should also accept number
and therefore number & {a: string}
. And similarly it should accept {a: number}
and therefore string & {a: number}
. If you find yourself wanting to prevent such assignments, you might want to spend some time examining your use cases to make sure that you've got a very good reason to do so, since you'll be working against the TypeScript type system. I will assume from now on that we do want to proceed.
Intersections of primitive types with object types in TypeScript are a convenient fiction which are allowed only because they help simulate nominal types. See the FAQ entry for "Can I make type aliases nominal?". These are known as "branded primitives". Technically such types cannot exist and they "should" therefore reduce to the never
type. The fact that this does not happen means that we can expect to see some weird exceptional behavior.
The particular weird thing is that string & {a: number}
is seen as assignable both to string
and to object
, even though string & object
is never
. If you try to express "a number
that is not an object
" or "a {a: number}
that is an object
", those end up looking like just number
and {a: number}
for any reasonable implementation, and nothing changes.
The only way I can think of to express this is via generics and conditional types. Instead of checking against, say, string & object
(which is never
), we check something like T extends string ? T extends object ? ⋯ : ⋯ : ⋯
.
Here's a possible implementation:
type Primitive = string | number | bigint | null | undefined | boolean | symbol;
function fn<T extends number | { a: number }>(
_: T extends object ? T extends Primitive ? number & { a: number } : T : T
) { }
I've defined Primitive
as the union of all primitive types, so that we prevent all (non-number
) primitives and not just string
. The check T extends object ? T extends Primitive ? ⋯ : T : T
will essentially only catch branded primitives, which are assignable both to Primitive
and to object
. Then once we know we have a branded primitive, we can make sure we only accept the one you want to accept, which is number & {a: number}
.
Let's test it out:
fn(1); // okay
fn("abc"); // error
fn({ a: 9 }); // okay
fn({ a: "abc" }); // error
fn(Object.assign(1, { a: 9 })); // okay
fn(Object.assign("abc", { a: 9 })); // error
fn(Object.assign(1, { a: "abc" }); // error
Looks good.
Answered By - jcalz
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.