Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unergonomic type for size of GPUTextureDescriptor #147

Open
pietrovismara opened this issue Feb 29, 2024 · 2 comments
Open

Unergonomic type for size of GPUTextureDescriptor #147

pietrovismara opened this issue Feb 29, 2024 · 2 comments

Comments

@pietrovismara
Copy link

Accessing the properties of size in a GPUTextureDescriptor requires first casting it as a GPUExtent3DDictStrict:

const texture: GPUTextureDescriptor = {
  size: { width: 1, height: 2, depthOrArrayLayers: 2 }, // No TS error here
  usage: GPUTextureUsage.COPY_DST,
  format: navigator.gpu.getPreferredCanvasFormat()
};

texture.size.width // Property 'width' does not exist on type 'GPUExtent3DStrict'.
texture.size.depthOrArrayLayers // Property 'depthOrArrayLayers' does not exist on type 'GPUExtent3DStrict'.

const size = texture.size as GPUExtent3DDictStrict;

size.width // OK
size.depthOrArrayLayers // OK

I understand this is probably because size can either be an Iterable or an object, but this isn't very ergonomic.
Perhaps GPUTextureDescriptor could take an optional generic parameter to determine the type of size?

Either way allowing size to be a union type makes dynamically handling GPUTextureDescriptors more complicated than it should be, since it forces you to check the type first.

I realize this is more of a spec issue than a types issue, but ideally size should just have one type. The drawbacks of having to check at runtime if it's an object or Iterable far outweigh the convenience of defining it more concisely as an array.

@pietrovismara pietrovismara changed the title Incorrect type for size of GPUTextureDescriptor Unergonomic type for size of GPUTextureDescriptor Feb 29, 2024
@kainino0x
Copy link
Collaborator

kainino0x commented Feb 29, 2024

I'm aware of a few other problems like this as well, see in particular Array vs Iterable
#14
darionco/bikeshed-to-ts#6 (linked from #99)

Right now the best way to deal with this is to simply not declare your own variables using the WebGPU types, but instead use inferred types:

const texture = {
  size: { width: 1, height: 2, depthOrArrayLayers: 2 },
  usage: GPUTextureUsage.COPY_DST,
  format: navigator.gpu.getPreferredCanvasFormat()
};

or custom types when you really need them, like GPUTextureDescriptor & { size: GPUExtent3DDictStrict }

@kainino0x
Copy link
Collaborator

Oh, and #29 is similar too.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants