GPUTexture

class wgpu.GPUTexture

Bases: GPUObjectBase

Represents a 1D, 2D or 3D color image object.

A texture also can have mipmaps (different levels of varying detail), and arrays. The texture represents the “raw” data. A GPUTextureView is used to define how the texture data should be interpreted.

Create a texture using GPUDevice.create_texture().

create_view(**parameters)

Create a GPUTextureView object.

If no aguments are given, a default view is given, with the same format and dimension as the texture.

Parameters:
  • label (str) – A human readable label. Optional.

  • format (enums.TextureFormat) – What channels it stores and how.

  • dimension (enums.TextureViewDimension) – The dimensionality of the texture view.

  • aspect (enums.TextureAspect) – Whether this view is used for depth, stencil, or all. Default all.

  • base_mip_level (int) – The starting mip level. Default 0.

  • mip_level_count (int) – The number of mip levels. Default None.

  • base_array_layer (int) – The starting array layer. Default 0.

  • array_layer_count (int) – The number of array layers. Default None.

property depth_or_array_layers

The texture’s depth or number of layers. Also see .size.

destroy()

An application that no longer requires a texture can choose to destroy it. Note that this is automatically called when the Python object is cleaned up by the garbadge collector.

property dimension

The dimension of the texture.

property format

The format of the texture.

property height

The texture’s height. Also see .size.

property mip_level_count

The total number of the mipmap levels of the texture.

property sample_count

The number of samples in each texel of the texture.

property size

The size of the texture in mipmap level 0, as a 3-tuple of ints.

property usage

The allowed usages for this texture.

property width

The texture’s width. Also see .size.