Kale Transformer Artifact¶
kale.Transformer
is an MLMD ArtifactType
which allows the logging of
transformer components in MLMD.
Overview
Import¶
The object lives in the kale.common.artifacts
module. Import it as
follows:
Attributes¶
Name | Type | Default | Description |
---|---|---|---|
name |
str |
None |
The name of the transformer |
transformer_dir |
str |
None |
The path in which Kale stores the transformer package (assets and functions) |
module_name |
str |
None |
The name of the module you define the transformer class |
class_name |
str |
None |
The name of the transformer class |
is_statefull |
boolean |
False |
If the transformer is written for a specific dataset is True, otherwise is False |
preprocess |
function |
None |
The preprocess function the transformer will use |
postprocess |
function |
None |
The postprocess function the transformer will use |
transformer_assets |
Dict[str, variable] |
None |
Any global variables that the transformer may depend on |
is_statefull |
boolean |
False |
If the transformer depends on stateful global variables, for example a word vectorizer, is True |
Initialization¶
There are two APIs that you can use to create a kale.Transformer
artifact:
- The Subclassing API
- The Functional API
Important
The two APIs are mutually exclusive. You can use one or the other, but not both.
Choose one of the following options, based on the API you want to use.
To use the Subclassing API, create a Python module, that is, a .py
file, that defines the transformer object. The transformer object extends
the kserve.Model
class, that KServe provides, and overrides
the preprocess
and postprocess
methods. For example:
Define a kale.Transformer
artifact and pass the name of the folder
containing the transformer module you defined previously to the
transformer_dir
attribute. Also, pass the name of the module, for
example transformer
, as well as the name of the class, for example
Transformer
.
Important
Always use absolute paths when specifying the path to the tranformer folder.
See also
- For a complete example, check out the Scikit Learn models user guide.
To use the Functional API, provide the preprocess
and postprocess
functions that the transformer will use. For example:
Important
The preprocess
and postprocess
functions must be standalone
functions. Thus, import all the Python modules the function depends on
within its body.