Skip to content

Why pyfields ?

During the few years I spent exploring the python world, I tried several times to find a "good" way to create classes where fields could be

  • declared explicitly in a compact way
  • with optional validation and conversion
  • with as little call overhead as possible
  • without messing with the __init__ and __setattr__ methods

I discovered:

  • @property, that is a good start but adds a python call cost on access and lacks the possibility to add validation and conversion in a compact declaration. It relies on the generic python descriptors mechanism.

  • attrs, a great way to define classes with many out-of-the-box features (representation, hashing, constructor, ...). Its philosophy is that objects should be immutable (They can be mutable, actually they are by default, but the validators are not executed on value modification as of 0.19). The way it works is by creating a "smart" __init__ script that contains all the logic (see here), and possibly a __setattr__ if you ask for immutable objects with frozen=True.

  • autoclass was one of my first open-source projects in python: I tried to create a less optimized version of attrs, but at least something that would support basic use cases. The main difference with attrs is that fields are defined using the __init__ signature, instead of class attributes, and it is possible to define custom setters to perform validation, that are effectively called on value modification. I also developed at the time a validation lib valid8) that works with autoclass and attrs. The result has been used in industrial projects. But it is still not satisfying because relying on the __init__ signature to define the fields is not very elegant and flexible in particular in case of multiple inheritance.

  • PEP557 dataclasses was largely inspired by and is roughly equivalent to attrs, although a few design choices differ and its scope seems more limited.

In parallel I discovered a few libraries oriented towards data modelling and serialization:

  • marshmallow, an ORM / ODM / framework-agnostic library for converting complex datatypes, such as objects, to and from native Python datatypes.

  • related is also a library oriented towards converting data models from/to json/yaml/python

  • colander

  • django forms

This topic was left aside for a moment, until half 2019 where I thought that I had accumulated enough python expertise (with makefun, decopatch and many pytest libraries) to have a fresh look on it. In the meantime I had discovered:

  • traitlets which provides a quite elegant way to define typed fields and define validation, but requires the classes to inherit from HasTraits, and does not allow users to define converters.

  • traits

  • werkzeug's @cached_property and sagemath's @lazy_attribute, that both rely on the descriptor protocol to define class fields, but lack compacity

  • zopeinterface, targeting definition of strict interfaces (but including attributes in their definition). It also defines the concept of "invariants"

  • pydantic embraces python 3.6+ type hints (that can be defined on class attributes). It is quite elegant, is compliant with dataclasses, and supports validators that can act on single or multiple fields. It requires classes to inherit from a BaseModel. It does not seem to support converters as of version 0.32, rather, some type conversion happens behind the scenes (see for example this issue). But it looks definitely promising.

  • trellis which provides an event-driven framework for class attributes with linked effects

I was still not satisfied by the landscape :(. So I wrote this alternative, maybe it can fit in some use cases ! Do not hesitate to provide feedback in the issues page.