Add a new Op#

Don’t define new Ops unless you have to#

It is usually not useful to define Ops that can be easily implemented using other already existing Ops. For example, instead of writing a “sum_square_difference” Op, you should probably just write a simple function:

from aesara import tensor as at

def sum_square_difference(a, b):
    return at.sum((a - b)**2)

Even without taking Aesara’s rewrites into account, it is likely to work just as well as a custom implementation. It also supports all data types, tensors of all dimensions as well as broadcasting, whereas a custom implementation would probably only bother to support contiguous vectors/matrices of doubles…

Use Aesara’s high order Ops when applicable#

Aesara provides some generic Op classes which allow you to generate a lot of Ops at a lesser effort. For instance, Elemwise can be used to make elemwise operations easily, whereas DimShuffle can be used to make transpose-like transformations. These higher order Ops are mostly tensor-related, as this is Aesara’s specialty.