This page describes theano.OpFromGraph, an Op that allows to encapsulate a Theano graph in an op.
This can be used to encapsulate some functionality in one block. It is useful to scale Theano compilation for regular bigger graphs when we reuse that encapsulated fonctionality with different inputs many times. Due to this encapsulation, it can make Theano compilation phase faster for graphs with many nodes.
Using this for small graphs is not recommended as it disables optimizations between what is inside the encapsulation and outside of it.
This creates an Op from inputs and outputs lists of variables.
The signature is similar to theano.function() and the resulting Op‘s perform will do the same operation as:
orig_function(inputs, outputs, **kwargs)
Note: |
|
---|
Example 1:
from theano import function, OpFromGraph, tensor
x, y, z = tensor.scalars('xyz')
e = x + y * z
op = OpFromGraph([x, y, z], [e])
# op behaves like a normal theano op
e2 = op(x, y, z) + op(z, y, x)
fn = function([x, y, z], [e2])
Example 2 with shared variable:
import numpy
import theano
from theano import config, function, OpFromGraph, tensor
x, y, z = tensor.scalars('xyz')
s = theano.shared(numpy.random.rand(2, 2).astype(config.floatX))
e = x + y * z + s
op = OpFromGraph([x, y, z], [e])
# op behaves like a normal theano op
e2 = op(x, y, z) + op(z, y, x)
fn = function([x, y, z], [e2])