by: Kevin Broløs & Chris Cave
(Feyn version 2.0 or newer)
Models can be saved to a file for use at a later time or place.
Here is an example how:
import feyn from feyn.datasets import make_classification # Generate a dataset and put it into a dataframe train, test = make_classification() # Connect to a QLattice and run a classification simulation ql = feyn.connect_qlattice() models = ql.auto_run( data=train, output_name='y', kind='classification' ) # Save a model to a file models.save('my_model.json')
You can load the
Model using feyn.Model.load.
from feyn import Model model = Model.load('my_model.json') prediction = model.predict(test)
Model is saved and selected, you can load them into any
Python environment to do predictions. You don't need access to a
QLattice for this.
Model is no different from a
Model sampled from the
QLattice. For example, you can resume fitting a
Model after loading it.