Feyn

Feyn

  • Tutorials
  • Guides
  • API Reference
  • FAQ

›Essentials

Getting Started

  • Quick start
  • Using Feyn
  • Installation
  • Transition to Feyn 3.0
  • What is a QLattice?
  • Community edition
  • Commercial use

Essentials

  • Auto Run
  • Visualise a model
  • Summary plot
  • Semantic types
  • Categorical features
  • Estimating priors
  • Model parameters
  • Predicting with a model
  • Saving and loading models
  • Filtering models
  • Seeding a QLattice
  • Privacy

Evaluate Regressors

  • Regression plot
  • Residuals plot

Evaluate Classifiers

  • ROC curve
  • Confusion matrix
  • Plot probability scores

Understand Your Models

  • Plot response
  • Plot response 2D
  • Model signal
  • Segmented loss
  • Interactive flow

Primitive Operations

  • Using the primitives
  • Updating priors
  • Sample models
  • Fitting models
  • Pruning models
  • Diverse models
  • Updating a QLattice
  • Validate data

Advanced

  • Converting a model to SymPy
  • Setting themes
  • Saving a graph as an image
  • Using the query language
  • Model complexity

Saving and loading models

by: Kevin Broløs & Chris Cave
(Feyn version 3.0 or newer)

Models can be saved to a file for use at a later time or place.

Note: Due to incompatibilities in the different versions of the QLattice in version 2.1, you are unable to load any models saved prior to version 2.1. If you need access to older models, we recommend you use a previous version of feyn.

Here is an example how:

import feyn
from feyn.datasets import make_classification

# Generate a dataset and put it into a dataframe
train, test = make_classification()

# Instantiate a QLattice and run a classification simulation
ql = feyn.QLattice()
models = ql.auto_run(
    data=train,
    output_name='y',
    kind='classification'
)

# Save a model to a file
models[0].save('my_model.json')

You can load the Model using feyn.Model.load.

from feyn import Model

model = Model.load('my_model.json')

prediction = model.predict(test)

Once a Model is saved and selected, you can load them into any Python environment to do predictions.

A loaded Model is no different from a Model sampled from the QLattice. For example, you can resume fitting a Model after loading it.

← Predicting with a modelFiltering models →
Copyright © 2022 Abzu.ai
Feyn®, QGraph®, and the QLattice® are registered trademarks of Abzu®