3. pydelfi package¶
3.1. Submodules¶
3.2. pydelfi.delfi module¶
-
class
pydelfi.delfi.
Delfi
(data, prior, nde, Finv=None, theta_fiducial=None, param_limits=None, param_names=None, nwalkers=100, posterior_chain_length=1000, proposal_chain_length=100, rank=0, n_procs=1, comm=None, red_op=None, show_plot=True, results_dir='', progress_bar=True, input_normalization=None, graph_restore_filename='graph_checkpoint', restore_filename='restore.pkl', restore=False, save=True)[source]¶ Bases:
object
-
bayesian_optimization_training
(simulator, compressor, n_batch, n_populations, n_optimizations=10, simulator_args=None, compressor_args=None, plot=False, batch_size=100, validation_split=0.1, epochs=300, patience=20, seed_generator=None, save_intermediate_posteriors=False, sub_batch=1)[source]¶
-
fisher_pretraining
(n_batch=5000, plot=True, batch_size=100, validation_split=0.1, epochs=1000, patience=20, mode='regression')[source]¶
-
run_simulation_batch
(n_batch, ps, simulator, compressor, simulator_args, compressor_args, seed_generator=None, sub_batch=1)[source]¶
-
sequential_training
(simulator, compressor, n_initial, n_batch, n_populations, proposal=None, simulator_args=None, compressor_args=None, safety=5, plot=True, batch_size=100, validation_split=0.1, epochs=300, patience=20, seed_generator=None, save_intermediate_posteriors=True, sub_batch=1)[source]¶
-
3.3. pydelfi.ndes module¶
-
class
pydelfi.ndes.
ConditionalGaussianMade
(n_parameters, n_data, n_hiddens, act_fun, output_order='sequential', mode='sequential', input_parameters=None, input_data=None, logpdf=None)[source]¶ Bases:
object
Implements a Made, where each conditional probability is modelled by a single gaussian component.
-
create_degrees
(input_order)[source]¶ Generates a degree for each hidden and input unit. A unit with degree d can only receive input from units with degree less than d. :param n_hiddens: a list with the number of hidden units :param input_order: the order of the inputs; can be ‘random’, ‘sequential’, or an array of an explicit order :param mode: the strategy for assigning degrees to hidden nodes: can be ‘random’ or ‘sequential’ :return: list of degrees
-
create_masks
(degrees)[source]¶ Creates the binary masks that make the connectivity autoregressive. :param degrees: a list of degrees for every layer :return: list of all masks, as theano shared variables
-
create_weights_conditional
(n_comps)[source]¶ Creates all learnable weight matrices and bias vectors. :param n_comps: number of gaussian components :return: weights and biases, as tensorflow variables
-
eval
(xy, sess, log=True)[source]¶ Evaluate log probabilities for given input-output pairs. :param xy: a pair (x, y) where x rows are inputs and y rows are outputs :param sess: tensorflow session where the graph is run :param log: whether to return probabilities in the log domain :return: log probabilities: log p(y|x)
-
-
class
pydelfi.ndes.
ConditionalMaskedAutoregressiveFlow
(n_parameters, n_data, n_hiddens, act_fun, n_mades, output_order='sequential', mode='sequential', input_parameters=None, input_data=None, logpdf=None, index=1)[source]¶ Bases:
object
Conditional Masked Autoregressive Flow.
-
eval
(xy, sess, log=True)[source]¶ Evaluate log probabilities for given input-output pairs. :param xy: a pair (x, y) where x rows are inputs and y rows are outputs :param sess: tensorflow session where the graph is run :param log: whether to return probabilities in the log domain :return: log probabilities: log p(y|x)
-
-
class
pydelfi.ndes.
MixtureDensityNetwork
(n_parameters, n_data, n_components=3, n_hidden=[50, 50], activations=[<sphinx.ext.autodoc.importer._MockObject object>, <sphinx.ext.autodoc.importer._MockObject object>], input_parameters=None, input_data=None, logpdf=None, index=1)[source]¶ Bases:
object
Implements a Mixture Density Network for modeling p(y|x)
-
eval
(xy, sess, log=True)[source]¶ Evaluate log probabilities for given input-output pairs. :param xy: a pair (x, y) where x rows are inputs and y rows are outputs :param sess: tensorflow session where the graph is run :param log: whether to return probabilities in the log domain :return: log probabilities: log p(y|x)
-
3.4. pydelfi.priors module¶
3.5. pydelfi.score module¶
-
class
pydelfi.score.
Gaussian
(ndata, theta_fiducial, mu=None, Cinv=None, dmudt=None, dCdt=None, F=None, prior_mean=None, prior_covariance=None, rank=0, n_procs=1, comm=None, red_op=None)[source]¶ Bases:
object
-
compute_derivatives
(simulator, nsims, h, simulator_args=None, seed_generator=None, progress_bar=True, sub_batch=1)[source]¶
-
3.6. pydelfi.train module¶
-
class
pydelfi.train.
ConditionalTrainer
(model, optimizer=<sphinx.ext.autodoc.importer._MockObject object>, optimizer_arguments={})[source]¶ Bases:
object
-
train
(sess, train_data, validation_split=0.1, epochs=1000, batch_size=100, patience=20, saver_name='tmp_model', progress_bar=True, mode='samples')[source]¶ Training function to be called with desired parameters within a tensorflow session. :param sess: tensorflow session where the graph is run. :param train_data: a tuple/list of (X,Y) with training data where Y is conditioned on X. :param validation_split: percentage of training data randomly selected to be used for validation :param epochs: maximum number of epochs for training. :param batch_size: batch size of each batch within an epoch. :param early_stopping: number of epochs for early stopping criteria. :param check_every_N: check every N iterations if model has improved and saves if so. :param saver_name: string of name (with or without folder) where model is saved. If none is given,
a temporal model is used to save and restore best model, and removed afterwards.
-