CoMETS v1.1.0 release notes
Articles

CoMETS v1.1.0 release notes

Introduction

 

CoMETS (the Cosmo Model Experimentation Toolbox) is a Python library aiming at simplifying and automating experimentation with our Cosmo Tech models.

For a non-expert end-user, the standard experimentation procedure is to construct What-If scenarios, by manually changing some lever value in a Cosmo Tech web application. It triggers a new scenario, in which the full run template is executed. The modifications are handled by the Parameter Handler: it modifies the lever to the user’s new input value.

In an alternative procedure, adapted to more advanced experimentation, we can define an ensemble of simulation runs directly in the run template engine. An example is the uncertainty analysis experiment implemented in our Supply Chain pre-packaged solution: a large number of simulations are launched together for one execution of the run template.

CoMETS aims at reproducing and extending this second experimentation procedure. It provides unified, easy-to-use and model agnostic tools to configure advanced experiments in the run template engine, using the Python language.

CoMETS has to be installed on the same machine as the simulator. An interface to communicate with the simulator can be built with a few lines of codes. Then, it provides a single, user-friendly syntax to call algorithms performing Optimization, Uncertainty Analysis and many more types of experimentation on the simulator.

CoMETS targets two usage modes depending on the user profile:

Using CoMETS for a solution developer

Release notes

Version 1.1.0

Changes:

Experiments

Implementation of a new optimization algorithm and creation of a new experiment.

Optimization

Bayesian optimization works with a black-box optimization algorithm that uses an acquisition function to optimize the objective function with the help of a surrogate model. It’s useful for optimizing expensive-to-evaluate functions. For more information on the use of this algorithm, see CoMETS documentation.

Sensitivity analysis

A sensitivity analysis is the study of how variations in the output of a Task can be divided and allocated to variations in the inputs of a Task. This experiment has been declined in two types: local and global. For more information on the differences between this two types, please see CoMETS documentation

Model Interface

Datapaths provide access to entities and attributes in all CoSML models.