lib.sedna.core.joint_inference.joint_inference

Module Contents

Classes

BigModelService

Large model services implemented

JointInference

Sedna provide a framework make sure under the condition of limited

class lib.sedna.core.joint_inference.joint_inference.BigModelService(estimator=None)[source]

Bases: sedna.core.base.JobBase

Large model services implemented Provides RESTful interfaces for large-model inference.

Parameters:

estimator (Instance, big model) – An instance with the high-level API that greatly simplifies machine learning programming. Estimators encapsulate training, evaluation, prediction, and exporting for your model.

Examples

>>> Estimator = xgboost.XGBClassifier()
>>> BigModelService(estimator=Estimator).start()
start()[source]

Start inference rest server

train(train_data, valid_data=None, post_process=None, **kwargs)[source]

todo: no support yet

inference(data=None, post_process=None, **kwargs)[source]

Inference task for JointInference

Parameters:
  • data (BaseDataSource) – datasource use for inference, see sedna.datasources.BaseDataSource for more detail.

  • post_process (function or a registered method) – effected after estimator inference.

  • kwargs (Dict) – parameters for estimator inference, Like: ntree_limit in Xgboost.XGBClassifier

Return type:

inference result

class lib.sedna.core.joint_inference.joint_inference.JointInference(estimator=None, hard_example_mining: dict = None)[source]

Bases: sedna.core.base.JobBase

Sedna provide a framework make sure under the condition of limited resources on the edge, difficult inference tasks are offloaded to the cloud to improve the overall performance, keeping the throughput.

Parameters:
  • estimator (Instance) – An instance with the high-level API that greatly simplifies machine learning programming. Estimators encapsulate training, evaluation, prediction, and exporting for your model.

  • hard_example_mining (Dict) – HEM algorithms with parameters which has registered to ClassFactory, see sedna.algorithms.hard_example_mining for more detail.

Examples

>>> Estimator = keras.models.Sequential()
>>> ji_service = JointInference(
        estimator=Estimator,
        hard_example_mining={
            "method": "IBT",
            "param": {
                "threshold_img": 0.9
            }
        }
    )

Notes

Sedna provide an interface call get_hem_algorithm_from_config to build the hard_example_mining parameter from CRD definition.

classmethod get_hem_algorithm_from_config(**param)[source]

get the algorithm name and param of hard_example_mining from crd

Parameters:

param (Dict) – update value in parameters of hard_example_mining

Returns:

e.g.: {“method”: “IBT”, “param”: {“threshold_img”: 0.5}}

Return type:

dict

Examples

>>> JointInference.get_hem_algorithm_from_config(
        threshold_img=0.9
    )
{"method": "IBT", "param": {"threshold_img": 0.9}}
inference(data=None, post_process=None, **kwargs)[source]

Inference task with JointInference

Parameters:
  • data (BaseDataSource) – datasource use for inference, see sedna.datasources.BaseDataSource for more detail.

  • post_process (function or a registered method) – effected after estimator inference.

  • kwargs (Dict) – parameters for estimator inference, Like: ntree_limit in Xgboost.XGBClassifier

Returns:

  • if is hard sample (bool)

  • inference result (object)

  • result from little-model (object)

  • result from big-model (object)