lib.sedna.service.server.inference

Module Contents

Classes

InferenceServer

rest api server for inference

class lib.sedna.service.server.inference.InferenceServer(model, servername, host: str = '127.0.0.1', http_port: int = 8080, max_buffer_size: int = 104857600, workers: int = 1)[source]

Bases: lib.sedna.service.server.base.BaseServer

rest api server for inference

start()[source]
model_info()[source]
predict(data: InferenceItem)[source]