Skip to content

Federated Learning API: API List

This document has been machine translated.

This document describes the Federated Learning API.

The execution method uses JSON-RPC v2.0 except for some APIs.

The python runtime environment can also use the API Client for easier handling of Federated Learning APIs.

JSON-RPC v2.0

Request Example:

When using JSON-RPC v2.0, please create the following request referring to the specification of each API.

  • method: str: Specify the API to call.
  • params: dict: Specify API parameters. If parameters are not needed, they can be omitted.
{
  "jsonrpc": "2.0",
  "id": 1,
  "method": "fl.initialize_model_store",
  "params": {
    "model_store_ddc": "ddc:my_first_model_store"
  }
}

Response Example:

If the request is successful, the following response is returned

  • result: Response to a successful request
{"jsonrpc": "2.0", "result": {"id": "xxx", ...}, "id": 1}

If the request fails on the server side, the following response is returned

  • error: Error information of the failed request
{"jsonrpc": "2.0", "error": {"code": -32601, "message": "Method not found"}, "id": "1"}

API Client(For Python)

Request Example:

When using the API Client, create a request like the following, referring to the specifications of each API.

from xdata_fl.client import Api

api = Api()
result = api.initialize_model_store(model_store_ddc="ddc:my_first_model_store")

# {"id": "xx", ...}

About Logging

When using the API Client, you can output API execution information to any logger.

The logging levels are as follows.

  • INFO: Heavy operations that result in model transfers
  • DEBUG: other

An example of logger configuration is as follows.

import logging
from xdata_fl.client import Api, set_logger

# Configure logger
handler = logging.StreamHandler()
formatter = logging.Formatter(
  " ".join(
    "%(asctime)s",
    "%(levelname)s"
    "%(filename)s",
    "%(funcName)s",
    "%(lineno)s",
    "%(message)s"
  ),
   datefmt="%Y/%m/%d %H:%M:%S"
)
handler.setFormatter(formatter)

logger = logging.getLogger("your_logger")
logger.setLevel(logging.INFO)
logger.addHandler(handler)


# Integrate logger
set_logger(logger)
  • The default logging level in Python is WARNING, so please configure the logger at the INFO level.
  • Time at output is not output, so please use logging.Formatter to adjust it.

For detailed logging usage, please refer to the official documentation.

fl.get_server_version

Returns the server version of the Federated Learning API.

Response:

  • result: str: Server Version of Federated Learning API

Request Example:

result = api.get_server_version()

fl.initialize_model_store

Create a model store.

Parameters:

  • model_store_ddc: str: Name of model store to create (DDC)

Response:

  • result: dict: Information about the created model store (ddc_info)

Request Example:

result = api.initialize_model_store("ddc:my_first_model_store")

fl.exists_store

Checks for the existence of a model store.

Parameters:

  • model_store_ddc: str: Name of model store (DDC) to check.

Response:

  • result: bool: Whether model store exists or not.

Request Example:

result = api.exists_store("ddc:my_first_model_store")

fl.delete_store

Deletes a model store.

Parameters:

  • model_store_ddc: str: Model store name (DDC) to be deleted.
  • raise_if_no_exists: bool = True: If True, raises an exception if the specified model store does not exist.

Response:

  • result: int: 1 if the model store was deleted, 0 otherwise (if raise_if_no_exists = False)

Request Example:

result = api.delete_store("ddc:my_first_model_store")

fl.put_model

Stores models in the specified model store. Usually executed in Party.

Note:

This method sends the request as multipart/form-data, not JSON-RPC v2.0.

Also, the endpoint is ../jsonrpc/upload/fl.put_model.

Parameters:

  • model_store_ddc: str: Model store name to delete (DDC)
  • file: Union[str, io.BufferedReader]: file path or file object
  • model_kind: str: Information such as extension for the model (currently unused)
  • model_description: str: Arbitrary description to give to the model
  • model_meta: dict = {}: arbitrary metadata for the model
  • model_state: int = FlEvensts.CREATED: (basically, do not specify) model state in coalition learning

Response:

  • result: dict: Return model_info

Request Example:

# To pass a file path
result = api.put_model(
  "ddc:my_first_model_store",
  "./my_models/model_1",
  model_description="test model",
  model_meta={"additonal": "additional data"}
)

# When passing a file object
with Open("./my_models/model_1", "rb) as f:
  result = api.put_model(
    "ddc:my_first_model_store",
    f,
    model_description="test model",
    model_meta={"additonal": "additional data"}
  )

fl.put_aggregated_model

Stores models in the specified model store. Usually performed in Aggregator.

Note:

This method sends the request as multipart/form-data, not JSON-RPC v2.0.

Also, the endpoint is . /jsonrpc/upload/fl.put_aggregated_model.

Parameters:

  • model_store_ddc: str: Model store name to delete (DDC)
  • file: Union[str, io.BufferedReader]: file path or file object
  • model_kind: str: Information such as extension for the model (currently unused)
  • model_description: str: Arbitrary description to give to the model
  • model_meta: dict = {}: arbitrary metadata for the model
  • model_state: int = FlEvensts.AGGREGATED: (basically, do not specify) model state in coalition learning

Response:

  • result: dict: Return `model_info

Request Example:

# To pass a file path
result = api.put_aggregated_model(
  "ddc:my_first_model_store",
  "./my_models/aggregated_model_1",
  model_description="test model",
  model_meta={"additonal": "additional data"}
)

# When passing a file object
with Open("./my_models/aggregated_model_1", "rb) as f:
  result = api.put_aggregated_model(
    "ddc:my_first_model_store",
    f,
    model_description="test model",
    model_meta={"additonal": "additional data"}
  )

fl.feedback

Feedback the specified model in the specified model store to the upstream model store. The upstream model store is determined by configuration information.

Models retrieved inside the server are forwarded to the upstream.

Parameters:

  • model_store_ddc: str: Model store name (DDC)
  • model_id: str: model id

Response:

  • result: dict: return model_info returned from upstream

Request Example:

result = api.feedback("ddc:my_first_model_store", "model_id")

fl.load_model

Downloads the specified model from the specified model store.

注意:

This method is JSON-RPC v2.0 for the request, but the response does not follow JSON JSON-RPC v2.0.

Also, the endpoint is .. /jsonrpc/download/fl.load_model.

Parameters:

  • model_store_ddc: str: Model store name (DDC)
  • model_id: str: model id

Response:

  • result: Iterable[bytes]: model body

Request Example:

_result = api.load_model("ddc:my_first_model_store", "model_id")
result = b"".join(_result)

fl.request_transfer

Retrieve the latest aggregated model from the upstream model store and store it in the specified model store.

The upstream model store is determined by configuration information.

Parameters:

  • model_store_ddc: str: Model Store Name(DDC)

Response:

  • result: dict: Return `model_info

Request Example:

_result = api.request_transfer("ddc:my_first_model_store")

fl.get_model_info

Retrieves information on the specified model in the specified model store. The data of the model itself is not included.

Parameters:

  • model_store_ddc: str: Model Store Name (DDC)
  • model_id: str: Model id

Response:

  • result: dict: Return `model_info

Request Example:

result = api.get_model_info("ddc:my_first_model_store", "model_id")

fl.exists_model

Checks for the existence of the specified model in the specified model store.

Parameters:

  • model_store_ddc: str: Model store name (DDC)
  • model_id: str: model id

Response:

  • result: bool: True if the model exists `True

Request Example:

result = api.exists_model("ddc:my_first_model_store", "model_id")

fl.delete_model

Deletes the specified model in the specified model store.

Parameters:

  • model_store_ddc: str: Model store name (DDC)
  • model_id: str: model id to delete

Response:

  • result: int: 1 if the model was deleted, otherwise (if raise_if_no_exists = false) 0.

Request Example:

result = api.delete_model("ddc:my_first_model_store", "model_id")

fl.delete_all_model

Deletes all models in the specified model store.

Parameters:

  • model_store_ddc: str: Model store name (DDC)

Response:

  • result: int: Number of models deleted

Request Example:

result = api.delete_all_model("ddc:my_first_model_store")

fl.get_models

Retrieves all model information for the specified model store.

Parameters:

  • model_store_ddc: str: Model store name (DDC)

Response:

  • result: list[dict]: model_infoの一覧

Request Example:

result = api.get_models("ddc:my_first_model_store")

fl.get_model_info_latest

Retrieves the latest model information for the specified model store.

Parameters:

  • model_store_ddc: str: Model store name (DDC)

Response:

  • result: dict: model_info

Request Example:

result = api.get_model_info_latest("ddc:my_first_model_store")

fl.get_latest_feedbacked_models

Retrieves a list of the latest feedbacked model information from the specified model store.

Parameters:

  • model_store_ddc: str: Model store name (DDC)
  • round: int: Target round (not yet implemented)

Response:

  • result: list[dict]: List of `model_info

Request Example:

result = api.get_latest_feedbacked_models("ddc:my_first_model_store")

fl.get_latest_aggregated_model

Retrieves the latest aggregated model information from the specified model store.

Parameters:

  • model_store_ddc: str: Model store name (DDC)

Response:

  • result: dict: model_info

Request Example:

result = api.get_latest_aggregated_model("ddc:my_first_model_store")

fl.get_latest_transferred_model

Retrieves the latest transferred model information from the specified model store.

Parameters:

  • model_store_ddc: str: Model store name(DDC)

Response:

  • result: dict: model_info

Request Example:

result = api.get_latest_transferred_model("ddc:my_first_model_store")

fl.get_parties

Retrieve parties from upstream.

Response:

  • result: list[dict]: Returns a list of `party_info

Request Example:

result = api.get_parties()

Response Example:

[
  {
    "id": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
    "tenant_id: "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
    "tenant_name: "my tenant name",
    "description": "my description",
    "hop_count": 1,
    "created_at": datetime.datetime.now(tz=LOCAL_TIMEZONE),
    "updated_at": datetime.datetime.now(tz=LOCAL_TIMEZONE),
  }
]

fl.create_party

Registers a party to the upstream.

The calling Edge retrieves the TENANT_ID and TENANT_NAME from the configuration information and forwards them to the upstream.

Parameters:

  • description: str = "" : Any description for the party

Response:

  • result: dict: party_info

Request Example:

result = api.create_party("my experiment 1")

fl.delete_party

Remove a party from the upstream.

Parameters:

  • id: str: id to identify the party

Response:

  • result: int: Returns the number of deletions

Request Example:

result = api.delete_party("party_id")