CLI#
Available gordo-client CLIs:
gordo-client#
Entry sub-command for client related activities.
gordo-client [OPTIONS] COMMAND [ARGS]...
Options
- --version#
Show the version and exit.
- --project <project>#
The project to target
- --host <host>#
The host the server is running on
- --port <port>#
Port the server is running on
- --scheme <scheme>#
tcp/http/https
- --batch-size <batch_size>#
How many samples to send
- --parallelism <parallelism>#
Maximum asynchronous jobs to run
- --metadata <metadata>#
Key-Value pair to be entered as metadata labels, may use this option multiple times. to be separated by a comma. ie: –metadata key,val –metadata some key,some value
- --session-config <session_config>#
Config json/yaml to set on the requests.Session object. Useful when needing to supplyauthentication parameters such as header keys. ie. –session-config {‘headers’: {‘API-KEY’: ‘foo-bar’}}
- --log-level <log_level>#
Run client with custom log-level.
- --all-columns#
Return all columns for prediction. Including ‘smooth-..’ columns
Environment variables
- GORDO_LOG_LEVEL
Provide a default for
--log-level
download-model#
Download the actual model from the target and write to an output directory.
gordo-client download-model [OPTIONS] OUTPUT_DIR
Options
- --target <target>#
A list of machines to target. If not provided then target all machines in the project
Arguments
- OUTPUT_DIR#
Required argument
metadata#
Get metadata from a given endpoint.
gordo-client metadata [OPTIONS]
Options
- --output-file <output_file>#
Optional output file to save metadata
- --target <target>#
A list of machines to target. If not provided then target all machines in the project
predict#
Run some predictions against the target.
gordo-client predict [OPTIONS] START END
Options
- --target <target>#
A list of machines to target. If not provided then target all machines in the project
- --data-provider <data_provider>#
DataProvider dict encoded as json. Must contain a ‘type’ key with the name of a DataProvider as value.
- --output-dir <output_dir>#
Save output prediction dataframes in a directory
- --influx-uri <influx_uri>#
Format: <username>:<password>@<host>:<port>/<optional-path>/<db_name>
- --influx-api-key <influx_api_key>#
Key to provide to the destination influx
- --influx-recreate-db#
Recreate the desintation DB before writing
- --forward-resampled-sensors#
forward the resampled sensor values
- --n-retries <n_retries>#
Time client should retry failed predictions
- --parquet, --no-parquet#
Use parquet serialization when sending and receiving data from server
Arguments
- START#
Required argument
- END#
Required argument
Environment variables
- DATA_PROVIDER
Provide a default for
--data-provider