API reference¶
thkit
¶

THKIT: Python package for general utilities.
Developed and maintained by C.Thang Nguyen
Modules:
-
config– -
io–Input/output utilities.
-
jobman– -
log– -
markup–Module for display-related classes and functions.
-
path– -
pkg– -
range– -
stuff–
Attributes:
-
THKIT_ROOT– -
__author__– -
__contact__–
THKIT_ROOT = Path(__file__).parent
module-attribute
¶
__author__ = 'C.Thang Nguyen'
module-attribute
¶
__contact__ = 'http://thangckt.github.io/email'
module-attribute
¶
config
¶
Classes:
-
Config–Base class for configuration files.
Functions:
-
get_default_args–Get dict of default values of arguments of a function.
-
argdict_to_schemadict–Convert a function's type-annotated arguments into a cerberus schema dict.
Config
¶
Base class for configuration files.
Methods:
-
validate–Validate the config file with the schema file.
-
loadconfig–Load data from a JSON or YAML file.
validate(config_dict: dict | None = None, config_file: str | None = None, schema_dict: dict | None = None, schema_file: str | None = None, allow_unknown: bool = False, require_all: bool = False)
staticmethod
¶
Validate the config file with the schema file.
Parameters:
-
config_dict(dict, default:None) –config dictionary. Defaults to None.
-
config_file(str, default:None) –path to the YAML config file, will override
config_dict. Defaults to None. -
schema_dict(dict, default:None) –schema dictionary. Defaults to None.
-
schema_file(str, default:None) –path to the YAML schema file, will override
schema_dict. Defaults to None. -
allow_unknown(bool, default:False) –whether to allow unknown fields in the config file. Defaults to False.
-
require_all(bool, default:False) –whether to require all fields in the schema file to be present in the config file. Defaults to False.
Raises:
-
ValueError–if the config file does not match the schema
loadconfig(filename: str | Path) -> dict
staticmethod
¶
Load data from a JSON or YAML file.
Args: filename (Union[str, Path]): The filename to load data from, whose suffix should be .json, jsonc, .yml, or .yml
Returns:
-
jdata(dict) –(dict) The data loaded from the file
Notes
- The YAML file can contain variable-interpolation, will be processed by OmegaConf. Example input YAML file:
get_default_args(func: Callable) -> dict
¶
Get dict of default values of arguments of a function.
Parameters:
-
func(Callable) –function to inspect
argdict_to_schemadict(func: Callable) -> dict
¶
Convert a function's type-annotated arguments into a cerberus schema dict.
Handles
- Single types
- Union types (as list of types)
- Nullable types (
Nonein Union) - Only checks top-level types (no recursion into
list[int],dict[str, float], etc.) - Supports multiple types in cerberus (e.g.
{"type": ["integer", "string"]}) when aUnionis given.
Parameters:
-
func(callable) –function to inspect
Returns:
-
schemadict(dict) –cerberus schema dictionary
io
¶
Input/output utilities.
Classes:
-
DotDict–Dictionary supporting dot notation (attribute access) as well as standard dictionary access.
Functions:
-
read_yaml–Read data from a YAML file.
-
write_yaml–Write data to a YAML file.
-
combine_text_files–Combine text files into a single file in a memory-efficient.
-
unpack_dict–Unpack one level of nested dictionary.
-
download_rawtext–Download raw text from a URL.
-
txt2str–Convert a text file to a string.
-
str2txt–Convert a string to a text file.
-
txt2list–Convert a text file to a list of lines (without newline characters).
-
list2txt–Convert a list of lines to a text file.
-
float2str–Convert float number to str.
DotDict(dct=None)
¶
Bases: dict
Dictionary supporting dot notation (attribute access) as well as standard dictionary access. Nested dicts and sequences (list/tuple/set) are converted recursively.
Parameters:
-
dct(dict, default:None) –Initial dictionary to populate the DotDict. Defaults to empty dict.
Usage
d = DotDict({'a': 1, 'b': {'c': 2, 'd': [3, {'e': 4}]}}) print(d.b.c) # 2 print(d['b']['c']) # 2 d.b.d[1].e = 42 print(d.b.d[1].e) # 42 print(d.to_dict()) # plain dict
Methods:
-
__setitem__–Set item using dot notation or standard dict syntax.
-
__setattr__– -
to_dict–Recursively convert DotDict back to plain dict.
Attributes:
__getattr__ = dict.__getitem__
class-attribute
instance-attribute
¶
__delattr__ = dict.__delitem__
class-attribute
instance-attribute
¶
__setitem__(key, value)
¶
Set item using dot notation or standard dict syntax.
__setattr__(key, value)
¶
to_dict()
¶
Recursively convert DotDict back to plain dict.
read_yaml(filename: str | Path) -> Any
¶
Read data from a YAML file.
write_yaml(jdata: dict[Any, Any] | list[Any], filename: str | Path)
¶
Write data to a YAML file.
combine_text_files(files: list[str], output_file: str, chunk_size: int = 1024)
¶
Combine text files into a single file in a memory-efficient.
Read and write in chunks to avoid loading large files into memory
Parameters:
unpack_dict(nested_dict: dict) -> dict
¶
Unpack one level of nested dictionary.
download_rawtext(url: str, outfile: str | None = None) -> str
¶
Download raw text from a URL.
txt2str(file_path: str | Path) -> str
¶
Convert a text file to a string.
str2txt(text: str, file_path: str | Path) -> None
¶
Convert a string to a text file.
txt2list(file_path: str | Path) -> list[str]
¶
Convert a text file to a list of lines (without newline characters).
list2txt(lines: list[str], file_path: str | Path) -> None
¶
Convert a list of lines to a text file.
jobman
¶
Modules:
-
#retired_code– -
helper– -
submit–
#retired_code
¶
Functions:
-
submit_job_chunk–Function to submit a jobs to the remote machine. The function will:
-
async_submit_job_chunk–Convert
submit_job_chunk()into an async function but only need to wait for the completion of the entireforloop (without worrying about the specifics of each operation inside the loop) -
async_submit_job_chunk_tqdm–Revised version of
async_submit_job_chunk()withtqdmprogress bar.
Attributes:
-
runvar– -
global_lock–
runvar = {}
module-attribute
¶
global_lock = asyncio.Lock()
module-attribute
¶
submit_job_chunk(mdict: dict, work_dir: str, task_list: list[Task], forward_common_files: list[str] = [], backward_common_files: list[str] = [], machine_index: int = 0, logger: object = None)
¶
Function to submit a jobs to the remote machine. The function will:
- Prepare the task list
- Make the submission of jobs to remote machines
- Wait for the jobs to finish and download the results to the local machine
Parameters:
-
mdict(dict) –a dictionary contain settings of the remote machine. The parameters described in the remote machine schema. This dictionary defines the login information, resources, execution command, etc. on the remote machine.
-
task_list(list[Task]) –a list of Task objects. Each task object contains the command to be executed on the remote machine, and the files to be copied to and from the remote machine. The dirs of each task must be relative to the
work_dir. -
forward_common_files(list[str], default:[]) –common files used for all tasks. These files are i n the
work_dir. -
backward_common_files(list[str], default:[]) –common files to download from the remote machine when the jobs are finished.
-
machine_index(int, default:0) –index of the machine in the list of machines.
-
logger(object, default:None) –the logger object to be used for logging.
Note
- Split the
task_listinto chunks to control the number of jobs submitted at once. - Should not use the
Localcontexts, it will interference the current shell environment which leads to the unexpected behavior on local machine. Instead, use another account to connect local machine withSSHcontext.
async_submit_job_chunk(mdict: dict, work_dir: str, task_list: list[Task], forward_common_files: list[str] = [], backward_common_files: list[str] = [], machine_index: int = 0, logger: object = None)
async
¶
Convert submit_job_chunk() into an async function but only need to wait for the completion of the entire for loop (without worrying about the specifics of each operation inside the loop)
Note
- An async function normally contain a
await ...statement to be awaited (yield control to event loop) - If the 'event loop is blocked' by a asynchronous function (it will not yield control to event loop), the async function will wait for the completion of the synchronous function. So, the async function will not be executed asynchronously. Try to use
await asyncio.to_thread()to run the synchronous function in a separate thread, so that the event loop is not blocked.
async_submit_job_chunk_tqdm(mdict: dict, work_dir: str, task_list: list[Task], forward_common_files: list[str] = [], backward_common_files: list[str] = [], machine_index: int = 0, logger: object = None)
async
¶
Revised version of async_submit_job_chunk() with tqdm progress bar.
helper
¶
Classes:
-
ConfigRemoteMachines–Class for remote machine configuration files.
Functions:
-
change_logfile_dispatcher–Change the logfile of dpdispatcher.
-
init_jobman_logger–Initialize the default logger under
log/if not provided. -
log_machine_info–Log remote machine information.
ConfigRemoteMachines(machines_file: str)
¶
Bases: Config
Class for remote machine configuration files.
Parameters:
-
machines_file(str) –path to the YAML file contains multiple machines configs.
Methods:
-
validate_machine_config–Validate multiple machines configs.
-
select_machines–Select machine dicts based on the prefix.
-
check_connection–Check whether the connections to all remote machines are valid.
-
check_resource_settings–Check whether the resource settings in all remote machines are valid.
-
validate–Validate the config file with the schema file.
-
loadconfig–Load data from a JSON or YAML file.
Attributes:
-
machines_file(str) – -
multi_mdicts–
machines_file: str = machines_file
instance-attribute
¶
multi_mdicts = self.loadconfig(machines_file)
instance-attribute
¶
validate_machine_config(schema_file: str | None = None)
¶
Validate multiple machines configs.
select_machines(mdict_prefix: str = '') -> list[dict]
¶
Select machine dicts based on the prefix.
To specify multiple remote machines for the same purpose, the top-level keys in the machines_file should start with the same prefix. Example:
- train_1, train_2,... for training jobs
- lammps_1, lammps_2,... for lammps jobs
- gpaw_1, gpaw_2,... for gpaw jobs
Parameters:
-
mdict_prefix(str, default:'') –the prefix to select remote machines for the same purpose. Example: 'dft_', 'md_', 'train_'.
Returns:
check_connection(mdict_prefix: str = '')
¶
Check whether the connections to all remote machines are valid.
Parameters:
-
mdict_prefix(str, default:'') –Only check the remote machines with this prefix.
check_resource_settings(mdict_prefix: str = '')
¶
Check whether the resource settings in all remote machines are valid.
validate(config_dict: dict | None = None, config_file: str | None = None, schema_dict: dict | None = None, schema_file: str | None = None, allow_unknown: bool = False, require_all: bool = False)
staticmethod
¶
Validate the config file with the schema file.
Parameters:
-
config_dict(dict, default:None) –config dictionary. Defaults to None.
-
config_file(str, default:None) –path to the YAML config file, will override
config_dict. Defaults to None. -
schema_dict(dict, default:None) –schema dictionary. Defaults to None.
-
schema_file(str, default:None) –path to the YAML schema file, will override
schema_dict. Defaults to None. -
allow_unknown(bool, default:False) –whether to allow unknown fields in the config file. Defaults to False.
-
require_all(bool, default:False) –whether to require all fields in the schema file to be present in the config file. Defaults to False.
Raises:
-
ValueError–if the config file does not match the schema
loadconfig(filename: str | Path) -> dict
staticmethod
¶
Load data from a JSON or YAML file.
Args: filename (Union[str, Path]): The filename to load data from, whose suffix should be .json, jsonc, .yml, or .yml
Returns:
-
jdata(dict) –(dict) The data loaded from the file
Notes
- The YAML file can contain variable-interpolation, will be processed by OmegaConf. Example input YAML file:
change_logfile_dispatcher(newlogfile: str)
¶
Change the logfile of dpdispatcher.
init_jobman_logger(logfile: str | None = None) -> ColorLogger
¶
Initialize the default logger under log/ if not provided.
log_machine_info(num_jobs: int, mdict: dict, machine_index: int, logger: logging.Logger)
¶
Log remote machine information.
submit
¶
Functions:
-
submit_job_chunk–Function to submit a jobs to the remote machine.
-
async_submit_job_chunk–Convert
submit_job_chunk()into an async function. -
alff_submit_job_multi_remotes–Submit jobs to multiple machines asynchronously.
Attributes:
-
sync_dict– -
global_lock–
sync_dict = {}
module-attribute
¶
global_lock = asyncio.Lock()
module-attribute
¶
submit_job_chunk(mdict: dict, work_dir: str, task_list: list[Task], forward_common_files: list[str] = [], backward_common_files: list[str] = [], machine_index: int = 0, logger: ColorLogger | None = None)
¶
Function to submit a jobs to the remote machine.
Includes: - Prepare the task list - Make the submission of jobs to remote machines - Wait for the jobs to finish and download the results to the local machine
Parameters:
-
mdict(dict) –a dictionary contain settings of the remote machine. The parameters described in the remote machine schema. This dictionary defines the login information, resources, execution command, etc. on the remote machine.
-
work_dir(str) –the base working directory on the local machine. All task directories are relative to this directory.
-
task_list(list[Task]) –a list of Task objects. Each task object contains the command to be executed on the remote machine, and the files to be copied to and from the remote machine. The dirs of each task must be relative to the
work_dir. -
forward_common_files(list[str], default:[]) –common files used for all tasks. These files are i n the
work_dir. -
backward_common_files(list[str], default:[]) –common files to download from the remote machine when the jobs are finished.
-
machine_index(int, default:0) –index of the machine in the list of machines.
-
logger(object, default:None) –the logger object to be used for logging.
Note
- Split the
task_listinto chunks to control the number of jobs submitted at once. - Should not use the
Localcontexts, it will interference the current shell environment which leads to the unexpected behavior on local machine. Instead, use another account to connect local machine withSSHcontext.
async_submit_job_chunk(mdict: dict, work_dir: str, task_list: list[Task], forward_common_files: list[str] = [], backward_common_files: list[str] = [], machine_index: int = 0, logger: ColorLogger | None = None)
async
¶
Convert submit_job_chunk() into an async function.
The approach in this function is only need to wait for the completion of the entire for loop (without worrying about the specifics of each operation inside the loop)
Note
- An async function normally contain a
await ...statement to be awaited (yield control to event loop) - If the 'event loop is blocked' by a asynchronous function (it will not yield control to event loop), the async function will wait for the completion of the synchronous function. So, the async function will not be executed asynchronously. Try to use
await asyncio.to_thread()to run the synchronous function in a separate thread, so that the event loop is not blocked. - This version use
richinstead oftqdmfor better handle progess bars (see retired codes). Multipletqdmbars work well if there are not errors during job submission. However, if the jobs raise errors, thetqdmbars will be messed up. rich's remaining time does not work well with multiple progress bars. So, I implemented a customized time remaining column.
alff_submit_job_multi_remotes(mdict_list: list[dict], commandlist_list: list[list[str]], work_dir: str, task_dirs: list[str], forward_files: list[str], backward_files: list[str], forward_common_files: list[str] = [], backward_common_files: list[str] = [], logger: ColorLogger | None = None)
async
¶
Submit jobs to multiple machines asynchronously.
Parameters:
log
¶
Classes:
-
ColorLogger–Logger subclass that supports
colorargument for console output.
Functions:
-
create_logger–Create a logger that supports
colorargument per message, to colorize console output and plain-text logfile. -
write_to_logfile–Retrieve logfile name from logger and write text to it. Useful when want to write unformat text to the same logfile used by logger.
ColorLogger
¶
create_logger(name: str | None = None, logfile: str | None = None, level: str = 'INFO', level_logfile: str | None = None) -> ColorLogger
¶
Create a logger that supports color argument per message, to colorize console output and plain-text logfile.
write_to_logfile(logger: logging.Logger, text: str)
¶
Retrieve logfile name from logger and write text to it. Useful when want to write unformat text to the same logfile used by logger.
markup
¶
Module for display-related classes and functions.
Classes:
-
ThangBar–A class to extend functions of the rich's progress bar.
-
DynamicBarColumn–Extend
BarColumnthat can read per-task fields 'complete_color', 'finished_color',... to customize colors. -
TextDecor–A collection of text decoration utilities.
ThangBar(*args, **kwargs)
¶
Bases: Progress
A class to extend functions of the rich's progress bar.
The same as rich.progress.Progress, with additional methods:
- hide_bar(): hide the progress bar.
- show_bar(): show the progress bar.
- compute_eta(): static method to compute estimated time of arrival (ETA) given number of iterations and time taken.
Methods:
-
hide_bar–Hide all progress bars in the given Progress object.
-
show_bar–Show all progress bars in the given Progress object.
-
align_etatext–Align the ETA text to the given width for all tasks.
-
compute_eta–Estimate remaining time.
hide_bar()
¶
Hide all progress bars in the given Progress object.
show_bar()
¶
Show all progress bars in the given Progress object.
align_etatext(width: int)
¶
Align the ETA text to the given width for all tasks.
compute_eta(num_iters: int, iter_index: int, old_time: float | None = None, new_time: float | None = None) -> str
staticmethod
¶
Estimate remaining time.
DynamicBarColumn(**kwargs)
¶
Bases: BarColumn
Extend BarColumn that can read per-task fields 'complete_color', 'finished_color',... to customize colors.
Parameters:
-
The same as `rich.progress.BarColumn`, and following additional arguments– -
- complete_color(str) –the color for completed part of the bar.
-
- finished_color(str) –the color for finished bar.
-
- pulse_color(str) –the color for pulsing bar.
Methods:
-
render–Gets a progress bar widget for a task.
render(task) -> ProgressBar
¶
Gets a progress bar widget for a task.
TextDecor(text: str = 'example')
¶
A collection of text decoration utilities.
Methods:
-
fill_center–Return the text centered within a line of the given length, filled with
fill. -
fill_left–Return the text left-aligned within a line of the given length, with left and right fills.
-
fill_box–Return a string centered in a box with side delimiters.
-
repeat–Repeat the input string to a specified length.
-
mkcolor–Return ANSI-colored text that works in terminal or Jupyter.
Attributes:
-
text–
text = text
instance-attribute
¶
fill_center(fill: str = '-', length: int = 60) -> str
¶
Return the text centered within a line of the given length, filled with fill.
fill_left(margin: int = 15, fill_left: str = '-', fill_right: str = ' ', length: int = 60) -> str
¶
Return the text left-aligned within a line of the given length, with left and right fills.
fill_box(fill: str = ' ', sp: str = 'ǁ', length: int = 60) -> str
¶
Return a string centered in a box with side delimiters.
Example:
Notes
- To input unicode characters, use the unicode escape sequence (e.g., "ǁ" for a specific character). See unicode-table for more details.
- ║ (Double vertical bar,
u2551) - ‖ (Double vertical line,
u2016) - ǁ (Latin letter lateral click,
u01C1)
- ║ (Double vertical bar,
repeat(length: int) -> str
¶
Repeat the input string to a specified length.
path
¶
Functions:
-
make_dir–Create a directory with a backup option.
-
make_dir_ask_backup–Make a directory and ask for backup if the directory already exists.
-
ask_yesnoback–Ask user for a yes/no/backup response.
-
ask_yesno–Ask user a yes/no question and return the normalized choice.
-
list_paths–List all files/folders in given directories and their subdirectories that match the given patterns.
-
collect_files–Collect files from a list of paths (files/folders). Will search files in folders and their subdirectories.
-
change_pathname–Change path names.
-
remove_files–Remove files from a given list of file paths.
-
remove_dirs–Remove a list of directories.
-
remove_files_in_paths–Remove files in the
fileslist in thepathslist. -
remove_dirs_in_paths–Remove directories in the
dirslist in thepathslist. -
copy_file–Copy a file/folder from the
source pathto thedestination path. It will create the destination directory if it does not exist. -
move_file–Move a file/folder from the source path to the destination path.
-
filter_dirs–Return directories containing
has_filesand none ofno_files.
make_dir(path: str | Path, backup: bool = True)
¶
Create a directory with a backup option.
make_dir_ask_backup(dir_path: str, logger: logging.Logger | None = None)
¶
Make a directory and ask for backup if the directory already exists.
ask_yesnoback(prompt: str) -> str
¶
Ask user for a yes/no/backup response.
ask_yesno(prompt: str) -> str
¶
Ask user a yes/no question and return the normalized choice.
list_paths(paths: list[str] | str, patterns: list[str], recursive=True) -> list[str]
¶
List all files/folders in given directories and their subdirectories that match the given patterns.
Parameters¶
paths : list[str] The list of paths to search files/folders. patterns : list[str] The list of patterns to apply to the files. Each filter can be a file extension or a pattern.
Returns:¶
List[str]: A list of matching paths.
Example:¶
folders = ["path1", "path2", "path3"]
patterns = ["*.ext1", "*.ext2", "something*.ext3", "*folder/"]
files = list_files_in_dirs(folders, patterns)
Note:¶
- glob() does not list hidden files by default. To include hidden files, use glob(".*", recursive=True).
- When use recursive=True, must include
**in the pattern to search subdirectories.- glob("*", recursive=True) will search all FILES & FOLDERS in the CURRENT directory.
- glob("*/", recursive=True) will search all FOLDERS in the current CURRENT directory.
- glob("**", recursive=True) will search all FILES & FOLDERS in the CURRENT & SUB subdirectories.
- glob("**/", recursive=True) will search all FOLDERS in the current CURRENT & SUB subdirectories.
- "/*" is equivalent to "".
- "/*/" is equivalent to "/".
- IMPORTANT: "/" will replicate the behavior of "**", then give unexpected results.
collect_files(paths: list[str] | str, patterns: list[str]) -> list[str]
¶
Collect files from a list of paths (files/folders). Will search files in folders and their subdirectories.
Parameters¶
paths : list[str] The list of paths to collect files from. patterns : list[str] The list of patterns to apply to the files. Each filter can be a file extension or a pattern.
Returns:¶
List[str]: A list of paths matching files.
change_pathname(paths: list[str], old_string: str, new_string: str, replace: bool = False) -> None
¶
remove_files(files: list[str]) -> None
¶
remove_dirs(dirs: list[str]) -> None
¶
remove_files_in_paths(files: list, paths: list) -> None
¶
Remove files in the files list in the paths list.
remove_dirs_in_paths(dirs: list, paths: list) -> None
¶
Remove directories in the dirs list in the paths list.
copy_file(src_path: str, dest_path: str)
¶
Copy a file/folder from the source path to the destination path. It will create the destination directory if it does not exist.
move_file(src_path: str, dest_path: str)
¶
Move a file/folder from the source path to the destination path.
filter_dirs(dirs: list[str], has_files: list[str] | None = None, no_files: list[str] | None = None) -> list[str]
¶
Return directories containing has_files and none of no_files.
Parameters:
-
dirs(list[str]) –List of directory paths to scan.
-
has_files(list[str] | None, default:None) –Files that must exist in the directory. Defaults to [].
-
no_files(list[str] | None, default:None) –Files that must not exist in the directory. Defaults to [].
Returns:
pkg
¶
Functions:
-
check_package–Check if the required packages are installed.
-
install_package–Install the required package.
-
dependency_info–Get the dependency information.
check_package(package_name: str, auto_install: bool = False, git_repo: str | None = None, conda_channel: str | None = None)
¶
Check if the required packages are installed.
install_package(package_name: str, git_repo: str | None = None, conda_channel: str | None = None) -> None
¶
Install the required package.
Parameters:
-
package_name(str) –package name
-
git_repo(str, default:None) –git path for the package. Default: None. E.g., http://somthing.git
-
conda_channel(str, default:None) –conda channel for the package. Default: None. E.g., conda-forge
Notes
- Default using:
pip install -U {package_name} - If
git_repois provided:pip install -U git+{git_repo} - If
conda_channelis provided:conda install -c {conda_channel} {package_name}
range
¶
Functions:
-
range_inclusive–Generate evenly spaced points including the endpoint (within tolerance).
-
composite_range–A custom parser to allow define composite ranges. This is needed for defining input parameters in YAML files.
-
composite_index–Allow define composite index ranges.
-
composite_strain_points–Generate composite spacing points from multiple ranges with tolerance-based uniqueness.
-
chunk_list–Yield successive n-sized chunks from
input_list.
range_inclusive(start: float, end: float, step: float, tol: float = 1e-06) -> list[float]
¶
Generate evenly spaced points including the endpoint (within tolerance).
composite_range(list_inputs: list[float | str], tol=1e-06) -> list[float]
¶
A custom parser to allow define composite ranges. This is needed for defining input parameters in YAML files.
Parameters:
-
list_inputs(list[int | float | str]) –Accepts numbers or strings with special form 'start:end[:step]' (inclusive).
-
tol(float, default:1e-06) –Tolerance for including the endpoint.
Examples: ["-3.1:-1", 0.1, 2, "3.1:5.2", "6.0:10.1:0.5"]
composite_index(list_inputs: list[float | str]) -> list[int]
¶
composite_strain_points(list_inputs: list[int | float | str], tol=1e-06) -> list[float]
¶
Generate composite spacing points from multiple ranges with tolerance-based uniqueness.
Notes
np.round(np.array(all_points) / tol).astype(int)is a trick to avoid floating point issues when comparing points with a certain tolerance.
stuff
¶
Functions:
-
time_uuid– -
simple_uuid–Generate a simple random UUID of 4 digits.