Implementation of nn.Modules for Temporal Fusion Transformer from PyTorch-Forecasting: https://github.com/jdb78/pytorch-forecasting

PyTorch Forecasting v0.9.1 License from https://github.com/jdb78/pytorch-forecasting/blob/master/LICENSE, accessed on Wed, November 3, 2021: ‘THE MIT License

Copyright 2020 Jan Beitner

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. ‘

darts.models.forecasting.tft_submodels.get_embedding_size(n, max_size=100)[source]

Determine empirically good embedding sizes (formula taken from fastai). :type n: int :param n: number of classes :type n: int :type max_size: int :param max_size: maximum embedding size. Defaults to 100. :type max_size: int, optional

Returns

embedding size

Return type

int