Skip to content

Commit

Permalink
Add arguments and doc string
Browse files Browse the repository at this point in the history
  • Loading branch information
JQGoh committed Dec 22, 2024
1 parent 3bab7c0 commit c5f20a1
Show file tree
Hide file tree
Showing 64 changed files with 330 additions and 12 deletions.
7 changes: 6 additions & 1 deletion nbs/models.autoformer.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -458,7 +458,10 @@
" `drop_last_loader`: bool=False, if True `TimeSeriesDataLoader` drops last non-full batch.<br>\n",
" `alias`: str, optional, Custom name of the model.<br>\n",
" `dataloader_kwargs`: dict, optional, list of parameters passed into the PyTorch Lightning dataloader by the `TimeSeriesDataLoader`. <br>\n",
" `**trainer_kwargs`: int, keyword trainer arguments inherited from [PyTorch Lighning's trainer](https://pytorch-lightning.readthedocs.io/en/stable/api/pytorch_lightning.trainer.trainer.Trainer.html?highlight=trainer).<br>\n",
" `config_optimizers`: <class 'function'>, optional, A callable function that implements the optimization behavior as detailed in <br>\n",
" https://lightning.ai/docs/pytorch/stable/api/lightning.pytorch.core.LightningModule.html#lightning.pytorch.core.LightningModule.configure_optimizers <br>\n",
" Note that the function must accept an argument which is the subclass of Neuralforecast's `BaseModel` to speficy the model's parameters() for the optimizer. <br>\n",
" `**trainer_kwargs`: int, keyword trainer arguments inherited from [PyTorch Lighning's trainer](https://pytorch-lightning.readthedocs.io/en/stable/api/pytorch_lightning.trainer.trainer.Trainer.html?highlight=trainer). <br>\n",
"\n",
"\t*References*<br>\n",
"\t- [Wu, Haixu, Jiehui Xu, Jianmin Wang, and Mingsheng Long. \"Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting\"](https://proceedings.neurips.cc/paper/2021/hash/bcc0d400288793e8bdcd7c19a8ac0c2b-Abstract.html)<br>\n",
Expand Down Expand Up @@ -503,6 +506,7 @@
" random_seed: int = 1,\n",
" drop_last_loader: bool = False,\n",
" dataloader_kwargs=None,\n",
" config_optimizers=None,\n",
" **trainer_kwargs):\n",
" super(Autoformer, self).__init__(h=h,\n",
" input_size=input_size,\n",
Expand All @@ -527,6 +531,7 @@
" drop_last_loader=drop_last_loader,\n",
" random_seed=random_seed,\n",
" dataloader_kwargs=dataloader_kwargs,\n",
" config_optimizers=config_optimizers,\n",
" **trainer_kwargs)\n",
"\n",
" # Architecture\n",
Expand Down
5 changes: 5 additions & 0 deletions nbs/models.bitcn.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -178,6 +178,9 @@
" `drop_last_loader`: bool=False, if True `TimeSeriesDataLoader` drops last non-full batch.<br>\n",
" `alias`: str, optional, Custom name of the model.<br>\n",
" `dataloader_kwargs`: dict, optional, list of parameters passed into the PyTorch Lightning dataloader by the `TimeSeriesDataLoader`. <br>\n",
" `config_optimizers`: <class 'function'>, optional, A callable function that implements the optimization behavior as detailed in <br>\n",
" https://lightning.ai/docs/pytorch/stable/api/lightning.pytorch.core.LightningModule.html#lightning.pytorch.core.LightningModule.configure_optimizers <br>\n",
" Note that the function must accept an argument which is the subclass of Neuralforecast's `BaseModel` to speficy the model's parameters() for the optimizer. <br>\n",
" `**trainer_kwargs`: int, keyword trainer arguments inherited from [PyTorch Lighning's trainer](https://pytorch-lightning.readthedocs.io/en/stable/api/pytorch_lightning.trainer.trainer.Trainer.html?highlight=trainer).<br> \n",
"\n",
" **References**<br> \n",
Expand Down Expand Up @@ -216,6 +219,7 @@
" random_seed: int = 1,\n",
" drop_last_loader: bool = False,\n",
" dataloader_kwargs=None,\n",
" config_optimizers=None,\n",
" **trainer_kwargs):\n",
" super(BiTCN, self).__init__(\n",
" h=h,\n",
Expand All @@ -241,6 +245,7 @@
" random_seed=random_seed,\n",
" drop_last_loader=drop_last_loader,\n",
" dataloader_kwargs=dataloader_kwargs,\n",
" config_optimizers=config_optimizers,\n",
" **trainer_kwargs\n",
" )\n",
"\n",
Expand Down
5 changes: 5 additions & 0 deletions nbs/models.deepar.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -183,6 +183,9 @@
" `drop_last_loader`: bool=False, if True `TimeSeriesDataLoader` drops last non-full batch.<br>\n",
" `alias`: str, optional, Custom name of the model.<br>\n",
" `dataloader_kwargs`: dict, optional, list of parameters passed into the PyTorch Lightning dataloader by the `TimeSeriesDataLoader`. <br>\n",
" `config_optimizers`: <class 'function'>, optional, A callable function that implements the optimization behavior as detailed in <br>\n",
" https://lightning.ai/docs/pytorch/stable/api/lightning.pytorch.core.LightningModule.html#lightning.pytorch.core.LightningModule.configure_optimizers <br>\n",
" Note that the function must accept an argument which is the subclass of Neuralforecast's `BaseModel` to speficy the model's parameters() for the optimizer. <br>\n",
" `**trainer_kwargs`: int, keyword trainer arguments inherited from [PyTorch Lighning's trainer](https://pytorch-lightning.readthedocs.io/en/stable/api/pytorch_lightning.trainer.trainer.Trainer.html?highlight=trainer).<br> \n",
"\n",
" **References**<br>\n",
Expand Down Expand Up @@ -226,6 +229,7 @@
" random_seed: int = 1,\n",
" drop_last_loader = False,\n",
" dataloader_kwargs = None,\n",
" config_optimizers=None,\n",
" **trainer_kwargs):\n",
"\n",
" if exclude_insample_y:\n",
Expand Down Expand Up @@ -264,6 +268,7 @@
" drop_last_loader=drop_last_loader,\n",
" random_seed=random_seed,\n",
" dataloader_kwargs=dataloader_kwargs,\n",
" config_optimizers=config_optimizers,\n",
" **trainer_kwargs)\n",
"\n",
" self.horizon_backup = self.h # Used because h=0 during training\n",
Expand Down
5 changes: 5 additions & 0 deletions nbs/models.deepnpts.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -121,6 +121,9 @@
" `drop_last_loader`: bool=False, if True `TimeSeriesDataLoader` drops last non-full batch.<br>\n",
" `alias`: str, optional, Custom name of the model.<br>\n",
" `dataloader_kwargs`: dict, optional, list of parameters passed into the PyTorch Lightning dataloader by the `TimeSeriesDataLoader`. <br>\n",
" `config_optimizers`: <class 'function'>, optional, A callable function that implements the optimization behavior as detailed in <br>\n",
" https://lightning.ai/docs/pytorch/stable/api/lightning.pytorch.core.LightningModule.html#lightning.pytorch.core.LightningModule.configure_optimizers <br>\n",
" Note that the function must accept an argument which is the subclass of Neuralforecast's `BaseModel` to speficy the model's parameters() for the optimizer. <br>\n",
" `**trainer_kwargs`: int, keyword trainer arguments inherited from [PyTorch Lighning's trainer](https://pytorch-lightning.readthedocs.io/en/stable/api/pytorch_lightning.trainer.trainer.Trainer.html?highlight=trainer).<br> \n",
"\n",
" **References**<br>\n",
Expand Down Expand Up @@ -161,6 +164,7 @@
" random_seed: int = 1,\n",
" drop_last_loader = False,\n",
" dataloader_kwargs = None,\n",
" config_optimizers = None,\n",
" **trainer_kwargs):\n",
"\n",
" if exclude_insample_y:\n",
Expand Down Expand Up @@ -196,6 +200,7 @@
" drop_last_loader=drop_last_loader,\n",
" random_seed=random_seed,\n",
" dataloader_kwargs=dataloader_kwargs,\n",
" config_optimizers=config_optimizers,\n",
" **trainer_kwargs)\n",
"\n",
" self.h = h\n",
Expand Down
5 changes: 5 additions & 0 deletions nbs/models.dilated_rnn.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -390,6 +390,9 @@
" `drop_last_loader`: bool=False, if True `TimeSeriesDataLoader` drops last non-full batch.<br>\n",
" `alias`: str, optional, Custom name of the model.<br>\n",
" `dataloader_kwargs`: dict, optional, list of parameters passed into the PyTorch Lightning dataloader by the `TimeSeriesDataLoader`. <br>\n",
" `config_optimizers`: <class 'function'>, optional, A callable function that implements the optimization behavior as detailed in <br>\n",
" https://lightning.ai/docs/pytorch/stable/api/lightning.pytorch.core.LightningModule.html#lightning.pytorch.core.LightningModule.configure_optimizers <br>\n",
" Note that the function must accept an argument which is the subclass of Neuralforecast's `BaseModel` to speficy the model's parameters() for the optimizer. <br>\n",
" `**trainer_kwargs`: int, keyword trainer arguments inherited from [PyTorch Lighning's trainer](https://pytorch-lightning.readthedocs.io/en/stable/api/pytorch_lightning.trainer.trainer.Trainer.html?highlight=trainer).<br> \n",
" \"\"\"\n",
" # Class attributes\n",
Expand Down Expand Up @@ -425,6 +428,7 @@
" random_seed: int = 1,\n",
" drop_last_loader: bool = False,\n",
" dataloader_kwargs = None,\n",
" config_optimizers=None,\n",
" **trainer_kwargs):\n",
" super(DilatedRNN, self).__init__(\n",
" h=h,\n",
Expand All @@ -446,6 +450,7 @@
" drop_last_loader=drop_last_loader,\n",
" random_seed=random_seed,\n",
" dataloader_kwargs=dataloader_kwargs,\n",
" config_optimizers=config_optimizers,\n",
" **trainer_kwargs\n",
" )\n",
"\n",
Expand Down
5 changes: 5 additions & 0 deletions nbs/models.dlinear.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -162,6 +162,9 @@
" `drop_last_loader`: bool=False, if True `TimeSeriesDataLoader` drops last non-full batch.<br>\n",
" `alias`: str, optional, Custom name of the model.<br>\n",
" `dataloader_kwargs`: dict, optional, list of parameters passed into the PyTorch Lightning dataloader by the `TimeSeriesDataLoader`. <br>\n",
" `config_optimizers`: <class 'function'>, optional, A callable function that implements the optimization behavior as detailed in <br>\n",
" https://lightning.ai/docs/pytorch/stable/api/lightning.pytorch.core.LightningModule.html#lightning.pytorch.core.LightningModule.configure_optimizers <br>\n",
" Note that the function must accept an argument which is the subclass of Neuralforecast's `BaseModel` to speficy the model's parameters() for the optimizer. <br>\n",
" `**trainer_kwargs`: int, keyword trainer arguments inherited from [PyTorch Lighning's trainer](https://pytorch-lightning.readthedocs.io/en/stable/api/pytorch_lightning.trainer.trainer.Trainer.html?highlight=trainer).<br>\n",
"\n",
"\t*References*<br>\n",
Expand Down Expand Up @@ -198,6 +201,7 @@
" random_seed: int = 1,\n",
" drop_last_loader: bool = False,\n",
" dataloader_kwargs=None,\n",
" config_optimizers=None,\n",
" **trainer_kwargs):\n",
" super(DLinear, self).__init__(h=h,\n",
" input_size=input_size,\n",
Expand All @@ -222,6 +226,7 @@
" drop_last_loader=drop_last_loader,\n",
" random_seed=random_seed,\n",
" dataloader_kwargs=dataloader_kwargs,\n",
" config_optimizers=config_optimizers,\n",
" **trainer_kwargs)\n",
" \n",
" # Architecture\n",
Expand Down
7 changes: 6 additions & 1 deletion nbs/models.fedformer.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -451,6 +451,9 @@
" `drop_last_loader`: bool=False, if True `TimeSeriesDataLoader` drops last non-full batch.<br>\n",
" `alias`: str, optional, Custom name of the model.<br>\n",
" `dataloader_kwargs`: dict, optional, list of parameters passed into the PyTorch Lightning dataloader by the `TimeSeriesDataLoader`. <br>\n",
" `config_optimizers`: <class 'function'>, optional, A callable function that implements the optimization behavior as detailed in <br>\n",
" https://lightning.ai/docs/pytorch/stable/api/lightning.pytorch.core.LightningModule.html#lightning.pytorch.core.LightningModule.configure_optimizers <br>\n",
" Note that the function must accept an argument which is the subclass of Neuralforecast's `BaseModel` to speficy the model's parameters() for the optimizer. <br>\n",
" `**trainer_kwargs`: int, keyword trainer arguments inherited from [PyTorch Lighning's trainer](https://pytorch-lightning.readthedocs.io/en/stable/api/pytorch_lightning.trainer.trainer.Trainer.html?highlight=trainer).<br>\n",
"\n",
" \"\"\"\n",
Expand Down Expand Up @@ -495,6 +498,7 @@
" random_seed: int = 1,\n",
" drop_last_loader: bool = False,\n",
" dataloader_kwargs = None,\n",
" config_optimizers=None,\n",
" **trainer_kwargs):\n",
" super(FEDformer, self).__init__(h=h,\n",
" input_size=input_size,\n",
Expand All @@ -517,7 +521,8 @@
" scaler_type=scaler_type,\n",
" drop_last_loader=drop_last_loader,\n",
" random_seed=random_seed,\n",
" dataloader_kwargs=dataloader_kwargs, \n",
" dataloader_kwargs=dataloader_kwargs,\n",
" config_optimizers=config_optimizers,\n",
" **trainer_kwargs)\n",
" # Architecture\n",
" self.label_len = int(np.ceil(input_size * decoder_input_size_multiplier))\n",
Expand Down
5 changes: 5 additions & 0 deletions nbs/models.gru.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -134,6 +134,9 @@
" `drop_last_loader`: bool=False, if True `TimeSeriesDataLoader` drops last non-full batch.<br>\n",
" `alias`: str, optional, Custom name of the model.<br>\n",
" `dataloader_kwargs`: dict, optional, list of parameters passed into the PyTorch Lightning dataloader by the `TimeSeriesDataLoader`. <br>\n",
" `config_optimizers`: <class 'function'>, optional, A callable function that implements the optimization behavior as detailed in <br>\n",
" https://lightning.ai/docs/pytorch/stable/api/lightning.pytorch.core.LightningModule.html#lightning.pytorch.core.LightningModule.configure_optimizers <br>\n",
" Note that the function must accept an argument which is the subclass of Neuralforecast's `BaseModel` to speficy the model's parameters() for the optimizer. <br>\n",
" `**trainer_kwargs`: int, keyword trainer arguments inherited from [PyTorch Lighning's trainer](https://pytorch-lightning.readthedocs.io/en/stable/api/pytorch_lightning.trainer.trainer.Trainer.html?highlight=trainer).<br> \n",
" \"\"\"\n",
" # Class attributes\n",
Expand Down Expand Up @@ -170,6 +173,7 @@
" random_seed=1,\n",
" drop_last_loader = False,\n",
" dataloader_kwargs = None,\n",
" config_optimizers=None,\n",
" **trainer_kwargs):\n",
" super(GRU, self).__init__(\n",
" h=h,\n",
Expand All @@ -191,6 +195,7 @@
" drop_last_loader=drop_last_loader,\n",
" random_seed=random_seed,\n",
" dataloader_kwargs=dataloader_kwargs,\n",
" config_optimizers=config_optimizers,\n",
" **trainer_kwargs\n",
" )\n",
"\n",
Expand Down
5 changes: 5 additions & 0 deletions nbs/models.informer.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -306,6 +306,9 @@
" `drop_last_loader`: bool=False, if True `TimeSeriesDataLoader` drops last non-full batch.<br>\n",
" `alias`: str, optional, Custom name of the model.<br>\n",
" `dataloader_kwargs`: dict, optional, list of parameters passed into the PyTorch Lightning dataloader by the `TimeSeriesDataLoader`. <br>\n",
" `config_optimizers`: <class 'function'>, optional, A callable function that implements the optimization behavior as detailed in <br>\n",
" https://lightning.ai/docs/pytorch/stable/api/lightning.pytorch.core.LightningModule.html#lightning.pytorch.core.LightningModule.configure_optimizers <br>\n",
" Note that the function must accept an argument which is the subclass of Neuralforecast's `BaseModel` to speficy the model's parameters() for the optimizer. <br>\n",
" `**trainer_kwargs`: int, keyword trainer arguments inherited from [PyTorch Lighning's trainer](https://pytorch-lightning.readthedocs.io/en/stable/api/pytorch_lightning.trainer.trainer.Trainer.html?highlight=trainer).<br>\n",
"\n",
"\t*References*<br>\n",
Expand Down Expand Up @@ -351,6 +354,7 @@
" random_seed: int = 1,\n",
" drop_last_loader: bool = False,\n",
" dataloader_kwargs = None,\n",
" config_optimizers=None,\n",
" **trainer_kwargs):\n",
" super(Informer, self).__init__(h=h,\n",
" input_size=input_size,\n",
Expand All @@ -375,6 +379,7 @@
" drop_last_loader=drop_last_loader,\n",
" random_seed=random_seed,\n",
" dataloader_kwargs=dataloader_kwargs,\n",
" config_optimizers=config_optimizers,\n",
" **trainer_kwargs)\n",
"\n",
" # Architecture\n",
Expand Down
7 changes: 6 additions & 1 deletion nbs/models.itransformer.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -228,6 +228,9 @@
" `drop_last_loader`: bool=False, if True `TimeSeriesDataLoader` drops last non-full batch.<br>\n",
" `alias`: str, optional, Custom name of the model.<br>\n",
" `dataloader_kwargs`: dict, optional, list of parameters passed into the PyTorch Lightning dataloader by the `TimeSeriesDataLoader`. <br>\n",
" `config_optimizers`: <class 'function'>, optional, A callable function that implements the optimization behavior as detailed in <br>\n",
" https://lightning.ai/docs/pytorch/stable/api/lightning.pytorch.core.LightningModule.html#lightning.pytorch.core.LightningModule.configure_optimizers <br>\n",
" Note that the function must accept an argument which is the subclass of Neuralforecast's `BaseModel` to speficy the model's parameters() for the optimizer. <br> \n",
" `**trainer_kwargs`: int, keyword trainer arguments inherited from [PyTorch Lighning's trainer](https://pytorch-lightning.readthedocs.io/en/stable/api/pytorch_lightning.trainer.trainer.Trainer.html?highlight=trainer).<br>\n",
" \n",
" **References**<br>\n",
Expand Down Expand Up @@ -267,7 +270,8 @@
" scaler_type: str = 'identity',\n",
" random_seed: int = 1,\n",
" drop_last_loader: bool = False,\n",
" dataloader_kwargs = None, \n",
" dataloader_kwargs = None,\n",
" config_optimizers=None, \n",
" **trainer_kwargs):\n",
" \n",
" super(iTransformer, self).__init__(h=h,\n",
Expand All @@ -289,6 +293,7 @@
" random_seed=random_seed,\n",
" drop_last_loader=drop_last_loader,\n",
" dataloader_kwargs=dataloader_kwargs,\n",
" config_optimizers=config_optimizers,\n",
" **trainer_kwargs)\n",
" \n",
" self.enc_in = n_series\n",
Expand Down
Loading

0 comments on commit c5f20a1

Please sign in to comment.