diff --git a/README.md b/README.md
index 79c162624a81..47cd446aeb14 100644
--- a/README.md
+++ b/README.md
@@ -31,12 +31,11 @@
NautilusTrader is an open-source, high-performance, production-grade algorithmic trading platform,
providing quantitative traders with the ability to backtest portfolios of automated trading strategies
-on historical data with an event-driven engine, and also deploy those same strategies live.
+on historical data with an event-driven engine, and also deploy those same strategies live, with no code changes.
-The platform is 'AI-first', designed to deploy models for algorithmic trading strategies developed
-using the Python ecosystem - within a highly performant and robust Python native environment.
-This helps to address the challenge of keeping the research/backtest environment consistent with the production
-live trading environment.
+The platform is 'AI-first', designed to develop and deploy algorithmic trading strategies within a highly performant
+and robust Python native environment. This helps to address the parity challenge of keeping the Python research/backtest
+environment, consistent with the production live trading environment.
NautilusTraders design, architecture and implementation philosophy holds software correctness and safety at the
highest level, with the aim of supporting Python native, mission-critical, trading system backtesting
@@ -48,15 +47,16 @@ including FX, Equities, Futures, Options, CFDs, Crypto and Betting - across mult
## Features
-- **Fast:** C-level speed through Rust and Cython. Asynchronous networking with [uvloop](https://github.com/MagicStack/uvloop)
-- **Reliable:** Type safety through Rust and Cython. Redis backed performant state persistence
-- **Flexible:** OS independent, runs on Linux, macOS, Windows. Deploy using Docker
-- **Integrated:** Modular adapters mean any REST, WebSocket, or FIX API can be integrated
-- **Advanced:** Time in force `IOC`, `FOK`, `GTD`, `AT_THE_OPEN`, `AT_THE_CLOSE`, advanced order types and conditional triggers. Execution instructions `post-only`, `reduce-only`, and icebergs. Contingency order lists including `OCO`, `OTO`
-- **Backtesting:** Run with multiple venues, instruments and strategies simultaneously using historical quote tick, trade tick, bar, order book and custom data with nanosecond resolution
-- **Live:** Use identical strategy implementations between backtesting and live deployments
-- **Multi-venue:** Multiple venue capabilities facilitate market making and statistical arbitrage strategies
-- **AI Agent Training:** Backtest engine fast enough to be used to train AI trading agents (RL/ES)
+- **Fast** - C-level speed through Rust and Cython. Asynchronous networking with [uvloop](https://github.com/MagicStack/uvloop)
+- **Reliable** - Type safety through Rust and Cython. Redis backed performant state persistence
+- **Portable** - OS independent, runs on Linux, macOS, Windows. Deploy using Docker
+- **Flexible** - Modular adapters mean any REST, WebSocket, or FIX API can be integrated
+- **Advanced** - Time in force `IOC`, `FOK`, `GTD`, `AT_THE_OPEN`, `AT_THE_CLOSE`, advanced order types and conditional triggers. Execution instructions `post-only`, `reduce-only`, and icebergs. Contingency order lists including `OCO`, `OTO`
+- **Customizable** - Add user defined custom components, or assemble entire systems from scratch leveraging the cache and message bus
+- **Backtesting** - Run with multiple venues, instruments and strategies simultaneously using historical quote tick, trade tick, bar, order book and custom data with nanosecond resolution
+- **Live** - Use identical strategy implementations between backtesting and live deployments
+- **Multi-venue** - Multiple venue capabilities facilitate market making and statistical arbitrage strategies
+- **AI Agent Training** - Backtest engine fast enough to be used to train AI trading agents (RL/ES)
![Alt text](https://github.com/nautechsystems/nautilus_trader/blob/develop/docs/_images/nautilus-art.png?raw=true "nautilus")
> *nautilus - from ancient Greek 'sailor' and naus 'ship'.*
@@ -66,15 +66,20 @@ including FX, Equities, Futures, Options, CFDs, Crypto and Betting - across mult
## Why NautilusTrader?
-Traditionally, trading strategy research and backtesting might be conducted in Python (or other suitable language), with
-the models and/or strategies then needing to be reimplemented in C, C++, C#, Java or other statically
-typed language(s). The reasoning here is to utilize the performance and type safety a compiled language can offer,
-which has historically made these languages more suitable for large trading systems.
+- **Highly performant event-driven Python** - native binary core components
+- **Parity between backtesting and live trading** - identical strategy code
+- **Reduced operational risk** - risk management functionality, logical correctness and type safety
+- **Highly extendable** - message bus, custom components and actors, custom data, custom adapters
-The value of NautilusTrader here is that this reimplementation step is circumvented - as the critical core components of the platform
-have all been written entirely in Cython. Because Cython can generate efficient C code (which then compiles to C extension modules as native binaries),
-Python can effectively be used as a high-performance systems programming language - with the benefit being that a Python native environment can be offered which is suitable for
-professional quantitative traders and trading firms.
+Traditionally, trading strategy research and backtesting might be conducted in Python (or other suitable language)
+using vectorized methods, with the strategy then needing to be reimplemented in a more event-drive way
+using C++, C#, Java or other statically typed language(s). The reasoning here is that vectorized backtesting code cannot
+express the granular time and event dependent complexity of real-time trading, where compiled languages have
+proven to be more suitable due to their inherently higher performance, and type safety.
+
+One of the key advantages of NautilusTrader here, is that this reimplementation step is now circumvented - as the critical core components of the platform
+have all been written entirely in Rust or Cython. This means we're using the right tools for the job, where systems programming languages compile performant binaries,
+with CPython C extension modules then able to offer a Python native environment, suitable for professional quantitative traders and trading firms.
## Why Python?
diff --git a/RELEASES.md b/RELEASES.md
index b13977dd497c..b2c331f23978 100644
--- a/RELEASES.md
+++ b/RELEASES.md
@@ -1,3 +1,27 @@
+# NautilusTrader 1.148.0 Beta
+
+Released on 30th June 2022 (UTC).
+
+### Breaking Changes
+None
+
+### Enhancements
+- Ported core bar objects to Rust thanks @ghill2
+- Improved core `unix_nanos_to_iso8601` performance by 30% thanks @ghill2
+- Added `DataCatalog` interface for `ParquetDataCatalog` thanks @jordanparker6
+- Added `AroonOscillator` indicator thanks @graceyangfan
+- Added `ArcherMovingAveragesTrends` indicator thanks @graceyangfan
+- Added `DoubleExponentialMovingAverage` indicator thanks @graceyangfan
+- Added `WilderMovingAverage` indicator thanks @graceyangfan
+- Added `ChandeMomentumOscillator` indicator thanks @graceyangfan
+- Added `VerticalHorizontalFilter` indicator thanks @graceyangfan
+- Added `Bias` indicator thanks @graceyangfan
+
+### Fixes
+None
+
+---
+
# NautilusTrader 1.147.1 Beta
Released on 6th June 2022.
@@ -16,7 +40,7 @@ None
# NautilusTrader 1.147.0 Beta
-Released on 4th June 2022.
+Released on 4th June 2022 (UTC).
### Breaking Changes
None
@@ -551,7 +575,7 @@ Released on 12th September 2021.
- Added `ContingencyType` enum (for contingency orders in an `OrderList`)
- All order types can now be `reduce_only` (#437)
- Refined backtest configuration options
-- Improved efficiency of `UUID4` using the `fastuuid` Rust bindings
+- Improved efficiency of `UUID4` using the Rust `fastuuid` Python bindings
### Fixes
- Fixed Redis loss of precision for `int64_t` nanosecond timestamps (#363)
diff --git a/docs/_templates/hero.html b/docs/_templates/hero.html
index 964e5aaa1ac3..d2bcf82d9e59 100644
--- a/docs/_templates/hero.html
+++ b/docs/_templates/hero.html
@@ -1,5 +1,4 @@
+
\ No newline at end of file
diff --git a/docs/api_reference/common.md b/docs/api_reference/common.md
index 09ff4dd5c6b8..0579f14f737b 100644
--- a/docs/api_reference/common.md
+++ b/docs/api_reference/common.md
@@ -114,12 +114,3 @@
:member-order: bysource
```
-## UUID
-
-```{eval-rst}
-.. automodule:: nautilus_trader.common.uuid
- :show-inheritance:
- :inherited-members:
- :members:
- :member-order: bysource
-```
diff --git a/docs/api_reference/index.md b/docs/api_reference/index.md
index eae5285fd8fd..2c37fcb00c00 100644
--- a/docs/api_reference/index.md
+++ b/docs/api_reference/index.md
@@ -11,59 +11,6 @@ a future time we may separate the documentation between the `develop` branch, an
more stable releases on `master`.
```
-## Type Safety
-The design of the platform holds software correctness and safety at the highest level.
-Given most of the core production code is written in Cython, type safety is often provided
-at the C level.
-
-```{note}
-If a function or methods parameter is not explicitly typed as allowing
-``None``, then you can assume you will receive a `ValueError` when passing ``None``
-as an argument (this is not explicitly documented).
-```
-
-## Framework Organization
-The codebase is organized around both layering of abstraction levels, and generally
-grouped into logical subpackages of cohesive concepts. You can navigate to the documentation
-for each of these subpackages from the left menu.
-
-### Core / Low-Level
-- `core`: constants, functions and low-level components used throughout the framework
-- `common`: common parts for assembling the frameworks various components
-- `network`: low-level base components for networking clients
-- `serialization`: serialization base components and serializer implementations
-- `model`: defines a rich trading domain model
-
-### System Components
-- `accounting`: different account types and account management machinery
-- `adapters`: integration adapters for the platform including brokers and exchanges
-- `analysis`: components relating to trading performance statistics and analysis
-- `cache`: provides common caching infrastructure
-- `data`: the data stack and data tooling for the platform
-- `execution`: the execution stack for the platform
-- `indicators`: a set of efficient indicators and analyzers
-- `infrastructure`: technology specific infrastructure implementations
-- `msgbus`: a universal message bus for connecting system components
-- `persistence`: data storage, cataloging and retrieval, mainly to support backtesting
-- `portfolio`: portfolio management functionality
-- `risk`: risk specific components and tooling
-- `trading`: trading domain specific components and tooling
-
-### System Implementations
-- `backtest`: backtesting componentry as well as a backtest engine implementation
-- `live`: live engine and client implementations as well as a node for live trading
-- `system`: the core system kernel common between backtest, sandbox and live contexts
-
-## Errors and Exceptions
-Every attempt has been made to accurately document the possible exceptions which
-can be raised from NautilusTrader code, and the conditions which will trigger them.
-
-```{warning}
-There may be other undocumented exceptions which can be raised by Pythons standard
-library, or from third party library dependencies.
-```
-
-
```{eval-rst}
.. toctree::
:maxdepth: 1
diff --git a/docs/conf.py b/docs/conf.py
index 9105ca783df9..86eb9f3a9275 100644
--- a/docs/conf.py
+++ b/docs/conf.py
@@ -112,19 +112,11 @@
"title": "nautilustrader.io ⬀",
},
],
- "heroes": {
- "index": "Documentation",
- "getting_started/index": "Documentation",
- "user_guide/index": "Documentation",
- "api_reference/index": "Documentation",
- "integrations/index": "Documentation",
- "developer_guide/index": "Documentation",
- },
"version_dropdown": True,
"version_json": "_static/version.json",
"version_info": {
- "Latest (1.146.0)": "https://docs.nautilustrader.io",
- "Develop (1.147.0)": "https://docs.nautilustrader.io/develop",
+ "1.148.0 (develop)": "https://docs.nautilustrader.io",
+ "1.148.0 (latest)": "https://docs.nautilustrader.io/latest",
},
"table_classes": ["plain"],
}
diff --git a/docs/developer_guide/cython.md b/docs/developer_guide/cython.md
index de9f2910cbdc..26901d9e0113 100644
--- a/docs/developer_guide/cython.md
+++ b/docs/developer_guide/cython.md
@@ -6,7 +6,7 @@ More information on Cython syntax and conventions can be found by reading the [C
## Function and method signatures
Ensure that all functions and methods returning `void` or a primitive C type (such as `bint`, `int`, `double`) include the `except *` keyword in the signature.
-This will ensure Python exceptions are not ignored, but instead are “bubbled up” to the caller as expected.
+This will ensure Python exceptions are not ignored, and instead are “bubbled up” to the caller as expected.
## Debugging
diff --git a/docs/developer_guide/rust.md b/docs/developer_guide/rust.md
index 14aeb1f658d3..82a2e17ccbc1 100644
--- a/docs/developer_guide/rust.md
+++ b/docs/developer_guide/rust.md
@@ -15,7 +15,6 @@ Cython. This approach is to aid a smooth transition to greater amounts
of Rust in the codebase, and reducing amounts of Cython (which will eventually be eliminated).
We want to avoid a need for Rust to call Python using the FFI. In the future [PyO3](https://github.com/PyO3/PyO3) will be used.
-
## Unsafe Rust
It will be necessary to write `unsafe` Rust code to be able to achieve the value
of interoperating between Python and Rust. The ability to step outside the boundaries of safe Rust is what makes it possible to
diff --git a/docs/developer_guide/testing.md b/docs/developer_guide/testing.md
index e23db8959af5..3454bcc83e65 100644
--- a/docs/developer_guide/testing.md
+++ b/docs/developer_guide/testing.md
@@ -6,7 +6,7 @@ The test suite is divided into broad categories of tests including:
- Acceptance tests
- Performance tests
-The performance tests are not run as part of the CI pipeline, but exist to aid development of performance-critical components.
+The performance tests exist to aid development of performance-critical components.
Tests can be run using either [Pytest](https://docs.pytest.org) or the [Nox](https://nox.thea.codes/en/stable/) tool.
diff --git a/docs/getting_started/quick_start.md b/docs/getting_started/quick_start.md
index f30f5e839f73..106ace5d1e82 100644
--- a/docs/getting_started/quick_start.md
+++ b/docs/getting_started/quick_start.md
@@ -25,18 +25,18 @@ deleted when the container is deleted.
To save time, we have prepared a script to load sample data into the Nautilus format for use with this example.
First, download and load the data by running the next cell (this should take ~ 1-2 mins):
-```python
+```bash
!curl https://raw.githubusercontent.com/nautechsystems/nautilus_data/main/scripts/hist_data_to_catalog.py | python -
```
-## Connecting to the DataCatalog
+## Connecting to the ParquetDataCatalog
If everything worked correctly, you should be able to see a single EUR/USD instrument in the catalog:
```python
-from nautilus_trader.persistence.catalog import DataCatalog
+from nautilus_trader.persistence.catalog import ParquetDataCatalog
-catalog = DataCatalog("./")
+catalog = ParquetDataCatalog("./")
catalog.instruments()
```
@@ -173,7 +173,7 @@ venue = BacktestVenueConfig(
## Instruments
-Second, we need to know about the instruments that we would like to load data for, we can use the `DataCatalog` for this:
+Second, we need to know about the instruments that we would like to load data for, we can use the `ParquetDataCatalog` for this:
```python
instruments = catalog.instruments(as_nautilus=True)
diff --git a/docs/index.md b/docs/index.md
index ad58e1f938e3..d412ab9c6f0f 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -1,15 +1,14 @@
-# Introduction
+# NautilusTrader Documentation
-Welcome to the documentation for NautilusTrader!
+Welcome to the official documentation for NautilusTrader!
NautilusTrader is an open-source, high-performance, production-grade algorithmic trading platform,
providing quantitative traders with the ability to backtest portfolios of automated trading strategies
-on historical data with an event-driven engine, and also deploy those same strategies live.
+on historical data with an event-driven engine, and also deploy those same strategies live, with no code changes.
-The platform is 'AI-first', designed to deploy models for algorithmic trading strategies developed
-using the Python ecosystem - within a highly performant and robust Python native environment.
-This helps to address the challenge of keeping the research/backtest environment consistent with the production
-live trading environment.
+The platform is 'AI-first', designed to develop and deploy algorithmic trading strategies within a highly performant
+and robust Python native environment. This helps to address the parity challenge of keeping the Python research/backtest
+environment, consistent with the production live trading environment.
NautilusTraders design, architecture and implementation philosophy holds software correctness and safety at the
highest level, with the aim of supporting Python native, mission-critical, trading system backtesting
@@ -21,15 +20,15 @@ including FX, Equities, Futures, Options, CFDs, Crypto and Betting - across mult
## Features
-- **Fast:** C-level speed through Cython. Asynchronous networking with [uvloop](https://github.com/MagicStack/uvloop).
-- **Reliable:** Type safety through Cython. Redis backed performant state persistence.
-- **Flexible:** OS independent, runs on Linux, macOS, Windows. Deploy using Docker.
-- **Integrated:** Modular adapters mean any REST, WebSocket, or FIX API can be integrated.
-- **Advanced:** Time in force `IOC`, `FOK`, `GTD`, `AT_THE_OPEN`, `AT_THE_CLOSE`, advanced order types and conditional triggers. Execution instructions `post-only`, `reduce-only`, and icebergs. Contingency order lists including `OCO`, `OTO`.
-- **Backtesting:** Run with multiple venues, instruments and strategies simultaneously using historical quote tick, trade tick, bar, order book and custom data with nanosecond resolution.
-- **Live:** Use identical strategy implementations between backtesting and live deployments.
-- **Multi-venue:** Multiple venue capabilities facilitate market making and statistical arbitrage strategies.
-- **AI Agent Training:** Backtest engine fast enough to be used to train AI trading agents (RL/ES).
+- **Fast:** C-level speed through Rust and Cython. Asynchronous networking with [uvloop](https://github.com/MagicStack/uvloop)
+- **Reliable:** Type safety through Rust and Cython. Redis backed performant state persistence
+- **Flexible:** OS independent, runs on Linux, macOS, Windows. Deploy using Docker
+- **Integrated:** Modular adapters mean any REST, WebSocket, or FIX API can be integrated
+- **Advanced:** Time in force `IOC`, `FOK`, `GTD`, `AT_THE_OPEN`, `AT_THE_CLOSE`, advanced order types and conditional triggers. Execution instructions `post-only`, `reduce-only`, and icebergs. Contingency order lists including `OCO`, `OTO`
+- **Backtesting:** Run with multiple venues, instruments and strategies simultaneously using historical quote tick, trade tick, bar, order book and custom data with nanosecond resolution
+- **Live:** Use identical strategy implementations between backtesting and live deployments
+- **Multi-venue:** Multiple venue capabilities facilitate market making and statistical arbitrage strategies
+- **AI Agent Training:** Backtest engine fast enough to be used to train AI trading agents (RL/ES)
![Nautilus](https://github.com/nautechsystems/nautilus_trader/blob/develop/docs/_images/nautilus-art.png?raw=true "nautilus")
> *nautilus - from ancient Greek 'sailor' and naus 'ship'.*
@@ -39,15 +38,20 @@ including FX, Equities, Futures, Options, CFDs, Crypto and Betting - across mult
## Why NautilusTrader?
-Traditionally, trading strategy research and backtesting might be conducted in Python (or other suitable language), with
-the models and/or strategies then needing to be reimplemented in C, C++, C#, Java or other statically
-typed language(s). The reasoning here is to utilize the performance and type safety a compiled language can offer,
-which has historically made these languages more suitable for large trading systems.
+- **Highly performant event-driven Python** - native binary core components
+- **Parity between backtesting and live trading** - identical strategy code
+- **Reduced operational risk** - risk management functionality, logical correctness and type safety
+- **Highly extendable** - message bus, custom components and actors, custom data, custom adapters
-The value of NautilusTrader here is that this reimplementation step is circumvented - as the critical core components of the platform
-have all been written entirely in Cython. Because Cython can generate efficient C code (which then compiles to C extension modules as native binaries),
-Python can effectively be used as a high-performance systems programming language - with the benefit being that a Python native environment can be offered which is suitable for
-professional quantitative traders and trading firms.
+Traditionally, trading strategy research and backtesting might be conducted in Python (or other suitable language)
+using vectorized methods, with the strategy then needing to be reimplemented in a more event-drive way
+using C++, C#, Java or other statically typed language(s). The reasoning here is that vectorized backtesting code cannot
+express the granular time and event dependent complexity of real-time trading, where compiled languages have
+proven to be more suitable due to their inherently higher performance, and type safety.
+
+One of the key advantages of NautilusTrader here, is that this reimplementation step is now circumvented - as the critical core components of the platform
+have all been written entirely in Rust or Cython. This means we're using the right tools for the job, where systems programming languages compile performant binaries,
+with CPython C extension modules then able to offer a Python native environment, suitable for professional quantitative traders and trading firms.
## Why Python?
diff --git a/docs/integrations/index.md b/docs/integrations/index.md
index 262882443d42..56b32f4f9137 100644
--- a/docs/integrations/index.md
+++ b/docs/integrations/index.md
@@ -46,7 +46,7 @@ a warning or error when a user attempts to perform said action
All integrations must be compatible with the NautilusTrader API at the system boundary,
this means there is some normalization and standardization needed.
-- All symbols will match the native/local symbol for the exchange, unless there are conflicts (such as [Binance](../integrations/binance.md#symbology) using the same symbol for both spot and perpetual futures).
+- All symbols will match the native/local symbol for the exchange, unless there are conflicts (such as Binance using the same symbol for both Spot and Perpetual Futures markets).
- All timestamps will be either normalized to UNIX nanoseconds, or clearly marked as UNIX milliseconds by appending `_ms` to param and property names.
```{eval-rst}
diff --git a/docs/user_guide/adapters.md b/docs/user_guide/adapters.md
index 02a0c844594e..6ce705c98621 100644
--- a/docs/user_guide/adapters.md
+++ b/docs/user_guide/adapters.md
@@ -1,9 +1,9 @@
# Adapters
The NautilusTrader design allows for integrating data publishers and/or trading venues
-via adapters, these can be found in the top level `adapters` subpackage.
+through adapter implementations, these can be found in the top level `adapters` subpackage.
-A full integration adapter (to say a Crypto exchange) is typically comprised of the following main components:
+A full integration adapter is typically comprised of the following main components:
- `InstrumentProvider`
- `DataClient`
@@ -11,16 +11,16 @@ A full integration adapter (to say a Crypto exchange) is typically comprised of
## Instrument Providers
-Instrument providers do as their name suggests by parsing the publisher or venues raw API
-and instantiating Nautilus `Instrument` objects.
+Instrument providers do as their name suggests - instantiating Nautilus
+`Instrument` objects by parsing the publisher or venues raw API.
The use cases for the instruments available from an `InstrumentProvider` are either:
- Used standalone to discover the instruments available for an integration, using these for research or backtesting purposes
-- Used in a sandbox or live trading context for consumption by actors/strategies
+- Used in a sandbox or live trading environment context for consumption by actors/strategies
### Research/Backtesting
-Here is an example of discovering the current instruments for the Binance Futures testnet
+Here is an example of discovering the current instruments for the Binance Futures testnet:
```python
from nautilus_trader.adapters.binance.common.enums import BinanceAccountType
from nautilus_trader.adapters.binance.factories import get_cached_binance_http_client
@@ -74,20 +74,21 @@ InstrumentProviderConfig(load_ids=["BTCUSDT-PERP", "ETHUSDT-PERP"])
### Requests
An `Actor` or `Strategy` can request custom data from a `DataClient` by sending a `DataRequest`. If the client that receives the
-`DataRequest` implements a handler for the request, data will be returned to the `Strategy`.
+`DataRequest` implements a handler for the request, data will be returned to the `Actor` or `Strategy`.
#### Example
-An example of this is a `DataRequest` for an `Instrument`, which the Actor class implements (copied below). Any `Actor` or
+An example of this is a `DataRequest` for an `Instrument`, which the `Actor` class implements (copied below). Any `Actor` or
`Strategy` can call a `request_instrument` method with an `InstrumentId` to request the instrument from a `DataClient`.
-In this particular case, the `Actor` implements a separate method, `request_instrument`, but a similar type of
-`DataRequest` could be instantiated and called from anywhere and anytime in the Actor/Strategy code.
+In this particular case, the `Actor` implements a separate method `request_instrument`. A similar type of
+`DataRequest` could be instantiated and called from anywhere and/or anytime in the actor/strategy code.
-On the Actor/Strategy:
+On the actor/strategy:
```cython
# nautilus_trader/common/actor.pyx
+
cpdef void request_instrument(self, InstrumentId instrument_id, ClientId client_id=None) except *:
"""
Request `Instrument` data for the given instrument ID.
diff --git a/docs/user_guide/architecture.md b/docs/user_guide/architecture.md
new file mode 100644
index 000000000000..bc6b1b09e18e
--- /dev/null
+++ b/docs/user_guide/architecture.md
@@ -0,0 +1,166 @@
+# Architecture
+
+This guide describes the architecture of NautilusTrader from highest to lowest level, including:
+- Design philosophy
+- System architecture
+- Framework organization
+- Code structure
+- Component organization and interaction
+- Implementation techniques
+
+## Design philosophy
+The major architectural techniques and design patterns employed by NautilusTrader are:
+- [Domain driven design (DDD)](https://en.wikipedia.org/wiki/Domain-driven_design)
+- [Event-driven architecture](https://en.wikipedia.org/wiki/Event-driven_programming)
+- [Messaging patterns](https://en.wikipedia.org/wiki/Messaging_pattern) (Pub/Sub, Req/Rep, point-to-point)
+- [Ports and adapters](https://en.wikipedia.org/wiki/Hexagonal_architecture_(software))
+- [Crash-only design](https://en.wikipedia.org/wiki/Crash-only_software)
+
+These techniques have been utilized to assist in achieving certain architectural quality attributes.
+
+### Quality attributes
+Architectural decisions are often a trade-off between competing priorities. The
+below is a list of some of the most important quality attributes which are considered
+when making design and architectural decisions, roughly in order of 'weighting'.
+
+- Reliability
+- Performance
+- Modularity
+- Testability
+- Maintainability
+- Deployability
+
+## System architecture
+
+The NautilusTrader codebase is actually both a framework for composing trading
+systems, and a set of default system applications which can operate in various
+environment contexts.
+
+### Environment contexts
+- `Backtest` - Historical data with simulated venues
+- `Sandbox` - Real-time data with simulated venues
+- `Live` - Real-time data with live venues (paper trading or real accounts)
+
+### Common core
+The platform has been designed to share as much common code between backtest, sandbox and live trading systems as possible.
+This is formalized in the `system` subpackage, where you will find the `NautilusKernel` class,
+providing a common core system kernel.
+
+A _ports and adapters_ architectural style allows modular components to be 'plugged into' the
+core system, providing many hooks for user defined / custom component implementations.
+
+### Messaging
+To facilitate modularity and loose coupling, an extremely efficient `MessageBus` passes messages (data, commands and events) between components.
+
+From a high level architectural view, it's important to understand that the platform has been designed to run efficiently
+on a single thread, for both backtesting and live trading. Much research and testing
+resulted in arriving at this design, as it was found the overhead of context switching between threads
+didn't actually result in improved performance.
+
+When considering the logic of how your trading will work within the system boundary, you can expect each component to consume messages
+in a predictable synchronous way (_similar_ to the [actor model](https://en.wikipedia.org/wiki/Actor_model)).
+
+```{note}
+Of interest is the LMAX exchange architecture, which achieves award winning performance running on
+a single thread. You can read about their _disruptor_ pattern based architecture in [this interesting article](https://martinfowler.com/articles/lmax.html) by Martin Fowler.
+```
+
+## Framework organization
+The codebase is organized with a layering of abstraction levels, and generally
+grouped into logical subpackages of cohesive concepts. You can navigate to the documentation
+for each of these subpackages from the left nav menu.
+
+### Core / low-Level
+- `core` - constants, functions and low-level components used throughout the framework
+- `common` - common parts for assembling the frameworks various components
+- `network` - low-level base components for networking clients
+- `serialization` - serialization base components and serializer implementations
+- `model` - defines a rich trading domain model
+
+### Components
+- `accounting` - different account types and account management machinery
+- `adapters` - integration adapters for the platform including brokers and exchanges
+- `analysis` - components relating to trading performance statistics and analysis
+- `cache` - provides common caching infrastructure
+- `data` - the data stack and data tooling for the platform
+- `execution` - the execution stack for the platform
+- `indicators` - a set of efficient indicators and analyzers
+- `infrastructure` - technology specific infrastructure implementations
+- `msgbus` - a universal message bus for connecting system components
+- `persistence` - data storage, cataloging and retrieval, mainly to support backtesting
+- `portfolio` - portfolio management functionality
+- `risk` - risk specific components and tooling
+- `trading` - trading domain specific components and tooling
+
+### System implementations
+- `backtest` - backtesting componentry as well as a backtest engine and node implementations
+- `live` - live engine and client implementations as well as a node for live trading
+- `system` - the core system kernel common between backtest, sandbox and live contexts
+
+## Code structure
+The foundation of the codebase is the `nautilus_core` directory, containing a collection of core Rust libraries including a C API interface generated by `cbindgen`.
+
+The bulk of the production code resides in the `nautilus_trader` directory, which contains a collection of pure Python and Cython modules.
+
+Python bindings for the Rust core are achieved by statically linking the Rust libraries to the C extension modules generated by Cython at compile time (effectively extending the CPython API).
+
+```{note}
+Both Rust and Cython are build dependencies. The binary wheels produced from a build do not themselves require
+Rust or Cython to be installed at runtime.
+```
+### Dependency flow
+```
+┌─────────────────────────┐
+│ │
+│ │
+│ nautilus_trader │
+│ │
+│ Python / Cython │
+│ │
+│ │
+└────────────┬────────────┘
+ C API │
+ │
+ │
+ │
+ C API ▼
+┌─────────────────────────┐
+│ │
+│ │
+│ nautilus_core │
+│ │
+│ Rust │
+│ │
+│ │
+└─────────────────────────┘
+```
+
+### Type safety
+The design of the platform holds software correctness and safety at the highest level.
+
+The Rust codebase in `nautilus_core` is always type safe and memory safe as guaranteed by the `rustc` compiler,
+and so is _correct by construction_ (unless explicitly marked `unsafe`, see the Rust section of the [Developer Guide](../developer_guide/rust.md)).
+
+Cython provides type safety at the C level at both compile time, and runtime:
+
+```{note}
+If you pass an argument with an invalid type to a Cython implemented module with typed parameters,
+then you will receive a ``TypeError`` at runtime.
+
+If a function or methods parameter is not explicitly typed as allowing
+``None``, then you can assume you will receive a `ValueError` when passing ``None``
+as an argument at runtime.
+```
+
+```{warning}
+The above exceptions are not explicitly documented, as this would bloat the docstrings significantly.
+```
+
+### Errors and exceptions
+Every attempt has been made to accurately document the possible exceptions which
+can be raised from NautilusTrader code, and the conditions which will trigger them.
+
+```{warning}
+There may be other undocumented exceptions which can be raised by Pythons standard
+library, or from third party library dependencies.
+```
diff --git a/docs/user_guide/backtest_example.md b/docs/user_guide/backtest_example.md
index ab179ef9a812..f7e692c7cfec 100644
--- a/docs/user_guide/backtest_example.md
+++ b/docs/user_guide/backtest_example.md
@@ -2,11 +2,9 @@
This notebook runs through a complete backtest example using raw data (external to Nautilus) to a single backtest run.
-
-
## Imports
-We'll start with all of our imports for the remainder of this guide.
+We'll start with all of our imports for the remainder of this guide:
```python
import datetime
@@ -22,16 +20,16 @@ from nautilus_trader.model.objects import Price, Quantity
from nautilus_trader.backtest.data.providers import TestInstrumentProvider
from nautilus_trader.backtest.node import BacktestNode
-from nautilus_trader.persistence.catalog import DataCatalog
+from nautilus_trader.persistence.catalog import ParquetDataCatalog
from nautilus_trader.persistence.external.core import process_files, write_objects
from nautilus_trader.persistence.external.readers import TextReader
```
## Getting some raw data
-Before we start the notebook - as a once off we need to download some sample data for backtesting.
+As a once off before we start the notebook - we need to download some sample data for backtesting.
-For this notebook we will use FX data from `histdata.com`, simply go to https://www.histdata.com/download-free-forex-historical-data/?/ascii/tick-data-quotes/ and select an FX pair, and one or more months of data to download.
+For this example we will use FX data from `histdata.com`. Simply go to https://www.histdata.com/download-free-forex-historical-data/?/ascii/tick-data-quotes/ and select an FX pair, then select one or more months of data to download.
Once you have downloaded the data, set the variable `DATA_DIR` below to the directory containing the data. By default, it will use the users `Downloads` directory.
@@ -49,16 +47,14 @@ assert raw_files, f"Unable to find any histdata files in directory {DATA_DIR}"
raw_files
```
-
## The Data Catalog
Next we will load this raw data into the data catalog. The data catalog is a central store for Nautilus data, persisted in the [Parquet](https://parquet.apache.org) file format.
We have chosen parquet as the storage format for the following reasons:
-- It performs much better than CSV/JSON/HDF5/etc in terms of compression (storage size) and read performance
+- It performs much better than CSV/JSON/HDF5/etc in terms of compression ratio (storage size) and read performance
- It does not require any separate running components (for example a database)
-- It is quick and simple for someone to get up and running with
-
+- It is quick and simple to get up and running with
## Loading data into the catalog
@@ -66,7 +62,7 @@ We can load data from various sources into the data catalog using helper methods
The FX data from `histdata` is stored in CSV/text format, with fields `timestamp, bid_price, ask_price`. To load the data into the catalog, we simply write a function that converts each row into a Nautilus object (in this case, a `QuoteTick`). For this example, we will use the `TextReader` helper, which allows reading and applying a parsing function line by line.
-Then, we simply instantiate a `DataCatalog` (passing in a directory where to store the data, by default we will just use the current directory) and pass our parsing function wrapping in the Reader class to `process_files`. We also need to know about which instrument this data is for; in this example, we will simply use one of the Nautilus test helpers to create a FX instrument.
+Then, we simply instantiate a `ParquetDataCatalog` (passing in a directory where to store the data, by default we will just use the current directory) and pass our parsing function wrapping in the Reader class to `process_files`. We also need to know about which instrument this data is for; in this example, we will simply use one of the Nautilus test helpers to create a FX instrument.
It should only take a couple of minutes to load the data (depending on how many months).
@@ -100,7 +96,7 @@ os.mkdir(CATALOG_PATH)
```python
AUDUSD = TestInstrumentProvider.default_fx_ccy("AUD/USD")
-catalog = DataCatalog(CATALOG_PATH)
+catalog = ParquetDataCatalog(CATALOG_PATH)
process_files(
glob_path=f"{DATA_DIR}/HISTDATA*.zip",
@@ -121,6 +117,10 @@ catalog.instruments()
```
```python
+import pandas as pd
+from nautilus_trader.core.datetime import dt_to_unix_nanos
+
+
start = dt_to_unix_nanos(pd.Timestamp('2020-01-01', tz='UTC'))
end = dt_to_unix_nanos(pd.Timestamp('2020-01-02', tz='UTC'))
@@ -129,8 +129,7 @@ catalog.quote_ticks(start=start, end=end)
## Configuring backtests
-Nautilus has a top-level object `BacktestRunConfig` that allows configuring a backtest in one place. It is a `Partialable` object (which means it can be configured in stages); the benefits of which are reduced boilerplate code when creating multiple backtest runs (for example when doing some sort of grid search over parameters).
-
+Nautilus uses a `BacktestRunConfig` object, which allows configuring a backtest in one place. It is a `Partialable` object (which means it can be configured in stages); the benefits of which are reduced boilerplate code when creating multiple backtest runs (for example when doing some sort of grid search over parameters).
### Adding data and venues
@@ -139,7 +138,7 @@ instrument = catalog.instruments(as_nautilus=True)[0]
data_config=[
BacktestDataConfig(
- catalog_path=str(DataCatalog.from_env().path),
+ catalog_path=str(ParquetDataCatalog.from_env().path),
data_cls=QuoteTick,
instrument_id=instrument.id.value,
start_time=1580398089820000000,
diff --git a/docs/user_guide/core_concepts.md b/docs/user_guide/core_concepts.md
index e17bb95fc42d..a2551e2dd089 100644
--- a/docs/user_guide/core_concepts.md
+++ b/docs/user_guide/core_concepts.md
@@ -1,48 +1,44 @@
# Core Concepts
-NautilusTrader has been built from the ground up to deliver optimal
-performance with a high quality user experience, within the bounds of a robust Python native environment. There are two main use cases for this software package:
+There are two main use cases for this software package:
-- Backtesting trading strategies
-- Deploying trading strategies live
+- Backtesting trading systems on historical data
+- Deploying trading systems live in real-time
The projects codebase provides a framework for implementing systems to achieve the above. You will find
the default `backtest` and `live` system implementations in their respectively named subpackages. All examples
will also either utilize the default backtest or live system implementations.
-## System Architecture
-
-### Common core
-NautilusTrader has been designed to share as much common code between backtest and live systems as possible. This
-is formalized in the `system` subpackage, where you will find the `NautilusKernel` class, providing a common core system kernel.
-
-A _ports and adapters_ architectural style allows modular components to be 'plugged into' the
-core system, providing many hook points for user defined / custom implementations.
-
-### Messaging
-To facilitate this modularity and loose coupling, an extremely efficient `MessageBus` passes data, commands and events as messages between components.
+```{note}
+We consider trading strategies to be subcomponents of end-to-end trading systems, which
+include the application and infrastructure layers.
+```
-From a high level architectural view, it's important to understand that the platform has been designed to run efficiently
-on a single thread, for both backtesting and live trading. A lot of research and testing
-resulted in arriving at this design, as it was found the overhead of context switching between threads
-didn't pay off in better performance.
+## Distributed
+The platform is also able to be become part of an even larger distributed system, and so you will find that
+nearly every configuration and domain object can be serialized over the wire using either JSON, MessagePack, or Apache arrow (feather).
-When considering the logic of how your trading will work within the system boundary, you can expect each component to consume messages
-in a predictable synchronous way (_similar_ to the [actor model](https://en.wikipedia.org/wiki/Actor_model)).
+## Common core
+Both backtest, sandbox and live trading nodes use a common system core. Registering user defined `Actor` and `Strategy`
+components are then managed in the same way across these environment contexts.
-```{note}
-Of interest is the LMAX exchange architecture, which achieves award winning performance running on
-a single thread. You can read about their _disruptor_ pattern based architecture in [this interesting article](https://martinfowler.com/articles/lmax.html) by Martin Fowler.
-```
+## Backtesting
+Backtesting can be achieved by first making data available to a `BacktestEngine` either directly or via
+a higher level `BacktestNode` and `ParquetDataCatalog`, and then running the system across this data with nanosecond resolution.
-## Trading Live
-A `TradingNode` can host a fleet of trading strategies, with data able to be ingested from multiple data clients, and order execution handled through multiple execution clients.
+## Live trading
+A `TradingNode` can ingest data and events from multiple data and execution clients.
Live deployments can use both demo/paper trading accounts, or real accounts.
For live trading, extremely high performance (benchmarks pending) can be achieved running asynchronously on a single [event loop](https://docs.python.org/3/library/asyncio-eventloop.html),
especially leveraging the [uvloop](https://github.com/MagicStack/uvloop) implementation (available for Linux and macOS only).
-## Data Types
+## Domain model
+A rich trading domain model has been defined, which expresses value types such as
+`Price` and `Quantity`, up to more complex entities such as `Order` objects - which aggregate
+many events to determine state.
+
+### Data Types
The following market data types can be requested historically, and also subscribed to as live streams when available from a data publisher, and implemented in an integrations adapter.
- `OrderBookDelta`
- `OrderBookDeltas` (L1/L2/L3)
@@ -79,7 +75,7 @@ The following BarAggregation options are possible;
The price types and bar aggregations can be combined with step sizes >= 1 in any way through a `BarSpecification`.
This enables maximum flexibility and now allows alternative bars to be aggregated for live trading.
-## Account Types
+### Account Types
The following account types are available for both live and backtest environments;
- `Cash` single-currency (base currency)
@@ -87,7 +83,7 @@ The following account types are available for both live and backtest environment
- `Margin` single-currency (base currency)
- `Margin` multi-currency
-## Order Types
+### Order Types
The following order types are available (when possible on an exchange);
- `MARKET`
diff --git a/docs/user_guide/index.md b/docs/user_guide/index.md
index 115e74247879..aa69333e50a1 100644
--- a/docs/user_guide/index.md
+++ b/docs/user_guide/index.md
@@ -3,9 +3,9 @@
Welcome to the user guide for NautilusTrader!
Here you will find detailed documentation and examples explaining the different
-use cases of NautilusTrader.
+use cases for the platform.
-You can choose different subjects on the left, which are generally ordered from
+You can choose different subjects from the left nav menu, which are generally ordered from
highest to lowest level (although they are self-contained and can be read in any order).
Since this is a companion guide to the full [API Reference](../api_reference/index.md)
@@ -14,6 +14,9 @@ the source of truth if code or information in the user guide differs. We will ai
user guide in line with the API Reference on a best effort basis, and intend to introduce some doc tests
in the near future to assist with this.
+```{note}
+The terms 'NautilusTrader', 'Nautilus' and 'platform' are used interchageably throughout the documentation.
+```
```{eval-rst}
.. toctree::
@@ -23,6 +26,7 @@ in the near future to assist with this.
:hidden:
core_concepts.md
+ architecture.md
backtest_example.md
loading_external_data.md
strategies.md
diff --git a/docs/user_guide/instruments.md b/docs/user_guide/instruments.md
index 1c96ea07ef5c..600ec88b0a4d 100644
--- a/docs/user_guide/instruments.md
+++ b/docs/user_guide/instruments.md
@@ -8,17 +8,17 @@ currently a number of subclasses representing a range of _asset classes_ and _as
- `CurrencyPair` (represents a Fiat FX or Cryptocurrency pair in a spot/cash market)
- `CryptoPerpetual` (Perpetual Futures Contract a.k.a. Perpetual Swap)
- `CryptoFuture` (Deliverable Futures Contract with Crypto assets as underlying, and for price quotes and settlement)
-- `BettingInstrument`
+- `BettingInstrument` (Sports, gaming, or other betting)
## Symbology
-All instruments should have a unique `InstrumentId`, which is made up of both the native symbol and venue ID, separated by a period.
+All instruments should have a unique `InstrumentId`, which is made up of both the native symbol, and venue ID, separated by a period.
For example, on the FTX crypto exchange, the Ethereum Perpetual Futures Contract has the instrument ID `ETH-PERP.FTX`.
All native symbols _should_ be unique for a venue (this is not always the case e.g. Binance share native symbols between spot and futures markets),
and the `{symbol.venue}` combination _must_ be unique for a Nautilus system.
```{warning}
-The correct instrument must be matched to a market dataset such as ticks or orderbook data for logically sound operation.
+The correct instrument must be matched to a market dataset such as ticks or order book data for logically sound operation.
An incorrectly specified instrument may truncate data or otherwise produce surprising results.
```
@@ -50,12 +50,12 @@ instrument = Instrument(...) # <-- provide all necessary parameters
See the full instrument [API Reference](../api_reference/model/instruments.md).
## Live trading
-All the live integration adapters have defined `InstrumentProvider` classes which work in an automated way
-under the hood to cache the latest instrument definitions from the exchange. Refer to a particular `Instrument`
-object by pass the matching `InstrumentId` to data and execution related methods and classes which require one.
+Live integration adapters have defined `InstrumentProvider` classes which work in an automated way to cache the
+latest instrument definitions for the exchange. Refer to a particular `Instrument`
+object by pass the matching `InstrumentId` to data and execution related methods, and classes which require one.
## Finding instruments
-Since the same strategy/actor classes can be used for both backtests and live trading, you can
+Since the same actor/strategy classes can be used for both backtest and live trading, you can
get instruments in exactly the same way through the central cache:
```python
@@ -79,7 +79,7 @@ self.subscribe_instruments(ftx)
```
When an update to the instrument(s) is received by the `DataEngine`, the object(s) will
-be passed to the strategy/actors `on_instrument()` method. A user can override this method with actions
+be passed to the actors/strategies `on_instrument()` method. A user can override this method with actions
to take upon receiving an instrument update:
```python
@@ -105,8 +105,8 @@ dependent and can include:
- `min_quantity` (minimum quantity for a single order)
- `max_notional` (maximum value of a single order)
- `min_notional` (minimum value of a single order)
-- `max_price` (maximum valid order price)
-- `min_price` (minimum valid order price)
+- `max_price` (maximum valid quote or order price)
+- `min_price` (minimum valid quote or order price)
```{note}
Most of these limits are checked by the Nautilus `RiskEngine`, otherwise exceeding
@@ -125,13 +125,13 @@ quantity = instrument.make_qty(150)
```
```{tip}
-This is the recommended method for creating valid prices and quantities, e.g. before
-passing them to the order factory to create an order.
+The above is the recommended method for creating valid prices and quantities,
+such as when passing them to the order factory to create an order.
```
## Margins and Fees
The current initial and maintenance margin requirements, as well as any trading
-fees are available from an instrument:
+fees are also available from an instrument:
- `margin_init` (initial/order margin rate)
- `margin_maint` (maintenance/position margin rate)
- `maker_fee` (the fee percentage applied to notional order values when providing liquidity)
@@ -139,6 +139,6 @@ fees are available from an instrument:
## Additional Info
The raw instrument definition as provided by the exchange (typically from JSON serialized data) is also
-included as a generic Python dictionary. This is to provide all possible information
+included as a generic Python dictionary. This is to retain all information
which is not necessarily part of the unified Nautilus API, and is available to the user
at runtime by calling the `.info` property.
diff --git a/docs/user_guide/loading_external_data.md b/docs/user_guide/loading_external_data.md
index cea4db3960ec..70c11120456a 100644
--- a/docs/user_guide/loading_external_data.md
+++ b/docs/user_guide/loading_external_data.md
@@ -1,32 +1,16 @@
----
-jupyter:
- jupytext:
- formats: ipynb,md
- text_representation:
- extension: .md
- format_name: markdown
- format_version: '1.3'
- jupytext_version: 1.13.5
- kernelspec:
- display_name: Python (nautilus_trader)
- language: python
- name: nautilus_trader
----
-
# Loading External Data
-This notebook runs through a example loading raw data (external to Nautilus) into the NautilusTrader `DataCatalog`, for use in backtesting.
+This notebook runs through an example of loading raw data (external to Nautilus) into the NautilusTrader `ParquetDataCatalog`, for use in backtesting.
## The DataCatalog
The data catalog is a central store for Nautilus data, persisted in the [Parquet](https://parquet.apache.org) file format.
We have chosen parquet as the storage format for the following reasons:
-- It performs much better than CSV/JSON/HDF5/etc in terms of compression (storage size) and read performance.
-- It does not require any separate running components (for example a database).
-- It is quick and simple for someone to get up and running with.
+- It performs much better than CSV/JSON/HDF5/etc in terms of compression ratio (storage size) and read performance
+- It does not require any separately running components (for example a database)
+- It is quick and simple to get up and running with
-
### Getting some sample raw data
Before we start the notebook - as a once off we need to download some sample data for loading.
@@ -35,11 +19,10 @@ For this notebook we will use FX data from `histdata.com`, simply go to https://
Once you have downloaded the data, set the variable `input_files` below to the path containing the
data. You can also use a glob to select multiple files, for example `"~/Downloads/HISTDATA_COM_ASCII_AUDUSD_*.zip"`.
-
```python
import fsspec
-fs = fsspec.filesystem('file')
+fs = fsspec.filesystem("file")
input_files = "~/Downloads/HISTDATA_COM_ASCII_AUDUSD_T202001.zip"
```
@@ -55,27 +38,25 @@ assert len(fs.glob(input_files)), f"Could not find files with {input_files=}"
We can load data from various sources into the data catalog using helper methods in the
`nautilus_trader.persistence.external.readers` module. The module contains methods for reading
-various data formats (csv, json, txt), minimising the amount of code required to get data loaded
+various data formats (CSV, JSON, text), minimising the amount of code required to get data loaded
correctly into the data catalog.
There are a handful of readers available, some notes on when to use which:
- `CSVReader` - use when your data is CSV (comma separated values) and has a header row. Each row of the data typically is one "entry" and is linked to the header.
-- `TextReader` - similar to CSVReader, but used when data may container multiple "entries" per line, for example JSON data with multiple orderbook or trade ticks in a single line. Typically does not have a header row and field names come from some definition elsewhere.
+- `TextReader` - similar to CSVReader, however used when data may container multiple 'entries' per line. For example, JSON data with multiple order book or trade ticks in a single line. This data typically does not have a header row, and field names come from some external definition.
- `ParquetReader` - for parquet files, will read chunks of the data and process similar to `CSVReader`.
-Each of the `Reader` classes takes a `line_parser` or `block_parser` function, a user defined function to convert a line or block (chunk / multiple rows) of data into nautilus object(s) (for example `QuoteTick` or `TradeTick`).
-
+Each of the `Reader` classes takes a `line_parser` or `block_parser` function, a user defined function to convert a line or block (chunk / multiple rows) of data into Nautilus object(s) (for example `QuoteTick` or `TradeTick`).
### Writing the parser function
-The FX data from `histdata` is stored in csv/text format, with fields `timestamp, bid_price, ask_price`.
+The FX data from `histdata` is stored in CSV (plain text) format, with fields `timestamp, bid_price, ask_price`.
-For this example, we will use the `CSVReader` class, but we need to manually pass a header as the files do not contain one. The `CSVReader` has a couple of options, we'll be setting and `chunked=False` to process the data line-by-line and `as_dataframe=False` to process the data as a string rather than DataFrame. See the [API Reference](../api_reference/persistence.md) for more details.
+For this example, we will use the `CSVReader` class, where we need to manually pass a header (as the files do not contain one). The `CSVReader` has a couple of options, we'll be setting `chunked=False` to process the data line-by-line, and `as_dataframe=False` to process the data as a string rather than DataFrame. See the [API Reference](../api_reference/persistence.md) for more details.
```python
import datetime
import pandas as pd
-from nautilus_trader.persistence.external.readers import CSVReader
from nautilus_trader.model.data.tick import QuoteTick
from nautilus_trader.model.objects import Price, Quantity
from nautilus_trader.core.datetime import dt_to_unix_nanos
@@ -94,9 +75,10 @@ def parser(data, instrument_id):
)
```
-### Creating a DataCatalog if one does not exist
+### Creating a new DataCatalog
-Now that we have our parser function, we instantiate a `DataCatalog` (passing in a directory where to store the data, by default we will just use the current directory):
+If a `ParquetDataCatalog` does not already exist, we can easily create one.
+Now that we have our parser function, we instantiate a `ParquetDataCatalog` (passing in a directory where to store the data, by default we will just use the current directory):
```python
import os, shutil
@@ -109,21 +91,21 @@ os.mkdir(CATALOG_PATH)
```
```python
-# Create an instance of the DataCatalog
-from nautilus_trader.persistence.catalog import DataCatalog
-catalog = DataCatalog(CATALOG_PATH)
+# Create an instance of the ParquetDataCatalog
+from nautilus_trader.persistence.catalog import ParquetDataCatalog
+catalog = ParquetDataCatalog(CATALOG_PATH)
```
### Instruments
-Nautilus needs to link market data to an `instrument_id`, and an `instrument_id` to an `Instrument`
-definition. This can be done at any time, but typically it makes sense to do it when you are loading
+Nautilus needs to link market data to an instrument ID, and an instrument ID to an `Instrument`
+definition. This can be done at any time, although typically it makes sense when you are loading
market data into the catalog.
-For our example, Nautilus contains some helpers for creating FX pairs, which we will use. If,
-however, you were adding data for financial or crypto markets, you could need to create (and add to
-the catalog) an instrument corresponding to that instrument_id. Definitions for various other
-instruments can be found in `nautilus_trader.model.instruments`.
+For our example, Nautilus contains some helpers for creating FX pairs, which we will use. If
+however, you were adding data for financial or crypto markets, you would need to create (and add to
+the catalog) an instrument corresponding to that instrument ID. Definitions for other
+instruments (of various asset classes) can be found in `nautilus_trader.model.instruments`.
See [Instruments](./instruments.md) for more details on creating other instruments.
@@ -135,7 +117,7 @@ from nautilus_trader.backtest.data.providers import TestInstrumentProvider
instrument = TestInstrumentProvider.default_fx_ccy("EUR/USD")
```
-We can now add our new instrument to the `DataCatalog`:
+We can now add our new instrument to the `ParquetDataCatalog`:
```python
from nautilus_trader.persistence.external.core import write_objects
@@ -152,17 +134,16 @@ catalog.instruments()
### Loading the files
-One final note, our parsing function takes an `instrument_id` argument, as in our case with
-hist_data, the actual file does not contain information about the instrument, only the file name
+One final note: our parsing function takes an `instrument_id` argument, as in our case with
+hist_data, however the actual file does not contain information about the instrument, only the file name
does. In our instance, we would likely need to split our loading per FX pair, so we can determine
-which instrument we are loading. We will use a simple lambda function to pass our `instrument_id` to
+which instrument we are loading. We will use a simple lambda function to pass our instrument ID to
the parsing function.
We can now use the `process_files` function to load one or more files using our `Reader` class and
`parsing` function as shown below. This function will loop over many files, as well as breaking up
large files into chunks (protecting us from out of memory errors when reading large files) and save
-the results to the `DataCatalog`.
-
+the results to the `ParquetDataCatalog`.
For the hist_data, it should take less than a minute or two to load each FX file (a progress bar
will appear below):
@@ -170,6 +151,8 @@ will appear below):
```python
from nautilus_trader.persistence.external.core import process_files
+from nautilus_trader.persistence.external.readers import CSVReader
+
process_files(
glob_path=input_files,
@@ -183,15 +166,18 @@ process_files(
)
```
-## Using the DataCatalog
+## Using the ParquetDataCatalog
Once data has been loaded into the catalog, the `catalog` instance can be used for loading data into
the backtest engine, or simple for research purposes. It contains various methods to pull data from
-the catalog, like `quote_ticks` (show below):
+the catalog, such as `quote_ticks`, for example:
```python
-start = dt_to_unix_nanos(pd.Timestamp('2020-01-01', tz='UTC'))
-end = dt_to_unix_nanos(pd.Timestamp('2020-01-02', tz='UTC'))
+import pandas as pd
+from nautilus_trader.core.datetime import dt_to_unix_nanos
+
+start = dt_to_unix_nanos(pd.Timestamp("2020-01-01", tz="UTC"))
+end = dt_to_unix_nanos(pd.Timestamp("2020-01-02", tz="UTC"))
catalog.quote_ticks(start=start, end=end)
```
diff --git a/docs/user_guide/orders.md b/docs/user_guide/orders.md
index 7d45d1960a42..a378ac511769 100644
--- a/docs/user_guide/orders.md
+++ b/docs/user_guide/orders.md
@@ -1,7 +1,7 @@
# Orders
This guide provides more details about the available order types for the platform, along with
-the execution instructions available for each.
+the execution instructions supported for each.
Orders are one of the fundamental building blocks of any algorithmic trading strategy.
NautilusTrader has unified a large set of order types and execution instructions
@@ -13,7 +13,7 @@ order execution and management, which allows essentially any type of trading str
The two main types of orders are _Market_ orders and _Limit_ orders. All the other order
types are built from these two fundamental types, in terms of liquidity provision they
are exact opposites. _Market_ orders demand liquidity and require immediate trading at the best
-price available. Conversely, _Limit_ orders provide liquidity, they act as standing orders in a limit order book
+price available. Conversely, _Limit_ orders provide liquidity, they act as standing orders in a public limit order book
at a specified limit price.
The core order types available for the platform are (using the enum values):
@@ -41,62 +41,62 @@ how an order will be processed and executed. The following is a brief
summary of the different execution instructions available.
### Time In Force
-The orders time in force is an instruction to indicate how long the order will remain open
-or active before being filled or the remaining quantity canceled.
+The orders time in force is an instruction to specify how long the order will remain open
+or active, before any remaining quantity is canceled.
-- `GTC` (Good 'til Canceled): The order remains in force until canceled by the trader or the exchange.
-- `IOC` (Immediate or Cancel / Fill **and** Kill): The order will execute immediately with any portion of the order quantity which cannot be executed being canceled.
-- `FOK` (Fill **or** Kill): The order will execute immediately, and in full, or not at all.
-- `GTD` (Good 'til Date): The order remains in force until reaching the specified expiration date and time.
-- `DAY` (Good for session/day): The order remains in force until the end of the current trading session.
-- `AT_THE_OPEN` (OPG): The order is only in force at the trading session open.
-- `AT_THE_CLOSE`: The order is only in force at the trading session close.
+- `GTC` (Good 'til Canceled) - The order remains in force until canceled by the trader or the exchange
+- `IOC` (Immediate or Cancel / Fill **and** Kill) - The order will execute immediately with any portion of the order quantity which cannot be executed being canceled
+- `FOK` (Fill **or** Kill) - The order will execute immediately, and in full, or not at all
+- `GTD` (Good 'til Date) - The order remains in force until reaching the specified expiration date and time
+- `DAY` (Good for session/day) - The order remains in force until the end of the current trading session
+- `AT_THE_OPEN` (OPG) - The order is only in force at the trading session open
+- `AT_THE_CLOSE` - The order is only in force at the trading session close
### Expire Time
This instruction is to be used in conjunction with the `GTD` time in force to specify the time
-at which the order will expire and be removed from the exchanges order book or order management system.
+at which the order will expire and be removed from the exchanges order book (or order management system).
### Post Only
-An order which is marked as `post_only` will only ever participate in providing liquidity to the central
+An order which is marked as `post_only` will only ever participate in providing liquidity to the
limit order book, and never initiating a trade which takes liquidity as an aggressor. This option is
-important for market makers or traders seeking to restrict the order to a liquidity _maker_ fee tier.
+important for market makers, or traders seeking to restrict the order to a liquidity _maker_ fee tier.
### Reduce Only
-An order which is marked as `reduce_only` will only ever reduce an existing position on an instrument, and
-never open a new position if already flat. The exact behaviour of this instruction can vary between
+An order which is set as `reduce_only` will only ever reduce an existing position on an instrument, and
+never open a new position (if already flat). The exact behaviour of this instruction can vary between
exchanges, however the behaviour as per the Nautilus `SimulatedExchange` is typical of a live exchange.
-- Order will be cancelled if the associated position is closed / becomes flat.
-- Order quantity will be reduced as the associated positions size reduces.
+- Order will be cancelled if the associated position is closed (becomes flat)
+- Order quantity will be reduced as the associated positions size reduces
### Display Quantity
The `display_qty` specifies the portion of a _Limit_ order which is displayed on the public limit order book.
These are also known as iceberg orders as there is a visible portion to be displayed, with more quantity which is hidden.
-Specifying a display quantity of zero is also equivalent to marking an order as `hidden`.
+Specifying a display quantity of zero is also equivalent to setting an order as `hidden`.
### Trigger Type
Also known as [trigger method](https://guides.interactivebrokers.com/tws/usersguidebook/configuretws/modify_the_stop_trigger_method.htm)
which is applicable to conditional trigger orders, specifying the method of triggering the stop price.
-- `DEFAULT`: The default trigger type for the exchange (typically `LAST` or `BID_ASK`).
-- `LAST`: The trigger price will be based on the last traded price.
-- `BID_ASK`: The trigger price will be based on the `BID` for buy orders and `ASK` for sell orders.
-- `DOUBLE_LAST`: The trigger price will be based on the last two consecutive `LAST` prices.
-- `DOUBLE_BID_ASK`: The trigger price will be based on the last two consecutive `BID` or `ASK` prices as applicable.
-- `LAST_OR_BID_ASK`: The trigger price will be based on the `LAST` or `BID`/`ASK`.
-- `MID_POINT`: The trigger price will be based on the mid-point between the `BID` and `ASK`.
-- `MARK`: The trigger price will be based on the exchanges mark price for the instrument.
-- `INDEX`: The trigger price will be based on the exchanges index price for the instrument.
+- `DEFAULT` - The default trigger type for the exchange (typically `LAST` or `BID_ASK`)
+- `LAST` - The trigger price will be based on the last traded price
+- `BID_ASK` - The trigger price will be based on the `BID` for buy orders and `ASK` for sell orders
+- `DOUBLE_LAST` - The trigger price will be based on the last two consecutive `LAST` prices
+- `DOUBLE_BID_ASK` - The trigger price will be based on the last two consecutive `BID` or `ASK` prices as applicable
+- `LAST_OR_BID_ASK` - The trigger price will be based on the `LAST` or `BID`/`ASK`
+- `MID_POINT` - The trigger price will be based on the mid-point between the `BID` and `ASK`
+- `MARK` - The trigger price will be based on the exchanges mark price for the instrument
+- `INDEX` - The trigger price will be based on the exchanges index price for the instrument
### Trigger Offset Type
Applicable to conditional trailing-stop trigger orders, specifies the method of triggering modification
of the stop price based on the offset from the 'market' (bid, ask or last price as applicable).
-- `DEFAULT`: The default offset type for the exchange (typically `PRICE`).
-- `PRICE`: The offset is based on a price difference.
-- `BASIS_POINTS`: The offset is based on a price percentage difference expressed in basis points (100bp = 1%).
-- `TICKS`: The offset is based on a number of ticks.
-- `PRICE_TIER`: The offset is based on an exchange specific price tier.
+- `DEFAULT` - The default offset type for the exchange (typically `PRICE`)
+- `PRICE` - The offset is based on a price difference
+- `BASIS_POINTS` - The offset is based on a price percentage difference expressed in basis points (100bp = 1%)
+- `TICKS` - The offset is based on a number of ticks
+- `PRICE_TIER` - The offset is based on an exchange specific price tier
### Contingency Orders
More advanced relationships can be specified between orders such as assigning child order(s) which will only
@@ -148,6 +148,7 @@ execute at that price (or better).
In the following example we create a _Limit_ order on the FTX Crypto exchange to SELL 20 ETH-PERP Perpetual Futures
contracts at a limit price of 5000 USD, as a market maker.
+
```python
order: LimitOrder = self.order_factory.limit(
instrument_id=InstrumentId.from_str("ETH-PERP.FTX"),
@@ -165,7 +166,7 @@ order: LimitOrder = self.order_factory.limit(
[API Reference](https://docs.nautilustrader.io/api_reference/model/orders.html#module-nautilus_trader.model.orders.limit)
### Stop-Market
-A _Stop-Market_ order is a conditional order which once triggered will immediately
+A _Stop-Market_ order is a conditional order which once triggered, will immediately
place a _Market_ order. This order type is often used as a stop-loss to limit losses, either
as a SELL order against LONG positions, or as a BUY order against SHORT positions.
@@ -237,7 +238,7 @@ order: MarketToLimitOrder = self.order_factory.market_to_limit(
### Market-If-Touched
A _Market-If-Touched_ order is a conditional order which once triggered will immediately
place a _Market_ order. This order type is often used to enter a new position on a stop price in the market orders direction,
-or to take profits from an existing position, either as a SELL order against LONG positions,
+or to take profits for an existing position, either as a SELL order against LONG positions,
or as a BUY order against SHORT positions.
In the following example we create a _Market-If-Touched_ order on the Binance Futures exchange
diff --git a/docs/user_guide/strategies.md b/docs/user_guide/strategies.md
index 8775d8b4a49e..2f936044094b 100644
--- a/docs/user_guide/strategies.md
+++ b/docs/user_guide/strategies.md
@@ -45,6 +45,8 @@ Here is an example configuration:
```python
from decimal import Decimal
from nautilus_trader.config import StrategyConfig
+from nautilus_trader.model.identifiers import InstrumentId
+from nautilus_trader.trading.strategy import Strategy
class MyStrategyConfig(StrategyConfig):
@@ -55,24 +57,29 @@ class MyStrategyConfig(StrategyConfig):
trade_size: Decimal
order_id_tag: str
-config = MyStrategy(
- instrument_id="ETH-PERP.FTX",
- bar_type="ETH-PERP.FTX-1000-TICK[LAST]-INTERNAL",
- trade_size=Decimal(1),
- order_id_tag="001",
-)
-```
-
-Once a configuration is defined and instantiated, we can pass this to our trading strategy. Here we simply add an instrument ID
-as a string, to parameterize the instrument the strategy will trade.
+# Here we simply add an instrument ID as a string, to
+# parameterize the instrument the strategy will trade.
-```python
class MyStrategy(Strategy):
def __init__(self, config: MyStrategyConfig):
super().__init__(config)
# Configuration
self.instrument_id = InstrumentId.from_str(config.instrument_id)
+
+
+# Once a configuration is defined and instantiated, we can pass this to our
+# trading strategy to initialize.
+
+config = MyStrategyConfig(
+ instrument_id="ETH-PERP.FTX",
+ bar_type="ETH-PERP.FTX-1000-TICK[LAST]-INTERNAL",
+ trade_size=Decimal(1),
+ order_id_tag="001",
+)
+
+strategy = MyStrategy(config=config)
+
```
```{note}
diff --git a/examples/live/betfair.py b/examples/live/betfair.py
index 9020c83b1990..4c94254a67ce 100644
--- a/examples/live/betfair.py
+++ b/examples/live/betfair.py
@@ -112,6 +112,7 @@ async def main(market_id: str):
try:
node.start()
+ await asyncio.gather(*asyncio.all_tasks())
except Exception as ex:
print(ex)
print(traceback.format_exc())
@@ -123,4 +124,4 @@ async def main(market_id: str):
# Update the market ID with something coming up in `Next Races` from
# https://www.betfair.com.au/exchange/plus/
# The market ID will appear in the browser query string.
- asyncio.run(main(market_id="1.199513161"))
+ asyncio.run(main(market_id="1.200150918"))
diff --git a/examples/live/interactive_brokers_example.py b/examples/live/interactive_brokers_example.py
index 605ee37de6ef..7d23e6b11f99 100644
--- a/examples/live/interactive_brokers_example.py
+++ b/examples/live/interactive_brokers_example.py
@@ -43,9 +43,9 @@
filters=tuple(
{
"secType": "STK",
- "symbol": "AAC",
- "exchange": "NYSE",
- "currency": "USD",
+ "symbol": "SEC0",
+ "exchange": "BVME.ETF",
+ # "currency": "USD",
}.items()
),
),
diff --git a/examples/notebooks/backtest_example.ipynb b/examples/notebooks/backtest_example.ipynb
index 2c56eeaa907f..7e9b40c79877 100644
--- a/examples/notebooks/backtest_example.ipynb
+++ b/examples/notebooks/backtest_example.ipynb
@@ -17,7 +17,7 @@
"from nautilus_trader.config import ImportableStrategyConfig\n",
"from nautilus_trader.examples.strategies.ema_cross import EMACross, EMACrossConfig\n",
"from nautilus_trader.model.data.tick import QuoteTick\n",
- "from nautilus_trader.persistence.catalog import DataCatalog"
+ "from nautilus_trader.persistence.catalog import ParquetDataCatalog"
]
},
{
@@ -27,7 +27,7 @@
"metadata": {},
"outputs": [],
"source": [
- "catalog = DataCatalog.from_env()"
+ "catalog = ParquetDataCatalog.from_env()"
]
},
{
@@ -55,7 +55,7 @@
"\n",
"data_config=[\n",
" BacktestDataConfig(\n",
- " catalog_path=str(DataCatalog.from_env().path),\n",
+ " catalog_path=str(ParquetDataCatalog.from_env().path),\n",
" data_cls=QuoteTick,\n",
" instrument_id=instrument.id.value,\n",
" start_time=1580398089820000000,\n",
diff --git a/nautilus_core/Cargo.lock b/nautilus_core/Cargo.lock
index b5553e2d7f2c..66821b3d6029 100644
--- a/nautilus_core/Cargo.lock
+++ b/nautilus_core/Cargo.lock
@@ -152,9 +152,9 @@ dependencies = [
[[package]]
name = "crossbeam-channel"
-version = "0.5.4"
+version = "0.5.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
-checksum = "5aaa7bd5fb665c6864b5f963dd9097905c54125909c7aa94c9e18507cdbe6c53"
+checksum = "4c02a4d71819009c192cf4872265391563fd6a84c81ff2c0f2a7026ca4c1d85c"
dependencies = [
"cfg-if",
"crossbeam-utils",
@@ -173,26 +173,26 @@ dependencies = [
[[package]]
name = "crossbeam-epoch"
-version = "0.9.8"
+version = "0.9.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
-checksum = "1145cf131a2c6ba0615079ab6a638f7e1973ac9c2634fcbeaaad6114246efe8c"
+checksum = "07db9d94cbd326813772c968ccd25999e5f8ae22f4f8d1b11effa37ef6ce281d"
dependencies = [
"autocfg",
"cfg-if",
"crossbeam-utils",
- "lazy_static",
"memoffset",
+ "once_cell",
"scopeguard",
]
[[package]]
name = "crossbeam-utils"
-version = "0.8.8"
+version = "0.8.10"
source = "registry+https://github.com/rust-lang/crates.io-index"
-checksum = "0bf124c720b7686e3c2663cf54062ab0f68a88af2fb6a030e87e30bf721fcb38"
+checksum = "7d82ee10ce34d7bc12c2122495e7593a9c41347ecdd64185af4ecf72cb1a7f83"
dependencies = [
"cfg-if",
- "lazy_static",
+ "once_cell",
]
[[package]]
@@ -219,9 +219,9 @@ dependencies = [
[[package]]
name = "either"
-version = "1.6.1"
+version = "1.7.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
-checksum = "e78d4f1cc4ae33bbfc157ed5d5a5ef3bc29227303d595861deb238fcec4e9457"
+checksum = "3f107b87b6afc2a64fd13cac55fe06d6c8859f12d4b14cbcdd2c67d0976781be"
[[package]]
name = "fastrand"
@@ -234,13 +234,13 @@ dependencies = [
[[package]]
name = "getrandom"
-version = "0.2.6"
+version = "0.2.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
-checksum = "9be70c98951c83b8d2f8f60d7065fa6d5146873094452a1008da8c2f1e4205ad"
+checksum = "4eb1a864a501629691edf6c15a593b7a51eebaa1e8468e9ddc623de7c9b58ec6"
dependencies = [
"cfg-if",
"libc",
- "wasi",
+ "wasi 0.11.0+wasi-snapshot-preview1",
]
[[package]]
@@ -251,9 +251,9 @@ checksum = "eabb4a44450da02c90444cf74558da904edde8fb4e9035a9a6a4e15445af0bd7"
[[package]]
name = "hashbrown"
-version = "0.11.2"
+version = "0.12.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
-checksum = "ab5ef0d4909ef3724cc8cce6ccc8572c5c817592e9285f5464f8e86f8bd3726e"
+checksum = "db0d4cf898abf0081f964436dc980e96670a0f36863e4b83aaacdb65c9d7ccc3"
[[package]]
name = "heck"
@@ -281,9 +281,9 @@ checksum = "71a816c97c42258aa5834d07590b718b4c9a598944cd39a52dc25b351185d678"
[[package]]
name = "indexmap"
-version = "1.8.2"
+version = "1.9.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
-checksum = "e6012d540c5baa3589337a98ce73408de9b5a25ec9fc2c6fd6be8f0d39e0ca5a"
+checksum = "10a35a97730320ffe8e2d410b5d3b69279b98d2c14bdb8b70ea89ecf7888d41e"
dependencies = [
"autocfg",
"hashbrown",
@@ -327,9 +327,9 @@ checksum = "112c678d4050afce233f4f2852bb2eb519230b3cf12f33585275537d7e41578d"
[[package]]
name = "js-sys"
-version = "0.3.57"
+version = "0.3.58"
source = "registry+https://github.com/rust-lang/crates.io-index"
-checksum = "671a26f820db17c2a2750743f1dd03bafd15b98c9f30c7c2628c024c05d73397"
+checksum = "c3fac17f7123a73ca62df411b1bf727ccc805daa070338fda671c86dac1bdc27"
dependencies = [
"wasm-bindgen",
]
@@ -396,6 +396,7 @@ name = "nautilus_core"
version = "0.1.0"
dependencies = [
"cbindgen",
+ "chrono",
"lazy_static",
"pyo3",
"uuid",
@@ -507,9 +508,9 @@ dependencies = [
[[package]]
name = "proc-macro2"
-version = "1.0.39"
+version = "1.0.40"
source = "registry+https://github.com/rust-lang/crates.io-index"
-checksum = "c54b25569025b7fc9651de43004ae593a75ad88543b17178aa5e1b9c4f15f56f"
+checksum = "dd96a1e8ed2596c337f8eae5f24924ec83f5ad5ab21ea8e455d3566c69fbcaf7"
dependencies = [
"unicode-ident",
]
@@ -575,9 +576,9 @@ dependencies = [
[[package]]
name = "quote"
-version = "1.0.18"
+version = "1.0.20"
source = "registry+https://github.com/rust-lang/crates.io-index"
-checksum = "a1feb54ed693b93a84e14094943b84b7c4eae204c512b7ccb95ab0c66d278ad1"
+checksum = "3bcdf212e9776fbcb2d23ab029360416bb1706b1aea2d1a5ba002727cbcab804"
dependencies = [
"proc-macro2",
]
@@ -690,9 +691,9 @@ checksum = "d29ab0c6d3fc0ee92fe66e2d99f700eab17a8d57d1c1d3b748380fb20baa78cd"
[[package]]
name = "semver"
-version = "1.0.9"
+version = "1.0.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
-checksum = "8cb243bdfdb5936c8dc3c45762a19d12ab4550cdc753bc247637d4ec35a040fd"
+checksum = "3d92beeab217753479be2f74e54187a6aed4c125ff0703a866c3147a02f0c6dd"
[[package]]
name = "serde"
@@ -726,9 +727,9 @@ dependencies = [
[[package]]
name = "serde_json"
-version = "1.0.81"
+version = "1.0.82"
source = "registry+https://github.com/rust-lang/crates.io-index"
-checksum = "9b7ce2b32a1aed03c558dc61a5cd328f15aff2dbc17daad8fb8af04d2100e15c"
+checksum = "82c2c1fdcd807d1098552c5b9a36e425e42e9fbd7c6a37a8425f390f781f7fa7"
dependencies = [
"itoa 1.0.2",
"ryu",
@@ -737,9 +738,9 @@ dependencies = [
[[package]]
name = "smallvec"
-version = "1.8.0"
+version = "1.9.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
-checksum = "f2dd574626839106c320a323308629dcb1acfc96e32a8cba364ddc61ac23ee83"
+checksum = "2fd0db749597d91ff862fd1d55ea87f7855a744a8425a64695b6fca237d1dad1"
[[package]]
name = "strsim"
@@ -749,9 +750,9 @@ checksum = "8ea5119cdb4c55b55d432abb513a0429384878c15dde60cc77b1c99de1a95a6a"
[[package]]
name = "syn"
-version = "1.0.96"
+version = "1.0.98"
source = "registry+https://github.com/rust-lang/crates.io-index"
-checksum = "0748dd251e24453cb8717f0354206b91557e4ec8703673a4b30208f2abaf1ebf"
+checksum = "c50aef8a904de4c23c788f104b7dddc7d6f79c647c7c8ce4cc8f73eb0ca773dd"
dependencies = [
"proc-macro2",
"quote",
@@ -794,7 +795,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6db9e6914ab8b1ae1c260a4ae7a49b6c5611b40328a735b21862567685e73255"
dependencies = [
"libc",
- "wasi",
+ "wasi 0.10.0+wasi-snapshot-preview1",
"winapi",
]
@@ -819,9 +820,9 @@ dependencies = [
[[package]]
name = "unicode-ident"
-version = "1.0.0"
+version = "1.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
-checksum = "d22af068fba1eb5edcb4aea19d382b2a3deb4c8f9d475c589b6ada9e0fd493ee"
+checksum = "5bd2fe26506023ed7b5e1e315add59d6f584c621d037f9368fea9cfb988f368c"
[[package]]
name = "unicode-segmentation"
@@ -873,11 +874,17 @@ version = "0.10.0+wasi-snapshot-preview1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1a143597ca7c7793eff794def352d41792a93c481eb1042423ff7ff72ba2c31f"
+[[package]]
+name = "wasi"
+version = "0.11.0+wasi-snapshot-preview1"
+source = "registry+https://github.com/rust-lang/crates.io-index"
+checksum = "9c8d87e72b64a3b4db28d11ce29237c246188f4f51057d65a7eab63b7987e423"
+
[[package]]
name = "wasm-bindgen"
-version = "0.2.80"
+version = "0.2.81"
source = "registry+https://github.com/rust-lang/crates.io-index"
-checksum = "27370197c907c55e3f1a9fbe26f44e937fe6451368324e009cba39e139dc08ad"
+checksum = "7c53b543413a17a202f4be280a7e5c62a1c69345f5de525ee64f8cfdbc954994"
dependencies = [
"cfg-if",
"wasm-bindgen-macro",
@@ -885,9 +892,9 @@ dependencies = [
[[package]]
name = "wasm-bindgen-backend"
-version = "0.2.80"
+version = "0.2.81"
source = "registry+https://github.com/rust-lang/crates.io-index"
-checksum = "53e04185bfa3a779273da532f5025e33398409573f348985af9a1cbf3774d3f4"
+checksum = "5491a68ab4500fa6b4d726bd67408630c3dbe9c4fe7bda16d5c82a1fd8c7340a"
dependencies = [
"bumpalo",
"lazy_static",
@@ -900,9 +907,9 @@ dependencies = [
[[package]]
name = "wasm-bindgen-macro"
-version = "0.2.80"
+version = "0.2.81"
source = "registry+https://github.com/rust-lang/crates.io-index"
-checksum = "17cae7ff784d7e83a2fe7611cfe766ecf034111b49deb850a3dc7699c08251f5"
+checksum = "c441e177922bc58f1e12c022624b6216378e5febc2f0533e41ba443d505b80aa"
dependencies = [
"quote",
"wasm-bindgen-macro-support",
@@ -910,9 +917,9 @@ dependencies = [
[[package]]
name = "wasm-bindgen-macro-support"
-version = "0.2.80"
+version = "0.2.81"
source = "registry+https://github.com/rust-lang/crates.io-index"
-checksum = "99ec0dc7a4756fffc231aab1b9f2f578d23cd391390ab27f952ae0c9b3ece20b"
+checksum = "7d94ac45fcf608c1f45ef53e748d35660f168490c10b23704c7779ab8f5c3048"
dependencies = [
"proc-macro2",
"quote",
@@ -923,15 +930,15 @@ dependencies = [
[[package]]
name = "wasm-bindgen-shared"
-version = "0.2.80"
+version = "0.2.81"
source = "registry+https://github.com/rust-lang/crates.io-index"
-checksum = "d554b7f530dee5964d9a9468d95c1f8b8acae4f282807e7d27d4b03099a46744"
+checksum = "6a89911bd99e5f3659ec4acf9c4d93b0a90fe4a2a11f15328472058edc5261be"
[[package]]
name = "web-sys"
-version = "0.3.57"
+version = "0.3.58"
source = "registry+https://github.com/rust-lang/crates.io-index"
-checksum = "7b17e741662c70c8bd24ac5c5b18de314a2c26c32bf8346ee1e6f53de919c283"
+checksum = "2fed94beee57daf8dd7d51f2b15dc2bcde92d7a72304cdf662a4371008b71b90"
dependencies = [
"js-sys",
"wasm-bindgen",
diff --git a/nautilus_core/common/cbindgen.toml b/nautilus_core/common/cbindgen.toml
index 85c83ef94bab..b60dd81694ac 100644
--- a/nautilus_core/common/cbindgen.toml
+++ b/nautilus_core/common/cbindgen.toml
@@ -8,6 +8,7 @@ tab_width = 4
[export.rename]
"Timestamp" = "uint64_t"
+"Timedelta" = "int64_t"
"UUID4" = "UUID4_t"
"Logger" = "Logger_t"
-"TraderId" = "TraderId_t"
\ No newline at end of file
+"TraderId" = "TraderId_t"
diff --git a/nautilus_core/common/cbindgen_cython.toml b/nautilus_core/common/cbindgen_cython.toml
index 8cdb8365303a..a92c5d86be7a 100644
--- a/nautilus_core/common/cbindgen_cython.toml
+++ b/nautilus_core/common/cbindgen_cython.toml
@@ -12,6 +12,7 @@ header = '"../includes/common.h"'
"libc.stdint" = [
"uint8_t",
"uint64_t",
+ "int64_t",
]
"cpython.object" = [
@@ -24,5 +25,6 @@ header = '"../includes/common.h"'
[export.rename]
"Timestamp" = "uint64_t"
+"Timedelta" = "int64_t"
"UUID4" = "UUID4_t"
-"Logger" = "Logger_t"
\ No newline at end of file
+"Logger" = "Logger_t"
diff --git a/nautilus_core/common/src/clock.rs b/nautilus_core/common/src/clock.rs
index d4b559232b66..f13c7e98ab41 100644
--- a/nautilus_core/common/src/clock.rs
+++ b/nautilus_core/common/src/clock.rs
@@ -13,66 +13,119 @@
// limitations under the License.
// -------------------------------------------------------------------------------------------------
-use crate::timer::TimeEvent;
+use crate::timer::TimeEventHandler;
use std::collections::HashMap;
+use std::ops::{Deref, DerefMut};
-use super::timer::{NameID, TestTimer};
+use super::timer::TestTimer;
use nautilus_core::datetime::{nanos_to_millis, nanos_to_secs};
+use nautilus_core::string::pystr_to_string;
use nautilus_core::time::{Timedelta, Timestamp};
use pyo3::prelude::*;
+use pyo3::types::PyList;
+use pyo3::AsPyPointer;
-#[allow(dead_code)]
-struct TestClock {
- time_ns: Timestamp,
- next_time_ns: Timestamp,
- timers: HashMap,
- handlers: HashMap,
- default_handler: PyObject,
+trait Clock {
+ /// If the clock is a test clock.
+ fn is_test_clock(&self) -> bool;
+
+ /// If the clock has a default handler registered.
+ fn is_default_handler_registered(&self) -> bool;
+
+ /// Register a default event handler for the clock. If a [Timer]
+ /// does not have an event handler, then this handler is used.
+ fn register_default_handler(&mut self, handler: PyObject);
+
+ /// Set a [Timer] to alert at a particular time. Optional
+ /// callback gets used to handle generated events.
+ fn set_time_alert_ns(
+ &mut self,
+ // Both representations of name
+ name: (String, PyObject),
+ alert_time_ns: Timestamp,
+ callback: Option,
+ );
+
+ /// Set a [Timer] to start alerting at every interval
+ /// between start and stop time. Optional callback gets
+ /// used to handle generated event.
+ fn set_timer_ns(
+ &mut self,
+ // Both representations of name
+ name: (String, PyObject),
+ interval_ns: Timedelta,
+ start_time_ns: Timestamp,
+ stop_time_ns: Timestamp,
+ callback: Option,
+ );
+}
+
+#[pyclass]
+pub struct TestClock {
+ pub time_ns: Timestamp,
+ pub next_time_ns: Timestamp,
+ pub timers: HashMap,
+ pub handlers: HashMap,
+ pub default_handler: Option,
}
-#[allow(dead_code)]
impl TestClock {
- fn new(initial_ns: Timestamp, default_handler: PyObject) -> TestClock {
+ #[inline]
+ fn new() -> TestClock {
TestClock {
- time_ns: initial_ns,
+ time_ns: 0,
next_time_ns: 0,
timers: HashMap::new(),
handlers: HashMap::new(),
- default_handler,
+ default_handler: None,
}
}
+ #[allow(dead_code)] // Temporary
+ #[inline]
fn timestamp(&self) -> f64 {
nanos_to_secs(self.time_ns as f64)
}
+ #[allow(dead_code)] // Temporary
+ #[inline]
fn timestamp_ms(&self) -> u64 {
nanos_to_millis(self.time_ns)
}
+ #[allow(dead_code)] // Temporary
fn timestamp_ns(&self) -> u64 {
self.time_ns
}
+ #[allow(dead_code)] // Temporary
fn set_time(&mut self, to_time_ns: Timestamp) {
self.time_ns = to_time_ns
}
- fn advance_time(&mut self, to_time_ns: Timestamp) -> Vec<(Vec, &PyObject)> {
+ pub fn timer_names(self) -> PyObject {
+ let timer_names = self.timers.keys().clone();
+ Python::with_gil(|py| PyList::new(py, timer_names).into())
+ }
+
+ #[inline]
+ pub fn advance_time(&mut self, to_time_ns: Timestamp) -> Vec {
// Time should increase monotonically
assert!(
to_time_ns >= self.time_ns,
- "Time to advance to should be greater than current clock time"
+ "`to_time_ns` was < `self._time_ns`"
);
let events = self
.timers
.iter_mut()
.filter(|(_, timer)| !timer.is_expired)
- .map(|(name_id, timer)| {
- let handler = self.handlers.get(name_id).unwrap_or(&self.default_handler);
- let events: Vec = timer.advance(to_time_ns).collect();
- (events, handler)
+ .flat_map(|(name_id, timer)| {
+ let handler = &self.handlers[name_id];
+ timer.advance(to_time_ns).map(|event| TimeEventHandler {
+ event,
+ handler: handler.clone(),
+ })
})
.collect();
@@ -90,57 +143,133 @@ impl TestClock {
}
}
-trait Clock {
- fn register_default_handler(&mut self, handler: PyObject);
- fn set_time_alert_ns(
- &mut self,
- name: NameID,
- alert_time_ns: Timestamp,
- callback: Option,
- );
- fn set_timer_ns(
- &mut self,
- name: NameID,
- interval_ns: Timedelta,
- start_time_ns: Timestamp,
- stop_time_ns: Timestamp,
- callback: Option,
- );
-}
-
impl Clock for TestClock {
+ fn is_test_clock(&self) -> bool {
+ true
+ }
+
+ fn is_default_handler_registered(&self) -> bool {
+ self.default_handler.is_some()
+ }
+
+ #[inline]
fn register_default_handler(&mut self, handler: PyObject) {
- self.default_handler = handler
+ let _ = self.default_handler.insert(handler);
}
+ #[inline]
fn set_time_alert_ns(
&mut self,
- name: NameID,
+ name: (String, PyObject),
alert_time_ns: Timestamp,
callback: Option,
) {
- let callback = callback.unwrap_or_else(|| self.default_handler.clone());
+ assert!(
+ callback.is_some() | self.default_handler.is_some(),
+ "`callback` and `default_handler` were none"
+ );
+ let callback = match callback {
+ None => self.default_handler.clone().unwrap(),
+ Some(callback) => callback,
+ };
+
let timer = TestTimer::new(
- name,
+ name.1,
(alert_time_ns - self.time_ns) as Timedelta,
self.time_ns,
Some(alert_time_ns),
);
- self.timers.insert(name, timer);
- self.handlers.insert(name, callback);
+ self.timers.insert(name.0.clone(), timer);
+ self.handlers.insert(name.0, callback);
}
+ #[inline]
fn set_timer_ns(
&mut self,
- name: NameID,
+ name: (String, PyObject),
interval_ns: Timedelta,
start_time_ns: Timestamp,
stop_time_ns: Timestamp,
callback: Option,
) {
- let callback = callback.unwrap_or_else(|| self.default_handler.clone());
- let timer = TestTimer::new(name, interval_ns, start_time_ns, Some(stop_time_ns));
- self.timers.insert(name, timer);
- self.handlers.insert(name, callback);
+ assert!(
+ callback.is_some() | self.default_handler.is_some(),
+ "`callback` and `default_handler` were none"
+ );
+ let callback = match callback {
+ None => self.default_handler.clone().unwrap(),
+ Some(callback) => callback,
+ };
+
+ let timer = TestTimer::new(name.1, interval_ns, start_time_ns, Some(stop_time_ns));
+ self.timers.insert(name.0.clone(), timer);
+ self.handlers.insert(name.0, callback);
}
}
+
+////////////////////////////////////////////////////////////////////////////////
+// C API
+////////////////////////////////////////////////////////////////////////////////
+
+#[repr(C)]
+pub struct CTestClock(Box);
+
+impl Deref for CTestClock {
+ type Target = TestClock;
+
+ fn deref(&self) -> &Self::Target {
+ &self.0
+ }
+}
+
+impl DerefMut for CTestClock {
+ fn deref_mut(&mut self) -> &mut Self::Target {
+ &mut self.0
+ }
+}
+
+#[no_mangle]
+pub extern "C" fn test_clock_new() -> CTestClock {
+ CTestClock(Box::new(TestClock::new()))
+}
+
+#[no_mangle]
+pub extern "C" fn test_clock_register_default_handler(clock: &mut CTestClock, handler: PyObject) {
+ clock.register_default_handler(handler);
+}
+
+/// # Safety
+/// - `name` must be borrowed from a valid Python UTF-8 `str`.
+#[no_mangle]
+pub unsafe extern "C" fn test_clock_set_time_alert_ns(
+ clock: &mut CTestClock,
+ name: PyObject,
+ alert_time_ns: Timestamp,
+ callback: Option,
+) {
+ let name = (pystr_to_string(name.as_ptr()), name);
+ clock.set_time_alert_ns(name, alert_time_ns, callback);
+}
+
+/// # Safety
+/// - `name` must be borrowed from a valid Python UTF-8 `str`.
+#[no_mangle]
+pub unsafe extern "C" fn test_clock_set_timer_ns(
+ clock: &mut CTestClock,
+ name: PyObject,
+ interval_ns: Timedelta,
+ start_time_ns: Timestamp,
+ stop_time_ns: Timestamp,
+ callback: Option,
+) {
+ let name = (pystr_to_string(name.as_ptr()), name);
+ clock.set_timer_ns(name, interval_ns, start_time_ns, stop_time_ns, callback);
+}
+
+#[no_mangle]
+pub extern "C" fn test_clock_advance_time(clock: &mut CTestClock, to_time_ns: u64) -> PyObject {
+ let events = clock.advance_time(to_time_ns);
+ Python::with_gil(|py| {
+ PyList::new(py, events.into_iter().map(|v| Py::new(py, v).unwrap())).into()
+ })
+}
diff --git a/nautilus_core/common/src/enums.rs b/nautilus_core/common/src/enums.rs
new file mode 100644
index 000000000000..e21457bfbb06
--- /dev/null
+++ b/nautilus_core/common/src/enums.rs
@@ -0,0 +1,24 @@
+// -------------------------------------------------------------------------------------------------
+// Copyright (C) 2015-2022 Nautech Systems Pty Ltd. All rights reserved.
+// https://nautechsystems.io
+//
+// Licensed under the GNU Lesser General Public License Version 3.0 (the "License");
+// You may not use this file except in compliance with the License.
+// You may obtain a copy of the License at https://www.gnu.org/licenses/lgpl-3.0.en.html
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+// -------------------------------------------------------------------------------------------------
+
+#[derive(Clone, Hash, Debug)]
+#[repr(C)]
+pub enum MessageCategory {
+ COMMAND,
+ DOCUMENT,
+ EVENT,
+ REQUEST,
+ RESPONSE,
+}
diff --git a/nautilus_core/common/src/lib.rs b/nautilus_core/common/src/lib.rs
index ed1d0a10b216..1c7e6feebddb 100644
--- a/nautilus_core/common/src/lib.rs
+++ b/nautilus_core/common/src/lib.rs
@@ -14,5 +14,6 @@
// -------------------------------------------------------------------------------------------------
pub mod clock;
+pub mod enums;
pub mod logging;
pub mod timer;
diff --git a/nautilus_core/common/src/logging.rs b/nautilus_core/common/src/logging.rs
index 76c9836c6135..42018d6f5d1a 100644
--- a/nautilus_core/common/src/logging.rs
+++ b/nautilus_core/common/src/logging.rs
@@ -13,13 +13,13 @@
// limitations under the License.
// -------------------------------------------------------------------------------------------------
-use chrono::{DateTime, NaiveDateTime, SecondsFormat, Utc};
use std::{
fmt::Display,
io::{self, BufWriter, Stderr, Stdout, Write},
ops::{Deref, DerefMut},
};
+use nautilus_core::datetime::unix_nanos_to_iso8601;
use nautilus_core::string::{pystr_to_string, string_to_pystr};
use nautilus_core::uuid::UUID4;
use nautilus_model::identifiers::trader_id::TraderId;
@@ -134,14 +134,10 @@ impl Logger {
component: &str,
msg: &str,
) -> Result<(), io::Error> {
- let secs = (timestamp_ns / 1_000_000_000) as i64;
- let nsecs = (timestamp_ns as i64 - (secs * 1_000_000_000)) as u32;
- let datetime = NaiveDateTime::from_timestamp(secs, nsecs);
let fmt_line = format!(
- "{bold}{utc}{startc} {color}[{level}] {trader_id}.{component}: {msg}{endc}\n",
+ "{bold}{ts}{startc} {color}[{level}] {trader_id}.{component}: {msg}{endc}\n",
bold = LogFormat::BOLD,
- utc = DateTime::::from_utc(datetime, Utc)
- .to_rfc3339_opts(SecondsFormat::Nanos, true),
+ ts = unix_nanos_to_iso8601(timestamp_ns),
startc = LogFormat::ENDC,
color = color,
level = level,
diff --git a/nautilus_core/common/src/timer.rs b/nautilus_core/common/src/timer.rs
index b0f777f4b800..760dde18481f 100644
--- a/nautilus_core/common/src/timer.rs
+++ b/nautilus_core/common/src/timer.rs
@@ -13,26 +13,64 @@
// limitations under the License.
// -------------------------------------------------------------------------------------------------
+use crate::enums::MessageCategory;
use nautilus_core::time::{Timedelta, Timestamp};
use nautilus_core::uuid::UUID4;
+use pyo3::prelude::*;
-/// Index of name string in global string store
-pub type NameID = u64;
-
-#[derive(Clone, Hash, Debug)]
+#[derive(Clone, Debug)]
+#[repr(C)]
+#[pyclass]
/// Represents a time event occurring at the event timestamp.
pub struct TimeEvent {
/// The event name.
- pub name: NameID,
+ pub name: PyObject,
/// The event ID.
- pub id: UUID4,
+ pub category: MessageCategory, // Only applicable to generic messages in the future
/// The UNIX timestamp (nanoseconds) when the time event occurred.
+ pub event_id: UUID4,
+ /// The message category
pub ts_event: Timestamp,
/// The UNIX timestamp (nanoseconds) when the object was initialized.
pub ts_init: Timestamp,
}
+/// Represents a bundled event and it's handler
+#[repr(C)]
+#[pyclass]
+#[derive(Clone)]
+pub struct TimeEventHandler {
+ /// A [TimeEvent] generated by a timer.
+ pub event: TimeEvent,
+ /// A callable handler for this time event.
+ pub handler: PyObject,
+}
+
+impl TimeEventHandler {
+ #[inline]
+ pub fn handle_py(self) {
+ Python::with_gil(|py| {
+ self.handler.call0(py).expect("Failed calling handler");
+ });
+ }
+
+ #[inline]
+ pub fn handle(self) {
+ Python::with_gil(|py| {
+ self.handler
+ .call1(py, (self.event,))
+ .expect("Failed calling handler");
+ });
+ }
+}
+
pub trait Timer {
+ fn new(
+ name: PyObject,
+ interval_ns: Timedelta,
+ start_time_ns: Timestamp,
+ stop_time_ns: Option,
+ ) -> Self;
fn pop_event(&self, event_id: UUID4, ts_init: Timestamp) -> TimeEvent;
fn iterate_next_time(&mut self, ts_now: Timestamp);
fn cancel(&mut self);
@@ -40,7 +78,10 @@ pub trait Timer {
#[allow(dead_code)]
pub struct TestTimer {
- name: NameID,
+ /// A string name stored as an opaque python object.
+ /// It is passed from Python when a timer is created
+ /// and cloned to an [TimeEvent]s generated by this timer.
+ name: PyObject,
interval_ns: Timedelta,
start_time_ns: Timestamp,
stop_time_ns: Option,
@@ -50,7 +91,7 @@ pub struct TestTimer {
impl TestTimer {
pub fn new(
- name: NameID,
+ name: PyObject,
interval_ns: Timedelta,
start_time_ns: Timestamp,
stop_time_ns: Option,
@@ -64,12 +105,29 @@ impl TestTimer {
is_expired: false,
}
}
+
+ /// Advance the test timer forward to the given time, generating a sequence
+ /// of events. A [TimeEvent] is appended for each time a next event is
+ /// <= the given `to_time_ns`.
pub fn advance(&mut self, to_time_ns: Timestamp) -> impl Iterator- + '_ {
self.take_while(move |(_, next_time_ns)| to_time_ns >= *next_time_ns)
.map(|(event, _)| event)
}
- pub fn pop_next_event(&mut self) -> TimeEvent {
- self.next().unwrap().0
+
+ // TODO(cs): Potentially now redundant with the iterator
+ /// Iterates the timers next time, and checks if the timer is now expired.
+ pub fn iterate_next_time(&mut self, ts_now: Timestamp) {
+ self.next_time_ns += self.interval_ns as u64;
+ if let Some(stop_time_ns) = self.stop_time_ns {
+ if ts_now >= stop_time_ns {
+ self.is_expired = true
+ }
+ }
+ }
+
+ /// Cancels the timer (the timer will not generate an event).
+ pub fn cancel(&mut self) {
+ self.is_expired = true;
}
}
@@ -82,16 +140,16 @@ impl Iterator for TestTimer {
} else {
let item = (
TimeEvent {
- name: self.name,
- id: UUID4::new(),
+ name: self.name.clone(), // clone increments ref count on PyObject
+ category: MessageCategory::EVENT,
+ event_id: UUID4::new(),
ts_event: self.next_time_ns,
ts_init: self.next_time_ns,
},
self.next_time_ns,
);
- // if current next event time has exceeded
- // stop time expire timer
+ // If current next event time has exceeded stop time, then expire timer
if let Some(stop_time_ns) = self.stop_time_ns {
if self.next_time_ns >= stop_time_ns {
self.is_expired = true;
@@ -111,10 +169,15 @@ impl Iterator for TestTimer {
#[cfg(test)]
mod tests {
use super::{TestTimer, TimeEvent};
+ use pyo3::prelude::*;
+ use pyo3::types::PyString;
#[test]
fn test_pop_event() {
- let mut timer = TestTimer::new(0, 0, 1, None);
+ pyo3::prepare_freethreaded_python();
+ let name: PyObject = Python::with_gil(|py| PyString::new(py, "G'day mate").into());
+ let mut timer = TestTimer::new(name, 0, 1, None);
+
assert!(timer.next().is_some());
assert!(timer.next().is_some());
timer.is_expired = true;
@@ -123,15 +186,21 @@ mod tests {
#[test]
fn test_advance() {
- let mut timer = TestTimer::new(0, 1, 0, None);
+ pyo3::prepare_freethreaded_python();
+ let name: PyObject = Python::with_gil(|py| PyString::new(py, "G'day mate").into());
+ let mut timer = TestTimer::new(name, 1, 0, None);
let events: Vec = timer.advance(5).collect();
+
assert_eq!(events.len(), 5);
}
#[test]
fn test_advance_stop() {
- let mut timer = TestTimer::new(0, 1, 0, Some(5));
+ pyo3::prepare_freethreaded_python();
+ let name: PyObject = Python::with_gil(|py| PyString::new(py, "G'day mate").into());
+ let mut timer = TestTimer::new(name, 1, 0, Some(5));
let events: Vec = timer.advance(10).collect();
+
assert_eq!(events.len(), 5);
}
}
diff --git a/nautilus_core/common/tests/callback.py b/nautilus_core/common/tests/callback.py
new file mode 100644
index 000000000000..094a7c3ef132
--- /dev/null
+++ b/nautilus_core/common/tests/callback.py
@@ -0,0 +1,29 @@
+# -------------------------------------------------------------------------------------------------
+# Copyright (C) 2015-2022 Nautech Systems Pty Ltd. All rights reserved.
+# https://nautechsystems.io
+#
+# Licensed under the GNU Lesser General Public License Version 3.0 (the "License");
+# You may not use this file except in compliance with the License.
+# You may obtain a copy of the License at https://www.gnu.org/licenses/lgpl-3.0.en.html
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+# -------------------------------------------------------------------------------------------------
+
+count = 0
+events = []
+
+
+def increment(event):
+ global count
+ global events
+ count += 1
+ events.append(event)
+
+
+def display():
+ global count
+ print(count)
diff --git a/nautilus_core/common/tests/test_clock.rs b/nautilus_core/common/tests/test_clock.rs
new file mode 100644
index 000000000000..025e54874635
--- /dev/null
+++ b/nautilus_core/common/tests/test_clock.rs
@@ -0,0 +1,82 @@
+// -------------------------------------------------------------------------------------------------
+// Copyright (C) 2015-2022 Nautech Systems Pty Ltd. All rights reserved.
+// https://nautechsystems.io
+//
+// Licensed under the GNU Lesser General Public License Version 3.0 (the "License");
+// You may not use this file except in compliance with the License.
+// You may obtain a copy of the License at https://www.gnu.org/licenses/lgpl-3.0.en.html
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+// -------------------------------------------------------------------------------------------------
+
+use nautilus_common::clock::{test_clock_new, test_clock_set_time_alert_ns};
+use pyo3::{prelude::*, types::*};
+use std::str::FromStr;
+
+#[test]
+fn test_clock_advance() {
+ pyo3::prepare_freethreaded_python();
+
+ let mut test_clock = Python::with_gil(|_py| test_clock_new());
+
+ assert_eq!(test_clock.time_ns, 0);
+ let timer_name = "tringtring";
+
+ let (name, callback) = Python::with_gil(|py| {
+ let name = PyString::new(py, timer_name).into();
+ let dummy = Some(PyDict::new(py).into());
+ (name, dummy)
+ });
+
+ unsafe {
+ test_clock_set_time_alert_ns(&mut test_clock, name, 2_000, callback);
+ }
+
+ assert_eq!(test_clock.timers.len(), 1);
+ assert_eq!(
+ test_clock.timers.keys().next().unwrap().as_str(),
+ timer_name
+ );
+
+ let events = test_clock.advance_time(3_000);
+
+ assert_eq!(test_clock.timers.values().next().unwrap().is_expired, true);
+ assert_eq!(events.len(), 1);
+ assert_eq!(
+ events.iter().next().unwrap().event.name.to_string(),
+ String::from_str(timer_name).unwrap()
+ );
+}
+
+#[test]
+fn test_clock_even_callback() {
+ pyo3::prepare_freethreaded_python();
+
+ let mut test_clock = Python::with_gil(|_py| test_clock_new());
+
+ let (name, callback, pymod): (PyObject, PyObject, PyObject) = Python::with_gil(|py| {
+ let code = include_str!("callback.py");
+ let pymod = PyModule::from_code(py, &code, "humpty", "dumpty").unwrap();
+ let name = PyString::new(py, "brrrringbrrring");
+ let callback = pymod.getattr("increment").unwrap();
+ (name.into(), callback.into(), pymod.into())
+ });
+
+ unsafe {
+ test_clock_set_time_alert_ns(&mut test_clock, name, 2_000, Some(callback));
+ }
+
+ let events = test_clock.advance_time(3_000);
+ events
+ .into_iter()
+ .for_each(|time_event_handler| time_event_handler.handle());
+
+ let count: u64 =
+ Python::with_gil(|py| pymod.getattr(py, "count").unwrap().extract(py).unwrap());
+
+ assert_eq!(count, 1);
+}
diff --git a/nautilus_core/core/Cargo.toml b/nautilus_core/core/Cargo.toml
index ba10c6c4e737..c67b0178f99a 100644
--- a/nautilus_core/core/Cargo.toml
+++ b/nautilus_core/core/Cargo.toml
@@ -9,7 +9,7 @@ name = "nautilus_core"
crate-type = ["rlib", "staticlib"]
[dependencies]
-cbindgen = "^0.20.0"
+chrono = "^0.4.19"
pyo3 = "^0.16.5"
uuid = { version = "^0.8.2", features = ["v4"] }
lazy_static = "1.4.0"
diff --git a/nautilus_core/core/cbindgen.toml b/nautilus_core/core/cbindgen.toml
index 85f3b9a60be2..76c4ed789b0e 100644
--- a/nautilus_core/core/cbindgen.toml
+++ b/nautilus_core/core/cbindgen.toml
@@ -8,4 +8,5 @@ tab_width = 4
[export.rename]
"Timestamp" = "uint64_t"
+"Timedelta" = "int64_t"
"UUID4" = "UUID4_t"
diff --git a/nautilus_core/core/cbindgen_cython.toml b/nautilus_core/core/cbindgen_cython.toml
index 8bde51d12d41..b02df8f50b4f 100644
--- a/nautilus_core/core/cbindgen_cython.toml
+++ b/nautilus_core/core/cbindgen_cython.toml
@@ -12,6 +12,7 @@ header = '"../includes/core.h"'
"libc.stdint" = [
"uint8_t",
"uint64_t",
+ "int64_t",
]
"cpython.object" = [
@@ -20,4 +21,5 @@ header = '"../includes/core.h"'
[export.rename]
"Timestamp" = "uint64_t"
+"Timedelta" = "int64_t"
"UUID4" = "UUID4_t"
diff --git a/nautilus_core/core/src/datetime.rs b/nautilus_core/core/src/datetime.rs
index 5663bfdb01f1..e118ff8d823b 100644
--- a/nautilus_core/core/src/datetime.rs
+++ b/nautilus_core/core/src/datetime.rs
@@ -13,6 +13,10 @@
// limitations under the License.
// -------------------------------------------------------------------------------------------------
+use chrono::prelude::{DateTime, Utc};
+use chrono::{Datelike, Timelike};
+use std::time::{UNIX_EPOCH, Duration};
+
const NANOSECONDS_IN_SECOND: u64 = 1_000_000_000;
const NANOSECONDS_IN_MILLISECOND: u64 = 1_000_000;
const NANOSECONDS_IN_MICROSECOND: u64 = 1_000;
@@ -31,3 +35,19 @@ pub fn nanos_to_millis(nanos: u64) -> u64 {
pub fn nanos_to_micros(nanos: u64) -> u64 {
nanos / NANOSECONDS_IN_MICROSECOND
}
+
+#[inline]
+pub fn unix_nanos_to_iso8601(timestamp_ns: u64) -> String {
+ let dt = DateTime::::from(UNIX_EPOCH + Duration::from_nanos(timestamp_ns));
+ let date = dt.date();
+ let time = dt.time();
+ format!("{}-{:02}-{:02}T{:02}:{:02}:{:02}.{:09}Z",
+ date.year(),
+ date.month(),
+ date.day(),
+ time.hour(),
+ time.minute(),
+ time.second(),
+ time.nanosecond()
+ )
+}
diff --git a/nautilus_core/core/src/time.rs b/nautilus_core/core/src/time.rs
index 85c35cd82489..4c19fd0410b1 100644
--- a/nautilus_core/core/src/time.rs
+++ b/nautilus_core/core/src/time.rs
@@ -23,11 +23,6 @@ pub type Timestamp = u64;
/// Represents a timedelta in nanoseconds.
pub type Timedelta = i64;
-// A static reference to an instant of system time
-lazy_static! {
- pub static ref INSTANT: Instant = Instant::now();
-}
-
// A static reference to duration since UNIX epoch
lazy_static! {
pub static ref INIT_SINCE_EPOCH: Duration = SystemTime::now()
@@ -35,6 +30,11 @@ lazy_static! {
.expect("Invalid system time");
}
+// A static reference to an instant of system time
+lazy_static! {
+ pub static ref INSTANT: Instant = Instant::now();
+}
+
////////////////////////////////////////////////////////////////////////////////
// C API
////////////////////////////////////////////////////////////////////////////////
diff --git a/nautilus_core/model/Cargo.toml b/nautilus_core/model/Cargo.toml
index 63bc3f5d2c53..72d0be66a052 100644
--- a/nautilus_core/model/Cargo.toml
+++ b/nautilus_core/model/Cargo.toml
@@ -9,7 +9,6 @@ name = "nautilus_model"
crate-type = ["rlib", "staticlib"]
[dependencies]
-cbindgen = "^0.20.0"
pyo3 = "^0.16.5"
nautilus_core = { path = "../core" }
diff --git a/nautilus_core/model/cbindgen.toml b/nautilus_core/model/cbindgen.toml
index ce52c32ae0f3..5f08fb552478 100644
--- a/nautilus_core/model/cbindgen.toml
+++ b/nautilus_core/model/cbindgen.toml
@@ -8,6 +8,7 @@ tab_width = 4
[export.rename]
"Timestamp" = "uint64_t"
+"Timedelta" = "int64_t"
"Currency" = "Currency_t"
"Money" = "Money_t"
"Price" = "Price_t"
@@ -27,3 +28,6 @@ tab_width = 4
"TraderId" = "TraderId_t"
"Venue" = "Venue_t"
"VenueOrderId" = "VenueOrderId_t"
+"BarSpecification" = "BarSpecification_t"
+"BarType" = "BarType_t"
+"Bar" = "Bar_t"
diff --git a/nautilus_core/model/cbindgen_cython.toml b/nautilus_core/model/cbindgen_cython.toml
index 9ec3b350da2c..3ba49bada616 100644
--- a/nautilus_core/model/cbindgen_cython.toml
+++ b/nautilus_core/model/cbindgen_cython.toml
@@ -22,6 +22,7 @@ header = '"../includes/model.h"'
[export.rename]
"Timestamp" = "uint64_t"
+"Timedelta" = "int64_t"
"Currency" = "Currency_t"
"Money" = "Money_t"
"Price" = "Price_t"
@@ -41,3 +42,6 @@ header = '"../includes/model.h"'
"TraderId" = "TraderId_t"
"Venue" = "Venue_t"
"VenueOrderId" = "VenueOrderId_t"
+"BarSpecification" = "BarSpecification_t"
+"BarType" = "BarType_t"
+"Bar" = "Bar_t"
diff --git a/nautilus_core/model/src/data/bar.rs b/nautilus_core/model/src/data/bar.rs
new file mode 100644
index 000000000000..fffc2d7960c1
--- /dev/null
+++ b/nautilus_core/model/src/data/bar.rs
@@ -0,0 +1,548 @@
+// -------------------------------------------------------------------------------------------------
+// Copyright (C) 2015-2022 Nautech Systems Pty Ltd. All rights reserved.
+// https://nautechsystems.io
+//
+// Licensed under the GNU Lesser General Public License Version 3.0 (the "License");
+// You may not use this file except in compliance with the License.
+// You may obtain a copy of the License at https://www.gnu.org/licenses/lgpl-3.0.en.html
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+// -------------------------------------------------------------------------------------------------
+
+use crate::enums::AggregationSource;
+use crate::enums::BarAggregation;
+use crate::enums::PriceType;
+use crate::identifiers::instrument_id::InstrumentId;
+use crate::types::price::Price;
+use crate::types::quantity::Quantity;
+use nautilus_core::string::string_to_pystr;
+use nautilus_core::time::Timestamp;
+use pyo3::ffi;
+use std::cmp::Ordering;
+use std::collections::hash_map::DefaultHasher;
+use std::fmt::{Debug, Display, Formatter, Result};
+use std::hash::{Hash, Hasher};
+
+#[repr(C)]
+#[derive(Clone, Hash, PartialEq, Debug)]
+pub struct BarSpecification {
+ pub step: u64,
+ pub aggregation: BarAggregation,
+ pub price_type: PriceType,
+}
+
+impl Display for BarSpecification {
+ fn fmt(&self, f: &mut Formatter<'_>) -> Result {
+ write!(f, "{}-{}-{}", self.step, self.aggregation, self.price_type)
+ }
+}
+
+impl PartialOrd for BarSpecification {
+ fn partial_cmp(&self, other: &Self) -> Option {
+ self.to_string().partial_cmp(&other.to_string())
+ }
+
+ fn lt(&self, other: &Self) -> bool {
+ self.to_string().lt(&other.to_string())
+ }
+
+ fn le(&self, other: &Self) -> bool {
+ self.to_string().le(&other.to_string())
+ }
+
+ fn gt(&self, other: &Self) -> bool {
+ self.to_string().gt(&other.to_string())
+ }
+
+ fn ge(&self, other: &Self) -> bool {
+ self.to_string().ge(&other.to_string())
+ }
+}
+
+/// Returns a [BarSpecification] as a Python str.
+///
+/// # Safety
+/// Returns a pointer to a valid Python UTF-8 string.
+/// - Assumes that since the data is originating from Rust, the GIL does not need
+/// to be acquired.
+/// - Assumes you are immediately returning this pointer to Python.
+#[no_mangle]
+pub unsafe extern "C" fn bar_specification_to_pystr(
+ bar_spec: &BarSpecification,
+) -> *mut ffi::PyObject {
+ string_to_pystr(bar_spec.to_string().as_str())
+}
+#[no_mangle]
+pub extern "C" fn bar_specification_free(bar_spec: BarSpecification) {
+ drop(bar_spec); // Memory freed here
+}
+
+#[no_mangle]
+pub extern "C" fn bar_specification_hash(bar_spec: &BarSpecification) -> u64 {
+ let mut h = DefaultHasher::new();
+ bar_spec.hash(&mut h);
+ h.finish()
+}
+
+#[no_mangle]
+pub extern "C" fn bar_specification_new(
+ step: u64,
+ aggregation: u8,
+ price_type: u8,
+) -> BarSpecification {
+ let aggregation = BarAggregation::from(aggregation);
+ let price_type = PriceType::from(price_type);
+ BarSpecification {
+ step,
+ aggregation,
+ price_type,
+ }
+}
+
+#[no_mangle]
+pub extern "C" fn bar_specification_eq(lhs: &BarSpecification, rhs: &BarSpecification) -> u8 {
+ (lhs == rhs) as u8
+}
+
+#[no_mangle]
+pub extern "C" fn bar_specification_lt(lhs: &BarSpecification, rhs: &BarSpecification) -> u8 {
+ (lhs < rhs) as u8
+}
+
+#[no_mangle]
+pub extern "C" fn bar_specification_le(lhs: &BarSpecification, rhs: &BarSpecification) -> u8 {
+ (lhs <= rhs) as u8
+}
+
+#[no_mangle]
+pub extern "C" fn bar_specification_gt(lhs: &BarSpecification, rhs: &BarSpecification) -> u8 {
+ (lhs > rhs) as u8
+}
+
+#[no_mangle]
+pub extern "C" fn bar_specification_ge(lhs: &BarSpecification, rhs: &BarSpecification) -> u8 {
+ (lhs >= rhs) as u8
+}
+
+#[repr(C)]
+#[derive(Clone, Debug)]
+pub struct BarType {
+ pub instrument_id: InstrumentId,
+ pub spec: BarSpecification,
+ pub aggregation_source: AggregationSource,
+}
+
+impl PartialEq for BarType {
+ fn eq(&self, other: &Self) -> bool {
+ self.instrument_id == other.instrument_id
+ && self.spec == other.spec
+ && self.aggregation_source == other.aggregation_source
+ }
+}
+
+impl Hash for BarType {
+ fn hash(&self, state: &mut H) {
+ self.spec.hash(state);
+ self.instrument_id.hash(state);
+ }
+}
+
+impl PartialOrd for BarType {
+ fn partial_cmp(&self, other: &Self) -> Option {
+ self.to_string().partial_cmp(&other.to_string())
+ }
+
+ fn lt(&self, other: &Self) -> bool {
+ self.to_string().lt(&other.to_string())
+ }
+
+ fn le(&self, other: &Self) -> bool {
+ self.to_string().le(&other.to_string())
+ }
+
+ fn gt(&self, other: &Self) -> bool {
+ self.to_string().gt(&other.to_string())
+ }
+
+ fn ge(&self, other: &Self) -> bool {
+ self.to_string().ge(&other.to_string())
+ }
+}
+
+impl Display for BarType {
+ fn fmt(&self, f: &mut Formatter<'_>) -> Result {
+ write!(
+ f,
+ "{}-{}-{}",
+ self.instrument_id, self.spec, self.aggregation_source
+ )
+ }
+}
+
+#[no_mangle]
+pub extern "C" fn bar_type_new(
+ instrument_id: InstrumentId,
+ spec: BarSpecification,
+ aggregation_source: u8,
+) -> BarType {
+ let aggregation_source = AggregationSource::from(aggregation_source);
+ BarType {
+ instrument_id,
+ spec,
+ aggregation_source,
+ }
+}
+
+#[no_mangle]
+pub extern "C" fn bar_type_eq(lhs: &BarType, rhs: &BarType) -> u8 {
+ (lhs == rhs) as u8
+}
+
+#[no_mangle]
+pub extern "C" fn bar_type_lt(lhs: &BarType, rhs: &BarType) -> u8 {
+ (lhs < rhs) as u8
+}
+
+#[no_mangle]
+pub extern "C" fn bar_type_le(lhs: &BarType, rhs: &BarType) -> u8 {
+ (lhs <= rhs) as u8
+}
+
+#[no_mangle]
+pub extern "C" fn bar_type_gt(lhs: &BarType, rhs: &BarType) -> u8 {
+ (lhs > rhs) as u8
+}
+
+#[no_mangle]
+pub extern "C" fn bar_type_ge(lhs: &BarType, rhs: &BarType) -> u8 {
+ (lhs >= rhs) as u8
+}
+
+#[no_mangle]
+pub extern "C" fn bar_type_hash(bar_type: &BarType) -> u64 {
+ let mut h = DefaultHasher::new();
+ bar_type.hash(&mut h);
+ h.finish()
+}
+
+/// Returns a [BarType] as a Python str.
+///
+/// # Safety
+/// Returns a pointer to a valid Python UTF-8 string.
+/// - Assumes that since the data is originating from Rust, the GIL does not need
+/// to be acquired.
+/// - Assumes you are immediately returning this pointer to Python.
+#[no_mangle]
+pub unsafe extern "C" fn bar_type_to_pystr(bar_type: &BarType) -> *mut ffi::PyObject {
+ string_to_pystr(bar_type.to_string().as_str())
+}
+
+#[no_mangle]
+pub extern "C" fn bar_type_free(bar_type: BarType) {
+ drop(bar_type); // Memory freed here
+}
+
+#[repr(C)]
+#[derive(Clone, Hash, PartialEq, Debug)]
+pub struct Bar {
+ pub bar_type: BarType,
+ pub open: Price,
+ pub high: Price,
+ pub low: Price,
+ pub close: Price,
+ pub volume: Quantity,
+ pub ts_event: Timestamp,
+ pub ts_init: Timestamp,
+}
+
+impl Display for Bar {
+ fn fmt(&self, f: &mut Formatter<'_>) -> Result {
+ write!(
+ f,
+ "{},{},{},{},{},{},{}",
+ self.bar_type, self.open, self.high, self.low, self.close, self.volume, self.ts_event
+ )
+ }
+}
+
+#[no_mangle]
+pub extern "C" fn bar_new(
+ bar_type: BarType,
+ open: Price,
+ high: Price,
+ low: Price,
+ close: Price,
+ volume: Quantity,
+ ts_event: Timestamp,
+ ts_init: Timestamp,
+) -> Bar {
+ Bar {
+ bar_type,
+ open,
+ high,
+ low,
+ close,
+ volume,
+ ts_event,
+ ts_init,
+ }
+}
+
+#[no_mangle]
+pub extern "C" fn bar_new_from_raw(
+ bar_type: BarType,
+ open: i64,
+ high: i64,
+ low: i64,
+ close: i64,
+ price_prec: u8,
+ volume: u64,
+ size_prec: u8,
+ ts_event: Timestamp,
+ ts_init: Timestamp,
+) -> Bar {
+ Bar {
+ bar_type,
+ open: Price::from_raw(open, price_prec),
+ high: Price::from_raw(high, price_prec),
+ low: Price::from_raw(low, price_prec),
+ close: Price::from_raw(close, price_prec),
+ volume: Quantity::from_raw(volume, size_prec),
+ ts_event,
+ ts_init,
+ }
+}
+
+/// Returns a [Bar] as a Python str.
+///
+/// # Safety
+/// Returns a pointer to a valid Python UTF-8 string.
+/// - Assumes that since the data is originating from Rust, the GIL does not need
+/// to be acquired.
+/// - Assumes you are immediately returning this pointer to Python.
+#[no_mangle]
+pub unsafe extern "C" fn bar_to_pystr(bar: &Bar) -> *mut ffi::PyObject {
+ string_to_pystr(bar.to_string().as_str())
+}
+
+#[no_mangle]
+pub extern "C" fn bar_free(bar: Bar) {
+ drop(bar); // Memory freed here
+}
+
+#[no_mangle]
+pub extern "C" fn bar_eq(lhs: &Bar, rhs: &Bar) -> u8 {
+ (lhs == rhs) as u8
+}
+
+#[no_mangle]
+pub extern "C" fn bar_hash(bar: &Bar) -> u64 {
+ let mut h = DefaultHasher::new();
+ bar.hash(&mut h);
+ h.finish()
+}
+////////////////////////////////////////////////////////////////////////////////
+// Tests
+////////////////////////////////////////////////////////////////////////////////
+#[cfg(test)]
+mod tests {
+ use crate::data::bar::Bar;
+ use crate::data::bar::BarSpecification;
+ use crate::data::bar::BarType;
+ use crate::enums::AggregationSource;
+ use crate::enums::BarAggregation;
+ use crate::enums::PriceType;
+ use crate::identifiers::instrument_id::InstrumentId;
+ use crate::identifiers::symbol::Symbol;
+ use crate::identifiers::venue::Venue;
+ use crate::types::price::Price;
+ use crate::types::quantity::Quantity;
+ // use std::hash::Hash;
+
+ #[test]
+ fn test_bar_spec_equality() {
+ // Arrange
+ let bar_spec1 = BarSpecification {
+ step: 1,
+ aggregation: BarAggregation::Minute,
+ price_type: PriceType::Bid,
+ };
+ let bar_spec2 = BarSpecification {
+ step: 1,
+ aggregation: BarAggregation::Minute,
+ price_type: PriceType::Bid,
+ };
+ let bar_spec3 = BarSpecification {
+ step: 1,
+ aggregation: BarAggregation::Minute,
+ price_type: PriceType::Ask,
+ };
+
+ // Act, Assert
+ assert_eq!(bar_spec1, bar_spec1);
+ assert_eq!(bar_spec1, bar_spec2);
+ assert_ne!(bar_spec1, bar_spec3);
+ }
+
+ #[test]
+ fn test_bar_spec_comparison() {
+ // # Arrange
+ let bar_spec1 = BarSpecification {
+ step: 1,
+ aggregation: BarAggregation::Minute,
+ price_type: PriceType::Bid,
+ };
+ let bar_spec2 = BarSpecification {
+ step: 1,
+ aggregation: BarAggregation::Minute,
+ price_type: PriceType::Bid,
+ };
+ let bar_spec3 = BarSpecification {
+ step: 1,
+ aggregation: BarAggregation::Minute,
+ price_type: PriceType::Ask,
+ };
+
+ // # Act, Assert
+ assert!(bar_spec1 <= bar_spec2);
+ assert!(bar_spec3 < bar_spec1);
+ assert!(bar_spec1 > bar_spec3);
+ assert!(bar_spec1 >= bar_spec3);
+ }
+
+ #[test]
+ fn test_bar_spec_string_reprs() {
+ let bar_spec = BarSpecification {
+ step: 1,
+ aggregation: BarAggregation::Minute,
+ price_type: PriceType::Bid,
+ };
+ assert_eq!(bar_spec.to_string(), "1-MINUTE-BID");
+ assert_eq!(format!("{bar_spec}"), "1-MINUTE-BID");
+ }
+
+ #[test]
+ fn test_bar_type_equality() {
+ // # Arrange
+ let instrument_id1 = InstrumentId {
+ symbol: Symbol::from("AUD/USD"),
+ venue: Venue::from("SIM"),
+ };
+ let instrument_id2 = InstrumentId {
+ symbol: Symbol::from("GBP/USD"),
+ venue: Venue::from("SIM"),
+ };
+ let bar_spec = BarSpecification {
+ step: 1,
+ aggregation: BarAggregation::Minute,
+ price_type: PriceType::Bid,
+ };
+ let bar_type1 = BarType {
+ instrument_id: instrument_id1.clone(),
+ spec: bar_spec.clone(),
+ aggregation_source: AggregationSource::External,
+ };
+ let bar_type2 = BarType {
+ instrument_id: instrument_id1.clone(),
+ spec: bar_spec.clone(),
+ aggregation_source: AggregationSource::External,
+ };
+ let bar_type3 = BarType {
+ instrument_id: instrument_id2,
+ spec: bar_spec.clone(),
+ aggregation_source: AggregationSource::External,
+ };
+
+ // # Act, Assert
+ assert_eq!(bar_type1, bar_type1);
+ assert_eq!(bar_type1, bar_type2);
+ assert_ne!(bar_type1, bar_type3);
+ }
+
+ #[test]
+ fn test_bar_type_comparison() {
+ // # Arrange
+ let instrument_id1 = InstrumentId {
+ symbol: Symbol::from("AUD/USD"),
+ venue: Venue::from("SIM"),
+ };
+
+ let instrument_id2 = InstrumentId {
+ symbol: Symbol::from("GBP/USD"),
+ venue: Venue::from("SIM"),
+ };
+ let bar_spec = BarSpecification {
+ step: 1,
+ aggregation: BarAggregation::Minute,
+ price_type: PriceType::Bid,
+ };
+ let bar_type1 = BarType {
+ instrument_id: instrument_id1.clone(),
+ spec: bar_spec.clone(),
+ aggregation_source: AggregationSource::External,
+ };
+ let bar_type2 = BarType {
+ instrument_id: instrument_id1.clone(),
+ spec: bar_spec.clone(),
+ aggregation_source: AggregationSource::External,
+ };
+ let bar_type3 = BarType {
+ instrument_id: instrument_id2.clone(),
+ spec: bar_spec.clone(),
+ aggregation_source: AggregationSource::External,
+ };
+
+ // # Act, Assert
+ assert!(bar_type1 <= bar_type2);
+ assert!(bar_type1 < bar_type3);
+ assert!(bar_type3 > bar_type1);
+ assert!(bar_type3 >= bar_type1);
+ }
+ #[test]
+ fn test_bar_equality() {
+ let instrument_id = InstrumentId {
+ symbol: Symbol::from("AUDUSD"),
+ venue: Venue::from("SIM"),
+ };
+ let bar_spec = BarSpecification {
+ step: 1,
+ aggregation: BarAggregation::Minute,
+ price_type: PriceType::Bid,
+ };
+ let bar_type = BarType {
+ instrument_id: instrument_id,
+ spec: bar_spec,
+ aggregation_source: AggregationSource::External,
+ };
+ // # Arrange
+ let bar1 = Bar {
+ bar_type: bar_type.clone(),
+ open: Price::from("1.00001"),
+ high: Price::from("1.00004"),
+ low: Price::from("1.00002"),
+ close: Price::from("1.00003"),
+ volume: Quantity::from("100000"),
+ ts_event: 0,
+ ts_init: 0,
+ };
+
+ let bar2 = Bar {
+ bar_type: bar_type.clone(),
+ open: Price::from("1.00000"),
+ high: Price::from("1.00004"),
+ low: Price::from("1.00002"),
+ close: Price::from("1.00003"),
+ volume: Quantity::from("100000"),
+ ts_event: 0,
+ ts_init: 0,
+ };
+
+ // # Act, Assert
+ assert_eq!(bar1, bar1);
+ assert_ne!(bar1, bar2);
+ }
+}
diff --git a/nautilus_core/model/src/data/mod.rs b/nautilus_core/model/src/data/mod.rs
index 1ac633205f4b..b13da0b8985b 100644
--- a/nautilus_core/model/src/data/mod.rs
+++ b/nautilus_core/model/src/data/mod.rs
@@ -13,4 +13,5 @@
// limitations under the License.
// -------------------------------------------------------------------------------------------------
+pub mod bar;
pub mod tick;
diff --git a/nautilus_core/model/src/data/tick.rs b/nautilus_core/model/src/data/tick.rs
index 2140bc198aa5..2120eb0b3033 100644
--- a/nautilus_core/model/src/data/tick.rs
+++ b/nautilus_core/model/src/data/tick.rs
@@ -89,8 +89,8 @@ pub extern "C" fn quote_tick_new(
ask: Price,
bid_size: Quantity,
ask_size: Quantity,
- ts_event: u64,
- ts_init: u64,
+ ts_event: Timestamp,
+ ts_init: Timestamp,
) -> QuoteTick {
QuoteTick {
instrument_id,
@@ -112,8 +112,8 @@ pub extern "C" fn quote_tick_from_raw(
bid_size: u64,
ask_size: u64,
size_prec: u8,
- ts_event: u64,
- ts_init: u64,
+ ts_event: Timestamp,
+ ts_init: Timestamp,
) -> QuoteTick {
QuoteTick {
instrument_id,
diff --git a/nautilus_core/model/src/enums.rs b/nautilus_core/model/src/enums.rs
index bd26f6f608a1..67821eae50c7 100644
--- a/nautilus_core/model/src/enums.rs
+++ b/nautilus_core/model/src/enums.rs
@@ -83,6 +83,44 @@ pub enum PriceType {
Mid = 3,
Last = 4,
}
+impl PriceType {
+ pub fn as_str(&self) -> &'static str {
+ match self {
+ PriceType::Bid => "BID",
+ PriceType::Ask => "ASK",
+ PriceType::Mid => "MID",
+ PriceType::Last => "LAST",
+ }
+ }
+}
+impl Display for PriceType {
+ fn fmt(&self, f: &mut Formatter<'_>) -> Result {
+ write!(f, "{}", self.as_str())
+ }
+}
+
+impl From<&str> for PriceType {
+ fn from(s: &str) -> Self {
+ match s.to_uppercase().as_str() {
+ "BID" => PriceType::Bid,
+ "ASK" => PriceType::Ask,
+ "MID" => PriceType::Mid,
+ "LAST" => PriceType::Last,
+ _ => panic!("Invalid `PriceType` value, was {s}"),
+ }
+ }
+}
+impl From for PriceType {
+ fn from(i: u8) -> Self {
+ match i {
+ 1 => PriceType::Bid,
+ 2 => PriceType::Ask,
+ 3 => PriceType::Mid,
+ 4 => PriceType::Last,
+ _ => panic!("Invalid `Price` value, was {i}"),
+ }
+ }
+}
#[repr(C)]
#[derive(Copy, Clone, Debug, Hash, PartialEq, Eq)]
@@ -110,3 +148,143 @@ pub enum DepthType {
Volume = 1,
Exposure = 2,
}
+
+#[repr(C)]
+#[derive(Copy, Clone, Debug, Hash, PartialEq, Eq, PartialOrd)]
+#[allow(non_camel_case_types)]
+pub enum BarAggregation {
+ Tick = 1,
+ TickImbalance = 2,
+ TickRuns = 3,
+ Volume = 4,
+ VolumeImbalance = 5,
+ VolumeRuns = 6,
+ Value = 7,
+ ValueImbalance = 8,
+ ValueRuns = 9,
+ Millisecond = 10,
+ Second = 11,
+ Minute = 12,
+ Hour = 13,
+ Day = 14,
+ Week = 15,
+ Month = 16,
+}
+
+impl From<&str> for BarAggregation {
+ fn from(s: &str) -> Self {
+ match s.to_uppercase().as_str() {
+ "TICK" => BarAggregation::Tick,
+ "TICK_IMBALANCE" => BarAggregation::TickImbalance,
+ "TICK_RUNS" => BarAggregation::TickRuns,
+ "VOLUME" => BarAggregation::Volume,
+ "VOLUME_IMBALANCE" => BarAggregation::VolumeImbalance,
+ "VOLUME_RUNS" => BarAggregation::VolumeRuns,
+ "VALUE" => BarAggregation::Value,
+ "VALUE_IMBALANCE" => BarAggregation::ValueImbalance,
+ "VALUE_RUNS" => BarAggregation::ValueRuns,
+ "MILLISECOND" => BarAggregation::Millisecond,
+ "SECOND" => BarAggregation::Second,
+ "MINUTE" => BarAggregation::Minute,
+ "HOUR" => BarAggregation::Hour,
+ "DAY" => BarAggregation::Day,
+ "WEEK" => BarAggregation::Week,
+ "MONTH" => BarAggregation::Month,
+ _ => panic!("Invalid `BarAggregation` value, was {s}"),
+ }
+ }
+}
+impl From for BarAggregation {
+ fn from(i: u8) -> Self {
+ match i {
+ 1 => BarAggregation::Tick,
+ 2 => BarAggregation::TickImbalance,
+ 3 => BarAggregation::TickRuns,
+ 4 => BarAggregation::Volume,
+ 5 => BarAggregation::VolumeImbalance,
+ 6 => BarAggregation::VolumeRuns,
+ 7 => BarAggregation::Value,
+ 8 => BarAggregation::ValueImbalance,
+ 9 => BarAggregation::ValueRuns,
+ 10 => BarAggregation::Millisecond,
+ 11 => BarAggregation::Second,
+ 12 => BarAggregation::Minute,
+ 13 => BarAggregation::Hour,
+ 14 => BarAggregation::Day,
+ 15 => BarAggregation::Week,
+ 16 => BarAggregation::Month,
+ _ => panic!("Invalid `BarAggregation` value, was {i}"),
+ }
+ }
+}
+
+impl BarAggregation {
+ pub fn as_str(&self) -> &'static str {
+ match self {
+ BarAggregation::Tick => "TICK",
+ BarAggregation::TickImbalance => "TICK_IMBALANCE",
+ BarAggregation::TickRuns => "TICK_RUNS",
+ BarAggregation::Volume => "VOLUME",
+ BarAggregation::VolumeImbalance => "VOLUME_IMBALANCE",
+ BarAggregation::VolumeRuns => "VOLUME_RUNS",
+ BarAggregation::Value => "VALUE",
+ BarAggregation::ValueImbalance => "VALUE_IMBALANCE",
+ BarAggregation::ValueRuns => "VALUE_RUNS",
+ BarAggregation::Millisecond => "MILLISECOND",
+ BarAggregation::Second => "SECOND",
+ BarAggregation::Minute => "MINUTE",
+ BarAggregation::Hour => "HOUR",
+ BarAggregation::Day => "DAY",
+ BarAggregation::Week => "WEEK",
+ BarAggregation::Month => "MONTH",
+ }
+ }
+}
+
+impl Display for BarAggregation {
+ fn fmt(&self, f: &mut Formatter<'_>) -> Result {
+ write!(f, "{}", self.as_str())
+ }
+}
+
+#[repr(C)]
+#[derive(Copy, Clone, Debug, Hash, PartialEq, Eq)]
+#[allow(non_camel_case_types)]
+pub enum AggregationSource {
+ External = 1,
+ Internal = 2,
+}
+
+impl AggregationSource {
+ pub fn as_str(&self) -> &'static str {
+ match self {
+ AggregationSource::External => "EXTERNAL",
+ AggregationSource::Internal => "INTERNAL",
+ }
+ }
+}
+
+impl From<&str> for AggregationSource {
+ fn from(s: &str) -> Self {
+ match s.to_uppercase().as_str() {
+ "EXTERNAL" => AggregationSource::External,
+ "INTERNAL" => AggregationSource::Internal,
+ _ => panic!("Invalid `AggregationSource` value, was {s}"),
+ }
+ }
+}
+impl From for AggregationSource {
+ fn from(i: u8) -> Self {
+ match i {
+ 1 => AggregationSource::External,
+ 2 => AggregationSource::Internal,
+ _ => panic!("Invalid `AggregationSource` value, was {i}"),
+ }
+ }
+}
+
+impl Display for AggregationSource {
+ fn fmt(&self, f: &mut Formatter<'_>) -> Result {
+ write!(f, "{}", self.as_str())
+ }
+}
diff --git a/nautilus_trader/adapters/betfair/factories.py b/nautilus_trader/adapters/betfair/factories.py
index b84a8d40dc46..ed1c0cf48452 100644
--- a/nautilus_trader/adapters/betfair/factories.py
+++ b/nautilus_trader/adapters/betfair/factories.py
@@ -36,6 +36,7 @@
CLIENTS: Dict[str, BetfairClient] = {}
+INSTRUMENT_PROVIDER = None
@lru_cache(1)
@@ -126,10 +127,15 @@ def get_cached_betfair_instrument_provider(
BinanceInstrumentProvider
"""
- LoggerAdapter("BetfairFactory", logger).warning(
- "Creating new instance of BetfairInstrumentProvider"
- )
- return BetfairInstrumentProvider(client=client, logger=logger, filters=dict(market_filter))
+ global INSTRUMENT_PROVIDER
+ if INSTRUMENT_PROVIDER is None:
+ LoggerAdapter("BetfairFactory", logger).warning(
+ "Creating new instance of BetfairInstrumentProvider"
+ )
+ INSTRUMENT_PROVIDER = BetfairInstrumentProvider(
+ client=client, logger=logger, filters=dict(market_filter)
+ )
+ return INSTRUMENT_PROVIDER
class BetfairLiveDataClientFactory(LiveDataClientFactory):
diff --git a/nautilus_trader/adapters/binance/common/types.py b/nautilus_trader/adapters/binance/common/types.py
index ed2da2c007f6..aa259e81f56a 100644
--- a/nautilus_trader/adapters/binance/common/types.py
+++ b/nautilus_trader/adapters/binance/common/types.py
@@ -98,6 +98,30 @@ def __init__(
self.taker_sell_base_volume = Quantity.from_str(str(taker_sell_base_volume))
self.taker_sell_quote_volume = Quantity.from_str(str(taker_sell_quote_volume))
+ def __del__(self) -> None:
+ pass # Avoid double free (segmentation fault)
+
+ def __getstate__(self):
+ return (
+ *super().__getstate__(),
+ self.quote_volume.__getstate__()[0],
+ self.count,
+ self.taker_buy_base_volume.__getstate__()[0],
+ self.taker_buy_quote_volume.__getstate__()[0],
+ self.taker_sell_base_volume.__getstate__()[0],
+ self.taker_sell_quote_volume.__getstate__()[0],
+ )
+
+ def __setstate__(self, state):
+
+ super().__setstate__(state[:15])
+ self.quote_volume = Quantity.from_raw(state[15], state[12])
+ self.count = state[16]
+ self.taker_buy_base_volume = Quantity.from_raw(state[17], state[12])
+ self.taker_buy_quote_volume = Quantity.from_raw(state[18], state[12])
+ self.taker_sell_base_volume = Quantity.from_raw(state[19], state[12])
+ self.taker_sell_quote_volume = Quantity.from_raw(state[20], state[12])
+
def __repr__(self) -> str:
return (
f"{type(self).__name__}("
diff --git a/nautilus_trader/adapters/interactive_brokers/execution.py b/nautilus_trader/adapters/interactive_brokers/execution.py
index 2fc06f02ca44..1122f661e8af 100644
--- a/nautilus_trader/adapters/interactive_brokers/execution.py
+++ b/nautilus_trader/adapters/interactive_brokers/execution.py
@@ -165,7 +165,7 @@ async def _check_task(self, coro):
def submit_order(self, command: SubmitOrder) -> None:
PyCondition.not_none(command, "command")
- contract_details = self._instrument_provider.contract_details[command.instrument_id]
+ contract_details = self._instrument_provider.contract_details[command.instrument_id.value]
order: IBOrder = nautilus_order_to_ib_order(order=command.order)
trade: IBTrade = self._client.placeOrder(contract=contract_details.contract, order=order)
self._venue_order_id_to_client_order_id[trade.order.orderId] = command.order.client_order_id
diff --git a/nautilus_trader/adapters/interactive_brokers/factories.py b/nautilus_trader/adapters/interactive_brokers/factories.py
index c73a072e34d0..5c150ac43aea 100644
--- a/nautilus_trader/adapters/interactive_brokers/factories.py
+++ b/nautilus_trader/adapters/interactive_brokers/factories.py
@@ -49,7 +49,7 @@ def get_cached_ib_client(
host: str = "127.0.0.1",
port: int = 4001,
connect=True,
- timeout=180,
+ timeout=300,
client_id: int = 1,
start_gateway: bool = True,
) -> ib_insync.IB:
@@ -88,7 +88,7 @@ def get_cached_ib_client(
# Start gateway
if GATEWAY is None:
GATEWAY = InteractiveBrokersGateway(username=username, password=password)
- GATEWAY.safe_start()
+ GATEWAY.safe_start(wait=timeout)
client_key: tuple = (host, port)
diff --git a/nautilus_trader/adapters/interactive_brokers/gateway.py b/nautilus_trader/adapters/interactive_brokers/gateway.py
index 86a6af13672e..94c8521263e1 100644
--- a/nautilus_trader/adapters/interactive_brokers/gateway.py
+++ b/nautilus_trader/adapters/interactive_brokers/gateway.py
@@ -45,7 +45,7 @@ class InteractiveBrokersGateway:
A class to manage starting an Interactive Brokers Gateway docker container
"""
- IMAGE = "mgvazquez/ibgateway"
+ IMAGE = "ghcr.io/unusualalpha/ib-gateway:1012.2m"
CONTAINER_NAME = "nautilus-ib-gateway"
def __init__(
@@ -165,9 +165,9 @@ def start(self, wait: Optional[int] = 90):
self.log.info("Gateway ready")
- def safe_start(self):
+ def safe_start(self, wait: int = 90):
try:
- self.start()
+ self.start(wait=wait)
except ContainerExists:
return
diff --git a/nautilus_trader/adapters/interactive_brokers/historic.py b/nautilus_trader/adapters/interactive_brokers/historic.py
index 5992a7e11af5..2ac9909dbba9 100644
--- a/nautilus_trader/adapters/interactive_brokers/historic.py
+++ b/nautilus_trader/adapters/interactive_brokers/historic.py
@@ -42,7 +42,7 @@
from nautilus_trader.model.instruments.base import Instrument
from nautilus_trader.model.objects import Price
from nautilus_trader.model.objects import Quantity
-from nautilus_trader.persistence.catalog import DataCatalog
+from nautilus_trader.persistence.catalog.parquet import ParquetDataCatalog
from nautilus_trader.persistence.external.core import write_objects
@@ -50,7 +50,7 @@
def generate_filename(
- catalog: DataCatalog,
+ catalog: ParquetDataCatalog,
instrument_id: InstrumentId,
kind: Literal["BID_ASK", "TRADES"],
date: datetime.date,
@@ -61,7 +61,7 @@ def generate_filename(
def back_fill_catalog(
ib: IB,
- catalog: DataCatalog,
+ catalog: ParquetDataCatalog,
contracts: List[Contract],
start_date: datetime.date,
end_date: datetime.date,
@@ -75,10 +75,10 @@ def back_fill_catalog(
----------
ib : IB
The ib_insync client.
- catalog : DataCatalog
- The DataCatalog to write the data to
+ catalog : ParquetDataCatalog
+ The data catalog to write the data to.
contracts : List[Contract]
- The list of IB Contracts to collect data for
+ The list of IB Contracts to collect data for.
start_date : datetime.date
The start_date for the back fill.
end_date : datetime.date
diff --git a/nautilus_trader/adapters/interactive_brokers/providers.py b/nautilus_trader/adapters/interactive_brokers/providers.py
index 5ae324d8046a..e5b5aa78b632 100644
--- a/nautilus_trader/adapters/interactive_brokers/providers.py
+++ b/nautilus_trader/adapters/interactive_brokers/providers.py
@@ -78,11 +78,13 @@ def __init__(
self._port = port
self._client_id = client_id
self.config = config
- self.contract_details: Dict[InstrumentId, ContractDetails] = {}
+ self.contract_details: Dict[str, ContractDetails] = {}
self.contract_id_to_instrument_id: Dict[int, InstrumentId] = {}
async def load_all_async(self, filters: Optional[Dict] = None) -> None:
- await self.load(**filters)
+ for f in self._parse_filters(filters=filters):
+ filt = dict(f)
+ await self.load(**filt)
@staticmethod
def _one_not_both(a, b):
@@ -93,13 +95,22 @@ def _parse_contract(**kwargs) -> Contract:
sec_type = kwargs.pop("secType", None)
return Contract(secType=sec_type, **kwargs)
+ @staticmethod
+ def _parse_filters(filters):
+ if "filters" in filters:
+ return filters["filters"]
+ elif filters is None:
+ return []
+ return tuple(filters.items())
+
async def load_ids_async(
self,
instrument_ids: List[InstrumentId],
filters: Optional[Dict] = None,
) -> None:
assert self._one_not_both(instrument_ids, filters)
- await self.load(**dict(filters or {}))
+ for filt in self._parse_filters(filters):
+ await self.load(**dict(filt or {}))
async def load_async(self, instrument_id: InstrumentId, filters: Optional[Dict] = None):
raise NotImplementedError("method must be implemented in the subclass") # pragma: no cover
@@ -221,5 +232,5 @@ async def load(self, build_options_chain=False, option_kwargs=None, **kwargs):
)
self._log.info(f"Adding {instrument=} from IB instrument provider")
self.add(instrument)
- self.contract_details[instrument.id] = details
+ self.contract_details[instrument.id.value] = details
self.contract_id_to_instrument_id[details.contract.conId] = instrument.id
diff --git a/nautilus_trader/backtest/data/wranglers.pyx b/nautilus_trader/backtest/data/wranglers.pyx
index 65e5082b99f9..903ce07669a6 100644
--- a/nautilus_trader/backtest/data/wranglers.pyx
+++ b/nautilus_trader/backtest/data/wranglers.pyx
@@ -13,17 +13,15 @@
# limitations under the License.
# -------------------------------------------------------------------------------------------------
+import random
from copy import copy
import numpy as np
+import pandas as pd
from libc.stdint cimport int64_t
from libc.stdint cimport uint64_t
-import random
-
-import pandas as pd
-
from nautilus_trader.core.correctness cimport Condition
from nautilus_trader.core.datetime cimport as_utc_index
from nautilus_trader.core.datetime cimport secs_to_nanos
diff --git a/nautilus_trader/backtest/engine.pxd b/nautilus_trader/backtest/engine.pxd
index bd076521e13c..98f67ba31c44 100644
--- a/nautilus_trader/backtest/engine.pxd
+++ b/nautilus_trader/backtest/engine.pxd
@@ -54,4 +54,4 @@ cdef class BacktestEngine:
"""The last backtest run time range end (if run).\n\n:returns: `datetime` or ``None``"""
cdef Data _next(self)
- cdef void _advance_time(self, uint64_t now_ns) except *
+ cdef list _advance_time(self, uint64_t now_ns)
diff --git a/nautilus_trader/backtest/engine.pyx b/nautilus_trader/backtest/engine.pyx
index 5bacc475acf7..1ecd0766415e 100644
--- a/nautilus_trader/backtest/engine.pyx
+++ b/nautilus_trader/backtest/engine.pyx
@@ -766,11 +766,12 @@ cdef class BacktestEngine:
break
# -- MAIN BACKTEST LOOP -----------------------------------------------#
+ cdef list now_events
cdef Data data = self._next()
while data is not None:
if data.ts_init > end_ns:
break
- self._advance_time(data.ts_init)
+ now_events = self._advance_time(data.ts_init)
if isinstance(data, OrderBookData):
self._exchanges[data.instrument_id.venue].process_order_book(data)
elif isinstance(data, QuoteTick):
@@ -780,6 +781,8 @@ cdef class BacktestEngine:
elif isinstance(data, Bar):
self._exchanges[data.type.instrument_id.venue].process_bar(data)
self.kernel.data_engine.process(data)
+ for event_handler in now_events:
+ event_handler.handle()
for exchange in self._exchanges.values():
exchange.process(data.ts_init)
self.iteration += 1
@@ -807,20 +810,29 @@ cdef class BacktestEngine:
if cursor < self._data_len:
return self._data[cursor]
- cdef void _advance_time(self, uint64_t now_ns) except *:
- cdef list time_events = [] # type: list[TimeEventHandler]
+ cdef list _advance_time(self, uint64_t now_ns):
+ cdef list all_events = [] # type: list[TimeEventHandler]
+ cdef list now_events = [] # type: list[TimeEventHandler]
cdef:
Actor actor
Strategy strategy
- cdef TimeEventHandler event_handler
for actor in self.kernel.trader.actors_c():
- time_events += actor.clock.advance_time(now_ns)
+ all_events += actor.clock.advance_time(now_ns)
for strategy in self.kernel.trader.strategies_c():
- time_events += strategy.clock.advance_time(now_ns)
- for event_handler in sorted(time_events):
- self.kernel.clock.set_time(event_handler.event.ts_event)
+ all_events += strategy.clock.advance_time(now_ns)
+
+ all_events += self.kernel.clock.advance_time(now_ns)
+
+ # Handle all events prior to the `now_ns`
+ cdef TimeEventHandler event_handler
+ for event_handler in sorted(all_events):
+ if event_handler.event.ts_event == now_ns:
+ now_events.append(event_handler)
+ continue
event_handler.handle()
- self.kernel.clock.set_time(now_ns)
+
+ # Return the remaining events to be handled
+ return now_events
def _log_pre_run(self):
log_memory(self._log)
diff --git a/nautilus_trader/backtest/exchange.pyx b/nautilus_trader/backtest/exchange.pyx
index 8132dc025deb..6679d65d5935 100644
--- a/nautilus_trader/backtest/exchange.pyx
+++ b/nautilus_trader/backtest/exchange.pyx
@@ -656,7 +656,7 @@ cdef class SimulatedExchange:
self._log.debug(f"Processed {bar}")
cdef void _process_trade_ticks_from_bar(self, OrderBook book, Bar bar) except *:
- cdef Quantity size = Quantity(bar.volume.as_f64_c() / 4.0, bar.volume._mem.precision)
+ cdef Quantity size = Quantity(bar.volume.as_double() / 4.0, bar._mem.volume.precision)
cdef Price last = self._last.get(book.instrument_id)
# Create reusable tick
@@ -664,14 +664,14 @@ cdef class SimulatedExchange:
bar.type.instrument_id,
bar.open,
size,
- AggressorSide.BUY if last is None or bar.open._mem.raw > last._mem.raw else AggressorSide.SELL,
+ AggressorSide.BUY if last is None or bar._mem.open.raw > last._mem.raw else AggressorSide.SELL,
self._generate_trade_id(),
bar.ts_event,
bar.ts_event,
)
# Open
- if last is None or bar.open._mem.raw != last._mem.raw: # Direct memory comparison
+ if last is None or bar._mem.open.raw != last._mem.raw: # Direct memory comparison
book.update_trade_tick(tick)
self._iterate_matching_engine(
tick.instrument_id,
@@ -680,8 +680,8 @@ cdef class SimulatedExchange:
last = bar.open
# High
- if bar.high._mem.raw > last._mem.raw: # Direct memory comparison
- tick._mem.price = bar.high._mem # Direct memory assignment
+ if bar._mem.high.raw > last._mem.raw: # Direct memory comparison
+ tick._mem.price = bar._mem.high # Direct memory assignment
tick._mem.aggressor_side = AggressorSide.BUY # Direct memory assignment
tick._mem.trade_id = self._generate_trade_id()._mem
book.update_trade_tick(tick)
@@ -692,8 +692,8 @@ cdef class SimulatedExchange:
last = bar.high
# Low
- if bar.low._mem.raw < last._mem.raw: # Direct memory comparison
- tick._mem.price = bar.low._mem # Direct memory assignment
+ if bar._mem.low.raw < last._mem.raw: # Direct memory comparison
+ tick._mem.price = bar._mem.low # Direct memory assignment
tick._mem.aggressor_side = AggressorSide.SELL
tick._mem.trade_id = self._generate_trade_id()._mem
book.update_trade_tick(tick)
@@ -704,9 +704,9 @@ cdef class SimulatedExchange:
last = bar.low
# Close
- if bar.close._mem.raw != last._mem.raw: # Direct memory comparison
- tick._mem.price = bar.close._mem # Direct memory assignment
- tick._mem.aggressor_side = AggressorSide.BUY if bar.close._mem.raw > last._mem.raw else AggressorSide.SELL
+ if bar._mem.close.raw != last._mem.raw: # Direct memory comparison
+ tick._mem.price = bar._mem.close # Direct memory assignment
+ tick._mem.aggressor_side = AggressorSide.BUY if bar._mem.close.raw > last._mem.raw else AggressorSide.SELL
tick._mem.trade_id = self._generate_trade_id()._mem
book.update_trade_tick(tick)
self._iterate_matching_engine(
@@ -727,8 +727,8 @@ cdef class SimulatedExchange:
if last_bid_bar.ts_event != last_ask_bar.ts_event:
return # Wait for next bar
- cdef Quantity bid_size = Quantity(last_bid_bar.volume.as_f64_c() / 4.0, last_bid_bar.volume._mem.precision)
- cdef Quantity ask_size = Quantity(last_ask_bar.volume.as_f64_c() / 4.0, last_ask_bar.volume._mem.precision)
+ cdef Quantity bid_size = Quantity(last_bid_bar.volume.as_double() / 4.0, last_bid_bar._mem.volume.precision)
+ cdef Quantity ask_size = Quantity(last_ask_bar.volume.as_double() / 4.0, last_ask_bar._mem.volume.precision)
# Create reusable tick
cdef QuoteTick tick = QuoteTick(
@@ -749,8 +749,8 @@ cdef class SimulatedExchange:
)
# High
- tick._mem.bid = last_bid_bar.high._mem # Direct memory assignment
- tick._mem.ask = last_ask_bar.high._mem # Direct memory assignment
+ tick._mem.bid = last_bid_bar._mem.high # Direct memory assignment
+ tick._mem.ask = last_ask_bar._mem.high # Direct memory assignment
book.update_quote_tick(tick)
self._iterate_matching_engine(
tick.instrument_id,
@@ -758,8 +758,8 @@ cdef class SimulatedExchange:
)
# Low
- tick._mem.bid = last_bid_bar.low._mem # Assigning memory directly
- tick._mem.ask = last_ask_bar.low._mem # Assigning memory directly
+ tick._mem.bid = last_bid_bar._mem.low # Assigning memory directly
+ tick._mem.ask = last_ask_bar._mem.low # Assigning memory directly
book.update_quote_tick(tick)
self._iterate_matching_engine(
tick.instrument_id,
@@ -767,8 +767,8 @@ cdef class SimulatedExchange:
)
# Close
- tick._mem.bid = last_bid_bar.close._mem # Assigning memory directly
- tick._mem.ask = last_ask_bar.close._mem # Assigning memory directly
+ tick._mem.bid = last_bid_bar._mem.close # Assigning memory directly
+ tick._mem.ask = last_ask_bar._mem.close # Assigning memory directly
book.update_quote_tick(tick)
self._iterate_matching_engine(
tick.instrument_id,
diff --git a/nautilus_trader/backtest/node.py b/nautilus_trader/backtest/node.py
index e5834fa2d306..7add6e7bcf60 100644
--- a/nautilus_trader/backtest/node.py
+++ b/nautilus_trader/backtest/node.py
@@ -38,7 +38,7 @@
from nautilus_trader.persistence.batching import batch_files
from nautilus_trader.persistence.batching import extract_generic_data_client_ids
from nautilus_trader.persistence.batching import groupby_datatype
-from nautilus_trader.persistence.catalog import DataCatalog
+from nautilus_trader.persistence.catalog.parquet import ParquetDataCatalog
class BacktestNode:
@@ -235,7 +235,7 @@ def _run_streaming(
batch_size_bytes: int,
) -> None:
config = data_configs[0]
- catalog: DataCatalog = config.catalog()
+ catalog: ParquetDataCatalog = config.catalog()
data_client_ids = extract_generic_data_client_ids(data_configs=data_configs)
diff --git a/nautilus_trader/cache/base.pxd b/nautilus_trader/cache/base.pxd
index f7420b8762ab..f2012e652849 100644
--- a/nautilus_trader/cache/base.pxd
+++ b/nautilus_trader/cache/base.pxd
@@ -49,6 +49,7 @@ cdef class CacheFacade:
cpdef QuoteTick quote_tick(self, InstrumentId instrument_id, int index=*)
cpdef TradeTick trade_tick(self, InstrumentId instrument_id, int index=*)
cpdef Bar bar(self, BarType bar_type, int index=*)
+ cpdef int book_update_count(self, InstrumentId instrument_id) except *
cpdef int ticker_count(self, InstrumentId instrument_id) except *
cpdef int quote_tick_count(self, InstrumentId instrument_id) except *
cpdef int trade_tick_count(self, InstrumentId instrument_id) except *
diff --git a/nautilus_trader/cache/base.pyx b/nautilus_trader/cache/base.pyx
index 84f6b762bbe4..0424e27d8b07 100644
--- a/nautilus_trader/cache/base.pyx
+++ b/nautilus_trader/cache/base.pyx
@@ -72,6 +72,10 @@ cdef class CacheFacade:
"""Abstract method (implement in subclass)."""
raise NotImplementedError("method must be implemented in the subclass") # pragma: no cover
+ cpdef int book_update_count(self, InstrumentId instrument_id) except *:
+ """Abstract method (implement in subclass)."""
+ raise NotImplementedError("method must be implemented in the subclass") # pragma: no cover
+
cpdef int ticker_count(self, InstrumentId instrument_id) except *:
"""Abstract method (implement in subclass)."""
raise NotImplementedError("method must be implemented in the subclass") # pragma: no cover
diff --git a/nautilus_trader/cache/cache.pyx b/nautilus_trader/cache/cache.pyx
index 6384f564381c..2fd404c581cd 100644
--- a/nautilus_trader/cache/cache.pyx
+++ b/nautilus_trader/cache/cache.pyx
@@ -1673,6 +1673,30 @@ cdef class Cache(CacheFacade):
except IndexError:
return None
+ cpdef int book_update_count(self, InstrumentId instrument_id) except *:
+ """
+ The count of order book updates for the given instrument ID.
+
+ Will return zero if there is no book for the instrument ID.
+
+ Parameters
+ ----------
+ instrument_id : InstrumentId
+ The instrument ID for the book.
+
+ Returns
+ -------
+ int
+
+ """
+ Condition.not_none(instrument_id, "instrument_id")
+
+ cdef OrderBook book = self._order_books.get(instrument_id)
+ if book is None:
+ return 0
+ else:
+ return book.count
+
cpdef int ticker_count(self, InstrumentId instrument_id) except *:
"""
The count of tickers for the given instrument ID.
diff --git a/nautilus_trader/common/clock.pxd b/nautilus_trader/common/clock.pxd
index 87414e578709..df4ae23ae944 100644
--- a/nautilus_trader/common/clock.pxd
+++ b/nautilus_trader/common/clock.pxd
@@ -49,7 +49,6 @@ cdef class Clock:
cpdef uint64_t timestamp_ns(self) except *
cpdef datetime utc_now(self)
cpdef datetime local_now(self, tzinfo tz=*)
- cpdef timedelta delta(self, datetime time)
cpdef list timer_names(self)
cpdef Timer timer(self, str name)
cpdef void register_default_handler(self, handler: Callable[[TimeEvent], None]) except *
diff --git a/nautilus_trader/common/clock.pyx b/nautilus_trader/common/clock.pyx
index 5f38788a1ec1..9b2434643f7b 100644
--- a/nautilus_trader/common/clock.pyx
+++ b/nautilus_trader/common/clock.pyx
@@ -18,7 +18,6 @@ from typing import Callable
import cython
import numpy as np
import pandas as pd
-import pytz
from cpython.datetime cimport datetime
from cpython.datetime cimport timedelta
@@ -135,25 +134,6 @@ cdef class Clock:
"""
return self.utc_now().astimezone(tz)
- cpdef timedelta delta(self, datetime time):
- """
- Return the timedelta from the current time to the given time.
-
- Parameters
- ----------
- time : datetime
- The datum time.
-
- Returns
- -------
- timedelta
- The time difference.
-
- """
- Condition.not_none(time, "time")
-
- return self.utc_now() - time
-
cpdef list timer_names(self):
"""
The timer names held by the clock.
@@ -544,17 +524,13 @@ cdef class TestClock(Clock):
"""
Provides a monotonic clock for backtesting and unit testing.
- Parameters
- ----------
- initial_ns : uint64_t
- The initial UNIX time (nanoseconds) for the clock.
"""
__test__ = False
- def __init__(self, uint64_t initial_ns=0):
+ def __init__(self):
super().__init__()
- self._time_ns = initial_ns
+ self._time_ns = 0
self.is_test_clock = True
cpdef datetime utc_now(self):
@@ -567,7 +543,7 @@ cdef class TestClock(Clock):
The current tz-aware UTC time of the clock.
"""
- return pd.Timestamp(self._time_ns, tz=pytz.utc)
+ return pd.Timestamp(self._time_ns, tz="UTC")
cpdef double timestamp(self) except *:
"""
@@ -690,7 +666,7 @@ cdef class TestClock(Clock):
cdef class LiveClock(Clock):
"""
- Provides a clock for live trading. All times are timezone aware UTC.
+ Provides a monotonic clock for live trading. All times are timezone aware UTC.
Parameters
----------
diff --git a/nautilus_trader/common/logging.pyx b/nautilus_trader/common/logging.pyx
index 943fd3427006..b675703fa58d 100644
--- a/nautilus_trader/common/logging.pyx
+++ b/nautilus_trader/common/logging.pyx
@@ -54,7 +54,6 @@ from nautilus_trader.core.rust.common cimport logger_get_trader_id
from nautilus_trader.core.rust.common cimport logger_is_bypassed
from nautilus_trader.core.rust.common cimport logger_log
from nautilus_trader.core.rust.common cimport logger_new
-from nautilus_trader.core.rust.core cimport unix_timestamp_ns
from nautilus_trader.core.uuid cimport UUID4
from nautilus_trader.model.identifiers cimport TraderId
@@ -167,8 +166,8 @@ cdef class Logger:
)
self._sinks = []
- def __del__(self):
- logger_free(self._logger)
+ def __del__(self) -> None:
+ logger_free(self._logger) # `self._logger` moved to Rust (then dropped)
@property
def trader_id(self) -> TraderId:
diff --git a/nautilus_trader/common/timer.pxd b/nautilus_trader/common/timer.pxd
index 27fdc406f615..3b265a375f8f 100644
--- a/nautilus_trader/common/timer.pxd
+++ b/nautilus_trader/common/timer.pxd
@@ -55,7 +55,6 @@ cdef class Timer:
cdef class TestTimer(Timer):
- cpdef Event pop_next_event(self)
cpdef list advance(self, uint64_t to_time_ns)
diff --git a/nautilus_trader/common/timer.pyx b/nautilus_trader/common/timer.pyx
index 2cf6fa90b651..f26e9acd9dc2 100644
--- a/nautilus_trader/common/timer.pyx
+++ b/nautilus_trader/common/timer.pyx
@@ -147,13 +147,10 @@ cdef class Timer:
uint64_t stop_time_ns=0,
):
Condition.valid_string(name, "name")
- Condition.callable(callback, "function")
+ Condition.callable(callback, "callback")
self.name = name
self.callback = callback
-
- # Note that for very large time intervals (greater than 270 years on
- # most platforms) the below will lose microsecond accuracy.
self.interval_ns = interval_ns
self.start_time_ns = start_time_ns
self.next_time_ns = start_time_ns + interval_ns
@@ -283,23 +280,6 @@ cdef class TestTimer(Timer):
return events
- cpdef Event pop_next_event(self):
- """
- Return the next time event for this timer.
-
- Returns
- -------
- TimeEvent
-
- """
- cdef TimeEvent event = self.pop_event(
- event_id=UUID4(),
- ts_init=self.next_time_ns,
- )
- self.iterate_next_time(to_time_ns=self.next_time_ns)
-
- return event
-
cpdef void cancel(self) except *:
"""
Cancels the timer (the timer will not generate an event).
diff --git a/nautilus_trader/config/backtest.py b/nautilus_trader/config/backtest.py
index 82b51bb8a39e..8812849df95c 100644
--- a/nautilus_trader/config/backtest.py
+++ b/nautilus_trader/config/backtest.py
@@ -174,9 +174,9 @@ def end_time_nanos(self) -> int:
return maybe_dt_to_unix_nanos(pd.Timestamp(self.end_time))
def catalog(self):
- from nautilus_trader.persistence.catalog import DataCatalog
+ from nautilus_trader.persistence.catalog.parquet import ParquetDataCatalog
- return DataCatalog(
+ return ParquetDataCatalog(
path=self.catalog_path,
fs_protocol=self.catalog_fs_protocol,
fs_storage_options=self.catalog_fs_storage_options,
diff --git a/nautilus_trader/config/common.py b/nautilus_trader/config/common.py
index c09849985901..bebd36beaffa 100644
--- a/nautilus_trader/config/common.py
+++ b/nautilus_trader/config/common.py
@@ -27,7 +27,7 @@
from nautilus_trader.common import Environment
from nautilus_trader.core.correctness import PyCondition
-from nautilus_trader.persistence.catalog import DataCatalog
+from nautilus_trader.persistence.catalog.parquet import ParquetDataCatalog
def resolve_path(path: str):
@@ -220,11 +220,11 @@ def fs(self):
return fsspec.filesystem(protocol=self.fs_protocol, **(self.fs_storage_options or {}))
@classmethod
- def from_catalog(cls, catalog: DataCatalog, **kwargs):
+ def from_catalog(cls, catalog: ParquetDataCatalog, **kwargs):
return cls(catalog_path=str(catalog.path), fs_protocol=catalog.fs.protocol, **kwargs)
- def as_catalog(self) -> DataCatalog:
- return DataCatalog(
+ def as_catalog(self) -> ParquetDataCatalog:
+ return ParquetDataCatalog(
path=self.catalog_path,
fs_protocol=self.fs_protocol,
fs_storage_options=self.fs_storage_options,
diff --git a/nautilus_trader/core/includes/common.h b/nautilus_trader/core/includes/common.h
index ab5656724ada..43dea0497b59 100644
--- a/nautilus_trader/core/includes/common.h
+++ b/nautilus_trader/core/includes/common.h
@@ -25,6 +25,14 @@ typedef enum LogLevel {
typedef struct Logger_t Logger_t;
+typedef struct Option_PyObject Option_PyObject;
+
+typedef struct TestClock TestClock;
+
+typedef struct CTestClock {
+ struct TestClock *_0;
+} CTestClock;
+
/**
* Logger is not C FFI safe, so we box and pass it as an opaque pointer.
* This works because Logger fields don't need to be accessed, only functions
@@ -34,6 +42,32 @@ typedef struct CLogger {
struct Logger_t *_0;
} CLogger;
+struct CTestClock test_clock_new(void);
+
+void test_clock_register_default_handler(struct CTestClock *clock, PyObject handler);
+
+/**
+ * # Safety
+ * - `name` must be borrowed from a valid Python UTF-8 `str`.
+ */
+void test_clock_set_time_alert_ns(struct CTestClock *clock,
+ PyObject name,
+ uint64_t alert_time_ns,
+ struct Option_PyObject callback);
+
+/**
+ * # Safety
+ * - `name` must be borrowed from a valid Python UTF-8 `str`.
+ */
+void test_clock_set_timer_ns(struct CTestClock *clock,
+ PyObject name,
+ int64_t interval_ns,
+ uint64_t start_time_ns,
+ uint64_t stop_time_ns,
+ struct Option_PyObject callback);
+
+PyObject test_clock_advance_time(struct CTestClock *clock, uint64_t to_time_ns);
+
/**
* Creates a logger from a valid Python object pointer and a defined logging level.
*
diff --git a/nautilus_trader/core/includes/model.h b/nautilus_trader/core/includes/model.h
index ede09c350c46..0618482c6f59 100644
--- a/nautilus_trader/core/includes/model.h
+++ b/nautilus_trader/core/includes/model.h
@@ -9,6 +9,30 @@
#define FIXED_SCALAR 1000000000.0
+typedef enum AggregationSource {
+ External = 1,
+ Internal = 2,
+} AggregationSource;
+
+typedef enum BarAggregation {
+ Tick = 1,
+ TickImbalance = 2,
+ TickRuns = 3,
+ Volume = 4,
+ VolumeImbalance = 5,
+ VolumeRuns = 6,
+ Value = 7,
+ ValueImbalance = 8,
+ ValueRuns = 9,
+ Millisecond = 10,
+ Second = 11,
+ Minute = 12,
+ Hour = 13,
+ Day = 14,
+ Week = 15,
+ Month = 16,
+} BarAggregation;
+
typedef enum BookLevel {
L1_TBBO = 1,
L2_MBP = 2,
@@ -25,12 +49,25 @@ typedef enum OrderSide {
Sell = 2,
} OrderSide;
+typedef enum PriceType {
+ Bid = 1,
+ Ask = 2,
+ Mid = 3,
+ Last = 4,
+} PriceType;
+
typedef struct BTreeMap_BookPrice__Level BTreeMap_BookPrice__Level;
typedef struct HashMap_u64__BookPrice HashMap_u64__BookPrice;
typedef struct String String;
+typedef struct BarSpecification_t {
+ uint64_t step;
+ enum BarAggregation aggregation;
+ enum PriceType price_type;
+} BarSpecification_t;
+
typedef struct Symbol_t {
struct String *value;
} Symbol_t;
@@ -44,6 +81,12 @@ typedef struct InstrumentId_t {
struct Venue_t venue;
} InstrumentId_t;
+typedef struct BarType_t {
+ struct InstrumentId_t instrument_id;
+ struct BarSpecification_t spec;
+ enum AggregationSource aggregation_source;
+} BarType_t;
+
typedef struct Price_t {
int64_t raw;
uint8_t precision;
@@ -54,6 +97,17 @@ typedef struct Quantity_t {
uint8_t precision;
} Quantity_t;
+typedef struct Bar_t {
+ struct BarType_t bar_type;
+ struct Price_t open;
+ struct Price_t high;
+ struct Price_t low;
+ struct Price_t close;
+ struct Quantity_t volume;
+ uint64_t ts_event;
+ uint64_t ts_init;
+} Bar_t;
+
/**
* Represents a single quote tick in a financial market.
*/
@@ -148,6 +202,106 @@ typedef struct Money_t {
struct Currency_t currency;
} Money_t;
+/**
+ * Returns a [BarSpecification] as a Python str.
+ *
+ * # Safety
+ * Returns a pointer to a valid Python UTF-8 string.
+ * - Assumes that since the data is originating from Rust, the GIL does not need
+ * to be acquired.
+ * - Assumes you are immediately returning this pointer to Python.
+ */
+PyObject *bar_specification_to_pystr(const struct BarSpecification_t *bar_spec);
+
+void bar_specification_free(struct BarSpecification_t bar_spec);
+
+uint64_t bar_specification_hash(const struct BarSpecification_t *bar_spec);
+
+struct BarSpecification_t bar_specification_new(uint64_t step,
+ uint8_t aggregation,
+ uint8_t price_type);
+
+uint8_t bar_specification_eq(const struct BarSpecification_t *lhs,
+ const struct BarSpecification_t *rhs);
+
+uint8_t bar_specification_lt(const struct BarSpecification_t *lhs,
+ const struct BarSpecification_t *rhs);
+
+uint8_t bar_specification_le(const struct BarSpecification_t *lhs,
+ const struct BarSpecification_t *rhs);
+
+uint8_t bar_specification_gt(const struct BarSpecification_t *lhs,
+ const struct BarSpecification_t *rhs);
+
+uint8_t bar_specification_ge(const struct BarSpecification_t *lhs,
+ const struct BarSpecification_t *rhs);
+
+struct BarType_t bar_type_new(struct InstrumentId_t instrument_id,
+ struct BarSpecification_t spec,
+ uint8_t aggregation_source);
+
+uint8_t bar_type_eq(const struct BarType_t *lhs, const struct BarType_t *rhs);
+
+uint8_t bar_type_lt(const struct BarType_t *lhs, const struct BarType_t *rhs);
+
+uint8_t bar_type_le(const struct BarType_t *lhs, const struct BarType_t *rhs);
+
+uint8_t bar_type_gt(const struct BarType_t *lhs, const struct BarType_t *rhs);
+
+uint8_t bar_type_ge(const struct BarType_t *lhs, const struct BarType_t *rhs);
+
+uint64_t bar_type_hash(const struct BarType_t *bar_type);
+
+/**
+ * Returns a [BarType] as a Python str.
+ *
+ * # Safety
+ * Returns a pointer to a valid Python UTF-8 string.
+ * - Assumes that since the data is originating from Rust, the GIL does not need
+ * to be acquired.
+ * - Assumes you are immediately returning this pointer to Python.
+ */
+PyObject *bar_type_to_pystr(const struct BarType_t *bar_type);
+
+void bar_type_free(struct BarType_t bar_type);
+
+struct Bar_t bar_new(struct BarType_t bar_type,
+ struct Price_t open,
+ struct Price_t high,
+ struct Price_t low,
+ struct Price_t close,
+ struct Quantity_t volume,
+ uint64_t ts_event,
+ uint64_t ts_init);
+
+struct Bar_t bar_new_from_raw(struct BarType_t bar_type,
+ int64_t open,
+ int64_t high,
+ int64_t low,
+ int64_t close,
+ uint8_t price_prec,
+ uint64_t volume,
+ uint8_t size_prec,
+ uint64_t ts_event,
+ uint64_t ts_init);
+
+/**
+ * Returns a [Bar] as a Python str.
+ *
+ * # Safety
+ * Returns a pointer to a valid Python UTF-8 string.
+ * - Assumes that since the data is originating from Rust, the GIL does not need
+ * to be acquired.
+ * - Assumes you are immediately returning this pointer to Python.
+ */
+PyObject *bar_to_pystr(const struct Bar_t *bar);
+
+void bar_free(struct Bar_t bar);
+
+uint8_t bar_eq(const struct Bar_t *lhs, const struct Bar_t *rhs);
+
+uint64_t bar_hash(const struct Bar_t *bar);
+
void quote_tick_free(struct QuoteTick_t tick);
struct QuoteTick_t quote_tick_new(struct InstrumentId_t instrument_id,
diff --git a/nautilus_trader/core/rust/common.pxd b/nautilus_trader/core/rust/common.pxd
index 6b530ed25a94..a6048827df70 100644
--- a/nautilus_trader/core/rust/common.pxd
+++ b/nautilus_trader/core/rust/common.pxd
@@ -1,7 +1,7 @@
# Warning, this file is autogenerated by cbindgen. Don't modify this manually. */
from cpython.object cimport PyObject
-from libc.stdint cimport uint8_t, uint64_t
+from libc.stdint cimport uint8_t, uint64_t, int64_t
from nautilus_trader.core.rust.core cimport UUID4_t
cdef extern from "../includes/common.h":
@@ -25,12 +25,43 @@ cdef extern from "../includes/common.h":
cdef struct Logger_t:
pass
+ cdef struct Option_PyObject:
+ pass
+
+ cdef struct TestClock:
+ pass
+
+ cdef struct CTestClock:
+ TestClock *_0;
+
# Logger is not C FFI safe, so we box and pass it as an opaque pointer.
# This works because Logger fields don't need to be accessed, only functions
# are called.
cdef struct CLogger:
Logger_t *_0;
+ CTestClock test_clock_new();
+
+ void test_clock_register_default_handler(CTestClock *clock, PyObject handler);
+
+ # # Safety
+ # - `name` must be borrowed from a valid Python UTF-8 `str`.
+ void test_clock_set_time_alert_ns(CTestClock *clock,
+ PyObject name,
+ uint64_t alert_time_ns,
+ Option_PyObject callback);
+
+ # # Safety
+ # - `name` must be borrowed from a valid Python UTF-8 `str`.
+ void test_clock_set_timer_ns(CTestClock *clock,
+ PyObject name,
+ int64_t interval_ns,
+ uint64_t start_time_ns,
+ uint64_t stop_time_ns,
+ Option_PyObject callback);
+
+ PyObject test_clock_advance_time(CTestClock *clock, uint64_t to_time_ns);
+
# Creates a logger from a valid Python object pointer and a defined logging level.
#
# # Safety
diff --git a/nautilus_trader/core/rust/core.pxd b/nautilus_trader/core/rust/core.pxd
index 276f06c5753e..82ecae93211e 100644
--- a/nautilus_trader/core/rust/core.pxd
+++ b/nautilus_trader/core/rust/core.pxd
@@ -1,7 +1,7 @@
# Warning, this file is autogenerated by cbindgen. Don't modify this manually. */
from cpython.object cimport PyObject
-from libc.stdint cimport uint8_t, uint64_t
+from libc.stdint cimport uint8_t, uint64_t, int64_t
cdef extern from "../includes/core.h":
diff --git a/nautilus_trader/core/rust/model.pxd b/nautilus_trader/core/rust/model.pxd
index 45b368780153..5eeeb8fbf278 100644
--- a/nautilus_trader/core/rust/model.pxd
+++ b/nautilus_trader/core/rust/model.pxd
@@ -9,6 +9,28 @@ cdef extern from "../includes/model.h":
const double FIXED_SCALAR # = 1000000000.0
+ cdef enum AggregationSource:
+ External # = 1,
+ Internal # = 2,
+
+ cdef enum BarAggregation:
+ Tick # = 1,
+ TickImbalance # = 2,
+ TickRuns # = 3,
+ Volume # = 4,
+ VolumeImbalance # = 5,
+ VolumeRuns # = 6,
+ Value # = 7,
+ ValueImbalance # = 8,
+ ValueRuns # = 9,
+ Millisecond # = 10,
+ Second # = 11,
+ Minute # = 12,
+ Hour # = 13,
+ Day # = 14,
+ Week # = 15,
+ Month # = 16,
+
cdef enum BookLevel:
L1_TBBO # = 1,
L2_MBP # = 2,
@@ -22,6 +44,12 @@ cdef extern from "../includes/model.h":
Buy # = 1,
Sell # = 2,
+ cdef enum PriceType:
+ Bid # = 1,
+ Ask # = 2,
+ Mid # = 3,
+ Last # = 4,
+
cdef struct BTreeMap_BookPrice__Level:
pass
@@ -31,6 +59,11 @@ cdef extern from "../includes/model.h":
cdef struct String:
pass
+ cdef struct BarSpecification_t:
+ uint64_t step;
+ BarAggregation aggregation;
+ PriceType price_type;
+
cdef struct Symbol_t:
String *value;
@@ -41,6 +74,11 @@ cdef extern from "../includes/model.h":
Symbol_t symbol;
Venue_t venue;
+ cdef struct BarType_t:
+ InstrumentId_t instrument_id;
+ BarSpecification_t spec;
+ AggregationSource aggregation_source;
+
cdef struct Price_t:
int64_t raw;
uint8_t precision;
@@ -49,6 +87,16 @@ cdef extern from "../includes/model.h":
uint64_t raw;
uint8_t precision;
+ cdef struct Bar_t:
+ BarType_t bar_type;
+ Price_t open;
+ Price_t high;
+ Price_t low;
+ Price_t close;
+ Quantity_t volume;
+ uint64_t ts_event;
+ uint64_t ts_init;
+
# Represents a single quote tick in a financial market.
cdef struct QuoteTick_t:
InstrumentId_t instrument_id;
@@ -123,6 +171,95 @@ cdef extern from "../includes/model.h":
int64_t raw;
Currency_t currency;
+ # Returns a [BarSpecification] as a Python str.
+ #
+ # # Safety
+ # Returns a pointer to a valid Python UTF-8 string.
+ # - Assumes that since the data is originating from Rust, the GIL does not need
+ # to be acquired.
+ # - Assumes you are immediately returning this pointer to Python.
+ PyObject *bar_specification_to_pystr(const BarSpecification_t *bar_spec);
+
+ void bar_specification_free(BarSpecification_t bar_spec);
+
+ uint64_t bar_specification_hash(const BarSpecification_t *bar_spec);
+
+ BarSpecification_t bar_specification_new(uint64_t step,
+ uint8_t aggregation,
+ uint8_t price_type);
+
+ uint8_t bar_specification_eq(const BarSpecification_t *lhs, const BarSpecification_t *rhs);
+
+ uint8_t bar_specification_lt(const BarSpecification_t *lhs, const BarSpecification_t *rhs);
+
+ uint8_t bar_specification_le(const BarSpecification_t *lhs, const BarSpecification_t *rhs);
+
+ uint8_t bar_specification_gt(const BarSpecification_t *lhs, const BarSpecification_t *rhs);
+
+ uint8_t bar_specification_ge(const BarSpecification_t *lhs, const BarSpecification_t *rhs);
+
+ BarType_t bar_type_new(InstrumentId_t instrument_id,
+ BarSpecification_t spec,
+ uint8_t aggregation_source);
+
+ uint8_t bar_type_eq(const BarType_t *lhs, const BarType_t *rhs);
+
+ uint8_t bar_type_lt(const BarType_t *lhs, const BarType_t *rhs);
+
+ uint8_t bar_type_le(const BarType_t *lhs, const BarType_t *rhs);
+
+ uint8_t bar_type_gt(const BarType_t *lhs, const BarType_t *rhs);
+
+ uint8_t bar_type_ge(const BarType_t *lhs, const BarType_t *rhs);
+
+ uint64_t bar_type_hash(const BarType_t *bar_type);
+
+ # Returns a [BarType] as a Python str.
+ #
+ # # Safety
+ # Returns a pointer to a valid Python UTF-8 string.
+ # - Assumes that since the data is originating from Rust, the GIL does not need
+ # to be acquired.
+ # - Assumes you are immediately returning this pointer to Python.
+ PyObject *bar_type_to_pystr(const BarType_t *bar_type);
+
+ void bar_type_free(BarType_t bar_type);
+
+ Bar_t bar_new(BarType_t bar_type,
+ Price_t open,
+ Price_t high,
+ Price_t low,
+ Price_t close,
+ Quantity_t volume,
+ uint64_t ts_event,
+ uint64_t ts_init);
+
+ Bar_t bar_new_from_raw(BarType_t bar_type,
+ int64_t open,
+ int64_t high,
+ int64_t low,
+ int64_t close,
+ uint8_t price_prec,
+ uint64_t volume,
+ uint8_t size_prec,
+ uint64_t ts_event,
+ uint64_t ts_init);
+
+ # Returns a [Bar] as a Python str.
+ #
+ # # Safety
+ # Returns a pointer to a valid Python UTF-8 string.
+ # - Assumes that since the data is originating from Rust, the GIL does not need
+ # to be acquired.
+ # - Assumes you are immediately returning this pointer to Python.
+ PyObject *bar_to_pystr(const Bar_t *bar);
+
+ void bar_free(Bar_t bar);
+
+ uint8_t bar_eq(const Bar_t *lhs, const Bar_t *rhs);
+
+ uint64_t bar_hash(const Bar_t *bar);
+
void quote_tick_free(QuoteTick_t tick);
QuoteTick_t quote_tick_new(InstrumentId_t instrument_id,
diff --git a/nautilus_trader/data/aggregation.pxd b/nautilus_trader/data/aggregation.pxd
index 1d65db910d14..7a7a3e41aa2c 100644
--- a/nautilus_trader/data/aggregation.pxd
+++ b/nautilus_trader/data/aggregation.pxd
@@ -21,6 +21,7 @@ from libc.stdint cimport uint64_t
from nautilus_trader.common.clock cimport Clock
from nautilus_trader.common.logging cimport LoggerAdapter
from nautilus_trader.common.timer cimport TimeEvent
+from nautilus_trader.common.timer cimport Timer
from nautilus_trader.model.data.bar cimport Bar
from nautilus_trader.model.data.bar cimport BarType
from nautilus_trader.model.data.tick cimport QuoteTick
@@ -89,8 +90,10 @@ cdef class ValueBarAggregator(BarAggregator):
cdef class TimeBarAggregator(BarAggregator):
cdef Clock _clock
+ cdef Timer _timer
cdef bint _build_on_next_tick
cdef uint64_t _stored_close_ns
+ cdef tuple _cached_update
cdef readonly timedelta interval
"""The aggregators time interval.\n\n:returns: `timedelta`"""
@@ -105,5 +108,4 @@ cdef class TimeBarAggregator(BarAggregator):
cdef timedelta _get_interval(self)
cdef uint64_t _get_interval_ns(self)
cpdef void _set_build_timer(self) except *
- cpdef void _build_bar(self, uint64_t ts_event) except *
- cpdef void _build_event(self, TimeEvent event) except *
+ cpdef void _build_bar(self, TimeEvent event) except *
diff --git a/nautilus_trader/data/aggregation.pyx b/nautilus_trader/data/aggregation.pyx
index d42728b4249a..2f423ad9400e 100644
--- a/nautilus_trader/data/aggregation.pyx
+++ b/nautilus_trader/data/aggregation.pyx
@@ -110,13 +110,13 @@ cdef class BarBuilder:
if self._high is None or partial_bar.high._mem.raw > self._high._mem.raw:
self._high = partial_bar.high
- if self._low is None or partial_bar.low._mem.raw < self._low._mem.raw:
+ if self._low is None or partial_bar._mem.low.raw < self._low._mem.raw:
self._low = partial_bar.low
if self._close is None:
self._close = partial_bar.close
- self.volume._mem.raw += partial_bar.volume._mem.raw
+ self.volume._mem.raw += partial_bar._mem.volume.raw
if self.ts_last == 0:
self.ts_last = partial_bar.ts_init
@@ -212,7 +212,7 @@ cdef class BarBuilder:
low=self._low,
close=self._close,
volume=Quantity(self.volume, self.size_precision),
- ts_event=ts_event, # TODO: Hardcoded identical for now...
+ ts_event=ts_event,
ts_init=ts_event,
)
@@ -549,12 +549,14 @@ cdef class TimeBarAggregator(BarAggregator):
)
self._clock = clock
+ self._timer = None
self.interval = self._get_interval()
self.interval_ns = self._get_interval_ns()
self._set_build_timer()
- self.next_close_ns = self._clock.timer(str(self.bar_type)).next_time_ns
+ self.next_close_ns = self._timer.next_time_ns
self._build_on_next_tick = False
self._stored_close_ns = 0
+ self._cached_update = None
cpdef datetime get_start_time(self):
"""
@@ -627,6 +629,7 @@ cdef class TimeBarAggregator(BarAggregator):
Stop the bar aggregator.
"""
self._clock.cancel_timer(str(self.bar_type))
+ self._timer = None
cdef timedelta _get_interval(self):
cdef BarAggregation aggregation = self.bar_type.spec.aggregation
@@ -644,8 +647,9 @@ cdef class TimeBarAggregator(BarAggregator):
return timedelta(days=(1 * step))
else:
# Design time error
- raise ValueError(f"Aggregation not time range, "
- f"was {BarAggregationParser.to_str(aggregation)}")
+ raise ValueError(
+ f"Aggregation not time based, was {BarAggregationParser.to_str(aggregation)}",
+ )
cdef uint64_t _get_interval_ns(self):
cdef BarAggregation aggregation = self.bar_type.spec.aggregation
@@ -663,8 +667,9 @@ cdef class TimeBarAggregator(BarAggregator):
return secs_to_nanos(step) * 60 * 60 * 24
else:
# Design time error
- raise ValueError(f"Aggregation not time range, "
- f"was {BarAggregationParser.to_str(aggregation)}")
+ raise ValueError(
+ f"Aggregation not time based, was {BarAggregationParser.to_str(aggregation)}",
+ )
cpdef void _set_build_timer(self) except *:
cdef str timer_name = str(self.bar_type)
@@ -674,24 +679,14 @@ cdef class TimeBarAggregator(BarAggregator):
interval=self.interval,
start_time=self.get_start_time(),
stop_time=None,
- callback=self._build_event,
+ callback=self._build_bar,
)
+ self._timer = self._clock.timer(timer_name)
+
self._log.debug(f"Started timer {timer_name}.")
cdef void _apply_update(self, Price price, Quantity size, uint64_t ts_event) except *:
- if self._clock.is_test_clock:
- if self.next_close_ns < ts_event:
- # Build bar first, then update
- self._build_bar(self.next_close_ns)
- self._builder.update(price, size, ts_event)
- return
- elif self.next_close_ns == ts_event:
- # Update first, then build bar
- self._builder.update(price, size, ts_event)
- self._build_bar(self.next_close_ns)
- return
-
self._builder.update(price, size, ts_event)
if self._build_on_next_tick: # (fast C-level check)
self._build_and_send(self._stored_close_ns)
@@ -699,13 +694,7 @@ cdef class TimeBarAggregator(BarAggregator):
self._build_on_next_tick = False
self._stored_close_ns = 0
- cpdef void _build_bar(self, uint64_t ts_event) except *:
- cdef TestTimer timer = self._clock.timer(str(self.bar_type))
- cdef TimeEvent event = timer.pop_next_event()
- self._build_event(event)
- self.next_close_ns = timer.next_time_ns
-
- cpdef void _build_event(self, TimeEvent event) except *:
+ cpdef void _build_bar(self, TimeEvent event) except *:
if not self._builder.initialized:
# Set flag to build on next close with the stored close time
self._build_on_next_tick = True
@@ -713,3 +702,6 @@ cdef class TimeBarAggregator(BarAggregator):
return
self._build_and_send(ts_event=event.ts_event)
+
+ # On receiving this event, timer would now have a new `next_time_ns`
+ self.next_close_ns = self._timer.next_time_ns
diff --git a/nautilus_trader/data/client.pyx b/nautilus_trader/data/client.pyx
index 56b927dde514..db98617e8ad9 100644
--- a/nautilus_trader/data/client.pyx
+++ b/nautilus_trader/data/client.pyx
@@ -370,6 +370,7 @@ cdef class MarketDataClient(DataClient):
f"Cannot subscribe to all `Instrument` data: not implemented. "
f"You can implement by overriding the `subscribe_instruments` method for this client.",
)
+ raise NotImplementedError("method must be implemented in the subclass")
cpdef void subscribe_instrument(self, InstrumentId instrument_id) except *:
"""
@@ -380,6 +381,7 @@ cdef class MarketDataClient(DataClient):
f"Cannot subscribe to `Instrument` data for {instrument_id}: not implemented. "
f"You can implement by overriding the `subscribe_instrument` method for this client.",
)
+ raise NotImplementedError("method must be implemented in the subclass")
cpdef void subscribe_order_book_deltas(self, InstrumentId instrument_id, BookType book_type, int depth=0, dict kwargs=None) except *:
"""
@@ -401,6 +403,7 @@ cdef class MarketDataClient(DataClient):
f"Cannot subscribe to `OrderBookDeltas` data for {instrument_id}: not implemented. "
f"You can implement by overriding the `subscribe_order_book_deltas` method for this client.",
)
+ raise NotImplementedError("method must be implemented in the subclass")
cpdef void subscribe_order_book_snapshots(self, InstrumentId instrument_id, BookType book_type, int depth=0, dict kwargs=None) except *:
"""
@@ -422,6 +425,7 @@ cdef class MarketDataClient(DataClient):
f"Cannot subscribe to `OrderBookSnapshot` data for {instrument_id}: not implemented. "
f"You can implement by overriding the `subscribe_order_book_snapshots` method for this client.",
)
+ raise NotImplementedError("method must be implemented in the subclass")
cpdef void subscribe_ticker(self, InstrumentId instrument_id) except *:
"""
@@ -437,6 +441,7 @@ cdef class MarketDataClient(DataClient):
f"Cannot subscribe to `Ticker` data for {instrument_id}: not implemented. "
f"You can implement by overriding the `subscribe_ticker` method for this client.",
)
+ raise NotImplementedError("method must be implemented in the subclass")
cpdef void subscribe_quote_ticks(self, InstrumentId instrument_id) except *:
"""
@@ -452,6 +457,7 @@ cdef class MarketDataClient(DataClient):
f"Cannot subscribe to `QuoteTick` data for {instrument_id}: not implemented. "
f"You can implement by overriding the `subscribe_quote_ticks` method for this client.",
)
+ raise NotImplementedError("method must be implemented in the subclass")
cpdef void subscribe_trade_ticks(self, InstrumentId instrument_id) except *:
"""
@@ -467,6 +473,7 @@ cdef class MarketDataClient(DataClient):
f"Cannot subscribe to `TradeTick` data for {instrument_id}: not implemented. "
f"You can implement by overriding the `subscribe_trade_ticks` method for this client.",
)
+ raise NotImplementedError("method must be implemented in the subclass")
cpdef void subscribe_instrument_status_updates(self, InstrumentId instrument_id) except *:
"""
@@ -482,6 +489,7 @@ cdef class MarketDataClient(DataClient):
f"Cannot subscribe to `InstrumentStatusUpdates` data for {instrument_id}: not implemented. "
f"You can implement by overriding the `subscribe_instrument_status_updates` method for this client.",
)
+ raise NotImplementedError("method must be implemented in the subclass")
cpdef void subscribe_instrument_close_prices(self, InstrumentId instrument_id) except *:
"""
@@ -497,6 +505,7 @@ cdef class MarketDataClient(DataClient):
f"Cannot subscribe to `InstrumentClosePrice` data for {instrument_id}: not implemented. "
f"You can implement by overriding the `subscribe_instrument_close_prices` method for this client.",
)
+ raise NotImplementedError("method must be implemented in the subclass")
cpdef void subscribe_bars(self, BarType bar_type) except *:
"""
@@ -512,6 +521,7 @@ cdef class MarketDataClient(DataClient):
f"Cannot subscribe to `Bar` data for {bar_type}: not implemented. "
f"You can implement by overriding the `subscribe_bars` method for this client.",
)
+ raise NotImplementedError("method must be implemented in the subclass")
cpdef void unsubscribe_instruments(self) except *:
"""
@@ -522,6 +532,7 @@ cdef class MarketDataClient(DataClient):
f"Cannot unsubscribe from all `Instrument` data: not implemented. "
f"You can implement by overriding the `unsubscribe_instruments` method for this client.",
)
+ raise NotImplementedError("method must be implemented in the subclass")
cpdef void unsubscribe_instrument(self, InstrumentId instrument_id) except *:
"""
@@ -537,6 +548,7 @@ cdef class MarketDataClient(DataClient):
f"Cannot unsubscribe from `Instrument` data for {instrument_id}: not implemented. "
f"You can implement by overriding the `unsubscribe_instrument` method for this client.",
)
+ raise NotImplementedError("method must be implemented in the subclass")
cpdef void unsubscribe_order_book_deltas(self, InstrumentId instrument_id) except *:
"""
@@ -552,6 +564,7 @@ cdef class MarketDataClient(DataClient):
f"Cannot unsubscribe from `OrderBookDeltas` data for {instrument_id}: not implemented. "
f"You can implement by overriding the `unsubscribe_order_book_deltas` method for this client.",
)
+ raise NotImplementedError("method must be implemented in the subclass")
cpdef void unsubscribe_order_book_snapshots(self, InstrumentId instrument_id) except *:
"""
@@ -567,6 +580,7 @@ cdef class MarketDataClient(DataClient):
f"Cannot unsubscribe from `OrderBookSnapshot` data for {instrument_id}: not implemented. "
f"You can implement by overriding the `unsubscribe_order_book_snapshots` method for this client.",
)
+ raise NotImplementedError("method must be implemented in the subclass")
cpdef void unsubscribe_ticker(self, InstrumentId instrument_id) except *:
"""
@@ -582,6 +596,7 @@ cdef class MarketDataClient(DataClient):
f"Cannot unsubscribe from `Ticker` data for {instrument_id}: not implemented. "
f"You can implement by overriding the `unsubscribe_ticker` method for this client.",
)
+ raise NotImplementedError("method must be implemented in the subclass")
cpdef void unsubscribe_quote_ticks(self, InstrumentId instrument_id) except *:
"""
@@ -597,6 +612,7 @@ cdef class MarketDataClient(DataClient):
f"Cannot unsubscribe from `QuoteTick` data for {instrument_id}: not implemented. "
f"You can implement by overriding the `unsubscribe_quote_ticks` method for this client.",
)
+ raise NotImplementedError("method must be implemented in the subclass")
cpdef void unsubscribe_trade_ticks(self, InstrumentId instrument_id) except *:
"""
@@ -612,6 +628,7 @@ cdef class MarketDataClient(DataClient):
f"Cannot unsubscribe from `TradeTick` data for {instrument_id}: not implemented. "
f"You can implement by overriding the `unsubscribe_trade_ticks` method for this client.",
)
+ raise NotImplementedError("method must be implemented in the subclass")
cpdef void unsubscribe_bars(self, BarType bar_type) except *:
"""
@@ -627,6 +644,7 @@ cdef class MarketDataClient(DataClient):
f"Cannot unsubscribe from `Bar` data for {bar_type}: not implemented. "
f"You can implement by overriding the `unsubscribe_bars` method for this client.",
)
+ raise NotImplementedError("method must be implemented in the subclass")
cpdef void unsubscribe_instrument_status_updates(self, InstrumentId instrument_id) except *:
"""
@@ -642,6 +660,7 @@ cdef class MarketDataClient(DataClient):
f"Cannot unsubscribe from `InstrumentStatusUpdates` data for {instrument_id}: not implemented. "
f"You can implement by overriding the `unsubscribe_instrument_status_updates` method for this client.",
)
+ raise NotImplementedError("method must be implemented in the subclass")
cpdef void unsubscribe_instrument_close_prices(self, InstrumentId instrument_id) except *:
"""
@@ -657,6 +676,7 @@ cdef class MarketDataClient(DataClient):
f"Cannot unsubscribe from `InstrumentClosePrice` data for {instrument_id}: not implemented. "
f"You can implement by overriding the `unsubscribe_instrument_close_prices` method for this client.",
)
+ raise NotImplementedError("method must be implemented in the subclass")
cpdef void _add_subscription_instrument(self, InstrumentId instrument_id) except *:
Condition.not_none(instrument_id, "instrument_id")
diff --git a/nautilus_trader/examples/strategies/ema_cross.py b/nautilus_trader/examples/strategies/ema_cross.py
index 5262498f658b..d10084d84f2d 100644
--- a/nautilus_trader/examples/strategies/ema_cross.py
+++ b/nautilus_trader/examples/strategies/ema_cross.py
@@ -227,6 +227,10 @@ def on_bar(self, bar: Bar):
)
return # Wait for indicators to warm up...
+ if bar.is_single_price():
+ # Implies no market information for this bar
+ return
+
# BUY LOGIC
if self.fast_ema.value >= self.slow_ema.value:
if self.portfolio.is_flat(self.instrument_id):
@@ -301,7 +305,7 @@ def on_stop(self):
# Unsubscribe from data
self.unsubscribe_bars(self.bar_type)
- self.unsubscribe_quote_ticks(self.instrument_id)
+ # self.unsubscribe_quote_ticks(self.instrument_id)
# self.unsubscribe_trade_ticks(self.instrument_id)
# self.unsubscribe_ticker(self.instrument_id)
# self.unsubscribe_order_book_deltas(self.instrument_id)
diff --git a/nautilus_trader/examples/strategies/ema_cross_bracket.py b/nautilus_trader/examples/strategies/ema_cross_bracket.py
index 5ef98122c845..023d5119815c 100644
--- a/nautilus_trader/examples/strategies/ema_cross_bracket.py
+++ b/nautilus_trader/examples/strategies/ema_cross_bracket.py
@@ -24,6 +24,7 @@
from nautilus_trader.indicators.average.ema import ExponentialMovingAverage
from nautilus_trader.model.data.bar import Bar
from nautilus_trader.model.data.bar import BarType
+from nautilus_trader.model.data.tick import QuoteTick
from nautilus_trader.model.enums import OrderSide
from nautilus_trader.model.identifiers import InstrumentId
from nautilus_trader.model.instruments.base import Instrument
@@ -119,6 +120,21 @@ def on_start(self):
# Subscribe to live data
self.subscribe_bars(self.bar_type)
+ # self.subscribe_quote_ticks(self.instrument_id)
+
+ def on_quote_tick(self, tick: QuoteTick):
+ """
+ Actions to be performed when the strategy is running and receives a quote tick.
+
+ Parameters
+ ----------
+ tick : QuoteTick
+ The quote tick received.
+
+ """
+ # For debugging (must add a subscription)
+ # self.log.info(repr(tick), LogColor.CYAN)
+ pass
def on_bar(self, bar: Bar):
"""
@@ -130,7 +146,7 @@ def on_bar(self, bar: Bar):
The bar received.
"""
- self.log.info(f"Received {repr(bar)}")
+ self.log.info(repr(bar), LogColor.CYAN)
# Check if indicators ready
if not self.indicators_initialized():
@@ -140,6 +156,10 @@ def on_bar(self, bar: Bar):
)
return # Wait for indicators to warm up...
+ if bar.is_single_price():
+ # Implies no market information for this bar
+ return
+
# BUY LOGIC
if self.fast_ema.value >= self.slow_ema.value:
if self.portfolio.is_flat(self.instrument_id):
@@ -220,6 +240,7 @@ def on_stop(self):
# Unsubscribe from data
self.unsubscribe_bars(self.bar_type)
+ # self.unsubscribe_quote_ticks(self.instrument_id)
def on_reset(self):
"""
diff --git a/nautilus_trader/execution/client.pyx b/nautilus_trader/execution/client.pyx
index 084d875b5205..a3252ce745f3 100644
--- a/nautilus_trader/execution/client.pyx
+++ b/nautilus_trader/execution/client.pyx
@@ -176,6 +176,7 @@ cdef class ExecutionClient(Component):
f"Cannot execute command {command}: not implemented. "
f"You can implement by overriding the `submit_order` method for this client.",
)
+ raise NotImplementedError("method must be implemented in the subclass")
cpdef void submit_order_list(self, SubmitOrderList command) except *:
"""
@@ -191,6 +192,7 @@ cdef class ExecutionClient(Component):
f"Cannot execute command {command}: not implemented. "
f"You can implement by overriding the `submit_order_list` method for this client.",
)
+ raise NotImplementedError("method must be implemented in the subclass")
cpdef void modify_order(self, ModifyOrder command) except *:
"""
@@ -206,6 +208,7 @@ cdef class ExecutionClient(Component):
f"Cannot execute command {command}: not implemented. "
f"You can implement by overriding the `modify_order` method for this client.",
)
+ raise NotImplementedError("method must be implemented in the subclass")
cpdef void cancel_order(self, CancelOrder command) except *:
"""
@@ -221,6 +224,7 @@ cdef class ExecutionClient(Component):
f"Cannot execute command {command}: not implemented. "
f"You can implement by overriding the `cancel_order` method for this client.",
)
+ raise NotImplementedError("method must be implemented in the subclass")
cpdef void cancel_all_orders(self, CancelAllOrders command) except *:
"""
@@ -236,6 +240,7 @@ cdef class ExecutionClient(Component):
f"Cannot execute command {command}: not implemented. "
f"You can implement by overriding the `cancel_all_orders` method for this client.",
)
+ raise NotImplementedError("method must be implemented in the subclass")
cpdef void sync_order_status(self, QueryOrder command) except *:
"""
@@ -245,6 +250,7 @@ cdef class ExecutionClient(Component):
f"Cannot execute command {command}: not implemented. "
f"You can implement by overriding the `sync_order_status` method for this client.",
)
+ raise NotImplementedError("method must be implemented in the subclass")
# -- EVENT HANDLERS -------------------------------------------------------------------------------
diff --git a/nautilus_trader/indicators/amat.pxd b/nautilus_trader/indicators/amat.pxd
new file mode 100644
index 000000000000..1519f1bdd0cf
--- /dev/null
+++ b/nautilus_trader/indicators/amat.pxd
@@ -0,0 +1,37 @@
+# -------------------------------------------------------------------------------------------------
+# Copyright (C) 2015-2022 Nautech Systems Pty Ltd. All rights reserved.
+# https://nautechsystems.io
+#
+# Licensed under the GNU Lesser General Public License Version 3.0 (the "License");
+# You may not use this file except in compliance with the License.
+# You may obtain a copy of the License at https://www.gnu.org/licenses/lgpl-3.0.en.html
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+# -------------------------------------------------------------------------------------------------
+
+from nautilus_trader.indicators.average.moving_average cimport MovingAverage
+from nautilus_trader.indicators.base.indicator cimport Indicator
+
+
+cdef class ArcherMovingAveragesTrends(Indicator):
+ cdef MovingAverage _fast_ma
+ cdef MovingAverage _slow_ma
+ cdef object _fast_ma_price
+ cdef object _slow_ma_price
+
+ cdef readonly int fast_period
+ """The fast moving average window period.\n\n:returns: `int`"""
+ cdef readonly int slow_period
+ """The slow moving average window period.\n\n:returns: `int`"""
+ cdef readonly int signal_period
+ """The period for lookback price array.\n\n:returns: `int`"""
+ cdef readonly int long_run
+ """The current long run value.\n\n:returns: `int`"""
+ cdef readonly int short_run
+ """The current short run value.\n\n:returns: `int`"""
+
+ cpdef void update_raw(self, double close) except *
diff --git a/nautilus_trader/indicators/amat.pyx b/nautilus_trader/indicators/amat.pyx
new file mode 100644
index 000000000000..23c0f8566655
--- /dev/null
+++ b/nautilus_trader/indicators/amat.pyx
@@ -0,0 +1,129 @@
+# -------------------------------------------------------------------------------------------------
+# Copyright (C) 2015-2022 Nautech Systems Pty Ltd. All rights reserved.
+# https://nautechsystems.io
+#
+# Licensed under the GNU Lesser General Public License Version 3.0 (the "License");
+# You may not use this file except in compliance with the License.
+# You may obtain a copy of the License at https://www.gnu.org/licenses/lgpl-3.0.en.html
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+# -------------------------------------------------------------------------------------------------
+
+from collections import deque
+
+from nautilus_trader.indicators.average.ma_factory import MovingAverageFactory
+from nautilus_trader.indicators.average.moving_average import MovingAverageType
+
+from nautilus_trader.core.correctness cimport Condition
+from nautilus_trader.indicators.base.indicator cimport Indicator
+from nautilus_trader.model.data.bar cimport Bar
+
+
+cdef class ArcherMovingAveragesTrends(Indicator):
+ """
+ Archer Moving Averages Trends indicator.
+
+ Parameters
+ ----------
+ fast_period : int
+ The period for the fast moving average (> 0).
+ slow_period : int
+ The period for the slow moving average (> 0 & > fast_sma).
+ signal_period : int
+ The period for lookback price array (> 0).
+ ma_type : MovingAverageType
+ The moving average type for the calculations.
+
+ References
+ ----------
+ https://github.com/twopirllc/pandas-ta/blob/bc3b292bf1cc1d5f2aba50bb750a75209d655b37/pandas_ta/trend/amat.py
+ """
+
+ def __init__(
+ self,
+ int fast_period,
+ int slow_period,
+ int signal_period,
+ ma_type not None: MovingAverageType=MovingAverageType.EXPONENTIAL,
+ ):
+ Condition.positive_int(fast_period, "fast_period")
+ Condition.positive_int(slow_period, "slow_period")
+ Condition.true(slow_period > fast_period, "fast_period was >= slow_period")
+ Condition.positive_int(signal_period, "signal_period")
+ params = [
+ fast_period,
+ slow_period,
+ signal_period,
+ ma_type.name,
+ ]
+ super().__init__(params=params)
+
+ self.fast_period = fast_period
+ self.slow_period = slow_period
+ self.signal_period = signal_period
+ self._fast_ma = MovingAverageFactory.create(fast_period, ma_type)
+ self._slow_ma = MovingAverageFactory.create(slow_period, ma_type)
+ self._fast_ma_price = deque(maxlen = signal_period + 1)
+ self._slow_ma_price = deque(maxlen = signal_period + 1)
+ self.long_run = 0
+ self.short_run = 0
+
+ cpdef void handle_bar(self, Bar bar) except *:
+ """
+ Update the indicator with the given bar.
+
+ Parameters
+ ----------
+ bar : Bar
+ The update bar.
+
+ """
+ Condition.not_none(bar, "bar")
+
+ self.update_raw(
+ bar.close.as_double(),
+ )
+
+ cpdef void update_raw(self, double close) except *:
+ """
+ Update the indicator with the given close price value.
+
+ Parameters
+ ----------
+ close : double
+ The close price.
+
+ """
+ self._fast_ma.update_raw(close)
+ self._slow_ma.update_raw(close)
+ if self._slow_ma.initialized:
+ self._fast_ma_price.append(self._fast_ma.value)
+ self._slow_ma_price.append(self._slow_ma.value)
+
+ self.long_run = (self._fast_ma_price[-1] - self._fast_ma_price[0] > 0 and \
+ self._slow_ma_price[-1] - self._slow_ma_price[0] < 0 )
+ self.long_run = (self._fast_ma_price[-1] - self._fast_ma_price[0] > 0 and \
+ self._slow_ma_price[-1] - self._slow_ma_price[0] > 0 ) or self.long_run
+
+ self.short_run = (self._fast_ma_price[-1] - self._fast_ma_price[0] < 0 and \
+ self._slow_ma_price[-1] - self._slow_ma_price[0] > 0 )
+ self.short_run = (self._fast_ma_price[-1] - self._fast_ma_price[0] < 0 and \
+ self._slow_ma_price[-1] - self._slow_ma_price[0] < 0 ) or self.short_run
+
+ # Initialization logic
+ if not self.initialized:
+ self._set_has_inputs(True)
+ if len(self._slow_ma_price) >= self.signal_period + 1 and self._slow_ma.initialized:
+ self._set_initialized(True)
+
+ cpdef void _reset(self) except *:
+ self._fast_ma.reset()
+ self._slow_ma.reset()
+ self._fast_ma_price.clear()
+ self._slow_ma_price.clear()
+ self.long_run = 0
+ self.short_run = 0
diff --git a/nautilus_trader/indicators/aroon.pxd b/nautilus_trader/indicators/aroon.pxd
new file mode 100644
index 000000000000..a0816a6722d3
--- /dev/null
+++ b/nautilus_trader/indicators/aroon.pxd
@@ -0,0 +1,33 @@
+# -------------------------------------------------------------------------------------------------
+# Copyright (C) 2015-2022 Nautech Systems Pty Ltd. All rights reserved.
+# https://nautechsystems.io
+#
+# Licensed under the GNU Lesser General Public License Version 3.0 (the "License");
+# You may not use this file except in compliance with the License.
+# You may obtain a copy of the License at https://www.gnu.org/licenses/lgpl-3.0.en.html
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+# -------------------------------------------------------------------------------------------------
+
+from nautilus_trader.indicators.base.indicator cimport Indicator
+
+
+cdef class AroonOscillator(Indicator):
+ cdef object _high_inputs
+ cdef object _low_inputs
+
+ cdef readonly int period
+ """The window period.\n\n:returns: `int`"""
+ cdef readonly double aroon_up
+ """The current aroon up value.\n\n:returns: `double`"""
+ cdef readonly double aroon_down
+ """The current aroon down value.\n\n:returns: `double`"""
+ cdef readonly double value
+ """The current value.\n\n:returns: `double`"""
+
+ cpdef void update_raw(self, double high, double low) except *
+ cdef void _check_initialized(self) except *
diff --git a/nautilus_trader/indicators/aroon.pyx b/nautilus_trader/indicators/aroon.pyx
new file mode 100644
index 000000000000..f43735e53b14
--- /dev/null
+++ b/nautilus_trader/indicators/aroon.pyx
@@ -0,0 +1,109 @@
+# -------------------------------------------------------------------------------------------------
+# Copyright (C) 2015-2022 Nautech Systems Pty Ltd. All rights reserved.
+# https://nautechsystems.io
+#
+# Licensed under the GNU Lesser General Public License Version 3.0 (the "License");
+# You may not use this file except in compliance with the License.
+# You may obtain a copy of the License at https://www.gnu.org/licenses/lgpl-3.0.en.html
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+# -------------------------------------------------------------------------------------------------
+
+from collections import deque
+
+import numpy as np
+
+from nautilus_trader.core.correctness cimport Condition
+from nautilus_trader.indicators.base.indicator cimport Indicator
+from nautilus_trader.model.data.bar cimport Bar
+
+
+cdef class AroonOscillator(Indicator):
+ """
+ The Aroon (AR) indicator developed by Tushar Chande attempts to
+ determine whether an instrument is trending, and how strong the trend is.
+ AroonUp and AroonDown lines make up the indicator with their formulas below.
+
+ Parameters
+ ----------
+ period : int
+ The rolling window period for the indicator (> 0).
+ """
+
+ def __init__(self, int period):
+ Condition.positive_int(period, "period")
+ params = [
+ period,
+ ]
+ super().__init__(params = params)
+
+ self.period = period
+ self._high_inputs = deque(maxlen = self.period + 1)
+ self._low_inputs = deque(maxlen = self.period + 1)
+ self.aroon_up = 0
+ self.aroon_down = 0
+ self.value = 0
+
+ cpdef void handle_bar(self, Bar bar) except *:
+ """
+ Update the indicator with the given bar.
+
+ Parameters
+ ----------
+ bar : Bar
+ The update bar.
+
+ """
+ Condition.not_none(bar, "bar")
+
+ self.update_raw(
+ bar.high.as_double(),
+ bar.low.as_double(),
+ )
+
+ cpdef void update_raw(
+ self,
+ double high,
+ double low,
+ ) except *:
+ """
+ Update the indicator with the given raw values.
+
+ Parameters
+ ----------
+ high : double
+ The high price.
+ low : double
+ The low price.
+ """
+ # Update inputs
+ self._high_inputs.appendleft(high)
+ self._low_inputs.appendleft(low)
+
+ # Convert to double to compute values
+ cdef double periods_from_hh = np.argmax(self._high_inputs)
+ cdef double periods_from_ll = np.argmin(self._low_inputs)
+
+ self.aroon_up = 100.0 * (1.0 - periods_from_hh / self.period)
+ self.aroon_down = 100.0 * (1.0 - periods_from_ll / self.period)
+ self.value = self.aroon_up - self.aroon_down
+
+ self._check_initialized()
+
+ cdef void _check_initialized(self) except *:
+ # Initialization logic
+ if not self.initialized:
+ self._set_has_inputs(True)
+ if len(self._high_inputs) >= self.period + 1:
+ self._set_initialized(True)
+
+ cpdef void _reset(self) except *:
+ self._high_inputs.clear()
+ self._low_inputs.clear()
+ self.aroon_up = 0
+ self.aroon_down = 0
+ self.value = 0
diff --git a/nautilus_trader/indicators/atr.pyx b/nautilus_trader/indicators/atr.pyx
index ed4c21144d38..d849d11c7709 100644
--- a/nautilus_trader/indicators/atr.pyx
+++ b/nautilus_trader/indicators/atr.pyx
@@ -119,9 +119,6 @@ cdef class AverageTrueRange(Indicator):
self.value = self._value_floor
cdef void _check_initialized(self) except *:
- """
- Initialization logic.
- """
if not self.initialized:
self._set_has_inputs(True)
if self._ma.initialized:
diff --git a/nautilus_trader/indicators/average/dema.pxd b/nautilus_trader/indicators/average/dema.pxd
new file mode 100644
index 000000000000..e5e346d4ca71
--- /dev/null
+++ b/nautilus_trader/indicators/average/dema.pxd
@@ -0,0 +1,21 @@
+# -------------------------------------------------------------------------------------------------
+# Copyright (C) 2015-2022 Nautech Systems Pty Ltd. All rights reserved.
+# https://nautechsystems.io
+#
+# Licensed under the GNU Lesser General Public License Version 3.0 (the "License");
+# You may not use this file except in compliance with the License.
+# You may obtain a copy of the License at https://www.gnu.org/licenses/lgpl-3.0.en.html
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+# -------------------------------------------------------------------------------------------------
+
+from nautilus_trader.indicators.average.moving_average cimport MovingAverage
+
+
+cdef class DoubleExponentialMovingAverage(MovingAverage):
+ cdef MovingAverage _ma1
+ cdef MovingAverage _ma2
diff --git a/nautilus_trader/indicators/average/dema.pyx b/nautilus_trader/indicators/average/dema.pyx
new file mode 100644
index 000000000000..963d8c791c6d
--- /dev/null
+++ b/nautilus_trader/indicators/average/dema.pyx
@@ -0,0 +1,119 @@
+# -------------------------------------------------------------------------------------------------
+# Copyright (C) 2015-2022 Nautech Systems Pty Ltd. All rights reserved.
+# https://nautechsystems.io
+#
+# Licensed under the GNU Lesser General Public License Version 3.0 (the "License");
+# You may not use this file except in compliance with the License.
+# You may obtain a copy of the License at https://www.gnu.org/licenses/lgpl-3.0.en.html
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+# -------------------------------------------------------------------------------------------------
+
+from nautilus_trader.core.correctness cimport Condition
+from nautilus_trader.indicators.average.ema cimport ExponentialMovingAverage
+from nautilus_trader.indicators.average.moving_average cimport MovingAverage
+from nautilus_trader.model.c_enums.price_type cimport PriceType
+from nautilus_trader.model.data.bar cimport Bar
+from nautilus_trader.model.data.tick cimport QuoteTick
+from nautilus_trader.model.data.tick cimport TradeTick
+from nautilus_trader.model.objects cimport Price
+
+
+cdef class DoubleExponentialMovingAverage(MovingAverage):
+ """
+ The Double Exponential Moving Average attempts to a smoother average with less
+ lag than the normal Exponential Moving Average (EMA).
+
+ Parameters
+ ----------
+ period : int
+ The rolling window period for the indicator (> 0).
+ price_type : PriceType
+ The specified price type for extracting values from quote ticks.
+
+ Raises
+ ------
+ ValueError
+ If `period` is not positive (> 0).
+ """
+
+ def __init__(self, int period, PriceType price_type=PriceType.LAST):
+ Condition.positive_int(period, "period")
+ super().__init__(period, params=[period], price_type=price_type)
+
+ self._ma1 = ExponentialMovingAverage(period)
+ self._ma2 = ExponentialMovingAverage(period)
+
+ self.value = 0
+
+ cpdef void handle_quote_tick(self, QuoteTick tick) except *:
+ """
+ Update the indicator with the given quote tick.
+
+ Parameters
+ ----------
+ tick : QuoteTick
+ The update tick to handle.
+
+ """
+ Condition.not_none(tick, "tick")
+
+ cdef Price price = tick.extract_price(self.price_type)
+ self.update_raw(Price.raw_to_f64_c(price._mem.raw))
+
+ cpdef void handle_trade_tick(self, TradeTick tick) except *:
+ """
+ Update the indicator with the given trade tick.
+
+ Parameters
+ ----------
+ tick : TradeTick
+ The update tick to handle.
+
+ """
+ Condition.not_none(tick, "tick")
+
+ self.update_raw(Price.raw_to_f64_c(tick._mem.price.raw))
+
+ cpdef void handle_bar(self, Bar bar) except *:
+ """
+ Update the indicator with the given bar.
+
+ Parameters
+ ----------
+ bar : Bar
+ The update bar to handle.
+
+ """
+ Condition.not_none(bar, "bar")
+
+ self.update_raw(bar.close.as_double())
+
+ cpdef void update_raw(self, double value) except *:
+ """
+ Update the indicator with the given raw value.
+
+ Parameters
+ ----------
+ value : double
+ The update value.
+
+ """
+ self._ma1.update_raw(value)
+ self._ma2.update_raw(self._ma1.value)
+
+ self.value = 2.0 * self._ma1.value - self._ma2.value
+
+ if not self.initialized:
+ self._set_has_inputs(True)
+ if self._ma2.initialized:
+ self._set_initialized(True)
+
+ cpdef void _reset_ma(self) except *:
+ self._ma1.reset()
+ self._ma2.reset()
+ self.value = 0
diff --git a/nautilus_trader/indicators/average/ema.pyx b/nautilus_trader/indicators/average/ema.pyx
index 0c2232ab6b44..3a74833d35e1 100644
--- a/nautilus_trader/indicators/average/ema.pyx
+++ b/nautilus_trader/indicators/average/ema.pyx
@@ -88,7 +88,7 @@ cdef class ExponentialMovingAverage(MovingAverage):
"""
Condition.not_none(bar, "bar")
- self.update_raw(bar.close.as_f64_c())
+ self.update_raw(bar.close.as_double())
cpdef void update_raw(self, double value) except *:
"""
diff --git a/nautilus_trader/indicators/average/ma_factory.pyx b/nautilus_trader/indicators/average/ma_factory.pyx
index b479cb059d26..10076c7e311e 100644
--- a/nautilus_trader/indicators/average/ma_factory.pyx
+++ b/nautilus_trader/indicators/average/ma_factory.pyx
@@ -15,10 +15,12 @@
from nautilus_trader.core.correctness cimport Condition
+from nautilus_trader.indicators.average.dema import DoubleExponentialMovingAverage
from nautilus_trader.indicators.average.ema import ExponentialMovingAverage
from nautilus_trader.indicators.average.hma import HullMovingAverage
from nautilus_trader.indicators.average.moving_average import MovingAverage
from nautilus_trader.indicators.average.moving_average import MovingAverageType
+from nautilus_trader.indicators.average.rma import WilderMovingAverage
from nautilus_trader.indicators.average.sma import SimpleMovingAverage
from nautilus_trader.indicators.average.wma import WeightedMovingAverage
@@ -67,3 +69,9 @@ cdef class MovingAverageFactory:
elif ma_type == MovingAverageType.HULL:
return HullMovingAverage(period)
+
+ elif ma_type == MovingAverageType.WILDER:
+ return WilderMovingAverage(period)
+
+ elif ma_type == MovingAverageType.DOUBLEEXPONENTIAL:
+ return DoubleExponentialMovingAverage(period)
diff --git a/nautilus_trader/indicators/average/moving_average.pyx b/nautilus_trader/indicators/average/moving_average.pyx
index 0bab4191ec24..cedc15b44674 100644
--- a/nautilus_trader/indicators/average/moving_average.pyx
+++ b/nautilus_trader/indicators/average/moving_average.pyx
@@ -31,6 +31,8 @@ class MovingAverageType(Enum):
WEIGHTED = 2
HULL = 3
ADAPTIVE = 4
+ WILDER = 5
+ DOUBLEEXPONENTIAL = 6
cdef class MovingAverage(Indicator):
diff --git a/nautilus_trader/indicators/average/rma.pxd b/nautilus_trader/indicators/average/rma.pxd
new file mode 100644
index 000000000000..709a3e8f4adf
--- /dev/null
+++ b/nautilus_trader/indicators/average/rma.pxd
@@ -0,0 +1,21 @@
+# -------------------------------------------------------------------------------------------------
+# Copyright (C) 2015-2022 Nautech Systems Pty Ltd. All rights reserved.
+# https://nautechsystems.io
+#
+# Licensed under the GNU Lesser General Public License Version 3.0 (the "License");
+# You may not use this file except in compliance with the License.
+# You may obtain a copy of the License at https://www.gnu.org/licenses/lgpl-3.0.en.html
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+# -------------------------------------------------------------------------------------------------
+
+from nautilus_trader.indicators.average.ema cimport MovingAverage
+
+
+cdef class WilderMovingAverage(MovingAverage):
+ cdef readonly double alpha
+ """The moving average alpha value.\n\n:returns: `double`"""
diff --git a/nautilus_trader/indicators/average/rma.pyx b/nautilus_trader/indicators/average/rma.pyx
new file mode 100644
index 000000000000..52a33c8e7f08
--- /dev/null
+++ b/nautilus_trader/indicators/average/rma.pyx
@@ -0,0 +1,108 @@
+# -------------------------------------------------------------------------------------------------
+# Copyright (C) 2015-2022 Nautech Systems Pty Ltd. All rights reserved.
+# https://nautechsystems.io
+#
+# Licensed under the GNU Lesser General Public License Version 3.0 (the "License");
+# You may not use this file except in compliance with the License.
+# You may obtain a copy of the License at https://www.gnu.org/licenses/lgpl-3.0.en.html
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+# -------------------------------------------------------------------------------------------------
+
+from nautilus_trader.core.correctness cimport Condition
+from nautilus_trader.indicators.average.moving_average cimport MovingAverage
+from nautilus_trader.model.c_enums.price_type cimport PriceType
+from nautilus_trader.model.data.bar cimport Bar
+from nautilus_trader.model.data.tick cimport QuoteTick
+from nautilus_trader.model.data.tick cimport TradeTick
+from nautilus_trader.model.objects cimport Price
+
+
+cdef class WilderMovingAverage(MovingAverage):
+ """
+ The Wilder's Moving Average is simply an Exponential Moving Average (EMA) with
+ a modified alpha = 1 / period.
+
+ Parameters
+ ----------
+ period : int
+ The rolling window period for the indicator (> 0).
+ price_type : PriceType
+ The specified price type for extracting values from quote ticks.
+
+ Raises
+ ------
+ ValueError
+ If `period` is not positive (> 0).
+ """
+
+ def __init__(self, int period, PriceType price_type=PriceType.LAST):
+ Condition.positive_int(period, "period")
+ super().__init__(period, params=[period], price_type=price_type)
+
+ self.alpha = 1.0 / period
+ self.value = 0
+
+ cpdef void handle_quote_tick(self, QuoteTick tick) except *:
+ """
+ Update the indicator with the given quote tick.
+
+ Parameters
+ ----------
+ tick : QuoteTick
+ The update tick to handle.
+
+ """
+ Condition.not_none(tick, "tick")
+
+ cdef Price price = tick.extract_price(self.price_type)
+ self.update_raw(Price.raw_to_f64_c(price._mem.raw))
+
+ cpdef void handle_trade_tick(self, TradeTick tick) except *:
+ """
+ Update the indicator with the given trade tick.
+
+ Parameters
+ ----------
+ tick : TradeTick
+ The update tick to handle.
+
+ """
+ Condition.not_none(tick, "tick")
+
+ self.update_raw(Price.raw_to_f64_c(tick._mem.price.raw))
+
+ cpdef void handle_bar(self, Bar bar) except *:
+ """
+ Update the indicator with the given bar.
+
+ Parameters
+ ----------
+ bar : Bar
+ The update bar to handle.
+
+ """
+ Condition.not_none(bar, "bar")
+
+ self.update_raw(bar.close.as_double())
+
+ cpdef void update_raw(self, double value) except *:
+ """
+ Update the indicator with the given raw value.
+
+ Parameters
+ ----------
+ value : double
+ The update value.
+
+ """
+ # Check if this is the initial input
+ if not self.has_inputs:
+ self.value = value
+
+ self.value = self.alpha * value + ((1.0 - self.alpha) * self.value)
+ self._increment_count()
diff --git a/nautilus_trader/indicators/bias.pxd b/nautilus_trader/indicators/bias.pxd
new file mode 100644
index 000000000000..77d0c894c34b
--- /dev/null
+++ b/nautilus_trader/indicators/bias.pxd
@@ -0,0 +1,29 @@
+# -------------------------------------------------------------------------------------------------
+# Copyright (C) 2015-2022 Nautech Systems Pty Ltd. All rights reserved.
+# https://nautechsystems.io
+#
+# Licensed under the GNU Lesser General Public License Version 3.0 (the "License");
+# You may not use this file except in compliance with the License.
+# You may obtain a copy of the License at https://www.gnu.org/licenses/lgpl-3.0.en.html
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+# -------------------------------------------------------------------------------------------------
+
+from nautilus_trader.indicators.average.moving_average cimport MovingAverage
+from nautilus_trader.indicators.base.indicator cimport Indicator
+
+
+cdef class Bias(Indicator):
+ cdef MovingAverage _ma
+
+ cdef readonly int period
+ """The window period.\n\n:returns: `int`"""
+ cdef readonly double value
+ """The current value.\n\n:returns: `double`"""
+
+ cpdef void update_raw(self, double close) except *
+ cdef void _check_initialized(self) except *
diff --git a/nautilus_trader/indicators/bias.pyx b/nautilus_trader/indicators/bias.pyx
new file mode 100644
index 000000000000..fb772cc81797
--- /dev/null
+++ b/nautilus_trader/indicators/bias.pyx
@@ -0,0 +1,91 @@
+# -------------------------------------------------------------------------------------------------
+# Copyright (C) 2015-2022 Nautech Systems Pty Ltd. All rights reserved.
+# https://nautechsystems.io
+#
+# Licensed under the GNU Lesser General Public License Version 3.0 (the "License");
+# You may not use this file except in compliance with the License.
+# You may obtain a copy of the License at https://www.gnu.org/licenses/lgpl-3.0.en.html
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+# -------------------------------------------------------------------------------------------------
+
+from nautilus_trader.indicators.average.ma_factory import MovingAverageFactory
+from nautilus_trader.indicators.average.ma_factory import MovingAverageType
+
+from nautilus_trader.core.correctness cimport Condition
+from nautilus_trader.indicators.base.indicator cimport Indicator
+from nautilus_trader.model.data.bar cimport Bar
+
+
+cdef class Bias(Indicator):
+ """
+ Rate of change between the source and a moving average.
+
+ Parameters
+ ----------
+ period : int
+ The rolling window period for the indicator (> 0).
+ ma_type : MovingAverageType
+ The moving average type for the indicator (cannot be None).
+ """
+
+ def __init__(
+ self,
+ int period,
+ ma_type not None: MovingAverageType=MovingAverageType.SIMPLE,
+ ):
+ Condition.positive_int(period, "period")
+ params = [
+ period,
+ ma_type.name,
+ ]
+ super().__init__(params=params)
+
+ self.period = period
+ self._ma = MovingAverageFactory.create(period, ma_type)
+ self.value = 0
+
+ cpdef void handle_bar(self, Bar bar) except *:
+ """
+ Update the indicator with the given bar.
+
+ Parameters
+ ----------
+ bar : Bar
+ The update bar.
+
+ """
+ Condition.not_none(bar, "bar")
+
+ self.update_raw(
+ bar.close.as_double(),
+ )
+
+ cpdef void update_raw(self, double close) except *:
+ """
+ Update the indicator with the given raw values.
+
+ Parameters
+ ----------
+ close : double
+ The close price.
+
+ """
+ # Calculate average
+ self._ma.update_raw(close)
+ self.value = (close / self._ma.value) - 1.0
+ self._check_initialized()
+
+ cdef void _check_initialized(self) except *:
+ if not self.initialized:
+ self._set_has_inputs(True)
+ if self._ma.initialized:
+ self._set_initialized(True)
+
+ cpdef void _reset(self) except *:
+ self._ma.reset()
+ self.value = 0
diff --git a/nautilus_trader/indicators/bollinger_bands.pyx b/nautilus_trader/indicators/bollinger_bands.pyx
index 410fd3977b00..8fd9c14c89fa 100644
--- a/nautilus_trader/indicators/bollinger_bands.pyx
+++ b/nautilus_trader/indicators/bollinger_bands.pyx
@@ -33,7 +33,7 @@ cdef class BollingerBands(Indicator):
"""
A Bollinger Band® is a technical analysis tool defined by a set of
trend lines plotted two standard deviations (positively and negatively) away
- from a simple moving average (SMA) of a instrument_id's price, but which can be
+ from a simple moving average (SMA) of an instruments price, which can be
adjusted to user preferences.
Parameters
diff --git a/nautilus_trader/indicators/cci.pyx b/nautilus_trader/indicators/cci.pyx
index d671e554b96f..40d24abed201 100644
--- a/nautilus_trader/indicators/cci.pyx
+++ b/nautilus_trader/indicators/cci.pyx
@@ -17,12 +17,11 @@ from collections import deque
import numpy as np
-from nautilus_trader.core.stats cimport fast_mad_with_mean
-
from nautilus_trader.indicators.average.ma_factory import MovingAverageFactory
from nautilus_trader.indicators.average.ma_factory import MovingAverageType
from nautilus_trader.core.correctness cimport Condition
+from nautilus_trader.core.stats cimport fast_mad_with_mean
from nautilus_trader.indicators.base.indicator cimport Indicator
from nautilus_trader.model.data.bar cimport Bar
diff --git a/nautilus_trader/indicators/cmo.pxd b/nautilus_trader/indicators/cmo.pxd
new file mode 100644
index 000000000000..de55ca6b60cf
--- /dev/null
+++ b/nautilus_trader/indicators/cmo.pxd
@@ -0,0 +1,30 @@
+# -------------------------------------------------------------------------------------------------
+# Copyright (C) 2015-2022 Nautech Systems Pty Ltd. All rights reserved.
+# https://nautechsystems.io
+#
+# Licensed under the GNU Lesser General Public License Version 3.0 (the "License");
+# You may not use this file except in compliance with the License.
+# You may obtain a copy of the License at https://www.gnu.org/licenses/lgpl-3.0.en.html
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+# -------------------------------------------------------------------------------------------------
+
+from nautilus_trader.indicators.average.moving_average cimport MovingAverage
+from nautilus_trader.indicators.base.indicator cimport Indicator
+
+
+cdef class ChandeMomentumOscillator(Indicator):
+ cdef MovingAverage _average_gain
+ cdef MovingAverage _average_loss
+ cdef double _previous_close
+
+ cdef readonly int period
+ """The window period.\n\n:returns: `int`"""
+ cdef readonly double value
+ """The current value.\n\n:returns: `double`"""
+
+ cpdef void update_raw(self, double close) except *
diff --git a/nautilus_trader/indicators/cmo.pyx b/nautilus_trader/indicators/cmo.pyx
new file mode 100644
index 000000000000..ce7bc82b8392
--- /dev/null
+++ b/nautilus_trader/indicators/cmo.pyx
@@ -0,0 +1,109 @@
+# -------------------------------------------------------------------------------------------------
+# Copyright (C) 2015-2022 Nautech Systems Pty Ltd. All rights reserved.
+# https://nautechsystems.io
+#
+# Licensed under the GNU Lesser General Public License Version 3.0 (the "License");
+# You may not use this file except in compliance with the License.
+# You may obtain a copy of the License at https://www.gnu.org/licenses/lgpl-3.0.en.html
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+# -------------------------------------------------------------------------------------------------
+
+from nautilus_trader.indicators.average.ma_factory import MovingAverageFactory
+from nautilus_trader.indicators.average.moving_average import MovingAverageType
+
+from nautilus_trader.core.correctness cimport Condition
+from nautilus_trader.indicators.base.indicator cimport Indicator
+from nautilus_trader.model.data.bar cimport Bar
+
+
+cdef class ChandeMomentumOscillator(Indicator):
+ """
+ Attempts to capture the momentum of an asset with overbought at 50 and
+ oversold at -50.
+
+ Parameters
+ ----------
+ ma_type : int
+ The moving average type for average gain/loss.
+ period : MovingAverageType
+ The rolling window period for the indicator.
+ """
+
+ def __init__(
+ self,
+ int period,
+ ma_type not None: MovingAverageType=MovingAverageType.WILDER,
+ ):
+ params = [
+ period,
+ ma_type.name,
+ ]
+ super().__init__(params = params)
+
+ self.period = period
+ self._average_gain = MovingAverageFactory.create(period, ma_type)
+ self._average_loss = MovingAverageFactory.create(period, ma_type)
+ self._previous_close = 0
+ self.value = 0
+
+ cpdef void handle_bar(self, Bar bar) except *:
+ """
+ Update the indicator with the given bar.
+
+ Parameters
+ ----------
+ bar : Bar
+ The update bar.
+
+ """
+ Condition.not_none(bar, "bar")
+
+ self.update_raw(bar.close.as_double())
+
+ cpdef void update_raw(self, double close) except *:
+ """
+ Update the indicator with the given value.
+
+ Parameters
+ ----------
+ value : double
+ The update value.
+
+ """
+ # Check if first input
+ if not self.has_inputs:
+ self._set_has_inputs(True)
+ self._previous_close = close
+
+ cdef double gain = close - self._previous_close
+
+ if gain > 0:
+ self._average_gain.update_raw(gain)
+ self._average_loss.update_raw(0)
+ elif gain < 0:
+ self._average_gain.update_raw(0)
+ self._average_loss.update_raw(-gain)
+ else:
+ self._average_gain.update_raw(0)
+ self._average_loss.update_raw(0)
+ # Initialization logic
+ if not self.initialized:
+ if self._average_gain.initialized and self._average_loss.initialized:
+ self._set_initialized(True)
+
+ if self.initialized:
+ self.value = 100.0 * (self._average_gain.value - self._average_loss.value)
+ self.value = self.value / (self._average_gain.value + self._average_loss.value)
+
+ self._previous_close = close
+
+ cpdef void _reset(self) except *:
+ self._average_gain.reset()
+ self._average_loss.reset()
+ self._previous_close = 0
+ self.value = 0
diff --git a/nautilus_trader/indicators/keltner_position.pyx b/nautilus_trader/indicators/keltner_position.pyx
index 5d3019b0b636..2d4ecad3649a 100644
--- a/nautilus_trader/indicators/keltner_position.pyx
+++ b/nautilus_trader/indicators/keltner_position.pyx
@@ -13,7 +13,6 @@
# limitations under the License.
# -------------------------------------------------------------------------------------------------
-
from nautilus_trader.indicators.average.moving_average import MovingAverageType
from nautilus_trader.core.correctness cimport Condition
diff --git a/nautilus_trader/indicators/linear_regression.pyx b/nautilus_trader/indicators/linear_regression.pyx
index b4d4b961e756..0470a904af5d 100644
--- a/nautilus_trader/indicators/linear_regression.pyx
+++ b/nautilus_trader/indicators/linear_regression.pyx
@@ -17,8 +17,6 @@ from collections import deque
from statistics import mean
import numpy as np
-from numpy import arctan as npAtan
-from numpy import pi as npPi
cimport numpy as np
@@ -109,7 +107,7 @@ cdef class LinearRegression(Indicator):
residuals[i] = self.slope * x_arr[i] + self.intercept - y_arr[i]
self.value = residuals[-1] + y_arr[-1]
- self.degree = 180.0 / npPi * npAtan(self.slope)
+ self.degree = 180.0 / np.pi * np.arctan(self.slope)
self.cfo = 100.0 * residuals[-1] / y_arr[-1]
self.R2 = 1.0 - sum(residuals * residuals) / sum((y_arr - mean(y_arr)) * (y_arr - mean(y_arr)))
diff --git a/nautilus_trader/indicators/vhf.pxd b/nautilus_trader/indicators/vhf.pxd
new file mode 100644
index 000000000000..d8902b436e7a
--- /dev/null
+++ b/nautilus_trader/indicators/vhf.pxd
@@ -0,0 +1,32 @@
+# -------------------------------------------------------------------------------------------------
+# Copyright (C) 2015-2022 Nautech Systems Pty Ltd. All rights reserved.
+# https://nautechsystems.io
+#
+# Licensed under the GNU Lesser General Public License Version 3.0 (the "License");
+# You may not use this file except in compliance with the License.
+# You may obtain a copy of the License at https://www.gnu.org/licenses/lgpl-3.0.en.html
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+# -------------------------------------------------------------------------------------------------
+
+from nautilus_trader.indicators.average.moving_average cimport MovingAverage
+from nautilus_trader.indicators.base.indicator cimport Indicator
+
+
+cdef class VerticalHorizontalFilter(Indicator):
+ cdef MovingAverage _ma
+ cdef object _prices
+
+ cdef readonly int period
+ """The window period.\n\n:returns: `int`"""
+ cdef readonly double _previous_close
+ """The previous close price.\n\n:returns: `double`"""
+ cdef readonly double value
+ """The current value.\n\n:returns: `double`"""
+
+ cpdef void update_raw(self, double close) except *
+ cdef void _check_initialized(self) except *
diff --git a/nautilus_trader/indicators/vhf.pyx b/nautilus_trader/indicators/vhf.pyx
new file mode 100644
index 000000000000..0c36ea4af727
--- /dev/null
+++ b/nautilus_trader/indicators/vhf.pyx
@@ -0,0 +1,111 @@
+# -------------------------------------------------------------------------------------------------
+# Copyright (C) 2015-2022 Nautech Systems Pty Ltd. All rights reserved.
+# https://nautechsystems.io
+#
+# Licensed under the GNU Lesser General Public License Version 3.0 (the "License");
+# You may not use this file except in compliance with the License.
+# You may obtain a copy of the License at https://www.gnu.org/licenses/lgpl-3.0.en.html
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+# -------------------------------------------------------------------------------------------------
+
+from collections import deque
+
+from libc.math cimport fabs
+
+from nautilus_trader.indicators.average.ma_factory import MovingAverageFactory
+from nautilus_trader.indicators.average.ma_factory import MovingAverageType
+
+from nautilus_trader.core.correctness cimport Condition
+from nautilus_trader.indicators.base.indicator cimport Indicator
+from nautilus_trader.model.data.bar cimport Bar
+
+
+cdef class VerticalHorizontalFilter(Indicator):
+ """
+ The Vertical Horizon Filter (VHF) was created by Adam White to identify
+ trending and ranging markets.
+
+ Parameters
+ ----------
+ period : int
+ The rolling window period for the indicator (> 0).
+ ma_type : MovingAverageType
+ The moving average type for the indicator (cannot be None).
+ """
+
+ def __init__(
+ self,
+ int period,
+ ma_type not None: MovingAverageType=MovingAverageType.SIMPLE,
+ ):
+ Condition.positive_int(period, "period")
+ params = [
+ period,
+ ma_type.name,
+ ]
+ super().__init__(params=params)
+
+ self.period = period
+ self._prices = deque(maxlen=self.period)
+ self._ma = MovingAverageFactory.create(period, ma_type)
+ self._previous_close = 0
+ self.value = 0
+
+ cpdef void handle_bar(self, Bar bar) except *:
+ """
+ Update the indicator with the given bar.
+
+ Parameters
+ ----------
+ bar : Bar
+ The update bar.
+
+ """
+ Condition.not_none(bar, "bar")
+
+ self.update_raw(
+ bar.close.as_double(),
+ )
+
+ cpdef void update_raw(self, double close) except *:
+ """
+ Update the indicator with the given raw value.
+
+ Parameters
+ ----------
+ close : double
+ The close price.
+
+ """
+ # Update inputs
+ if not self.has_inputs:
+ self._previous_close = close
+
+ self._prices.append(close)
+
+ cdef double max_price = max(self._prices)
+ cdef double min_price = min(self._prices)
+
+ self._ma.update_raw(fabs(close - self._previous_close))
+ if self.initialized:
+ self.value = fabs(max_price - min_price) / self.period / self._ma.value
+ self._previous_close = close
+
+ self._check_initialized()
+
+ cdef void _check_initialized(self) except *:
+ if not self.initialized:
+ self._set_has_inputs(True)
+ if self._ma.initialized and len(self._prices) >= self.period:
+ self._set_initialized(True)
+
+ cpdef void _reset(self) except *:
+ self._prices.clear()
+ self._ma.reset()
+ self._previous_close = 0
+ self.value = 0
diff --git a/nautilus_trader/model/data/bar.pxd b/nautilus_trader/model/data/bar.pxd
index 8d9f15a493d6..bbd3c45df173 100644
--- a/nautilus_trader/model/data/bar.pxd
+++ b/nautilus_trader/model/data/bar.pxd
@@ -14,24 +14,24 @@
# -------------------------------------------------------------------------------------------------
from nautilus_trader.core.data cimport Data
-from nautilus_trader.model.c_enums.aggregation_source cimport AggregationSource
+from nautilus_trader.core.rust.model cimport Bar_t
+from nautilus_trader.core.rust.model cimport BarSpecification_t
+from nautilus_trader.core.rust.model cimport BarType_t
from nautilus_trader.model.c_enums.bar_aggregation cimport BarAggregation
-from nautilus_trader.model.c_enums.price_type cimport PriceType
-from nautilus_trader.model.identifiers cimport InstrumentId
-from nautilus_trader.model.objects cimport Price
-from nautilus_trader.model.objects cimport Quantity
cdef class BarSpecification:
- cdef readonly int step
- """The specified step size for bar aggregation.\n\n:returns: `int`"""
- cdef readonly BarAggregation aggregation
- """The specified aggregation method for bars.\n\n:returns: `BarAggregation`"""
- cdef readonly PriceType price_type
- """The specified price type for bar aggregation.\n\n:returns: `PriceType`"""
+ cdef BarSpecification_t _mem
+ cdef str to_str(self)
cdef str aggregation_string_c(self)
+ @staticmethod
+ cdef BarSpecification from_raw_c(BarSpecification_t raw)
+
+ @staticmethod
+ cdef BarSpecification from_str_c(str value)
+
@staticmethod
cdef bint check_time_aggregated_c(BarAggregation aggregation)
@@ -41,21 +41,21 @@ cdef class BarSpecification:
@staticmethod
cdef bint check_information_aggregated_c(BarAggregation aggregation)
- @staticmethod
- cdef BarSpecification from_str_c(str value)
-
cpdef bint is_time_aggregated(self) except *
cpdef bint is_threshold_aggregated(self) except *
cpdef bint is_information_aggregated(self) except *
+ @staticmethod
+ cdef BarSpecification from_raw_c(BarSpecification_t raw)
+
cdef class BarType:
- cdef readonly InstrumentId instrument_id
- """The bar type instrument ID.\n\n:returns: `InstrumentId`"""
- cdef readonly BarSpecification spec
- """The bar type specification.\n\n:returns: `BarSpecification`"""
- cdef readonly AggregationSource aggregation_source
- """The bar aggregation source.\n\n:returns: `bool`"""
+ cdef BarType_t _mem
+
+ cdef str to_str(self)
+
+ @staticmethod
+ cdef BarType from_raw_c(BarType_t raw)
@staticmethod
cdef BarType from_str_c(str value)
@@ -65,23 +65,14 @@ cdef class BarType:
cdef class Bar(Data):
- cdef readonly BarType type
- """The type of the bar.\n\n:returns: `BarType`"""
- cdef readonly Price open
- """The open price of the bar.\n\n:returns: `Price`"""
- cdef readonly Price high
- """The high price of the bar.\n\n:returns: `Price`"""
- cdef readonly Price low
- """The low price of the bar.\n\n:returns: `Price`"""
- cdef readonly Price close
- """The close price of the bar.\n\n:returns: `Price`"""
- cdef readonly Quantity volume
- """The volume of the bar.\n\n:returns: `Quantity`"""
- cdef readonly bint checked
- """If the input values were integrity checked.\n\n:returns: `bool`"""
+ cdef Bar_t _mem
+
+ cdef str to_str(self)
@staticmethod
cdef Bar from_dict_c(dict values)
@staticmethod
cdef dict to_dict_c(Bar obj)
+
+ cpdef bint is_single_price(self)
diff --git a/nautilus_trader/model/data/bar.pyx b/nautilus_trader/model/data/bar.pyx
index a85d62f2dea5..9c94cd788bd6 100644
--- a/nautilus_trader/model/data/bar.pyx
+++ b/nautilus_trader/model/data/bar.pyx
@@ -13,10 +13,42 @@
# limitations under the License.
# -------------------------------------------------------------------------------------------------
+from nautilus_trader.model.c_enums.bar_aggregation import BarAggregationParser
+from nautilus_trader.model.c_enums.price_type import PriceTypeParser
+
+from cpython.object cimport PyObject
from libc.stdint cimport uint64_t
from nautilus_trader.core.correctness cimport Condition
from nautilus_trader.core.data cimport Data
+from nautilus_trader.core.rust.model cimport BarSpecification_t
+from nautilus_trader.core.rust.model cimport BarType_t
+from nautilus_trader.core.rust.model cimport bar_eq
+from nautilus_trader.core.rust.model cimport bar_free
+from nautilus_trader.core.rust.model cimport bar_hash
+from nautilus_trader.core.rust.model cimport bar_new
+from nautilus_trader.core.rust.model cimport bar_new_from_raw
+from nautilus_trader.core.rust.model cimport bar_specification_eq
+from nautilus_trader.core.rust.model cimport bar_specification_free
+from nautilus_trader.core.rust.model cimport bar_specification_ge
+from nautilus_trader.core.rust.model cimport bar_specification_gt
+from nautilus_trader.core.rust.model cimport bar_specification_hash
+from nautilus_trader.core.rust.model cimport bar_specification_le
+from nautilus_trader.core.rust.model cimport bar_specification_lt
+from nautilus_trader.core.rust.model cimport bar_specification_new
+from nautilus_trader.core.rust.model cimport bar_specification_to_pystr
+from nautilus_trader.core.rust.model cimport bar_to_pystr
+from nautilus_trader.core.rust.model cimport bar_type_eq
+from nautilus_trader.core.rust.model cimport bar_type_free
+from nautilus_trader.core.rust.model cimport bar_type_ge
+from nautilus_trader.core.rust.model cimport bar_type_gt
+from nautilus_trader.core.rust.model cimport bar_type_hash
+from nautilus_trader.core.rust.model cimport bar_type_le
+from nautilus_trader.core.rust.model cimport bar_type_lt
+from nautilus_trader.core.rust.model cimport bar_type_new
+from nautilus_trader.core.rust.model cimport bar_type_to_pystr
+from nautilus_trader.core.rust.model cimport instrument_id_from_pystrs
+from nautilus_trader.model.c_enums.aggregation_source cimport AggregationSource
from nautilus_trader.model.c_enums.aggregation_source cimport AggregationSourceParser
from nautilus_trader.model.c_enums.bar_aggregation cimport BarAggregation
from nautilus_trader.model.c_enums.bar_aggregation cimport BarAggregationParser
@@ -55,34 +87,52 @@ cdef class BarSpecification:
):
Condition.positive_int(step, 'step')
- self.step = step
- self.aggregation = aggregation
- self.price_type = price_type
+ self._mem = bar_specification_new(
+ step,
+ aggregation,
+ price_type
+ )
- def __eq__(self, BarSpecification other) -> bool:
+ def __getstate__(self):
return (
- self.step == other.step
- and self.aggregation == other.aggregation
- and self.price_type == other.price_type
+ self._mem.step,
+ self._mem.aggregation,
+ self._mem.price_type,
+ )
+
+ def __setstate__(self, state):
+ self._mem = bar_specification_new(
+ state[0],
+ state[1],
+ state[2]
)
+ def __del__(self) -> None:
+ bar_specification_free(self._mem) # `self._mem` moved to Rust (then dropped)
+
+ cdef str to_str(self):
+ return bar_specification_to_pystr(&self._mem)
+
+ def __eq__(self, BarSpecification other) -> bool:
+ return bar_specification_eq(&self._mem, &other._mem)
+
def __lt__(self, BarSpecification other) -> bool:
- return str(self) < str(other)
+ return bar_specification_lt(&self._mem, &other._mem)
def __le__(self, BarSpecification other) -> bool:
- return str(self) <= str(other)
+ return bar_specification_le(&self._mem, &other._mem)
def __gt__(self, BarSpecification other) -> bool:
- return str(self) > str(other)
+ return bar_specification_gt(&self._mem, &other._mem)
def __ge__(self, BarSpecification other) -> bool:
- return str(self) >= str(other)
+ return bar_specification_ge(&self._mem, &other._mem)
def __hash__(self) -> int:
- return hash((self.step, self.aggregation, self.price_type))
+ return bar_specification_hash(&self._mem)
def __str__(self) -> str:
- return f"{self.step}-{BarAggregationParser.to_str(self.aggregation)}-{PriceTypeParser.to_str(self.price_type)}"
+ return self.to_str()
def __repr__(self) -> str:
return f"{type(self).__name__}({self})"
@@ -90,6 +140,29 @@ cdef class BarSpecification:
cdef str aggregation_string_c(self):
return BarAggregationParser.to_str(self.aggregation)
+ @staticmethod
+ cdef BarSpecification from_raw_c(BarSpecification_t raw):
+ cdef BarSpecification spec = BarSpecification.__new__(BarSpecification)
+ spec._mem = raw
+ return spec
+
+ @staticmethod
+ cdef BarSpecification from_str_c(str value):
+ Condition.valid_string(value, 'value')
+
+ cdef list pieces = value.rsplit('-', maxsplit=2)
+
+ if len(pieces) != 3:
+ raise ValueError(
+ f"The BarSpecification string value was malformed, was {value}",
+ )
+
+ return BarSpecification(
+ int(pieces[0]),
+ BarAggregationParser.from_str(pieces[1]),
+ PriceTypeParser.from_str(pieces[2]),
+ )
+
@staticmethod
cdef bint check_time_aggregated_c(BarAggregation aggregation):
if (
@@ -130,22 +203,41 @@ cdef class BarSpecification:
else:
return False
- @staticmethod
- cdef BarSpecification from_str_c(str value):
- Condition.valid_string(value, 'value')
+ @property
+ def step(self) -> int:
+ """
+ The step size for the specification.
- cdef list pieces = value.rsplit('-', maxsplit=2)
+ Returns
+ -------
+ int
- if len(pieces) != 3:
- raise ValueError(
- f"The BarSpecification string value was malformed, was {value}",
- )
+ """
+ return self._mem.step
- return BarSpecification(
- int(pieces[0]),
- BarAggregationParser.from_str(pieces[1]),
- PriceTypeParser.from_str(pieces[2]),
- )
+ @property
+ def aggregation(self) -> BarAggregation:
+ """
+ The aggregation for the specification.
+
+ Returns
+ -------
+ BarAggregation
+
+ """
+ return self._mem.aggregation
+
+ @property
+ def price_type(self) -> PriceType:
+ """
+ The price type for the specification.
+
+ Returns
+ -------
+ PriceType
+
+ """
+ return self._mem.price_type
@staticmethod
def from_str(str value) -> BarSpecification:
@@ -310,38 +402,72 @@ cdef class BarType:
BarSpecification bar_spec not None,
AggregationSource aggregation_source=AggregationSource.EXTERNAL,
):
- self.instrument_id = instrument_id
- self.spec = bar_spec
- self.aggregation_source = aggregation_source
+ self._mem = bar_type_new(
+ instrument_id._mem,
+ bar_spec._mem,
+ aggregation_source
+ )
- def __eq__(self, BarType other) -> bool:
+ def __getstate__(self):
return (
- self.instrument_id == other.instrument_id
- and self.spec == other.spec
- and self.aggregation_source == other.aggregation_source
+ self.instrument_id.symbol.value,
+ self.instrument_id.venue.value,
+ self._mem.spec.step,
+ self._mem.spec.aggregation,
+ self._mem.spec.price_type,
+ self._mem.aggregation_source
+ )
+
+ def __setstate__(self, state):
+ self._mem = bar_type_new(
+ instrument_id_from_pystrs(
+ state[0],
+ state[1]
+ ),
+ bar_specification_new(
+ state[2],
+ state[3],
+ state[4]
+ ),
+ state[5],
)
+ def __del__(self) -> None:
+ bar_type_free(self._mem) # `self._mem` moved to Rust (then dropped)
+
+ cdef str to_str(self):
+ return bar_type_to_pystr(&self._mem)
+
+ def __eq__(self, BarType other) -> bool:
+ return bar_type_eq(&self._mem, &other._mem)
+
def __lt__(self, BarType other) -> bool:
- return str(self) < str(other)
+ return bar_type_lt(&self._mem, &other._mem)
def __le__(self, BarType other) -> bool:
- return str(self) <= str(other)
+ return bar_type_le(&self._mem, &other._mem)
def __gt__(self, BarType other) -> bool:
- return str(self) > str(other)
+ return bar_type_gt(&self._mem, &other._mem)
def __ge__(self, BarType other) -> bool:
- return str(self) >= str(other)
+ return bar_type_ge(&self._mem, &other._mem)
def __hash__(self) -> int:
- return hash((self.instrument_id, self.spec))
+ return bar_type_hash(&self._mem)
def __str__(self) -> str:
- return f"{self.instrument_id}-{self.spec}-{AggregationSourceParser.to_str(self.aggregation_source)}"
+ return self.to_str()
def __repr__(self) -> str:
return f"{type(self).__name__}({self})"
+ @staticmethod
+ cdef BarType from_raw_c(BarType_t raw):
+ cdef BarType bar_type = BarType.__new__(BarType)
+ bar_type._mem = raw
+ return bar_type
+
@staticmethod
cdef BarType from_str_c(str value):
Condition.valid_string(value, 'value')
@@ -365,6 +491,42 @@ cdef class BarType:
aggregation_source=aggregation_source,
)
+ @property
+ def instrument_id(self) -> InstrumentId:
+ """
+ The instrument ID for the bar type.
+
+ Returns
+ -------
+ InstrumentId
+
+ """
+ return InstrumentId.from_raw_c(self._mem.instrument_id)
+
+ @property
+ def spec(self) -> BarSpecification:
+ """
+ The specification for the bar type.
+
+ Returns
+ -------
+ BarSpecification
+
+ """
+ return BarSpecification.from_raw_c(self._mem.spec)
+
+ @property
+ def aggregation_source(self) -> AggregationSource:
+ """
+ The aggregation source for the bar type.
+
+ Returns
+ -------
+ AggregationSource
+
+ """
+ return self._mem.aggregation_source
+
@staticmethod
def from_str(str value) -> BarType:
"""
@@ -463,22 +625,77 @@ cdef class Bar(Data):
Condition.true(low <= close, 'low was > close')
super().__init__(ts_event, ts_init)
- self.type = bar_type
- self.open = open
- self.high = high
- self.low = low
- self.close = close
- self.volume = volume
- self.checked = check
+ self._mem = bar_new(
+ bar_type._mem,
+ open._mem,
+ high._mem,
+ low._mem,
+ close._mem,
+ volume._mem,
+ ts_event,
+ ts_init,
+ )
+
+ def __getstate__(self):
+ return (
+ self.type.instrument_id.symbol.value,
+ self.type.instrument_id.venue.value,
+ self._mem.bar_type.spec.step,
+ self._mem.bar_type.spec.aggregation,
+ self._mem.bar_type.spec.price_type,
+ self._mem.bar_type.aggregation_source,
+ self._mem.open.raw,
+ self._mem.high.raw,
+ self._mem.low.raw,
+ self._mem.close.raw,
+ self._mem.close.precision,
+ self._mem.volume.raw,
+ self._mem.volume.precision,
+ self.ts_event,
+ self.ts_init,
+ )
+
+ def __setstate__(self, state):
+ self._mem = bar_new_from_raw(
+ bar_type_new(
+ instrument_id_from_pystrs(
+ state[0],
+ state[1]
+ ),
+ bar_specification_new(
+ state[2],
+ state[3],
+ state[4]
+ ),
+ state[5],
+ ),
+ state[6],
+ state[7],
+ state[8],
+ state[9],
+ state[10],
+ state[11],
+ state[12],
+ state[13],
+ state[14],
+ )
+ self.ts_event = state[13]
+ self.ts_init = state[14]
+
+ def __del__(self) -> None:
+ bar_free(self._mem) # `self._mem` moved to Rust (then dropped)
def __eq__(self, Bar other) -> bool:
- return Bar.to_dict_c(self) == Bar.to_dict_c(other)
+ return bar_eq(&self._mem, &other._mem)
def __hash__(self) -> int:
- return hash(frozenset(Bar.to_dict_c(self)))
+ return bar_hash(&self._mem)
+
+ cdef str to_str(self):
+ return bar_to_pystr(&self._mem)
def __str__(self) -> str:
- return f"{self.type},{self.open},{self.high},{self.low},{self.close},{self.volume},{self.ts_event}"
+ return self.to_str()
def __repr__(self) -> str:
return f"{type(self).__name__}({self})"
@@ -508,10 +725,82 @@ cdef class Bar(Data):
"low": str(obj.low),
"close": str(obj.close),
"volume": str(obj.volume),
- "ts_event": obj.ts_event,
- "ts_init": obj.ts_init,
+ "ts_event": obj._mem.ts_event,
+ "ts_init": obj._mem.ts_init,
}
+ @property
+ def type(self) -> BarType:
+ """
+ The type of the bar.
+
+ Returns
+ -------
+ BarType
+
+ """
+ return BarType.from_raw_c(self._mem.bar_type)
+
+ @property
+ def open(self) -> Price:
+ """
+ The open price of the bar.
+
+ Returns
+ -------
+ Price
+
+ """
+ return Price.from_raw_c(self._mem.open.raw, self._mem.open.precision)
+
+ @property
+ def high(self) -> Price:
+ """
+ The high price of the bar.
+
+ Returns
+ -------
+ Price
+
+ """
+ return Price.from_raw_c(self._mem.high.raw, self._mem.high.precision)
+
+ @property
+ def low(self) -> Price:
+ """
+ The low price of the bar.
+
+ Returns
+ -------
+ Price
+
+ """
+ return Price.from_raw_c(self._mem.low.raw, self._mem.low.precision)
+
+ @property
+ def close(self) -> Price:
+ """
+ The close price of the bar.
+
+ Returns
+ -------
+ Price
+
+ """
+ return Price.from_raw_c(self._mem.close.raw, self._mem.close.precision)
+
+ @property
+ def volume(self) -> Quantity:
+ """
+ The volume of the bar.
+
+ Returns
+ -------
+ Quantity
+
+ """
+ return Quantity.from_raw_c(self._mem.volume.raw, self._mem.volume.precision)
+
@staticmethod
def from_dict(dict values) -> Bar:
"""
@@ -540,3 +829,14 @@ cdef class Bar(Data):
"""
return Bar.to_dict_c(obj)
+
+ cpdef bint is_single_price(self):
+ """
+ If the OHLC are all equal to a single price.
+
+ Returns
+ -------
+ bool
+
+ """
+ return self._mem.open.raw == self._mem.high.raw == self._mem.low.raw == self._mem.close.raw
diff --git a/nautilus_trader/model/objects.pyx b/nautilus_trader/model/objects.pyx
index e0b28297b38c..14f08fc665c9 100644
--- a/nautilus_trader/model/objects.pyx
+++ b/nautilus_trader/model/objects.pyx
@@ -263,6 +263,10 @@ cdef class Quantity:
cdef double raw_to_f64_c(uint64_t raw) except *:
return raw / FIXED_SCALAR
+ @staticmethod
+ def raw_to_f64(raw) -> float:
+ return Quantity.raw_to_f64_c(raw)
+
@staticmethod
cdef Quantity from_raw_c(uint64_t raw, uint8_t precision):
cdef Quantity quantity = Quantity.__new__(Quantity)
@@ -657,6 +661,10 @@ cdef class Price:
cdef double raw_to_f64_c(uint64_t raw) except *:
return raw / FIXED_SCALAR
+ @staticmethod
+ def raw_to_f64(raw) -> float:
+ return Quantity.raw_to_f64_c(raw)
+
@staticmethod
cdef Price from_str_c(str value):
cdef uint8_t precision = precision_from_str(value)
diff --git a/nautilus_trader/model/orderbook/book.pxd b/nautilus_trader/model/orderbook/book.pxd
index 396b4ed29c6b..0f3d48598e74 100644
--- a/nautilus_trader/model/orderbook/book.pxd
+++ b/nautilus_trader/model/orderbook/book.pxd
@@ -44,6 +44,8 @@ cdef class OrderBook:
"""The order books asks.\n\n:returns: `Ladder`"""
cdef readonly int last_update_id
"""The last update ID.\n\n:returns: `int`"""
+ cdef readonly int count
+ """The update count for the book.\n\n:returns: `int`"""
cdef readonly uint64_t ts_last
"""The UNIX timestamp (nanoseconds) when the order book was last updated.\n\n:returns: `uint64_t`"""
diff --git a/nautilus_trader/model/orderbook/book.pyx b/nautilus_trader/model/orderbook/book.pyx
index c27a2c5d0e72..f7d1898179c1 100644
--- a/nautilus_trader/model/orderbook/book.pyx
+++ b/nautilus_trader/model/orderbook/book.pyx
@@ -73,6 +73,7 @@ cdef class OrderBook:
size_precision=size_precision,
)
self.last_update_id = 0
+ self.count = 0
self.ts_last = 0
@staticmethod
@@ -373,6 +374,8 @@ cdef class OrderBook:
else:
self.last_update_id = update_id
+ self.count += 1
+
cdef void _check_integrity(self) except *:
cdef Level top_bid_level = self.bids.top()
cdef Level top_ask_level = self.asks.top()
diff --git a/nautilus_trader/persistence/batching.py b/nautilus_trader/persistence/batching.py
index d578ad6b7359..63603e83d063 100644
--- a/nautilus_trader/persistence/batching.py
+++ b/nautilus_trader/persistence/batching.py
@@ -26,7 +26,7 @@
from pyarrow.lib import ArrowInvalid
from nautilus_trader.config import BacktestDataConfig
-from nautilus_trader.persistence.catalog import DataCatalog
+from nautilus_trader.persistence.catalog.parquet import ParquetDataCatalog
from nautilus_trader.persistence.funcs import parse_bytes
from nautilus_trader.serialization.arrow.serializer import ParquetSerializer
from nautilus_trader.serialization.arrow.util import clean_key
@@ -59,7 +59,7 @@ def dataset_batches(
def build_filenames(
- catalog: DataCatalog,
+ catalog: ParquetDataCatalog,
data_configs: List[BacktestDataConfig],
) -> List[FileMeta]:
files = []
@@ -87,7 +87,7 @@ def frame_to_nautilus(df: pd.DataFrame, cls: type):
def batch_files( # noqa: C901
- catalog: DataCatalog,
+ catalog: ParquetDataCatalog,
data_configs: List[BacktestDataConfig],
read_num_rows: int = 10000,
target_batch_size_bytes: int = parse_bytes("100mb"), # noqa: B008,
diff --git a/nautilus_trader/persistence/catalog/__init__.py b/nautilus_trader/persistence/catalog/__init__.py
new file mode 100644
index 000000000000..e7b319b1fa31
--- /dev/null
+++ b/nautilus_trader/persistence/catalog/__init__.py
@@ -0,0 +1,25 @@
+# -------------------------------------------------------------------------------------------------
+# Copyright (C) 2015-2022 Nautech Systems Pty Ltd. All rights reserved.
+# https://nautechsystems.io
+#
+# Licensed under the GNU Lesser General Public License Version 3.0 (the "License");
+# You may not use this file except in compliance with the License.
+# You may obtain a copy of the License at https://www.gnu.org/licenses/lgpl-3.0.en.html
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+# -------------------------------------------------------------------------------------------------
+
+from .base import BaseDataCatalog
+from .parquet import ParquetDataCatalog
+from .parquet import resolve_path
+
+
+__all__ = (
+ "BaseDataCatalog",
+ "ParquetDataCatalog",
+ "resolve_path",
+)
diff --git a/nautilus_trader/persistence/catalog/base.py b/nautilus_trader/persistence/catalog/base.py
new file mode 100644
index 000000000000..b1e1f2d96018
--- /dev/null
+++ b/nautilus_trader/persistence/catalog/base.py
@@ -0,0 +1,330 @@
+# -------------------------------------------------------------------------------------------------
+# Copyright (C) 2015-2022 Nautech Systems Pty Ltd. All rights reserved.
+# https://nautechsystems.io
+#
+# Licensed under the GNU Lesser General Public License Version 3.0 (the "License");
+# You may not use this file except in compliance with the License.
+# You may obtain a copy of the License at https://www.gnu.org/licenses/lgpl-3.0.en.html
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+# -------------------------------------------------------------------------------------------------
+
+from abc import ABC
+from abc import ABCMeta
+from abc import abstractclassmethod
+from abc import abstractmethod
+from typing import Callable, Dict, List, Optional, Union
+
+import pandas as pd
+import pyarrow as pa
+
+from nautilus_trader.core.inspect import is_nautilus_class
+from nautilus_trader.model.data.bar import Bar
+from nautilus_trader.model.data.base import DataType
+from nautilus_trader.model.data.base import GenericData
+from nautilus_trader.model.data.tick import QuoteTick
+from nautilus_trader.model.data.tick import TradeTick
+from nautilus_trader.model.data.ticker import Ticker
+from nautilus_trader.model.data.venue import InstrumentStatusUpdate
+from nautilus_trader.model.instruments.base import Instrument
+from nautilus_trader.model.orderbook.data import OrderBookData
+from nautilus_trader.persistence.base import Singleton
+from nautilus_trader.persistence.external.metadata import load_mappings
+from nautilus_trader.serialization.arrow.serializer import ParquetSerializer
+from nautilus_trader.serialization.arrow.util import GENERIC_DATA_PREFIX
+from nautilus_trader.serialization.arrow.util import dict_of_lists_to_list_of_dicts
+
+
+class _CombinedMeta(Singleton, ABCMeta):
+ pass
+
+
+class BaseDataCatalog(ABC, metaclass=_CombinedMeta):
+ """
+ Provides a abstract base class for a queryable data catalog.
+ """
+
+ @abstractclassmethod
+ def from_env(cls):
+ raise NotImplementedError
+
+ @abstractclassmethod
+ def from_uri(cls, uri):
+ raise NotImplementedError
+
+ # -- QUERIES -----------------------------------------------------------------------------------
+
+ @abstractmethod
+ def _query(
+ self,
+ cls: type,
+ filter_expr: Optional[Callable] = None,
+ instrument_ids=None,
+ start=None,
+ end=None,
+ ts_column="ts_init",
+ raise_on_empty: bool = True,
+ instrument_id_column="instrument_id",
+ table_kwargs: Optional[Dict] = None,
+ clean_instrument_keys: bool = True,
+ as_dataframe: bool = True,
+ projections: Optional[Dict] = None,
+ **kwargs,
+ ):
+ raise NotImplementedError
+
+ def load_inverse_mappings(self, path):
+ mappings = load_mappings(fs=self.fs, path=path)
+ for key in mappings:
+ mappings[key] = {v: k for k, v in mappings[key].items()}
+ return mappings
+
+ @staticmethod
+ def _handle_table_dataframe(
+ table: pa.Table,
+ mappings: Optional[Dict],
+ raise_on_empty: bool = True,
+ sort_columns: Optional[List] = None,
+ as_type: Optional[Dict] = None,
+ ):
+ df = table.to_pandas().drop_duplicates()
+ for col in mappings:
+ df.loc[:, col] = df[col].map(mappings[col])
+
+ if df.empty and raise_on_empty:
+ local_vars = dict(locals())
+ kw = [f"{k}={local_vars[k]}" for k in ("filter_expr", "instrument_ids", "start", "end")]
+ raise ValueError(f"Data empty for {kw}")
+ if sort_columns:
+ df = df.sort_values(sort_columns)
+ if as_type:
+ df = df.astype(as_type)
+ return df
+
+ @staticmethod
+ def _handle_table_nautilus(
+ table: Union[pa.Table, pd.DataFrame], cls: type, mappings: Optional[Dict]
+ ):
+ if isinstance(table, pa.Table):
+ dicts = dict_of_lists_to_list_of_dicts(table.to_pydict())
+ elif isinstance(table, pd.DataFrame):
+ dicts = table.to_dict("records")
+ else:
+ raise TypeError(
+ f"`table` was {type(table)}, expected `pyarrow.Table` or `pandas.DataFrame`"
+ )
+ if not dicts:
+ return []
+ for key, maps in mappings.items():
+ for d in dicts:
+ if d[key] in maps:
+ d[key] = maps[d[key]]
+ data = ParquetSerializer.deserialize(cls=cls, chunk=dicts)
+ return data
+
+ def query(
+ self,
+ cls: type,
+ filter_expr: Optional[Callable] = None,
+ instrument_ids=None,
+ as_nautilus: bool = False,
+ sort_columns: Optional[List[str]] = None,
+ as_type: Optional[Dict] = None,
+ **kwargs,
+ ):
+ if not is_nautilus_class(cls):
+ # Special handling for generic data
+ return self.generic_data(
+ cls=cls,
+ filter_expr=filter_expr,
+ instrument_ids=instrument_ids,
+ as_nautilus=as_nautilus,
+ **kwargs,
+ )
+ return self._query(
+ cls=cls,
+ filter_expr=filter_expr,
+ instrument_ids=instrument_ids,
+ sort_columns=sort_columns,
+ as_type=as_type,
+ as_dataframe=not as_nautilus,
+ **kwargs,
+ )
+
+ @abstractmethod
+ def _query_subclasses(
+ self,
+ base_cls: type,
+ filter_expr: Optional[Callable] = None,
+ instrument_ids=None,
+ as_nautilus: bool = False,
+ **kwargs,
+ ):
+ raise NotImplementedError
+
+ def instruments(
+ self,
+ instrument_type: Optional[type] = None,
+ instrument_ids=None,
+ filter_expr: Optional[Callable] = None,
+ as_nautilus: bool = False,
+ **kwargs,
+ ):
+ if instrument_type is not None:
+ assert isinstance(instrument_type, type)
+ base_cls = instrument_type
+ else:
+ base_cls = Instrument
+
+ return self._query_subclasses(
+ base_cls=base_cls,
+ instrument_ids=instrument_ids,
+ filter_expr=filter_expr,
+ as_nautilus=as_nautilus,
+ instrument_id_column="id",
+ clean_instrument_keys=False,
+ **kwargs,
+ )
+
+ def instrument_status_updates(
+ self,
+ instrument_ids=None,
+ filter_expr: Optional[Callable] = None,
+ as_nautilus: bool = False,
+ **kwargs,
+ ):
+ return self.query(
+ cls=InstrumentStatusUpdate,
+ instrument_ids=instrument_ids,
+ filter_expr=filter_expr,
+ as_nautilus=as_nautilus,
+ sort_columns=["instrument_id", "ts_init"],
+ **kwargs,
+ )
+
+ def trade_ticks(
+ self,
+ instrument_ids=None,
+ filter_expr: Optional[Callable] = None,
+ as_nautilus: bool = False,
+ **kwargs,
+ ):
+ return self.query(
+ cls=TradeTick,
+ filter_expr=filter_expr,
+ instrument_ids=instrument_ids,
+ as_nautilus=as_nautilus,
+ as_type={"price": float, "size": float},
+ **kwargs,
+ )
+
+ def quote_ticks(
+ self,
+ instrument_ids=None,
+ filter_expr: Optional[Callable] = None,
+ as_nautilus: bool = False,
+ **kwargs,
+ ):
+ return self.query(
+ cls=QuoteTick,
+ filter_expr=filter_expr,
+ instrument_ids=instrument_ids,
+ as_nautilus=as_nautilus,
+ **kwargs,
+ )
+
+ def tickers(
+ self,
+ instrument_ids=None,
+ filter_expr: Optional[Callable] = None,
+ as_nautilus: bool = False,
+ **kwargs,
+ ):
+ return self._query_subclasses(
+ base_cls=Ticker,
+ filter_expr=filter_expr,
+ instrument_ids=instrument_ids,
+ as_nautilus=as_nautilus,
+ **kwargs,
+ )
+
+ def bars(
+ self,
+ instrument_ids=None,
+ filter_expr: Optional[Callable] = None,
+ as_nautilus: bool = False,
+ **kwargs,
+ ):
+ return self._query_subclasses(
+ base_cls=Bar,
+ filter_expr=filter_expr,
+ instrument_ids=instrument_ids,
+ as_nautilus=as_nautilus,
+ **kwargs,
+ )
+
+ def order_book_deltas(
+ self,
+ instrument_ids=None,
+ filter_expr: Optional[Callable] = None,
+ as_nautilus: bool = False,
+ **kwargs,
+ ):
+ return self.query(
+ cls=OrderBookData,
+ filter_expr=filter_expr,
+ instrument_ids=instrument_ids,
+ as_nautilus=as_nautilus,
+ **kwargs,
+ )
+
+ def generic_data(
+ self,
+ cls: type,
+ filter_expr: Optional[Callable] = None,
+ as_nautilus: bool = False,
+ **kwargs,
+ ):
+ data = self._query(
+ cls=cls,
+ filter_expr=filter_expr,
+ as_dataframe=not as_nautilus,
+ **kwargs,
+ )
+ if as_nautilus:
+ if data is None:
+ return []
+ return [GenericData(data_type=DataType(cls), data=d) for d in data]
+ return data
+
+ @abstractmethod
+ def list_data_types(self):
+ raise NotImplementedError
+
+ def list_generic_data_types(self):
+ data_types = self.list_data_types()
+ return [
+ n.replace(GENERIC_DATA_PREFIX, "")
+ for n in data_types
+ if n.startswith(GENERIC_DATA_PREFIX)
+ ]
+
+ @abstractmethod
+ def list_backtests(self) -> List[str]:
+ raise NotImplementedError
+
+ @abstractmethod
+ def list_live_runs(self) -> List[str]:
+ raise NotImplementedError
+
+ @abstractmethod
+ def read_live_run(self, live_run_id: str, **kwargs):
+ raise NotImplementedError
+
+ @abstractmethod
+ def read_backtest(self, backtest_run_id: str, **kwargs):
+ raise NotImplementedError
diff --git a/nautilus_trader/persistence/catalog.py b/nautilus_trader/persistence/catalog/parquet.py
similarity index 69%
rename from nautilus_trader/persistence/catalog.py
rename to nautilus_trader/persistence/catalog/parquet.py
index 08b80125624d..da67b006d961 100644
--- a/nautilus_trader/persistence/catalog.py
+++ b/nautilus_trader/persistence/catalog/parquet.py
@@ -26,30 +26,19 @@
from fsspec.utils import infer_storage_options
from pyarrow import ArrowInvalid
-from nautilus_trader.core.inspect import is_nautilus_class
-from nautilus_trader.model.data.bar import Bar
-from nautilus_trader.model.data.base import DataType
-from nautilus_trader.model.data.base import GenericData
-from nautilus_trader.model.data.tick import QuoteTick
-from nautilus_trader.model.data.tick import TradeTick
-from nautilus_trader.model.data.ticker import Ticker
-from nautilus_trader.model.data.venue import InstrumentStatusUpdate
-from nautilus_trader.model.instruments.base import Instrument
-from nautilus_trader.model.orderbook.data import OrderBookData
-from nautilus_trader.persistence.base import Singleton
+from nautilus_trader.persistence.catalog.base import BaseDataCatalog
from nautilus_trader.persistence.external.metadata import load_mappings
from nautilus_trader.serialization.arrow.serializer import ParquetSerializer
from nautilus_trader.serialization.arrow.serializer import list_schemas
-from nautilus_trader.serialization.arrow.util import GENERIC_DATA_PREFIX
from nautilus_trader.serialization.arrow.util import camel_to_snake_case
from nautilus_trader.serialization.arrow.util import class_to_filename
from nautilus_trader.serialization.arrow.util import clean_key
from nautilus_trader.serialization.arrow.util import dict_of_lists_to_list_of_dicts
-class DataCatalog(metaclass=Singleton):
+class ParquetDataCatalog(BaseDataCatalog):
"""
- Provides a queryable data catalog.
+ Provides a queryable data catalog persisted to file in parquet format.
Parameters
----------
@@ -193,35 +182,6 @@ def _make_path(self, cls: type) -> str:
path: pathlib.Path = self.path / "data" / f"{class_to_filename(cls=cls)}.parquet"
return str(resolve_path(path=path, fs=self.fs))
- def query(
- self,
- cls: type,
- filter_expr: Optional[Callable] = None,
- instrument_ids=None,
- as_nautilus: bool = False,
- sort_columns: Optional[List[str]] = None,
- as_type: Optional[Dict] = None,
- **kwargs,
- ):
- if not is_nautilus_class(cls):
- # Special handling for generic data
- return self.generic_data(
- cls=cls,
- filter_expr=filter_expr,
- instrument_ids=instrument_ids,
- as_nautilus=as_nautilus,
- **kwargs,
- )
- return self._query(
- cls=cls,
- filter_expr=filter_expr,
- instrument_ids=instrument_ids,
- sort_columns=sort_columns,
- as_type=as_type,
- as_dataframe=not as_nautilus,
- **kwargs,
- )
-
def _query_subclasses(
self,
base_cls: type,
@@ -259,153 +219,10 @@ def _query_subclasses(
objects = [o for objs in filter(None, dfs) for o in objs]
return objects
- def instruments(
- self,
- instrument_type: Optional[type] = None,
- instrument_ids=None,
- filter_expr: Optional[Callable] = None,
- as_nautilus: bool = False,
- **kwargs,
- ):
- if instrument_type is not None:
- assert isinstance(instrument_type, type)
- base_cls = instrument_type
- else:
- base_cls = Instrument
-
- return self._query_subclasses(
- base_cls=base_cls,
- instrument_ids=instrument_ids,
- filter_expr=filter_expr,
- as_nautilus=as_nautilus,
- instrument_id_column="id",
- clean_instrument_keys=False,
- **kwargs,
- )
-
- def instrument_status_updates(
- self,
- instrument_ids=None,
- filter_expr: Optional[Callable] = None,
- as_nautilus: bool = False,
- **kwargs,
- ):
- return self.query(
- cls=InstrumentStatusUpdate,
- instrument_ids=instrument_ids,
- filter_expr=filter_expr,
- as_nautilus=as_nautilus,
- sort_columns=["instrument_id", "ts_init"],
- **kwargs,
- )
-
- def trade_ticks(
- self,
- instrument_ids=None,
- filter_expr: Optional[Callable] = None,
- as_nautilus: bool = False,
- **kwargs,
- ):
- return self.query(
- cls=TradeTick,
- filter_expr=filter_expr,
- instrument_ids=instrument_ids,
- as_nautilus=as_nautilus,
- as_type={"price": float, "size": float},
- **kwargs,
- )
-
- def quote_ticks(
- self,
- instrument_ids=None,
- filter_expr: Optional[Callable] = None,
- as_nautilus: bool = False,
- **kwargs,
- ):
- return self.query(
- cls=QuoteTick,
- filter_expr=filter_expr,
- instrument_ids=instrument_ids,
- as_nautilus=as_nautilus,
- **kwargs,
- )
-
- def tickers(
- self,
- instrument_ids=None,
- filter_expr: Optional[Callable] = None,
- as_nautilus: bool = False,
- **kwargs,
- ):
- return self._query_subclasses(
- base_cls=Ticker,
- filter_expr=filter_expr,
- instrument_ids=instrument_ids,
- as_nautilus=as_nautilus,
- **kwargs,
- )
-
- def bars(
- self,
- instrument_ids=None,
- filter_expr: Optional[Callable] = None,
- as_nautilus: bool = False,
- **kwargs,
- ):
- return self._query_subclasses(
- base_cls=Bar,
- filter_expr=filter_expr,
- instrument_ids=instrument_ids,
- as_nautilus=as_nautilus,
- **kwargs,
- )
-
- def order_book_deltas(
- self,
- instrument_ids=None,
- filter_expr: Optional[Callable] = None,
- as_nautilus: bool = False,
- **kwargs,
- ):
- return self.query(
- cls=OrderBookData,
- filter_expr=filter_expr,
- instrument_ids=instrument_ids,
- as_nautilus=as_nautilus,
- **kwargs,
- )
-
- def generic_data(
- self,
- cls: type,
- filter_expr: Optional[Callable] = None,
- as_nautilus: bool = False,
- **kwargs,
- ):
- data = self._query(
- cls=cls,
- filter_expr=filter_expr,
- as_dataframe=not as_nautilus,
- **kwargs,
- )
- if as_nautilus:
- if data is None:
- return []
- return [GenericData(data_type=DataType(cls), data=d) for d in data]
- return data
-
def list_data_types(self):
glob_path = resolve_path(self.path / "data" / "*.parquet", fs=self.fs)
return [pathlib.Path(p).stem for p in self.fs.glob(glob_path)]
- def list_generic_data_types(self):
- data_types = self.list_data_types()
- return [
- n.replace(GENERIC_DATA_PREFIX, "")
- for n in data_types
- if n.startswith(GENERIC_DATA_PREFIX)
- ]
-
def list_partitions(self, cls_type: type):
assert isinstance(cls_type, type), "`cls_type` should be type, i.e. TradeTick"
name = class_to_filename(cls_type)
diff --git a/nautilus_trader/persistence/external/core.py b/nautilus_trader/persistence/external/core.py
index cb0425c1d64a..94d49cc7913c 100644
--- a/nautilus_trader/persistence/external/core.py
+++ b/nautilus_trader/persistence/external/core.py
@@ -34,8 +34,9 @@
from nautilus_trader.core.correctness import PyCondition
from nautilus_trader.model.data.base import GenericData
from nautilus_trader.model.instruments.base import Instrument
-from nautilus_trader.persistence.catalog import DataCatalog
-from nautilus_trader.persistence.catalog import resolve_path
+from nautilus_trader.persistence.catalog.base import BaseDataCatalog
+from nautilus_trader.persistence.catalog.parquet import ParquetDataCatalog
+from nautilus_trader.persistence.catalog.parquet import resolve_path
from nautilus_trader.persistence.external.metadata import load_mappings
from nautilus_trader.persistence.external.metadata import write_partition_column_mappings
from nautilus_trader.persistence.external.readers import Reader
@@ -94,7 +95,7 @@ def iter(self):
yield raw
-def process_raw_file(catalog: DataCatalog, raw_file: RawFile, reader: Reader):
+def process_raw_file(catalog: ParquetDataCatalog, raw_file: RawFile, reader: Reader):
n_rows = 0
for block in raw_file.iter():
objs = [x for x in reader.parse(block) if x is not None]
@@ -108,7 +109,7 @@ def process_raw_file(catalog: DataCatalog, raw_file: RawFile, reader: Reader):
def process_files(
glob_path,
reader: Reader,
- catalog: DataCatalog,
+ catalog: ParquetDataCatalog,
block_size: str = "128mb",
compression: str = "infer",
executor: Optional[Executor] = None,
@@ -202,7 +203,7 @@ def determine_partition_cols(cls: type, instrument_id: str = None) -> Union[List
return None
-def merge_existing_data(catalog: DataCatalog, cls: type, df: pd.DataFrame) -> pd.DataFrame:
+def merge_existing_data(catalog: BaseDataCatalog, cls: type, df: pd.DataFrame) -> pd.DataFrame:
"""
Handle existing data for instrument subclasses.
@@ -219,7 +220,9 @@ def merge_existing_data(catalog: DataCatalog, cls: type, df: pd.DataFrame) -> pd
return df
-def write_tables(catalog: DataCatalog, tables: Dict[type, Dict[str, pd.DataFrame]], **kwargs):
+def write_tables(
+ catalog: ParquetDataCatalog, tables: Dict[type, Dict[str, pd.DataFrame]], **kwargs
+):
"""
Write tables to catalog.
"""
@@ -334,7 +337,7 @@ def write_parquet(
write_partition_column_mappings(fs=fs, path=path, mappings=mappings)
-def write_objects(catalog: DataCatalog, chunk: List, **kwargs):
+def write_objects(catalog: ParquetDataCatalog, chunk: List, **kwargs):
serialized = split_and_serialize(objs=chunk)
tables = dicts_to_dataframes(serialized)
write_tables(catalog=catalog, tables=tables, **kwargs)
@@ -378,7 +381,7 @@ def _parse_file_start(fn: str) -> Optional[Tuple[str, pd.Timestamp]]:
return None
-def _validate_dataset(catalog: DataCatalog, path: str, new_partition_format="%Y%m%d"):
+def _validate_dataset(catalog: ParquetDataCatalog, path: str, new_partition_format="%Y%m%d"):
"""
Repartition dataset into sorted time chunks (default dates) and drop duplicates.
"""
@@ -408,7 +411,7 @@ def _validate_dataset(catalog: DataCatalog, path: str, new_partition_format="%Y%
fs.rm(fn)
-def validate_data_catalog(catalog: DataCatalog, **kwargs):
+def validate_data_catalog(catalog: ParquetDataCatalog, **kwargs):
for cls in catalog.list_data_types():
path = resolve_path(catalog.path / "data" / f"{cls}.parquet", fs=catalog.fs)
_validate_dataset(catalog=catalog, path=path, **kwargs)
diff --git a/nautilus_trader/persistence/external/readers.py b/nautilus_trader/persistence/external/readers.py
index a35147b157d9..460a329278b9 100644
--- a/nautilus_trader/persistence/external/readers.py
+++ b/nautilus_trader/persistence/external/readers.py
@@ -38,9 +38,9 @@ class LinePreprocessor:
2021-06-29T06:03:14.528000 - {"op":"mcm","pt":1624946594395,"mc":[{"id":"1.179082386","rc":[{"atb":[[1.93,0]]}]}
- The raw JSON data is contained after the logging timestamp, but we would
- also want to use this timestamp as the `ts_init` value in Nautilus. In
- this instance, you could use something along the lines of:
+ The raw JSON data is contained after the logging timestamp, additionally we would
+ also want to use this timestamp as the Nautilus `ts_init` value. In
+ this instance, you could use something like:
>>> class LoggingLinePreprocessor(LinePreprocessor):
>>> @staticmethod
diff --git a/nautilus_trader/persistence/migrate.py b/nautilus_trader/persistence/migrate.py
index 9dff81ebf52a..c4c3c01ab51f 100644
--- a/nautilus_trader/persistence/migrate.py
+++ b/nautilus_trader/persistence/migrate.py
@@ -13,7 +13,7 @@
# limitations under the License.
# -------------------------------------------------------------------------------------------------
-from nautilus_trader.persistence.catalog import DataCatalog
+from nautilus_trader.persistence.catalog.base import BaseDataCatalog
from nautilus_trader.persistence.external.core import write_objects
@@ -36,6 +36,6 @@ def inner(*args, **kwargs):
write_objects = create_temp_table(write_objects)
-def migrate(catalog: DataCatalog, version_from: str, version_to: str):
+def migrate(catalog: BaseDataCatalog, version_from: str, version_to: str):
"""Migrate the `catalog` between versions `version_from` and `version_to`"""
pass
diff --git a/nautilus_trader/persistence/migrations/1.135.0.py b/nautilus_trader/persistence/migrations/1.135.0.py
index 1fed228a553a..8b243a904c63 100644
--- a/nautilus_trader/persistence/migrations/1.135.0.py
+++ b/nautilus_trader/persistence/migrations/1.135.0.py
@@ -20,7 +20,7 @@
from tqdm import tqdm
from nautilus_trader.model.data.tick import TradeTick
-from nautilus_trader.persistence.catalog import DataCatalog
+from nautilus_trader.persistence.catalog.parquet import ParquetDataCatalog
from nautilus_trader.persistence.external.core import write_objects
@@ -30,7 +30,7 @@
# EXAMPLE ONLY - not working
-def main(catalog: DataCatalog):
+def main(catalog: ParquetDataCatalog):
"""Rename match_id to trade_id in TradeTick"""
fs: fsspec.AbstractFileSystem = catalog.fs
@@ -39,7 +39,7 @@ def main(catalog: DataCatalog):
"instrument_id"
].unique()
- tmp_catalog = DataCatalog(str(catalog.path) + "_tmp")
+ tmp_catalog = ParquetDataCatalog(str(catalog.path) + "_tmp")
tmp_catalog.fs = catalog.fs
for ins_id in tqdm(instrument_ids):
diff --git a/nautilus_trader/persistence/migrations/1.137.0.py b/nautilus_trader/persistence/migrations/1.137.0.py
index 90bcd64d3846..5cf717a6cd50 100644
--- a/nautilus_trader/persistence/migrations/1.137.0.py
+++ b/nautilus_trader/persistence/migrations/1.137.0.py
@@ -19,7 +19,7 @@
import pyarrow.dataset as ds
from nautilus_trader.model.instruments.base import Instrument
-from nautilus_trader.persistence.catalog import DataCatalog
+from nautilus_trader.persistence.catalog.parquet import ParquetDataCatalog
from nautilus_trader.persistence.external.core import write_objects
from nautilus_trader.serialization.arrow.util import class_to_filename
@@ -28,7 +28,7 @@
TO = "1.137.0"
-def main(catalog: DataCatalog):
+def main(catalog: ParquetDataCatalog):
"""Rename local_symbol to native_symbol in Instruments"""
fs: fsspec.AbstractFileSystem = catalog.fs
for cls in Instrument.__subclasses__():
diff --git a/nautilus_trader/persistence/streaming.py b/nautilus_trader/persistence/streaming.py
index 11da33265156..3518d81d9cf3 100644
--- a/nautilus_trader/persistence/streaming.py
+++ b/nautilus_trader/persistence/streaming.py
@@ -30,7 +30,7 @@
from nautilus_trader.model.orderbook.data import OrderBookDelta
from nautilus_trader.model.orderbook.data import OrderBookDeltas
from nautilus_trader.model.orderbook.data import OrderBookSnapshot
-from nautilus_trader.persistence.catalog import resolve_path
+from nautilus_trader.persistence.catalog.parquet import resolve_path
from nautilus_trader.serialization.arrow.serializer import ParquetSerializer
from nautilus_trader.serialization.arrow.serializer import get_cls_table
from nautilus_trader.serialization.arrow.serializer import list_schemas
diff --git a/poetry.lock b/poetry.lock
index 4bee5360b5cb..563f06dd088f 100644
--- a/poetry.lock
+++ b/poetry.lock
@@ -91,7 +91,7 @@ tests_no_zope = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest (>
[[package]]
name = "babel"
-version = "2.10.1"
+version = "2.10.3"
description = "Internationalization utilities"
category = "dev"
optional = false
@@ -117,7 +117,7 @@ lxml = ["lxml"]
[[package]]
name = "certifi"
-version = "2022.5.18.1"
+version = "2022.6.15"
description = "Python package for providing Mozilla's CA Bundle."
category = "dev"
optional = false
@@ -144,11 +144,11 @@ python-versions = ">=3.6.1"
[[package]]
name = "charset-normalizer"
-version = "2.0.12"
+version = "2.1.0"
description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
category = "main"
optional = false
-python-versions = ">=3.5.0"
+python-versions = ">=3.6.0"
[package.extras]
unicode_backport = ["unicodedata2"]
@@ -174,7 +174,7 @@ python-versions = ">=3.6"
[[package]]
name = "colorama"
-version = "0.4.4"
+version = "0.4.5"
description = "Cross-platform colored terminal text."
category = "main"
optional = false
@@ -410,7 +410,7 @@ python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
[[package]]
name = "importlib-metadata"
-version = "4.11.4"
+version = "4.12.0"
description = "Read metadata from Python packages"
category = "dev"
optional = false
@@ -422,7 +422,7 @@ zipp = ">=0.5"
[package.extras]
docs = ["sphinx", "jaraco.packaging (>=9)", "rst.linker (>=1.9)"]
perf = ["ipython"]
-testing = ["pytest (>=6)", "pytest-checkdocs (>=2.4)", "pytest-flake8", "pytest-cov", "pytest-enabler (>=1.0.1)", "packaging", "pyfakefs", "flufl.flake8", "pytest-perf (>=0.9.2)", "pytest-black (>=0.3.7)", "pytest-mypy (>=0.9.1)", "importlib-resources (>=1.3)"]
+testing = ["pytest (>=6)", "pytest-checkdocs (>=2.4)", "pytest-flake8", "pytest-cov", "pytest-enabler (>=1.3)", "packaging", "pyfakefs", "flufl.flake8", "pytest-perf (>=0.9.2)", "pytest-black (>=0.3.7)", "pytest-mypy (>=0.9.1)", "importlib-resources (>=1.3)"]
[[package]]
name = "iniconfig"
@@ -532,7 +532,7 @@ python-versions = ">=3.7"
[[package]]
name = "msgspec"
-version = "0.6.0"
+version = "0.7.1"
description = "A fast and friendly JSON/MessagePack library, with optional schema validation"
category = "main"
optional = false
@@ -548,26 +548,26 @@ python-versions = ">=3.7"
[[package]]
name = "myst-parser"
-version = "0.17.2"
+version = "0.18.0"
description = "An extended commonmark compliant parser, with bridges to docutils & sphinx."
category = "dev"
optional = false
python-versions = ">=3.7"
[package.dependencies]
-docutils = ">=0.15,<0.18"
+docutils = ">=0.15,<0.19"
jinja2 = "*"
markdown-it-py = ">=1.0.0,<3.0.0"
mdit-py-plugins = ">=0.3.0,<0.4.0"
pyyaml = "*"
-sphinx = ">=3.1,<5"
+sphinx = ">=4,<6"
typing-extensions = "*"
[package.extras]
code_style = ["pre-commit (>=2.12,<3.0)"]
linkify = ["linkify-it-py (>=1.0,<2.0)"]
-rtd = ["ipython", "sphinx-book-theme", "sphinx-panels", "sphinxcontrib-bibtex (>=2.4,<3.0)", "sphinxext-rediraffe (>=0.2.7,<0.3.0)", "sphinxcontrib.mermaid (>=0.7.1,<0.8.0)", "sphinxext-opengraph (>=0.6.3,<0.7.0)"]
-testing = ["beautifulsoup4", "coverage", "docutils (>=0.17.0,<0.18.0)", "pytest (>=6,<7)", "pytest-cov", "pytest-regressions", "pytest-param-files (>=0.3.4,<0.4.0)"]
+rtd = ["ipython", "sphinx-book-theme", "sphinx-design", "sphinxext-rediraffe (>=0.2.7,<0.3.0)", "sphinxcontrib.mermaid (>=0.7.1,<0.8.0)", "sphinxext-opengraph (>=0.6.3,<0.7.0)"]
+testing = ["beautifulsoup4", "coverage", "pytest (>=6,<7)", "pytest-cov", "pytest-regressions", "pytest-param-files (>=0.3.4,<0.4.0)", "sphinx-pytest"]
[[package]]
name = "nest-asyncio"
@@ -579,7 +579,7 @@ python-versions = ">=3.5"
[[package]]
name = "networkx"
-version = "2.8.3"
+version = "2.8.4"
description = "Python package for creating and manipulating graphs and networks"
category = "main"
optional = true
@@ -588,17 +588,17 @@ python-versions = ">=3.8"
[package.extras]
default = ["numpy (>=1.19)", "scipy (>=1.8)", "matplotlib (>=3.4)", "pandas (>=1.3)"]
developer = ["pre-commit (>=2.19)", "mypy (>=0.960)"]
-doc = ["sphinx (>=4.5)", "pydata-sphinx-theme (>=0.8.1)", "sphinx-gallery (>=0.10)", "numpydoc (>=1.3)", "pillow (>=9.1)", "nb2plots (>=0.6)", "texext (>=0.6.6)"]
+doc = ["sphinx (>=5)", "pydata-sphinx-theme (>=0.9)", "sphinx-gallery (>=0.10)", "numpydoc (>=1.4)", "pillow (>=9.1)", "nb2plots (>=0.6)", "texext (>=0.6.6)"]
extra = ["lxml (>=4.6)", "pygraphviz (>=1.9)", "pydot (>=1.4.2)", "sympy (>=1.10)"]
test = ["pytest (>=7.1)", "pytest-cov (>=3.0)", "codecov (>=2.1)"]
[[package]]
name = "nodeenv"
-version = "1.6.0"
+version = "1.7.0"
description = "Node.js virtual environment builder"
category = "dev"
optional = false
-python-versions = "*"
+python-versions = ">=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*"
[[package]]
name = "nox"
@@ -620,7 +620,7 @@ tox_to_nox = ["jinja2", "tox"]
[[package]]
name = "numpy"
-version = "1.22.4"
+version = "1.23.0"
description = "NumPy is the fundamental package for array computing with Python."
category = "main"
optional = false
@@ -628,7 +628,7 @@ python-versions = ">=3.8"
[[package]]
name = "numpydoc"
-version = "1.3.1"
+version = "1.4.0"
description = "Sphinx extension to support docstrings in Numpy format"
category = "dev"
optional = false
@@ -643,7 +643,7 @@ testing = ["pytest", "pytest-cov", "matplotlib"]
[[package]]
name = "orjson"
-version = "3.7.1"
+version = "3.7.5"
description = "Fast, correct Python JSON library supporting dataclasses, datetimes, and numpy"
category = "main"
optional = false
@@ -662,7 +662,7 @@ pyparsing = ">=2.0.2,<3.0.5 || >3.0.5"
[[package]]
name = "pandas"
-version = "1.4.2"
+version = "1.4.3"
description = "Powerful data structures for data analysis, time series, and statistics"
category = "main"
optional = false
@@ -769,7 +769,7 @@ numpy = ">=1.16.6"
[[package]]
name = "pycares"
-version = "4.1.2"
+version = "4.2.1"
description = "Python interface for c-ares"
category = "main"
optional = false
@@ -904,7 +904,7 @@ pytest = ">=3.10"
[[package]]
name = "pytest-mock"
-version = "3.7.0"
+version = "3.8.1"
description = "Thin-wrapper around the mock package for easier use with pytest"
category = "dev"
optional = false
@@ -979,7 +979,7 @@ python-versions = ">=3.6"
[[package]]
name = "redis"
-version = "4.3.3"
+version = "4.3.4"
description = "Python client for Redis database and key-value store"
category = "main"
optional = true
@@ -996,21 +996,21 @@ ocsp = ["cryptography (>=36.0.1)", "pyopenssl (==20.0.1)", "requests (>=2.26.0)"
[[package]]
name = "requests"
-version = "2.27.1"
+version = "2.28.1"
description = "Python HTTP for Humans."
category = "dev"
optional = false
-python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*"
+python-versions = ">=3.7, <4"
[package.dependencies]
certifi = ">=2017.4.17"
-charset-normalizer = {version = ">=2.0.0,<2.1.0", markers = "python_version >= \"3\""}
-idna = {version = ">=2.5,<4", markers = "python_version >= \"3\""}
+charset-normalizer = ">=2,<3"
+idna = ">=2.5,<4"
urllib3 = ">=1.21.1,<1.27"
[package.extras]
-socks = ["PySocks (>=1.5.6,!=1.5.7)", "win-inet-pton"]
-use_chardet_on_py3 = ["chardet (>=3.0.2,<5)"]
+socks = ["PySocks (>=1.5.6,!=1.5.7)"]
+use_chardet_on_py3 = ["chardet (>=3.0.2,<6)"]
[[package]]
name = "scipy"
@@ -1234,11 +1234,11 @@ test = ["pytest"]
[[package]]
name = "tabulate"
-version = "0.8.9"
+version = "0.8.10"
description = "Pretty-print tabular data"
category = "main"
optional = false
-python-versions = "*"
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
[package.extras]
widechars = ["wcwidth"]
@@ -1339,7 +1339,7 @@ test = ["aiohttp", "flake8 (>=3.9.2,<3.10.0)", "psutil", "pycodestyle (>=2.7.0,<
[[package]]
name = "virtualenv"
-version = "20.14.1"
+version = "20.15.1"
description = "Virtual Python Environment builder"
category = "dev"
optional = false
@@ -1395,7 +1395,7 @@ redis = ["hiredis", "redis"]
[metadata]
lock-version = "1.1"
python-versions = ">=3.8,<3.11"
-content-hash = "c081bfbc624034c01edae6ca1614c76dd1ebf1f1fd543a3d0713412dc0bd2e6c"
+content-hash = "75b7b30fd42843b8f4c1d9ecc711ec8a0699479bfbaaf5723c9b52cb36f3d807"
[metadata.files]
aiodns = [
@@ -1501,16 +1501,16 @@ attrs = [
{file = "attrs-21.4.0.tar.gz", hash = "sha256:626ba8234211db98e869df76230a137c4c40a12d72445c45d5f5b716f076e2fd"},
]
babel = [
- {file = "Babel-2.10.1-py3-none-any.whl", hash = "sha256:3f349e85ad3154559ac4930c3918247d319f21910d5ce4b25d439ed8693b98d2"},
- {file = "Babel-2.10.1.tar.gz", hash = "sha256:98aeaca086133efb3e1e2aad0396987490c8425929ddbcfe0550184fdc54cd13"},
+ {file = "Babel-2.10.3-py3-none-any.whl", hash = "sha256:ff56f4892c1c4bf0d814575ea23471c230d544203c7748e8c68f0089478d48eb"},
+ {file = "Babel-2.10.3.tar.gz", hash = "sha256:7614553711ee97490f732126dc077f8d0ae084ebc6a96e23db1482afabdb2c51"},
]
beautifulsoup4 = [
{file = "beautifulsoup4-4.11.1-py3-none-any.whl", hash = "sha256:58d5c3d29f5a36ffeb94f02f0d786cd53014cf9b3b3951d42e0080d8a9498d30"},
{file = "beautifulsoup4-4.11.1.tar.gz", hash = "sha256:ad9aa55b65ef2808eb405f46cf74df7fcb7044d5cbc26487f96eb2ef2e436693"},
]
certifi = [
- {file = "certifi-2022.5.18.1-py3-none-any.whl", hash = "sha256:f1d53542ee8cbedbe2118b5686372fb33c297fcd6379b050cca0ef13a597382a"},
- {file = "certifi-2022.5.18.1.tar.gz", hash = "sha256:9c5705e395cd70084351dd8ad5c41e65655e08ce46f2ec9cf6c2c08390f71eb7"},
+ {file = "certifi-2022.6.15-py3-none-any.whl", hash = "sha256:fe86415d55e84719d75f8b69414f6438ac3547d2078ab91b67e779ef69378412"},
+ {file = "certifi-2022.6.15.tar.gz", hash = "sha256:84c85a9078b11105f04f3036a9482ae10e4621616db313fe045dd24743a0820d"},
]
cffi = [
{file = "cffi-1.15.0-cp27-cp27m-macosx_10_9_x86_64.whl", hash = "sha256:c2502a1a03b6312837279c8c1bd3ebedf6c12c4228ddbad40912d671ccc8a962"},
@@ -1569,8 +1569,8 @@ cfgv = [
{file = "cfgv-3.3.1.tar.gz", hash = "sha256:f5a830efb9ce7a445376bb66ec94c638a9787422f96264c98edc6bdeed8ab736"},
]
charset-normalizer = [
- {file = "charset-normalizer-2.0.12.tar.gz", hash = "sha256:2857e29ff0d34db842cd7ca3230549d1a697f96ee6d3fb071cfa6c7393832597"},
- {file = "charset_normalizer-2.0.12-py3-none-any.whl", hash = "sha256:6881edbebdb17b39b4eaaa821b438bf6eddffb4468cf344f09f89def34a8b1df"},
+ {file = "charset-normalizer-2.1.0.tar.gz", hash = "sha256:575e708016ff3a5e3681541cb9d79312c416835686d054a23accb873b254f413"},
+ {file = "charset_normalizer-2.1.0-py3-none-any.whl", hash = "sha256:5189b6f22b01957427f35b6a08d9a0bc45b46d3788ef5a92e978433c7a35f8a5"},
]
click = [
{file = "click-8.1.3-py3-none-any.whl", hash = "sha256:bb4d8133cb15a609f44e8213d9b391b0809795062913b383c62be0ee95b1db48"},
@@ -1581,8 +1581,8 @@ cloudpickle = [
{file = "cloudpickle-2.1.0.tar.gz", hash = "sha256:bb233e876a58491d9590a676f93c7a5473a08f747d5ab9df7f9ce564b3e7938e"},
]
colorama = [
- {file = "colorama-0.4.4-py2.py3-none-any.whl", hash = "sha256:9f47eda37229f68eee03b24b9748937c7dc3868f906e8ba69fbcbdd3bc5dc3e2"},
- {file = "colorama-0.4.4.tar.gz", hash = "sha256:5941b2b48a20143d2267e95b1c2a7603ce057ee39fd88e7329b0c292aa16869b"},
+ {file = "colorama-0.4.5-py2.py3-none-any.whl", hash = "sha256:854bf444933e37f5824ae7bfc1e98d5bce2ebe4160d46b5edf346a89358e99da"},
+ {file = "colorama-0.4.5.tar.gz", hash = "sha256:e6c6b4334fc50988a639d9b98aa429a0b57da6e17b9a44f0451f930b6967b7a4"},
]
colorlog = [
{file = "colorlog-6.6.0-py2.py3-none-any.whl", hash = "sha256:351c51e866c86c3217f08e4b067a7974a678be78f07f85fc2d55b8babde6d94e"},
@@ -1843,8 +1843,8 @@ imagesize = [
{file = "imagesize-1.3.0.tar.gz", hash = "sha256:cd1750d452385ca327479d45b64d9c7729ecf0b3969a58148298c77092261f9d"},
]
importlib-metadata = [
- {file = "importlib_metadata-4.11.4-py3-none-any.whl", hash = "sha256:c58c8eb8a762858f49e18436ff552e83914778e50e9d2f1660535ffb364552ec"},
- {file = "importlib_metadata-4.11.4.tar.gz", hash = "sha256:5d26852efe48c0a32b0509ffbc583fda1a2266545a78d104a6f4aff3db17d700"},
+ {file = "importlib_metadata-4.12.0-py3-none-any.whl", hash = "sha256:7401a975809ea1fdc658c3aa4f78cc2195a0e019c5cbc4c06122884e9ae80c23"},
+ {file = "importlib_metadata-4.12.0.tar.gz", hash = "sha256:637245b8bab2b6502fcbc752cc4b7a6f6243bb02b31c5c26156ad103d3d45670"},
]
iniconfig = [
{file = "iniconfig-1.1.1-py2.py3-none-any.whl", hash = "sha256:011e24c64b7f47f6ebd835bb12a743f2fbe9a26d4cecaa7f53bc4f35ee9da8b3"},
@@ -1978,19 +1978,28 @@ mdurl = [
{file = "mdurl-0.1.1.tar.gz", hash = "sha256:f79c9709944df218a4cdb0fcc0b0c7ead2f44594e3e84dc566606f04ad749c20"},
]
msgspec = [
- {file = "msgspec-0.6.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:b12e831cc54b26d19b8f22935c85b475bc8d1fc03f7333da54c9f9ced950b6d8"},
- {file = "msgspec-0.6.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:388cd4036b79cd718a635999f9482f8a95b7ec8481ece848079b55f84aa61395"},
- {file = "msgspec-0.6.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:d86573b780e43c0d05f65bcffd683ee1c43b79cace3d0528f73f7c808800fabb"},
- {file = "msgspec-0.6.0-cp310-cp310-win_amd64.whl", hash = "sha256:141240ba69556eba7adc085dd653178c5679af80979695434174b9b7df46ad10"},
- {file = "msgspec-0.6.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:411e594c1388cc978d6bba5f636d0f7f6c18b8813f0c9199c95bd17db3b4d63c"},
- {file = "msgspec-0.6.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a3d8dcbc3ad69f741637fb6c69b09d7c4b44a95ca388da91220446525988c5da"},
- {file = "msgspec-0.6.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:25e8359c7725826deb8fad3340fc206f017fd041245d0633fb257fd47fc99199"},
- {file = "msgspec-0.6.0-cp38-cp38-win_amd64.whl", hash = "sha256:7b2fbbb71ae7420dd3b81c217434fe89065bdcb14e8c885b3e1d2bf401b0b7a0"},
- {file = "msgspec-0.6.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:559f749bbbf691c708bed998be472eea994cad1e846bbc9a68f19af8d7179041"},
- {file = "msgspec-0.6.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5ef7ef54494183ddef8ac4e21eb3fc6d7fbcdc7b56d6a19414e51a672bdaa3e6"},
- {file = "msgspec-0.6.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:3b8ff0237bb81b48ed680749043a969fa56ff06c13120944d2f4432fb3af4888"},
- {file = "msgspec-0.6.0-cp39-cp39-win_amd64.whl", hash = "sha256:ea0f848255520cc3695ba9c174645c64233a6e0195240f00399d4c0ea5888b34"},
- {file = "msgspec-0.6.0.tar.gz", hash = "sha256:27c9615be6a32898d95ba0c7b17251ee4c2d9f5fe51b61b4e1086cf6e017bec1"},
+ {file = "msgspec-0.7.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:4c76fff9dba14477d9be5085c5925f54ae358d15505004b7bbdfc6da7bc7ecc1"},
+ {file = "msgspec-0.7.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:555b5311a8f9908896dbd02361140cdaf4dcc8c73149c0c1162e13b0558e6dbd"},
+ {file = "msgspec-0.7.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c77d5b557b3f92d686e671f396b46832117a15b23f3060a809c64a757844c505"},
+ {file = "msgspec-0.7.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:87a23e0cbc946c8542e7936556351245a14ed301aa13a61aa62fcca07ad890aa"},
+ {file = "msgspec-0.7.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:2be49a102142570d5b878d47e1e9d4897a64697c928cca8be57db491e15847cb"},
+ {file = "msgspec-0.7.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:053a300b43028baa138edee85b0a40a4bba1661b8f349d0294a7ca67ad2adda4"},
+ {file = "msgspec-0.7.1-cp310-cp310-win_amd64.whl", hash = "sha256:79190031a61fc98ec87be64eed0b5ffcb88e390beae2454a27b67c3a68d3fd23"},
+ {file = "msgspec-0.7.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:b584ba5762ee218f646790678811f7c80f8a565929fb09e03e9b241239424577"},
+ {file = "msgspec-0.7.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:853ff2f4d8508d18f07c14af8149b50dac781262abe1c3cac2a0c0bbeca08883"},
+ {file = "msgspec-0.7.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:550d007775592f65b55e6c1326b483f6e186769df57fc793112f179657450ec5"},
+ {file = "msgspec-0.7.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f492de69eb4098363759af1868f99804da50525d323df370126207bf857b63b6"},
+ {file = "msgspec-0.7.1-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:405ccd250e139f970610ba02fca113eb1459b4b27d4cd331f89e2e86ccd7feb6"},
+ {file = "msgspec-0.7.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:f0f19c2ad47bd2ccc5d7ad3ed01939380f42ed99160b3a8112ce0e9dc8c1fad0"},
+ {file = "msgspec-0.7.1-cp38-cp38-win_amd64.whl", hash = "sha256:9518207d65ba52ea342e9c85aa5a5bdb42b6d9903fc816c09c2b8d16f8759176"},
+ {file = "msgspec-0.7.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:669fae2dd87cff6515af943f8db5974310d3f87dd7741d2904372c2cfe39b2fc"},
+ {file = "msgspec-0.7.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:996bc9363ad1b7e128087f24a522e1bc36cb6c9d25c5a643a3901f85a9afd2f8"},
+ {file = "msgspec-0.7.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:72e042415aba55ddbea5d361ddec2c92acc09f839d7a653c54e1256392717436"},
+ {file = "msgspec-0.7.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dacfa497d25f7deb29f4d8969d331586fe77599a0000f20431095bd38db65ea7"},
+ {file = "msgspec-0.7.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:eda4acf1945ee507bd29f12f4f0c6cb13c39d9020954fa256f2e7b667210ab64"},
+ {file = "msgspec-0.7.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:436220596186559f07d9db3e249e9e02cd745d0fb8e546183580b6f9f45881fd"},
+ {file = "msgspec-0.7.1-cp39-cp39-win_amd64.whl", hash = "sha256:596ecb68c75d78d7c86bf3f7ecd742b345bb3ef448483a520aa10be782b1f3b1"},
+ {file = "msgspec-0.7.1.tar.gz", hash = "sha256:af3210976e499f94d06ae3a7d3a6e7c94874b51f3012405f39fee3a849edaad7"},
]
multidict = [
{file = "multidict-6.0.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:0b9e95a740109c6047602f4db4da9949e6c5945cefbad34a1299775ddc9a62e2"},
@@ -2054,113 +2063,115 @@ multidict = [
{file = "multidict-6.0.2.tar.gz", hash = "sha256:5ff3bd75f38e4c43f1f470f2df7a4d430b821c4ce22be384e1459cb57d6bb013"},
]
myst-parser = [
- {file = "myst-parser-0.17.2.tar.gz", hash = "sha256:4c076d649e066f9f5c7c661bae2658be1ca06e76b002bb97f02a09398707686c"},
- {file = "myst_parser-0.17.2-py3-none-any.whl", hash = "sha256:1635ce3c18965a528d6de980f989ff64d6a1effb482e1f611b1bfb79e38f3d98"},
+ {file = "myst-parser-0.18.0.tar.gz", hash = "sha256:739a4d96773a8e55a2cacd3941ce46a446ee23dcd6b37e06f73f551ad7821d86"},
+ {file = "myst_parser-0.18.0-py3-none-any.whl", hash = "sha256:4965e51918837c13bf1c6f6fe2c6bddddf193148360fbdaefe743a4981358f6a"},
]
nest-asyncio = [
{file = "nest_asyncio-1.5.5-py3-none-any.whl", hash = "sha256:b98e3ec1b246135e4642eceffa5a6c23a3ab12c82ff816a92c612d68205813b2"},
{file = "nest_asyncio-1.5.5.tar.gz", hash = "sha256:e442291cd942698be619823a17a86a5759eabe1f8613084790de189fe9e16d65"},
]
networkx = [
- {file = "networkx-2.8.3-py3-none-any.whl", hash = "sha256:f151edac6f9b0cf11fecce93e236ac22b499bb9ff8d6f8393b9fef5ad09506cc"},
- {file = "networkx-2.8.3.tar.gz", hash = "sha256:67fab04a955a73eb660fe7bf281b6fa71a003bc6e23a92d2f6227654c5223dbe"},
+ {file = "networkx-2.8.4-py3-none-any.whl", hash = "sha256:6933b9b3174a0bdf03c911bb4a1ee43a86ce3edeb813e37e1d4c553b3f4a2c4f"},
+ {file = "networkx-2.8.4.tar.gz", hash = "sha256:5e53f027c0d567cf1f884dbb283224df525644e43afd1145d64c9d88a3584762"},
]
nodeenv = [
- {file = "nodeenv-1.6.0-py2.py3-none-any.whl", hash = "sha256:621e6b7076565ddcacd2db0294c0381e01fd28945ab36bcf00f41c5daf63bef7"},
- {file = "nodeenv-1.6.0.tar.gz", hash = "sha256:3ef13ff90291ba2a4a7a4ff9a979b63ffdd00a464dbe04acf0ea6471517a4c2b"},
+ {file = "nodeenv-1.7.0-py2.py3-none-any.whl", hash = "sha256:27083a7b96a25f2f5e1d8cb4b6317ee8aeda3bdd121394e5ac54e498028a042e"},
+ {file = "nodeenv-1.7.0.tar.gz", hash = "sha256:e0e7f7dfb85fc5394c6fe1e8fa98131a2473e04311a45afb6508f7cf1836fa2b"},
]
nox = [
{file = "nox-2022.1.7-py3-none-any.whl", hash = "sha256:efee12f02d39405b16d68f60e7a06fe1fc450ae58669d6cdda8c7f48e3bae9e3"},
{file = "nox-2022.1.7.tar.gz", hash = "sha256:b375238cebb0b9df2fab74b8d0ce1a50cd80df60ca2e13f38f539454fcd97d7e"},
]
numpy = [
- {file = "numpy-1.22.4-cp310-cp310-macosx_10_14_x86_64.whl", hash = "sha256:ba9ead61dfb5d971d77b6c131a9dbee62294a932bf6a356e48c75ae684e635b3"},
- {file = "numpy-1.22.4-cp310-cp310-macosx_10_15_x86_64.whl", hash = "sha256:1ce7ab2053e36c0a71e7a13a7475bd3b1f54750b4b433adc96313e127b870887"},
- {file = "numpy-1.22.4-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:7228ad13744f63575b3a972d7ee4fd61815b2879998e70930d4ccf9ec721dce0"},
- {file = "numpy-1.22.4-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:43a8ca7391b626b4c4fe20aefe79fec683279e31e7c79716863b4b25021e0e74"},
- {file = "numpy-1.22.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a911e317e8c826ea632205e63ed8507e0dc877dcdc49744584dfc363df9ca08c"},
- {file = "numpy-1.22.4-cp310-cp310-win32.whl", hash = "sha256:9ce7df0abeabe7fbd8ccbf343dc0db72f68549856b863ae3dd580255d009648e"},
- {file = "numpy-1.22.4-cp310-cp310-win_amd64.whl", hash = "sha256:3e1ffa4748168e1cc8d3cde93f006fe92b5421396221a02f2274aab6ac83b077"},
- {file = "numpy-1.22.4-cp38-cp38-macosx_10_15_x86_64.whl", hash = "sha256:59d55e634968b8f77d3fd674a3cf0b96e85147cd6556ec64ade018f27e9479e1"},
- {file = "numpy-1.22.4-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:c1d937820db6e43bec43e8d016b9b3165dcb42892ea9f106c70fb13d430ffe72"},
- {file = "numpy-1.22.4-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d4c5d5eb2ec8da0b4f50c9a843393971f31f1d60be87e0fb0917a49133d257d6"},
- {file = "numpy-1.22.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:64f56fc53a2d18b1924abd15745e30d82a5782b2cab3429aceecc6875bd5add0"},
- {file = "numpy-1.22.4-cp38-cp38-win32.whl", hash = "sha256:fb7a980c81dd932381f8228a426df8aeb70d59bbcda2af075b627bbc50207cba"},
- {file = "numpy-1.22.4-cp38-cp38-win_amd64.whl", hash = "sha256:e96d7f3096a36c8754207ab89d4b3282ba7b49ea140e4973591852c77d09eb76"},
- {file = "numpy-1.22.4-cp39-cp39-macosx_10_14_x86_64.whl", hash = "sha256:4c6036521f11a731ce0648f10c18ae66d7143865f19f7299943c985cdc95afb5"},
- {file = "numpy-1.22.4-cp39-cp39-macosx_10_15_x86_64.whl", hash = "sha256:b89bf9b94b3d624e7bb480344e91f68c1c6c75f026ed6755955117de00917a7c"},
- {file = "numpy-1.22.4-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:2d487e06ecbf1dc2f18e7efce82ded4f705f4bd0cd02677ffccfb39e5c284c7e"},
- {file = "numpy-1.22.4-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f3eb268dbd5cfaffd9448113539e44e2dd1c5ca9ce25576f7c04a5453edc26fa"},
- {file = "numpy-1.22.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:37431a77ceb9307c28382c9773da9f306435135fae6b80b62a11c53cfedd8802"},
- {file = "numpy-1.22.4-cp39-cp39-win32.whl", hash = "sha256:cc7f00008eb7d3f2489fca6f334ec19ca63e31371be28fd5dad955b16ec285bd"},
- {file = "numpy-1.22.4-cp39-cp39-win_amd64.whl", hash = "sha256:f0725df166cf4785c0bc4cbfb320203182b1ecd30fee6e541c8752a92df6aa32"},
- {file = "numpy-1.22.4-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0791fbd1e43bf74b3502133207e378901272f3c156c4df4954cad833b1380207"},
- {file = "numpy-1.22.4.zip", hash = "sha256:425b390e4619f58d8526b3dcf656dde069133ae5c240229821f01b5f44ea07af"},
+ {file = "numpy-1.23.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:58bfd40eb478f54ff7a5710dd61c8097e169bc36cc68333d00a9bcd8def53b38"},
+ {file = "numpy-1.23.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:196cd074c3f97c4121601790955f915187736f9cf458d3ee1f1b46aff2b1ade0"},
+ {file = "numpy-1.23.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f1d88ef79e0a7fa631bb2c3dda1ea46b32b1fe614e10fedd611d3d5398447f2f"},
+ {file = "numpy-1.23.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d54b3b828d618a19779a84c3ad952e96e2c2311b16384e973e671aa5be1f6187"},
+ {file = "numpy-1.23.0-cp310-cp310-win32.whl", hash = "sha256:2b2da66582f3a69c8ce25ed7921dcd8010d05e59ac8d89d126a299be60421171"},
+ {file = "numpy-1.23.0-cp310-cp310-win_amd64.whl", hash = "sha256:97a76604d9b0e79f59baeca16593c711fddb44936e40310f78bfef79ee9a835f"},
+ {file = "numpy-1.23.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:d8cc87bed09de55477dba9da370c1679bd534df9baa171dd01accbb09687dac3"},
+ {file = "numpy-1.23.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:f0f18804df7370571fb65db9b98bf1378172bd4e962482b857e612d1fec0f53e"},
+ {file = "numpy-1.23.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ac86f407873b952679f5f9e6c0612687e51547af0e14ddea1eedfcb22466babd"},
+ {file = "numpy-1.23.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ae8adff4172692ce56233db04b7ce5792186f179c415c37d539c25de7298d25d"},
+ {file = "numpy-1.23.0-cp38-cp38-win32.whl", hash = "sha256:fe8b9683eb26d2c4d5db32cd29b38fdcf8381324ab48313b5b69088e0e355379"},
+ {file = "numpy-1.23.0-cp38-cp38-win_amd64.whl", hash = "sha256:5043bcd71fcc458dfb8a0fc5509bbc979da0131b9d08e3d5f50fb0bbb36f169a"},
+ {file = "numpy-1.23.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:1c29b44905af288b3919803aceb6ec7fec77406d8b08aaa2e8b9e63d0fe2f160"},
+ {file = "numpy-1.23.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:98e8e0d8d69ff4d3fa63e6c61e8cfe2d03c29b16b58dbef1f9baa175bbed7860"},
+ {file = "numpy-1.23.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:79a506cacf2be3a74ead5467aee97b81fca00c9c4c8b3ba16dbab488cd99ba10"},
+ {file = "numpy-1.23.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:092f5e6025813e64ad6d1b52b519165d08c730d099c114a9247c9bb635a2a450"},
+ {file = "numpy-1.23.0-cp39-cp39-win32.whl", hash = "sha256:d6ca8dabe696c2785d0c8c9b0d8a9b6e5fdbe4f922bde70d57fa1a2848134f95"},
+ {file = "numpy-1.23.0-cp39-cp39-win_amd64.whl", hash = "sha256:fc431493df245f3c627c0c05c2bd134535e7929dbe2e602b80e42bf52ff760bc"},
+ {file = "numpy-1.23.0-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:f9c3fc2adf67762c9fe1849c859942d23f8d3e0bee7b5ed3d4a9c3eeb50a2f07"},
+ {file = "numpy-1.23.0-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d0d2094e8f4d760500394d77b383a1b06d3663e8892cdf5df3c592f55f3bff66"},
+ {file = "numpy-1.23.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:94b170b4fa0168cd6be4becf37cb5b127bd12a795123984385b8cd4aca9857e5"},
+ {file = "numpy-1.23.0.tar.gz", hash = "sha256:bd3fa4fe2e38533d5336e1272fc4e765cabbbde144309ccee8675509d5cd7b05"},
]
numpydoc = [
- {file = "numpydoc-1.3.1-py3-none-any.whl", hash = "sha256:a49822cb225e71b7ef7889dd42576b5aa14c56ce62e0bc030f97abc8a3ae240f"},
- {file = "numpydoc-1.3.1.tar.gz", hash = "sha256:349ff29e00a5caf119141967e579f8f17b24d41c46740b13ea4e8dba9971b20f"},
+ {file = "numpydoc-1.4.0-py3-none-any.whl", hash = "sha256:fd26258868ebcc75c816fe68e1d41e3b55bd410941acfb969dee3eef6e5cf260"},
+ {file = "numpydoc-1.4.0.tar.gz", hash = "sha256:9494daf1c7612f59905fa09e65c9b8a90bbacb3804d91f7a94e778831e6fcfa5"},
]
orjson = [
- {file = "orjson-3.7.1-cp310-cp310-macosx_10_7_x86_64.whl", hash = "sha256:4175929ca77338e6a57ff232c0e80443411ac0b489bfff755988ae70e3f62a97"},
- {file = "orjson-3.7.1-cp310-cp310-macosx_10_9_x86_64.macosx_11_0_arm64.macosx_10_9_universal2.whl", hash = "sha256:349514d69ce089b0e39014345907318ddba8ceca32a187635601c391c36ecdd6"},
- {file = "orjson-3.7.1-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:d4a380c164f8c40660ce8f96791695aa74d32a2a45d4038a2a4826d9ec2c61f6"},
- {file = "orjson-3.7.1-cp310-cp310-manylinux_2_24_aarch64.whl", hash = "sha256:26c64da280c9e097081d12047f13b4adba776b19885da5e488c093d4a1461056"},
- {file = "orjson-3.7.1-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:3a1e2dfa7ba8adb7511f3560e968ff2a66e7d0cd2f454219a0ab778c3c2e1a5c"},
- {file = "orjson-3.7.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:8d877467096dc117500a5ed38238085a81518252db3991793a8468ca97445e6c"},
- {file = "orjson-3.7.1-cp310-none-win_amd64.whl", hash = "sha256:b71915140261916e50dbf62dec1b448c159a789e23bc89b3ad1e6516f677036f"},
- {file = "orjson-3.7.1-cp37-cp37m-macosx_10_7_x86_64.whl", hash = "sha256:033dd5f91a8a967a007d3d05cbabec67040d6ff3e159ea17d3f681c3114a0d78"},
- {file = "orjson-3.7.1-cp37-cp37m-macosx_10_9_x86_64.macosx_11_0_arm64.macosx_10_9_universal2.whl", hash = "sha256:7e309db07b13d84bc5eb6ffdfd46f00b2301ce78871809e177f599db3f31fe24"},
- {file = "orjson-3.7.1-cp37-cp37m-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:aa0919735afbdeada9687347fa7963ca9732c60b1005832ac9d4853a9cb48be9"},
- {file = "orjson-3.7.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:31367b5d8389373aff1742b66608c4bff318a9015d94981a8c1919e82fe983ca"},
- {file = "orjson-3.7.1-cp37-cp37m-manylinux_2_24_aarch64.whl", hash = "sha256:a23292d6093748eee3f7ed85dbe6c9abd24d4d399d7044bc323ff39834966d6c"},
- {file = "orjson-3.7.1-cp37-cp37m-manylinux_2_28_x86_64.whl", hash = "sha256:ad0faec8ee89cd50a486804b7d9a97016a1d2074298ceadbc75efc6691030b36"},
- {file = "orjson-3.7.1-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:43b9a44b42c67adbc02fc86efacf27a374b09971cd58e0cd9739b8a748d19be5"},
- {file = "orjson-3.7.1-cp37-none-win_amd64.whl", hash = "sha256:6fdef8939f528dd9386c4941e88227a3ccf124c8278ebc7e98533294ae446ef1"},
- {file = "orjson-3.7.1-cp38-cp38-macosx_10_7_x86_64.whl", hash = "sha256:f65b7d87534c567136d73f9bcdff46ad5ded2aabe8e89be7ba97395129d48e72"},
- {file = "orjson-3.7.1-cp38-cp38-macosx_10_9_x86_64.macosx_11_0_arm64.macosx_10_9_universal2.whl", hash = "sha256:37b41c8869347388b1794a1c92c5e981ab764638f62e026252026a650b1b266c"},
- {file = "orjson-3.7.1-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:6165914b1a209458201bf0a99dcf5f44f58477ba23ac71d7b5e4ca197e174f30"},
- {file = "orjson-3.7.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c4d9d1edd7c92d0b35082ae3a230705206d6beb3c07a8c72ebdd980820598f9c"},
- {file = "orjson-3.7.1-cp38-cp38-manylinux_2_24_aarch64.whl", hash = "sha256:42376b0330cbbe5864b480de16a48f4c82aae95dca9cdcf81490e7ca87cc131a"},
- {file = "orjson-3.7.1-cp38-cp38-manylinux_2_28_x86_64.whl", hash = "sha256:454a4d8c81882cbff19eaed90d0a4e42602970c686c8cd34071c8097e3dfdb5c"},
- {file = "orjson-3.7.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:f6e691fcf0e03575bef7efd24b331282ad3c1df75855be368e4c3c2cfba6967f"},
- {file = "orjson-3.7.1-cp38-none-win_amd64.whl", hash = "sha256:4f657a16f81b0497e5c67b3c151d9eb8c99d2d3a7ab996500e9ae453e8b0e0fe"},
- {file = "orjson-3.7.1-cp39-cp39-macosx_10_7_x86_64.whl", hash = "sha256:fc7ecc7b38ebf6ac072efa209c5b8d02eb7af393a8a2c812fc01dcb037a86c48"},
- {file = "orjson-3.7.1-cp39-cp39-macosx_10_9_x86_64.macosx_11_0_arm64.macosx_10_9_universal2.whl", hash = "sha256:f7b628f4dcfc0b726ede7d2024cf79107d54851451e15edeae2f4ee55554f3c6"},
- {file = "orjson-3.7.1-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:696661c6b6e58361aba0b14a2c5977c049f481bc7fe41759a55e86b13b361905"},
- {file = "orjson-3.7.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:32f384ff9dd555ee21508887b12316d8bd04921b396c876b4d4c87a30a3c8a13"},
- {file = "orjson-3.7.1-cp39-cp39-manylinux_2_24_aarch64.whl", hash = "sha256:7b03a3f32cd5fcdd8460de690579d1dfa1965bd333b89bd3d202907c7b49ada4"},
- {file = "orjson-3.7.1-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:71c285488c5f767e102a389f9efb11e93e6345247d60043efeffa616a3056945"},
- {file = "orjson-3.7.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:906a33c2fe834cb47daafd09d77405260caa5fa1354219bec5df9ac2b4e909fa"},
- {file = "orjson-3.7.1-cp39-none-win_amd64.whl", hash = "sha256:5730e44fc20891cadea7d163a2dad723f95cf81199e1a02dac339a11437a999d"},
- {file = "orjson-3.7.1.tar.gz", hash = "sha256:3ae89fd45bc9c72dcc0a489aa2411f139ee8a32468c387188be21d25f20f83d3"},
+ {file = "orjson-3.7.5-cp310-cp310-macosx_10_7_x86_64.whl", hash = "sha256:bfa8d3412d1f73d8cceb5ce96cfb50ba897d3bc9acc5e2e13240a94a75fcdbdc"},
+ {file = "orjson-3.7.5-cp310-cp310-macosx_10_9_x86_64.macosx_11_0_arm64.macosx_10_9_universal2.whl", hash = "sha256:fad043bd582ad4e9177d4d32173c3d388a0ab5a71ab51ce5a01aaee8ddf338d4"},
+ {file = "orjson-3.7.5-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:e481c8511bab48dbc2db7389944fa85b95fabceec6693fa5f0876cea4910cf73"},
+ {file = "orjson-3.7.5-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3caade3cea3003263cb87f2c193e9dcdc82e83ae936631a494f980f677fc8a46"},
+ {file = "orjson-3.7.5-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:928e1120057cea1f9a931740c840e0faeaf019ed3df767c499450057c3f878b7"},
+ {file = "orjson-3.7.5-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:d849b269a55564821b64f3e781b2e7746fb5c39f28bfe27cf6330a43eb89ed2b"},
+ {file = "orjson-3.7.5-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:c124fffb72a9393f21ba27519e8a619d473b6b94c591045a8a206344425e2093"},
+ {file = "orjson-3.7.5-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:8d0983147121e9e87dddda7997c08331fdf0b26fad22a11e11bfc333169bb305"},
+ {file = "orjson-3.7.5-cp310-none-win_amd64.whl", hash = "sha256:65a8d18fdaee45807b8bd7ead2898293d23d4d766cf42d63022a40758591a707"},
+ {file = "orjson-3.7.5-cp37-cp37m-macosx_10_7_x86_64.whl", hash = "sha256:fcdef342b0a810a15a559208b24f13198ce2252a441868e0265bdb82ba85b1eb"},
+ {file = "orjson-3.7.5-cp37-cp37m-macosx_10_9_x86_64.macosx_11_0_arm64.macosx_10_9_universal2.whl", hash = "sha256:ad035d250d438cfd2a1d38dd54d5b800ca67290a89e14b5345c55b2fb0e186e0"},
+ {file = "orjson-3.7.5-cp37-cp37m-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:7316ea00623121178139240fddc0bc2de3f730b0d93081c7f5c6780eea75417b"},
+ {file = "orjson-3.7.5-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bd46103e9d60a87e9497ccb72b6a32d0f12586d0abcd1a5a5a375bacad3b5f50"},
+ {file = "orjson-3.7.5-cp37-cp37m-manylinux_2_28_x86_64.whl", hash = "sha256:dca21c76552b0b539b38121ffd397e129af6b2b0d9c586a96e33652691d3e87d"},
+ {file = "orjson-3.7.5-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:376248b40b7c917c563311af1ce511f9e01c7e982bdf5c01d1287f7d9672cf95"},
+ {file = "orjson-3.7.5-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:4b0ab07bdc34fa54c56ab385926aadb89b6106498ed762297418189d97cc6c43"},
+ {file = "orjson-3.7.5-cp37-none-win_amd64.whl", hash = "sha256:0893ff1e7952267b7e54062b9ceee670da81e52825d81f04473a40a88239200c"},
+ {file = "orjson-3.7.5-cp38-cp38-macosx_10_7_x86_64.whl", hash = "sha256:d89e26508e50551acd97c96780be6ba7c755997985d5143c9222daab014f1bc2"},
+ {file = "orjson-3.7.5-cp38-cp38-macosx_10_9_x86_64.macosx_11_0_arm64.macosx_10_9_universal2.whl", hash = "sha256:314ae3957a096baf0d58f9650e062bc6d1a3e554f7c8847b19cb9846134bcf62"},
+ {file = "orjson-3.7.5-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:16acac2ba4ab747acc1a8b3c72487d0a88031925f5a8662fa2dc25d812edaf10"},
+ {file = "orjson-3.7.5-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:383ee9124f585cf75fc5fdfa01497fe89c0667b1d72c22bed2a07904bfb3d09d"},
+ {file = "orjson-3.7.5-cp38-cp38-manylinux_2_28_x86_64.whl", hash = "sha256:57a0e86d59c61b8cf91d84a9959b4c7f903d1243cee9b3d269aa5ea65ce2686f"},
+ {file = "orjson-3.7.5-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:04468b96014ff953a0bc0db8ae89de4df33b5008cfa25de200deb7483eebdaa8"},
+ {file = "orjson-3.7.5-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:3202dafd6d54950b89eb872a74152f74440bc010bd5675eac39a31f940c5c545"},
+ {file = "orjson-3.7.5-cp38-none-win_amd64.whl", hash = "sha256:c63bbea3fe5087b4586cd94393ecc368c7cd62018c9470e0175a483ebbcf2a8c"},
+ {file = "orjson-3.7.5-cp39-cp39-macosx_10_7_x86_64.whl", hash = "sha256:828f4767fda30faeed7adaa2d004b400fb55d5ca3b8127ee053e9d62addadc16"},
+ {file = "orjson-3.7.5-cp39-cp39-macosx_10_9_x86_64.macosx_11_0_arm64.macosx_10_9_universal2.whl", hash = "sha256:9be3b6411a6ca75aa255f1d0139d36bdce80159738932d7311abaa82e28f2021"},
+ {file = "orjson-3.7.5-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ed1ff20a4ee6571e2a4a432e2d4a6ad82cbc6a698acf860b5ceb066fdae3b859"},
+ {file = "orjson-3.7.5-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2e19f12762ef7693999197be1860df204589f4e907d3a7c76ad6aa764ee94877"},
+ {file = "orjson-3.7.5-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:215102b89fec7e3f8a760a6667fee3ec9fd3f1e2a89293265141e884c5a239f5"},
+ {file = "orjson-3.7.5-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:a4a221479e6b52305ce572e9a8403bbc47815ef1ed1c41d378f4de8efaa7b57c"},
+ {file = "orjson-3.7.5-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:76e91ea4c73a9e28924bb2e1f8ebd29b969b7b3d4b6c30e17b408c36d5f79f46"},
+ {file = "orjson-3.7.5-cp39-none-win_amd64.whl", hash = "sha256:d444bb261b6ce58ab7c3998cbb72a0ca2b7b4d1e120e4e22b4d2c4a927960ca6"},
+ {file = "orjson-3.7.5.tar.gz", hash = "sha256:47c9d2b3f993b630b1efa58ad128b5d8a61cd7cd5c0cec8dad043a1ab9d02866"},
]
packaging = [
{file = "packaging-21.3-py3-none-any.whl", hash = "sha256:ef103e05f519cdc783ae24ea4e2e0f508a9c99b2d4969652eed6a2e1ea5bd522"},
{file = "packaging-21.3.tar.gz", hash = "sha256:dd47c42927d89ab911e606518907cc2d3a1f38bbd026385970643f9c5b8ecfeb"},
]
pandas = [
- {file = "pandas-1.4.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:be67c782c4f1b1f24c2f16a157e12c2693fd510f8df18e3287c77f33d124ed07"},
- {file = "pandas-1.4.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:5a206afa84ed20e07603f50d22b5f0db3fb556486d8c2462d8bc364831a4b417"},
- {file = "pandas-1.4.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:0010771bd9223f7afe5f051eb47c4a49534345dfa144f2f5470b27189a4dd3b5"},
- {file = "pandas-1.4.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3228198333dd13c90b6434ddf61aa6d57deaca98cf7b654f4ad68a2db84f8cfe"},
- {file = "pandas-1.4.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5b79af3a69e5175c6fa7b4e046b21a646c8b74e92c6581a9d825687d92071b51"},
- {file = "pandas-1.4.2-cp310-cp310-win_amd64.whl", hash = "sha256:5586cc95692564b441f4747c47c8a9746792e87b40a4680a2feb7794defb1ce3"},
- {file = "pandas-1.4.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:061609334a8182ab500a90fe66d46f6f387de62d3a9cb9aa7e62e3146c712167"},
- {file = "pandas-1.4.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:b8134651258bce418cb79c71adeff0a44090c98d955f6953168ba16cc285d9f7"},
- {file = "pandas-1.4.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:df82739e00bb6daf4bba4479a40f38c718b598a84654cbd8bb498fd6b0aa8c16"},
- {file = "pandas-1.4.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:385c52e85aaa8ea6a4c600a9b2821181a51f8be0aee3af6f2dcb41dafc4fc1d0"},
- {file = "pandas-1.4.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:295872bf1a09758aba199992c3ecde455f01caf32266d50abc1a073e828a7b9d"},
- {file = "pandas-1.4.2-cp38-cp38-win32.whl", hash = "sha256:95c1e422ced0199cf4a34385ff124b69412c4bc912011ce895582bee620dfcaa"},
- {file = "pandas-1.4.2-cp38-cp38-win_amd64.whl", hash = "sha256:5c54ea4ef3823108cd4ec7fb27ccba4c3a775e0f83e39c5e17f5094cb17748bc"},
- {file = "pandas-1.4.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:c072c7f06b9242c855ed8021ff970c0e8f8b10b35e2640c657d2a541c5950f59"},
- {file = "pandas-1.4.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:f549097993744ff8c41b5e8f2f0d3cbfaabe89b4ae32c8c08ead6cc535b80139"},
- {file = "pandas-1.4.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:ff08a14ef21d94cdf18eef7c569d66f2e24e0bc89350bcd7d243dd804e3b5eb2"},
- {file = "pandas-1.4.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8c5bf555b6b0075294b73965adaafb39cf71c312e38c5935c93d78f41c19828a"},
- {file = "pandas-1.4.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:51649ef604a945f781105a6d2ecf88db7da0f4868ac5d45c51cb66081c4d9c73"},
- {file = "pandas-1.4.2-cp39-cp39-win32.whl", hash = "sha256:d0d4f13e4be7ce89d7057a786023c461dd9370040bdb5efa0a7fe76b556867a0"},
- {file = "pandas-1.4.2-cp39-cp39-win_amd64.whl", hash = "sha256:09d8be7dd9e1c4c98224c4dfe8abd60d145d934e9fc1f5f411266308ae683e6a"},
- {file = "pandas-1.4.2.tar.gz", hash = "sha256:92bc1fc585f1463ca827b45535957815b7deb218c549b7c18402c322c7549a12"},
+ {file = "pandas-1.4.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:d51674ed8e2551ef7773820ef5dab9322be0828629f2cbf8d1fc31a0c4fed640"},
+ {file = "pandas-1.4.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:16ad23db55efcc93fa878f7837267973b61ea85d244fc5ff0ccbcfa5638706c5"},
+ {file = "pandas-1.4.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:958a0588149190c22cdebbc0797e01972950c927a11a900fe6c2296f207b1d6f"},
+ {file = "pandas-1.4.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e48fbb64165cda451c06a0f9e4c7a16b534fcabd32546d531b3c240ce2844112"},
+ {file = "pandas-1.4.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6f803320c9da732cc79210d7e8cc5c8019aad512589c910c66529eb1b1818230"},
+ {file = "pandas-1.4.3-cp310-cp310-win_amd64.whl", hash = "sha256:2893e923472a5e090c2d5e8db83e8f907364ec048572084c7d10ef93546be6d1"},
+ {file = "pandas-1.4.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:24ea75f47bbd5574675dae21d51779a4948715416413b30614c1e8b480909f81"},
+ {file = "pandas-1.4.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:d5ebc990bd34f4ac3c73a2724c2dcc9ee7bf1ce6cf08e87bb25c6ad33507e318"},
+ {file = "pandas-1.4.3-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:d6c0106415ff1a10c326c49bc5dd9ea8b9897a6ca0c8688eb9c30ddec49535ef"},
+ {file = "pandas-1.4.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:78b00429161ccb0da252229bcda8010b445c4bf924e721265bec5a6e96a92e92"},
+ {file = "pandas-1.4.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6dfbf16b1ea4f4d0ee11084d9c026340514d1d30270eaa82a9f1297b6c8ecbf0"},
+ {file = "pandas-1.4.3-cp38-cp38-win32.whl", hash = "sha256:48350592665ea3cbcd07efc8c12ff12d89be09cd47231c7925e3b8afada9d50d"},
+ {file = "pandas-1.4.3-cp38-cp38-win_amd64.whl", hash = "sha256:605d572126eb4ab2eadf5c59d5d69f0608df2bf7bcad5c5880a47a20a0699e3e"},
+ {file = "pandas-1.4.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:a3924692160e3d847e18702bb048dc38e0e13411d2b503fecb1adf0fcf950ba4"},
+ {file = "pandas-1.4.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:07238a58d7cbc8a004855ade7b75bbd22c0db4b0ffccc721556bab8a095515f6"},
+ {file = "pandas-1.4.3-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:755679c49460bd0d2f837ab99f0a26948e68fa0718b7e42afbabd074d945bf84"},
+ {file = "pandas-1.4.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:41fc406e374590a3d492325b889a2686b31e7a7780bec83db2512988550dadbf"},
+ {file = "pandas-1.4.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1d9382f72a4f0e93909feece6fef5500e838ce1c355a581b3d8f259839f2ea76"},
+ {file = "pandas-1.4.3-cp39-cp39-win32.whl", hash = "sha256:0daf876dba6c622154b2e6741f29e87161f844e64f84801554f879d27ba63c0d"},
+ {file = "pandas-1.4.3-cp39-cp39-win_amd64.whl", hash = "sha256:721a3dd2f06ef942f83a819c0f3f6a648b2830b191a72bbe9451bcd49c3bd42e"},
+ {file = "pandas-1.4.3.tar.gz", hash = "sha256:2ff7788468e75917574f080cd4681b27e1a7bf36461fe968b49a87b5a54d007c"},
]
platformdirs = [
{file = "platformdirs-2.5.2-py3-none-any.whl", hash = "sha256:027d8e83a2d7de06bbac4e5ef7e023c02b863d7ea5d079477e722bb41ab25788"},
@@ -2252,37 +2263,37 @@ pyarrow = [
{file = "pyarrow-8.0.0.tar.gz", hash = "sha256:4a18a211ed888f1ac0b0ebcb99e2d9a3e913a481120ee9b1fe33d3fedb945d4e"},
]
pycares = [
- {file = "pycares-4.1.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:71b99b9e041ae3356b859822c511f286f84c8889ec9ed1fbf6ac30fb4da13e4c"},
- {file = "pycares-4.1.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c000942f5fc64e6e046aa61aa53b629b576ba11607d108909727c3c8f211a157"},
- {file = "pycares-4.1.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:b0e50ddc78252f2e2b6b5f2c73e5b2449dfb6bea7a5a0e21dfd1e2bcc9e17382"},
- {file = "pycares-4.1.2-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:6831e963a910b0a8cbdd2750ffcdf5f2bb0edb3f53ca69ff18484de2cc3807c4"},
- {file = "pycares-4.1.2-cp310-cp310-win32.whl", hash = "sha256:ad7b28e1b6bc68edd3d678373fa3af84e39d287090434f25055d21b4716b2fc6"},
- {file = "pycares-4.1.2-cp310-cp310-win_amd64.whl", hash = "sha256:27a6f09dbfb69bb79609724c0f90dfaa7c215876a7cd9f12d585574d1f922112"},
- {file = "pycares-4.1.2-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:e5a060f5fa90ae245aa99a4a8ad13ec39c2340400de037c7e8d27b081e1a3c64"},
- {file = "pycares-4.1.2-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:056330275dea42b7199494047a745e1d9785d39fb8c4cd469dca043532240b80"},
- {file = "pycares-4.1.2-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:0aa897543a786daba74ec5e19638bd38b2b432d179a0e248eac1e62de5756207"},
- {file = "pycares-4.1.2-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:cbceaa9b2c416aa931627466d3240aecfc905c292c842252e3d77b8630072505"},
- {file = "pycares-4.1.2-cp36-cp36m-win32.whl", hash = "sha256:112e1385c451069112d6b5ea1f9c378544f3c6b89882ff964e9a64be3336d7e4"},
- {file = "pycares-4.1.2-cp36-cp36m-win_amd64.whl", hash = "sha256:c6680f7fdc0f1163e8f6c2a11d11b9a0b524a61000d2a71f9ccd410f154fb171"},
- {file = "pycares-4.1.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:58a41a2baabcd95266db776c510d349d417919407f03510fc87ac7488730d913"},
- {file = "pycares-4.1.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a810d01c9a426ee8b0f36969c2aef5fb966712be9d7e466920beb328cd9cefa3"},
- {file = "pycares-4.1.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:b266cec81dcea2c3efbbd3dda00af8d7eb0693ae9e47e8706518334b21f27d4a"},
- {file = "pycares-4.1.2-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:8319afe4838e09df267c421ca93da408f770b945ec6217dda72f1f6a493e37e4"},
- {file = "pycares-4.1.2-cp37-cp37m-win32.whl", hash = "sha256:4d5da840aa0d9b15fa51107f09270c563a348cb77b14ae9653d0bbdbe326fcc2"},
- {file = "pycares-4.1.2-cp37-cp37m-win_amd64.whl", hash = "sha256:5632f21d92cc0225ba5ff906e4e5dec415ef0b3df322c461d138190681cd5d89"},
- {file = "pycares-4.1.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:8fd1ff17a26bb004f0f6bb902ba7dddd810059096ae0cc3b45e4f5be46315d19"},
- {file = "pycares-4.1.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:439799be4b7576e907139a7f9b3c8a01b90d3e38af4af9cd1fc6c1ee9a42b9e6"},
- {file = "pycares-4.1.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:40079ed58efa91747c50aac4edf8ecc7e570132ab57dc0a4030eb0d016a6cab8"},
- {file = "pycares-4.1.2-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:4e190471a015f8225fa38069617192e06122771cce2b169ac7a60bfdbd3d4ab2"},
- {file = "pycares-4.1.2-cp38-cp38-win32.whl", hash = "sha256:2b837315ed08c7df009b67725fe1f50489e99de9089f58ec1b243dc612f172aa"},
- {file = "pycares-4.1.2-cp38-cp38-win_amd64.whl", hash = "sha256:c7eba3c8354b730a54d23237d0b6445a2f68570fa68d0848887da23a3f3b71f3"},
- {file = "pycares-4.1.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:2f5f84fe9f83eab9cd68544b165b74ba6e3412d029cc9ab20098d9c332869fc5"},
- {file = "pycares-4.1.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:569eef8597b5e02b1bc4644b9f272160304d8c9985357d7ecfcd054da97c0771"},
- {file = "pycares-4.1.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:e1489aa25d14dbf7176110ead937c01176ed5a0ebefd3b092bbd6b202241814c"},
- {file = "pycares-4.1.2-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:dc942692fca0e27081b7bb414bb971d34609c80df5e953f6d0c62ecc8019acd9"},
- {file = "pycares-4.1.2-cp39-cp39-win32.whl", hash = "sha256:ed71dc4290d9c3353945965604ef1f6a4de631733e9819a7ebc747220b27e641"},
- {file = "pycares-4.1.2-cp39-cp39-win_amd64.whl", hash = "sha256:ec00f3594ee775665167b1a1630edceefb1b1283af9ac57480dba2fb6fd6c360"},
- {file = "pycares-4.1.2.tar.gz", hash = "sha256:03490be0e7b51a0c8073f877bec347eff31003f64f57d9518d419d9369452837"},
+ {file = "pycares-4.2.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d83f193563b42360528167705b1c7bb91e2a09f990b98e3d6378835b72cd5c96"},
+ {file = "pycares-4.2.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b03f69df69f0ab3bfb8dbe54444afddff6ff9389561a08aade96b4f91207a655"},
+ {file = "pycares-4.2.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:3b78bdee2f2f1351d5fccc2d1b667aea2d15a55d74d52cb9fd5bea8b5e74c4dc"},
+ {file = "pycares-4.2.1-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:f05223de13467bb26f9a1594a1799ce2d08ad8ea241489fecd9d8ed3bbbfc672"},
+ {file = "pycares-4.2.1-cp310-cp310-win32.whl", hash = "sha256:1f37f762414680063b4dfec5be809a84f74cd8e203d939aaf3ba9c807a9e7013"},
+ {file = "pycares-4.2.1-cp310-cp310-win_amd64.whl", hash = "sha256:1a9506d496efeb809a1b63647cb2f3f33c67fcf62bf80a2359af692fef2c1755"},
+ {file = "pycares-4.2.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:2fd53eb5b441c4f6f9c78d7900e05883e9998b34a14b804be4fc4c6f9fea89f3"},
+ {file = "pycares-4.2.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:061dd4c80fec73feb150455b159704cd51a122f20d36790033bd6375d4198579"},
+ {file = "pycares-4.2.1-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:a521d7f54f3e52ded4d34c306ba05cfe9eb5aaa2e5aaf83c96564b9369495588"},
+ {file = "pycares-4.2.1-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:99e00e397d07a79c9f43e4303e67f4f97bcabd013bda0d8f2d430509b7aef8a0"},
+ {file = "pycares-4.2.1-cp36-cp36m-win32.whl", hash = "sha256:d9cd826d8e0c270059450709bff994bfeb072f79d82fd3f11c701690ff65d0e7"},
+ {file = "pycares-4.2.1-cp36-cp36m-win_amd64.whl", hash = "sha256:f8e6942965465ca98e212376c4afb9aec501d8129054929744b2f4a487c8c14b"},
+ {file = "pycares-4.2.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:e75cbd4d3b3d9b02bba6e170846e39893a825e7a5fb1b96728fc6d7b964f8945"},
+ {file = "pycares-4.2.1-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d2e8ec4c8e07c986b70a3cc8f5b297c53b08ac755e5b9797512002a466e2de86"},
+ {file = "pycares-4.2.1-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:5333b51ef4ff3e8973b4a1b57cad5ada13e15552445ee3cd74bd77407dec9d44"},
+ {file = "pycares-4.2.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:2113529004df4894783eaa61e9abc3a680756b6f033d942f2800301ae8c71c29"},
+ {file = "pycares-4.2.1-cp37-cp37m-win32.whl", hash = "sha256:e7a95763cdc20cf9ec357066e656ea30b8de6b03de6175cbb50890e22aa01868"},
+ {file = "pycares-4.2.1-cp37-cp37m-win_amd64.whl", hash = "sha256:7a901776163a04de5d67c42bd63a287cff9cb05fc041668ad1681fe3daa36445"},
+ {file = "pycares-4.2.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:66b5390a4885a578e687d3f2683689c35e1d4573f4d0ecf217431f7bb55c49a0"},
+ {file = "pycares-4.2.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:15dd5cf21bc73ad539e8aabf7afe370d1df8af7bc6944cd7298f3bfef0c1a27c"},
+ {file = "pycares-4.2.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:4ee625d7571039038bca51ae049b047cbfcfc024b302aae6cc53d5d9aa8648a8"},
+ {file = "pycares-4.2.1-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:396ee487178e9de06ca4122a35a157474db3ce0a0db6038a31c831ebb9863315"},
+ {file = "pycares-4.2.1-cp38-cp38-win32.whl", hash = "sha256:e4dc37f732f7110ca6368e0128cbbd0a54f5211515a061b2add64da2ddb8e5ca"},
+ {file = "pycares-4.2.1-cp38-cp38-win_amd64.whl", hash = "sha256:3636fccf643c5192c34ee0183c514a2d09419e3a76ca2717cef626638027cb21"},
+ {file = "pycares-4.2.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6724573e830ea2345f4bcf0f968af64cc6d491dc2133e9c617f603445dcdfa58"},
+ {file = "pycares-4.2.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e9dbfcacbde6c21380c412c13d53ea44b257dea3f7b9d80be2c873bb20e21fee"},
+ {file = "pycares-4.2.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:c8a46839da642b281ac5f56d3c6336528e128b3c41eab9c5330d250f22325e9d"},
+ {file = "pycares-4.2.1-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:9b05c2cec644a6c66b55bcf6c24d4dfdaf2f7205b16e5c4ceee31db104fac958"},
+ {file = "pycares-4.2.1-cp39-cp39-win32.whl", hash = "sha256:8bd6ed3ad3a5358a635c1acf5d0f46be9afb095772b84427ff22283d2f31db1b"},
+ {file = "pycares-4.2.1-cp39-cp39-win_amd64.whl", hash = "sha256:fbd53728d798d07811898e11991e22209229c090eab265a53d12270b95d70d1a"},
+ {file = "pycares-4.2.1.tar.gz", hash = "sha256:735b4f75fd0f595c4e9184da18cd87737f46bc81a64ea41f4edce2b6b68d46d2"},
]
pycparser = [
{file = "pycparser-2.21-py2.py3-none-any.whl", hash = "sha256:8ee45429555515e1f6b185e78100aea234072576aa43ab53aefcae078162fca9"},
@@ -2354,8 +2365,8 @@ pytest-forked = [
{file = "pytest_forked-1.4.0-py3-none-any.whl", hash = "sha256:bbbb6717efc886b9d64537b41fb1497cfaf3c9601276be8da2cccfea5a3c8ad8"},
]
pytest-mock = [
- {file = "pytest-mock-3.7.0.tar.gz", hash = "sha256:5112bd92cc9f186ee96e1a92efc84969ea494939c3aead39c50f421c4cc69534"},
- {file = "pytest_mock-3.7.0-py3-none-any.whl", hash = "sha256:6cff27cec936bf81dc5ee87f07132b807bcda51106b5ec4b90a04331cba76231"},
+ {file = "pytest-mock-3.8.1.tar.gz", hash = "sha256:2c6d756d5d3bf98e2e80797a959ca7f81f479e7d1f5f571611b0fdd6d1745240"},
+ {file = "pytest_mock-3.8.1-py3-none-any.whl", hash = "sha256:d989f11ca4a84479e288b0cd1e6769d6ad0d3d7743dcc75e460d1416a5f2135a"},
]
pytest-xdist = [
{file = "pytest-xdist-2.5.0.tar.gz", hash = "sha256:4580deca3ff04ddb2ac53eba39d76cb5dd5edeac050cb6fbc768b0dd712b4edf"},
@@ -2409,12 +2420,12 @@ pyyaml = [
{file = "PyYAML-6.0.tar.gz", hash = "sha256:68fb519c14306fec9720a2a5b45bc9f0c8d1b9c72adf45c37baedfcd949c35a2"},
]
redis = [
- {file = "redis-4.3.3-py3-none-any.whl", hash = "sha256:f57f8df5d238a8ecf92f499b6b21467bfee6c13d89953c27edf1e2bc673622e7"},
- {file = "redis-4.3.3.tar.gz", hash = "sha256:2f7a57cf4af15cd543c4394bcbe2b9148db2606a37edba755368836e3a1d053e"},
+ {file = "redis-4.3.4-py3-none-any.whl", hash = "sha256:a52d5694c9eb4292770084fa8c863f79367ca19884b329ab574d5cb2036b3e54"},
+ {file = "redis-4.3.4.tar.gz", hash = "sha256:ddf27071df4adf3821c4f2ca59d67525c3a82e5f268bed97b813cb4fabf87880"},
]
requests = [
- {file = "requests-2.27.1-py2.py3-none-any.whl", hash = "sha256:f22fa1e554c9ddfd16e6e41ac79759e17be9e492b3587efa038054674760e72d"},
- {file = "requests-2.27.1.tar.gz", hash = "sha256:68d7c56fd5a8999887728ef304a6d12edc7be74f1cfa47714fc8b414525c9a61"},
+ {file = "requests-2.28.1-py3-none-any.whl", hash = "sha256:8fefa2a1a1365bf5520aac41836fbee479da67864514bdb821f31ce07ce65349"},
+ {file = "requests-2.28.1.tar.gz", hash = "sha256:7c5599b102feddaa661c826c56ab4fee28bfd17f5abca1ebbe3e7f19d7c97983"},
]
scipy = [
{file = "scipy-1.8.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:65b77f20202599c51eb2771d11a6b899b97989159b7975e9b5259594f1d35ef4"},
@@ -2502,8 +2513,9 @@ sphinxcontrib-serializinghtml = [
{file = "sphinxcontrib_serializinghtml-1.1.5-py2.py3-none-any.whl", hash = "sha256:352a9a00ae864471d3a7ead8d7d79f5fc0b57e8b3f95e9867eb9eb28999b92fd"},
]
tabulate = [
- {file = "tabulate-0.8.9-py3-none-any.whl", hash = "sha256:d7c013fe7abbc5e491394e10fa845f8f32fe54f8dc60c6622c6cf482d25d47e4"},
- {file = "tabulate-0.8.9.tar.gz", hash = "sha256:eb1d13f25760052e8931f2ef80aaf6045a6cceb47514db8beab24cded16f13a7"},
+ {file = "tabulate-0.8.10-py3-none-any.whl", hash = "sha256:0ba055423dbaa164b9e456abe7920c5e8ed33fcc16f6d1b2f2d152c8e1e8b4fc"},
+ {file = "tabulate-0.8.10-py3.8.egg", hash = "sha256:436f1c768b424654fce8597290d2764def1eea6a77cfa5c33be00b1bc0f4f63d"},
+ {file = "tabulate-0.8.10.tar.gz", hash = "sha256:6c57f3f3dd7ac2782770155f3adb2db0b1a269637e42f27599925e64b114f519"},
]
text-unidecode = [
{file = "text-unidecode-1.3.tar.gz", hash = "sha256:bad6603bb14d279193107714b288be206cac565dfa49aa5b105294dd5c4aab93"},
@@ -2556,8 +2568,8 @@ uvloop = [
{file = "uvloop-0.16.0.tar.gz", hash = "sha256:f74bc20c7b67d1c27c72601c78cf95be99d5c2cdd4514502b4f3eb0933ff1228"},
]
virtualenv = [
- {file = "virtualenv-20.14.1-py2.py3-none-any.whl", hash = "sha256:e617f16e25b42eb4f6e74096b9c9e37713cf10bf30168fb4a739f3fa8f898a3a"},
- {file = "virtualenv-20.14.1.tar.gz", hash = "sha256:ef589a79795589aada0c1c5b319486797c03b67ac3984c48c669c0e4f50df3a5"},
+ {file = "virtualenv-20.15.1-py2.py3-none-any.whl", hash = "sha256:b30aefac647e86af6d82bfc944c556f8f1a9c90427b2fb4e3bfbf338cb82becf"},
+ {file = "virtualenv-20.15.1.tar.gz", hash = "sha256:288171134a2ff3bfb1a2f54f119e77cd1b81c29fc1265a2356f3e8d14c7d58c4"},
]
wrapt = [
{file = "wrapt-1.14.1-cp27-cp27m-macosx_10_9_x86_64.whl", hash = "sha256:1b376b3f4896e7930f1f772ac4b064ac12598d1c38d04907e696cc4d794b43d3"},
diff --git a/pyproject.toml b/pyproject.toml
index c80a299186c5..93f0400fae0f 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -1,6 +1,6 @@
[tool.poetry]
name = "nautilus_trader"
-version = "1.147.1"
+version = "1.148.0"
description = "A high-performance algorithmic trading platform and event-driven backtester"
authors = ["Nautech Systems "]
license = "LGPL-3.0-or-later"
@@ -35,7 +35,7 @@ include = [
requires = [
"setuptools",
"poetry-core>=1.0.8",
- "numpy>=1.22.4",
+ "numpy>=1.23.0",
"Cython==3.0.0a9", # Pinned at 3.0.0a9 due coverage
]
build-backend = "poetry.core.masonry.api"
@@ -53,22 +53,22 @@ click = "^8.1.3"
cloudpickle = "^2.1.0"
frozendict = "^2.3.2"
fsspec = "^2022.5.0"
-msgspec = "^0.6.0"
-numpy = "^1.22.4"
-orjson = "^3.7.1"
-pandas = "^1.4.2"
+msgspec = "^0.7.1"
+numpy = "^1.23.0"
+orjson = "^3.7.5"
+pandas = "^1.4.3"
psutil = "^5.9.1"
pyarrow = "^8.0.0"
pydantic = "^1.9.1"
pytz = "^2022.1"
-tabulate = "^0.8.9"
+tabulate = "^0.8.10"
toml = "^0.10.2"
tqdm = "^4.64.0"
uvloop = { version = "^0.16.0", markers = "sys_platform != 'win32'" }
hiredis = { version = "^2.0.0", optional = true }
hyperopt = { version = "^0.2.7", optional = true }
ib_insync = { version = "^0.9.70", optional = true }
-redis = { version = "^4.3.3", optional = true }
+redis = { version = "^4.3.4", optional = true }
# Removed due to 3.10 windows build issue - https://github.com/docker/docker-py/issues/2902
# docker = {version = "^5.0.3", optional = true }
@@ -77,16 +77,16 @@ redis = { version = "^4.3.3", optional = true }
# https://github.com/cython/cython/issues/3515
coverage = "4.5.4" # Pinned at 4.5.4 due Cython
nox = "^2022.1.7"
-numpydoc = "^1.3.1"
+numpydoc = "^1.4.0"
pre-commit = "^2.19.0"
pytest = "^7.1.2"
pytest-asyncio = "^0.18.3"
pytest-benchmark = "^3.4.1"
pytest-cov = "2.10.1" # Pinned at 2.10.1 due coverage 4.5.4
-pytest-mock = "^3.7.0"
+pytest-mock = "^3.8.1"
pytest-xdist = { version = "^2.5.0", extras = ["psutil"] }
linkify-it-py = "^2.0.0"
-myst-parser = "^0.17.2"
+myst-parser = "^0.18.0"
sphinx_comments = "^0.0.3"
sphinx_copybutton = "^0.5.0"
sphinx-external-toc = "^0.3.0"
diff --git a/tests/acceptance_tests/test_backtest_acceptance.py b/tests/acceptance_tests/test_backtest_acceptance.py
index d7594c3e14f5..dd75df25bd96 100644
--- a/tests/acceptance_tests/test_backtest_acceptance.py
+++ b/tests/acceptance_tests/test_backtest_acceptance.py
@@ -107,7 +107,7 @@ def test_run_ema_cross_strategy(self):
# Assert - Should return expected PnL
assert strategy.fast_ema.count == 2689
assert self.engine.iteration == 115044
- assert self.engine.portfolio.account(self.venue).balance_total(USD) == Money(992811.19, USD)
+ assert self.engine.portfolio.account(self.venue).balance_total(USD) == Money(993238.25, USD)
def test_rerun_ema_cross_strategy_returns_identical_performance(self):
# Arrange
@@ -167,7 +167,7 @@ def test_run_multiple_strategies(self):
assert strategy1.fast_ema.count == 2689
assert strategy2.fast_ema.count == 2689
assert self.engine.iteration == 115044
- assert self.engine.portfolio.account(self.venue).balance_total(USD) == Money(985622.38, USD)
+ assert self.engine.portfolio.account(self.venue).balance_total(USD) == Money(986476.50, USD)
class TestBacktestAcceptanceTestsGBPUSDBarsInternal:
@@ -227,7 +227,7 @@ def test_run_ema_cross_with_minute_bar_spec(self):
# Assert
assert strategy.fast_ema.count == 8353
assert self.engine.iteration == 120468
- assert self.engine.portfolio.account(self.venue).balance_total(GBP) == Money(931346.76, GBP)
+ assert self.engine.portfolio.account(self.venue).balance_total(GBP) == Money(949923.86, GBP)
class TestBacktestAcceptanceTestsGBPUSDBarsExternal:
@@ -370,8 +370,8 @@ def test_run_ema_cross_with_minute_trade_bars(self):
assert self.engine.iteration == 10000
btc_ending_balance = self.engine.portfolio.account(self.venue).balance_total(BTC)
usdt_ending_balance = self.engine.portfolio.account(self.venue).balance_total(USDT)
- assert btc_ending_balance == Money(9.57500000, BTC)
- assert usdt_ending_balance == Money(10016974.96985900, USDT)
+ assert btc_ending_balance == Money(9.57200000, BTC)
+ assert usdt_ending_balance == Money(10017117.03405900, USDT)
def test_run_ema_cross_with_trade_ticks_from_bar_data(self):
# Arrange
@@ -405,8 +405,8 @@ def test_run_ema_cross_with_trade_ticks_from_bar_data(self):
assert self.engine.iteration == 40000
btc_ending_balance = self.engine.portfolio.account(self.venue).balance_total(BTC)
usdt_ending_balance = self.engine.portfolio.account(self.venue).balance_total(USDT)
- assert btc_ending_balance == Money(9.57500000, BTC)
- assert usdt_ending_balance == Money(10016974.96985900, USDT)
+ assert btc_ending_balance == Money(9.57200000, BTC)
+ assert usdt_ending_balance == Money(10017117.03405900, USDT)
class TestBacktestAcceptanceTestsAUDUSD:
diff --git a/tests/integration_tests/adapters/betfair/test_betfair_persistence.py b/tests/integration_tests/adapters/betfair/test_betfair_persistence.py
index 94cf04af39f6..ffc51cba9c64 100644
--- a/tests/integration_tests/adapters/betfair/test_betfair_persistence.py
+++ b/tests/integration_tests/adapters/betfair/test_betfair_persistence.py
@@ -17,7 +17,7 @@
import pytest
from nautilus_trader.adapters.betfair.data_types import BSPOrderBookDelta
-from nautilus_trader.persistence.catalog import DataCatalog
+from nautilus_trader.persistence.catalog.parquet import ParquetDataCatalog
from nautilus_trader.persistence.external.core import RawFile
from nautilus_trader.persistence.external.core import process_raw_file
from tests.integration_tests.adapters.betfair.test_kit import BetfairTestStubs
@@ -28,7 +28,7 @@
class TestBetfairPersistence:
def setup(self):
data_catalog_setup()
- self.catalog = DataCatalog.from_env()
+ self.catalog = ParquetDataCatalog.from_env()
self.fs = self.catalog.fs
self.reader = BetfairTestStubs.betfair_reader()
diff --git a/tests/integration_tests/adapters/binance/test_core_types.py b/tests/integration_tests/adapters/binance/test_core_types.py
index be24d881fdde..884da7e05818 100644
--- a/tests/integration_tests/adapters/binance/test_core_types.py
+++ b/tests/integration_tests/adapters/binance/test_core_types.py
@@ -50,14 +50,14 @@ def test_binance_ticker_repr(self):
first_id=28385,
last_id=28460,
count=76,
- ts_event=1500000000000,
- ts_init=1500000000000,
+ ts_event=1650000000000000000,
+ ts_init=1650000000000000000,
)
# Act, Assert
assert (
repr(ticker)
- == "BinanceTicker(instrument_id=BTCUSDT.BINANCE, price_change=-94.99999800, price_change_percent=-95.960, weighted_avg_price=0.29628482, prev_close_price=0.10002000, last_price=4.00000200, last_qty=200.00000000, bid_price=4.00000000, bid_qty=24.00000000, ask_price=4.00000200, ask_qty=24.00000200, open_price=99.00000000, high_price=100.00000000, low_price=0.10000000, volume=8913.30000000, quote_volume=15.30000000, open_time_ms=1499783499040, close_time_ms=1499869899040, first_id=28385, last_id=28460, count=76, ts_event=1500000000000, ts_init=1500000000000)" # noqa
+ == "BinanceTicker(instrument_id=BTCUSDT.BINANCE, price_change=-94.99999800, price_change_percent=-95.960, weighted_avg_price=0.29628482, prev_close_price=0.10002000, last_price=4.00000200, last_qty=200.00000000, bid_price=4.00000000, bid_qty=24.00000000, ask_price=4.00000200, ask_qty=24.00000200, open_price=99.00000000, high_price=100.00000000, low_price=0.10000000, volume=8913.30000000, quote_volume=15.30000000, open_time_ms=1499783499040, close_time_ms=1499869899040, first_id=28385, last_id=28460, count=76, ts_event=1650000000000000000, ts_init=1650000000000000000)" # noqa
)
def test_binance_ticker_pickle(self):
@@ -84,8 +84,8 @@ def test_binance_ticker_pickle(self):
first_id=28385,
last_id=28460,
count=76,
- ts_event=1500000000000,
- ts_init=1500000000000,
+ ts_event=1650000000000000000,
+ ts_init=1650000000000000000,
)
# Act
@@ -96,7 +96,7 @@ def test_binance_ticker_pickle(self):
assert unpickled == ticker
assert (
repr(unpickled)
- == "BinanceTicker(instrument_id=BTCUSDT.BINANCE, price_change=-94.99999800, price_change_percent=-95.960, weighted_avg_price=0.29628482, prev_close_price=0.10002000, last_price=4.00000200, last_qty=200.00000000, bid_price=4.00000000, bid_qty=24.00000000, ask_price=4.00000200, ask_qty=24.00000200, open_price=99.00000000, high_price=100.00000000, low_price=0.10000000, volume=8913.30000000, quote_volume=15.30000000, open_time_ms=1499783499040, close_time_ms=1499869899040, first_id=28385, last_id=28460, count=76, ts_event=1500000000000, ts_init=1500000000000)" # noqa
+ == "BinanceTicker(instrument_id=BTCUSDT.BINANCE, price_change=-94.99999800, price_change_percent=-95.960, weighted_avg_price=0.29628482, prev_close_price=0.10002000, last_price=4.00000200, last_qty=200.00000000, bid_price=4.00000000, bid_qty=24.00000000, ask_price=4.00000200, ask_qty=24.00000200, open_price=99.00000000, high_price=100.00000000, low_price=0.10000000, volume=8913.30000000, quote_volume=15.30000000, open_time_ms=1499783499040, close_time_ms=1499869899040, first_id=28385, last_id=28460, count=76, ts_event=1650000000000000000, ts_init=1650000000000000000)" # noqa
)
def test_binance_ticker_to_from_dict(self):
@@ -123,8 +123,8 @@ def test_binance_ticker_to_from_dict(self):
first_id=28385,
last_id=28460,
count=76,
- ts_event=1500000000000,
- ts_init=1500000000000,
+ ts_event=1650000000000000000,
+ ts_init=1650000000000000000,
)
# Act
@@ -155,8 +155,8 @@ def test_binance_ticker_to_from_dict(self):
"first_id": 28385,
"last_id": 28460,
"count": 76,
- "ts_event": 1500000000000,
- "ts_init": 1500000000000,
+ "ts_event": 1650000000000000000,
+ "ts_init": 1650000000000000000,
}
def test_binance_bar_repr(self):
@@ -167,7 +167,7 @@ def test_binance_bar_repr(self):
bar_spec=TestDataStubs.bar_spec_1min_last(),
),
open=Price.from_str("0.01634790"),
- high=Price.from_str("0.80000000"),
+ high=Price.from_str("0.01640000"),
low=Price.from_str("0.01575800"),
close=Price.from_str("0.01577100"),
volume=Quantity.from_str("148976.11427815"),
@@ -175,14 +175,14 @@ def test_binance_bar_repr(self):
count=100,
taker_buy_base_volume=Quantity.from_str("1756.87402397"),
taker_buy_quote_volume=Quantity.from_str("28.46694368"),
- ts_event=1500000000000,
- ts_init=1500000000000,
+ ts_event=1650000000000000000,
+ ts_init=1650000000000000000,
)
# Act, Assert
assert (
repr(bar)
- == "BinanceBar(bar_type=BTCUSDT.BINANCE-1-MINUTE-LAST-EXTERNAL, open=0.01634790, high=0.80000000, low=0.01575800, close=0.01577100, volume=148976.11427815, quote_volume=2434.19055334, count=100, taker_buy_base_volume=1756.87402397, taker_buy_quote_volume=28.46694368, taker_sell_base_volume=147219.24025418, taker_sell_quote_volume=2405.72360966, ts_event=1500000000000,ts_init=1500000000000)" # noqa
+ == "BinanceBar(bar_type=BTCUSDT.BINANCE-1-MINUTE-LAST-EXTERNAL, open=0.01634790, high=0.01640000, low=0.01575800, close=0.01577100, volume=148976.11427815, quote_volume=2434.19055334, count=100, taker_buy_base_volume=1756.87402397, taker_buy_quote_volume=28.46694368, taker_sell_base_volume=147219.24025418, taker_sell_quote_volume=2405.72360966, ts_event=1650000000000000000,ts_init=1650000000000000000)" # noqa
)
def test_binance_bar_to_from_dict(self):
@@ -193,7 +193,7 @@ def test_binance_bar_to_from_dict(self):
bar_spec=TestDataStubs.bar_spec_1min_last(),
),
open=Price.from_str("0.01634790"),
- high=Price.from_str("0.80000000"),
+ high=Price.from_str("0.01640000"),
low=Price.from_str("0.01575800"),
close=Price.from_str("0.01577100"),
volume=Quantity.from_str("148976.11427815"),
@@ -201,8 +201,8 @@ def test_binance_bar_to_from_dict(self):
count=100,
taker_buy_base_volume=Quantity.from_str("1756.87402397"),
taker_buy_quote_volume=Quantity.from_str("28.46694368"),
- ts_event=1500000000000,
- ts_init=1500000000000,
+ ts_event=1650000000000000000,
+ ts_init=1650000000000000000,
)
# Act
@@ -214,7 +214,7 @@ def test_binance_bar_to_from_dict(self):
"type": "BinanceBar",
"bar_type": "BTCUSDT.BINANCE-1-MINUTE-LAST-EXTERNAL",
"open": "0.01634790",
- "high": "0.80000000",
+ "high": "0.01640000",
"low": "0.01575800",
"close": "0.01577100",
"volume": "148976.11427815",
@@ -222,8 +222,8 @@ def test_binance_bar_to_from_dict(self):
"count": 100,
"taker_buy_base_volume": "1756.87402397",
"taker_buy_quote_volume": "28.46694368",
- "ts_event": 1500000000000,
- "ts_init": 1500000000000,
+ "ts_event": 1650000000000000000,
+ "ts_init": 1650000000000000000,
}
def test_binance_bar_pickling(self):
@@ -234,7 +234,7 @@ def test_binance_bar_pickling(self):
bar_spec=TestDataStubs.bar_spec_1min_last(),
),
open=Price.from_str("0.01634790"),
- high=Price.from_str("0.80000000"),
+ high=Price.from_str("0.01640000"),
low=Price.from_str("0.01575800"),
close=Price.from_str("0.01577100"),
volume=Quantity.from_str("148976.11427815"),
@@ -242,8 +242,8 @@ def test_binance_bar_pickling(self):
count=100,
taker_buy_base_volume=Quantity.from_str("1756.87402397"),
taker_buy_quote_volume=Quantity.from_str("28.46694368"),
- ts_event=1500000000000,
- ts_init=1500000000000,
+ ts_event=1650000000000000000,
+ ts_init=1650000000000000000,
)
# Act
@@ -254,5 +254,9 @@ def test_binance_bar_pickling(self):
assert unpickled == bar
assert (
repr(bar)
- == "BinanceBar(bar_type=BTCUSDT.BINANCE-1-MINUTE-LAST-EXTERNAL, open=0.01634790, high=0.80000000, low=0.01575800, close=0.01577100, volume=148976.11427815, quote_volume=2434.19055334, count=100, taker_buy_base_volume=1756.87402397, taker_buy_quote_volume=28.46694368, taker_sell_base_volume=147219.24025418, taker_sell_quote_volume=2405.72360966, ts_event=1500000000000,ts_init=1500000000000)" # noqa
+ == "BinanceBar(bar_type=BTCUSDT.BINANCE-1-MINUTE-LAST-EXTERNAL, open=0.01634790, high=0.01640000, low=0.01575800, close=0.01577100, volume=148976.11427815, quote_volume=2434.19055334, count=100, taker_buy_base_volume=1756.87402397, taker_buy_quote_volume=28.46694368, taker_sell_base_volume=147219.24025418, taker_sell_quote_volume=2405.72360966, ts_event=1650000000000000000,ts_init=1650000000000000000)" # noqa
)
+ assert unpickled.quote_volume == bar.quote_volume
+ assert unpickled.count == bar.count
+ assert unpickled.taker_buy_base_volume == bar.taker_buy_base_volume
+ assert unpickled.taker_buy_quote_volume == bar.taker_buy_quote_volume
diff --git a/tests/integration_tests/adapters/interactive_brokers/test_execution.py b/tests/integration_tests/adapters/interactive_brokers/test_execution.py
index 1c08b8bccd2d..f1cf6572ba20 100644
--- a/tests/integration_tests/adapters/interactive_brokers/test_execution.py
+++ b/tests/integration_tests/adapters/interactive_brokers/test_execution.py
@@ -41,7 +41,9 @@ def setup(self):
def instrument_setup(self, instrument=None, contract_details=None):
instrument = instrument or self.instrument
contract_details = contract_details or self.contract_details
- self.exec_client._instrument_provider.contract_details[instrument.id] = contract_details
+ self.exec_client._instrument_provider.contract_details[
+ instrument.id.value
+ ] = contract_details
self.exec_client._instrument_provider.contract_id_to_instrument_id[
contract_details.contract.conId
] = instrument.id
diff --git a/tests/integration_tests/adapters/interactive_brokers/test_historic.py b/tests/integration_tests/adapters/interactive_brokers/test_historic.py
index e104060c1a25..b559f96bb869 100644
--- a/tests/integration_tests/adapters/interactive_brokers/test_historic.py
+++ b/tests/integration_tests/adapters/interactive_brokers/test_historic.py
@@ -30,7 +30,7 @@
from nautilus_trader.model.data.bar import BarSpecification
from nautilus_trader.model.data.tick import QuoteTick
from nautilus_trader.model.data.tick import TradeTick
-from nautilus_trader.persistence.catalog import DataCatalog
+from nautilus_trader.persistence.catalog.parquet import ParquetDataCatalog
from tests.integration_tests.adapters.interactive_brokers.test_kit import IBTestStubs
from tests.test_kit.mocks.data import data_catalog_setup
@@ -38,7 +38,7 @@
class TestInteractiveBrokersHistoric:
def setup(self):
data_catalog_setup()
- self.catalog = DataCatalog.from_env()
+ self.catalog = ParquetDataCatalog.from_env()
self.ib = mock.Mock()
@pytest.mark.skipif(sys.platform == "win32", reason="test path broken on Windows")
diff --git a/tests/integration_tests/orderbook/test_orderbook.py b/tests/integration_tests/orderbook/test_orderbook.py
index 3e8dfd5143cf..1d24e04127b6 100644
--- a/tests/integration_tests/orderbook/test_orderbook.py
+++ b/tests/integration_tests/orderbook/test_orderbook.py
@@ -29,7 +29,7 @@ def test_l3_feed():
size_precision=0,
)
# Updates that cause the book to fail integrity checks will be deleted
- # immediately, but we may get also delete later.
+ # immediately, however we may also delete later.
skip_deletes = []
i = 0
for i, m in enumerate(TestDataStubs.l3_feed()): # noqa (B007)
diff --git a/tests/test_kit/mocks/data.py b/tests/test_kit/mocks/data.py
index 7a3678f29710..7c7a0ebb422b 100644
--- a/tests/test_kit/mocks/data.py
+++ b/tests/test_kit/mocks/data.py
@@ -30,7 +30,7 @@
from nautilus_trader.model.objects import Price
from nautilus_trader.model.objects import Quantity
from nautilus_trader.persistence.base import clear_singleton_instances
-from nautilus_trader.persistence.catalog import DataCatalog
+from nautilus_trader.persistence.catalog.parquet import ParquetDataCatalog
from nautilus_trader.persistence.external.core import process_files
from nautilus_trader.persistence.external.readers import CSVReader
from nautilus_trader.persistence.external.readers import Reader
@@ -50,15 +50,15 @@ class NewsEventData(NewsEvent):
def data_catalog_setup():
"""
- Reset the filesystem and DataCatalog to a clean state
+ Reset the filesystem and ParquetDataCatalog to a clean state
"""
- clear_singleton_instances(DataCatalog)
+ clear_singleton_instances(ParquetDataCatalog)
fs = fsspec.filesystem("memory")
path = "/.nautilus/"
if not fs.exists(path):
fs.mkdir(path)
os.environ["NAUTILUS_PATH"] = f"memory://{path}"
- catalog = DataCatalog.from_env()
+ catalog = ParquetDataCatalog.from_env()
assert isinstance(catalog.fs, MemoryFileSystem)
try:
catalog.fs.rm("/", recursive=True)
@@ -95,7 +95,7 @@ def parse_csv_tick(df, instrument_id):
clock = TestClock()
logger = Logger(clock)
- catalog = DataCatalog.from_env()
+ catalog = ParquetDataCatalog.from_env()
instrument_provider = InstrumentProvider(
venue=venue,
logger=logger,
diff --git a/tests/test_kit/stubs/component.py b/tests/test_kit/stubs/component.py
index 98f2fef3aed4..24a1355f6217 100644
--- a/tests/test_kit/stubs/component.py
+++ b/tests/test_kit/stubs/component.py
@@ -35,7 +35,7 @@
from nautilus_trader.model.instruments.base import Instrument
from nautilus_trader.model.objects import Money
from nautilus_trader.msgbus.bus import MessageBus
-from nautilus_trader.persistence.catalog import DataCatalog
+from nautilus_trader.persistence.catalog.parquet import ParquetDataCatalog
from nautilus_trader.portfolio.portfolio import Portfolio
from nautilus_trader.trading.strategy import Strategy
from tests.test_kit.mocks.engines import MockLiveDataEngine
@@ -136,7 +136,7 @@ def order_factory():
@staticmethod
def backtest_node(
- catalog: DataCatalog,
+ catalog: ParquetDataCatalog,
engine_config: BacktestEngineConfig,
) -> BacktestNode:
run_config = TestConfigStubs.backtest_run_config(catalog=catalog, config=engine_config)
diff --git a/tests/test_kit/stubs/config.py b/tests/test_kit/stubs/config.py
index c9e88e7d9dd1..24be497357a5 100644
--- a/tests/test_kit/stubs/config.py
+++ b/tests/test_kit/stubs/config.py
@@ -27,7 +27,7 @@
from nautilus_trader.core.data import Data
from nautilus_trader.model.data.tick import QuoteTick
from nautilus_trader.model.identifiers import Venue
-from nautilus_trader.persistence.catalog import DataCatalog
+from nautilus_trader.persistence.catalog.parquet import ParquetDataCatalog
AAPL_US = TestInstrumentProvider.aapl_equity()
@@ -36,7 +36,7 @@
class TestConfigStubs:
@staticmethod
def streaming_config(
- catalog: DataCatalog,
+ catalog: ParquetDataCatalog,
kind: str = "backtest",
) -> StreamingConfig:
return StreamingConfig(
@@ -74,7 +74,7 @@ def backtest_engine_config(
bypass_risk=False,
allow_cash_position=True,
persist=False,
- catalog: Optional[DataCatalog] = None,
+ catalog: Optional[ParquetDataCatalog] = None,
strategies: List[ImportableStrategyConfig] = None,
) -> BacktestEngineConfig:
if persist:
@@ -90,7 +90,7 @@ def backtest_engine_config(
@staticmethod
def backtest_run_config(
- catalog: DataCatalog,
+ catalog: ParquetDataCatalog,
config: Optional[BacktestEngineConfig] = None,
instrument_ids: Optional[List[str]] = None,
data_types: Tuple[Data] = (QuoteTick,),
diff --git a/tests/unit_tests/accounting/test_accounting_calculators.py b/tests/unit_tests/accounting/test_accounting_calculators.py
index eb2d5ca58631..2a4ff98c2303 100644
--- a/tests/unit_tests/accounting/test_accounting_calculators.py
+++ b/tests/unit_tests/accounting/test_accounting_calculators.py
@@ -38,17 +38,6 @@
class TestExchangeRateCalculator:
- # TODO!
- # def test_get_rate_when_price_type_last_raises_value_error(self):
- # # Arrange
- # converter = ExchangeRateCalculator()
- # bid_rates = {"AUD/USD": 0.80000}
- # ask_rates = {"AUD/USD": 0.80010}
- #
- # # Act, Assert
- # with pytest.raises(ValueError):
- # converter.get_rate(USD, JPY, PriceType.LAST, bid_rates, ask_rates)
-
def test_get_rate_when_from_currency_equals_to_currency_returns_one(self):
# Arrange
converter = ExchangeRateCalculator()
diff --git a/tests/unit_tests/backtest/test_backtest_engine.py b/tests/unit_tests/backtest/test_backtest_engine.py
index c24fa80ee4ab..327d7e56b27c 100644
--- a/tests/unit_tests/backtest/test_backtest_engine.py
+++ b/tests/unit_tests/backtest/test_backtest_engine.py
@@ -52,7 +52,7 @@
from nautilus_trader.model.orderbook.data import OrderBookDelta
from nautilus_trader.model.orderbook.data import OrderBookDeltas
from nautilus_trader.model.orderbook.data import OrderBookSnapshot
-from nautilus_trader.persistence.catalog import DataCatalog
+from nautilus_trader.persistence.catalog.parquet import ParquetDataCatalog
from nautilus_trader.trading.strategy import Strategy
from tests.test_kit.stubs import MyData
from tests.test_kit.stubs.component import TestComponentStubs
@@ -159,7 +159,7 @@ def test_account_state_timestamp(self):
def test_persistence_files_cleaned_up(self):
# Arrange
temp_dir = tempfile.mkdtemp()
- catalog = DataCatalog(
+ catalog = ParquetDataCatalog(
path=str(temp_dir),
fs_protocol="file",
)
@@ -527,7 +527,7 @@ def test_run_ema_cross_with_added_bars(self):
def test_dump_pickled_data(self):
# Arrange, # Act, # Assert
- assert len(self.engine.dump_pickled_data()) == 7229570
+ assert len(self.engine.dump_pickled_data()) == 5181010
def test_load_pickled_data(self):
# Arrange
diff --git a/tests/unit_tests/common/test_common_actor.py b/tests/unit_tests/common/test_common_actor.py
index 1cef3ce296a7..ed3bc8ae46b5 100644
--- a/tests/unit_tests/common/test_common_actor.py
+++ b/tests/unit_tests/common/test_common_actor.py
@@ -43,7 +43,7 @@
from nautilus_trader.model.identifiers import Symbol
from nautilus_trader.model.identifiers import Venue
from nautilus_trader.msgbus.bus import MessageBus
-from nautilus_trader.persistence.catalog import DataCatalog
+from nautilus_trader.persistence.catalog.parquet import ParquetDataCatalog
from nautilus_trader.persistence.streaming import StreamingFeatherWriter
from nautilus_trader.trading.filters import NewsEvent
from nautilus_trader.trading.filters import NewsImpact
@@ -1557,7 +1557,7 @@ def test_publish_data_persist(self):
logger=self.logger,
)
data_catalog_setup()
- catalog = DataCatalog.from_env()
+ catalog = ParquetDataCatalog.from_env()
writer = StreamingFeatherWriter(
path=str(catalog.path),
fs_protocol=catalog.fs_protocol,
diff --git a/tests/unit_tests/common/test_common_clock.py b/tests/unit_tests/common/test_common_clock.py
index f59e24759726..de495f32a3ad 100644
--- a/tests/unit_tests/common/test_common_clock.py
+++ b/tests/unit_tests/common/test_common_clock.py
@@ -59,27 +59,6 @@ def test_local_now(self):
assert result == UNIX_EPOCH.astimezone(tz=pytz.timezone("Australia/Sydney"))
assert str(result) == "1970-01-01 10:00:00+10:00"
- def test_delta1(self):
- # Arrange
- start = self.clock.utc_now()
-
- # Act
- self.clock.set_time(1_000_000_000)
- result = self.clock.delta(start)
-
- # Assert
- assert result > timedelta(0)
- assert isinstance(result, timedelta)
-
- def test_delta2(self):
- # Arrange
- clock = TestClock()
-
- # Act
- events = clock.delta(UNIX_EPOCH - timedelta(minutes=9))
-
- assert events == timedelta(minutes=9)
-
def test_set_time_alert(self):
# Arrange
name = "TEST_ALERT"
@@ -270,7 +249,8 @@ def test_set_two_repeating_timers(self):
def test_instantiate_has_expected_time_and_properties(self):
# Arrange
initial_ns = 42_000_000
- clock = TestClock(initial_ns=initial_ns)
+ clock = TestClock()
+ clock.set_time(initial_ns)
# Act, Assert
assert clock.timestamp_ns() == initial_ns
@@ -314,7 +294,8 @@ def test_timestamp_ns_returns_expected_datetime(self):
def test_timestamp_returns_expected_double(self):
# Arrange
- clock = TestClock(60_000_000_000)
+ clock = TestClock()
+ clock.set_time(60_000_000_000)
# Act
result = clock.timestamp()
@@ -324,7 +305,8 @@ def test_timestamp_returns_expected_double(self):
def test_timestamp_ns_returns_expected_int64(self):
# Arrange
- clock = TestClock(60_000_000_000)
+ clock = TestClock()
+ clock.set_time(60_000_000_000)
# Act
result = clock.timestamp_ns()
@@ -643,18 +625,6 @@ def test_local_now(self):
assert isinstance(result, datetime)
assert str(result).endswith("+11:00") or str(result).endswith("+10:00")
- def test_delta(self):
- # Arrange
- start = self.clock.utc_now()
-
- # Act
- time.sleep(0.1)
- result = self.clock.delta(start)
-
- # Assert
- assert result > timedelta(0)
- assert isinstance(result, timedelta)
-
def test_set_time_alert(self):
# Arrange
name = "TEST_ALERT"
diff --git a/tests/unit_tests/data/test_data_aggregation.py b/tests/unit_tests/data/test_data_aggregation.py
index 9dd3aaf3fc7e..c245b68acfee 100644
--- a/tests/unit_tests/data/test_data_aggregation.py
+++ b/tests/unit_tests/data/test_data_aggregation.py
@@ -1173,19 +1173,18 @@ def test_instantiate_with_various_bar_specs(self, bar_spec, expected):
# Assert
assert aggregator.next_close_ns == expected
- def test_update_timed_with_test_clock_sends_single_bar_to_handler(self):
+ def test_update_timer_with_test_clock_sends_single_bar_to_handler(self):
# Arrange
clock = TestClock()
- bar_store = ObjectStorer()
- handler = bar_store.store
+ handler = []
instrument_id = TestIdStubs.audusd_id()
bar_spec = BarSpecification(1, BarAggregation.MINUTE, PriceType.MID)
bar_type = BarType(instrument_id, bar_spec)
aggregator = TimeBarAggregator(
AUDUSD_SIM,
bar_type,
- handler,
- TestClock(),
+ handler.append,
+ clock,
Logger(clock),
)
@@ -1215,20 +1214,24 @@ def test_update_timed_with_test_clock_sends_single_bar_to_handler(self):
ask=Price.from_str("1.00003"),
bid_size=Quantity.from_int(1),
ask_size=Quantity.from_int(1),
- ts_event=2 * 60 * 1_000_000_000, # 2 minutes in nanoseconds
- ts_init=2 * 60 * 1_000_000_000, # 2 minutes in nanoseconds
+ ts_event=1 * 60 * 1_000_000_000, # 1 minute in nanoseconds
+ ts_init=1 * 60 * 1_000_000_000, # 1 minute in nanoseconds
)
# Act
aggregator.handle_quote_tick(tick1)
aggregator.handle_quote_tick(tick2)
aggregator.handle_quote_tick(tick3)
+ events = clock.advance_time(tick3.ts_event)
+ events[0].handle()
+ print(handler)
# Assert
- assert len(bar_store.get_store()) == 1
- assert Price.from_str("1.000025") == bar_store.get_store()[0].open
- assert Price.from_str("1.000035") == bar_store.get_store()[0].high
- assert Price.from_str("1.000025") == bar_store.get_store()[0].low
- assert Price.from_str("1.000035") == bar_store.get_store()[0].close
- assert Quantity.from_int(2) == bar_store.get_store()[0].volume
- assert 60_000_000_000 == bar_store.get_store()[0].ts_init
+ bar = handler[0]
+ assert len(handler) == 1
+ assert Price.from_str("1.000025") == bar.open
+ assert Price.from_str("1.000035") == bar.high
+ assert Price.from_str("1.000015") == bar.low
+ assert Price.from_str("1.000015") == bar.close
+ assert Quantity.from_int(3) == bar.volume
+ assert 60_000_000_000 == bar.ts_init
diff --git a/tests/unit_tests/indicators/test_amat.py b/tests/unit_tests/indicators/test_amat.py
new file mode 100644
index 000000000000..8d1c65d626f9
--- /dev/null
+++ b/tests/unit_tests/indicators/test_amat.py
@@ -0,0 +1,111 @@
+# -------------------------------------------------------------------------------------------------
+# Copyright (C) 2015-2022 Nautech Systems Pty Ltd. All rights reserved.
+# https://nautechsystems.io
+#
+# Licensed under the GNU Lesser General Public License Version 3.0 (the "License");
+# You may not use this file except in compliance with the License.
+# You may obtain a copy of the License at https://www.gnu.org/licenses/lgpl-3.0.en.html
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+# -------------------------------------------------------------------------------------------------
+
+from nautilus_trader.backtest.data.providers import TestInstrumentProvider
+from nautilus_trader.indicators.amat import ArcherMovingAveragesTrends
+from tests.test_kit.stubs.data import TestDataStubs
+
+
+AUDUSD_SIM = TestInstrumentProvider.default_fx_ccy("AUD/USD")
+
+
+class TestArcherMovingAveragesTrends:
+ def setup(self):
+ # Fixture Setup
+ self.amat = ArcherMovingAveragesTrends(5, 10, 5)
+
+ def test_name_returns_expected_string(self):
+ # Arrange, Act, Assert
+ assert self.amat.name == "ArcherMovingAveragesTrends"
+
+ def test_period(self):
+ # Arrange, Act, Assert
+ assert self.amat.fast_period == 5
+ assert self.amat.slow_period == 10
+ assert self.amat.signal_period == 5
+
+ def test_initialized_without_inputs_returns_false(self):
+ # Arrange, Act, Assert
+ assert self.amat.initialized is False
+
+ def test_initialized_with_required_inputs_returns_true(self):
+ # Arrange, Act
+ for _i in range(20):
+ self.amat.update_raw(109.61)
+
+ # Assert
+ assert self.amat.initialized is True
+
+ def test_handle_bar_updates_indicator(self):
+ # Arrange
+ indicator = ArcherMovingAveragesTrends(5, 10, 5)
+
+ bar = TestDataStubs.bar_5decimal()
+
+ # Act
+ indicator.handle_bar(bar)
+
+ # Assert
+ assert indicator.has_inputs
+ assert indicator.long_run == 0
+ assert indicator.short_run == 0
+
+ def test_value_with_one_input(self):
+ # Arrange, Act
+ self.amat.update_raw(109.93)
+
+ # Assert
+ assert self.amat.long_run == 0
+ assert self.amat.short_run == 0
+
+ def test_value_with_twenty_inputs(self):
+ # Arrange, Act
+ self.amat.update_raw(109.93)
+ self.amat.update_raw(110.0)
+ self.amat.update_raw(109.77)
+ self.amat.update_raw(109.96)
+ self.amat.update_raw(110.29)
+ self.amat.update_raw(110.53)
+ self.amat.update_raw(110.27)
+ self.amat.update_raw(110.21)
+ self.amat.update_raw(110.06)
+ self.amat.update_raw(110.19)
+ self.amat.update_raw(109.83)
+ self.amat.update_raw(109.9)
+ self.amat.update_raw(110.0)
+ self.amat.update_raw(110.03)
+ self.amat.update_raw(110.13)
+ self.amat.update_raw(109.95)
+ self.amat.update_raw(109.75)
+ self.amat.update_raw(110.15)
+ self.amat.update_raw(109.9)
+ self.amat.update_raw(110.04)
+
+ # Assert
+ assert self.amat.long_run == 0
+ assert self.amat.short_run == 1
+
+ def test_reset_successfully_returns_indicator_to_fresh_state(self):
+ # Arrange
+ for _i in range(1000):
+ self.amat.update_raw(109.93)
+
+ # Act
+ self.amat.reset()
+
+ # Assert
+ assert not self.amat.initialized
+ assert self.amat.long_run == 0
+ assert self.amat.short_run == 0
diff --git a/tests/unit_tests/indicators/test_aroon.py b/tests/unit_tests/indicators/test_aroon.py
new file mode 100644
index 000000000000..7c35a930e9b0
--- /dev/null
+++ b/tests/unit_tests/indicators/test_aroon.py
@@ -0,0 +1,112 @@
+# -------------------------------------------------------------------------------------------------
+# Copyright (C) 2015-2022 Nautech Systems Pty Ltd. All rights reserved.
+# https://nautechsystems.io
+#
+# Licensed under the GNU Lesser General Public License Version 3.0 (the "License");
+# You may not use this file except in compliance with the License.
+# You may obtain a copy of the License at https://www.gnu.org/licenses/lgpl-3.0.en.html
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+# -------------------------------------------------------------------------------------------------
+
+from nautilus_trader.backtest.data.providers import TestInstrumentProvider
+from nautilus_trader.indicators.aroon import AroonOscillator
+from tests.test_kit.stubs.data import TestDataStubs
+
+
+AUDUSD_SIM = TestInstrumentProvider.default_fx_ccy("AUD/USD")
+
+
+class TestAroonOscillator:
+ def setup(self):
+ # Fixture Setup
+ self.aroon = AroonOscillator(10)
+
+ def test_name_returns_expected_string(self):
+ # Arrange, Act, Assert
+ assert self.aroon.name == "AroonOscillator"
+
+ def test_period(self):
+ # Arrange, Act, Assert
+ assert self.aroon.period == 10
+
+ def test_initialized_without_inputs_returns_false(self):
+ # Arrange, Act, Assert
+ assert self.aroon.initialized is False
+
+ def test_initialized_with_required_inputs_returns_true(self):
+ # Arrange, Act
+ for _i in range(20):
+ self.aroon.update_raw(110.08, 109.61)
+
+ # Assert
+ assert self.aroon.initialized is True
+
+ def test_handle_bar_updates_indicator(self):
+ # Arrange
+ indicator = AroonOscillator(10)
+ bar = TestDataStubs.bar_5decimal()
+
+ # Act
+ indicator.handle_bar(bar)
+
+ # Assert
+ assert indicator.has_inputs
+ assert indicator.aroon_up == 100.0
+ assert indicator.aroon_down == 100.0
+ assert indicator.value == 0
+
+ def test_value_with_one_input(self):
+ # Arrange, Act
+ self.aroon.update_raw(110.08, 109.61)
+
+ # Assert
+ assert self.aroon.aroon_up == 100.0
+ assert self.aroon.aroon_down == 100.0
+ assert self.aroon.value == 0
+
+ def test_value_with_twenty_inputs(self):
+ # Arrange, Act
+ self.aroon.update_raw(110.08, 109.61)
+ self.aroon.update_raw(110.15, 109.91)
+ self.aroon.update_raw(110.1, 109.73)
+ self.aroon.update_raw(110.06, 109.77)
+ self.aroon.update_raw(110.29, 109.88)
+ self.aroon.update_raw(110.53, 110.29)
+ self.aroon.update_raw(110.61, 110.26)
+ self.aroon.update_raw(110.28, 110.17)
+ self.aroon.update_raw(110.3, 110.0)
+ self.aroon.update_raw(110.25, 110.01)
+ self.aroon.update_raw(110.25, 109.81)
+ self.aroon.update_raw(109.92, 109.71)
+ self.aroon.update_raw(110.21, 109.84)
+ self.aroon.update_raw(110.08, 109.95)
+ self.aroon.update_raw(110.2, 109.96)
+ self.aroon.update_raw(110.16, 109.95)
+ self.aroon.update_raw(109.99, 109.75)
+ self.aroon.update_raw(110.2, 109.73)
+ self.aroon.update_raw(110.1, 109.81)
+ self.aroon.update_raw(110.04, 109.96)
+
+ # Assert
+ assert self.aroon.aroon_up == 9.999999999999998
+ assert self.aroon.aroon_down == 19.999999999999996
+ assert self.aroon.value == -9.999999999999998
+
+ def test_reset_successfully_returns_indicator_to_fresh_state(self):
+ # Arrange
+ for _i in range(1000):
+ self.aroon.update_raw(110.08, 109.61)
+
+ # Act
+ self.aroon.reset()
+
+ # Assert
+ assert not self.aroon.initialized
+ assert self.aroon.aroon_up == 0
+ assert self.aroon.aroon_down == 0
+ assert self.aroon.value == 0
diff --git a/tests/unit_tests/indicators/test_bias.py b/tests/unit_tests/indicators/test_bias.py
new file mode 100644
index 000000000000..c5071ede2da6
--- /dev/null
+++ b/tests/unit_tests/indicators/test_bias.py
@@ -0,0 +1,116 @@
+# -------------------------------------------------------------------------------------------------
+# Copyright (C) 2015-2022 Nautech Systems Pty Ltd. All rights reserved.
+# https://nautechsystems.io
+#
+# Licensed under the GNU Lesser General Public License Version 3.0 (the "License");
+# You may not use this file except in compliance with the License.
+# You may obtain a copy of the License at https://www.gnu.org/licenses/lgpl-3.0.en.html
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+# -------------------------------------------------------------------------------------------------
+
+from nautilus_trader.backtest.data.providers import TestInstrumentProvider
+from nautilus_trader.indicators.bias import Bias
+from tests.test_kit.stubs.data import TestDataStubs
+
+
+AUDUSD_SIM = TestInstrumentProvider.default_fx_ccy("AUD/USD")
+
+
+class TestBias:
+ def setup(self):
+ # Fixture Setup
+ self.bias = Bias(10)
+
+ def test_name_returns_expected_string(self):
+ # Arrange, Act, Assert
+ assert self.bias.name == "Bias"
+
+ def test_str_repr_returns_expected_string(self):
+ # Arrange, Act, Assert
+ assert str(self.bias) == "Bias(10, SIMPLE)"
+ assert repr(self.bias) == "Bias(10, SIMPLE)"
+
+ def test_period_returns_expected_value(self):
+ # Arrange, Act, Assert
+ assert self.bias.period == 10
+
+ def test_initialized_without_inputs_returns_false(self):
+ # Arrange, Act, Assert
+ assert self.bias.initialized is False
+
+ def test_initialized_with_required_inputs_returns_true(self):
+ # Arrange
+ self.bias.update_raw(1.00000)
+ self.bias.update_raw(2.00000)
+ self.bias.update_raw(3.00000)
+ self.bias.update_raw(4.00000)
+ self.bias.update_raw(5.00000)
+ self.bias.update_raw(6.00000)
+ self.bias.update_raw(7.00000)
+ self.bias.update_raw(8.00000)
+ self.bias.update_raw(9.00000)
+ self.bias.update_raw(10.00000)
+
+ # Act, Assert
+ assert self.bias.initialized is True
+
+ def test_handle_bar_updates_indicator(self):
+ # Arrange
+ indicator = Bias(10)
+
+ bar = TestDataStubs.bar_5decimal()
+
+ # Act
+ indicator.handle_bar(bar)
+ # Assert
+ assert indicator.has_inputs
+ assert indicator.value == 0
+
+ def test_value_with_one_input_returns_expected_value(self):
+ # Arrange
+ self.bias.update_raw(1.00000)
+ # Act, Assert
+ assert self.bias.value == 0
+
+ def test_value_with_all_higher_inputs_returns_expected_value(self):
+ # Arrange
+ self.bias.update_raw(109.93)
+ self.bias.update_raw(110.0)
+ self.bias.update_raw(109.77)
+ self.bias.update_raw(109.96)
+ self.bias.update_raw(110.29)
+ self.bias.update_raw(110.53)
+ self.bias.update_raw(110.27)
+ self.bias.update_raw(110.21)
+ self.bias.update_raw(110.06)
+ self.bias.update_raw(110.19)
+ self.bias.update_raw(109.83)
+ self.bias.update_raw(109.9)
+ self.bias.update_raw(110.0)
+ self.bias.update_raw(110.03)
+ self.bias.update_raw(110.13)
+ self.bias.update_raw(109.95)
+ self.bias.update_raw(109.75)
+ self.bias.update_raw(110.15)
+ self.bias.update_raw(109.9)
+ self.bias.update_raw(110.04)
+ # Act, Assert
+ assert self.bias.value == 0.0006547359231776628
+
+ def test_reset_successfully_returns_indicator_to_fresh_state(self):
+ # Arrange
+ self.bias.update_raw(1.00020)
+ self.bias.update_raw(1.00030)
+ self.bias.update_raw(1.00050)
+
+ # Act
+ self.bias.reset()
+
+ # Assert
+ assert not self.bias.initialized
+ assert self.bias.value == 0
diff --git a/tests/unit_tests/indicators/test_cmo.py b/tests/unit_tests/indicators/test_cmo.py
new file mode 100644
index 000000000000..ebed3dd904f5
--- /dev/null
+++ b/tests/unit_tests/indicators/test_cmo.py
@@ -0,0 +1,116 @@
+# -------------------------------------------------------------------------------------------------
+# Copyright (C) 2015-2022 Nautech Systems Pty Ltd. All rights reserved.
+# https://nautechsystems.io
+#
+# Licensed under the GNU Lesser General Public License Version 3.0 (the "License");
+# You may not use this file except in compliance with the License.
+# You may obtain a copy of the License at https://www.gnu.org/licenses/lgpl-3.0.en.html
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+# -------------------------------------------------------------------------------------------------
+
+from nautilus_trader.backtest.data.providers import TestInstrumentProvider
+from nautilus_trader.indicators.cmo import ChandeMomentumOscillator
+from tests.test_kit.stubs.data import TestDataStubs
+
+
+AUDUSD_SIM = TestInstrumentProvider.default_fx_ccy("AUD/USD")
+
+
+class TestChandeMomentumOscillator:
+ def setup(self):
+ # Fixture Setup
+ self.cmo = ChandeMomentumOscillator(10)
+
+ def test_name_returns_expected_string(self):
+ # Arrange, Act, Assert
+ assert self.cmo.name == "ChandeMomentumOscillator"
+
+ def test_str_repr_returns_expected_string(self):
+ # Arrange, Act, Assert
+ assert str(self.cmo) == "ChandeMomentumOscillator(10, WILDER)"
+ assert repr(self.cmo) == "ChandeMomentumOscillator(10, WILDER)"
+
+ def test_period_returns_expected_value(self):
+ # Arrange, Act, Assert
+ assert self.cmo.period == 10
+
+ def test_initialized_without_inputs_returns_false(self):
+ # Arrange, Act, Assert
+ assert self.cmo.initialized is False
+
+ def test_initialized_with_required_inputs_returns_true(self):
+ # Arrange
+ self.cmo.update_raw(1.00000)
+ self.cmo.update_raw(2.00000)
+ self.cmo.update_raw(3.00000)
+ self.cmo.update_raw(4.00000)
+ self.cmo.update_raw(5.00000)
+ self.cmo.update_raw(6.00000)
+ self.cmo.update_raw(7.00000)
+ self.cmo.update_raw(8.00000)
+ self.cmo.update_raw(9.00000)
+ self.cmo.update_raw(10.00000)
+
+ # Act, Assert
+ assert self.cmo.initialized is True
+
+ def test_handle_bar_updates_indicator(self):
+ # Arrange
+ indicator = ChandeMomentumOscillator(10)
+
+ bar = TestDataStubs.bar_5decimal()
+
+ # Act
+ indicator.handle_bar(bar)
+ # Assert
+ assert indicator.has_inputs
+ assert indicator.value == 0
+
+ def test_value_with_one_input_returns_expected_value(self):
+ # Arrange
+ self.cmo.update_raw(1.00000)
+ # Act, Assert
+ assert self.cmo.value == 0
+
+ def test_value_with_all_higher_inputs_returns_expected_value(self):
+ # Arrange
+ self.cmo.update_raw(109.93)
+ self.cmo.update_raw(110.0)
+ self.cmo.update_raw(109.77)
+ self.cmo.update_raw(109.96)
+ self.cmo.update_raw(110.29)
+ self.cmo.update_raw(110.53)
+ self.cmo.update_raw(110.27)
+ self.cmo.update_raw(110.21)
+ self.cmo.update_raw(110.06)
+ self.cmo.update_raw(110.19)
+ self.cmo.update_raw(109.83)
+ self.cmo.update_raw(109.9)
+ self.cmo.update_raw(110.0)
+ self.cmo.update_raw(110.03)
+ self.cmo.update_raw(110.13)
+ self.cmo.update_raw(109.95)
+ self.cmo.update_raw(109.75)
+ self.cmo.update_raw(110.15)
+ self.cmo.update_raw(109.9)
+ self.cmo.update_raw(110.04)
+ # Act, Assert
+ assert self.cmo.value == 2.0896294562387054
+
+ def test_reset_successfully_returns_indicator_to_fresh_state(self):
+ # Arrange
+ self.cmo.update_raw(1.00020)
+ self.cmo.update_raw(1.00030)
+ self.cmo.update_raw(1.00050)
+
+ # Act
+ self.cmo.reset()
+
+ # Assert
+ assert not self.cmo.initialized
+ assert self.cmo.value == 0
diff --git a/tests/unit_tests/indicators/test_dema.py b/tests/unit_tests/indicators/test_dema.py
new file mode 100644
index 000000000000..424e7e93978d
--- /dev/null
+++ b/tests/unit_tests/indicators/test_dema.py
@@ -0,0 +1,134 @@
+# -------------------------------------------------------------------------------------------------
+# Copyright (C) 2015-2022 Nautech Systems Pty Ltd. All rights reserved.
+# https://nautechsystems.io
+#
+# Licensed under the GNU Lesser General Public License Version 3.0 (the "License");
+# You may not use this file except in compliance with the License.
+# You may obtain a copy of the License at https://www.gnu.org/licenses/lgpl-3.0.en.html
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+# -------------------------------------------------------------------------------------------------
+
+from decimal import Decimal
+
+from nautilus_trader.backtest.data.providers import TestInstrumentProvider
+from nautilus_trader.indicators.average.dema import DoubleExponentialMovingAverage
+from nautilus_trader.model.enums import PriceType
+from tests.test_kit.stubs.data import TestDataStubs
+
+
+AUDUSD_SIM = TestInstrumentProvider.default_fx_ccy("AUD/USD")
+
+
+class TestDoubleExponentialMovingAverage:
+ def setup(self):
+ # Fixture Setup
+ self.dema = DoubleExponentialMovingAverage(10)
+
+ def test_name_returns_expected_string(self):
+ # Arrange, Act, Assert
+ assert self.dema.name == "DoubleExponentialMovingAverage"
+
+ def test_str_repr_returns_expected_string(self):
+ # Arrange, Act, Assert
+ assert str(self.dema) == "DoubleExponentialMovingAverage(10)"
+ assert repr(self.dema) == "DoubleExponentialMovingAverage(10)"
+
+ def test_period_returns_expected_value(self):
+ # Arrange, Act, Assert
+ assert self.dema.period == 10
+
+ def test_initialized_without_inputs_returns_false(self):
+ # Arrange, Act, Assert
+ assert self.dema.initialized is False
+
+ def test_initialized_with_required_inputs_returns_true(self):
+ # Arrange
+ self.dema.update_raw(1.00000)
+ self.dema.update_raw(2.00000)
+ self.dema.update_raw(3.00000)
+ self.dema.update_raw(4.00000)
+ self.dema.update_raw(5.00000)
+ self.dema.update_raw(6.00000)
+ self.dema.update_raw(7.00000)
+ self.dema.update_raw(8.00000)
+ self.dema.update_raw(9.00000)
+ self.dema.update_raw(10.00000)
+
+ # Act
+
+ # Assert
+ assert self.dema.initialized is True
+
+ def test_handle_quote_tick_updates_indicator(self):
+ # Arrange
+ indicator = DoubleExponentialMovingAverage(10, PriceType.MID)
+
+ tick = TestDataStubs.quote_tick_5decimal(AUDUSD_SIM.id)
+
+ # Act
+ indicator.handle_quote_tick(tick)
+
+ # Assert
+ print(Decimal(1.00002))
+ print(Decimal(indicator.value))
+ assert indicator.has_inputs
+ assert indicator.value == 1.00002
+
+ def test_handle_trade_tick_updates_indicator(self):
+ # Arrange
+ indicator = DoubleExponentialMovingAverage(10)
+
+ tick = TestDataStubs.trade_tick_5decimal(AUDUSD_SIM.id)
+
+ # Act
+ indicator.handle_trade_tick(tick)
+
+ # Assert
+ assert indicator.has_inputs
+ assert indicator.value == 1.00001
+
+ def test_handle_bar_updates_indicator(self):
+ # Arrange
+ indicator = DoubleExponentialMovingAverage(10)
+
+ bar = TestDataStubs.bar_5decimal()
+
+ # Act
+ indicator.handle_bar(bar)
+
+ # Assert
+ assert indicator.has_inputs
+ assert indicator.value == 1.00003
+
+ def test_value_with_one_input_returns_expected_value(self):
+ # Arrange
+ self.dema.update_raw(1.00000)
+
+ # Act, Assert
+ assert self.dema.value == 1.0
+
+ def test_value_with_three_inputs_returns_expected_value(self):
+ # Arrange
+ self.dema.update_raw(1.00000)
+ self.dema.update_raw(2.00000)
+ self.dema.update_raw(3.00000)
+
+ # Act, Assert
+ assert self.dema.value == 1.904583020285499
+
+ def test_reset_successfully_returns_indicator_to_fresh_state(self):
+ # Arrange
+ for _i in range(1000):
+ self.dema.update_raw(1.00000)
+
+ # Act
+ self.dema.reset()
+
+ # Assert
+ assert not self.dema.initialized
+ assert self.dema.value == 0.0
diff --git a/tests/unit_tests/indicators/test_hilbert_snr.py b/tests/unit_tests/indicators/test_hilbert_snr.py
index f8c39327df2a..7ad959749dfd 100644
--- a/tests/unit_tests/indicators/test_hilbert_snr.py
+++ b/tests/unit_tests/indicators/test_hilbert_snr.py
@@ -15,6 +15,8 @@
import sys
+import pytest
+
from nautilus_trader.backtest.data.providers import TestInstrumentProvider
from nautilus_trader.indicators.hilbert_snr import HilbertSignalNoiseRatio
from tests.test_kit.stubs.data import TestDataStubs
@@ -111,7 +113,7 @@ def test_value_with_close_on_high_returns_expected_value(self):
self.snr.update_raw(high, low)
# Assert
- assert self.snr.value == 51.90000000000095
+ assert self.snr.value == pytest.approx(51.90)
def test_value_with_close_on_low_returns_expected_value(self):
# Arrange
@@ -125,7 +127,7 @@ def test_value_with_close_on_low_returns_expected_value(self):
self.snr.update_raw(high, low)
# Assert
- assert self.snr.value == 51.90000000000095
+ assert self.snr.value == pytest.approx(51.90)
def test_reset_successfully_returns_indicator_to_fresh_state(self):
# Arrange
diff --git a/tests/unit_tests/indicators/test_ma_factory.py b/tests/unit_tests/indicators/test_ma_factory.py
index eca54ed6812f..09800d6e920a 100644
--- a/tests/unit_tests/indicators/test_ma_factory.py
+++ b/tests/unit_tests/indicators/test_ma_factory.py
@@ -14,10 +14,12 @@
# -------------------------------------------------------------------------------------------------
from nautilus_trader.backtest.data.providers import TestInstrumentProvider
+from nautilus_trader.indicators.average.dema import DoubleExponentialMovingAverage
from nautilus_trader.indicators.average.ema import ExponentialMovingAverage
from nautilus_trader.indicators.average.hma import HullMovingAverage
from nautilus_trader.indicators.average.ma_factory import MovingAverageFactory
from nautilus_trader.indicators.average.moving_average import MovingAverageType
+from nautilus_trader.indicators.average.rma import WilderMovingAverage
from nautilus_trader.indicators.average.sma import SimpleMovingAverage
from nautilus_trader.indicators.average.wma import WeightedMovingAverage
@@ -53,3 +55,17 @@ def test_weighted_returns_expected_indicator(self):
# Assert
assert isinstance(indicator, WeightedMovingAverage)
+
+ def test_wilde_returns_expected_indicator(self):
+ # Arrange, Act
+ indicator = MovingAverageFactory.create(10, MovingAverageType.WILDER)
+
+ # Assert
+ assert isinstance(indicator, WilderMovingAverage)
+
+ def test_double_exponential_returns_expected_indicator(self):
+ # Arrange, Act
+ indicator = MovingAverageFactory.create(10, MovingAverageType.DOUBLEEXPONENTIAL)
+
+ # Assert
+ assert isinstance(indicator, DoubleExponentialMovingAverage)
diff --git a/tests/unit_tests/indicators/test_vhf.py b/tests/unit_tests/indicators/test_vhf.py
new file mode 100644
index 000000000000..e6ccd4a12128
--- /dev/null
+++ b/tests/unit_tests/indicators/test_vhf.py
@@ -0,0 +1,83 @@
+# -------------------------------------------------------------------------------------------------
+# Copyright (C) 2015-2022 Nautech Systems Pty Ltd. All rights reserved.
+# https://nautechsystems.io
+#
+# Licensed under the GNU Lesser General Public License Version 3.0 (the "License");
+# You may not use this file except in compliance with the License.
+# You may obtain a copy of the License at https://www.gnu.org/licenses/lgpl-3.0.en.html
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+# -------------------------------------------------------------------------------------------------
+
+from nautilus_trader.backtest.data.providers import TestInstrumentProvider
+from nautilus_trader.indicators.vhf import VerticalHorizontalFilter
+from tests.test_kit.stubs.data import TestDataStubs
+
+
+AUDUSD_SIM = TestInstrumentProvider.default_fx_ccy("AUD/USD")
+
+
+class TestVerticalHorizontalFilter:
+ def setup(self):
+ # Fixture Setup
+ self.period = 10
+ self.vhf = VerticalHorizontalFilter(period=self.period)
+
+ def test_init(self):
+ assert not self.vhf.initialized
+ assert not self.vhf.has_inputs
+ assert self.vhf.period == self.period
+ assert self.vhf.value == 0
+
+ def test_name_returns_expected_string(self):
+ assert self.vhf.name == "VerticalHorizontalFilter"
+
+ def test_handle_bar_updates_indicator(self):
+ for _ in range(self.period):
+ self.vhf.handle_bar(TestDataStubs.bar_5decimal())
+
+ assert self.vhf.has_inputs
+ assert self.vhf.value == 0
+
+ def test_value_with_one_input(self):
+ self.vhf.update_raw(56.87)
+
+ assert self.vhf.value == 0
+
+ def test_value_with_twenty_inputs(self):
+ self.vhf.update_raw(56.87)
+ self.vhf.update_raw(56.96)
+ self.vhf.update_raw(57.17)
+ self.vhf.update_raw(57.54)
+ self.vhf.update_raw(57.88)
+ self.vhf.update_raw(57.85)
+ self.vhf.update_raw(57.86)
+ self.vhf.update_raw(57.97)
+ self.vhf.update_raw(58.07)
+ self.vhf.update_raw(58.04)
+ self.vhf.update_raw(57.96)
+ self.vhf.update_raw(57.98)
+ self.vhf.update_raw(58.05)
+ self.vhf.update_raw(57.94)
+ self.vhf.update_raw(57.99)
+ self.vhf.update_raw(58.11)
+ self.vhf.update_raw(58.22)
+ self.vhf.update_raw(58.19)
+ self.vhf.update_raw(58.04)
+ self.vhf.update_raw(58.02)
+
+ assert self.vhf.value == 0.36842105263158487
+
+ def test_reset(self):
+ self.vhf.update_raw(56.87)
+
+ self.vhf.reset()
+
+ assert not self.vhf.initialized
+ assert not self.vhf.has_inputs
+ assert self.vhf.period == self.period
+ assert self.vhf.value == 0
diff --git a/tests/unit_tests/model/test_model_bar.py b/tests/unit_tests/model/test_model_bar.py
index f8678748aea5..df20ea1345c8 100644
--- a/tests/unit_tests/model/test_model_bar.py
+++ b/tests/unit_tests/model/test_model_bar.py
@@ -13,6 +13,8 @@
# limitations under the License.
# -------------------------------------------------------------------------------------------------
+import pickle
+
import pytest
from nautilus_trader.model.data.bar import Bar
@@ -32,6 +34,7 @@
AUDUSD_SIM = TestIdStubs.audusd_id()
GBPUSD_SIM = TestIdStubs.gbpusd_id()
+
ONE_MIN_BID = BarSpecification(1, BarAggregation.MINUTE, PriceType.BID)
AUDUSD_1_MIN_BID = BarType(AUDUSD_SIM, ONE_MIN_BID)
GBPUSD_1_MIN_BID = BarType(GBPUSD_SIM, ONE_MIN_BID)
@@ -61,6 +64,17 @@ def test_bar_spec_comparison(self):
assert bar_spec1 > bar_spec3
assert bar_spec1 >= bar_spec3
+ def test_bar_spec_pickle(self):
+ # Arrange
+ bar_spec = BarSpecification(1000, BarAggregation.TICK, PriceType.LAST)
+
+ # Act
+ pickled = pickle.dumps(bar_spec)
+ unpickled = pickle.loads(pickled) # noqa S301 (pickle is safe here)
+
+ # Assert
+ assert unpickled == bar_spec
+
def test_bar_spec_hash_str_and_repr(self):
# Arrange
bar_spec = BarSpecification(1, BarAggregation.MINUTE, PriceType.BID)
@@ -163,6 +177,15 @@ def test_aggregation_queries(
== is_information_aggregated
)
+ def test_properties(self):
+ # Arrange, Act
+ bar_spec = BarSpecification(1, BarAggregation.HOUR, PriceType.BID)
+
+ # Assert
+ assert bar_spec.step == 1
+ assert bar_spec.aggregation == BarAggregation.HOUR
+ assert bar_spec.price_type == PriceType.BID
+
class TestBarType:
def test_bar_type_equality(self):
@@ -194,6 +217,19 @@ def test_bar_type_comparison(self):
assert bar_type3 > bar_type1
assert bar_type3 >= bar_type1
+ def test_bar_type_pickle(self):
+ # Arrange
+ instrument_id = InstrumentId(Symbol("AUD/USD"), Venue("SIM"))
+ bar_spec = BarSpecification(1, BarAggregation.MINUTE, PriceType.BID)
+ bar_type = BarType(instrument_id, bar_spec)
+
+ # Act
+ pickled = pickle.dumps(bar_type)
+ unpickled = pickle.loads(pickled) # noqa S301 (pickle is safe here)
+
+ # Assert
+ assert unpickled == bar_type
+
def test_bar_type_hash_str_and_repr(self):
# Arrange
instrument_id = InstrumentId(Symbol("AUD/USD"), Venue("SIM"))
@@ -267,6 +303,17 @@ def test_from_str_given_various_valid_string_returns_expected_specification(
# Assert
assert expected == bar_type
+ def test_properties(self):
+ # Arrange, Act
+ instrument_id = InstrumentId(Symbol("AUD/USD"), Venue("SIM"))
+ bar_spec = BarSpecification(1, BarAggregation.MINUTE, PriceType.BID)
+ bar_type = BarType(instrument_id, bar_spec, AggregationSource.EXTERNAL)
+
+ # Assert
+ assert bar_type.instrument_id == instrument_id
+ assert bar_type.spec == bar_spec
+ assert bar_type.aggregation_source == AggregationSource.EXTERNAL
+
class TestBar:
def test_fully_qualified_name(self):
@@ -369,6 +416,34 @@ def test_hash_str_repr(self):
== "Bar(AUD/USD.SIM-1-MINUTE-BID-EXTERNAL,1.00001,1.00004,1.00002,1.00003,100000,0)"
)
+ def test_is_single_price(self):
+ # Arrange
+ bar1 = Bar(
+ AUDUSD_1_MIN_BID,
+ Price.from_str("1.00000"),
+ Price.from_str("1.00000"),
+ Price.from_str("1.00000"),
+ Price.from_str("1.00000"),
+ Quantity.from_int(100000),
+ 0,
+ 0,
+ )
+
+ bar2 = Bar(
+ AUDUSD_1_MIN_BID,
+ Price.from_str("1.00000"),
+ Price.from_str("1.00004"),
+ Price.from_str("1.00002"),
+ Price.from_str("1.00003"),
+ Quantity.from_int(100000),
+ 0,
+ 0,
+ )
+
+ # Act, Assert
+ assert bar1.is_single_price()
+ assert not bar2.is_single_price()
+
def test_to_dict(self):
# Arrange
bar = Bar(
@@ -407,3 +482,23 @@ def test_from_dict_returns_expected_bar(self):
# Assert
assert result == bar
+
+ def test_pickle_bar(self):
+ # Arrange
+ bar = Bar(
+ AUDUSD_1_MIN_BID,
+ Price.from_str("1.00001"),
+ Price.from_str("1.00004"),
+ Price.from_str("1.00002"),
+ Price.from_str("1.00003"),
+ Quantity.from_int(100000),
+ 0,
+ 0,
+ )
+
+ # Act
+ pickled = pickle.dumps(bar)
+ unpickled = pickle.loads(pickled) # noqa S301 (pickle is safe here)
+
+ # Assert
+ assert unpickled == bar
diff --git a/tests/unit_tests/model/test_orderbook.py b/tests/unit_tests/model/test_orderbook.py
index f0fe469b73b9..9a5e53ebca4a 100644
--- a/tests/unit_tests/model/test_orderbook.py
+++ b/tests/unit_tests/model/test_orderbook.py
@@ -303,6 +303,8 @@ def test_orderbook_snapshot(empty_l2_book):
empty_l2_book.apply_snapshot(snapshot)
assert empty_l2_book.best_bid_price() == 1580.0
assert empty_l2_book.best_ask_price() == 1552.15
+ assert empty_l2_book.count == 4
+ assert empty_l2_book.last_update_id == 4
def test_orderbook_operation_update(empty_l2_book, clock):
@@ -321,6 +323,8 @@ def test_orderbook_operation_update(empty_l2_book, clock):
)
empty_l2_book.apply_delta(delta)
assert empty_l2_book.best_ask_price() == 0.5814
+ assert empty_l2_book.count == 1
+ assert empty_l2_book.last_update_id == 1
def test_orderbook_operation_add(empty_l2_book, clock):
@@ -339,6 +343,8 @@ def test_orderbook_operation_add(empty_l2_book, clock):
)
empty_l2_book.apply_delta(delta)
assert empty_l2_book.best_ask_price() == 0.59
+ assert empty_l2_book.count == 1
+ assert empty_l2_book.last_update_id == 1
def test_orderbook_operations(empty_l2_book):
@@ -377,6 +383,7 @@ def test_apply(empty_l2_book, clock):
)
empty_l2_book.apply_snapshot(snapshot)
assert empty_l2_book.best_ask_price() == 160
+ assert empty_l2_book.count == 2
delta = OrderBookDelta(
instrument_id=TestIdStubs.audusd_id(),
book_type=BookType.L2_MBP,
@@ -392,6 +399,7 @@ def test_apply(empty_l2_book, clock):
)
empty_l2_book.apply(delta)
assert empty_l2_book.best_ask_price() == 155
+ assert empty_l2_book.count == 3
def test_orderbook_midpoint(sample_book):
diff --git a/tests/unit_tests/persistence/external/test_core.py b/tests/unit_tests/persistence/external/test_core.py
index 1f79c206ce0b..6d4187a3510c 100644
--- a/tests/unit_tests/persistence/external/test_core.py
+++ b/tests/unit_tests/persistence/external/test_core.py
@@ -31,8 +31,8 @@
from nautilus_trader.model.data.tick import QuoteTick
from nautilus_trader.model.objects import Price
from nautilus_trader.model.objects import Quantity
-from nautilus_trader.persistence.catalog import DataCatalog
-from nautilus_trader.persistence.catalog import resolve_path
+from nautilus_trader.persistence.catalog.parquet import ParquetDataCatalog
+from nautilus_trader.persistence.catalog.parquet import resolve_path
from nautilus_trader.persistence.external.core import RawFile
from nautilus_trader.persistence.external.core import _validate_dataset
from nautilus_trader.persistence.external.core import dicts_to_dataframes
@@ -174,7 +174,7 @@ def test_write_parquet_no_partitions(
df = pd.DataFrame(
{"value": np.random.random(5), "instrument_id": ["a", "a", "a", "b", "b"]}
)
- catalog = DataCatalog.from_env()
+ catalog = ParquetDataCatalog.from_env()
fs = catalog.fs
root = catalog.path
@@ -199,7 +199,7 @@ def test_write_parquet_partitions(
self,
):
# Arrange
- catalog = DataCatalog.from_env()
+ catalog = ParquetDataCatalog.from_env()
fs = catalog.fs
root = catalog.path
path = "sample.parquet"
@@ -358,7 +358,7 @@ def test_data_catalog_instruments_filter_by_instrument_id(self):
def test_repartition_dataset(self):
# Arrange
- catalog = DataCatalog.from_env()
+ catalog = ParquetDataCatalog.from_env()
fs = catalog.fs
root = catalog.path
path = "sample.parquet"
@@ -467,7 +467,7 @@ def test_catalog_generic_data_not_overwritten(self):
# Clear the catalog again
data_catalog_setup()
- self.catalog = DataCatalog.from_env()
+ self.catalog = ParquetDataCatalog.from_env()
assert (
len(self.catalog.generic_data(NewsEventData, raise_on_empty=False, as_nautilus=True))
diff --git a/tests/unit_tests/persistence/external/test_parsers.py b/tests/unit_tests/persistence/external/test_parsers.py
index 454eba3c332c..43a6a0578185 100644
--- a/tests/unit_tests/persistence/external/test_parsers.py
+++ b/tests/unit_tests/persistence/external/test_parsers.py
@@ -24,7 +24,7 @@
from nautilus_trader.backtest.data.wranglers import BarDataWrangler
from nautilus_trader.backtest.data.wranglers import QuoteTickDataWrangler
from nautilus_trader.model.instruments.currency_pair import CurrencyPair
-from nautilus_trader.persistence.catalog import DataCatalog
+from nautilus_trader.persistence.catalog.parquet import ParquetDataCatalog
from nautilus_trader.persistence.external.core import make_raw_files
from nautilus_trader.persistence.external.core import process_files
from nautilus_trader.persistence.external.core import process_raw_file
@@ -47,7 +47,7 @@
class TestPersistenceParsers:
def setup(self):
data_catalog_setup()
- self.catalog = DataCatalog.from_env()
+ self.catalog = ParquetDataCatalog.from_env()
self.reader = MockReader()
self.line_preprocessor = TestLineProcessor()
diff --git a/tests/unit_tests/persistence/test_batching.py b/tests/unit_tests/persistence/test_batching.py
index f3891b6f4801..7dc09f77007f 100644
--- a/tests/unit_tests/persistence/test_batching.py
+++ b/tests/unit_tests/persistence/test_batching.py
@@ -13,7 +13,6 @@
# limitations under the License.
# -------------------------------------------------------------------------------------------------
-
import fsspec
from nautilus_trader.adapters.betfair.providers import BetfairInstrumentProvider
@@ -24,8 +23,8 @@
from nautilus_trader.model.data.venue import InstrumentStatusUpdate
from nautilus_trader.model.orderbook.data import OrderBookData
from nautilus_trader.persistence.batching import batch_files
-from nautilus_trader.persistence.catalog import DataCatalog
-from nautilus_trader.persistence.catalog import resolve_path
+from nautilus_trader.persistence.catalog.parquet import ParquetDataCatalog
+from nautilus_trader.persistence.catalog.parquet import resolve_path
from nautilus_trader.persistence.external.core import process_files
from nautilus_trader.persistence.external.readers import CSVReader
from nautilus_trader.persistence.funcs import parse_bytes
@@ -42,7 +41,7 @@
class TestPersistenceBatching:
def setup(self):
data_catalog_setup()
- self.catalog = DataCatalog.from_env()
+ self.catalog = ParquetDataCatalog.from_env()
self.fs: fsspec.AbstractFileSystem = self.catalog.fs
self._loaded_data_into_catalog()
diff --git a/tests/unit_tests/persistence/test_catalog.py b/tests/unit_tests/persistence/test_catalog.py
index ce70daabbd09..90d08ae12128 100644
--- a/tests/unit_tests/persistence/test_catalog.py
+++ b/tests/unit_tests/persistence/test_catalog.py
@@ -38,8 +38,8 @@
from nautilus_trader.model.instruments.equity import Equity
from nautilus_trader.model.objects import Price
from nautilus_trader.model.objects import Quantity
-from nautilus_trader.persistence.catalog import DataCatalog
-from nautilus_trader.persistence.catalog import resolve_path
+from nautilus_trader.persistence.catalog.parquet import ParquetDataCatalog
+from nautilus_trader.persistence.catalog.parquet import resolve_path
from nautilus_trader.persistence.external.core import dicts_to_dataframes
from nautilus_trader.persistence.external.core import process_files
from nautilus_trader.persistence.external.core import split_and_serialize
@@ -61,7 +61,7 @@
class TestPersistenceCatalog:
def setup(self):
data_catalog_setup()
- self.catalog = DataCatalog.from_env()
+ self.catalog = ParquetDataCatalog.from_env()
self.fs: fsspec.AbstractFileSystem = self.catalog.fs
self._load_data_into_catalog()
@@ -78,13 +78,13 @@ def _load_data_into_catalog(self):
def test_catalog_root_path_windows_local(self):
from tempfile import tempdir
- catalog = DataCatalog(path=tempdir, fs_protocol="file")
+ catalog = ParquetDataCatalog(path=tempdir, fs_protocol="file")
path = resolve_path(path=catalog.path / "test", fs=catalog.fs)
assert path == str(pathlib.Path(tempdir) / "test")
@pytest.mark.skipif(sys.platform != "win32", reason="windows only")
def test_catalog_root_path_windows_non_local(self):
- catalog = DataCatalog(path="/some/path", fs_protocol="memory")
+ catalog = ParquetDataCatalog(path="/some/path", fs_protocol="memory")
path = resolve_path(path=catalog.path / "test", fs=catalog.fs)
assert path == "/some/path/test"
@@ -123,7 +123,7 @@ def test_data_catalog_instruments_as_nautilus(self):
def test_data_catalog_currency_with_null_max_price_loads(self):
# Arrange
- catalog = DataCatalog.from_env()
+ catalog = ParquetDataCatalog.from_env()
instrument = TestInstrumentProvider.default_fx_ccy("AUD/USD", venue=Venue("SIM"))
write_objects(catalog=catalog, chunk=[instrument])
@@ -135,7 +135,7 @@ def test_data_catalog_currency_with_null_max_price_loads(self):
def test_data_catalog_instrument_ids_correctly_unmapped(self):
# Arrange
- catalog = DataCatalog.from_env()
+ catalog = ParquetDataCatalog.from_env()
instrument = TestInstrumentProvider.default_fx_ccy("AUD/USD", venue=Venue("SIM"))
trade_tick = TradeTick(
instrument_id=instrument.id,
@@ -318,7 +318,7 @@ def test_catalog_persists_equity(self):
)
# Act
- catalog = DataCatalog.from_env()
+ catalog = ParquetDataCatalog.from_env()
write_objects(catalog=catalog, chunk=[instrument, quote_tick])
instrument_from_catalog = catalog.instruments(
as_nautilus=True,
diff --git a/tests/unit_tests/persistence/test_metadata.py b/tests/unit_tests/persistence/test_metadata.py
index e8f20ce28eec..39bd5df322a8 100644
--- a/tests/unit_tests/persistence/test_metadata.py
+++ b/tests/unit_tests/persistence/test_metadata.py
@@ -17,7 +17,7 @@
from nautilus_trader.backtest.data.providers import TestInstrumentProvider
from nautilus_trader.model.identifiers import Venue
-from nautilus_trader.persistence.catalog import DataCatalog
+from nautilus_trader.persistence.catalog.parquet import ParquetDataCatalog
from nautilus_trader.persistence.external.core import write_objects
from nautilus_trader.persistence.external.metadata import load_mappings
from tests.test_kit.mocks.data import data_catalog_setup
@@ -27,7 +27,7 @@
class TestPersistenceBatching:
def setup(self):
data_catalog_setup()
- self.catalog = DataCatalog.from_env()
+ self.catalog = ParquetDataCatalog.from_env()
self.fs: fsspec.AbstractFileSystem = self.catalog.fs
def test_metadata_multiple_instruments(self):
diff --git a/tests/unit_tests/persistence/test_streaming.py b/tests/unit_tests/persistence/test_streaming.py
index d7c2eecce36f..5ed72d8c290f 100644
--- a/tests/unit_tests/persistence/test_streaming.py
+++ b/tests/unit_tests/persistence/test_streaming.py
@@ -25,8 +25,8 @@
from nautilus_trader.config import BacktestRunConfig
from nautilus_trader.core.data import Data
from nautilus_trader.model.data.venue import InstrumentStatusUpdate
-from nautilus_trader.persistence.catalog import DataCatalog
-from nautilus_trader.persistence.catalog import resolve_path
+from nautilus_trader.persistence.catalog.parquet import ParquetDataCatalog
+from nautilus_trader.persistence.catalog.parquet import resolve_path
from nautilus_trader.persistence.external.core import process_files
from nautilus_trader.persistence.external.readers import CSVReader
from nautilus_trader.persistence.streaming import generate_signal_class
@@ -40,7 +40,7 @@
class TestPersistenceStreaming:
def setup(self):
data_catalog_setup()
- self.catalog = DataCatalog.from_env()
+ self.catalog = ParquetDataCatalog.from_env()
self.fs = self.catalog.fs
self._load_data_into_catalog()
diff --git a/tests/unit_tests/serialization/test_serialization_arrow.py b/tests/unit_tests/serialization/test_serialization_arrow.py
index 1223de35c0ca..16a1f447d7be 100644
--- a/tests/unit_tests/serialization/test_serialization_arrow.py
+++ b/tests/unit_tests/serialization/test_serialization_arrow.py
@@ -38,7 +38,7 @@
from nautilus_trader.model.orderbook.data import OrderBookDeltas
from nautilus_trader.model.orderbook.data import OrderBookSnapshot
from nautilus_trader.model.position import Position
-from nautilus_trader.persistence.catalog import DataCatalog
+from nautilus_trader.persistence.catalog.parquet import ParquetDataCatalog
from nautilus_trader.persistence.external.core import write_objects
from nautilus_trader.serialization.arrow.serializer import ParquetSerializer
from tests.test_kit.stubs.data import TestDataStubs
@@ -55,7 +55,7 @@
def _reset():
"""Cleanup resources before each test run"""
os.environ["NAUTILUS_PATH"] = "memory:///.nautilus/"
- catalog = DataCatalog.from_env()
+ catalog = ParquetDataCatalog.from_env()
assert isinstance(catalog.fs, MemoryFileSystem)
try:
catalog.fs.rm("/", recursive=True)
@@ -69,7 +69,7 @@ class TestParquetSerializer:
def setup(self):
# Fixture Setup
_reset()
- self.catalog = DataCatalog(path="/root", fs_protocol="memory")
+ self.catalog = ParquetDataCatalog(path="/root", fs_protocol="memory")
self.order_factory = OrderFactory(
trader_id=TraderId("T-001"),
strategy_id=StrategyId("S-001"),
diff --git a/version.json b/version.json
index 336f292c63c0..86a794fc6f9b 100644
--- a/version.json
+++ b/version.json
@@ -1,6 +1,6 @@
{
"schemaVersion": 1,
"label": "",
- "message": "v1.147.1",
+ "message": "v1.148.0",
"color": "orange"
}