From e433cc6c3664df73702431eddf7b15966d7d1019 Mon Sep 17 00:00:00 2001 From: Guillaume Dalle <22795598+gdalle@users.noreply.github.com> Date: Fri, 5 Apr 2024 11:01:13 +0200 Subject: [PATCH] Update refs --- paper/paper.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/paper/paper.md b/paper/paper.md index dd851b8f..d5ad5cfb 100644 --- a/paper/paper.md +++ b/paper/paper.md @@ -48,8 +48,8 @@ In this industrial use case, the observations were marked temporal point process Unfortunately, nearly all implementations of HMMs we surveyed (in Julia and Python) expect the observations to be generated by a _predefined set of distributions_, with _no temporal heterogeneity_. In Julia, the previous reference package `HMMBase.jl` [@mouchetHMMBaseJlHidden2023] requires compliance with the `Distributions.jl` [@besanconDistributionsJlDefinition2021] interface, which precludes anything not scalar- or array-valued, let alone point processes. -In Python, the `numpy`-based `hmmlearn` [@hmmlearndevelopersHmmlearnHiddenMarkov2023] and the `PyTorch`-based `pomegranate` [@schreiberPomegranateFastFlexible2018] each offer a catalogue of discrete and continuous distributions, but do not allow for easy extension by the user. -The more recent `JAX`-based `dynamax` [@changDynamaxStateSpace2024] is the only package adopting an extensible interface with optional controls, similar to ours. +In Python, the `numpy`-based `hmmlearn` [@hmmlearndevelopersHmmlearnHiddenMarkov2023] and the `PyTorch`-based `pomegranate` [@schreiberPomegranateFastFlexible2018,@schreiberJmschreiPomegranate2024] each offer a catalogue of discrete and continuous distributions, but do not allow for easy extension by the user. +The more recent `JAX`-based `dynamax` [@changDynamaxStateSpace2024,@murphyProbabilisticMachineLearning2023,@sarkkaBayesianFilteringSmoothing2023] is the only package adopting an extensible interface with optional controls, similar to ours. Focusing on Julia specifically, other downsides of `HMMBase.jl` include the lack of support for _multiple observation sequences_, _automatic differentiation_, _sparse transition matrices_ or _number types beyond 64-bit floating point_. Two other Julia packages each provide a subset of functionalities that `HMMBase.jl` lacks, namely `HMMGradients.jl` [@antonelloHMMGradientsJlEnables2021] and `MarkovModels.jl` [@ondelGPUAcceleratedForwardBackwardAlgorithm2022], but they are less developed and ill-suited to uninformed users. @@ -60,7 +60,7 @@ Two other Julia packages each provide a subset of functionalities that `HMMBase. Our package is _generic_. Observations can be arbitrary objects, and the associated distributions only need to implement two methods: a loglikelihood `logdensityof(dist, x)` and a sampler `rand(rng, x)`. -Number types are not restricted, and automatic differentiation of the sequence loglikelihood [@qinDirectOptimizationApproach2000] is supported both in forward and reverse mode, partly thanks to `ChainRulesCore.jl` [@whiteJuliaDiffChainRulesJl2022]. +Number types are not restricted, and automatic differentiation of the sequence loglikelihood [@qinDirectOptimizationApproach2000] is supported both in forward and reverse mode, partly thanks to `ChainRulesCore.jl` [@whiteJuliaDiffChainRulesJl2022a]. The extendable `AbstractHMM` interface allows incorporating features such as priors or structured transitions, as well as temporal or control dependency, simply by redefining three methods: ```julia