A Very Fast (Almost) Deterministic Transform for Time Series Classification.

This is a Pytorch implementation of MiniRocket developed by Malcolm McLean and Ignacio Oguiza

Dempster, A., Schmidt, D. F., & Webb, G. I. (2020). MINIROCKET: A Very Fast (Almost) Deterministic Transform for Time Series Classification. arXiv preprint arXiv:2012.08791.

Original paper: https://arxiv.org/abs/2012.08791

Original code: https://github.com/angus924/minirocket

class MiniRocketFeatures[source]

MiniRocketFeatures(c_in, seq_len, num_features=10000, max_dilations_per_kernel=32, random_state=None) :: Module

This is a Pytorch implementation of MiniRocket developed by Malcolm McLean and Ignacio Oguiza

MiniRocket paper citation: @article{dempster_etal_2020, author = {Dempster, Angus and Schmidt, Daniel F and Webb, Geoffrey I}, title = {{MINIROCKET}: A Very Fast (Almost) Deterministic Transform for Time Series Classification}, year = {2020}, journal = {arXiv:2012.08791} } Original paper: https://arxiv.org/abs/2012.08791 Original code: https://github.com/angus924/minirocket

get_minirocket_features[source]

get_minirocket_features(o, model, chunksize=1024, use_cuda=None, to_np=False)

Function used to split a large dataset into chunks, avoiding OOM error.

class MiniRocketHead[source]

MiniRocketHead(c_in, c_out, seq_len=1, bn=True, fc_dropout=0.0) :: Sequential

A sequential container. Modules will be added to it in the order they are passed in the constructor. Alternatively, an ordered dict of modules can also be passed in.

To make it easier to understand, here is a small example::

# Example of using Sequential
model = nn.Sequential(
          nn.Conv2d(1,20,5),
          nn.ReLU(),
          nn.Conv2d(20,64,5),
          nn.ReLU()
        )

# Example of using Sequential with OrderedDict
model = nn.Sequential(OrderedDict([
          ('conv1', nn.Conv2d(1,20,5)),
          ('relu1', nn.ReLU()),
          ('conv2', nn.Conv2d(20,64,5)),
          ('relu2', nn.ReLU())
        ]))

class MiniRocket[source]

MiniRocket(c_in, c_out, seq_len, num_features=10000, max_dilations_per_kernel=32, random_state=None, bn=True, fc_dropout=0) :: Sequential

A sequential container. Modules will be added to it in the order they are passed in the constructor. Alternatively, an ordered dict of modules can also be passed in.

To make it easier to understand, here is a small example::

# Example of using Sequential
model = nn.Sequential(
          nn.Conv2d(1,20,5),
          nn.ReLU(),
          nn.Conv2d(20,64,5),
          nn.ReLU()
        )

# Example of using Sequential with OrderedDict
model = nn.Sequential(OrderedDict([
          ('conv1', nn.Conv2d(1,20,5)),
          ('relu1', nn.ReLU()),
          ('conv2', nn.Conv2d(20,64,5)),
          ('relu2', nn.ReLU())
        ]))
from fastai.torch_core import default_device
from fastai.metrics import accuracy
from fastai.callback.tracker import ReduceLROnPlateau
from tsai.data.all import *
from tsai.learner import *

dsid = 'ECGFiveDays'
X, y, splits = get_UCR_data(dsid, split_data=False)
mrf = MiniRocketFeatures(c_in=X.shape[1], seq_len=X.shape[2]).to(default_device())
X_train = torch.from_numpy(X[splits[0]]).to(default_device())
mrf.fit(X_train)
X_tfm = get_minirocket_features(X, mrf)
tfms = [None, TSClassification()]
batch_tfms = TSStandardize(by_var=True)
dls = get_ts_dls(X_tfm, y, splits=splits, tfms=tfms, batch_tfms=batch_tfms, bs=256)
learn = ts_learner(dls, MiniRocketHead, metrics=accuracy)
learn.fit(1, 1e-4, cbs=ReduceLROnPlateau(factor=0.5, min_lr=1e-8, patience=10))
epoch train_loss valid_loss accuracy time
0 0.693147 0.531333 0.757259 00:01
from fastai.torch_core import default_device
from tsai.data.all import *
from tsai.learner import *
dsid = 'ECGFiveDays'
X, y, splits = get_UCR_data(dsid, split_data=False)
tfms = [None, TSClassification()]
batch_tfms = TSStandardize()
dls = get_ts_dls(X, y, splits=splits, tfms=tfms, batch_tfms=batch_tfms, bs=256)
learn = ts_learner(dls, MiniRocket, metrics=accuracy)
learn.fit_one_cycle(1, 1e-2)
epoch train_loss valid_loss accuracy time
0 0.693147 0.715753 0.502904 00:18