Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How can I generate the x_enc, x_mark_enc, x_dec, x_mark_dec arguments? #672

Closed
msasen opened this issue Feb 5, 2025 · 2 comments
Closed

Comments

@msasen
Copy link

msasen commented Feb 5, 2025

I am trying to export the TimesNET model I have trained. However, the forward() function requires four arguments: x_enc, x_mark_enc, x_dec, and x_mark_dec, and I couldn't understand what these arguments are.

If you can provide support on how to generate these arguments, I will prepare complete documentation on how to export the models found in the Time-Series-Library.

Any pointers on this issue would be greatly appreciated. Thank you.

import torch
from models.TimesNet import Model

class Config:
    def __init__(self):
        self.task_name = "long_term_forecast"
        self.is_training = 1
        self.model_id = "test"  # Updated from ETTh1_96_96
        self.model = "Autoformer"  # Updated from TimesNet
        self.data = "ETTh1"
        self.root_path = "./data/ETT/"  # Updated from ./dataset/ETT-small/
        self.data_path = "ETTh1.csv"
        self.features = "M"
        self.target = "OT"
        self.freq = "h"
        self.checkpoints = "./checkpoints/"


        # Forecasting task
        self.seq_len = 96
        self.label_len = 48
        self.pred_len = 96
        self.seasonal_patterns = "Monthly"
        self.inverse = False

        # Inputation task
        self.mask_rate = 0.25

        # Anomaly detection task
        self.anomaly_ratio = 0.25


        self.expand = 2
        self.d_conv = 4
        self.top_k = 5
        self.num_kernels = 6
        self.enc_in = 7
        self.dec_in = 7
        self.c_out = 7
        self.d_model = 512
        self.n_heads = 8
        self.e_layers = 2
        self.d_layers = 1
        self.d_ff = 2048
        self.moving_avg = 25
        self.factor = 1

        self.seg_len = 96

        # Optimization
        self.num_workers = 10
        self.itr = 1
        self.train_epochs = 10
        self.batch_size = 32
        self.patience = 3
        self.learning_rate = 0.0001
        self.des = "test"
        self.loss = "MSE"
        self.lradj = "type1"
        self.use_amp = False


        self.use_gpu = True
        self.gpu = 0
        self.gpu_type = "cuda"
        self.use_multi_gpu = False
        self.devices = "0,1,2,3"

        # De-stationary projector parameters
        self.p_hidden_dims = [128, 128]
        self.p_hidden_layers = 2

        # Metrics (DTW)
        self.use_dtw = False

        # Augmentation
        self.augmentation_ratio = 0
        self.seed = 2
        self.jitter = False
        self.scaling = False
        self.permutation = False
        self.randompermutation = False

        self.magwarp = False
        self.timewarp = False
        self.windowslice = False
        self.windowwarp = False
        self.rotation = False
        self.spawner = False
        self.dtwwarp = False
        self.shapedtwwarp = False
        self.wdba = False
        self.discdtw = False
        self.discsdtw = False
        self.extra_tag = ""
        self.dropout = 0.1
        # TimeXer
        self.patch_len = 16


        self.task_name = "long_term_forecast"
        self.is_training = 1
        self.root_path = "./dataset/ETT-small/"
        self.data_path = "ETTh1.csv"
        self.model_id = "ETTh1_96_96"
        self.model = "TimesNet"
        self.data = "ETTh1"
        self.features = "M"
        self.seq_len = 96
        self.label_len = 48
        self.pred_len = 96
        self.e_layers = 2
        self.d_layers = 1
        self.factor = 3
        self.enc_in = 7
        self.dec_in = 7
        self.c_out = 7
        self.d_model = 16
        self.d_ff = 32
        self.des = "Exp"
        self.itr = 1
        self.top_k = 5
        self.num_kernels = 6
        self.embed = "timeF"
        self.freq
        #self.activation = "relu"
        #self.output_attention = True
        #self.do_predict = True
configs = Config()

checkpoint = torch.load('checkpoints/long_term_forecast_ETTh1_96_96_TimesNet_ETTh1_ftM_sl96_ll48_pl96_dm16_nh8_el2_dl1_df32_expand2_dc4_fc3_ebtimeF_dtTrue_Exp_0/checkpoint.pth')

model = Model(configs)
model.load_state_dict(checkpoint) 
with torch.no_grad():
    predictions = model(...)

print("Predictions:", predictions)

TypeError: forward() missing 4 required positional arguments: 'x_enc', 'x_mark_enc', 'x_dec', and 'x_mark_dec'

@wuhaixu2016
Copy link
Member

@wuhaixu2016
Copy link
Member

Hi, hope your issue has been resolved.

Also, it is very appreciated that you make a pull request for the code for exporting well-trained models.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants