首页 星云 工具 资源 星选 资讯 热门工具
:

PDF转图片 完全免费 小红书视频下载 无水印 抖音视频下载 无水印 数字星空

时间序列预测实战(十九)魔改Informer模型进行滚动长期预测(科研版本,结果可视化)

人工智能 1.01MB 30 需要积分: 1
立即下载

资源介绍:

在之前的文章中我们已经讲过Informer模型了,但是呢官方的预测功能开发的很简陋只能设定固定长度去预测未来固定范围的值,当我们想要发表论文的时候往往这个预测功能是并不能满足的,所以我在官方代码的基础上增添了一个滚动长期预测的功能,这个功能就是指我们可以第一次预测未来24个时间段的值然后我们像模型中填补 24个值再次去预测未来24个时间段的值(填补功能我设置成自动的了无需大家手动填补),这个功能可以说是很实用的,这样我们可以准确的评估固定时间段的值,当我们实际使用时可以设置自动爬取数据从而产生实际效用。本文修改内容完全为本人个人开发,创作不易所以如果能够帮助到大家希望大家给我的文章点点赞,同时可以关注本专栏(免费阅读),本专栏持续复现各种的顶会内容,无论你想发顶会还是其它水平的论文都能够对你有所帮助。 时间序列预测在许多领域都是关键要素,在这些场景中,我们可以利用大量的时间序列历史数据来进行长期预测,即长序列时间序列预测(LSTF)。然而,现有方法大多设计用于短期问题,如预测48点或更少的数据。随着序列长度的增加,模型的预测能力受到挑战。例如,当预测长度超过48点时,LSTM网络的预测
import torch import numpy as np import torch.nn as nn import torch.nn.functional as F from torch import Tensor from typing import List, Tuple import math from functools import partial from torch import nn, einsum, diagonal from math import log2, ceil import pdb from sympy import Poly, legendre, Symbol, chebyshevt from scipy.special import eval_legendre def legendreDer(k, x): def _legendre(k, x): return (2 * k + 1) * eval_legendre(k, x) out = 0 for i in np.arange(k - 1, -1, -2): out += _legendre(i, x) return out def phi_(phi_c, x, lb=0, ub=1): mask = np.logical_or(x < lb, x > ub) * 1.0 return np.polynomial.polynomial.Polynomial(phi_c)(x) * (1 - mask) def get_phi_psi(k, base): x = Symbol('x') phi_coeff = np.zeros((k, k)) phi_2x_coeff = np.zeros((k, k)) if base == 'legendre': for ki in range(k): coeff_ = Poly(legendre(ki, 2 * x - 1), x).all_coeffs() phi_coeff[ki, :ki + 1] = np.flip(np.sqrt(2 * ki + 1) * np.array(coeff_).astype(np.float64)) coeff_ = Poly(legendre(ki, 4 * x - 1), x).all_coeffs() phi_2x_coeff[ki, :ki + 1] = np.flip(np.sqrt(2) * np.sqrt(2 * ki + 1) * np.array(coeff_).astype(np.float64)) psi1_coeff = np.zeros((k, k)) psi2_coeff = np.zeros((k, k)) for ki in range(k): psi1_coeff[ki, :] = phi_2x_coeff[ki, :] for i in range(k): a = phi_2x_coeff[ki, :ki + 1] b = phi_coeff[i, :i + 1] prod_ = np.convolve(a, b) prod_[np.abs(prod_) < 1e-8] = 0 proj_ = (prod_ * 1 / (np.arange(len(prod_)) + 1) * np.power(0.5, 1 + np.arange(len(prod_)))).sum() psi1_coeff[ki, :] -= proj_ * phi_coeff[i, :] psi2_coeff[ki, :] -= proj_ * phi_coeff[i, :] for j in range(ki): a = phi_2x_coeff[ki, :ki + 1] b = psi1_coeff[j, :] prod_ = np.convolve(a, b) prod_[np.abs(prod_) < 1e-8] = 0 proj_ = (prod_ * 1 / (np.arange(len(prod_)) + 1) * np.power(0.5, 1 + np.arange(len(prod_)))).sum() psi1_coeff[ki, :] -= proj_ * psi1_coeff[j, :] psi2_coeff[ki, :] -= proj_ * psi2_coeff[j, :] a = psi1_coeff[ki, :] prod_ = np.convolve(a, a) prod_[np.abs(prod_) < 1e-8] = 0 norm1 = (prod_ * 1 / (np.arange(len(prod_)) + 1) * np.power(0.5, 1 + np.arange(len(prod_)))).sum() a = psi2_coeff[ki, :] prod_ = np.convolve(a, a) prod_[np.abs(prod_) < 1e-8] = 0 norm2 = (prod_ * 1 / (np.arange(len(prod_)) + 1) * (1 - np.power(0.5, 1 + np.arange(len(prod_))))).sum() norm_ = np.sqrt(norm1 + norm2) psi1_coeff[ki, :] /= norm_ psi2_coeff[ki, :] /= norm_ psi1_coeff[np.abs(psi1_coeff) < 1e-8] = 0 psi2_coeff[np.abs(psi2_coeff) < 1e-8] = 0 phi = [np.poly1d(np.flip(phi_coeff[i, :])) for i in range(k)] psi1 = [np.poly1d(np.flip(psi1_coeff[i, :])) for i in range(k)] psi2 = [np.poly1d(np.flip(psi2_coeff[i, :])) for i in range(k)] elif base == 'chebyshev': for ki in range(k): if ki == 0: phi_coeff[ki, :ki + 1] = np.sqrt(2 / np.pi) phi_2x_coeff[ki, :ki + 1] = np.sqrt(2 / np.pi) * np.sqrt(2) else: coeff_ = Poly(chebyshevt(ki, 2 * x - 1), x).all_coeffs() phi_coeff[ki, :ki + 1] = np.flip(2 / np.sqrt(np.pi) * np.array(coeff_).astype(np.float64)) coeff_ = Poly(chebyshevt(ki, 4 * x - 1), x).all_coeffs() phi_2x_coeff[ki, :ki + 1] = np.flip( np.sqrt(2) * 2 / np.sqrt(np.pi) * np.array(coeff_).astype(np.float64)) phi = [partial(phi_, phi_coeff[i, :]) for i in range(k)] x = Symbol('x') kUse = 2 * k roots = Poly(chebyshevt(kUse, 2 * x - 1)).all_roots() x_m = np.array([rt.evalf(20) for rt in roots]).astype(np.float64) # x_m[x_m==0.5] = 0.5 + 1e-8 # add small noise to avoid the case of 0.5 belonging to both phi(2x) and phi(2x-1) # not needed for our purpose here, we use even k always to avoid wm = np.pi / kUse / 2 psi1_coeff = np.zeros((k, k)) psi2_coeff = np.zeros((k, k)) psi1 = [[] for _ in range(k)] psi2 = [[] for _ in range(k)] for ki in range(k): psi1_coeff[ki, :] = phi_2x_coeff[ki, :] for i in range(k): proj_ = (wm * phi[i](x_m) * np.sqrt(2) * phi[ki](2 * x_m)).sum() psi1_coeff[ki, :] -= proj_ * phi_coeff[i, :] psi2_coeff[ki, :] -= proj_ * phi_coeff[i, :] for j in range(ki): proj_ = (wm * psi1[j](x_m) * np.sqrt(2) * phi[ki](2 * x_m)).sum() psi1_coeff[ki, :] -= proj_ * psi1_coeff[j, :] psi2_coeff[ki, :] -= proj_ * psi2_coeff[j, :] psi1[ki] = partial(phi_, psi1_coeff[ki, :], lb=0, ub=0.5) psi2[ki] = partial(phi_, psi2_coeff[ki, :], lb=0.5, ub=1) norm1 = (wm * psi1[ki](x_m) * psi1[ki](x_m)).sum() norm2 = (wm * psi2[ki](x_m) * psi2[ki](x_m)).sum() norm_ = np.sqrt(norm1 + norm2) psi1_coeff[ki, :] /= norm_ psi2_coeff[ki, :] /= norm_ psi1_coeff[np.abs(psi1_coeff) < 1e-8] = 0 psi2_coeff[np.abs(psi2_coeff) < 1e-8] = 0 psi1[ki] = partial(phi_, psi1_coeff[ki, :], lb=0, ub=0.5 + 1e-16) psi2[ki] = partial(phi_, psi2_coeff[ki, :], lb=0.5 + 1e-16, ub=1) return phi, psi1, psi2 def get_filter(base, k): def psi(psi1, psi2, i, inp): mask = (inp <= 0.5) * 1.0 return psi1[i](inp) * mask + psi2[i](inp) * (1 - mask) if base not in ['legendre', 'chebyshev']: raise Exception('Base not supported') x = Symbol('x') H0 = np.zeros((k, k)) H1 = np.zeros((k, k)) G0 = np.zeros((k, k)) G1 = np.zeros((k, k)) PHI0 = np.zeros((k, k)) PHI1 = np.zeros((k, k)) phi, psi1, psi2 = get_phi_psi(k, base) if base == 'legendre': roots = Poly(legendre(k, 2 * x - 1)).all_roots() x_m = np.array([rt.evalf(20) for rt in roots]).astype(np.float64) wm = 1 / k / legendreDer(k, 2 * x_m - 1) / eval_legendre(k - 1, 2 * x_m - 1) for ki in range(k): for kpi in range(k): H0[ki, kpi] = 1 / np.sqrt(2) * (wm * phi[ki](x_m / 2) * phi[kpi](x_m)).sum() G0[ki, kpi] = 1 / np.sqrt(2) * (wm * psi(psi1, psi2, ki, x_m / 2) * phi[kpi](x_m)).sum() H1[ki, kpi] = 1 / np.sqrt(2) * (wm * phi[ki]((x_m + 1) / 2) * phi[kpi](x_m)).sum() G1[ki, kpi] = 1 / np.sqrt(2) * (wm * psi(psi1, psi2, ki, (x_m + 1) / 2) * phi[kpi](x_m)).sum() PHI0 = np.eye(k) PHI1 = np.eye(k) elif base == 'chebyshev': x = Symbol('x') kUse = 2 * k roots = Poly(chebyshevt(kUse, 2 * x - 1)).all_roots() x_m = np.array([rt.evalf(20) for rt in roots]).astype(np.float64) # x_m[x_m==0.5] = 0.5 + 1e-8 # add small noise to avoid the case of 0.5 belonging to both phi(2x) and phi(2x-1) # not needed for our purpose here, we use even k always to avoid wm = np.pi / kUse / 2 for ki in range(k): for kpi in range(k): H0[ki, kpi] = 1 / np.sqrt(2) * (wm * phi[ki](x_m / 2) * phi[kpi](x_m)).sum() G0[ki, kpi] = 1 / np.sqrt(2) * (wm * psi(psi1, psi2, ki, x_m / 2) * phi[kpi](x_m)).sum() H1[ki, kpi] = 1 / np.sqrt(2) * (wm * phi[ki]((x_m + 1) / 2) * phi[kpi](x_m)).sum() G1[ki, kpi] = 1 / np.sqrt(2) * (wm * psi(psi1, psi2, ki, (x_m + 1) / 2) * phi[kpi](x_m)).sum() PHI0[ki, kpi] = (wm * phi[ki](2 * x_m) * phi[kpi](2 * x_m)).sum() * 2 PHI1[ki, kpi] = (wm * phi[ki](2 * x_m - 1) * phi[kpi](2 * x_m - 1)).sum()

资源文件列表:

model.zip 大约有147个文件
  1. model/.idea/
  2. model/.idea/.gitignore 184B
  3. model/.idea/inspectionProfiles/
  4. model/.idea/inspectionProfiles/profiles_settings.xml 174B
  5. model/.idea/inspectionProfiles/Project_Default.xml 886B
  6. model/.idea/misc.xml 288B
  7. model/.idea/model.iml 488B
  8. model/.idea/modules.xml 269B
  9. model/.idea/workspace.xml 6.63KB
  10. model/__pycache__/
  11. model/checkpoints/
  12. model/data/
  13. model/data/.ipynb_checkpoints/
  14. model/data/.ipynb_checkpoints/data_loader-checkpoint.py 13.52KB
  15. model/data/.ipynb_checkpoints/MSST2trainData-checkpoint.csv 455.11KB
  16. model/data/.ipynb_checkpoints/T1testData-checkpoint.csv 189.15KB
  17. model/data/.ipynb_checkpoints/T1trainData-checkpoint.csv 560.76KB
  18. model/data/__init__.py 1B
  19. model/data/__pycache__/
  20. model/data/__pycache__/__init__.cpython-38.pyc 115B
  21. model/data/__pycache__/__init__.cpython-39.pyc 152B
  22. model/data/__pycache__/data_loader.cpython-38.pyc 8.88KB
  23. model/data/__pycache__/data_loader.cpython-39.pyc 8.82KB
  24. model/data/data_loader.py 13.47KB
  25. model/environment.yml 198B
  26. model/ETTh1.csv 2.47MB
  27. model/ETTh1-Test.csv 38.37KB
  28. model/exp/
  29. model/exp/.ipynb_checkpoints/
  30. model/exp/.ipynb_checkpoints/exp_informer (4)-checkpoint.py 15.96KB
  31. model/exp/.ipynb_checkpoints/exp_informer-checkpoint.py 15.98KB
  32. model/exp/__init__.py
  33. model/exp/__pycache__/
  34. model/exp/__pycache__/__init__.cpython-38.pyc 114B
  35. model/exp/__pycache__/__init__.cpython-39.pyc 151B
  36. model/exp/__pycache__/exp_basic.cpython-38.pyc 1.5KB
  37. model/exp/__pycache__/exp_basic.cpython-39.pyc 1.54KB
  38. model/exp/__pycache__/exp_informer.cpython-38.pyc 8.7KB
  39. model/exp/__pycache__/exp_informer.cpython-39.pyc 10.01KB
  40. model/exp/exp_basic.py 875B
  41. model/exp/exp_informer.py 15.3KB
  42. model/layers/
  43. model/layers/__init__.py
  44. model/layers/__pycache__/
  45. model/layers/__pycache__/__init__.cpython-39.pyc 154B
  46. model/layers/__pycache__/AutoCorrelation.cpython-39.pyc 5.36KB
  47. model/layers/__pycache__/Autoformer_EncDec.cpython-39.pyc 6.85KB
  48. model/layers/__pycache__/Conv_Blocks.cpython-39.pyc 2.41KB
  49. model/layers/__pycache__/Crossformer_EncDec.cpython-39.pyc 4.38KB
  50. model/layers/__pycache__/Embed.cpython-39.pyc 7.26KB
  51. model/layers/__pycache__/Embedding.cpython-39.pyc 6.5KB
  52. model/layers/__pycache__/ETSformer_EncDec.cpython-39.pyc 11.86KB
  53. model/layers/__pycache__/FourierCorrelation.cpython-39.pyc 4.89KB
  54. model/layers/__pycache__/Invertible.cpython-39.pyc 3.68KB
  55. model/layers/__pycache__/MultiWaveletCorrelation.cpython-39.pyc 18.06KB
  56. model/layers/__pycache__/Projection.cpython-39.pyc 1.19KB
  57. model/layers/__pycache__/Pyraformer_EncDec.cpython-39.pyc 6.66KB
  58. model/layers/__pycache__/SelfAttention_Family.cpython-39.pyc 8.88KB
  59. model/layers/__pycache__/Transformer_EncDec.cpython-39.pyc 4.48KB
  60. model/layers/__pycache__/TransformerBlocks.cpython-39.pyc 5.26KB
  61. model/layers/AutoCorrelation.py 6.29KB
  62. model/layers/Autoformer_EncDec.py 6.67KB
  63. model/layers/Conv_Blocks.py 2.31KB
  64. model/layers/Crossformer_EncDec.py 4.23KB
  65. model/layers/Embed.py 6.85KB
  66. model/layers/Embedding.py 4.83KB
  67. model/layers/ETSformer_EncDec.py 11.13KB
  68. model/layers/FourierCorrelation.py 7.17KB
  69. model/layers/Invertible.py 3.22KB
  70. model/layers/MultiWaveletCorrelation.py 22.5KB
  71. model/layers/Projection.py 745B
  72. model/layers/Pyraformer_EncDec.py 7.26KB
  73. model/layers/SelfAttention_Family.py 11.78KB
  74. model/layers/Transformer_EncDec.py 4.81KB
  75. model/layers/TransformerBlocks.py 5.2KB
  76. model/main_informer.py 7.62KB
  77. model/models/
  78. model/models/__init__.py
  79. model/models/__pycache__/
  80. model/models/__pycache__/__init__.cpython-39.pyc 154B
  81. model/models/__pycache__/attn.cpython-38.pyc 5KB
  82. model/models/__pycache__/attn.cpython-39.pyc 5.02KB
  83. model/models/__pycache__/Autoformer.cpython-39.pyc 4.3KB
  84. model/models/__pycache__/Crossformer.cpython-39.pyc 4.46KB
  85. model/models/__pycache__/decoder.cpython-38.pyc 1.93KB
  86. model/models/__pycache__/decoder.cpython-39.pyc 1.95KB
  87. model/models/__pycache__/DLinear.cpython-39.pyc 3.21KB
  88. model/models/__pycache__/embed.cpython-38.pyc 5.02KB
  89. model/models/__pycache__/embed.cpython-39.pyc 5.04KB
  90. model/models/__pycache__/encoder.cpython-38.pyc 3.46KB
  91. model/models/__pycache__/encoder.cpython-39.pyc 3.49KB
  92. model/models/__pycache__/ETSformer.cpython-39.pyc 3.63KB
  93. model/models/__pycache__/FEDformer.cpython-39.pyc 5.04KB
  94. model/models/__pycache__/FiLM.cpython-39.pyc 8.95KB
  95. model/models/__pycache__/Informer.cpython-39.pyc 5.67KB
  96. model/models/__pycache__/iTransformer.cpython-39.pyc 3.96KB
  97. model/models/__pycache__/LightTS.cpython-39.pyc 4.23KB
  98. model/models/__pycache__/MICN.cpython-39.pyc 7KB
  99. model/models/__pycache__/model.cpython-38.pyc 4.81KB
  100. model/models/__pycache__/model.cpython-39.pyc 4.65KB
  101. model/models/__pycache__/Nonstationary_Transformer.cpython-39.pyc 6.11KB
  102. model/models/__pycache__/PatchTST.cpython-39.pyc 5.21KB
  103. model/models/__pycache__/Pyraformer.cpython-39.pyc 3.17KB
  104. model/models/__pycache__/Reformer.cpython-39.pyc 3.75KB
  105. model/models/__pycache__/TiDE.cpython-39.pyc 5.44KB
  106. model/models/__pycache__/TimesNet.cpython-39.pyc 5.65KB
  107. model/models/__pycache__/Transformer.cpython-39.pyc 3.75KB
  108. model/models/attn.py 6.03KB
  109. model/models/Autoformer.py 6.7KB
  110. model/models/Crossformer.py 6.21KB
  111. model/models/decoder.py 1.73KB
  112. model/models/DLinear.py 4.48KB
  113. model/models/embed.py 4.04KB
  114. model/models/encoder.py 3.47KB
  115. model/models/ETSformer.py 4.49KB
  116. model/models/FEDformer.py 8.25KB
  117. model/models/FiLM.py 11.39KB
  118. model/models/Informer.py 7.27KB
  119. model/models/iTransformer.py 5.67KB
  120. model/models/LightTS.py 5.21KB
  121. model/models/MICN.py 9.67KB
  122. model/models/model.py 6.97KB
  123. model/models/Nonstationary_Transformer.py 9.57KB
  124. model/models/PatchTST.py 8.54KB
  125. model/models/Pyraformer.py 4.12KB
  126. model/models/Reformer.py 4.97KB
  127. model/models/TiDE.py 6.86KB
  128. model/models/TimesNet.py 8.48KB
  129. model/models/Transformer.py 5.5KB
  130. model/myplot.png 59.13KB
  131. model/utils/
  132. model/utils/__init__.py
  133. model/utils/__pycache__/
  134. model/utils/__pycache__/__init__.cpython-38.pyc 116B
  135. model/utils/__pycache__/__init__.cpython-39.pyc 153B
  136. model/utils/__pycache__/masking.cpython-38.pyc 1.38KB
  137. model/utils/__pycache__/masking.cpython-39.pyc 1.43KB
  138. model/utils/__pycache__/metrics.cpython-38.pyc 1.38KB
  139. model/utils/__pycache__/metrics.cpython-39.pyc 1.42KB
  140. model/utils/__pycache__/timefeatures.cpython-38.pyc 7.11KB
  141. model/utils/__pycache__/timefeatures.cpython-39.pyc 7.16KB
  142. model/utils/__pycache__/tools.cpython-38.pyc 3.18KB
  143. model/utils/__pycache__/tools.cpython-39.pyc 3.21KB
  144. model/utils/masking.py 851B
  145. model/utils/metrics.py 826B
  146. model/utils/timefeatures.py 5.43KB
  147. model/utils/tools.py 2.76KB
0评论
提交 加载更多评论
其他资源 Typora软件免费版(适用于windows x64)
是Typora的安装包,是windows x64版本的
jspSmartUpload.zip
JAVA BEEN DE jspSmartUpload.zip
memcached5.3~5.4win64
memcached5.3~5.4win64.zip 适用php5.3,5.4 64位 安装执行 cmd里面 memcached.exe -d install 服务开启执行 net start "memcached Server"
Delphi技巧手册
Delphi技巧手册,相当经典。
Microsoft.NET4.8证书.zip
Windows 7 安装.Net Framework 4.8失败,提示:已处理证书链,但是在不受信任提供程序信任的根证书中终止。 解决方法:先导入以下证书后,再执行.net 4.8安装即可。
struts2hibernate3.3.2+Spring2.5.5整合所有jar包
spring3.04_struts2.18_hibernate3.55整合所有jar包 struts2+hibernate3.3.2+Spring2.5.5整合所有jar包
考研历年真题2009-2022年 考研408历年真题及解析(含答案与解析)无水印
考研历年真题2009-2022年 考研408历年真题及解析(含答案与解析)无水印 计算机专业读研的好处: 其一是有足够大的创新空间。在物联网、大数据和人工智能等技术的推动下,计算机专业的同学有大量的创新方向可以选择,而且很多方向都有机会出强成果。另外,目前很多传统学科都在寻求与计算机相结合来开辟新的创新空间,这也给计算机专业同学带来了新的机会。 其二是有更好的科研氛围。当前计算机领域的创新不仅仅是学术界在推动,整个产业领域也非常关注计算机领域的创新成果,在当前数字化、智能化大潮的推动下,很多传统企业希望借助于计算机领域的新技术来完成结构升级。 其三是有更多的就业出口。如果说消费互联网时代造就了大量的互联网明星企业,那么在产业互联网时代,互联网将全面覆盖到传统产业领域,同样有机会造就更多产业互联网时代的明星企业,而这会为计算机专业的高端人才带来更计算机专业对于科研场景的要求是相对比较高的,随着国内互联网产业的快速发展,很多互联网企业已经有了一定的数据和算力积累,而且目前很多导师也都在跟互联网大厂开展各种科研合作,这就为同学们带来了更多的科研资源和更好的交流场景。
素材资源解析系统源码 代下程序 第三方平台下载站程序千图网千库网等素材网站下载网站.zip
素材资源解析系统源码 代下程序 第三方平台下载站程序千图网千库网等素材网站下载网站, 1.需要目标站会员解析 2.随着目标站规则的更改,解析接口可能会失效—接口不保证长期有效,不会修改的最好别买,因为有些买家不懂,所以这里说明下。 3.程序还在升级中不保证所有功能都适用。 4.用于研究解析原理 5.源码只供学习使用,如用于商业活动与本人无关 6.请勿将系统用于非法业务,合理学习