sklearn如何获得model里的参数
发布日期:2021-05-06 22:01:59 浏览次数:24 分类:精选文章

本文共 2196 字,大约阅读时间需要 7 分钟。

from __future__ import divisionimport timeimport pickleimport numpy as npfrom sklearn.model_selection import GridSearchCVfrom sklearn.kernel_ridge import KernelRidgeimport matplotlib.pyplot as pltfrom sklearn.externals import joblibrng = np.random.RandomState(0)def gener(sta, end, num):  # 生成y=x^2的测试集    # 随机干扰因子    sampleNo = num    mu = 0.01    sigma = 0.2    np.random.seed(0)    s = np.random.normal(mu, sigma, sampleNo)    X = np.linspace(sta, end, num)  # 在返回(-1, 1)范围内的等差序列    pix = np.pi * X    Y = np.sin(pix) / pix + 0.1 * X + s    x = X.reshape(-1, 1)    y = Y    print(y.shape)    return x, y, numX, y, num = gener(-3,3,50)X_plot, y_plot, NUM = gener(-3,3,1000)# ############################################################################## Fit regression modeltrain_size = 40kr = GridSearchCV(KernelRidge(kernel='rbf', gamma=0.1), cv=5,                  param_grid={   "alpha": [1e0, 0.1, 1e-2, 1e-3],                              "gamma": np.logspace(-2, 2, 5)})t0 = time.time()kr.fit(X[:train_size], y[:train_size])kr_fit = time.time() - t0print("KRR complexity and bandwidth selected and model fitted in %.3f s"      % kr_fit)t0 = time.time()y_kr = kr.predict(X_plot)kr_predict = time.time() - t0print("KRR prediction for %d inputs in %.3f s"      % (X_plot.shape[0], kr_predict))# ############################################################################## Look at the resultsplt.scatter(X[:100], y[:100], c='k', label='data', zorder=1,            edgecolors=(0, 0, 0))plt.plot(X_plot, y_kr, c='g',         label='KRR (fit: %.3fs, predict: %.3fs)' % (kr_fit, kr_predict))plt.xlabel('data')plt.ylabel('target')plt.legend()plt.show()print(dir(kr))print(dir(kr.best_estimator_))print(kr.best_estimator_.get_params())coef = kr.best_estimator_.dual_coef_def ridge(X, x, NUM, num):    x2 = x * x  # 样本集的参数 逐个求平方    X2 = X * X  # 测试集的参数 逐个求平方    h = 3  # 高斯核的带宽    hh = 2 * h * h    temp = np.tile(X2, num) + np.tile(np.matrix(x2).T, (NUM, 1)) - 2 * X * x.T    k = np.exp(-temp)    return kk = ridge(X_plot, X[:40], NUM, 40)y = np.matrix(k)*np.matrix(coef).Tplt.plot(X_plot,y,label='coef')plt.legend()plt.show()print(coef)

这里用的KRR,所以只有一组系数,其他复杂模型不太了解

kr.best_estimator_.dual_coef_

上一篇:线性代数 补习
下一篇:岭回归(l2约束的高斯核最小二乘)

发表评论

最新留言

哈哈,博客排版真的漂亮呢~
[***.90.31.176]2025年03月23日 10时31分35秒