applied-ai-018 commited on
Commit
af78e4f
·
verified ·
1 Parent(s): d42a351

Add files using upload-large-folder tool

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. ckpts/universal/global_step40/zero/23.post_attention_layernorm.weight/exp_avg.pt +3 -0
  2. ckpts/universal/global_step40/zero/23.post_attention_layernorm.weight/exp_avg_sq.pt +3 -0
  3. ckpts/universal/global_step40/zero/23.post_attention_layernorm.weight/fp32.pt +3 -0
  4. ckpts/universal/global_step40/zero/25.post_attention_layernorm.weight/exp_avg.pt +3 -0
  5. venv/lib/python3.10/site-packages/sklearn/_loss/link.py +280 -0
  6. venv/lib/python3.10/site-packages/sklearn/cross_decomposition/__init__.py +3 -0
  7. venv/lib/python3.10/site-packages/sklearn/cross_decomposition/__pycache__/__init__.cpython-310.pyc +0 -0
  8. venv/lib/python3.10/site-packages/sklearn/cross_decomposition/__pycache__/_pls.cpython-310.pyc +0 -0
  9. venv/lib/python3.10/site-packages/sklearn/cross_decomposition/_pls.py +1083 -0
  10. venv/lib/python3.10/site-packages/sklearn/cross_decomposition/tests/__init__.py +0 -0
  11. venv/lib/python3.10/site-packages/sklearn/cross_decomposition/tests/__pycache__/__init__.cpython-310.pyc +0 -0
  12. venv/lib/python3.10/site-packages/sklearn/cross_decomposition/tests/__pycache__/test_pls.cpython-310.pyc +0 -0
  13. venv/lib/python3.10/site-packages/sklearn/cross_decomposition/tests/test_pls.py +646 -0
  14. venv/lib/python3.10/site-packages/sklearn/datasets/__pycache__/__init__.cpython-310.pyc +0 -0
  15. venv/lib/python3.10/site-packages/sklearn/datasets/__pycache__/_arff_parser.cpython-310.pyc +0 -0
  16. venv/lib/python3.10/site-packages/sklearn/datasets/__pycache__/_base.cpython-310.pyc +0 -0
  17. venv/lib/python3.10/site-packages/sklearn/datasets/__pycache__/_california_housing.cpython-310.pyc +0 -0
  18. venv/lib/python3.10/site-packages/sklearn/datasets/__pycache__/_covtype.cpython-310.pyc +0 -0
  19. venv/lib/python3.10/site-packages/sklearn/datasets/__pycache__/_kddcup99.cpython-310.pyc +0 -0
  20. venv/lib/python3.10/site-packages/sklearn/datasets/__pycache__/_lfw.cpython-310.pyc +0 -0
  21. venv/lib/python3.10/site-packages/sklearn/datasets/__pycache__/_olivetti_faces.cpython-310.pyc +0 -0
  22. venv/lib/python3.10/site-packages/sklearn/datasets/__pycache__/_openml.cpython-310.pyc +0 -0
  23. venv/lib/python3.10/site-packages/sklearn/datasets/__pycache__/_rcv1.cpython-310.pyc +0 -0
  24. venv/lib/python3.10/site-packages/sklearn/datasets/__pycache__/_samples_generator.cpython-310.pyc +0 -0
  25. venv/lib/python3.10/site-packages/sklearn/datasets/__pycache__/_species_distributions.cpython-310.pyc +0 -0
  26. venv/lib/python3.10/site-packages/sklearn/datasets/__pycache__/_svmlight_format_io.cpython-310.pyc +0 -0
  27. venv/lib/python3.10/site-packages/sklearn/datasets/__pycache__/_twenty_newsgroups.cpython-310.pyc +0 -0
  28. venv/lib/python3.10/site-packages/sklearn/datasets/data/__init__.py +0 -0
  29. venv/lib/python3.10/site-packages/sklearn/datasets/data/__pycache__/__init__.cpython-310.pyc +0 -0
  30. venv/lib/python3.10/site-packages/sklearn/datasets/data/boston_house_prices.csv +508 -0
  31. venv/lib/python3.10/site-packages/sklearn/datasets/data/breast_cancer.csv +0 -0
  32. venv/lib/python3.10/site-packages/sklearn/datasets/data/iris.csv +151 -0
  33. venv/lib/python3.10/site-packages/sklearn/datasets/data/linnerud_exercise.csv +21 -0
  34. venv/lib/python3.10/site-packages/sklearn/datasets/data/linnerud_physiological.csv +21 -0
  35. venv/lib/python3.10/site-packages/sklearn/datasets/data/wine_data.csv +179 -0
  36. venv/lib/python3.10/site-packages/sklearn/datasets/descr/__init__.py +0 -0
  37. venv/lib/python3.10/site-packages/sklearn/datasets/descr/__pycache__/__init__.cpython-310.pyc +0 -0
  38. venv/lib/python3.10/site-packages/sklearn/datasets/descr/breast_cancer.rst +122 -0
  39. venv/lib/python3.10/site-packages/sklearn/datasets/descr/california_housing.rst +46 -0
  40. venv/lib/python3.10/site-packages/sklearn/datasets/descr/covtype.rst +30 -0
  41. venv/lib/python3.10/site-packages/sklearn/datasets/descr/diabetes.rst +38 -0
  42. venv/lib/python3.10/site-packages/sklearn/datasets/descr/digits.rst +50 -0
  43. venv/lib/python3.10/site-packages/sklearn/datasets/descr/iris.rst +67 -0
  44. venv/lib/python3.10/site-packages/sklearn/datasets/descr/kddcup99.rst +94 -0
  45. venv/lib/python3.10/site-packages/sklearn/datasets/descr/lfw.rst +128 -0
  46. venv/lib/python3.10/site-packages/sklearn/datasets/descr/linnerud.rst +28 -0
  47. venv/lib/python3.10/site-packages/sklearn/datasets/descr/olivetti_faces.rst +44 -0
  48. venv/lib/python3.10/site-packages/sklearn/datasets/descr/rcv1.rst +72 -0
  49. venv/lib/python3.10/site-packages/sklearn/datasets/descr/species_distributions.rst +36 -0
  50. venv/lib/python3.10/site-packages/sklearn/datasets/descr/twenty_newsgroups.rst +264 -0
ckpts/universal/global_step40/zero/23.post_attention_layernorm.weight/exp_avg.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:542a1d36c56cc53701c26f9bbee1575654b9af6161edb63b8410d85571b67305
3
+ size 9372
ckpts/universal/global_step40/zero/23.post_attention_layernorm.weight/exp_avg_sq.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:07ff39584f214afcda19b6ec8476ccf71f055428bd1f52636fa1aef25e87abef
3
+ size 9387
ckpts/universal/global_step40/zero/23.post_attention_layernorm.weight/fp32.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d8f0ef0a78640094a3bf9d8da99a12147afb951afc26e9ec25b75a98b90f9ecb
3
+ size 9293
ckpts/universal/global_step40/zero/25.post_attention_layernorm.weight/exp_avg.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:45701f48be911fabde3695f2eee8ce049a3493b8cfbe8d8a81189e0d01b3d888
3
+ size 9372
venv/lib/python3.10/site-packages/sklearn/_loss/link.py ADDED
@@ -0,0 +1,280 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ Module contains classes for invertible (and differentiable) link functions.
3
+ """
4
+ # Author: Christian Lorentzen <[email protected]>
5
+
6
+ from abc import ABC, abstractmethod
7
+ from dataclasses import dataclass
8
+
9
+ import numpy as np
10
+ from scipy.special import expit, logit
11
+ from scipy.stats import gmean
12
+
13
+ from ..utils.extmath import softmax
14
+
15
+
16
+ @dataclass
17
+ class Interval:
18
+ low: float
19
+ high: float
20
+ low_inclusive: bool
21
+ high_inclusive: bool
22
+
23
+ def __post_init__(self):
24
+ """Check that low <= high"""
25
+ if self.low > self.high:
26
+ raise ValueError(
27
+ f"One must have low <= high; got low={self.low}, high={self.high}."
28
+ )
29
+
30
+ def includes(self, x):
31
+ """Test whether all values of x are in interval range.
32
+
33
+ Parameters
34
+ ----------
35
+ x : ndarray
36
+ Array whose elements are tested to be in interval range.
37
+
38
+ Returns
39
+ -------
40
+ result : bool
41
+ """
42
+ if self.low_inclusive:
43
+ low = np.greater_equal(x, self.low)
44
+ else:
45
+ low = np.greater(x, self.low)
46
+
47
+ if not np.all(low):
48
+ return False
49
+
50
+ if self.high_inclusive:
51
+ high = np.less_equal(x, self.high)
52
+ else:
53
+ high = np.less(x, self.high)
54
+
55
+ # Note: np.all returns numpy.bool_
56
+ return bool(np.all(high))
57
+
58
+
59
+ def _inclusive_low_high(interval, dtype=np.float64):
60
+ """Generate values low and high to be within the interval range.
61
+
62
+ This is used in tests only.
63
+
64
+ Returns
65
+ -------
66
+ low, high : tuple
67
+ The returned values low and high lie within the interval.
68
+ """
69
+ eps = 10 * np.finfo(dtype).eps
70
+ if interval.low == -np.inf:
71
+ low = -1e10
72
+ elif interval.low < 0:
73
+ low = interval.low * (1 - eps) + eps
74
+ else:
75
+ low = interval.low * (1 + eps) + eps
76
+
77
+ if interval.high == np.inf:
78
+ high = 1e10
79
+ elif interval.high < 0:
80
+ high = interval.high * (1 + eps) - eps
81
+ else:
82
+ high = interval.high * (1 - eps) - eps
83
+
84
+ return low, high
85
+
86
+
87
+ class BaseLink(ABC):
88
+ """Abstract base class for differentiable, invertible link functions.
89
+
90
+ Convention:
91
+ - link function g: raw_prediction = g(y_pred)
92
+ - inverse link h: y_pred = h(raw_prediction)
93
+
94
+ For (generalized) linear models, `raw_prediction = X @ coef` is the so
95
+ called linear predictor, and `y_pred = h(raw_prediction)` is the predicted
96
+ conditional (on X) expected value of the target `y_true`.
97
+
98
+ The methods are not implemented as staticmethods in case a link function needs
99
+ parameters.
100
+ """
101
+
102
+ is_multiclass = False # used for testing only
103
+
104
+ # Usually, raw_prediction may be any real number and y_pred is an open
105
+ # interval.
106
+ # interval_raw_prediction = Interval(-np.inf, np.inf, False, False)
107
+ interval_y_pred = Interval(-np.inf, np.inf, False, False)
108
+
109
+ @abstractmethod
110
+ def link(self, y_pred, out=None):
111
+ """Compute the link function g(y_pred).
112
+
113
+ The link function maps (predicted) target values to raw predictions,
114
+ i.e. `g(y_pred) = raw_prediction`.
115
+
116
+ Parameters
117
+ ----------
118
+ y_pred : array
119
+ Predicted target values.
120
+ out : array
121
+ A location into which the result is stored. If provided, it must
122
+ have a shape that the inputs broadcast to. If not provided or None,
123
+ a freshly-allocated array is returned.
124
+
125
+ Returns
126
+ -------
127
+ out : array
128
+ Output array, element-wise link function.
129
+ """
130
+
131
+ @abstractmethod
132
+ def inverse(self, raw_prediction, out=None):
133
+ """Compute the inverse link function h(raw_prediction).
134
+
135
+ The inverse link function maps raw predictions to predicted target
136
+ values, i.e. `h(raw_prediction) = y_pred`.
137
+
138
+ Parameters
139
+ ----------
140
+ raw_prediction : array
141
+ Raw prediction values (in link space).
142
+ out : array
143
+ A location into which the result is stored. If provided, it must
144
+ have a shape that the inputs broadcast to. If not provided or None,
145
+ a freshly-allocated array is returned.
146
+
147
+ Returns
148
+ -------
149
+ out : array
150
+ Output array, element-wise inverse link function.
151
+ """
152
+
153
+
154
+ class IdentityLink(BaseLink):
155
+ """The identity link function g(x)=x."""
156
+
157
+ def link(self, y_pred, out=None):
158
+ if out is not None:
159
+ np.copyto(out, y_pred)
160
+ return out
161
+ else:
162
+ return y_pred
163
+
164
+ inverse = link
165
+
166
+
167
+ class LogLink(BaseLink):
168
+ """The log link function g(x)=log(x)."""
169
+
170
+ interval_y_pred = Interval(0, np.inf, False, False)
171
+
172
+ def link(self, y_pred, out=None):
173
+ return np.log(y_pred, out=out)
174
+
175
+ def inverse(self, raw_prediction, out=None):
176
+ return np.exp(raw_prediction, out=out)
177
+
178
+
179
+ class LogitLink(BaseLink):
180
+ """The logit link function g(x)=logit(x)."""
181
+
182
+ interval_y_pred = Interval(0, 1, False, False)
183
+
184
+ def link(self, y_pred, out=None):
185
+ return logit(y_pred, out=out)
186
+
187
+ def inverse(self, raw_prediction, out=None):
188
+ return expit(raw_prediction, out=out)
189
+
190
+
191
+ class HalfLogitLink(BaseLink):
192
+ """Half the logit link function g(x)=1/2 * logit(x).
193
+
194
+ Used for the exponential loss.
195
+ """
196
+
197
+ interval_y_pred = Interval(0, 1, False, False)
198
+
199
+ def link(self, y_pred, out=None):
200
+ out = logit(y_pred, out=out)
201
+ out *= 0.5
202
+ return out
203
+
204
+ def inverse(self, raw_prediction, out=None):
205
+ return expit(2 * raw_prediction, out)
206
+
207
+
208
+ class MultinomialLogit(BaseLink):
209
+ """The symmetric multinomial logit function.
210
+
211
+ Convention:
212
+ - y_pred.shape = raw_prediction.shape = (n_samples, n_classes)
213
+
214
+ Notes:
215
+ - The inverse link h is the softmax function.
216
+ - The sum is over the second axis, i.e. axis=1 (n_classes).
217
+
218
+ We have to choose additional constraints in order to make
219
+
220
+ y_pred[k] = exp(raw_pred[k]) / sum(exp(raw_pred[k]), k=0..n_classes-1)
221
+
222
+ for n_classes classes identifiable and invertible.
223
+ We choose the symmetric side constraint where the geometric mean response
224
+ is set as reference category, see [2]:
225
+
226
+ The symmetric multinomial logit link function for a single data point is
227
+ then defined as
228
+
229
+ raw_prediction[k] = g(y_pred[k]) = log(y_pred[k]/gmean(y_pred))
230
+ = log(y_pred[k]) - mean(log(y_pred)).
231
+
232
+ Note that this is equivalent to the definition in [1] and implies mean
233
+ centered raw predictions:
234
+
235
+ sum(raw_prediction[k], k=0..n_classes-1) = 0.
236
+
237
+ For linear models with raw_prediction = X @ coef, this corresponds to
238
+ sum(coef[k], k=0..n_classes-1) = 0, i.e. the sum over classes for every
239
+ feature is zero.
240
+
241
+ Reference
242
+ ---------
243
+ .. [1] Friedman, Jerome; Hastie, Trevor; Tibshirani, Robert. "Additive
244
+ logistic regression: a statistical view of boosting" Ann. Statist.
245
+ 28 (2000), no. 2, 337--407. doi:10.1214/aos/1016218223.
246
+ https://projecteuclid.org/euclid.aos/1016218223
247
+
248
+ .. [2] Zahid, Faisal Maqbool and Gerhard Tutz. "Ridge estimation for
249
+ multinomial logit models with symmetric side constraints."
250
+ Computational Statistics 28 (2013): 1017-1034.
251
+ http://epub.ub.uni-muenchen.de/11001/1/tr067.pdf
252
+ """
253
+
254
+ is_multiclass = True
255
+ interval_y_pred = Interval(0, 1, False, False)
256
+
257
+ def symmetrize_raw_prediction(self, raw_prediction):
258
+ return raw_prediction - np.mean(raw_prediction, axis=1)[:, np.newaxis]
259
+
260
+ def link(self, y_pred, out=None):
261
+ # geometric mean as reference category
262
+ gm = gmean(y_pred, axis=1)
263
+ return np.log(y_pred / gm[:, np.newaxis], out=out)
264
+
265
+ def inverse(self, raw_prediction, out=None):
266
+ if out is None:
267
+ return softmax(raw_prediction, copy=True)
268
+ else:
269
+ np.copyto(out, raw_prediction)
270
+ softmax(out, copy=False)
271
+ return out
272
+
273
+
274
+ _LINKS = {
275
+ "identity": IdentityLink,
276
+ "log": LogLink,
277
+ "logit": LogitLink,
278
+ "half_logit": HalfLogitLink,
279
+ "multinomial_logit": MultinomialLogit,
280
+ }
venv/lib/python3.10/site-packages/sklearn/cross_decomposition/__init__.py ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ from ._pls import CCA, PLSSVD, PLSCanonical, PLSRegression
2
+
3
+ __all__ = ["PLSCanonical", "PLSRegression", "PLSSVD", "CCA"]
venv/lib/python3.10/site-packages/sklearn/cross_decomposition/__pycache__/__init__.cpython-310.pyc ADDED
Binary file (332 Bytes). View file
 
venv/lib/python3.10/site-packages/sklearn/cross_decomposition/__pycache__/_pls.cpython-310.pyc ADDED
Binary file (29.4 kB). View file
 
venv/lib/python3.10/site-packages/sklearn/cross_decomposition/_pls.py ADDED
@@ -0,0 +1,1083 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ The :mod:`sklearn.pls` module implements Partial Least Squares (PLS).
3
+ """
4
+
5
+ # Author: Edouard Duchesnay <[email protected]>
6
+ # License: BSD 3 clause
7
+
8
+ import warnings
9
+ from abc import ABCMeta, abstractmethod
10
+ from numbers import Integral, Real
11
+
12
+ import numpy as np
13
+ from scipy.linalg import svd
14
+
15
+ from ..base import (
16
+ BaseEstimator,
17
+ ClassNamePrefixFeaturesOutMixin,
18
+ MultiOutputMixin,
19
+ RegressorMixin,
20
+ TransformerMixin,
21
+ _fit_context,
22
+ )
23
+ from ..exceptions import ConvergenceWarning
24
+ from ..utils import check_array, check_consistent_length
25
+ from ..utils._param_validation import Interval, StrOptions
26
+ from ..utils.extmath import svd_flip
27
+ from ..utils.fixes import parse_version, sp_version
28
+ from ..utils.validation import FLOAT_DTYPES, check_is_fitted
29
+
30
+ __all__ = ["PLSCanonical", "PLSRegression", "PLSSVD"]
31
+
32
+
33
+ if sp_version >= parse_version("1.7"):
34
+ # Starting in scipy 1.7 pinv2 was deprecated in favor of pinv.
35
+ # pinv now uses the svd to compute the pseudo-inverse.
36
+ from scipy.linalg import pinv as pinv2
37
+ else:
38
+ from scipy.linalg import pinv2
39
+
40
+
41
+ def _pinv2_old(a):
42
+ # Used previous scipy pinv2 that was updated in:
43
+ # https://github.com/scipy/scipy/pull/10067
44
+ # We can not set `cond` or `rcond` for pinv2 in scipy >= 1.3 to keep the
45
+ # same behavior of pinv2 for scipy < 1.3, because the condition used to
46
+ # determine the rank is dependent on the output of svd.
47
+ u, s, vh = svd(a, full_matrices=False, check_finite=False)
48
+
49
+ t = u.dtype.char.lower()
50
+ factor = {"f": 1e3, "d": 1e6}
51
+ cond = np.max(s) * factor[t] * np.finfo(t).eps
52
+ rank = np.sum(s > cond)
53
+
54
+ u = u[:, :rank]
55
+ u /= s[:rank]
56
+ return np.transpose(np.conjugate(np.dot(u, vh[:rank])))
57
+
58
+
59
+ def _get_first_singular_vectors_power_method(
60
+ X, Y, mode="A", max_iter=500, tol=1e-06, norm_y_weights=False
61
+ ):
62
+ """Return the first left and right singular vectors of X'Y.
63
+
64
+ Provides an alternative to the svd(X'Y) and uses the power method instead.
65
+ With norm_y_weights to True and in mode A, this corresponds to the
66
+ algorithm section 11.3 of the Wegelin's review, except this starts at the
67
+ "update saliences" part.
68
+ """
69
+
70
+ eps = np.finfo(X.dtype).eps
71
+ try:
72
+ y_score = next(col for col in Y.T if np.any(np.abs(col) > eps))
73
+ except StopIteration as e:
74
+ raise StopIteration("Y residual is constant") from e
75
+
76
+ x_weights_old = 100 # init to big value for first convergence check
77
+
78
+ if mode == "B":
79
+ # Precompute pseudo inverse matrices
80
+ # Basically: X_pinv = (X.T X)^-1 X.T
81
+ # Which requires inverting a (n_features, n_features) matrix.
82
+ # As a result, and as detailed in the Wegelin's review, CCA (i.e. mode
83
+ # B) will be unstable if n_features > n_samples or n_targets >
84
+ # n_samples
85
+ X_pinv, Y_pinv = _pinv2_old(X), _pinv2_old(Y)
86
+
87
+ for i in range(max_iter):
88
+ if mode == "B":
89
+ x_weights = np.dot(X_pinv, y_score)
90
+ else:
91
+ x_weights = np.dot(X.T, y_score) / np.dot(y_score, y_score)
92
+
93
+ x_weights /= np.sqrt(np.dot(x_weights, x_weights)) + eps
94
+ x_score = np.dot(X, x_weights)
95
+
96
+ if mode == "B":
97
+ y_weights = np.dot(Y_pinv, x_score)
98
+ else:
99
+ y_weights = np.dot(Y.T, x_score) / np.dot(x_score.T, x_score)
100
+
101
+ if norm_y_weights:
102
+ y_weights /= np.sqrt(np.dot(y_weights, y_weights)) + eps
103
+
104
+ y_score = np.dot(Y, y_weights) / (np.dot(y_weights, y_weights) + eps)
105
+
106
+ x_weights_diff = x_weights - x_weights_old
107
+ if np.dot(x_weights_diff, x_weights_diff) < tol or Y.shape[1] == 1:
108
+ break
109
+ x_weights_old = x_weights
110
+
111
+ n_iter = i + 1
112
+ if n_iter == max_iter:
113
+ warnings.warn("Maximum number of iterations reached", ConvergenceWarning)
114
+
115
+ return x_weights, y_weights, n_iter
116
+
117
+
118
+ def _get_first_singular_vectors_svd(X, Y):
119
+ """Return the first left and right singular vectors of X'Y.
120
+
121
+ Here the whole SVD is computed.
122
+ """
123
+ C = np.dot(X.T, Y)
124
+ U, _, Vt = svd(C, full_matrices=False)
125
+ return U[:, 0], Vt[0, :]
126
+
127
+
128
+ def _center_scale_xy(X, Y, scale=True):
129
+ """Center X, Y and scale if the scale parameter==True
130
+
131
+ Returns
132
+ -------
133
+ X, Y, x_mean, y_mean, x_std, y_std
134
+ """
135
+ # center
136
+ x_mean = X.mean(axis=0)
137
+ X -= x_mean
138
+ y_mean = Y.mean(axis=0)
139
+ Y -= y_mean
140
+ # scale
141
+ if scale:
142
+ x_std = X.std(axis=0, ddof=1)
143
+ x_std[x_std == 0.0] = 1.0
144
+ X /= x_std
145
+ y_std = Y.std(axis=0, ddof=1)
146
+ y_std[y_std == 0.0] = 1.0
147
+ Y /= y_std
148
+ else:
149
+ x_std = np.ones(X.shape[1])
150
+ y_std = np.ones(Y.shape[1])
151
+ return X, Y, x_mean, y_mean, x_std, y_std
152
+
153
+
154
+ def _svd_flip_1d(u, v):
155
+ """Same as svd_flip but works on 1d arrays, and is inplace"""
156
+ # svd_flip would force us to convert to 2d array and would also return 2d
157
+ # arrays. We don't want that.
158
+ biggest_abs_val_idx = np.argmax(np.abs(u))
159
+ sign = np.sign(u[biggest_abs_val_idx])
160
+ u *= sign
161
+ v *= sign
162
+
163
+
164
+ class _PLS(
165
+ ClassNamePrefixFeaturesOutMixin,
166
+ TransformerMixin,
167
+ RegressorMixin,
168
+ MultiOutputMixin,
169
+ BaseEstimator,
170
+ metaclass=ABCMeta,
171
+ ):
172
+ """Partial Least Squares (PLS)
173
+
174
+ This class implements the generic PLS algorithm.
175
+
176
+ Main ref: Wegelin, a survey of Partial Least Squares (PLS) methods,
177
+ with emphasis on the two-block case
178
+ https://stat.uw.edu/sites/default/files/files/reports/2000/tr371.pdf
179
+ """
180
+
181
+ _parameter_constraints: dict = {
182
+ "n_components": [Interval(Integral, 1, None, closed="left")],
183
+ "scale": ["boolean"],
184
+ "deflation_mode": [StrOptions({"regression", "canonical"})],
185
+ "mode": [StrOptions({"A", "B"})],
186
+ "algorithm": [StrOptions({"svd", "nipals"})],
187
+ "max_iter": [Interval(Integral, 1, None, closed="left")],
188
+ "tol": [Interval(Real, 0, None, closed="left")],
189
+ "copy": ["boolean"],
190
+ }
191
+
192
+ @abstractmethod
193
+ def __init__(
194
+ self,
195
+ n_components=2,
196
+ *,
197
+ scale=True,
198
+ deflation_mode="regression",
199
+ mode="A",
200
+ algorithm="nipals",
201
+ max_iter=500,
202
+ tol=1e-06,
203
+ copy=True,
204
+ ):
205
+ self.n_components = n_components
206
+ self.deflation_mode = deflation_mode
207
+ self.mode = mode
208
+ self.scale = scale
209
+ self.algorithm = algorithm
210
+ self.max_iter = max_iter
211
+ self.tol = tol
212
+ self.copy = copy
213
+
214
+ @_fit_context(prefer_skip_nested_validation=True)
215
+ def fit(self, X, Y):
216
+ """Fit model to data.
217
+
218
+ Parameters
219
+ ----------
220
+ X : array-like of shape (n_samples, n_features)
221
+ Training vectors, where `n_samples` is the number of samples and
222
+ `n_features` is the number of predictors.
223
+
224
+ Y : array-like of shape (n_samples,) or (n_samples, n_targets)
225
+ Target vectors, where `n_samples` is the number of samples and
226
+ `n_targets` is the number of response variables.
227
+
228
+ Returns
229
+ -------
230
+ self : object
231
+ Fitted model.
232
+ """
233
+ check_consistent_length(X, Y)
234
+ X = self._validate_data(
235
+ X, dtype=np.float64, copy=self.copy, ensure_min_samples=2
236
+ )
237
+ Y = check_array(
238
+ Y, input_name="Y", dtype=np.float64, copy=self.copy, ensure_2d=False
239
+ )
240
+ if Y.ndim == 1:
241
+ self._predict_1d = True
242
+ Y = Y.reshape(-1, 1)
243
+ else:
244
+ self._predict_1d = False
245
+
246
+ n = X.shape[0]
247
+ p = X.shape[1]
248
+ q = Y.shape[1]
249
+
250
+ n_components = self.n_components
251
+ # With PLSRegression n_components is bounded by the rank of (X.T X) see
252
+ # Wegelin page 25. With CCA and PLSCanonical, n_components is bounded
253
+ # by the rank of X and the rank of Y: see Wegelin page 12
254
+ rank_upper_bound = p if self.deflation_mode == "regression" else min(n, p, q)
255
+ if n_components > rank_upper_bound:
256
+ raise ValueError(
257
+ f"`n_components` upper bound is {rank_upper_bound}. "
258
+ f"Got {n_components} instead. Reduce `n_components`."
259
+ )
260
+
261
+ self._norm_y_weights = self.deflation_mode == "canonical" # 1.1
262
+ norm_y_weights = self._norm_y_weights
263
+
264
+ # Scale (in place)
265
+ Xk, Yk, self._x_mean, self._y_mean, self._x_std, self._y_std = _center_scale_xy(
266
+ X, Y, self.scale
267
+ )
268
+
269
+ self.x_weights_ = np.zeros((p, n_components)) # U
270
+ self.y_weights_ = np.zeros((q, n_components)) # V
271
+ self._x_scores = np.zeros((n, n_components)) # Xi
272
+ self._y_scores = np.zeros((n, n_components)) # Omega
273
+ self.x_loadings_ = np.zeros((p, n_components)) # Gamma
274
+ self.y_loadings_ = np.zeros((q, n_components)) # Delta
275
+ self.n_iter_ = []
276
+
277
+ # This whole thing corresponds to the algorithm in section 4.1 of the
278
+ # review from Wegelin. See above for a notation mapping from code to
279
+ # paper.
280
+ Y_eps = np.finfo(Yk.dtype).eps
281
+ for k in range(n_components):
282
+ # Find first left and right singular vectors of the X.T.dot(Y)
283
+ # cross-covariance matrix.
284
+ if self.algorithm == "nipals":
285
+ # Replace columns that are all close to zero with zeros
286
+ Yk_mask = np.all(np.abs(Yk) < 10 * Y_eps, axis=0)
287
+ Yk[:, Yk_mask] = 0.0
288
+
289
+ try:
290
+ (
291
+ x_weights,
292
+ y_weights,
293
+ n_iter_,
294
+ ) = _get_first_singular_vectors_power_method(
295
+ Xk,
296
+ Yk,
297
+ mode=self.mode,
298
+ max_iter=self.max_iter,
299
+ tol=self.tol,
300
+ norm_y_weights=norm_y_weights,
301
+ )
302
+ except StopIteration as e:
303
+ if str(e) != "Y residual is constant":
304
+ raise
305
+ warnings.warn(f"Y residual is constant at iteration {k}")
306
+ break
307
+
308
+ self.n_iter_.append(n_iter_)
309
+
310
+ elif self.algorithm == "svd":
311
+ x_weights, y_weights = _get_first_singular_vectors_svd(Xk, Yk)
312
+
313
+ # inplace sign flip for consistency across solvers and archs
314
+ _svd_flip_1d(x_weights, y_weights)
315
+
316
+ # compute scores, i.e. the projections of X and Y
317
+ x_scores = np.dot(Xk, x_weights)
318
+ if norm_y_weights:
319
+ y_ss = 1
320
+ else:
321
+ y_ss = np.dot(y_weights, y_weights)
322
+ y_scores = np.dot(Yk, y_weights) / y_ss
323
+
324
+ # Deflation: subtract rank-one approx to obtain Xk+1 and Yk+1
325
+ x_loadings = np.dot(x_scores, Xk) / np.dot(x_scores, x_scores)
326
+ Xk -= np.outer(x_scores, x_loadings)
327
+
328
+ if self.deflation_mode == "canonical":
329
+ # regress Yk on y_score
330
+ y_loadings = np.dot(y_scores, Yk) / np.dot(y_scores, y_scores)
331
+ Yk -= np.outer(y_scores, y_loadings)
332
+ if self.deflation_mode == "regression":
333
+ # regress Yk on x_score
334
+ y_loadings = np.dot(x_scores, Yk) / np.dot(x_scores, x_scores)
335
+ Yk -= np.outer(x_scores, y_loadings)
336
+
337
+ self.x_weights_[:, k] = x_weights
338
+ self.y_weights_[:, k] = y_weights
339
+ self._x_scores[:, k] = x_scores
340
+ self._y_scores[:, k] = y_scores
341
+ self.x_loadings_[:, k] = x_loadings
342
+ self.y_loadings_[:, k] = y_loadings
343
+
344
+ # X was approximated as Xi . Gamma.T + X_(R+1)
345
+ # Xi . Gamma.T is a sum of n_components rank-1 matrices. X_(R+1) is
346
+ # whatever is left to fully reconstruct X, and can be 0 if X is of rank
347
+ # n_components.
348
+ # Similarly, Y was approximated as Omega . Delta.T + Y_(R+1)
349
+
350
+ # Compute transformation matrices (rotations_). See User Guide.
351
+ self.x_rotations_ = np.dot(
352
+ self.x_weights_,
353
+ pinv2(np.dot(self.x_loadings_.T, self.x_weights_), check_finite=False),
354
+ )
355
+ self.y_rotations_ = np.dot(
356
+ self.y_weights_,
357
+ pinv2(np.dot(self.y_loadings_.T, self.y_weights_), check_finite=False),
358
+ )
359
+ self.coef_ = np.dot(self.x_rotations_, self.y_loadings_.T)
360
+ self.coef_ = (self.coef_ * self._y_std).T
361
+ self.intercept_ = self._y_mean
362
+ self._n_features_out = self.x_rotations_.shape[1]
363
+ return self
364
+
365
+ def transform(self, X, Y=None, copy=True):
366
+ """Apply the dimension reduction.
367
+
368
+ Parameters
369
+ ----------
370
+ X : array-like of shape (n_samples, n_features)
371
+ Samples to transform.
372
+
373
+ Y : array-like of shape (n_samples, n_targets), default=None
374
+ Target vectors.
375
+
376
+ copy : bool, default=True
377
+ Whether to copy `X` and `Y`, or perform in-place normalization.
378
+
379
+ Returns
380
+ -------
381
+ x_scores, y_scores : array-like or tuple of array-like
382
+ Return `x_scores` if `Y` is not given, `(x_scores, y_scores)` otherwise.
383
+ """
384
+ check_is_fitted(self)
385
+ X = self._validate_data(X, copy=copy, dtype=FLOAT_DTYPES, reset=False)
386
+ # Normalize
387
+ X -= self._x_mean
388
+ X /= self._x_std
389
+ # Apply rotation
390
+ x_scores = np.dot(X, self.x_rotations_)
391
+ if Y is not None:
392
+ Y = check_array(
393
+ Y, input_name="Y", ensure_2d=False, copy=copy, dtype=FLOAT_DTYPES
394
+ )
395
+ if Y.ndim == 1:
396
+ Y = Y.reshape(-1, 1)
397
+ Y -= self._y_mean
398
+ Y /= self._y_std
399
+ y_scores = np.dot(Y, self.y_rotations_)
400
+ return x_scores, y_scores
401
+
402
+ return x_scores
403
+
404
+ def inverse_transform(self, X, Y=None):
405
+ """Transform data back to its original space.
406
+
407
+ Parameters
408
+ ----------
409
+ X : array-like of shape (n_samples, n_components)
410
+ New data, where `n_samples` is the number of samples
411
+ and `n_components` is the number of pls components.
412
+
413
+ Y : array-like of shape (n_samples, n_components)
414
+ New target, where `n_samples` is the number of samples
415
+ and `n_components` is the number of pls components.
416
+
417
+ Returns
418
+ -------
419
+ X_reconstructed : ndarray of shape (n_samples, n_features)
420
+ Return the reconstructed `X` data.
421
+
422
+ Y_reconstructed : ndarray of shape (n_samples, n_targets)
423
+ Return the reconstructed `X` target. Only returned when `Y` is given.
424
+
425
+ Notes
426
+ -----
427
+ This transformation will only be exact if `n_components=n_features`.
428
+ """
429
+ check_is_fitted(self)
430
+ X = check_array(X, input_name="X", dtype=FLOAT_DTYPES)
431
+ # From pls space to original space
432
+ X_reconstructed = np.matmul(X, self.x_loadings_.T)
433
+ # Denormalize
434
+ X_reconstructed *= self._x_std
435
+ X_reconstructed += self._x_mean
436
+
437
+ if Y is not None:
438
+ Y = check_array(Y, input_name="Y", dtype=FLOAT_DTYPES)
439
+ # From pls space to original space
440
+ Y_reconstructed = np.matmul(Y, self.y_loadings_.T)
441
+ # Denormalize
442
+ Y_reconstructed *= self._y_std
443
+ Y_reconstructed += self._y_mean
444
+ return X_reconstructed, Y_reconstructed
445
+
446
+ return X_reconstructed
447
+
448
+ def predict(self, X, copy=True):
449
+ """Predict targets of given samples.
450
+
451
+ Parameters
452
+ ----------
453
+ X : array-like of shape (n_samples, n_features)
454
+ Samples.
455
+
456
+ copy : bool, default=True
457
+ Whether to copy `X` and `Y`, or perform in-place normalization.
458
+
459
+ Returns
460
+ -------
461
+ y_pred : ndarray of shape (n_samples,) or (n_samples, n_targets)
462
+ Returns predicted values.
463
+
464
+ Notes
465
+ -----
466
+ This call requires the estimation of a matrix of shape
467
+ `(n_features, n_targets)`, which may be an issue in high dimensional
468
+ space.
469
+ """
470
+ check_is_fitted(self)
471
+ X = self._validate_data(X, copy=copy, dtype=FLOAT_DTYPES, reset=False)
472
+ # Normalize
473
+ X -= self._x_mean
474
+ X /= self._x_std
475
+ Ypred = X @ self.coef_.T + self.intercept_
476
+ return Ypred.ravel() if self._predict_1d else Ypred
477
+
478
+ def fit_transform(self, X, y=None):
479
+ """Learn and apply the dimension reduction on the train data.
480
+
481
+ Parameters
482
+ ----------
483
+ X : array-like of shape (n_samples, n_features)
484
+ Training vectors, where `n_samples` is the number of samples and
485
+ `n_features` is the number of predictors.
486
+
487
+ y : array-like of shape (n_samples, n_targets), default=None
488
+ Target vectors, where `n_samples` is the number of samples and
489
+ `n_targets` is the number of response variables.
490
+
491
+ Returns
492
+ -------
493
+ self : ndarray of shape (n_samples, n_components)
494
+ Return `x_scores` if `Y` is not given, `(x_scores, y_scores)` otherwise.
495
+ """
496
+ return self.fit(X, y).transform(X, y)
497
+
498
+ def _more_tags(self):
499
+ return {"poor_score": True, "requires_y": False}
500
+
501
+
502
+ class PLSRegression(_PLS):
503
+ """PLS regression.
504
+
505
+ PLSRegression is also known as PLS2 or PLS1, depending on the number of
506
+ targets.
507
+
508
+ For a comparison between other cross decomposition algorithms, see
509
+ :ref:`sphx_glr_auto_examples_cross_decomposition_plot_compare_cross_decomposition.py`.
510
+
511
+ Read more in the :ref:`User Guide <cross_decomposition>`.
512
+
513
+ .. versionadded:: 0.8
514
+
515
+ Parameters
516
+ ----------
517
+ n_components : int, default=2
518
+ Number of components to keep. Should be in `[1, min(n_samples,
519
+ n_features, n_targets)]`.
520
+
521
+ scale : bool, default=True
522
+ Whether to scale `X` and `Y`.
523
+
524
+ max_iter : int, default=500
525
+ The maximum number of iterations of the power method when
526
+ `algorithm='nipals'`. Ignored otherwise.
527
+
528
+ tol : float, default=1e-06
529
+ The tolerance used as convergence criteria in the power method: the
530
+ algorithm stops whenever the squared norm of `u_i - u_{i-1}` is less
531
+ than `tol`, where `u` corresponds to the left singular vector.
532
+
533
+ copy : bool, default=True
534
+ Whether to copy `X` and `Y` in :term:`fit` before applying centering,
535
+ and potentially scaling. If `False`, these operations will be done
536
+ inplace, modifying both arrays.
537
+
538
+ Attributes
539
+ ----------
540
+ x_weights_ : ndarray of shape (n_features, n_components)
541
+ The left singular vectors of the cross-covariance matrices of each
542
+ iteration.
543
+
544
+ y_weights_ : ndarray of shape (n_targets, n_components)
545
+ The right singular vectors of the cross-covariance matrices of each
546
+ iteration.
547
+
548
+ x_loadings_ : ndarray of shape (n_features, n_components)
549
+ The loadings of `X`.
550
+
551
+ y_loadings_ : ndarray of shape (n_targets, n_components)
552
+ The loadings of `Y`.
553
+
554
+ x_scores_ : ndarray of shape (n_samples, n_components)
555
+ The transformed training samples.
556
+
557
+ y_scores_ : ndarray of shape (n_samples, n_components)
558
+ The transformed training targets.
559
+
560
+ x_rotations_ : ndarray of shape (n_features, n_components)
561
+ The projection matrix used to transform `X`.
562
+
563
+ y_rotations_ : ndarray of shape (n_targets, n_components)
564
+ The projection matrix used to transform `Y`.
565
+
566
+ coef_ : ndarray of shape (n_target, n_features)
567
+ The coefficients of the linear model such that `Y` is approximated as
568
+ `Y = X @ coef_.T + intercept_`.
569
+
570
+ intercept_ : ndarray of shape (n_targets,)
571
+ The intercepts of the linear model such that `Y` is approximated as
572
+ `Y = X @ coef_.T + intercept_`.
573
+
574
+ .. versionadded:: 1.1
575
+
576
+ n_iter_ : list of shape (n_components,)
577
+ Number of iterations of the power method, for each
578
+ component.
579
+
580
+ n_features_in_ : int
581
+ Number of features seen during :term:`fit`.
582
+
583
+ feature_names_in_ : ndarray of shape (`n_features_in_`,)
584
+ Names of features seen during :term:`fit`. Defined only when `X`
585
+ has feature names that are all strings.
586
+
587
+ .. versionadded:: 1.0
588
+
589
+ See Also
590
+ --------
591
+ PLSCanonical : Partial Least Squares transformer and regressor.
592
+
593
+ Examples
594
+ --------
595
+ >>> from sklearn.cross_decomposition import PLSRegression
596
+ >>> X = [[0., 0., 1.], [1.,0.,0.], [2.,2.,2.], [2.,5.,4.]]
597
+ >>> Y = [[0.1, -0.2], [0.9, 1.1], [6.2, 5.9], [11.9, 12.3]]
598
+ >>> pls2 = PLSRegression(n_components=2)
599
+ >>> pls2.fit(X, Y)
600
+ PLSRegression()
601
+ >>> Y_pred = pls2.predict(X)
602
+
603
+ For a comparison between PLS Regression and :class:`~sklearn.decomposition.PCA`, see
604
+ :ref:`sphx_glr_auto_examples_cross_decomposition_plot_pcr_vs_pls.py`.
605
+ """
606
+
607
+ _parameter_constraints: dict = {**_PLS._parameter_constraints}
608
+ for param in ("deflation_mode", "mode", "algorithm"):
609
+ _parameter_constraints.pop(param)
610
+
611
+ # This implementation provides the same results that 3 PLS packages
612
+ # provided in the R language (R-project):
613
+ # - "mixOmics" with function pls(X, Y, mode = "regression")
614
+ # - "plspm " with function plsreg2(X, Y)
615
+ # - "pls" with function oscorespls.fit(X, Y)
616
+
617
+ def __init__(
618
+ self, n_components=2, *, scale=True, max_iter=500, tol=1e-06, copy=True
619
+ ):
620
+ super().__init__(
621
+ n_components=n_components,
622
+ scale=scale,
623
+ deflation_mode="regression",
624
+ mode="A",
625
+ algorithm="nipals",
626
+ max_iter=max_iter,
627
+ tol=tol,
628
+ copy=copy,
629
+ )
630
+
631
+ def fit(self, X, Y):
632
+ """Fit model to data.
633
+
634
+ Parameters
635
+ ----------
636
+ X : array-like of shape (n_samples, n_features)
637
+ Training vectors, where `n_samples` is the number of samples and
638
+ `n_features` is the number of predictors.
639
+
640
+ Y : array-like of shape (n_samples,) or (n_samples, n_targets)
641
+ Target vectors, where `n_samples` is the number of samples and
642
+ `n_targets` is the number of response variables.
643
+
644
+ Returns
645
+ -------
646
+ self : object
647
+ Fitted model.
648
+ """
649
+ super().fit(X, Y)
650
+ # expose the fitted attributes `x_scores_` and `y_scores_`
651
+ self.x_scores_ = self._x_scores
652
+ self.y_scores_ = self._y_scores
653
+ return self
654
+
655
+
656
+ class PLSCanonical(_PLS):
657
+ """Partial Least Squares transformer and regressor.
658
+
659
+ For a comparison between other cross decomposition algorithms, see
660
+ :ref:`sphx_glr_auto_examples_cross_decomposition_plot_compare_cross_decomposition.py`.
661
+
662
+ Read more in the :ref:`User Guide <cross_decomposition>`.
663
+
664
+ .. versionadded:: 0.8
665
+
666
+ Parameters
667
+ ----------
668
+ n_components : int, default=2
669
+ Number of components to keep. Should be in `[1, min(n_samples,
670
+ n_features, n_targets)]`.
671
+
672
+ scale : bool, default=True
673
+ Whether to scale `X` and `Y`.
674
+
675
+ algorithm : {'nipals', 'svd'}, default='nipals'
676
+ The algorithm used to estimate the first singular vectors of the
677
+ cross-covariance matrix. 'nipals' uses the power method while 'svd'
678
+ will compute the whole SVD.
679
+
680
+ max_iter : int, default=500
681
+ The maximum number of iterations of the power method when
682
+ `algorithm='nipals'`. Ignored otherwise.
683
+
684
+ tol : float, default=1e-06
685
+ The tolerance used as convergence criteria in the power method: the
686
+ algorithm stops whenever the squared norm of `u_i - u_{i-1}` is less
687
+ than `tol`, where `u` corresponds to the left singular vector.
688
+
689
+ copy : bool, default=True
690
+ Whether to copy `X` and `Y` in fit before applying centering, and
691
+ potentially scaling. If False, these operations will be done inplace,
692
+ modifying both arrays.
693
+
694
+ Attributes
695
+ ----------
696
+ x_weights_ : ndarray of shape (n_features, n_components)
697
+ The left singular vectors of the cross-covariance matrices of each
698
+ iteration.
699
+
700
+ y_weights_ : ndarray of shape (n_targets, n_components)
701
+ The right singular vectors of the cross-covariance matrices of each
702
+ iteration.
703
+
704
+ x_loadings_ : ndarray of shape (n_features, n_components)
705
+ The loadings of `X`.
706
+
707
+ y_loadings_ : ndarray of shape (n_targets, n_components)
708
+ The loadings of `Y`.
709
+
710
+ x_rotations_ : ndarray of shape (n_features, n_components)
711
+ The projection matrix used to transform `X`.
712
+
713
+ y_rotations_ : ndarray of shape (n_targets, n_components)
714
+ The projection matrix used to transform `Y`.
715
+
716
+ coef_ : ndarray of shape (n_targets, n_features)
717
+ The coefficients of the linear model such that `Y` is approximated as
718
+ `Y = X @ coef_.T + intercept_`.
719
+
720
+ intercept_ : ndarray of shape (n_targets,)
721
+ The intercepts of the linear model such that `Y` is approximated as
722
+ `Y = X @ coef_.T + intercept_`.
723
+
724
+ .. versionadded:: 1.1
725
+
726
+ n_iter_ : list of shape (n_components,)
727
+ Number of iterations of the power method, for each
728
+ component. Empty if `algorithm='svd'`.
729
+
730
+ n_features_in_ : int
731
+ Number of features seen during :term:`fit`.
732
+
733
+ feature_names_in_ : ndarray of shape (`n_features_in_`,)
734
+ Names of features seen during :term:`fit`. Defined only when `X`
735
+ has feature names that are all strings.
736
+
737
+ .. versionadded:: 1.0
738
+
739
+ See Also
740
+ --------
741
+ CCA : Canonical Correlation Analysis.
742
+ PLSSVD : Partial Least Square SVD.
743
+
744
+ Examples
745
+ --------
746
+ >>> from sklearn.cross_decomposition import PLSCanonical
747
+ >>> X = [[0., 0., 1.], [1.,0.,0.], [2.,2.,2.], [2.,5.,4.]]
748
+ >>> Y = [[0.1, -0.2], [0.9, 1.1], [6.2, 5.9], [11.9, 12.3]]
749
+ >>> plsca = PLSCanonical(n_components=2)
750
+ >>> plsca.fit(X, Y)
751
+ PLSCanonical()
752
+ >>> X_c, Y_c = plsca.transform(X, Y)
753
+ """
754
+
755
+ _parameter_constraints: dict = {**_PLS._parameter_constraints}
756
+ for param in ("deflation_mode", "mode"):
757
+ _parameter_constraints.pop(param)
758
+
759
+ # This implementation provides the same results that the "plspm" package
760
+ # provided in the R language (R-project), using the function plsca(X, Y).
761
+ # Results are equal or collinear with the function
762
+ # ``pls(..., mode = "canonical")`` of the "mixOmics" package. The
763
+ # difference relies in the fact that mixOmics implementation does not
764
+ # exactly implement the Wold algorithm since it does not normalize
765
+ # y_weights to one.
766
+
767
+ def __init__(
768
+ self,
769
+ n_components=2,
770
+ *,
771
+ scale=True,
772
+ algorithm="nipals",
773
+ max_iter=500,
774
+ tol=1e-06,
775
+ copy=True,
776
+ ):
777
+ super().__init__(
778
+ n_components=n_components,
779
+ scale=scale,
780
+ deflation_mode="canonical",
781
+ mode="A",
782
+ algorithm=algorithm,
783
+ max_iter=max_iter,
784
+ tol=tol,
785
+ copy=copy,
786
+ )
787
+
788
+
789
+ class CCA(_PLS):
790
+ """Canonical Correlation Analysis, also known as "Mode B" PLS.
791
+
792
+ For a comparison between other cross decomposition algorithms, see
793
+ :ref:`sphx_glr_auto_examples_cross_decomposition_plot_compare_cross_decomposition.py`.
794
+
795
+ Read more in the :ref:`User Guide <cross_decomposition>`.
796
+
797
+ Parameters
798
+ ----------
799
+ n_components : int, default=2
800
+ Number of components to keep. Should be in `[1, min(n_samples,
801
+ n_features, n_targets)]`.
802
+
803
+ scale : bool, default=True
804
+ Whether to scale `X` and `Y`.
805
+
806
+ max_iter : int, default=500
807
+ The maximum number of iterations of the power method.
808
+
809
+ tol : float, default=1e-06
810
+ The tolerance used as convergence criteria in the power method: the
811
+ algorithm stops whenever the squared norm of `u_i - u_{i-1}` is less
812
+ than `tol`, where `u` corresponds to the left singular vector.
813
+
814
+ copy : bool, default=True
815
+ Whether to copy `X` and `Y` in fit before applying centering, and
816
+ potentially scaling. If False, these operations will be done inplace,
817
+ modifying both arrays.
818
+
819
+ Attributes
820
+ ----------
821
+ x_weights_ : ndarray of shape (n_features, n_components)
822
+ The left singular vectors of the cross-covariance matrices of each
823
+ iteration.
824
+
825
+ y_weights_ : ndarray of shape (n_targets, n_components)
826
+ The right singular vectors of the cross-covariance matrices of each
827
+ iteration.
828
+
829
+ x_loadings_ : ndarray of shape (n_features, n_components)
830
+ The loadings of `X`.
831
+
832
+ y_loadings_ : ndarray of shape (n_targets, n_components)
833
+ The loadings of `Y`.
834
+
835
+ x_rotations_ : ndarray of shape (n_features, n_components)
836
+ The projection matrix used to transform `X`.
837
+
838
+ y_rotations_ : ndarray of shape (n_targets, n_components)
839
+ The projection matrix used to transform `Y`.
840
+
841
+ coef_ : ndarray of shape (n_targets, n_features)
842
+ The coefficients of the linear model such that `Y` is approximated as
843
+ `Y = X @ coef_.T + intercept_`.
844
+
845
+ intercept_ : ndarray of shape (n_targets,)
846
+ The intercepts of the linear model such that `Y` is approximated as
847
+ `Y = X @ coef_.T + intercept_`.
848
+
849
+ .. versionadded:: 1.1
850
+
851
+ n_iter_ : list of shape (n_components,)
852
+ Number of iterations of the power method, for each
853
+ component.
854
+
855
+ n_features_in_ : int
856
+ Number of features seen during :term:`fit`.
857
+
858
+ feature_names_in_ : ndarray of shape (`n_features_in_`,)
859
+ Names of features seen during :term:`fit`. Defined only when `X`
860
+ has feature names that are all strings.
861
+
862
+ .. versionadded:: 1.0
863
+
864
+ See Also
865
+ --------
866
+ PLSCanonical : Partial Least Squares transformer and regressor.
867
+ PLSSVD : Partial Least Square SVD.
868
+
869
+ Examples
870
+ --------
871
+ >>> from sklearn.cross_decomposition import CCA
872
+ >>> X = [[0., 0., 1.], [1.,0.,0.], [2.,2.,2.], [3.,5.,4.]]
873
+ >>> Y = [[0.1, -0.2], [0.9, 1.1], [6.2, 5.9], [11.9, 12.3]]
874
+ >>> cca = CCA(n_components=1)
875
+ >>> cca.fit(X, Y)
876
+ CCA(n_components=1)
877
+ >>> X_c, Y_c = cca.transform(X, Y)
878
+ """
879
+
880
+ _parameter_constraints: dict = {**_PLS._parameter_constraints}
881
+ for param in ("deflation_mode", "mode", "algorithm"):
882
+ _parameter_constraints.pop(param)
883
+
884
+ def __init__(
885
+ self, n_components=2, *, scale=True, max_iter=500, tol=1e-06, copy=True
886
+ ):
887
+ super().__init__(
888
+ n_components=n_components,
889
+ scale=scale,
890
+ deflation_mode="canonical",
891
+ mode="B",
892
+ algorithm="nipals",
893
+ max_iter=max_iter,
894
+ tol=tol,
895
+ copy=copy,
896
+ )
897
+
898
+
899
+ class PLSSVD(ClassNamePrefixFeaturesOutMixin, TransformerMixin, BaseEstimator):
900
+ """Partial Least Square SVD.
901
+
902
+ This transformer simply performs a SVD on the cross-covariance matrix
903
+ `X'Y`. It is able to project both the training data `X` and the targets
904
+ `Y`. The training data `X` is projected on the left singular vectors, while
905
+ the targets are projected on the right singular vectors.
906
+
907
+ Read more in the :ref:`User Guide <cross_decomposition>`.
908
+
909
+ .. versionadded:: 0.8
910
+
911
+ Parameters
912
+ ----------
913
+ n_components : int, default=2
914
+ The number of components to keep. Should be in `[1,
915
+ min(n_samples, n_features, n_targets)]`.
916
+
917
+ scale : bool, default=True
918
+ Whether to scale `X` and `Y`.
919
+
920
+ copy : bool, default=True
921
+ Whether to copy `X` and `Y` in fit before applying centering, and
922
+ potentially scaling. If `False`, these operations will be done inplace,
923
+ modifying both arrays.
924
+
925
+ Attributes
926
+ ----------
927
+ x_weights_ : ndarray of shape (n_features, n_components)
928
+ The left singular vectors of the SVD of the cross-covariance matrix.
929
+ Used to project `X` in :meth:`transform`.
930
+
931
+ y_weights_ : ndarray of (n_targets, n_components)
932
+ The right singular vectors of the SVD of the cross-covariance matrix.
933
+ Used to project `X` in :meth:`transform`.
934
+
935
+ n_features_in_ : int
936
+ Number of features seen during :term:`fit`.
937
+
938
+ feature_names_in_ : ndarray of shape (`n_features_in_`,)
939
+ Names of features seen during :term:`fit`. Defined only when `X`
940
+ has feature names that are all strings.
941
+
942
+ .. versionadded:: 1.0
943
+
944
+ See Also
945
+ --------
946
+ PLSCanonical : Partial Least Squares transformer and regressor.
947
+ CCA : Canonical Correlation Analysis.
948
+
949
+ Examples
950
+ --------
951
+ >>> import numpy as np
952
+ >>> from sklearn.cross_decomposition import PLSSVD
953
+ >>> X = np.array([[0., 0., 1.],
954
+ ... [1., 0., 0.],
955
+ ... [2., 2., 2.],
956
+ ... [2., 5., 4.]])
957
+ >>> Y = np.array([[0.1, -0.2],
958
+ ... [0.9, 1.1],
959
+ ... [6.2, 5.9],
960
+ ... [11.9, 12.3]])
961
+ >>> pls = PLSSVD(n_components=2).fit(X, Y)
962
+ >>> X_c, Y_c = pls.transform(X, Y)
963
+ >>> X_c.shape, Y_c.shape
964
+ ((4, 2), (4, 2))
965
+ """
966
+
967
+ _parameter_constraints: dict = {
968
+ "n_components": [Interval(Integral, 1, None, closed="left")],
969
+ "scale": ["boolean"],
970
+ "copy": ["boolean"],
971
+ }
972
+
973
+ def __init__(self, n_components=2, *, scale=True, copy=True):
974
+ self.n_components = n_components
975
+ self.scale = scale
976
+ self.copy = copy
977
+
978
+ @_fit_context(prefer_skip_nested_validation=True)
979
+ def fit(self, X, Y):
980
+ """Fit model to data.
981
+
982
+ Parameters
983
+ ----------
984
+ X : array-like of shape (n_samples, n_features)
985
+ Training samples.
986
+
987
+ Y : array-like of shape (n_samples,) or (n_samples, n_targets)
988
+ Targets.
989
+
990
+ Returns
991
+ -------
992
+ self : object
993
+ Fitted estimator.
994
+ """
995
+ check_consistent_length(X, Y)
996
+ X = self._validate_data(
997
+ X, dtype=np.float64, copy=self.copy, ensure_min_samples=2
998
+ )
999
+ Y = check_array(
1000
+ Y, input_name="Y", dtype=np.float64, copy=self.copy, ensure_2d=False
1001
+ )
1002
+ if Y.ndim == 1:
1003
+ Y = Y.reshape(-1, 1)
1004
+
1005
+ # we'll compute the SVD of the cross-covariance matrix = X.T.dot(Y)
1006
+ # This matrix rank is at most min(n_samples, n_features, n_targets) so
1007
+ # n_components cannot be bigger than that.
1008
+ n_components = self.n_components
1009
+ rank_upper_bound = min(X.shape[0], X.shape[1], Y.shape[1])
1010
+ if n_components > rank_upper_bound:
1011
+ raise ValueError(
1012
+ f"`n_components` upper bound is {rank_upper_bound}. "
1013
+ f"Got {n_components} instead. Reduce `n_components`."
1014
+ )
1015
+
1016
+ X, Y, self._x_mean, self._y_mean, self._x_std, self._y_std = _center_scale_xy(
1017
+ X, Y, self.scale
1018
+ )
1019
+
1020
+ # Compute SVD of cross-covariance matrix
1021
+ C = np.dot(X.T, Y)
1022
+ U, s, Vt = svd(C, full_matrices=False)
1023
+ U = U[:, :n_components]
1024
+ Vt = Vt[:n_components]
1025
+ U, Vt = svd_flip(U, Vt)
1026
+ V = Vt.T
1027
+
1028
+ self.x_weights_ = U
1029
+ self.y_weights_ = V
1030
+ self._n_features_out = self.x_weights_.shape[1]
1031
+ return self
1032
+
1033
+ def transform(self, X, Y=None):
1034
+ """
1035
+ Apply the dimensionality reduction.
1036
+
1037
+ Parameters
1038
+ ----------
1039
+ X : array-like of shape (n_samples, n_features)
1040
+ Samples to be transformed.
1041
+
1042
+ Y : array-like of shape (n_samples,) or (n_samples, n_targets), \
1043
+ default=None
1044
+ Targets.
1045
+
1046
+ Returns
1047
+ -------
1048
+ x_scores : array-like or tuple of array-like
1049
+ The transformed data `X_transformed` if `Y is not None`,
1050
+ `(X_transformed, Y_transformed)` otherwise.
1051
+ """
1052
+ check_is_fitted(self)
1053
+ X = self._validate_data(X, dtype=np.float64, reset=False)
1054
+ Xr = (X - self._x_mean) / self._x_std
1055
+ x_scores = np.dot(Xr, self.x_weights_)
1056
+ if Y is not None:
1057
+ Y = check_array(Y, input_name="Y", ensure_2d=False, dtype=np.float64)
1058
+ if Y.ndim == 1:
1059
+ Y = Y.reshape(-1, 1)
1060
+ Yr = (Y - self._y_mean) / self._y_std
1061
+ y_scores = np.dot(Yr, self.y_weights_)
1062
+ return x_scores, y_scores
1063
+ return x_scores
1064
+
1065
+ def fit_transform(self, X, y=None):
1066
+ """Learn and apply the dimensionality reduction.
1067
+
1068
+ Parameters
1069
+ ----------
1070
+ X : array-like of shape (n_samples, n_features)
1071
+ Training samples.
1072
+
1073
+ y : array-like of shape (n_samples,) or (n_samples, n_targets), \
1074
+ default=None
1075
+ Targets.
1076
+
1077
+ Returns
1078
+ -------
1079
+ out : array-like or tuple of array-like
1080
+ The transformed data `X_transformed` if `Y is not None`,
1081
+ `(X_transformed, Y_transformed)` otherwise.
1082
+ """
1083
+ return self.fit(X, y).transform(X, y)
venv/lib/python3.10/site-packages/sklearn/cross_decomposition/tests/__init__.py ADDED
File without changes
venv/lib/python3.10/site-packages/sklearn/cross_decomposition/tests/__pycache__/__init__.cpython-310.pyc ADDED
Binary file (201 Bytes). View file
 
venv/lib/python3.10/site-packages/sklearn/cross_decomposition/tests/__pycache__/test_pls.cpython-310.pyc ADDED
Binary file (16.5 kB). View file
 
venv/lib/python3.10/site-packages/sklearn/cross_decomposition/tests/test_pls.py ADDED
@@ -0,0 +1,646 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import warnings
2
+
3
+ import numpy as np
4
+ import pytest
5
+ from numpy.testing import assert_allclose, assert_array_almost_equal, assert_array_equal
6
+
7
+ from sklearn.cross_decomposition import CCA, PLSSVD, PLSCanonical, PLSRegression
8
+ from sklearn.cross_decomposition._pls import (
9
+ _center_scale_xy,
10
+ _get_first_singular_vectors_power_method,
11
+ _get_first_singular_vectors_svd,
12
+ _svd_flip_1d,
13
+ )
14
+ from sklearn.datasets import load_linnerud, make_regression
15
+ from sklearn.ensemble import VotingRegressor
16
+ from sklearn.exceptions import ConvergenceWarning
17
+ from sklearn.linear_model import LinearRegression
18
+ from sklearn.utils import check_random_state
19
+ from sklearn.utils.extmath import svd_flip
20
+
21
+
22
+ def assert_matrix_orthogonal(M):
23
+ K = np.dot(M.T, M)
24
+ assert_array_almost_equal(K, np.diag(np.diag(K)))
25
+
26
+
27
+ def test_pls_canonical_basics():
28
+ # Basic checks for PLSCanonical
29
+ d = load_linnerud()
30
+ X = d.data
31
+ Y = d.target
32
+
33
+ pls = PLSCanonical(n_components=X.shape[1])
34
+ pls.fit(X, Y)
35
+
36
+ assert_matrix_orthogonal(pls.x_weights_)
37
+ assert_matrix_orthogonal(pls.y_weights_)
38
+ assert_matrix_orthogonal(pls._x_scores)
39
+ assert_matrix_orthogonal(pls._y_scores)
40
+
41
+ # Check X = TP' and Y = UQ'
42
+ T = pls._x_scores
43
+ P = pls.x_loadings_
44
+ U = pls._y_scores
45
+ Q = pls.y_loadings_
46
+ # Need to scale first
47
+ Xc, Yc, x_mean, y_mean, x_std, y_std = _center_scale_xy(
48
+ X.copy(), Y.copy(), scale=True
49
+ )
50
+ assert_array_almost_equal(Xc, np.dot(T, P.T))
51
+ assert_array_almost_equal(Yc, np.dot(U, Q.T))
52
+
53
+ # Check that rotations on training data lead to scores
54
+ Xt = pls.transform(X)
55
+ assert_array_almost_equal(Xt, pls._x_scores)
56
+ Xt, Yt = pls.transform(X, Y)
57
+ assert_array_almost_equal(Xt, pls._x_scores)
58
+ assert_array_almost_equal(Yt, pls._y_scores)
59
+
60
+ # Check that inverse_transform works
61
+ X_back = pls.inverse_transform(Xt)
62
+ assert_array_almost_equal(X_back, X)
63
+ _, Y_back = pls.inverse_transform(Xt, Yt)
64
+ assert_array_almost_equal(Y_back, Y)
65
+
66
+
67
+ def test_sanity_check_pls_regression():
68
+ # Sanity check for PLSRegression
69
+ # The results were checked against the R-packages plspm, misOmics and pls
70
+
71
+ d = load_linnerud()
72
+ X = d.data
73
+ Y = d.target
74
+
75
+ pls = PLSRegression(n_components=X.shape[1])
76
+ X_trans, _ = pls.fit_transform(X, Y)
77
+
78
+ # FIXME: one would expect y_trans == pls.y_scores_ but this is not
79
+ # the case.
80
+ # xref: https://github.com/scikit-learn/scikit-learn/issues/22420
81
+ assert_allclose(X_trans, pls.x_scores_)
82
+
83
+ expected_x_weights = np.array(
84
+ [
85
+ [-0.61330704, -0.00443647, 0.78983213],
86
+ [-0.74697144, -0.32172099, -0.58183269],
87
+ [-0.25668686, 0.94682413, -0.19399983],
88
+ ]
89
+ )
90
+
91
+ expected_x_loadings = np.array(
92
+ [
93
+ [-0.61470416, -0.24574278, 0.78983213],
94
+ [-0.65625755, -0.14396183, -0.58183269],
95
+ [-0.51733059, 1.00609417, -0.19399983],
96
+ ]
97
+ )
98
+
99
+ expected_y_weights = np.array(
100
+ [
101
+ [+0.32456184, 0.29892183, 0.20316322],
102
+ [+0.42439636, 0.61970543, 0.19320542],
103
+ [-0.13143144, -0.26348971, -0.17092916],
104
+ ]
105
+ )
106
+
107
+ expected_y_loadings = np.array(
108
+ [
109
+ [+0.32456184, 0.29892183, 0.20316322],
110
+ [+0.42439636, 0.61970543, 0.19320542],
111
+ [-0.13143144, -0.26348971, -0.17092916],
112
+ ]
113
+ )
114
+
115
+ assert_array_almost_equal(np.abs(pls.x_loadings_), np.abs(expected_x_loadings))
116
+ assert_array_almost_equal(np.abs(pls.x_weights_), np.abs(expected_x_weights))
117
+ assert_array_almost_equal(np.abs(pls.y_loadings_), np.abs(expected_y_loadings))
118
+ assert_array_almost_equal(np.abs(pls.y_weights_), np.abs(expected_y_weights))
119
+
120
+ # The R / Python difference in the signs should be consistent across
121
+ # loadings, weights, etc.
122
+ x_loadings_sign_flip = np.sign(pls.x_loadings_ / expected_x_loadings)
123
+ x_weights_sign_flip = np.sign(pls.x_weights_ / expected_x_weights)
124
+ y_weights_sign_flip = np.sign(pls.y_weights_ / expected_y_weights)
125
+ y_loadings_sign_flip = np.sign(pls.y_loadings_ / expected_y_loadings)
126
+ assert_array_almost_equal(x_loadings_sign_flip, x_weights_sign_flip)
127
+ assert_array_almost_equal(y_loadings_sign_flip, y_weights_sign_flip)
128
+
129
+
130
+ def test_sanity_check_pls_regression_constant_column_Y():
131
+ # Check behavior when the first column of Y is constant
132
+ # The results are checked against a modified version of plsreg2
133
+ # from the R-package plsdepot
134
+ d = load_linnerud()
135
+ X = d.data
136
+ Y = d.target
137
+ Y[:, 0] = 1
138
+ pls = PLSRegression(n_components=X.shape[1])
139
+ pls.fit(X, Y)
140
+
141
+ expected_x_weights = np.array(
142
+ [
143
+ [-0.6273573, 0.007081799, 0.7786994],
144
+ [-0.7493417, -0.277612681, -0.6011807],
145
+ [-0.2119194, 0.960666981, -0.1794690],
146
+ ]
147
+ )
148
+
149
+ expected_x_loadings = np.array(
150
+ [
151
+ [-0.6273512, -0.22464538, 0.7786994],
152
+ [-0.6643156, -0.09871193, -0.6011807],
153
+ [-0.5125877, 1.01407380, -0.1794690],
154
+ ]
155
+ )
156
+
157
+ expected_y_loadings = np.array(
158
+ [
159
+ [0.0000000, 0.0000000, 0.0000000],
160
+ [0.4357300, 0.5828479, 0.2174802],
161
+ [-0.1353739, -0.2486423, -0.1810386],
162
+ ]
163
+ )
164
+
165
+ assert_array_almost_equal(np.abs(expected_x_weights), np.abs(pls.x_weights_))
166
+ assert_array_almost_equal(np.abs(expected_x_loadings), np.abs(pls.x_loadings_))
167
+ # For the PLSRegression with default parameters, y_loadings == y_weights
168
+ assert_array_almost_equal(np.abs(pls.y_loadings_), np.abs(expected_y_loadings))
169
+ assert_array_almost_equal(np.abs(pls.y_weights_), np.abs(expected_y_loadings))
170
+
171
+ x_loadings_sign_flip = np.sign(expected_x_loadings / pls.x_loadings_)
172
+ x_weights_sign_flip = np.sign(expected_x_weights / pls.x_weights_)
173
+ # we ignore the first full-zeros row for y
174
+ y_loadings_sign_flip = np.sign(expected_y_loadings[1:] / pls.y_loadings_[1:])
175
+
176
+ assert_array_equal(x_loadings_sign_flip, x_weights_sign_flip)
177
+ assert_array_equal(x_loadings_sign_flip[1:], y_loadings_sign_flip)
178
+
179
+
180
+ def test_sanity_check_pls_canonical():
181
+ # Sanity check for PLSCanonical
182
+ # The results were checked against the R-package plspm
183
+
184
+ d = load_linnerud()
185
+ X = d.data
186
+ Y = d.target
187
+
188
+ pls = PLSCanonical(n_components=X.shape[1])
189
+ pls.fit(X, Y)
190
+
191
+ expected_x_weights = np.array(
192
+ [
193
+ [-0.61330704, 0.25616119, -0.74715187],
194
+ [-0.74697144, 0.11930791, 0.65406368],
195
+ [-0.25668686, -0.95924297, -0.11817271],
196
+ ]
197
+ )
198
+
199
+ expected_x_rotations = np.array(
200
+ [
201
+ [-0.61330704, 0.41591889, -0.62297525],
202
+ [-0.74697144, 0.31388326, 0.77368233],
203
+ [-0.25668686, -0.89237972, -0.24121788],
204
+ ]
205
+ )
206
+
207
+ expected_y_weights = np.array(
208
+ [
209
+ [+0.58989127, 0.7890047, 0.1717553],
210
+ [+0.77134053, -0.61351791, 0.16920272],
211
+ [-0.23887670, -0.03267062, 0.97050016],
212
+ ]
213
+ )
214
+
215
+ expected_y_rotations = np.array(
216
+ [
217
+ [+0.58989127, 0.7168115, 0.30665872],
218
+ [+0.77134053, -0.70791757, 0.19786539],
219
+ [-0.23887670, -0.00343595, 0.94162826],
220
+ ]
221
+ )
222
+
223
+ assert_array_almost_equal(np.abs(pls.x_rotations_), np.abs(expected_x_rotations))
224
+ assert_array_almost_equal(np.abs(pls.x_weights_), np.abs(expected_x_weights))
225
+ assert_array_almost_equal(np.abs(pls.y_rotations_), np.abs(expected_y_rotations))
226
+ assert_array_almost_equal(np.abs(pls.y_weights_), np.abs(expected_y_weights))
227
+
228
+ x_rotations_sign_flip = np.sign(pls.x_rotations_ / expected_x_rotations)
229
+ x_weights_sign_flip = np.sign(pls.x_weights_ / expected_x_weights)
230
+ y_rotations_sign_flip = np.sign(pls.y_rotations_ / expected_y_rotations)
231
+ y_weights_sign_flip = np.sign(pls.y_weights_ / expected_y_weights)
232
+ assert_array_almost_equal(x_rotations_sign_flip, x_weights_sign_flip)
233
+ assert_array_almost_equal(y_rotations_sign_flip, y_weights_sign_flip)
234
+
235
+ assert_matrix_orthogonal(pls.x_weights_)
236
+ assert_matrix_orthogonal(pls.y_weights_)
237
+
238
+ assert_matrix_orthogonal(pls._x_scores)
239
+ assert_matrix_orthogonal(pls._y_scores)
240
+
241
+
242
+ def test_sanity_check_pls_canonical_random():
243
+ # Sanity check for PLSCanonical on random data
244
+ # The results were checked against the R-package plspm
245
+ n = 500
246
+ p_noise = 10
247
+ q_noise = 5
248
+ # 2 latents vars:
249
+ rng = check_random_state(11)
250
+ l1 = rng.normal(size=n)
251
+ l2 = rng.normal(size=n)
252
+ latents = np.array([l1, l1, l2, l2]).T
253
+ X = latents + rng.normal(size=4 * n).reshape((n, 4))
254
+ Y = latents + rng.normal(size=4 * n).reshape((n, 4))
255
+ X = np.concatenate((X, rng.normal(size=p_noise * n).reshape(n, p_noise)), axis=1)
256
+ Y = np.concatenate((Y, rng.normal(size=q_noise * n).reshape(n, q_noise)), axis=1)
257
+
258
+ pls = PLSCanonical(n_components=3)
259
+ pls.fit(X, Y)
260
+
261
+ expected_x_weights = np.array(
262
+ [
263
+ [0.65803719, 0.19197924, 0.21769083],
264
+ [0.7009113, 0.13303969, -0.15376699],
265
+ [0.13528197, -0.68636408, 0.13856546],
266
+ [0.16854574, -0.66788088, -0.12485304],
267
+ [-0.03232333, -0.04189855, 0.40690153],
268
+ [0.1148816, -0.09643158, 0.1613305],
269
+ [0.04792138, -0.02384992, 0.17175319],
270
+ [-0.06781, -0.01666137, -0.18556747],
271
+ [-0.00266945, -0.00160224, 0.11893098],
272
+ [-0.00849528, -0.07706095, 0.1570547],
273
+ [-0.00949471, -0.02964127, 0.34657036],
274
+ [-0.03572177, 0.0945091, 0.3414855],
275
+ [0.05584937, -0.02028961, -0.57682568],
276
+ [0.05744254, -0.01482333, -0.17431274],
277
+ ]
278
+ )
279
+
280
+ expected_x_loadings = np.array(
281
+ [
282
+ [0.65649254, 0.1847647, 0.15270699],
283
+ [0.67554234, 0.15237508, -0.09182247],
284
+ [0.19219925, -0.67750975, 0.08673128],
285
+ [0.2133631, -0.67034809, -0.08835483],
286
+ [-0.03178912, -0.06668336, 0.43395268],
287
+ [0.15684588, -0.13350241, 0.20578984],
288
+ [0.03337736, -0.03807306, 0.09871553],
289
+ [-0.06199844, 0.01559854, -0.1881785],
290
+ [0.00406146, -0.00587025, 0.16413253],
291
+ [-0.00374239, -0.05848466, 0.19140336],
292
+ [0.00139214, -0.01033161, 0.32239136],
293
+ [-0.05292828, 0.0953533, 0.31916881],
294
+ [0.04031924, -0.01961045, -0.65174036],
295
+ [0.06172484, -0.06597366, -0.1244497],
296
+ ]
297
+ )
298
+
299
+ expected_y_weights = np.array(
300
+ [
301
+ [0.66101097, 0.18672553, 0.22826092],
302
+ [0.69347861, 0.18463471, -0.23995597],
303
+ [0.14462724, -0.66504085, 0.17082434],
304
+ [0.22247955, -0.6932605, -0.09832993],
305
+ [0.07035859, 0.00714283, 0.67810124],
306
+ [0.07765351, -0.0105204, -0.44108074],
307
+ [-0.00917056, 0.04322147, 0.10062478],
308
+ [-0.01909512, 0.06182718, 0.28830475],
309
+ [0.01756709, 0.04797666, 0.32225745],
310
+ ]
311
+ )
312
+
313
+ expected_y_loadings = np.array(
314
+ [
315
+ [0.68568625, 0.1674376, 0.0969508],
316
+ [0.68782064, 0.20375837, -0.1164448],
317
+ [0.11712173, -0.68046903, 0.12001505],
318
+ [0.17860457, -0.6798319, -0.05089681],
319
+ [0.06265739, -0.0277703, 0.74729584],
320
+ [0.0914178, 0.00403751, -0.5135078],
321
+ [-0.02196918, -0.01377169, 0.09564505],
322
+ [-0.03288952, 0.09039729, 0.31858973],
323
+ [0.04287624, 0.05254676, 0.27836841],
324
+ ]
325
+ )
326
+
327
+ assert_array_almost_equal(np.abs(pls.x_loadings_), np.abs(expected_x_loadings))
328
+ assert_array_almost_equal(np.abs(pls.x_weights_), np.abs(expected_x_weights))
329
+ assert_array_almost_equal(np.abs(pls.y_loadings_), np.abs(expected_y_loadings))
330
+ assert_array_almost_equal(np.abs(pls.y_weights_), np.abs(expected_y_weights))
331
+
332
+ x_loadings_sign_flip = np.sign(pls.x_loadings_ / expected_x_loadings)
333
+ x_weights_sign_flip = np.sign(pls.x_weights_ / expected_x_weights)
334
+ y_weights_sign_flip = np.sign(pls.y_weights_ / expected_y_weights)
335
+ y_loadings_sign_flip = np.sign(pls.y_loadings_ / expected_y_loadings)
336
+ assert_array_almost_equal(x_loadings_sign_flip, x_weights_sign_flip)
337
+ assert_array_almost_equal(y_loadings_sign_flip, y_weights_sign_flip)
338
+
339
+ assert_matrix_orthogonal(pls.x_weights_)
340
+ assert_matrix_orthogonal(pls.y_weights_)
341
+
342
+ assert_matrix_orthogonal(pls._x_scores)
343
+ assert_matrix_orthogonal(pls._y_scores)
344
+
345
+
346
+ def test_convergence_fail():
347
+ # Make sure ConvergenceWarning is raised if max_iter is too small
348
+ d = load_linnerud()
349
+ X = d.data
350
+ Y = d.target
351
+ pls_nipals = PLSCanonical(n_components=X.shape[1], max_iter=2)
352
+ with pytest.warns(ConvergenceWarning):
353
+ pls_nipals.fit(X, Y)
354
+
355
+
356
+ @pytest.mark.parametrize("Est", (PLSSVD, PLSRegression, PLSCanonical))
357
+ def test_attibutes_shapes(Est):
358
+ # Make sure attributes are of the correct shape depending on n_components
359
+ d = load_linnerud()
360
+ X = d.data
361
+ Y = d.target
362
+ n_components = 2
363
+ pls = Est(n_components=n_components)
364
+ pls.fit(X, Y)
365
+ assert all(
366
+ attr.shape[1] == n_components for attr in (pls.x_weights_, pls.y_weights_)
367
+ )
368
+
369
+
370
+ @pytest.mark.parametrize("Est", (PLSRegression, PLSCanonical, CCA))
371
+ def test_univariate_equivalence(Est):
372
+ # Ensure 2D Y with 1 column is equivalent to 1D Y
373
+ d = load_linnerud()
374
+ X = d.data
375
+ Y = d.target
376
+
377
+ est = Est(n_components=1)
378
+ one_d_coeff = est.fit(X, Y[:, 0]).coef_
379
+ two_d_coeff = est.fit(X, Y[:, :1]).coef_
380
+
381
+ assert one_d_coeff.shape == two_d_coeff.shape
382
+ assert_array_almost_equal(one_d_coeff, two_d_coeff)
383
+
384
+
385
+ @pytest.mark.parametrize("Est", (PLSRegression, PLSCanonical, CCA, PLSSVD))
386
+ def test_copy(Est):
387
+ # check that the "copy" keyword works
388
+ d = load_linnerud()
389
+ X = d.data
390
+ Y = d.target
391
+ X_orig = X.copy()
392
+
393
+ # copy=True won't modify inplace
394
+ pls = Est(copy=True).fit(X, Y)
395
+ assert_array_equal(X, X_orig)
396
+
397
+ # copy=False will modify inplace
398
+ with pytest.raises(AssertionError):
399
+ Est(copy=False).fit(X, Y)
400
+ assert_array_almost_equal(X, X_orig)
401
+
402
+ if Est is PLSSVD:
403
+ return # PLSSVD does not support copy param in predict or transform
404
+
405
+ X_orig = X.copy()
406
+ with pytest.raises(AssertionError):
407
+ pls.transform(X, Y, copy=False),
408
+ assert_array_almost_equal(X, X_orig)
409
+
410
+ X_orig = X.copy()
411
+ with pytest.raises(AssertionError):
412
+ pls.predict(X, copy=False),
413
+ assert_array_almost_equal(X, X_orig)
414
+
415
+ # Make sure copy=True gives same transform and predictions as predict=False
416
+ assert_array_almost_equal(
417
+ pls.transform(X, Y, copy=True), pls.transform(X.copy(), Y.copy(), copy=False)
418
+ )
419
+ assert_array_almost_equal(
420
+ pls.predict(X, copy=True), pls.predict(X.copy(), copy=False)
421
+ )
422
+
423
+
424
+ def _generate_test_scale_and_stability_datasets():
425
+ """Generate dataset for test_scale_and_stability"""
426
+ # dataset for non-regression 7818
427
+ rng = np.random.RandomState(0)
428
+ n_samples = 1000
429
+ n_targets = 5
430
+ n_features = 10
431
+ Q = rng.randn(n_targets, n_features)
432
+ Y = rng.randn(n_samples, n_targets)
433
+ X = np.dot(Y, Q) + 2 * rng.randn(n_samples, n_features) + 1
434
+ X *= 1000
435
+ yield X, Y
436
+
437
+ # Data set where one of the features is constraint
438
+ X, Y = load_linnerud(return_X_y=True)
439
+ # causes X[:, -1].std() to be zero
440
+ X[:, -1] = 1.0
441
+ yield X, Y
442
+
443
+ X = np.array([[0.0, 0.0, 1.0], [1.0, 0.0, 0.0], [2.0, 2.0, 2.0], [3.0, 5.0, 4.0]])
444
+ Y = np.array([[0.1, -0.2], [0.9, 1.1], [6.2, 5.9], [11.9, 12.3]])
445
+ yield X, Y
446
+
447
+ # Seeds that provide a non-regression test for #18746, where CCA fails
448
+ seeds = [530, 741]
449
+ for seed in seeds:
450
+ rng = np.random.RandomState(seed)
451
+ X = rng.randn(4, 3)
452
+ Y = rng.randn(4, 2)
453
+ yield X, Y
454
+
455
+
456
+ @pytest.mark.parametrize("Est", (CCA, PLSCanonical, PLSRegression, PLSSVD))
457
+ @pytest.mark.parametrize("X, Y", _generate_test_scale_and_stability_datasets())
458
+ def test_scale_and_stability(Est, X, Y):
459
+ """scale=True is equivalent to scale=False on centered/scaled data
460
+ This allows to check numerical stability over platforms as well"""
461
+
462
+ X_s, Y_s, *_ = _center_scale_xy(X, Y)
463
+
464
+ X_score, Y_score = Est(scale=True).fit_transform(X, Y)
465
+ X_s_score, Y_s_score = Est(scale=False).fit_transform(X_s, Y_s)
466
+
467
+ assert_allclose(X_s_score, X_score, atol=1e-4)
468
+ assert_allclose(Y_s_score, Y_score, atol=1e-4)
469
+
470
+
471
+ @pytest.mark.parametrize("Estimator", (PLSSVD, PLSRegression, PLSCanonical, CCA))
472
+ def test_n_components_upper_bounds(Estimator):
473
+ """Check the validation of `n_components` upper bounds for `PLS` regressors."""
474
+ rng = np.random.RandomState(0)
475
+ X = rng.randn(10, 5)
476
+ Y = rng.randn(10, 3)
477
+ est = Estimator(n_components=10)
478
+ err_msg = "`n_components` upper bound is .*. Got 10 instead. Reduce `n_components`."
479
+ with pytest.raises(ValueError, match=err_msg):
480
+ est.fit(X, Y)
481
+
482
+
483
+ @pytest.mark.parametrize("n_samples, n_features", [(100, 10), (100, 200)])
484
+ def test_singular_value_helpers(n_samples, n_features, global_random_seed):
485
+ # Make sure SVD and power method give approximately the same results
486
+ X, Y = make_regression(
487
+ n_samples, n_features, n_targets=5, random_state=global_random_seed
488
+ )
489
+ u1, v1, _ = _get_first_singular_vectors_power_method(X, Y, norm_y_weights=True)
490
+ u2, v2 = _get_first_singular_vectors_svd(X, Y)
491
+
492
+ _svd_flip_1d(u1, v1)
493
+ _svd_flip_1d(u2, v2)
494
+
495
+ rtol = 1e-3
496
+ # Setting atol because some coordinates are very close to zero
497
+ assert_allclose(u1, u2, atol=u2.max() * rtol)
498
+ assert_allclose(v1, v2, atol=v2.max() * rtol)
499
+
500
+
501
+ def test_one_component_equivalence(global_random_seed):
502
+ # PLSSVD, PLSRegression and PLSCanonical should all be equivalent when
503
+ # n_components is 1
504
+ X, Y = make_regression(100, 10, n_targets=5, random_state=global_random_seed)
505
+ svd = PLSSVD(n_components=1).fit(X, Y).transform(X)
506
+ reg = PLSRegression(n_components=1).fit(X, Y).transform(X)
507
+ canonical = PLSCanonical(n_components=1).fit(X, Y).transform(X)
508
+
509
+ rtol = 1e-3
510
+ # Setting atol because some entries are very close to zero
511
+ assert_allclose(svd, reg, atol=reg.max() * rtol)
512
+ assert_allclose(svd, canonical, atol=canonical.max() * rtol)
513
+
514
+
515
+ def test_svd_flip_1d():
516
+ # Make sure svd_flip_1d is equivalent to svd_flip
517
+ u = np.array([1, -4, 2])
518
+ v = np.array([1, 2, 3])
519
+
520
+ u_expected, v_expected = svd_flip(u.reshape(-1, 1), v.reshape(1, -1))
521
+ _svd_flip_1d(u, v) # inplace
522
+
523
+ assert_allclose(u, u_expected.ravel())
524
+ assert_allclose(u, [-1, 4, -2])
525
+
526
+ assert_allclose(v, v_expected.ravel())
527
+ assert_allclose(v, [-1, -2, -3])
528
+
529
+
530
+ def test_loadings_converges(global_random_seed):
531
+ """Test that CCA converges. Non-regression test for #19549."""
532
+ X, y = make_regression(
533
+ n_samples=200, n_features=20, n_targets=20, random_state=global_random_seed
534
+ )
535
+
536
+ cca = CCA(n_components=10, max_iter=500)
537
+
538
+ with warnings.catch_warnings():
539
+ warnings.simplefilter("error", ConvergenceWarning)
540
+
541
+ cca.fit(X, y)
542
+
543
+ # Loadings converges to reasonable values
544
+ assert np.all(np.abs(cca.x_loadings_) < 1)
545
+
546
+
547
+ def test_pls_constant_y():
548
+ """Checks warning when y is constant. Non-regression test for #19831"""
549
+ rng = np.random.RandomState(42)
550
+ x = rng.rand(100, 3)
551
+ y = np.zeros(100)
552
+
553
+ pls = PLSRegression()
554
+
555
+ msg = "Y residual is constant at iteration"
556
+ with pytest.warns(UserWarning, match=msg):
557
+ pls.fit(x, y)
558
+
559
+ assert_allclose(pls.x_rotations_, 0)
560
+
561
+
562
+ @pytest.mark.parametrize("PLSEstimator", [PLSRegression, PLSCanonical, CCA])
563
+ def test_pls_coef_shape(PLSEstimator):
564
+ """Check the shape of `coef_` attribute.
565
+
566
+ Non-regression test for:
567
+ https://github.com/scikit-learn/scikit-learn/issues/12410
568
+ """
569
+ d = load_linnerud()
570
+ X = d.data
571
+ Y = d.target
572
+
573
+ pls = PLSEstimator(copy=True).fit(X, Y)
574
+
575
+ n_targets, n_features = Y.shape[1], X.shape[1]
576
+ assert pls.coef_.shape == (n_targets, n_features)
577
+
578
+
579
+ @pytest.mark.parametrize("scale", [True, False])
580
+ @pytest.mark.parametrize("PLSEstimator", [PLSRegression, PLSCanonical, CCA])
581
+ def test_pls_prediction(PLSEstimator, scale):
582
+ """Check the behaviour of the prediction function."""
583
+ d = load_linnerud()
584
+ X = d.data
585
+ Y = d.target
586
+
587
+ pls = PLSEstimator(copy=True, scale=scale).fit(X, Y)
588
+ Y_pred = pls.predict(X, copy=True)
589
+
590
+ y_mean = Y.mean(axis=0)
591
+ X_trans = X - X.mean(axis=0)
592
+ if scale:
593
+ X_trans /= X.std(axis=0, ddof=1)
594
+
595
+ assert_allclose(pls.intercept_, y_mean)
596
+ assert_allclose(Y_pred, X_trans @ pls.coef_.T + pls.intercept_)
597
+
598
+
599
+ @pytest.mark.parametrize("Klass", [CCA, PLSSVD, PLSRegression, PLSCanonical])
600
+ def test_pls_feature_names_out(Klass):
601
+ """Check `get_feature_names_out` cross_decomposition module."""
602
+ X, Y = load_linnerud(return_X_y=True)
603
+
604
+ est = Klass().fit(X, Y)
605
+ names_out = est.get_feature_names_out()
606
+
607
+ class_name_lower = Klass.__name__.lower()
608
+ expected_names_out = np.array(
609
+ [f"{class_name_lower}{i}" for i in range(est.x_weights_.shape[1])],
610
+ dtype=object,
611
+ )
612
+ assert_array_equal(names_out, expected_names_out)
613
+
614
+
615
+ @pytest.mark.parametrize("Klass", [CCA, PLSSVD, PLSRegression, PLSCanonical])
616
+ def test_pls_set_output(Klass):
617
+ """Check `set_output` in cross_decomposition module."""
618
+ pd = pytest.importorskip("pandas")
619
+ X, Y = load_linnerud(return_X_y=True, as_frame=True)
620
+
621
+ est = Klass().set_output(transform="pandas").fit(X, Y)
622
+ X_trans, y_trans = est.transform(X, Y)
623
+ assert isinstance(y_trans, np.ndarray)
624
+ assert isinstance(X_trans, pd.DataFrame)
625
+ assert_array_equal(X_trans.columns, est.get_feature_names_out())
626
+
627
+
628
+ def test_pls_regression_fit_1d_y():
629
+ """Check that when fitting with 1d `y`, prediction should also be 1d.
630
+
631
+ Non-regression test for Issue #26549.
632
+ """
633
+ X = np.array([[1, 1], [2, 4], [3, 9], [4, 16], [5, 25], [6, 36]])
634
+ y = np.array([2, 6, 12, 20, 30, 42])
635
+ expected = y.copy()
636
+
637
+ plsr = PLSRegression().fit(X, y)
638
+ y_pred = plsr.predict(X)
639
+ assert y_pred.shape == expected.shape
640
+
641
+ # Check that it works in VotingRegressor
642
+ lr = LinearRegression().fit(X, y)
643
+ vr = VotingRegressor([("lr", lr), ("plsr", plsr)])
644
+ y_pred = vr.fit(X, y).predict(X)
645
+ assert y_pred.shape == expected.shape
646
+ assert_allclose(y_pred, expected)
venv/lib/python3.10/site-packages/sklearn/datasets/__pycache__/__init__.cpython-310.pyc ADDED
Binary file (4.75 kB). View file
 
venv/lib/python3.10/site-packages/sklearn/datasets/__pycache__/_arff_parser.cpython-310.pyc ADDED
Binary file (14.2 kB). View file
 
venv/lib/python3.10/site-packages/sklearn/datasets/__pycache__/_base.cpython-310.pyc ADDED
Binary file (42 kB). View file
 
venv/lib/python3.10/site-packages/sklearn/datasets/__pycache__/_california_housing.cpython-310.pyc ADDED
Binary file (5.94 kB). View file
 
venv/lib/python3.10/site-packages/sklearn/datasets/__pycache__/_covtype.cpython-310.pyc ADDED
Binary file (6.59 kB). View file
 
venv/lib/python3.10/site-packages/sklearn/datasets/__pycache__/_kddcup99.cpython-310.pyc ADDED
Binary file (10.9 kB). View file
 
venv/lib/python3.10/site-packages/sklearn/datasets/__pycache__/_lfw.cpython-310.pyc ADDED
Binary file (15.5 kB). View file
 
venv/lib/python3.10/site-packages/sklearn/datasets/__pycache__/_olivetti_faces.cpython-310.pyc ADDED
Binary file (4.86 kB). View file
 
venv/lib/python3.10/site-packages/sklearn/datasets/__pycache__/_openml.cpython-310.pyc ADDED
Binary file (33 kB). View file
 
venv/lib/python3.10/site-packages/sklearn/datasets/__pycache__/_rcv1.cpython-310.pyc ADDED
Binary file (8.1 kB). View file
 
venv/lib/python3.10/site-packages/sklearn/datasets/__pycache__/_samples_generator.cpython-310.pyc ADDED
Binary file (60.7 kB). View file
 
venv/lib/python3.10/site-packages/sklearn/datasets/__pycache__/_species_distributions.cpython-310.pyc ADDED
Binary file (8.55 kB). View file
 
venv/lib/python3.10/site-packages/sklearn/datasets/__pycache__/_svmlight_format_io.cpython-310.pyc ADDED
Binary file (17.5 kB). View file
 
venv/lib/python3.10/site-packages/sklearn/datasets/__pycache__/_twenty_newsgroups.cpython-310.pyc ADDED
Binary file (16.4 kB). View file
 
venv/lib/python3.10/site-packages/sklearn/datasets/data/__init__.py ADDED
File without changes
venv/lib/python3.10/site-packages/sklearn/datasets/data/__pycache__/__init__.cpython-310.pyc ADDED
Binary file (189 Bytes). View file
 
venv/lib/python3.10/site-packages/sklearn/datasets/data/boston_house_prices.csv ADDED
@@ -0,0 +1,508 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 506,13,,,,,,,,,,,,
2
+ "CRIM","ZN","INDUS","CHAS","NOX","RM","AGE","DIS","RAD","TAX","PTRATIO","B","LSTAT","MEDV"
3
+ 0.00632,18,2.31,0,0.538,6.575,65.2,4.09,1,296,15.3,396.9,4.98,24
4
+ 0.02731,0,7.07,0,0.469,6.421,78.9,4.9671,2,242,17.8,396.9,9.14,21.6
5
+ 0.02729,0,7.07,0,0.469,7.185,61.1,4.9671,2,242,17.8,392.83,4.03,34.7
6
+ 0.03237,0,2.18,0,0.458,6.998,45.8,6.0622,3,222,18.7,394.63,2.94,33.4
7
+ 0.06905,0,2.18,0,0.458,7.147,54.2,6.0622,3,222,18.7,396.9,5.33,36.2
8
+ 0.02985,0,2.18,0,0.458,6.43,58.7,6.0622,3,222,18.7,394.12,5.21,28.7
9
+ 0.08829,12.5,7.87,0,0.524,6.012,66.6,5.5605,5,311,15.2,395.6,12.43,22.9
10
+ 0.14455,12.5,7.87,0,0.524,6.172,96.1,5.9505,5,311,15.2,396.9,19.15,27.1
11
+ 0.21124,12.5,7.87,0,0.524,5.631,100,6.0821,5,311,15.2,386.63,29.93,16.5
12
+ 0.17004,12.5,7.87,0,0.524,6.004,85.9,6.5921,5,311,15.2,386.71,17.1,18.9
13
+ 0.22489,12.5,7.87,0,0.524,6.377,94.3,6.3467,5,311,15.2,392.52,20.45,15
14
+ 0.11747,12.5,7.87,0,0.524,6.009,82.9,6.2267,5,311,15.2,396.9,13.27,18.9
15
+ 0.09378,12.5,7.87,0,0.524,5.889,39,5.4509,5,311,15.2,390.5,15.71,21.7
16
+ 0.62976,0,8.14,0,0.538,5.949,61.8,4.7075,4,307,21,396.9,8.26,20.4
17
+ 0.63796,0,8.14,0,0.538,6.096,84.5,4.4619,4,307,21,380.02,10.26,18.2
18
+ 0.62739,0,8.14,0,0.538,5.834,56.5,4.4986,4,307,21,395.62,8.47,19.9
19
+ 1.05393,0,8.14,0,0.538,5.935,29.3,4.4986,4,307,21,386.85,6.58,23.1
20
+ 0.7842,0,8.14,0,0.538,5.99,81.7,4.2579,4,307,21,386.75,14.67,17.5
21
+ 0.80271,0,8.14,0,0.538,5.456,36.6,3.7965,4,307,21,288.99,11.69,20.2
22
+ 0.7258,0,8.14,0,0.538,5.727,69.5,3.7965,4,307,21,390.95,11.28,18.2
23
+ 1.25179,0,8.14,0,0.538,5.57,98.1,3.7979,4,307,21,376.57,21.02,13.6
24
+ 0.85204,0,8.14,0,0.538,5.965,89.2,4.0123,4,307,21,392.53,13.83,19.6
25
+ 1.23247,0,8.14,0,0.538,6.142,91.7,3.9769,4,307,21,396.9,18.72,15.2
26
+ 0.98843,0,8.14,0,0.538,5.813,100,4.0952,4,307,21,394.54,19.88,14.5
27
+ 0.75026,0,8.14,0,0.538,5.924,94.1,4.3996,4,307,21,394.33,16.3,15.6
28
+ 0.84054,0,8.14,0,0.538,5.599,85.7,4.4546,4,307,21,303.42,16.51,13.9
29
+ 0.67191,0,8.14,0,0.538,5.813,90.3,4.682,4,307,21,376.88,14.81,16.6
30
+ 0.95577,0,8.14,0,0.538,6.047,88.8,4.4534,4,307,21,306.38,17.28,14.8
31
+ 0.77299,0,8.14,0,0.538,6.495,94.4,4.4547,4,307,21,387.94,12.8,18.4
32
+ 1.00245,0,8.14,0,0.538,6.674,87.3,4.239,4,307,21,380.23,11.98,21
33
+ 1.13081,0,8.14,0,0.538,5.713,94.1,4.233,4,307,21,360.17,22.6,12.7
34
+ 1.35472,0,8.14,0,0.538,6.072,100,4.175,4,307,21,376.73,13.04,14.5
35
+ 1.38799,0,8.14,0,0.538,5.95,82,3.99,4,307,21,232.6,27.71,13.2
36
+ 1.15172,0,8.14,0,0.538,5.701,95,3.7872,4,307,21,358.77,18.35,13.1
37
+ 1.61282,0,8.14,0,0.538,6.096,96.9,3.7598,4,307,21,248.31,20.34,13.5
38
+ 0.06417,0,5.96,0,0.499,5.933,68.2,3.3603,5,279,19.2,396.9,9.68,18.9
39
+ 0.09744,0,5.96,0,0.499,5.841,61.4,3.3779,5,279,19.2,377.56,11.41,20
40
+ 0.08014,0,5.96,0,0.499,5.85,41.5,3.9342,5,279,19.2,396.9,8.77,21
41
+ 0.17505,0,5.96,0,0.499,5.966,30.2,3.8473,5,279,19.2,393.43,10.13,24.7
42
+ 0.02763,75,2.95,0,0.428,6.595,21.8,5.4011,3,252,18.3,395.63,4.32,30.8
43
+ 0.03359,75,2.95,0,0.428,7.024,15.8,5.4011,3,252,18.3,395.62,1.98,34.9
44
+ 0.12744,0,6.91,0,0.448,6.77,2.9,5.7209,3,233,17.9,385.41,4.84,26.6
45
+ 0.1415,0,6.91,0,0.448,6.169,6.6,5.7209,3,233,17.9,383.37,5.81,25.3
46
+ 0.15936,0,6.91,0,0.448,6.211,6.5,5.7209,3,233,17.9,394.46,7.44,24.7
47
+ 0.12269,0,6.91,0,0.448,6.069,40,5.7209,3,233,17.9,389.39,9.55,21.2
48
+ 0.17142,0,6.91,0,0.448,5.682,33.8,5.1004,3,233,17.9,396.9,10.21,19.3
49
+ 0.18836,0,6.91,0,0.448,5.786,33.3,5.1004,3,233,17.9,396.9,14.15,20
50
+ 0.22927,0,6.91,0,0.448,6.03,85.5,5.6894,3,233,17.9,392.74,18.8,16.6
51
+ 0.25387,0,6.91,0,0.448,5.399,95.3,5.87,3,233,17.9,396.9,30.81,14.4
52
+ 0.21977,0,6.91,0,0.448,5.602,62,6.0877,3,233,17.9,396.9,16.2,19.4
53
+ 0.08873,21,5.64,0,0.439,5.963,45.7,6.8147,4,243,16.8,395.56,13.45,19.7
54
+ 0.04337,21,5.64,0,0.439,6.115,63,6.8147,4,243,16.8,393.97,9.43,20.5
55
+ 0.0536,21,5.64,0,0.439,6.511,21.1,6.8147,4,243,16.8,396.9,5.28,25
56
+ 0.04981,21,5.64,0,0.439,5.998,21.4,6.8147,4,243,16.8,396.9,8.43,23.4
57
+ 0.0136,75,4,0,0.41,5.888,47.6,7.3197,3,469,21.1,396.9,14.8,18.9
58
+ 0.01311,90,1.22,0,0.403,7.249,21.9,8.6966,5,226,17.9,395.93,4.81,35.4
59
+ 0.02055,85,0.74,0,0.41,6.383,35.7,9.1876,2,313,17.3,396.9,5.77,24.7
60
+ 0.01432,100,1.32,0,0.411,6.816,40.5,8.3248,5,256,15.1,392.9,3.95,31.6
61
+ 0.15445,25,5.13,0,0.453,6.145,29.2,7.8148,8,284,19.7,390.68,6.86,23.3
62
+ 0.10328,25,5.13,0,0.453,5.927,47.2,6.932,8,284,19.7,396.9,9.22,19.6
63
+ 0.14932,25,5.13,0,0.453,5.741,66.2,7.2254,8,284,19.7,395.11,13.15,18.7
64
+ 0.17171,25,5.13,0,0.453,5.966,93.4,6.8185,8,284,19.7,378.08,14.44,16
65
+ 0.11027,25,5.13,0,0.453,6.456,67.8,7.2255,8,284,19.7,396.9,6.73,22.2
66
+ 0.1265,25,5.13,0,0.453,6.762,43.4,7.9809,8,284,19.7,395.58,9.5,25
67
+ 0.01951,17.5,1.38,0,0.4161,7.104,59.5,9.2229,3,216,18.6,393.24,8.05,33
68
+ 0.03584,80,3.37,0,0.398,6.29,17.8,6.6115,4,337,16.1,396.9,4.67,23.5
69
+ 0.04379,80,3.37,0,0.398,5.787,31.1,6.6115,4,337,16.1,396.9,10.24,19.4
70
+ 0.05789,12.5,6.07,0,0.409,5.878,21.4,6.498,4,345,18.9,396.21,8.1,22
71
+ 0.13554,12.5,6.07,0,0.409,5.594,36.8,6.498,4,345,18.9,396.9,13.09,17.4
72
+ 0.12816,12.5,6.07,0,0.409,5.885,33,6.498,4,345,18.9,396.9,8.79,20.9
73
+ 0.08826,0,10.81,0,0.413,6.417,6.6,5.2873,4,305,19.2,383.73,6.72,24.2
74
+ 0.15876,0,10.81,0,0.413,5.961,17.5,5.2873,4,305,19.2,376.94,9.88,21.7
75
+ 0.09164,0,10.81,0,0.413,6.065,7.8,5.2873,4,305,19.2,390.91,5.52,22.8
76
+ 0.19539,0,10.81,0,0.413,6.245,6.2,5.2873,4,305,19.2,377.17,7.54,23.4
77
+ 0.07896,0,12.83,0,0.437,6.273,6,4.2515,5,398,18.7,394.92,6.78,24.1
78
+ 0.09512,0,12.83,0,0.437,6.286,45,4.5026,5,398,18.7,383.23,8.94,21.4
79
+ 0.10153,0,12.83,0,0.437,6.279,74.5,4.0522,5,398,18.7,373.66,11.97,20
80
+ 0.08707,0,12.83,0,0.437,6.14,45.8,4.0905,5,398,18.7,386.96,10.27,20.8
81
+ 0.05646,0,12.83,0,0.437,6.232,53.7,5.0141,5,398,18.7,386.4,12.34,21.2
82
+ 0.08387,0,12.83,0,0.437,5.874,36.6,4.5026,5,398,18.7,396.06,9.1,20.3
83
+ 0.04113,25,4.86,0,0.426,6.727,33.5,5.4007,4,281,19,396.9,5.29,28
84
+ 0.04462,25,4.86,0,0.426,6.619,70.4,5.4007,4,281,19,395.63,7.22,23.9
85
+ 0.03659,25,4.86,0,0.426,6.302,32.2,5.4007,4,281,19,396.9,6.72,24.8
86
+ 0.03551,25,4.86,0,0.426,6.167,46.7,5.4007,4,281,19,390.64,7.51,22.9
87
+ 0.05059,0,4.49,0,0.449,6.389,48,4.7794,3,247,18.5,396.9,9.62,23.9
88
+ 0.05735,0,4.49,0,0.449,6.63,56.1,4.4377,3,247,18.5,392.3,6.53,26.6
89
+ 0.05188,0,4.49,0,0.449,6.015,45.1,4.4272,3,247,18.5,395.99,12.86,22.5
90
+ 0.07151,0,4.49,0,0.449,6.121,56.8,3.7476,3,247,18.5,395.15,8.44,22.2
91
+ 0.0566,0,3.41,0,0.489,7.007,86.3,3.4217,2,270,17.8,396.9,5.5,23.6
92
+ 0.05302,0,3.41,0,0.489,7.079,63.1,3.4145,2,270,17.8,396.06,5.7,28.7
93
+ 0.04684,0,3.41,0,0.489,6.417,66.1,3.0923,2,270,17.8,392.18,8.81,22.6
94
+ 0.03932,0,3.41,0,0.489,6.405,73.9,3.0921,2,270,17.8,393.55,8.2,22
95
+ 0.04203,28,15.04,0,0.464,6.442,53.6,3.6659,4,270,18.2,395.01,8.16,22.9
96
+ 0.02875,28,15.04,0,0.464,6.211,28.9,3.6659,4,270,18.2,396.33,6.21,25
97
+ 0.04294,28,15.04,0,0.464,6.249,77.3,3.615,4,270,18.2,396.9,10.59,20.6
98
+ 0.12204,0,2.89,0,0.445,6.625,57.8,3.4952,2,276,18,357.98,6.65,28.4
99
+ 0.11504,0,2.89,0,0.445,6.163,69.6,3.4952,2,276,18,391.83,11.34,21.4
100
+ 0.12083,0,2.89,0,0.445,8.069,76,3.4952,2,276,18,396.9,4.21,38.7
101
+ 0.08187,0,2.89,0,0.445,7.82,36.9,3.4952,2,276,18,393.53,3.57,43.8
102
+ 0.0686,0,2.89,0,0.445,7.416,62.5,3.4952,2,276,18,396.9,6.19,33.2
103
+ 0.14866,0,8.56,0,0.52,6.727,79.9,2.7778,5,384,20.9,394.76,9.42,27.5
104
+ 0.11432,0,8.56,0,0.52,6.781,71.3,2.8561,5,384,20.9,395.58,7.67,26.5
105
+ 0.22876,0,8.56,0,0.52,6.405,85.4,2.7147,5,384,20.9,70.8,10.63,18.6
106
+ 0.21161,0,8.56,0,0.52,6.137,87.4,2.7147,5,384,20.9,394.47,13.44,19.3
107
+ 0.1396,0,8.56,0,0.52,6.167,90,2.421,5,384,20.9,392.69,12.33,20.1
108
+ 0.13262,0,8.56,0,0.52,5.851,96.7,2.1069,5,384,20.9,394.05,16.47,19.5
109
+ 0.1712,0,8.56,0,0.52,5.836,91.9,2.211,5,384,20.9,395.67,18.66,19.5
110
+ 0.13117,0,8.56,0,0.52,6.127,85.2,2.1224,5,384,20.9,387.69,14.09,20.4
111
+ 0.12802,0,8.56,0,0.52,6.474,97.1,2.4329,5,384,20.9,395.24,12.27,19.8
112
+ 0.26363,0,8.56,0,0.52,6.229,91.2,2.5451,5,384,20.9,391.23,15.55,19.4
113
+ 0.10793,0,8.56,0,0.52,6.195,54.4,2.7778,5,384,20.9,393.49,13,21.7
114
+ 0.10084,0,10.01,0,0.547,6.715,81.6,2.6775,6,432,17.8,395.59,10.16,22.8
115
+ 0.12329,0,10.01,0,0.547,5.913,92.9,2.3534,6,432,17.8,394.95,16.21,18.8
116
+ 0.22212,0,10.01,0,0.547,6.092,95.4,2.548,6,432,17.8,396.9,17.09,18.7
117
+ 0.14231,0,10.01,0,0.547,6.254,84.2,2.2565,6,432,17.8,388.74,10.45,18.5
118
+ 0.17134,0,10.01,0,0.547,5.928,88.2,2.4631,6,432,17.8,344.91,15.76,18.3
119
+ 0.13158,0,10.01,0,0.547,6.176,72.5,2.7301,6,432,17.8,393.3,12.04,21.2
120
+ 0.15098,0,10.01,0,0.547,6.021,82.6,2.7474,6,432,17.8,394.51,10.3,19.2
121
+ 0.13058,0,10.01,0,0.547,5.872,73.1,2.4775,6,432,17.8,338.63,15.37,20.4
122
+ 0.14476,0,10.01,0,0.547,5.731,65.2,2.7592,6,432,17.8,391.5,13.61,19.3
123
+ 0.06899,0,25.65,0,0.581,5.87,69.7,2.2577,2,188,19.1,389.15,14.37,22
124
+ 0.07165,0,25.65,0,0.581,6.004,84.1,2.1974,2,188,19.1,377.67,14.27,20.3
125
+ 0.09299,0,25.65,0,0.581,5.961,92.9,2.0869,2,188,19.1,378.09,17.93,20.5
126
+ 0.15038,0,25.65,0,0.581,5.856,97,1.9444,2,188,19.1,370.31,25.41,17.3
127
+ 0.09849,0,25.65,0,0.581,5.879,95.8,2.0063,2,188,19.1,379.38,17.58,18.8
128
+ 0.16902,0,25.65,0,0.581,5.986,88.4,1.9929,2,188,19.1,385.02,14.81,21.4
129
+ 0.38735,0,25.65,0,0.581,5.613,95.6,1.7572,2,188,19.1,359.29,27.26,15.7
130
+ 0.25915,0,21.89,0,0.624,5.693,96,1.7883,4,437,21.2,392.11,17.19,16.2
131
+ 0.32543,0,21.89,0,0.624,6.431,98.8,1.8125,4,437,21.2,396.9,15.39,18
132
+ 0.88125,0,21.89,0,0.624,5.637,94.7,1.9799,4,437,21.2,396.9,18.34,14.3
133
+ 0.34006,0,21.89,0,0.624,6.458,98.9,2.1185,4,437,21.2,395.04,12.6,19.2
134
+ 1.19294,0,21.89,0,0.624,6.326,97.7,2.271,4,437,21.2,396.9,12.26,19.6
135
+ 0.59005,0,21.89,0,0.624,6.372,97.9,2.3274,4,437,21.2,385.76,11.12,23
136
+ 0.32982,0,21.89,0,0.624,5.822,95.4,2.4699,4,437,21.2,388.69,15.03,18.4
137
+ 0.97617,0,21.89,0,0.624,5.757,98.4,2.346,4,437,21.2,262.76,17.31,15.6
138
+ 0.55778,0,21.89,0,0.624,6.335,98.2,2.1107,4,437,21.2,394.67,16.96,18.1
139
+ 0.32264,0,21.89,0,0.624,5.942,93.5,1.9669,4,437,21.2,378.25,16.9,17.4
140
+ 0.35233,0,21.89,0,0.624,6.454,98.4,1.8498,4,437,21.2,394.08,14.59,17.1
141
+ 0.2498,0,21.89,0,0.624,5.857,98.2,1.6686,4,437,21.2,392.04,21.32,13.3
142
+ 0.54452,0,21.89,0,0.624,6.151,97.9,1.6687,4,437,21.2,396.9,18.46,17.8
143
+ 0.2909,0,21.89,0,0.624,6.174,93.6,1.6119,4,437,21.2,388.08,24.16,14
144
+ 1.62864,0,21.89,0,0.624,5.019,100,1.4394,4,437,21.2,396.9,34.41,14.4
145
+ 3.32105,0,19.58,1,0.871,5.403,100,1.3216,5,403,14.7,396.9,26.82,13.4
146
+ 4.0974,0,19.58,0,0.871,5.468,100,1.4118,5,403,14.7,396.9,26.42,15.6
147
+ 2.77974,0,19.58,0,0.871,4.903,97.8,1.3459,5,403,14.7,396.9,29.29,11.8
148
+ 2.37934,0,19.58,0,0.871,6.13,100,1.4191,5,403,14.7,172.91,27.8,13.8
149
+ 2.15505,0,19.58,0,0.871,5.628,100,1.5166,5,403,14.7,169.27,16.65,15.6
150
+ 2.36862,0,19.58,0,0.871,4.926,95.7,1.4608,5,403,14.7,391.71,29.53,14.6
151
+ 2.33099,0,19.58,0,0.871,5.186,93.8,1.5296,5,403,14.7,356.99,28.32,17.8
152
+ 2.73397,0,19.58,0,0.871,5.597,94.9,1.5257,5,403,14.7,351.85,21.45,15.4
153
+ 1.6566,0,19.58,0,0.871,6.122,97.3,1.618,5,403,14.7,372.8,14.1,21.5
154
+ 1.49632,0,19.58,0,0.871,5.404,100,1.5916,5,403,14.7,341.6,13.28,19.6
155
+ 1.12658,0,19.58,1,0.871,5.012,88,1.6102,5,403,14.7,343.28,12.12,15.3
156
+ 2.14918,0,19.58,0,0.871,5.709,98.5,1.6232,5,403,14.7,261.95,15.79,19.4
157
+ 1.41385,0,19.58,1,0.871,6.129,96,1.7494,5,403,14.7,321.02,15.12,17
158
+ 3.53501,0,19.58,1,0.871,6.152,82.6,1.7455,5,403,14.7,88.01,15.02,15.6
159
+ 2.44668,0,19.58,0,0.871,5.272,94,1.7364,5,403,14.7,88.63,16.14,13.1
160
+ 1.22358,0,19.58,0,0.605,6.943,97.4,1.8773,5,403,14.7,363.43,4.59,41.3
161
+ 1.34284,0,19.58,0,0.605,6.066,100,1.7573,5,403,14.7,353.89,6.43,24.3
162
+ 1.42502,0,19.58,0,0.871,6.51,100,1.7659,5,403,14.7,364.31,7.39,23.3
163
+ 1.27346,0,19.58,1,0.605,6.25,92.6,1.7984,5,403,14.7,338.92,5.5,27
164
+ 1.46336,0,19.58,0,0.605,7.489,90.8,1.9709,5,403,14.7,374.43,1.73,50
165
+ 1.83377,0,19.58,1,0.605,7.802,98.2,2.0407,5,403,14.7,389.61,1.92,50
166
+ 1.51902,0,19.58,1,0.605,8.375,93.9,2.162,5,403,14.7,388.45,3.32,50
167
+ 2.24236,0,19.58,0,0.605,5.854,91.8,2.422,5,403,14.7,395.11,11.64,22.7
168
+ 2.924,0,19.58,0,0.605,6.101,93,2.2834,5,403,14.7,240.16,9.81,25
169
+ 2.01019,0,19.58,0,0.605,7.929,96.2,2.0459,5,403,14.7,369.3,3.7,50
170
+ 1.80028,0,19.58,0,0.605,5.877,79.2,2.4259,5,403,14.7,227.61,12.14,23.8
171
+ 2.3004,0,19.58,0,0.605,6.319,96.1,2.1,5,403,14.7,297.09,11.1,23.8
172
+ 2.44953,0,19.58,0,0.605,6.402,95.2,2.2625,5,403,14.7,330.04,11.32,22.3
173
+ 1.20742,0,19.58,0,0.605,5.875,94.6,2.4259,5,403,14.7,292.29,14.43,17.4
174
+ 2.3139,0,19.58,0,0.605,5.88,97.3,2.3887,5,403,14.7,348.13,12.03,19.1
175
+ 0.13914,0,4.05,0,0.51,5.572,88.5,2.5961,5,296,16.6,396.9,14.69,23.1
176
+ 0.09178,0,4.05,0,0.51,6.416,84.1,2.6463,5,296,16.6,395.5,9.04,23.6
177
+ 0.08447,0,4.05,0,0.51,5.859,68.7,2.7019,5,296,16.6,393.23,9.64,22.6
178
+ 0.06664,0,4.05,0,0.51,6.546,33.1,3.1323,5,296,16.6,390.96,5.33,29.4
179
+ 0.07022,0,4.05,0,0.51,6.02,47.2,3.5549,5,296,16.6,393.23,10.11,23.2
180
+ 0.05425,0,4.05,0,0.51,6.315,73.4,3.3175,5,296,16.6,395.6,6.29,24.6
181
+ 0.06642,0,4.05,0,0.51,6.86,74.4,2.9153,5,296,16.6,391.27,6.92,29.9
182
+ 0.0578,0,2.46,0,0.488,6.98,58.4,2.829,3,193,17.8,396.9,5.04,37.2
183
+ 0.06588,0,2.46,0,0.488,7.765,83.3,2.741,3,193,17.8,395.56,7.56,39.8
184
+ 0.06888,0,2.46,0,0.488,6.144,62.2,2.5979,3,193,17.8,396.9,9.45,36.2
185
+ 0.09103,0,2.46,0,0.488,7.155,92.2,2.7006,3,193,17.8,394.12,4.82,37.9
186
+ 0.10008,0,2.46,0,0.488,6.563,95.6,2.847,3,193,17.8,396.9,5.68,32.5
187
+ 0.08308,0,2.46,0,0.488,5.604,89.8,2.9879,3,193,17.8,391,13.98,26.4
188
+ 0.06047,0,2.46,0,0.488,6.153,68.8,3.2797,3,193,17.8,387.11,13.15,29.6
189
+ 0.05602,0,2.46,0,0.488,7.831,53.6,3.1992,3,193,17.8,392.63,4.45,50
190
+ 0.07875,45,3.44,0,0.437,6.782,41.1,3.7886,5,398,15.2,393.87,6.68,32
191
+ 0.12579,45,3.44,0,0.437,6.556,29.1,4.5667,5,398,15.2,382.84,4.56,29.8
192
+ 0.0837,45,3.44,0,0.437,7.185,38.9,4.5667,5,398,15.2,396.9,5.39,34.9
193
+ 0.09068,45,3.44,0,0.437,6.951,21.5,6.4798,5,398,15.2,377.68,5.1,37
194
+ 0.06911,45,3.44,0,0.437,6.739,30.8,6.4798,5,398,15.2,389.71,4.69,30.5
195
+ 0.08664,45,3.44,0,0.437,7.178,26.3,6.4798,5,398,15.2,390.49,2.87,36.4
196
+ 0.02187,60,2.93,0,0.401,6.8,9.9,6.2196,1,265,15.6,393.37,5.03,31.1
197
+ 0.01439,60,2.93,0,0.401,6.604,18.8,6.2196,1,265,15.6,376.7,4.38,29.1
198
+ 0.01381,80,0.46,0,0.422,7.875,32,5.6484,4,255,14.4,394.23,2.97,50
199
+ 0.04011,80,1.52,0,0.404,7.287,34.1,7.309,2,329,12.6,396.9,4.08,33.3
200
+ 0.04666,80,1.52,0,0.404,7.107,36.6,7.309,2,329,12.6,354.31,8.61,30.3
201
+ 0.03768,80,1.52,0,0.404,7.274,38.3,7.309,2,329,12.6,392.2,6.62,34.6
202
+ 0.0315,95,1.47,0,0.403,6.975,15.3,7.6534,3,402,17,396.9,4.56,34.9
203
+ 0.01778,95,1.47,0,0.403,7.135,13.9,7.6534,3,402,17,384.3,4.45,32.9
204
+ 0.03445,82.5,2.03,0,0.415,6.162,38.4,6.27,2,348,14.7,393.77,7.43,24.1
205
+ 0.02177,82.5,2.03,0,0.415,7.61,15.7,6.27,2,348,14.7,395.38,3.11,42.3
206
+ 0.0351,95,2.68,0,0.4161,7.853,33.2,5.118,4,224,14.7,392.78,3.81,48.5
207
+ 0.02009,95,2.68,0,0.4161,8.034,31.9,5.118,4,224,14.7,390.55,2.88,50
208
+ 0.13642,0,10.59,0,0.489,5.891,22.3,3.9454,4,277,18.6,396.9,10.87,22.6
209
+ 0.22969,0,10.59,0,0.489,6.326,52.5,4.3549,4,277,18.6,394.87,10.97,24.4
210
+ 0.25199,0,10.59,0,0.489,5.783,72.7,4.3549,4,277,18.6,389.43,18.06,22.5
211
+ 0.13587,0,10.59,1,0.489,6.064,59.1,4.2392,4,277,18.6,381.32,14.66,24.4
212
+ 0.43571,0,10.59,1,0.489,5.344,100,3.875,4,277,18.6,396.9,23.09,20
213
+ 0.17446,0,10.59,1,0.489,5.96,92.1,3.8771,4,277,18.6,393.25,17.27,21.7
214
+ 0.37578,0,10.59,1,0.489,5.404,88.6,3.665,4,277,18.6,395.24,23.98,19.3
215
+ 0.21719,0,10.59,1,0.489,5.807,53.8,3.6526,4,277,18.6,390.94,16.03,22.4
216
+ 0.14052,0,10.59,0,0.489,6.375,32.3,3.9454,4,277,18.6,385.81,9.38,28.1
217
+ 0.28955,0,10.59,0,0.489,5.412,9.8,3.5875,4,277,18.6,348.93,29.55,23.7
218
+ 0.19802,0,10.59,0,0.489,6.182,42.4,3.9454,4,277,18.6,393.63,9.47,25
219
+ 0.0456,0,13.89,1,0.55,5.888,56,3.1121,5,276,16.4,392.8,13.51,23.3
220
+ 0.07013,0,13.89,0,0.55,6.642,85.1,3.4211,5,276,16.4,392.78,9.69,28.7
221
+ 0.11069,0,13.89,1,0.55,5.951,93.8,2.8893,5,276,16.4,396.9,17.92,21.5
222
+ 0.11425,0,13.89,1,0.55,6.373,92.4,3.3633,5,276,16.4,393.74,10.5,23
223
+ 0.35809,0,6.2,1,0.507,6.951,88.5,2.8617,8,307,17.4,391.7,9.71,26.7
224
+ 0.40771,0,6.2,1,0.507,6.164,91.3,3.048,8,307,17.4,395.24,21.46,21.7
225
+ 0.62356,0,6.2,1,0.507,6.879,77.7,3.2721,8,307,17.4,390.39,9.93,27.5
226
+ 0.6147,0,6.2,0,0.507,6.618,80.8,3.2721,8,307,17.4,396.9,7.6,30.1
227
+ 0.31533,0,6.2,0,0.504,8.266,78.3,2.8944,8,307,17.4,385.05,4.14,44.8
228
+ 0.52693,0,6.2,0,0.504,8.725,83,2.8944,8,307,17.4,382,4.63,50
229
+ 0.38214,0,6.2,0,0.504,8.04,86.5,3.2157,8,307,17.4,387.38,3.13,37.6
230
+ 0.41238,0,6.2,0,0.504,7.163,79.9,3.2157,8,307,17.4,372.08,6.36,31.6
231
+ 0.29819,0,6.2,0,0.504,7.686,17,3.3751,8,307,17.4,377.51,3.92,46.7
232
+ 0.44178,0,6.2,0,0.504,6.552,21.4,3.3751,8,307,17.4,380.34,3.76,31.5
233
+ 0.537,0,6.2,0,0.504,5.981,68.1,3.6715,8,307,17.4,378.35,11.65,24.3
234
+ 0.46296,0,6.2,0,0.504,7.412,76.9,3.6715,8,307,17.4,376.14,5.25,31.7
235
+ 0.57529,0,6.2,0,0.507,8.337,73.3,3.8384,8,307,17.4,385.91,2.47,41.7
236
+ 0.33147,0,6.2,0,0.507,8.247,70.4,3.6519,8,307,17.4,378.95,3.95,48.3
237
+ 0.44791,0,6.2,1,0.507,6.726,66.5,3.6519,8,307,17.4,360.2,8.05,29
238
+ 0.33045,0,6.2,0,0.507,6.086,61.5,3.6519,8,307,17.4,376.75,10.88,24
239
+ 0.52058,0,6.2,1,0.507,6.631,76.5,4.148,8,307,17.4,388.45,9.54,25.1
240
+ 0.51183,0,6.2,0,0.507,7.358,71.6,4.148,8,307,17.4,390.07,4.73,31.5
241
+ 0.08244,30,4.93,0,0.428,6.481,18.5,6.1899,6,300,16.6,379.41,6.36,23.7
242
+ 0.09252,30,4.93,0,0.428,6.606,42.2,6.1899,6,300,16.6,383.78,7.37,23.3
243
+ 0.11329,30,4.93,0,0.428,6.897,54.3,6.3361,6,300,16.6,391.25,11.38,22
244
+ 0.10612,30,4.93,0,0.428,6.095,65.1,6.3361,6,300,16.6,394.62,12.4,20.1
245
+ 0.1029,30,4.93,0,0.428,6.358,52.9,7.0355,6,300,16.6,372.75,11.22,22.2
246
+ 0.12757,30,4.93,0,0.428,6.393,7.8,7.0355,6,300,16.6,374.71,5.19,23.7
247
+ 0.20608,22,5.86,0,0.431,5.593,76.5,7.9549,7,330,19.1,372.49,12.5,17.6
248
+ 0.19133,22,5.86,0,0.431,5.605,70.2,7.9549,7,330,19.1,389.13,18.46,18.5
249
+ 0.33983,22,5.86,0,0.431,6.108,34.9,8.0555,7,330,19.1,390.18,9.16,24.3
250
+ 0.19657,22,5.86,0,0.431,6.226,79.2,8.0555,7,330,19.1,376.14,10.15,20.5
251
+ 0.16439,22,5.86,0,0.431,6.433,49.1,7.8265,7,330,19.1,374.71,9.52,24.5
252
+ 0.19073,22,5.86,0,0.431,6.718,17.5,7.8265,7,330,19.1,393.74,6.56,26.2
253
+ 0.1403,22,5.86,0,0.431,6.487,13,7.3967,7,330,19.1,396.28,5.9,24.4
254
+ 0.21409,22,5.86,0,0.431,6.438,8.9,7.3967,7,330,19.1,377.07,3.59,24.8
255
+ 0.08221,22,5.86,0,0.431,6.957,6.8,8.9067,7,330,19.1,386.09,3.53,29.6
256
+ 0.36894,22,5.86,0,0.431,8.259,8.4,8.9067,7,330,19.1,396.9,3.54,42.8
257
+ 0.04819,80,3.64,0,0.392,6.108,32,9.2203,1,315,16.4,392.89,6.57,21.9
258
+ 0.03548,80,3.64,0,0.392,5.876,19.1,9.2203,1,315,16.4,395.18,9.25,20.9
259
+ 0.01538,90,3.75,0,0.394,7.454,34.2,6.3361,3,244,15.9,386.34,3.11,44
260
+ 0.61154,20,3.97,0,0.647,8.704,86.9,1.801,5,264,13,389.7,5.12,50
261
+ 0.66351,20,3.97,0,0.647,7.333,100,1.8946,5,264,13,383.29,7.79,36
262
+ 0.65665,20,3.97,0,0.647,6.842,100,2.0107,5,264,13,391.93,6.9,30.1
263
+ 0.54011,20,3.97,0,0.647,7.203,81.8,2.1121,5,264,13,392.8,9.59,33.8
264
+ 0.53412,20,3.97,0,0.647,7.52,89.4,2.1398,5,264,13,388.37,7.26,43.1
265
+ 0.52014,20,3.97,0,0.647,8.398,91.5,2.2885,5,264,13,386.86,5.91,48.8
266
+ 0.82526,20,3.97,0,0.647,7.327,94.5,2.0788,5,264,13,393.42,11.25,31
267
+ 0.55007,20,3.97,0,0.647,7.206,91.6,1.9301,5,264,13,387.89,8.1,36.5
268
+ 0.76162,20,3.97,0,0.647,5.56,62.8,1.9865,5,264,13,392.4,10.45,22.8
269
+ 0.7857,20,3.97,0,0.647,7.014,84.6,2.1329,5,264,13,384.07,14.79,30.7
270
+ 0.57834,20,3.97,0,0.575,8.297,67,2.4216,5,264,13,384.54,7.44,50
271
+ 0.5405,20,3.97,0,0.575,7.47,52.6,2.872,5,264,13,390.3,3.16,43.5
272
+ 0.09065,20,6.96,1,0.464,5.92,61.5,3.9175,3,223,18.6,391.34,13.65,20.7
273
+ 0.29916,20,6.96,0,0.464,5.856,42.1,4.429,3,223,18.6,388.65,13,21.1
274
+ 0.16211,20,6.96,0,0.464,6.24,16.3,4.429,3,223,18.6,396.9,6.59,25.2
275
+ 0.1146,20,6.96,0,0.464,6.538,58.7,3.9175,3,223,18.6,394.96,7.73,24.4
276
+ 0.22188,20,6.96,1,0.464,7.691,51.8,4.3665,3,223,18.6,390.77,6.58,35.2
277
+ 0.05644,40,6.41,1,0.447,6.758,32.9,4.0776,4,254,17.6,396.9,3.53,32.4
278
+ 0.09604,40,6.41,0,0.447,6.854,42.8,4.2673,4,254,17.6,396.9,2.98,32
279
+ 0.10469,40,6.41,1,0.447,7.267,49,4.7872,4,254,17.6,389.25,6.05,33.2
280
+ 0.06127,40,6.41,1,0.447,6.826,27.6,4.8628,4,254,17.6,393.45,4.16,33.1
281
+ 0.07978,40,6.41,0,0.447,6.482,32.1,4.1403,4,254,17.6,396.9,7.19,29.1
282
+ 0.21038,20,3.33,0,0.4429,6.812,32.2,4.1007,5,216,14.9,396.9,4.85,35.1
283
+ 0.03578,20,3.33,0,0.4429,7.82,64.5,4.6947,5,216,14.9,387.31,3.76,45.4
284
+ 0.03705,20,3.33,0,0.4429,6.968,37.2,5.2447,5,216,14.9,392.23,4.59,35.4
285
+ 0.06129,20,3.33,1,0.4429,7.645,49.7,5.2119,5,216,14.9,377.07,3.01,46
286
+ 0.01501,90,1.21,1,0.401,7.923,24.8,5.885,1,198,13.6,395.52,3.16,50
287
+ 0.00906,90,2.97,0,0.4,7.088,20.8,7.3073,1,285,15.3,394.72,7.85,32.2
288
+ 0.01096,55,2.25,0,0.389,6.453,31.9,7.3073,1,300,15.3,394.72,8.23,22
289
+ 0.01965,80,1.76,0,0.385,6.23,31.5,9.0892,1,241,18.2,341.6,12.93,20.1
290
+ 0.03871,52.5,5.32,0,0.405,6.209,31.3,7.3172,6,293,16.6,396.9,7.14,23.2
291
+ 0.0459,52.5,5.32,0,0.405,6.315,45.6,7.3172,6,293,16.6,396.9,7.6,22.3
292
+ 0.04297,52.5,5.32,0,0.405,6.565,22.9,7.3172,6,293,16.6,371.72,9.51,24.8
293
+ 0.03502,80,4.95,0,0.411,6.861,27.9,5.1167,4,245,19.2,396.9,3.33,28.5
294
+ 0.07886,80,4.95,0,0.411,7.148,27.7,5.1167,4,245,19.2,396.9,3.56,37.3
295
+ 0.03615,80,4.95,0,0.411,6.63,23.4,5.1167,4,245,19.2,396.9,4.7,27.9
296
+ 0.08265,0,13.92,0,0.437,6.127,18.4,5.5027,4,289,16,396.9,8.58,23.9
297
+ 0.08199,0,13.92,0,0.437,6.009,42.3,5.5027,4,289,16,396.9,10.4,21.7
298
+ 0.12932,0,13.92,0,0.437,6.678,31.1,5.9604,4,289,16,396.9,6.27,28.6
299
+ 0.05372,0,13.92,0,0.437,6.549,51,5.9604,4,289,16,392.85,7.39,27.1
300
+ 0.14103,0,13.92,0,0.437,5.79,58,6.32,4,289,16,396.9,15.84,20.3
301
+ 0.06466,70,2.24,0,0.4,6.345,20.1,7.8278,5,358,14.8,368.24,4.97,22.5
302
+ 0.05561,70,2.24,0,0.4,7.041,10,7.8278,5,358,14.8,371.58,4.74,29
303
+ 0.04417,70,2.24,0,0.4,6.871,47.4,7.8278,5,358,14.8,390.86,6.07,24.8
304
+ 0.03537,34,6.09,0,0.433,6.59,40.4,5.4917,7,329,16.1,395.75,9.5,22
305
+ 0.09266,34,6.09,0,0.433,6.495,18.4,5.4917,7,329,16.1,383.61,8.67,26.4
306
+ 0.1,34,6.09,0,0.433,6.982,17.7,5.4917,7,329,16.1,390.43,4.86,33.1
307
+ 0.05515,33,2.18,0,0.472,7.236,41.1,4.022,7,222,18.4,393.68,6.93,36.1
308
+ 0.05479,33,2.18,0,0.472,6.616,58.1,3.37,7,222,18.4,393.36,8.93,28.4
309
+ 0.07503,33,2.18,0,0.472,7.42,71.9,3.0992,7,222,18.4,396.9,6.47,33.4
310
+ 0.04932,33,2.18,0,0.472,6.849,70.3,3.1827,7,222,18.4,396.9,7.53,28.2
311
+ 0.49298,0,9.9,0,0.544,6.635,82.5,3.3175,4,304,18.4,396.9,4.54,22.8
312
+ 0.3494,0,9.9,0,0.544,5.972,76.7,3.1025,4,304,18.4,396.24,9.97,20.3
313
+ 2.63548,0,9.9,0,0.544,4.973,37.8,2.5194,4,304,18.4,350.45,12.64,16.1
314
+ 0.79041,0,9.9,0,0.544,6.122,52.8,2.6403,4,304,18.4,396.9,5.98,22.1
315
+ 0.26169,0,9.9,0,0.544,6.023,90.4,2.834,4,304,18.4,396.3,11.72,19.4
316
+ 0.26938,0,9.9,0,0.544,6.266,82.8,3.2628,4,304,18.4,393.39,7.9,21.6
317
+ 0.3692,0,9.9,0,0.544,6.567,87.3,3.6023,4,304,18.4,395.69,9.28,23.8
318
+ 0.25356,0,9.9,0,0.544,5.705,77.7,3.945,4,304,18.4,396.42,11.5,16.2
319
+ 0.31827,0,9.9,0,0.544,5.914,83.2,3.9986,4,304,18.4,390.7,18.33,17.8
320
+ 0.24522,0,9.9,0,0.544,5.782,71.7,4.0317,4,304,18.4,396.9,15.94,19.8
321
+ 0.40202,0,9.9,0,0.544,6.382,67.2,3.5325,4,304,18.4,395.21,10.36,23.1
322
+ 0.47547,0,9.9,0,0.544,6.113,58.8,4.0019,4,304,18.4,396.23,12.73,21
323
+ 0.1676,0,7.38,0,0.493,6.426,52.3,4.5404,5,287,19.6,396.9,7.2,23.8
324
+ 0.18159,0,7.38,0,0.493,6.376,54.3,4.5404,5,287,19.6,396.9,6.87,23.1
325
+ 0.35114,0,7.38,0,0.493,6.041,49.9,4.7211,5,287,19.6,396.9,7.7,20.4
326
+ 0.28392,0,7.38,0,0.493,5.708,74.3,4.7211,5,287,19.6,391.13,11.74,18.5
327
+ 0.34109,0,7.38,0,0.493,6.415,40.1,4.7211,5,287,19.6,396.9,6.12,25
328
+ 0.19186,0,7.38,0,0.493,6.431,14.7,5.4159,5,287,19.6,393.68,5.08,24.6
329
+ 0.30347,0,7.38,0,0.493,6.312,28.9,5.4159,5,287,19.6,396.9,6.15,23
330
+ 0.24103,0,7.38,0,0.493,6.083,43.7,5.4159,5,287,19.6,396.9,12.79,22.2
331
+ 0.06617,0,3.24,0,0.46,5.868,25.8,5.2146,4,430,16.9,382.44,9.97,19.3
332
+ 0.06724,0,3.24,0,0.46,6.333,17.2,5.2146,4,430,16.9,375.21,7.34,22.6
333
+ 0.04544,0,3.24,0,0.46,6.144,32.2,5.8736,4,430,16.9,368.57,9.09,19.8
334
+ 0.05023,35,6.06,0,0.4379,5.706,28.4,6.6407,1,304,16.9,394.02,12.43,17.1
335
+ 0.03466,35,6.06,0,0.4379,6.031,23.3,6.6407,1,304,16.9,362.25,7.83,19.4
336
+ 0.05083,0,5.19,0,0.515,6.316,38.1,6.4584,5,224,20.2,389.71,5.68,22.2
337
+ 0.03738,0,5.19,0,0.515,6.31,38.5,6.4584,5,224,20.2,389.4,6.75,20.7
338
+ 0.03961,0,5.19,0,0.515,6.037,34.5,5.9853,5,224,20.2,396.9,8.01,21.1
339
+ 0.03427,0,5.19,0,0.515,5.869,46.3,5.2311,5,224,20.2,396.9,9.8,19.5
340
+ 0.03041,0,5.19,0,0.515,5.895,59.6,5.615,5,224,20.2,394.81,10.56,18.5
341
+ 0.03306,0,5.19,0,0.515,6.059,37.3,4.8122,5,224,20.2,396.14,8.51,20.6
342
+ 0.05497,0,5.19,0,0.515,5.985,45.4,4.8122,5,224,20.2,396.9,9.74,19
343
+ 0.06151,0,5.19,0,0.515,5.968,58.5,4.8122,5,224,20.2,396.9,9.29,18.7
344
+ 0.01301,35,1.52,0,0.442,7.241,49.3,7.0379,1,284,15.5,394.74,5.49,32.7
345
+ 0.02498,0,1.89,0,0.518,6.54,59.7,6.2669,1,422,15.9,389.96,8.65,16.5
346
+ 0.02543,55,3.78,0,0.484,6.696,56.4,5.7321,5,370,17.6,396.9,7.18,23.9
347
+ 0.03049,55,3.78,0,0.484,6.874,28.1,6.4654,5,370,17.6,387.97,4.61,31.2
348
+ 0.03113,0,4.39,0,0.442,6.014,48.5,8.0136,3,352,18.8,385.64,10.53,17.5
349
+ 0.06162,0,4.39,0,0.442,5.898,52.3,8.0136,3,352,18.8,364.61,12.67,17.2
350
+ 0.0187,85,4.15,0,0.429,6.516,27.7,8.5353,4,351,17.9,392.43,6.36,23.1
351
+ 0.01501,80,2.01,0,0.435,6.635,29.7,8.344,4,280,17,390.94,5.99,24.5
352
+ 0.02899,40,1.25,0,0.429,6.939,34.5,8.7921,1,335,19.7,389.85,5.89,26.6
353
+ 0.06211,40,1.25,0,0.429,6.49,44.4,8.7921,1,335,19.7,396.9,5.98,22.9
354
+ 0.0795,60,1.69,0,0.411,6.579,35.9,10.7103,4,411,18.3,370.78,5.49,24.1
355
+ 0.07244,60,1.69,0,0.411,5.884,18.5,10.7103,4,411,18.3,392.33,7.79,18.6
356
+ 0.01709,90,2.02,0,0.41,6.728,36.1,12.1265,5,187,17,384.46,4.5,30.1
357
+ 0.04301,80,1.91,0,0.413,5.663,21.9,10.5857,4,334,22,382.8,8.05,18.2
358
+ 0.10659,80,1.91,0,0.413,5.936,19.5,10.5857,4,334,22,376.04,5.57,20.6
359
+ 8.98296,0,18.1,1,0.77,6.212,97.4,2.1222,24,666,20.2,377.73,17.6,17.8
360
+ 3.8497,0,18.1,1,0.77,6.395,91,2.5052,24,666,20.2,391.34,13.27,21.7
361
+ 5.20177,0,18.1,1,0.77,6.127,83.4,2.7227,24,666,20.2,395.43,11.48,22.7
362
+ 4.26131,0,18.1,0,0.77,6.112,81.3,2.5091,24,666,20.2,390.74,12.67,22.6
363
+ 4.54192,0,18.1,0,0.77,6.398,88,2.5182,24,666,20.2,374.56,7.79,25
364
+ 3.83684,0,18.1,0,0.77,6.251,91.1,2.2955,24,666,20.2,350.65,14.19,19.9
365
+ 3.67822,0,18.1,0,0.77,5.362,96.2,2.1036,24,666,20.2,380.79,10.19,20.8
366
+ 4.22239,0,18.1,1,0.77,5.803,89,1.9047,24,666,20.2,353.04,14.64,16.8
367
+ 3.47428,0,18.1,1,0.718,8.78,82.9,1.9047,24,666,20.2,354.55,5.29,21.9
368
+ 4.55587,0,18.1,0,0.718,3.561,87.9,1.6132,24,666,20.2,354.7,7.12,27.5
369
+ 3.69695,0,18.1,0,0.718,4.963,91.4,1.7523,24,666,20.2,316.03,14,21.9
370
+ 13.5222,0,18.1,0,0.631,3.863,100,1.5106,24,666,20.2,131.42,13.33,23.1
371
+ 4.89822,0,18.1,0,0.631,4.97,100,1.3325,24,666,20.2,375.52,3.26,50
372
+ 5.66998,0,18.1,1,0.631,6.683,96.8,1.3567,24,666,20.2,375.33,3.73,50
373
+ 6.53876,0,18.1,1,0.631,7.016,97.5,1.2024,24,666,20.2,392.05,2.96,50
374
+ 9.2323,0,18.1,0,0.631,6.216,100,1.1691,24,666,20.2,366.15,9.53,50
375
+ 8.26725,0,18.1,1,0.668,5.875,89.6,1.1296,24,666,20.2,347.88,8.88,50
376
+ 11.1081,0,18.1,0,0.668,4.906,100,1.1742,24,666,20.2,396.9,34.77,13.8
377
+ 18.4982,0,18.1,0,0.668,4.138,100,1.137,24,666,20.2,396.9,37.97,13.8
378
+ 19.6091,0,18.1,0,0.671,7.313,97.9,1.3163,24,666,20.2,396.9,13.44,15
379
+ 15.288,0,18.1,0,0.671,6.649,93.3,1.3449,24,666,20.2,363.02,23.24,13.9
380
+ 9.82349,0,18.1,0,0.671,6.794,98.8,1.358,24,666,20.2,396.9,21.24,13.3
381
+ 23.6482,0,18.1,0,0.671,6.38,96.2,1.3861,24,666,20.2,396.9,23.69,13.1
382
+ 17.8667,0,18.1,0,0.671,6.223,100,1.3861,24,666,20.2,393.74,21.78,10.2
383
+ 88.9762,0,18.1,0,0.671,6.968,91.9,1.4165,24,666,20.2,396.9,17.21,10.4
384
+ 15.8744,0,18.1,0,0.671,6.545,99.1,1.5192,24,666,20.2,396.9,21.08,10.9
385
+ 9.18702,0,18.1,0,0.7,5.536,100,1.5804,24,666,20.2,396.9,23.6,11.3
386
+ 7.99248,0,18.1,0,0.7,5.52,100,1.5331,24,666,20.2,396.9,24.56,12.3
387
+ 20.0849,0,18.1,0,0.7,4.368,91.2,1.4395,24,666,20.2,285.83,30.63,8.8
388
+ 16.8118,0,18.1,0,0.7,5.277,98.1,1.4261,24,666,20.2,396.9,30.81,7.2
389
+ 24.3938,0,18.1,0,0.7,4.652,100,1.4672,24,666,20.2,396.9,28.28,10.5
390
+ 22.5971,0,18.1,0,0.7,5,89.5,1.5184,24,666,20.2,396.9,31.99,7.4
391
+ 14.3337,0,18.1,0,0.7,4.88,100,1.5895,24,666,20.2,372.92,30.62,10.2
392
+ 8.15174,0,18.1,0,0.7,5.39,98.9,1.7281,24,666,20.2,396.9,20.85,11.5
393
+ 6.96215,0,18.1,0,0.7,5.713,97,1.9265,24,666,20.2,394.43,17.11,15.1
394
+ 5.29305,0,18.1,0,0.7,6.051,82.5,2.1678,24,666,20.2,378.38,18.76,23.2
395
+ 11.5779,0,18.1,0,0.7,5.036,97,1.77,24,666,20.2,396.9,25.68,9.7
396
+ 8.64476,0,18.1,0,0.693,6.193,92.6,1.7912,24,666,20.2,396.9,15.17,13.8
397
+ 13.3598,0,18.1,0,0.693,5.887,94.7,1.7821,24,666,20.2,396.9,16.35,12.7
398
+ 8.71675,0,18.1,0,0.693,6.471,98.8,1.7257,24,666,20.2,391.98,17.12,13.1
399
+ 5.87205,0,18.1,0,0.693,6.405,96,1.6768,24,666,20.2,396.9,19.37,12.5
400
+ 7.67202,0,18.1,0,0.693,5.747,98.9,1.6334,24,666,20.2,393.1,19.92,8.5
401
+ 38.3518,0,18.1,0,0.693,5.453,100,1.4896,24,666,20.2,396.9,30.59,5
402
+ 9.91655,0,18.1,0,0.693,5.852,77.8,1.5004,24,666,20.2,338.16,29.97,6.3
403
+ 25.0461,0,18.1,0,0.693,5.987,100,1.5888,24,666,20.2,396.9,26.77,5.6
404
+ 14.2362,0,18.1,0,0.693,6.343,100,1.5741,24,666,20.2,396.9,20.32,7.2
405
+ 9.59571,0,18.1,0,0.693,6.404,100,1.639,24,666,20.2,376.11,20.31,12.1
406
+ 24.8017,0,18.1,0,0.693,5.349,96,1.7028,24,666,20.2,396.9,19.77,8.3
407
+ 41.5292,0,18.1,0,0.693,5.531,85.4,1.6074,24,666,20.2,329.46,27.38,8.5
408
+ 67.9208,0,18.1,0,0.693,5.683,100,1.4254,24,666,20.2,384.97,22.98,5
409
+ 20.7162,0,18.1,0,0.659,4.138,100,1.1781,24,666,20.2,370.22,23.34,11.9
410
+ 11.9511,0,18.1,0,0.659,5.608,100,1.2852,24,666,20.2,332.09,12.13,27.9
411
+ 7.40389,0,18.1,0,0.597,5.617,97.9,1.4547,24,666,20.2,314.64,26.4,17.2
412
+ 14.4383,0,18.1,0,0.597,6.852,100,1.4655,24,666,20.2,179.36,19.78,27.5
413
+ 51.1358,0,18.1,0,0.597,5.757,100,1.413,24,666,20.2,2.6,10.11,15
414
+ 14.0507,0,18.1,0,0.597,6.657,100,1.5275,24,666,20.2,35.05,21.22,17.2
415
+ 18.811,0,18.1,0,0.597,4.628,100,1.5539,24,666,20.2,28.79,34.37,17.9
416
+ 28.6558,0,18.1,0,0.597,5.155,100,1.5894,24,666,20.2,210.97,20.08,16.3
417
+ 45.7461,0,18.1,0,0.693,4.519,100,1.6582,24,666,20.2,88.27,36.98,7
418
+ 18.0846,0,18.1,0,0.679,6.434,100,1.8347,24,666,20.2,27.25,29.05,7.2
419
+ 10.8342,0,18.1,0,0.679,6.782,90.8,1.8195,24,666,20.2,21.57,25.79,7.5
420
+ 25.9406,0,18.1,0,0.679,5.304,89.1,1.6475,24,666,20.2,127.36,26.64,10.4
421
+ 73.5341,0,18.1,0,0.679,5.957,100,1.8026,24,666,20.2,16.45,20.62,8.8
422
+ 11.8123,0,18.1,0,0.718,6.824,76.5,1.794,24,666,20.2,48.45,22.74,8.4
423
+ 11.0874,0,18.1,0,0.718,6.411,100,1.8589,24,666,20.2,318.75,15.02,16.7
424
+ 7.02259,0,18.1,0,0.718,6.006,95.3,1.8746,24,666,20.2,319.98,15.7,14.2
425
+ 12.0482,0,18.1,0,0.614,5.648,87.6,1.9512,24,666,20.2,291.55,14.1,20.8
426
+ 7.05042,0,18.1,0,0.614,6.103,85.1,2.0218,24,666,20.2,2.52,23.29,13.4
427
+ 8.79212,0,18.1,0,0.584,5.565,70.6,2.0635,24,666,20.2,3.65,17.16,11.7
428
+ 15.8603,0,18.1,0,0.679,5.896,95.4,1.9096,24,666,20.2,7.68,24.39,8.3
429
+ 12.2472,0,18.1,0,0.584,5.837,59.7,1.9976,24,666,20.2,24.65,15.69,10.2
430
+ 37.6619,0,18.1,0,0.679,6.202,78.7,1.8629,24,666,20.2,18.82,14.52,10.9
431
+ 7.36711,0,18.1,0,0.679,6.193,78.1,1.9356,24,666,20.2,96.73,21.52,11
432
+ 9.33889,0,18.1,0,0.679,6.38,95.6,1.9682,24,666,20.2,60.72,24.08,9.5
433
+ 8.49213,0,18.1,0,0.584,6.348,86.1,2.0527,24,666,20.2,83.45,17.64,14.5
434
+ 10.0623,0,18.1,0,0.584,6.833,94.3,2.0882,24,666,20.2,81.33,19.69,14.1
435
+ 6.44405,0,18.1,0,0.584,6.425,74.8,2.2004,24,666,20.2,97.95,12.03,16.1
436
+ 5.58107,0,18.1,0,0.713,6.436,87.9,2.3158,24,666,20.2,100.19,16.22,14.3
437
+ 13.9134,0,18.1,0,0.713,6.208,95,2.2222,24,666,20.2,100.63,15.17,11.7
438
+ 11.1604,0,18.1,0,0.74,6.629,94.6,2.1247,24,666,20.2,109.85,23.27,13.4
439
+ 14.4208,0,18.1,0,0.74,6.461,93.3,2.0026,24,666,20.2,27.49,18.05,9.6
440
+ 15.1772,0,18.1,0,0.74,6.152,100,1.9142,24,666,20.2,9.32,26.45,8.7
441
+ 13.6781,0,18.1,0,0.74,5.935,87.9,1.8206,24,666,20.2,68.95,34.02,8.4
442
+ 9.39063,0,18.1,0,0.74,5.627,93.9,1.8172,24,666,20.2,396.9,22.88,12.8
443
+ 22.0511,0,18.1,0,0.74,5.818,92.4,1.8662,24,666,20.2,391.45,22.11,10.5
444
+ 9.72418,0,18.1,0,0.74,6.406,97.2,2.0651,24,666,20.2,385.96,19.52,17.1
445
+ 5.66637,0,18.1,0,0.74,6.219,100,2.0048,24,666,20.2,395.69,16.59,18.4
446
+ 9.96654,0,18.1,0,0.74,6.485,100,1.9784,24,666,20.2,386.73,18.85,15.4
447
+ 12.8023,0,18.1,0,0.74,5.854,96.6,1.8956,24,666,20.2,240.52,23.79,10.8
448
+ 10.6718,0,18.1,0,0.74,6.459,94.8,1.9879,24,666,20.2,43.06,23.98,11.8
449
+ 6.28807,0,18.1,0,0.74,6.341,96.4,2.072,24,666,20.2,318.01,17.79,14.9
450
+ 9.92485,0,18.1,0,0.74,6.251,96.6,2.198,24,666,20.2,388.52,16.44,12.6
451
+ 9.32909,0,18.1,0,0.713,6.185,98.7,2.2616,24,666,20.2,396.9,18.13,14.1
452
+ 7.52601,0,18.1,0,0.713,6.417,98.3,2.185,24,666,20.2,304.21,19.31,13
453
+ 6.71772,0,18.1,0,0.713,6.749,92.6,2.3236,24,666,20.2,0.32,17.44,13.4
454
+ 5.44114,0,18.1,0,0.713,6.655,98.2,2.3552,24,666,20.2,355.29,17.73,15.2
455
+ 5.09017,0,18.1,0,0.713,6.297,91.8,2.3682,24,666,20.2,385.09,17.27,16.1
456
+ 8.24809,0,18.1,0,0.713,7.393,99.3,2.4527,24,666,20.2,375.87,16.74,17.8
457
+ 9.51363,0,18.1,0,0.713,6.728,94.1,2.4961,24,666,20.2,6.68,18.71,14.9
458
+ 4.75237,0,18.1,0,0.713,6.525,86.5,2.4358,24,666,20.2,50.92,18.13,14.1
459
+ 4.66883,0,18.1,0,0.713,5.976,87.9,2.5806,24,666,20.2,10.48,19.01,12.7
460
+ 8.20058,0,18.1,0,0.713,5.936,80.3,2.7792,24,666,20.2,3.5,16.94,13.5
461
+ 7.75223,0,18.1,0,0.713,6.301,83.7,2.7831,24,666,20.2,272.21,16.23,14.9
462
+ 6.80117,0,18.1,0,0.713,6.081,84.4,2.7175,24,666,20.2,396.9,14.7,20
463
+ 4.81213,0,18.1,0,0.713,6.701,90,2.5975,24,666,20.2,255.23,16.42,16.4
464
+ 3.69311,0,18.1,0,0.713,6.376,88.4,2.5671,24,666,20.2,391.43,14.65,17.7
465
+ 6.65492,0,18.1,0,0.713,6.317,83,2.7344,24,666,20.2,396.9,13.99,19.5
466
+ 5.82115,0,18.1,0,0.713,6.513,89.9,2.8016,24,666,20.2,393.82,10.29,20.2
467
+ 7.83932,0,18.1,0,0.655,6.209,65.4,2.9634,24,666,20.2,396.9,13.22,21.4
468
+ 3.1636,0,18.1,0,0.655,5.759,48.2,3.0665,24,666,20.2,334.4,14.13,19.9
469
+ 3.77498,0,18.1,0,0.655,5.952,84.7,2.8715,24,666,20.2,22.01,17.15,19
470
+ 4.42228,0,18.1,0,0.584,6.003,94.5,2.5403,24,666,20.2,331.29,21.32,19.1
471
+ 15.5757,0,18.1,0,0.58,5.926,71,2.9084,24,666,20.2,368.74,18.13,19.1
472
+ 13.0751,0,18.1,0,0.58,5.713,56.7,2.8237,24,666,20.2,396.9,14.76,20.1
473
+ 4.34879,0,18.1,0,0.58,6.167,84,3.0334,24,666,20.2,396.9,16.29,19.9
474
+ 4.03841,0,18.1,0,0.532,6.229,90.7,3.0993,24,666,20.2,395.33,12.87,19.6
475
+ 3.56868,0,18.1,0,0.58,6.437,75,2.8965,24,666,20.2,393.37,14.36,23.2
476
+ 4.64689,0,18.1,0,0.614,6.98,67.6,2.5329,24,666,20.2,374.68,11.66,29.8
477
+ 8.05579,0,18.1,0,0.584,5.427,95.4,2.4298,24,666,20.2,352.58,18.14,13.8
478
+ 6.39312,0,18.1,0,0.584,6.162,97.4,2.206,24,666,20.2,302.76,24.1,13.3
479
+ 4.87141,0,18.1,0,0.614,6.484,93.6,2.3053,24,666,20.2,396.21,18.68,16.7
480
+ 15.0234,0,18.1,0,0.614,5.304,97.3,2.1007,24,666,20.2,349.48,24.91,12
481
+ 10.233,0,18.1,0,0.614,6.185,96.7,2.1705,24,666,20.2,379.7,18.03,14.6
482
+ 14.3337,0,18.1,0,0.614,6.229,88,1.9512,24,666,20.2,383.32,13.11,21.4
483
+ 5.82401,0,18.1,0,0.532,6.242,64.7,3.4242,24,666,20.2,396.9,10.74,23
484
+ 5.70818,0,18.1,0,0.532,6.75,74.9,3.3317,24,666,20.2,393.07,7.74,23.7
485
+ 5.73116,0,18.1,0,0.532,7.061,77,3.4106,24,666,20.2,395.28,7.01,25
486
+ 2.81838,0,18.1,0,0.532,5.762,40.3,4.0983,24,666,20.2,392.92,10.42,21.8
487
+ 2.37857,0,18.1,0,0.583,5.871,41.9,3.724,24,666,20.2,370.73,13.34,20.6
488
+ 3.67367,0,18.1,0,0.583,6.312,51.9,3.9917,24,666,20.2,388.62,10.58,21.2
489
+ 5.69175,0,18.1,0,0.583,6.114,79.8,3.5459,24,666,20.2,392.68,14.98,19.1
490
+ 4.83567,0,18.1,0,0.583,5.905,53.2,3.1523,24,666,20.2,388.22,11.45,20.6
491
+ 0.15086,0,27.74,0,0.609,5.454,92.7,1.8209,4,711,20.1,395.09,18.06,15.2
492
+ 0.18337,0,27.74,0,0.609,5.414,98.3,1.7554,4,711,20.1,344.05,23.97,7
493
+ 0.20746,0,27.74,0,0.609,5.093,98,1.8226,4,711,20.1,318.43,29.68,8.1
494
+ 0.10574,0,27.74,0,0.609,5.983,98.8,1.8681,4,711,20.1,390.11,18.07,13.6
495
+ 0.11132,0,27.74,0,0.609,5.983,83.5,2.1099,4,711,20.1,396.9,13.35,20.1
496
+ 0.17331,0,9.69,0,0.585,5.707,54,2.3817,6,391,19.2,396.9,12.01,21.8
497
+ 0.27957,0,9.69,0,0.585,5.926,42.6,2.3817,6,391,19.2,396.9,13.59,24.5
498
+ 0.17899,0,9.69,0,0.585,5.67,28.8,2.7986,6,391,19.2,393.29,17.6,23.1
499
+ 0.2896,0,9.69,0,0.585,5.39,72.9,2.7986,6,391,19.2,396.9,21.14,19.7
500
+ 0.26838,0,9.69,0,0.585,5.794,70.6,2.8927,6,391,19.2,396.9,14.1,18.3
501
+ 0.23912,0,9.69,0,0.585,6.019,65.3,2.4091,6,391,19.2,396.9,12.92,21.2
502
+ 0.17783,0,9.69,0,0.585,5.569,73.5,2.3999,6,391,19.2,395.77,15.1,17.5
503
+ 0.22438,0,9.69,0,0.585,6.027,79.7,2.4982,6,391,19.2,396.9,14.33,16.8
504
+ 0.06263,0,11.93,0,0.573,6.593,69.1,2.4786,1,273,21,391.99,9.67,22.4
505
+ 0.04527,0,11.93,0,0.573,6.12,76.7,2.2875,1,273,21,396.9,9.08,20.6
506
+ 0.06076,0,11.93,0,0.573,6.976,91,2.1675,1,273,21,396.9,5.64,23.9
507
+ 0.10959,0,11.93,0,0.573,6.794,89.3,2.3889,1,273,21,393.45,6.48,22
508
+ 0.04741,0,11.93,0,0.573,6.03,80.8,2.505,1,273,21,396.9,7.88,11.9
venv/lib/python3.10/site-packages/sklearn/datasets/data/breast_cancer.csv ADDED
The diff for this file is too large to render. See raw diff
 
venv/lib/python3.10/site-packages/sklearn/datasets/data/iris.csv ADDED
@@ -0,0 +1,151 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 150,4,setosa,versicolor,virginica
2
+ 5.1,3.5,1.4,0.2,0
3
+ 4.9,3.0,1.4,0.2,0
4
+ 4.7,3.2,1.3,0.2,0
5
+ 4.6,3.1,1.5,0.2,0
6
+ 5.0,3.6,1.4,0.2,0
7
+ 5.4,3.9,1.7,0.4,0
8
+ 4.6,3.4,1.4,0.3,0
9
+ 5.0,3.4,1.5,0.2,0
10
+ 4.4,2.9,1.4,0.2,0
11
+ 4.9,3.1,1.5,0.1,0
12
+ 5.4,3.7,1.5,0.2,0
13
+ 4.8,3.4,1.6,0.2,0
14
+ 4.8,3.0,1.4,0.1,0
15
+ 4.3,3.0,1.1,0.1,0
16
+ 5.8,4.0,1.2,0.2,0
17
+ 5.7,4.4,1.5,0.4,0
18
+ 5.4,3.9,1.3,0.4,0
19
+ 5.1,3.5,1.4,0.3,0
20
+ 5.7,3.8,1.7,0.3,0
21
+ 5.1,3.8,1.5,0.3,0
22
+ 5.4,3.4,1.7,0.2,0
23
+ 5.1,3.7,1.5,0.4,0
24
+ 4.6,3.6,1.0,0.2,0
25
+ 5.1,3.3,1.7,0.5,0
26
+ 4.8,3.4,1.9,0.2,0
27
+ 5.0,3.0,1.6,0.2,0
28
+ 5.0,3.4,1.6,0.4,0
29
+ 5.2,3.5,1.5,0.2,0
30
+ 5.2,3.4,1.4,0.2,0
31
+ 4.7,3.2,1.6,0.2,0
32
+ 4.8,3.1,1.6,0.2,0
33
+ 5.4,3.4,1.5,0.4,0
34
+ 5.2,4.1,1.5,0.1,0
35
+ 5.5,4.2,1.4,0.2,0
36
+ 4.9,3.1,1.5,0.2,0
37
+ 5.0,3.2,1.2,0.2,0
38
+ 5.5,3.5,1.3,0.2,0
39
+ 4.9,3.6,1.4,0.1,0
40
+ 4.4,3.0,1.3,0.2,0
41
+ 5.1,3.4,1.5,0.2,0
42
+ 5.0,3.5,1.3,0.3,0
43
+ 4.5,2.3,1.3,0.3,0
44
+ 4.4,3.2,1.3,0.2,0
45
+ 5.0,3.5,1.6,0.6,0
46
+ 5.1,3.8,1.9,0.4,0
47
+ 4.8,3.0,1.4,0.3,0
48
+ 5.1,3.8,1.6,0.2,0
49
+ 4.6,3.2,1.4,0.2,0
50
+ 5.3,3.7,1.5,0.2,0
51
+ 5.0,3.3,1.4,0.2,0
52
+ 7.0,3.2,4.7,1.4,1
53
+ 6.4,3.2,4.5,1.5,1
54
+ 6.9,3.1,4.9,1.5,1
55
+ 5.5,2.3,4.0,1.3,1
56
+ 6.5,2.8,4.6,1.5,1
57
+ 5.7,2.8,4.5,1.3,1
58
+ 6.3,3.3,4.7,1.6,1
59
+ 4.9,2.4,3.3,1.0,1
60
+ 6.6,2.9,4.6,1.3,1
61
+ 5.2,2.7,3.9,1.4,1
62
+ 5.0,2.0,3.5,1.0,1
63
+ 5.9,3.0,4.2,1.5,1
64
+ 6.0,2.2,4.0,1.0,1
65
+ 6.1,2.9,4.7,1.4,1
66
+ 5.6,2.9,3.6,1.3,1
67
+ 6.7,3.1,4.4,1.4,1
68
+ 5.6,3.0,4.5,1.5,1
69
+ 5.8,2.7,4.1,1.0,1
70
+ 6.2,2.2,4.5,1.5,1
71
+ 5.6,2.5,3.9,1.1,1
72
+ 5.9,3.2,4.8,1.8,1
73
+ 6.1,2.8,4.0,1.3,1
74
+ 6.3,2.5,4.9,1.5,1
75
+ 6.1,2.8,4.7,1.2,1
76
+ 6.4,2.9,4.3,1.3,1
77
+ 6.6,3.0,4.4,1.4,1
78
+ 6.8,2.8,4.8,1.4,1
79
+ 6.7,3.0,5.0,1.7,1
80
+ 6.0,2.9,4.5,1.5,1
81
+ 5.7,2.6,3.5,1.0,1
82
+ 5.5,2.4,3.8,1.1,1
83
+ 5.5,2.4,3.7,1.0,1
84
+ 5.8,2.7,3.9,1.2,1
85
+ 6.0,2.7,5.1,1.6,1
86
+ 5.4,3.0,4.5,1.5,1
87
+ 6.0,3.4,4.5,1.6,1
88
+ 6.7,3.1,4.7,1.5,1
89
+ 6.3,2.3,4.4,1.3,1
90
+ 5.6,3.0,4.1,1.3,1
91
+ 5.5,2.5,4.0,1.3,1
92
+ 5.5,2.6,4.4,1.2,1
93
+ 6.1,3.0,4.6,1.4,1
94
+ 5.8,2.6,4.0,1.2,1
95
+ 5.0,2.3,3.3,1.0,1
96
+ 5.6,2.7,4.2,1.3,1
97
+ 5.7,3.0,4.2,1.2,1
98
+ 5.7,2.9,4.2,1.3,1
99
+ 6.2,2.9,4.3,1.3,1
100
+ 5.1,2.5,3.0,1.1,1
101
+ 5.7,2.8,4.1,1.3,1
102
+ 6.3,3.3,6.0,2.5,2
103
+ 5.8,2.7,5.1,1.9,2
104
+ 7.1,3.0,5.9,2.1,2
105
+ 6.3,2.9,5.6,1.8,2
106
+ 6.5,3.0,5.8,2.2,2
107
+ 7.6,3.0,6.6,2.1,2
108
+ 4.9,2.5,4.5,1.7,2
109
+ 7.3,2.9,6.3,1.8,2
110
+ 6.7,2.5,5.8,1.8,2
111
+ 7.2,3.6,6.1,2.5,2
112
+ 6.5,3.2,5.1,2.0,2
113
+ 6.4,2.7,5.3,1.9,2
114
+ 6.8,3.0,5.5,2.1,2
115
+ 5.7,2.5,5.0,2.0,2
116
+ 5.8,2.8,5.1,2.4,2
117
+ 6.4,3.2,5.3,2.3,2
118
+ 6.5,3.0,5.5,1.8,2
119
+ 7.7,3.8,6.7,2.2,2
120
+ 7.7,2.6,6.9,2.3,2
121
+ 6.0,2.2,5.0,1.5,2
122
+ 6.9,3.2,5.7,2.3,2
123
+ 5.6,2.8,4.9,2.0,2
124
+ 7.7,2.8,6.7,2.0,2
125
+ 6.3,2.7,4.9,1.8,2
126
+ 6.7,3.3,5.7,2.1,2
127
+ 7.2,3.2,6.0,1.8,2
128
+ 6.2,2.8,4.8,1.8,2
129
+ 6.1,3.0,4.9,1.8,2
130
+ 6.4,2.8,5.6,2.1,2
131
+ 7.2,3.0,5.8,1.6,2
132
+ 7.4,2.8,6.1,1.9,2
133
+ 7.9,3.8,6.4,2.0,2
134
+ 6.4,2.8,5.6,2.2,2
135
+ 6.3,2.8,5.1,1.5,2
136
+ 6.1,2.6,5.6,1.4,2
137
+ 7.7,3.0,6.1,2.3,2
138
+ 6.3,3.4,5.6,2.4,2
139
+ 6.4,3.1,5.5,1.8,2
140
+ 6.0,3.0,4.8,1.8,2
141
+ 6.9,3.1,5.4,2.1,2
142
+ 6.7,3.1,5.6,2.4,2
143
+ 6.9,3.1,5.1,2.3,2
144
+ 5.8,2.7,5.1,1.9,2
145
+ 6.8,3.2,5.9,2.3,2
146
+ 6.7,3.3,5.7,2.5,2
147
+ 6.7,3.0,5.2,2.3,2
148
+ 6.3,2.5,5.0,1.9,2
149
+ 6.5,3.0,5.2,2.0,2
150
+ 6.2,3.4,5.4,2.3,2
151
+ 5.9,3.0,5.1,1.8,2
venv/lib/python3.10/site-packages/sklearn/datasets/data/linnerud_exercise.csv ADDED
@@ -0,0 +1,21 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Chins Situps Jumps
2
+ 5 162 60
3
+ 2 110 60
4
+ 12 101 101
5
+ 12 105 37
6
+ 13 155 58
7
+ 4 101 42
8
+ 8 101 38
9
+ 6 125 40
10
+ 15 200 40
11
+ 17 251 250
12
+ 17 120 38
13
+ 13 210 115
14
+ 14 215 105
15
+ 1 50 50
16
+ 6 70 31
17
+ 12 210 120
18
+ 4 60 25
19
+ 11 230 80
20
+ 15 225 73
21
+ 2 110 43
venv/lib/python3.10/site-packages/sklearn/datasets/data/linnerud_physiological.csv ADDED
@@ -0,0 +1,21 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Weight Waist Pulse
2
+ 191 36 50
3
+ 189 37 52
4
+ 193 38 58
5
+ 162 35 62
6
+ 189 35 46
7
+ 182 36 56
8
+ 211 38 56
9
+ 167 34 60
10
+ 176 31 74
11
+ 154 33 56
12
+ 169 34 50
13
+ 166 33 52
14
+ 154 34 64
15
+ 247 46 50
16
+ 193 36 46
17
+ 202 37 62
18
+ 176 37 54
19
+ 157 32 52
20
+ 156 33 54
21
+ 138 33 68
venv/lib/python3.10/site-packages/sklearn/datasets/data/wine_data.csv ADDED
@@ -0,0 +1,179 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 178,13,class_0,class_1,class_2
2
+ 14.23,1.71,2.43,15.6,127,2.8,3.06,0.28,2.29,5.64,1.04,3.92,1065,0
3
+ 13.2,1.78,2.14,11.2,100,2.65,2.76,0.26,1.28,4.38,1.05,3.4,1050,0
4
+ 13.16,2.36,2.67,18.6,101,2.8,3.24,0.3,2.81,5.68,1.03,3.17,1185,0
5
+ 14.37,1.95,2.5,16.8,113,3.85,3.49,0.24,2.18,7.8,0.86,3.45,1480,0
6
+ 13.24,2.59,2.87,21,118,2.8,2.69,0.39,1.82,4.32,1.04,2.93,735,0
7
+ 14.2,1.76,2.45,15.2,112,3.27,3.39,0.34,1.97,6.75,1.05,2.85,1450,0
8
+ 14.39,1.87,2.45,14.6,96,2.5,2.52,0.3,1.98,5.25,1.02,3.58,1290,0
9
+ 14.06,2.15,2.61,17.6,121,2.6,2.51,0.31,1.25,5.05,1.06,3.58,1295,0
10
+ 14.83,1.64,2.17,14,97,2.8,2.98,0.29,1.98,5.2,1.08,2.85,1045,0
11
+ 13.86,1.35,2.27,16,98,2.98,3.15,0.22,1.85,7.22,1.01,3.55,1045,0
12
+ 14.1,2.16,2.3,18,105,2.95,3.32,0.22,2.38,5.75,1.25,3.17,1510,0
13
+ 14.12,1.48,2.32,16.8,95,2.2,2.43,0.26,1.57,5,1.17,2.82,1280,0
14
+ 13.75,1.73,2.41,16,89,2.6,2.76,0.29,1.81,5.6,1.15,2.9,1320,0
15
+ 14.75,1.73,2.39,11.4,91,3.1,3.69,0.43,2.81,5.4,1.25,2.73,1150,0
16
+ 14.38,1.87,2.38,12,102,3.3,3.64,0.29,2.96,7.5,1.2,3,1547,0
17
+ 13.63,1.81,2.7,17.2,112,2.85,2.91,0.3,1.46,7.3,1.28,2.88,1310,0
18
+ 14.3,1.92,2.72,20,120,2.8,3.14,0.33,1.97,6.2,1.07,2.65,1280,0
19
+ 13.83,1.57,2.62,20,115,2.95,3.4,0.4,1.72,6.6,1.13,2.57,1130,0
20
+ 14.19,1.59,2.48,16.5,108,3.3,3.93,0.32,1.86,8.7,1.23,2.82,1680,0
21
+ 13.64,3.1,2.56,15.2,116,2.7,3.03,0.17,1.66,5.1,0.96,3.36,845,0
22
+ 14.06,1.63,2.28,16,126,3,3.17,0.24,2.1,5.65,1.09,3.71,780,0
23
+ 12.93,3.8,2.65,18.6,102,2.41,2.41,0.25,1.98,4.5,1.03,3.52,770,0
24
+ 13.71,1.86,2.36,16.6,101,2.61,2.88,0.27,1.69,3.8,1.11,4,1035,0
25
+ 12.85,1.6,2.52,17.8,95,2.48,2.37,0.26,1.46,3.93,1.09,3.63,1015,0
26
+ 13.5,1.81,2.61,20,96,2.53,2.61,0.28,1.66,3.52,1.12,3.82,845,0
27
+ 13.05,2.05,3.22,25,124,2.63,2.68,0.47,1.92,3.58,1.13,3.2,830,0
28
+ 13.39,1.77,2.62,16.1,93,2.85,2.94,0.34,1.45,4.8,0.92,3.22,1195,0
29
+ 13.3,1.72,2.14,17,94,2.4,2.19,0.27,1.35,3.95,1.02,2.77,1285,0
30
+ 13.87,1.9,2.8,19.4,107,2.95,2.97,0.37,1.76,4.5,1.25,3.4,915,0
31
+ 14.02,1.68,2.21,16,96,2.65,2.33,0.26,1.98,4.7,1.04,3.59,1035,0
32
+ 13.73,1.5,2.7,22.5,101,3,3.25,0.29,2.38,5.7,1.19,2.71,1285,0
33
+ 13.58,1.66,2.36,19.1,106,2.86,3.19,0.22,1.95,6.9,1.09,2.88,1515,0
34
+ 13.68,1.83,2.36,17.2,104,2.42,2.69,0.42,1.97,3.84,1.23,2.87,990,0
35
+ 13.76,1.53,2.7,19.5,132,2.95,2.74,0.5,1.35,5.4,1.25,3,1235,0
36
+ 13.51,1.8,2.65,19,110,2.35,2.53,0.29,1.54,4.2,1.1,2.87,1095,0
37
+ 13.48,1.81,2.41,20.5,100,2.7,2.98,0.26,1.86,5.1,1.04,3.47,920,0
38
+ 13.28,1.64,2.84,15.5,110,2.6,2.68,0.34,1.36,4.6,1.09,2.78,880,0
39
+ 13.05,1.65,2.55,18,98,2.45,2.43,0.29,1.44,4.25,1.12,2.51,1105,0
40
+ 13.07,1.5,2.1,15.5,98,2.4,2.64,0.28,1.37,3.7,1.18,2.69,1020,0
41
+ 14.22,3.99,2.51,13.2,128,3,3.04,0.2,2.08,5.1,0.89,3.53,760,0
42
+ 13.56,1.71,2.31,16.2,117,3.15,3.29,0.34,2.34,6.13,0.95,3.38,795,0
43
+ 13.41,3.84,2.12,18.8,90,2.45,2.68,0.27,1.48,4.28,0.91,3,1035,0
44
+ 13.88,1.89,2.59,15,101,3.25,3.56,0.17,1.7,5.43,0.88,3.56,1095,0
45
+ 13.24,3.98,2.29,17.5,103,2.64,2.63,0.32,1.66,4.36,0.82,3,680,0
46
+ 13.05,1.77,2.1,17,107,3,3,0.28,2.03,5.04,0.88,3.35,885,0
47
+ 14.21,4.04,2.44,18.9,111,2.85,2.65,0.3,1.25,5.24,0.87,3.33,1080,0
48
+ 14.38,3.59,2.28,16,102,3.25,3.17,0.27,2.19,4.9,1.04,3.44,1065,0
49
+ 13.9,1.68,2.12,16,101,3.1,3.39,0.21,2.14,6.1,0.91,3.33,985,0
50
+ 14.1,2.02,2.4,18.8,103,2.75,2.92,0.32,2.38,6.2,1.07,2.75,1060,0
51
+ 13.94,1.73,2.27,17.4,108,2.88,3.54,0.32,2.08,8.9,1.12,3.1,1260,0
52
+ 13.05,1.73,2.04,12.4,92,2.72,3.27,0.17,2.91,7.2,1.12,2.91,1150,0
53
+ 13.83,1.65,2.6,17.2,94,2.45,2.99,0.22,2.29,5.6,1.24,3.37,1265,0
54
+ 13.82,1.75,2.42,14,111,3.88,3.74,0.32,1.87,7.05,1.01,3.26,1190,0
55
+ 13.77,1.9,2.68,17.1,115,3,2.79,0.39,1.68,6.3,1.13,2.93,1375,0
56
+ 13.74,1.67,2.25,16.4,118,2.6,2.9,0.21,1.62,5.85,0.92,3.2,1060,0
57
+ 13.56,1.73,2.46,20.5,116,2.96,2.78,0.2,2.45,6.25,0.98,3.03,1120,0
58
+ 14.22,1.7,2.3,16.3,118,3.2,3,0.26,2.03,6.38,0.94,3.31,970,0
59
+ 13.29,1.97,2.68,16.8,102,3,3.23,0.31,1.66,6,1.07,2.84,1270,0
60
+ 13.72,1.43,2.5,16.7,108,3.4,3.67,0.19,2.04,6.8,0.89,2.87,1285,0
61
+ 12.37,0.94,1.36,10.6,88,1.98,0.57,0.28,0.42,1.95,1.05,1.82,520,1
62
+ 12.33,1.1,2.28,16,101,2.05,1.09,0.63,0.41,3.27,1.25,1.67,680,1
63
+ 12.64,1.36,2.02,16.8,100,2.02,1.41,0.53,0.62,5.75,0.98,1.59,450,1
64
+ 13.67,1.25,1.92,18,94,2.1,1.79,0.32,0.73,3.8,1.23,2.46,630,1
65
+ 12.37,1.13,2.16,19,87,3.5,3.1,0.19,1.87,4.45,1.22,2.87,420,1
66
+ 12.17,1.45,2.53,19,104,1.89,1.75,0.45,1.03,2.95,1.45,2.23,355,1
67
+ 12.37,1.21,2.56,18.1,98,2.42,2.65,0.37,2.08,4.6,1.19,2.3,678,1
68
+ 13.11,1.01,1.7,15,78,2.98,3.18,0.26,2.28,5.3,1.12,3.18,502,1
69
+ 12.37,1.17,1.92,19.6,78,2.11,2,0.27,1.04,4.68,1.12,3.48,510,1
70
+ 13.34,0.94,2.36,17,110,2.53,1.3,0.55,0.42,3.17,1.02,1.93,750,1
71
+ 12.21,1.19,1.75,16.8,151,1.85,1.28,0.14,2.5,2.85,1.28,3.07,718,1
72
+ 12.29,1.61,2.21,20.4,103,1.1,1.02,0.37,1.46,3.05,0.906,1.82,870,1
73
+ 13.86,1.51,2.67,25,86,2.95,2.86,0.21,1.87,3.38,1.36,3.16,410,1
74
+ 13.49,1.66,2.24,24,87,1.88,1.84,0.27,1.03,3.74,0.98,2.78,472,1
75
+ 12.99,1.67,2.6,30,139,3.3,2.89,0.21,1.96,3.35,1.31,3.5,985,1
76
+ 11.96,1.09,2.3,21,101,3.38,2.14,0.13,1.65,3.21,0.99,3.13,886,1
77
+ 11.66,1.88,1.92,16,97,1.61,1.57,0.34,1.15,3.8,1.23,2.14,428,1
78
+ 13.03,0.9,1.71,16,86,1.95,2.03,0.24,1.46,4.6,1.19,2.48,392,1
79
+ 11.84,2.89,2.23,18,112,1.72,1.32,0.43,0.95,2.65,0.96,2.52,500,1
80
+ 12.33,0.99,1.95,14.8,136,1.9,1.85,0.35,2.76,3.4,1.06,2.31,750,1
81
+ 12.7,3.87,2.4,23,101,2.83,2.55,0.43,1.95,2.57,1.19,3.13,463,1
82
+ 12,0.92,2,19,86,2.42,2.26,0.3,1.43,2.5,1.38,3.12,278,1
83
+ 12.72,1.81,2.2,18.8,86,2.2,2.53,0.26,1.77,3.9,1.16,3.14,714,1
84
+ 12.08,1.13,2.51,24,78,2,1.58,0.4,1.4,2.2,1.31,2.72,630,1
85
+ 13.05,3.86,2.32,22.5,85,1.65,1.59,0.61,1.62,4.8,0.84,2.01,515,1
86
+ 11.84,0.89,2.58,18,94,2.2,2.21,0.22,2.35,3.05,0.79,3.08,520,1
87
+ 12.67,0.98,2.24,18,99,2.2,1.94,0.3,1.46,2.62,1.23,3.16,450,1
88
+ 12.16,1.61,2.31,22.8,90,1.78,1.69,0.43,1.56,2.45,1.33,2.26,495,1
89
+ 11.65,1.67,2.62,26,88,1.92,1.61,0.4,1.34,2.6,1.36,3.21,562,1
90
+ 11.64,2.06,2.46,21.6,84,1.95,1.69,0.48,1.35,2.8,1,2.75,680,1
91
+ 12.08,1.33,2.3,23.6,70,2.2,1.59,0.42,1.38,1.74,1.07,3.21,625,1
92
+ 12.08,1.83,2.32,18.5,81,1.6,1.5,0.52,1.64,2.4,1.08,2.27,480,1
93
+ 12,1.51,2.42,22,86,1.45,1.25,0.5,1.63,3.6,1.05,2.65,450,1
94
+ 12.69,1.53,2.26,20.7,80,1.38,1.46,0.58,1.62,3.05,0.96,2.06,495,1
95
+ 12.29,2.83,2.22,18,88,2.45,2.25,0.25,1.99,2.15,1.15,3.3,290,1
96
+ 11.62,1.99,2.28,18,98,3.02,2.26,0.17,1.35,3.25,1.16,2.96,345,1
97
+ 12.47,1.52,2.2,19,162,2.5,2.27,0.32,3.28,2.6,1.16,2.63,937,1
98
+ 11.81,2.12,2.74,21.5,134,1.6,0.99,0.14,1.56,2.5,0.95,2.26,625,1
99
+ 12.29,1.41,1.98,16,85,2.55,2.5,0.29,1.77,2.9,1.23,2.74,428,1
100
+ 12.37,1.07,2.1,18.5,88,3.52,3.75,0.24,1.95,4.5,1.04,2.77,660,1
101
+ 12.29,3.17,2.21,18,88,2.85,2.99,0.45,2.81,2.3,1.42,2.83,406,1
102
+ 12.08,2.08,1.7,17.5,97,2.23,2.17,0.26,1.4,3.3,1.27,2.96,710,1
103
+ 12.6,1.34,1.9,18.5,88,1.45,1.36,0.29,1.35,2.45,1.04,2.77,562,1
104
+ 12.34,2.45,2.46,21,98,2.56,2.11,0.34,1.31,2.8,0.8,3.38,438,1
105
+ 11.82,1.72,1.88,19.5,86,2.5,1.64,0.37,1.42,2.06,0.94,2.44,415,1
106
+ 12.51,1.73,1.98,20.5,85,2.2,1.92,0.32,1.48,2.94,1.04,3.57,672,1
107
+ 12.42,2.55,2.27,22,90,1.68,1.84,0.66,1.42,2.7,0.86,3.3,315,1
108
+ 12.25,1.73,2.12,19,80,1.65,2.03,0.37,1.63,3.4,1,3.17,510,1
109
+ 12.72,1.75,2.28,22.5,84,1.38,1.76,0.48,1.63,3.3,0.88,2.42,488,1
110
+ 12.22,1.29,1.94,19,92,2.36,2.04,0.39,2.08,2.7,0.86,3.02,312,1
111
+ 11.61,1.35,2.7,20,94,2.74,2.92,0.29,2.49,2.65,0.96,3.26,680,1
112
+ 11.46,3.74,1.82,19.5,107,3.18,2.58,0.24,3.58,2.9,0.75,2.81,562,1
113
+ 12.52,2.43,2.17,21,88,2.55,2.27,0.26,1.22,2,0.9,2.78,325,1
114
+ 11.76,2.68,2.92,20,103,1.75,2.03,0.6,1.05,3.8,1.23,2.5,607,1
115
+ 11.41,0.74,2.5,21,88,2.48,2.01,0.42,1.44,3.08,1.1,2.31,434,1
116
+ 12.08,1.39,2.5,22.5,84,2.56,2.29,0.43,1.04,2.9,0.93,3.19,385,1
117
+ 11.03,1.51,2.2,21.5,85,2.46,2.17,0.52,2.01,1.9,1.71,2.87,407,1
118
+ 11.82,1.47,1.99,20.8,86,1.98,1.6,0.3,1.53,1.95,0.95,3.33,495,1
119
+ 12.42,1.61,2.19,22.5,108,2,2.09,0.34,1.61,2.06,1.06,2.96,345,1
120
+ 12.77,3.43,1.98,16,80,1.63,1.25,0.43,0.83,3.4,0.7,2.12,372,1
121
+ 12,3.43,2,19,87,2,1.64,0.37,1.87,1.28,0.93,3.05,564,1
122
+ 11.45,2.4,2.42,20,96,2.9,2.79,0.32,1.83,3.25,0.8,3.39,625,1
123
+ 11.56,2.05,3.23,28.5,119,3.18,5.08,0.47,1.87,6,0.93,3.69,465,1
124
+ 12.42,4.43,2.73,26.5,102,2.2,2.13,0.43,1.71,2.08,0.92,3.12,365,1
125
+ 13.05,5.8,2.13,21.5,86,2.62,2.65,0.3,2.01,2.6,0.73,3.1,380,1
126
+ 11.87,4.31,2.39,21,82,2.86,3.03,0.21,2.91,2.8,0.75,3.64,380,1
127
+ 12.07,2.16,2.17,21,85,2.6,2.65,0.37,1.35,2.76,0.86,3.28,378,1
128
+ 12.43,1.53,2.29,21.5,86,2.74,3.15,0.39,1.77,3.94,0.69,2.84,352,1
129
+ 11.79,2.13,2.78,28.5,92,2.13,2.24,0.58,1.76,3,0.97,2.44,466,1
130
+ 12.37,1.63,2.3,24.5,88,2.22,2.45,0.4,1.9,2.12,0.89,2.78,342,1
131
+ 12.04,4.3,2.38,22,80,2.1,1.75,0.42,1.35,2.6,0.79,2.57,580,1
132
+ 12.86,1.35,2.32,18,122,1.51,1.25,0.21,0.94,4.1,0.76,1.29,630,2
133
+ 12.88,2.99,2.4,20,104,1.3,1.22,0.24,0.83,5.4,0.74,1.42,530,2
134
+ 12.81,2.31,2.4,24,98,1.15,1.09,0.27,0.83,5.7,0.66,1.36,560,2
135
+ 12.7,3.55,2.36,21.5,106,1.7,1.2,0.17,0.84,5,0.78,1.29,600,2
136
+ 12.51,1.24,2.25,17.5,85,2,0.58,0.6,1.25,5.45,0.75,1.51,650,2
137
+ 12.6,2.46,2.2,18.5,94,1.62,0.66,0.63,0.94,7.1,0.73,1.58,695,2
138
+ 12.25,4.72,2.54,21,89,1.38,0.47,0.53,0.8,3.85,0.75,1.27,720,2
139
+ 12.53,5.51,2.64,25,96,1.79,0.6,0.63,1.1,5,0.82,1.69,515,2
140
+ 13.49,3.59,2.19,19.5,88,1.62,0.48,0.58,0.88,5.7,0.81,1.82,580,2
141
+ 12.84,2.96,2.61,24,101,2.32,0.6,0.53,0.81,4.92,0.89,2.15,590,2
142
+ 12.93,2.81,2.7,21,96,1.54,0.5,0.53,0.75,4.6,0.77,2.31,600,2
143
+ 13.36,2.56,2.35,20,89,1.4,0.5,0.37,0.64,5.6,0.7,2.47,780,2
144
+ 13.52,3.17,2.72,23.5,97,1.55,0.52,0.5,0.55,4.35,0.89,2.06,520,2
145
+ 13.62,4.95,2.35,20,92,2,0.8,0.47,1.02,4.4,0.91,2.05,550,2
146
+ 12.25,3.88,2.2,18.5,112,1.38,0.78,0.29,1.14,8.21,0.65,2,855,2
147
+ 13.16,3.57,2.15,21,102,1.5,0.55,0.43,1.3,4,0.6,1.68,830,2
148
+ 13.88,5.04,2.23,20,80,0.98,0.34,0.4,0.68,4.9,0.58,1.33,415,2
149
+ 12.87,4.61,2.48,21.5,86,1.7,0.65,0.47,0.86,7.65,0.54,1.86,625,2
150
+ 13.32,3.24,2.38,21.5,92,1.93,0.76,0.45,1.25,8.42,0.55,1.62,650,2
151
+ 13.08,3.9,2.36,21.5,113,1.41,1.39,0.34,1.14,9.4,0.57,1.33,550,2
152
+ 13.5,3.12,2.62,24,123,1.4,1.57,0.22,1.25,8.6,0.59,1.3,500,2
153
+ 12.79,2.67,2.48,22,112,1.48,1.36,0.24,1.26,10.8,0.48,1.47,480,2
154
+ 13.11,1.9,2.75,25.5,116,2.2,1.28,0.26,1.56,7.1,0.61,1.33,425,2
155
+ 13.23,3.3,2.28,18.5,98,1.8,0.83,0.61,1.87,10.52,0.56,1.51,675,2
156
+ 12.58,1.29,2.1,20,103,1.48,0.58,0.53,1.4,7.6,0.58,1.55,640,2
157
+ 13.17,5.19,2.32,22,93,1.74,0.63,0.61,1.55,7.9,0.6,1.48,725,2
158
+ 13.84,4.12,2.38,19.5,89,1.8,0.83,0.48,1.56,9.01,0.57,1.64,480,2
159
+ 12.45,3.03,2.64,27,97,1.9,0.58,0.63,1.14,7.5,0.67,1.73,880,2
160
+ 14.34,1.68,2.7,25,98,2.8,1.31,0.53,2.7,13,0.57,1.96,660,2
161
+ 13.48,1.67,2.64,22.5,89,2.6,1.1,0.52,2.29,11.75,0.57,1.78,620,2
162
+ 12.36,3.83,2.38,21,88,2.3,0.92,0.5,1.04,7.65,0.56,1.58,520,2
163
+ 13.69,3.26,2.54,20,107,1.83,0.56,0.5,0.8,5.88,0.96,1.82,680,2
164
+ 12.85,3.27,2.58,22,106,1.65,0.6,0.6,0.96,5.58,0.87,2.11,570,2
165
+ 12.96,3.45,2.35,18.5,106,1.39,0.7,0.4,0.94,5.28,0.68,1.75,675,2
166
+ 13.78,2.76,2.3,22,90,1.35,0.68,0.41,1.03,9.58,0.7,1.68,615,2
167
+ 13.73,4.36,2.26,22.5,88,1.28,0.47,0.52,1.15,6.62,0.78,1.75,520,2
168
+ 13.45,3.7,2.6,23,111,1.7,0.92,0.43,1.46,10.68,0.85,1.56,695,2
169
+ 12.82,3.37,2.3,19.5,88,1.48,0.66,0.4,0.97,10.26,0.72,1.75,685,2
170
+ 13.58,2.58,2.69,24.5,105,1.55,0.84,0.39,1.54,8.66,0.74,1.8,750,2
171
+ 13.4,4.6,2.86,25,112,1.98,0.96,0.27,1.11,8.5,0.67,1.92,630,2
172
+ 12.2,3.03,2.32,19,96,1.25,0.49,0.4,0.73,5.5,0.66,1.83,510,2
173
+ 12.77,2.39,2.28,19.5,86,1.39,0.51,0.48,0.64,9.899999,0.57,1.63,470,2
174
+ 14.16,2.51,2.48,20,91,1.68,0.7,0.44,1.24,9.7,0.62,1.71,660,2
175
+ 13.71,5.65,2.45,20.5,95,1.68,0.61,0.52,1.06,7.7,0.64,1.74,740,2
176
+ 13.4,3.91,2.48,23,102,1.8,0.75,0.43,1.41,7.3,0.7,1.56,750,2
177
+ 13.27,4.28,2.26,20,120,1.59,0.69,0.43,1.35,10.2,0.59,1.56,835,2
178
+ 13.17,2.59,2.37,20,120,1.65,0.68,0.53,1.46,9.3,0.6,1.62,840,2
179
+ 14.13,4.1,2.74,24.5,96,2.05,0.76,0.56,1.35,9.2,0.61,1.6,560,2
venv/lib/python3.10/site-packages/sklearn/datasets/descr/__init__.py ADDED
File without changes
venv/lib/python3.10/site-packages/sklearn/datasets/descr/__pycache__/__init__.cpython-310.pyc ADDED
Binary file (190 Bytes). View file
 
venv/lib/python3.10/site-packages/sklearn/datasets/descr/breast_cancer.rst ADDED
@@ -0,0 +1,122 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ .. _breast_cancer_dataset:
2
+
3
+ Breast cancer wisconsin (diagnostic) dataset
4
+ --------------------------------------------
5
+
6
+ **Data Set Characteristics:**
7
+
8
+ :Number of Instances: 569
9
+
10
+ :Number of Attributes: 30 numeric, predictive attributes and the class
11
+
12
+ :Attribute Information:
13
+ - radius (mean of distances from center to points on the perimeter)
14
+ - texture (standard deviation of gray-scale values)
15
+ - perimeter
16
+ - area
17
+ - smoothness (local variation in radius lengths)
18
+ - compactness (perimeter^2 / area - 1.0)
19
+ - concavity (severity of concave portions of the contour)
20
+ - concave points (number of concave portions of the contour)
21
+ - symmetry
22
+ - fractal dimension ("coastline approximation" - 1)
23
+
24
+ The mean, standard error, and "worst" or largest (mean of the three
25
+ worst/largest values) of these features were computed for each image,
26
+ resulting in 30 features. For instance, field 0 is Mean Radius, field
27
+ 10 is Radius SE, field 20 is Worst Radius.
28
+
29
+ - class:
30
+ - WDBC-Malignant
31
+ - WDBC-Benign
32
+
33
+ :Summary Statistics:
34
+
35
+ ===================================== ====== ======
36
+ Min Max
37
+ ===================================== ====== ======
38
+ radius (mean): 6.981 28.11
39
+ texture (mean): 9.71 39.28
40
+ perimeter (mean): 43.79 188.5
41
+ area (mean): 143.5 2501.0
42
+ smoothness (mean): 0.053 0.163
43
+ compactness (mean): 0.019 0.345
44
+ concavity (mean): 0.0 0.427
45
+ concave points (mean): 0.0 0.201
46
+ symmetry (mean): 0.106 0.304
47
+ fractal dimension (mean): 0.05 0.097
48
+ radius (standard error): 0.112 2.873
49
+ texture (standard error): 0.36 4.885
50
+ perimeter (standard error): 0.757 21.98
51
+ area (standard error): 6.802 542.2
52
+ smoothness (standard error): 0.002 0.031
53
+ compactness (standard error): 0.002 0.135
54
+ concavity (standard error): 0.0 0.396
55
+ concave points (standard error): 0.0 0.053
56
+ symmetry (standard error): 0.008 0.079
57
+ fractal dimension (standard error): 0.001 0.03
58
+ radius (worst): 7.93 36.04
59
+ texture (worst): 12.02 49.54
60
+ perimeter (worst): 50.41 251.2
61
+ area (worst): 185.2 4254.0
62
+ smoothness (worst): 0.071 0.223
63
+ compactness (worst): 0.027 1.058
64
+ concavity (worst): 0.0 1.252
65
+ concave points (worst): 0.0 0.291
66
+ symmetry (worst): 0.156 0.664
67
+ fractal dimension (worst): 0.055 0.208
68
+ ===================================== ====== ======
69
+
70
+ :Missing Attribute Values: None
71
+
72
+ :Class Distribution: 212 - Malignant, 357 - Benign
73
+
74
+ :Creator: Dr. William H. Wolberg, W. Nick Street, Olvi L. Mangasarian
75
+
76
+ :Donor: Nick Street
77
+
78
+ :Date: November, 1995
79
+
80
+ This is a copy of UCI ML Breast Cancer Wisconsin (Diagnostic) datasets.
81
+ https://goo.gl/U2Uwz2
82
+
83
+ Features are computed from a digitized image of a fine needle
84
+ aspirate (FNA) of a breast mass. They describe
85
+ characteristics of the cell nuclei present in the image.
86
+
87
+ Separating plane described above was obtained using
88
+ Multisurface Method-Tree (MSM-T) [K. P. Bennett, "Decision Tree
89
+ Construction Via Linear Programming." Proceedings of the 4th
90
+ Midwest Artificial Intelligence and Cognitive Science Society,
91
+ pp. 97-101, 1992], a classification method which uses linear
92
+ programming to construct a decision tree. Relevant features
93
+ were selected using an exhaustive search in the space of 1-4
94
+ features and 1-3 separating planes.
95
+
96
+ The actual linear program used to obtain the separating plane
97
+ in the 3-dimensional space is that described in:
98
+ [K. P. Bennett and O. L. Mangasarian: "Robust Linear
99
+ Programming Discrimination of Two Linearly Inseparable Sets",
100
+ Optimization Methods and Software 1, 1992, 23-34].
101
+
102
+ This database is also available through the UW CS ftp server:
103
+
104
+ ftp ftp.cs.wisc.edu
105
+ cd math-prog/cpo-dataset/machine-learn/WDBC/
106
+
107
+ |details-start|
108
+ **References**
109
+ |details-split|
110
+
111
+ - W.N. Street, W.H. Wolberg and O.L. Mangasarian. Nuclear feature extraction
112
+ for breast tumor diagnosis. IS&T/SPIE 1993 International Symposium on
113
+ Electronic Imaging: Science and Technology, volume 1905, pages 861-870,
114
+ San Jose, CA, 1993.
115
+ - O.L. Mangasarian, W.N. Street and W.H. Wolberg. Breast cancer diagnosis and
116
+ prognosis via linear programming. Operations Research, 43(4), pages 570-577,
117
+ July-August 1995.
118
+ - W.H. Wolberg, W.N. Street, and O.L. Mangasarian. Machine learning techniques
119
+ to diagnose breast cancer from fine-needle aspirates. Cancer Letters 77 (1994)
120
+ 163-171.
121
+
122
+ |details-end|
venv/lib/python3.10/site-packages/sklearn/datasets/descr/california_housing.rst ADDED
@@ -0,0 +1,46 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ .. _california_housing_dataset:
2
+
3
+ California Housing dataset
4
+ --------------------------
5
+
6
+ **Data Set Characteristics:**
7
+
8
+ :Number of Instances: 20640
9
+
10
+ :Number of Attributes: 8 numeric, predictive attributes and the target
11
+
12
+ :Attribute Information:
13
+ - MedInc median income in block group
14
+ - HouseAge median house age in block group
15
+ - AveRooms average number of rooms per household
16
+ - AveBedrms average number of bedrooms per household
17
+ - Population block group population
18
+ - AveOccup average number of household members
19
+ - Latitude block group latitude
20
+ - Longitude block group longitude
21
+
22
+ :Missing Attribute Values: None
23
+
24
+ This dataset was obtained from the StatLib repository.
25
+ https://www.dcc.fc.up.pt/~ltorgo/Regression/cal_housing.html
26
+
27
+ The target variable is the median house value for California districts,
28
+ expressed in hundreds of thousands of dollars ($100,000).
29
+
30
+ This dataset was derived from the 1990 U.S. census, using one row per census
31
+ block group. A block group is the smallest geographical unit for which the U.S.
32
+ Census Bureau publishes sample data (a block group typically has a population
33
+ of 600 to 3,000 people).
34
+
35
+ A household is a group of people residing within a home. Since the average
36
+ number of rooms and bedrooms in this dataset are provided per household, these
37
+ columns may take surprisingly large values for block groups with few households
38
+ and many empty houses, such as vacation resorts.
39
+
40
+ It can be downloaded/loaded using the
41
+ :func:`sklearn.datasets.fetch_california_housing` function.
42
+
43
+ .. topic:: References
44
+
45
+ - Pace, R. Kelley and Ronald Barry, Sparse Spatial Autoregressions,
46
+ Statistics and Probability Letters, 33 (1997) 291-297
venv/lib/python3.10/site-packages/sklearn/datasets/descr/covtype.rst ADDED
@@ -0,0 +1,30 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ .. _covtype_dataset:
2
+
3
+ Forest covertypes
4
+ -----------------
5
+
6
+ The samples in this dataset correspond to 30×30m patches of forest in the US,
7
+ collected for the task of predicting each patch's cover type,
8
+ i.e. the dominant species of tree.
9
+ There are seven covertypes, making this a multiclass classification problem.
10
+ Each sample has 54 features, described on the
11
+ `dataset's homepage <https://archive.ics.uci.edu/ml/datasets/Covertype>`__.
12
+ Some of the features are boolean indicators,
13
+ while others are discrete or continuous measurements.
14
+
15
+ **Data Set Characteristics:**
16
+
17
+ ================= ============
18
+ Classes 7
19
+ Samples total 581012
20
+ Dimensionality 54
21
+ Features int
22
+ ================= ============
23
+
24
+ :func:`sklearn.datasets.fetch_covtype` will load the covertype dataset;
25
+ it returns a dictionary-like 'Bunch' object
26
+ with the feature matrix in the ``data`` member
27
+ and the target values in ``target``. If optional argument 'as_frame' is
28
+ set to 'True', it will return ``data`` and ``target`` as pandas
29
+ data frame, and there will be an additional member ``frame`` as well.
30
+ The dataset will be downloaded from the web if necessary.
venv/lib/python3.10/site-packages/sklearn/datasets/descr/diabetes.rst ADDED
@@ -0,0 +1,38 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ .. _diabetes_dataset:
2
+
3
+ Diabetes dataset
4
+ ----------------
5
+
6
+ Ten baseline variables, age, sex, body mass index, average blood
7
+ pressure, and six blood serum measurements were obtained for each of n =
8
+ 442 diabetes patients, as well as the response of interest, a
9
+ quantitative measure of disease progression one year after baseline.
10
+
11
+ **Data Set Characteristics:**
12
+
13
+ :Number of Instances: 442
14
+
15
+ :Number of Attributes: First 10 columns are numeric predictive values
16
+
17
+ :Target: Column 11 is a quantitative measure of disease progression one year after baseline
18
+
19
+ :Attribute Information:
20
+ - age age in years
21
+ - sex
22
+ - bmi body mass index
23
+ - bp average blood pressure
24
+ - s1 tc, total serum cholesterol
25
+ - s2 ldl, low-density lipoproteins
26
+ - s3 hdl, high-density lipoproteins
27
+ - s4 tch, total cholesterol / HDL
28
+ - s5 ltg, possibly log of serum triglycerides level
29
+ - s6 glu, blood sugar level
30
+
31
+ Note: Each of these 10 feature variables have been mean centered and scaled by the standard deviation times the square root of `n_samples` (i.e. the sum of squares of each column totals 1).
32
+
33
+ Source URL:
34
+ https://www4.stat.ncsu.edu/~boos/var.select/diabetes.html
35
+
36
+ For more information see:
37
+ Bradley Efron, Trevor Hastie, Iain Johnstone and Robert Tibshirani (2004) "Least Angle Regression," Annals of Statistics (with discussion), 407-499.
38
+ (https://web.stanford.edu/~hastie/Papers/LARS/LeastAngle_2002.pdf)
venv/lib/python3.10/site-packages/sklearn/datasets/descr/digits.rst ADDED
@@ -0,0 +1,50 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ .. _digits_dataset:
2
+
3
+ Optical recognition of handwritten digits dataset
4
+ --------------------------------------------------
5
+
6
+ **Data Set Characteristics:**
7
+
8
+ :Number of Instances: 1797
9
+ :Number of Attributes: 64
10
+ :Attribute Information: 8x8 image of integer pixels in the range 0..16.
11
+ :Missing Attribute Values: None
12
+ :Creator: E. Alpaydin (alpaydin '@' boun.edu.tr)
13
+ :Date: July; 1998
14
+
15
+ This is a copy of the test set of the UCI ML hand-written digits datasets
16
+ https://archive.ics.uci.edu/ml/datasets/Optical+Recognition+of+Handwritten+Digits
17
+
18
+ The data set contains images of hand-written digits: 10 classes where
19
+ each class refers to a digit.
20
+
21
+ Preprocessing programs made available by NIST were used to extract
22
+ normalized bitmaps of handwritten digits from a preprinted form. From a
23
+ total of 43 people, 30 contributed to the training set and different 13
24
+ to the test set. 32x32 bitmaps are divided into nonoverlapping blocks of
25
+ 4x4 and the number of on pixels are counted in each block. This generates
26
+ an input matrix of 8x8 where each element is an integer in the range
27
+ 0..16. This reduces dimensionality and gives invariance to small
28
+ distortions.
29
+
30
+ For info on NIST preprocessing routines, see M. D. Garris, J. L. Blue, G.
31
+ T. Candela, D. L. Dimmick, J. Geist, P. J. Grother, S. A. Janet, and C.
32
+ L. Wilson, NIST Form-Based Handprint Recognition System, NISTIR 5469,
33
+ 1994.
34
+
35
+ |details-start|
36
+ **References**
37
+ |details-split|
38
+
39
+ - C. Kaynak (1995) Methods of Combining Multiple Classifiers and Their
40
+ Applications to Handwritten Digit Recognition, MSc Thesis, Institute of
41
+ Graduate Studies in Science and Engineering, Bogazici University.
42
+ - E. Alpaydin, C. Kaynak (1998) Cascading Classifiers, Kybernetika.
43
+ - Ken Tang and Ponnuthurai N. Suganthan and Xi Yao and A. Kai Qin.
44
+ Linear dimensionalityreduction using relevance weighted LDA. School of
45
+ Electrical and Electronic Engineering Nanyang Technological University.
46
+ 2005.
47
+ - Claudio Gentile. A New Approximate Maximal Margin Classification
48
+ Algorithm. NIPS. 2000.
49
+
50
+ |details-end|
venv/lib/python3.10/site-packages/sklearn/datasets/descr/iris.rst ADDED
@@ -0,0 +1,67 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ .. _iris_dataset:
2
+
3
+ Iris plants dataset
4
+ --------------------
5
+
6
+ **Data Set Characteristics:**
7
+
8
+ :Number of Instances: 150 (50 in each of three classes)
9
+ :Number of Attributes: 4 numeric, predictive attributes and the class
10
+ :Attribute Information:
11
+ - sepal length in cm
12
+ - sepal width in cm
13
+ - petal length in cm
14
+ - petal width in cm
15
+ - class:
16
+ - Iris-Setosa
17
+ - Iris-Versicolour
18
+ - Iris-Virginica
19
+
20
+ :Summary Statistics:
21
+
22
+ ============== ==== ==== ======= ===== ====================
23
+ Min Max Mean SD Class Correlation
24
+ ============== ==== ==== ======= ===== ====================
25
+ sepal length: 4.3 7.9 5.84 0.83 0.7826
26
+ sepal width: 2.0 4.4 3.05 0.43 -0.4194
27
+ petal length: 1.0 6.9 3.76 1.76 0.9490 (high!)
28
+ petal width: 0.1 2.5 1.20 0.76 0.9565 (high!)
29
+ ============== ==== ==== ======= ===== ====================
30
+
31
+ :Missing Attribute Values: None
32
+ :Class Distribution: 33.3% for each of 3 classes.
33
+ :Creator: R.A. Fisher
34
+ :Donor: Michael Marshall (MARSHALL%[email protected])
35
+ :Date: July, 1988
36
+
37
+ The famous Iris database, first used by Sir R.A. Fisher. The dataset is taken
38
+ from Fisher's paper. Note that it's the same as in R, but not as in the UCI
39
+ Machine Learning Repository, which has two wrong data points.
40
+
41
+ This is perhaps the best known database to be found in the
42
+ pattern recognition literature. Fisher's paper is a classic in the field and
43
+ is referenced frequently to this day. (See Duda & Hart, for example.) The
44
+ data set contains 3 classes of 50 instances each, where each class refers to a
45
+ type of iris plant. One class is linearly separable from the other 2; the
46
+ latter are NOT linearly separable from each other.
47
+
48
+ |details-start|
49
+ **References**
50
+ |details-split|
51
+
52
+ - Fisher, R.A. "The use of multiple measurements in taxonomic problems"
53
+ Annual Eugenics, 7, Part II, 179-188 (1936); also in "Contributions to
54
+ Mathematical Statistics" (John Wiley, NY, 1950).
55
+ - Duda, R.O., & Hart, P.E. (1973) Pattern Classification and Scene Analysis.
56
+ (Q327.D83) John Wiley & Sons. ISBN 0-471-22361-1. See page 218.
57
+ - Dasarathy, B.V. (1980) "Nosing Around the Neighborhood: A New System
58
+ Structure and Classification Rule for Recognition in Partially Exposed
59
+ Environments". IEEE Transactions on Pattern Analysis and Machine
60
+ Intelligence, Vol. PAMI-2, No. 1, 67-71.
61
+ - Gates, G.W. (1972) "The Reduced Nearest Neighbor Rule". IEEE Transactions
62
+ on Information Theory, May 1972, 431-433.
63
+ - See also: 1988 MLC Proceedings, 54-64. Cheeseman et al"s AUTOCLASS II
64
+ conceptual clustering system finds 3 classes in the data.
65
+ - Many, many more ...
66
+
67
+ |details-end|
venv/lib/python3.10/site-packages/sklearn/datasets/descr/kddcup99.rst ADDED
@@ -0,0 +1,94 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ .. _kddcup99_dataset:
2
+
3
+ Kddcup 99 dataset
4
+ -----------------
5
+
6
+ The KDD Cup '99 dataset was created by processing the tcpdump portions
7
+ of the 1998 DARPA Intrusion Detection System (IDS) Evaluation dataset,
8
+ created by MIT Lincoln Lab [2]_. The artificial data (described on the `dataset's
9
+ homepage <https://kdd.ics.uci.edu/databases/kddcup99/kddcup99.html>`_) was
10
+ generated using a closed network and hand-injected attacks to produce a
11
+ large number of different types of attack with normal activity in the
12
+ background. As the initial goal was to produce a large training set for
13
+ supervised learning algorithms, there is a large proportion (80.1%) of
14
+ abnormal data which is unrealistic in real world, and inappropriate for
15
+ unsupervised anomaly detection which aims at detecting 'abnormal' data, i.e.:
16
+
17
+ * qualitatively different from normal data
18
+ * in large minority among the observations.
19
+
20
+ We thus transform the KDD Data set into two different data sets: SA and SF.
21
+
22
+ * SA is obtained by simply selecting all the normal data, and a small
23
+ proportion of abnormal data to gives an anomaly proportion of 1%.
24
+
25
+ * SF is obtained as in [3]_
26
+ by simply picking up the data whose attribute logged_in is positive, thus
27
+ focusing on the intrusion attack, which gives a proportion of 0.3% of
28
+ attack.
29
+
30
+ * http and smtp are two subsets of SF corresponding with third feature
31
+ equal to 'http' (resp. to 'smtp').
32
+
33
+ General KDD structure:
34
+
35
+ ================ ==========================================
36
+ Samples total 4898431
37
+ Dimensionality 41
38
+ Features discrete (int) or continuous (float)
39
+ Targets str, 'normal.' or name of the anomaly type
40
+ ================ ==========================================
41
+
42
+ SA structure:
43
+
44
+ ================ ==========================================
45
+ Samples total 976158
46
+ Dimensionality 41
47
+ Features discrete (int) or continuous (float)
48
+ Targets str, 'normal.' or name of the anomaly type
49
+ ================ ==========================================
50
+
51
+ SF structure:
52
+
53
+ ================ ==========================================
54
+ Samples total 699691
55
+ Dimensionality 4
56
+ Features discrete (int) or continuous (float)
57
+ Targets str, 'normal.' or name of the anomaly type
58
+ ================ ==========================================
59
+
60
+ http structure:
61
+
62
+ ================ ==========================================
63
+ Samples total 619052
64
+ Dimensionality 3
65
+ Features discrete (int) or continuous (float)
66
+ Targets str, 'normal.' or name of the anomaly type
67
+ ================ ==========================================
68
+
69
+ smtp structure:
70
+
71
+ ================ ==========================================
72
+ Samples total 95373
73
+ Dimensionality 3
74
+ Features discrete (int) or continuous (float)
75
+ Targets str, 'normal.' or name of the anomaly type
76
+ ================ ==========================================
77
+
78
+ :func:`sklearn.datasets.fetch_kddcup99` will load the kddcup99 dataset; it
79
+ returns a dictionary-like object with the feature matrix in the ``data`` member
80
+ and the target values in ``target``. The "as_frame" optional argument converts
81
+ ``data`` into a pandas DataFrame and ``target`` into a pandas Series. The
82
+ dataset will be downloaded from the web if necessary.
83
+
84
+ .. topic:: References
85
+
86
+ .. [2] Analysis and Results of the 1999 DARPA Off-Line Intrusion
87
+ Detection Evaluation, Richard Lippmann, Joshua W. Haines,
88
+ David J. Fried, Jonathan Korba, Kumar Das.
89
+
90
+ .. [3] K. Yamanishi, J.-I. Takeuchi, G. Williams, and P. Milne. Online
91
+ unsupervised outlier detection using finite mixtures with
92
+ discounting learning algorithms. In Proceedings of the sixth
93
+ ACM SIGKDD international conference on Knowledge discovery
94
+ and data mining, pages 320-324. ACM Press, 2000.
venv/lib/python3.10/site-packages/sklearn/datasets/descr/lfw.rst ADDED
@@ -0,0 +1,128 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ .. _labeled_faces_in_the_wild_dataset:
2
+
3
+ The Labeled Faces in the Wild face recognition dataset
4
+ ------------------------------------------------------
5
+
6
+ This dataset is a collection of JPEG pictures of famous people collected
7
+ over the internet, all details are available on the official website:
8
+
9
+ http://vis-www.cs.umass.edu/lfw/
10
+
11
+ Each picture is centered on a single face. The typical task is called
12
+ Face Verification: given a pair of two pictures, a binary classifier
13
+ must predict whether the two images are from the same person.
14
+
15
+ An alternative task, Face Recognition or Face Identification is:
16
+ given the picture of the face of an unknown person, identify the name
17
+ of the person by referring to a gallery of previously seen pictures of
18
+ identified persons.
19
+
20
+ Both Face Verification and Face Recognition are tasks that are typically
21
+ performed on the output of a model trained to perform Face Detection. The
22
+ most popular model for Face Detection is called Viola-Jones and is
23
+ implemented in the OpenCV library. The LFW faces were extracted by this
24
+ face detector from various online websites.
25
+
26
+ **Data Set Characteristics:**
27
+
28
+ ================= =======================
29
+ Classes 5749
30
+ Samples total 13233
31
+ Dimensionality 5828
32
+ Features real, between 0 and 255
33
+ ================= =======================
34
+
35
+ |details-start|
36
+ **Usage**
37
+ |details-split|
38
+
39
+ ``scikit-learn`` provides two loaders that will automatically download,
40
+ cache, parse the metadata files, decode the jpeg and convert the
41
+ interesting slices into memmapped numpy arrays. This dataset size is more
42
+ than 200 MB. The first load typically takes more than a couple of minutes
43
+ to fully decode the relevant part of the JPEG files into numpy arrays. If
44
+ the dataset has been loaded once, the following times the loading times
45
+ less than 200ms by using a memmapped version memoized on the disk in the
46
+ ``~/scikit_learn_data/lfw_home/`` folder using ``joblib``.
47
+
48
+ The first loader is used for the Face Identification task: a multi-class
49
+ classification task (hence supervised learning)::
50
+
51
+ >>> from sklearn.datasets import fetch_lfw_people
52
+ >>> lfw_people = fetch_lfw_people(min_faces_per_person=70, resize=0.4)
53
+
54
+ >>> for name in lfw_people.target_names:
55
+ ... print(name)
56
+ ...
57
+ Ariel Sharon
58
+ Colin Powell
59
+ Donald Rumsfeld
60
+ George W Bush
61
+ Gerhard Schroeder
62
+ Hugo Chavez
63
+ Tony Blair
64
+
65
+ The default slice is a rectangular shape around the face, removing
66
+ most of the background::
67
+
68
+ >>> lfw_people.data.dtype
69
+ dtype('float32')
70
+
71
+ >>> lfw_people.data.shape
72
+ (1288, 1850)
73
+
74
+ >>> lfw_people.images.shape
75
+ (1288, 50, 37)
76
+
77
+ Each of the ``1140`` faces is assigned to a single person id in the ``target``
78
+ array::
79
+
80
+ >>> lfw_people.target.shape
81
+ (1288,)
82
+
83
+ >>> list(lfw_people.target[:10])
84
+ [5, 6, 3, 1, 0, 1, 3, 4, 3, 0]
85
+
86
+ The second loader is typically used for the face verification task: each sample
87
+ is a pair of two picture belonging or not to the same person::
88
+
89
+ >>> from sklearn.datasets import fetch_lfw_pairs
90
+ >>> lfw_pairs_train = fetch_lfw_pairs(subset='train')
91
+
92
+ >>> list(lfw_pairs_train.target_names)
93
+ ['Different persons', 'Same person']
94
+
95
+ >>> lfw_pairs_train.pairs.shape
96
+ (2200, 2, 62, 47)
97
+
98
+ >>> lfw_pairs_train.data.shape
99
+ (2200, 5828)
100
+
101
+ >>> lfw_pairs_train.target.shape
102
+ (2200,)
103
+
104
+ Both for the :func:`sklearn.datasets.fetch_lfw_people` and
105
+ :func:`sklearn.datasets.fetch_lfw_pairs` function it is
106
+ possible to get an additional dimension with the RGB color channels by
107
+ passing ``color=True``, in that case the shape will be
108
+ ``(2200, 2, 62, 47, 3)``.
109
+
110
+ The :func:`sklearn.datasets.fetch_lfw_pairs` datasets is subdivided into
111
+ 3 subsets: the development ``train`` set, the development ``test`` set and
112
+ an evaluation ``10_folds`` set meant to compute performance metrics using a
113
+ 10-folds cross validation scheme.
114
+
115
+ |details-end|
116
+
117
+ .. topic:: References:
118
+
119
+ * `Labeled Faces in the Wild: A Database for Studying Face Recognition
120
+ in Unconstrained Environments.
121
+ <http://vis-www.cs.umass.edu/lfw/lfw.pdf>`_
122
+ Gary B. Huang, Manu Ramesh, Tamara Berg, and Erik Learned-Miller.
123
+ University of Massachusetts, Amherst, Technical Report 07-49, October, 2007.
124
+
125
+
126
+ .. topic:: Examples:
127
+
128
+ * :ref:`sphx_glr_auto_examples_applications_plot_face_recognition.py`
venv/lib/python3.10/site-packages/sklearn/datasets/descr/linnerud.rst ADDED
@@ -0,0 +1,28 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ .. _linnerrud_dataset:
2
+
3
+ Linnerrud dataset
4
+ -----------------
5
+
6
+ **Data Set Characteristics:**
7
+
8
+ :Number of Instances: 20
9
+ :Number of Attributes: 3
10
+ :Missing Attribute Values: None
11
+
12
+ The Linnerud dataset is a multi-output regression dataset. It consists of three
13
+ exercise (data) and three physiological (target) variables collected from
14
+ twenty middle-aged men in a fitness club:
15
+
16
+ - *physiological* - CSV containing 20 observations on 3 physiological variables:
17
+ Weight, Waist and Pulse.
18
+ - *exercise* - CSV containing 20 observations on 3 exercise variables:
19
+ Chins, Situps and Jumps.
20
+
21
+ |details-start|
22
+ **References**
23
+ |details-split|
24
+
25
+ * Tenenhaus, M. (1998). La regression PLS: theorie et pratique. Paris:
26
+ Editions Technic.
27
+
28
+ |details-end|
venv/lib/python3.10/site-packages/sklearn/datasets/descr/olivetti_faces.rst ADDED
@@ -0,0 +1,44 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ .. _olivetti_faces_dataset:
2
+
3
+ The Olivetti faces dataset
4
+ --------------------------
5
+
6
+ `This dataset contains a set of face images`_ taken between April 1992 and
7
+ April 1994 at AT&T Laboratories Cambridge. The
8
+ :func:`sklearn.datasets.fetch_olivetti_faces` function is the data
9
+ fetching / caching function that downloads the data
10
+ archive from AT&T.
11
+
12
+ .. _This dataset contains a set of face images: https://cam-orl.co.uk/facedatabase.html
13
+
14
+ As described on the original website:
15
+
16
+ There are ten different images of each of 40 distinct subjects. For some
17
+ subjects, the images were taken at different times, varying the lighting,
18
+ facial expressions (open / closed eyes, smiling / not smiling) and facial
19
+ details (glasses / no glasses). All the images were taken against a dark
20
+ homogeneous background with the subjects in an upright, frontal position
21
+ (with tolerance for some side movement).
22
+
23
+ **Data Set Characteristics:**
24
+
25
+ ================= =====================
26
+ Classes 40
27
+ Samples total 400
28
+ Dimensionality 4096
29
+ Features real, between 0 and 1
30
+ ================= =====================
31
+
32
+ The image is quantized to 256 grey levels and stored as unsigned 8-bit
33
+ integers; the loader will convert these to floating point values on the
34
+ interval [0, 1], which are easier to work with for many algorithms.
35
+
36
+ The "target" for this database is an integer from 0 to 39 indicating the
37
+ identity of the person pictured; however, with only 10 examples per class, this
38
+ relatively small dataset is more interesting from an unsupervised or
39
+ semi-supervised perspective.
40
+
41
+ The original dataset consisted of 92 x 112, while the version available here
42
+ consists of 64x64 images.
43
+
44
+ When using these images, please give credit to AT&T Laboratories Cambridge.
venv/lib/python3.10/site-packages/sklearn/datasets/descr/rcv1.rst ADDED
@@ -0,0 +1,72 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ .. _rcv1_dataset:
2
+
3
+ RCV1 dataset
4
+ ------------
5
+
6
+ Reuters Corpus Volume I (RCV1) is an archive of over 800,000 manually
7
+ categorized newswire stories made available by Reuters, Ltd. for research
8
+ purposes. The dataset is extensively described in [1]_.
9
+
10
+ **Data Set Characteristics:**
11
+
12
+ ============== =====================
13
+ Classes 103
14
+ Samples total 804414
15
+ Dimensionality 47236
16
+ Features real, between 0 and 1
17
+ ============== =====================
18
+
19
+ :func:`sklearn.datasets.fetch_rcv1` will load the following
20
+ version: RCV1-v2, vectors, full sets, topics multilabels::
21
+
22
+ >>> from sklearn.datasets import fetch_rcv1
23
+ >>> rcv1 = fetch_rcv1()
24
+
25
+ It returns a dictionary-like object, with the following attributes:
26
+
27
+ ``data``:
28
+ The feature matrix is a scipy CSR sparse matrix, with 804414 samples and
29
+ 47236 features. Non-zero values contains cosine-normalized, log TF-IDF vectors.
30
+ A nearly chronological split is proposed in [1]_: The first 23149 samples are
31
+ the training set. The last 781265 samples are the testing set. This follows
32
+ the official LYRL2004 chronological split. The array has 0.16% of non zero
33
+ values::
34
+
35
+ >>> rcv1.data.shape
36
+ (804414, 47236)
37
+
38
+ ``target``:
39
+ The target values are stored in a scipy CSR sparse matrix, with 804414 samples
40
+ and 103 categories. Each sample has a value of 1 in its categories, and 0 in
41
+ others. The array has 3.15% of non zero values::
42
+
43
+ >>> rcv1.target.shape
44
+ (804414, 103)
45
+
46
+ ``sample_id``:
47
+ Each sample can be identified by its ID, ranging (with gaps) from 2286
48
+ to 810596::
49
+
50
+ >>> rcv1.sample_id[:3]
51
+ array([2286, 2287, 2288], dtype=uint32)
52
+
53
+ ``target_names``:
54
+ The target values are the topics of each sample. Each sample belongs to at
55
+ least one topic, and to up to 17 topics. There are 103 topics, each
56
+ represented by a string. Their corpus frequencies span five orders of
57
+ magnitude, from 5 occurrences for 'GMIL', to 381327 for 'CCAT'::
58
+
59
+ >>> rcv1.target_names[:3].tolist() # doctest: +SKIP
60
+ ['E11', 'ECAT', 'M11']
61
+
62
+ The dataset will be downloaded from the `rcv1 homepage`_ if necessary.
63
+ The compressed size is about 656 MB.
64
+
65
+ .. _rcv1 homepage: http://jmlr.csail.mit.edu/papers/volume5/lewis04a/
66
+
67
+
68
+ .. topic:: References
69
+
70
+ .. [1] Lewis, D. D., Yang, Y., Rose, T. G., & Li, F. (2004).
71
+ RCV1: A new benchmark collection for text categorization research.
72
+ The Journal of Machine Learning Research, 5, 361-397.
venv/lib/python3.10/site-packages/sklearn/datasets/descr/species_distributions.rst ADDED
@@ -0,0 +1,36 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ .. _species_distribution_dataset:
2
+
3
+ Species distribution dataset
4
+ ----------------------------
5
+
6
+ This dataset represents the geographic distribution of two species in Central and
7
+ South America. The two species are:
8
+
9
+ - `"Bradypus variegatus" <http://www.iucnredlist.org/details/3038/0>`_ ,
10
+ the Brown-throated Sloth.
11
+
12
+ - `"Microryzomys minutus" <http://www.iucnredlist.org/details/13408/0>`_ ,
13
+ also known as the Forest Small Rice Rat, a rodent that lives in Peru,
14
+ Colombia, Ecuador, Peru, and Venezuela.
15
+
16
+ The dataset is not a typical dataset since a :class:`~sklearn.datasets.base.Bunch`
17
+ containing the attributes `data` and `target` is not returned. Instead, we have
18
+ information allowing to create a "density" map of the different species.
19
+
20
+ The grid for the map can be built using the attributes `x_left_lower_corner`,
21
+ `y_left_lower_corner`, `Nx`, `Ny` and `grid_size`, which respectively correspond
22
+ to the x and y coordinates of the lower left corner of the grid, the number of
23
+ points along the x- and y-axis and the size of the step on the grid.
24
+
25
+ The density at each location of the grid is contained in the `coverage` attribute.
26
+
27
+ Finally, the `train` and `test` attributes contain information regarding the location
28
+ of a species at a specific location.
29
+
30
+ The dataset is provided by Phillips et. al. (2006).
31
+
32
+ .. topic:: References
33
+
34
+ * `"Maximum entropy modeling of species geographic distributions"
35
+ <http://rob.schapire.net/papers/ecolmod.pdf>`_ S. J. Phillips,
36
+ R. P. Anderson, R. E. Schapire - Ecological Modelling, 190:231-259, 2006.
venv/lib/python3.10/site-packages/sklearn/datasets/descr/twenty_newsgroups.rst ADDED
@@ -0,0 +1,264 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ .. _20newsgroups_dataset:
2
+
3
+ The 20 newsgroups text dataset
4
+ ------------------------------
5
+
6
+ The 20 newsgroups dataset comprises around 18000 newsgroups posts on
7
+ 20 topics split in two subsets: one for training (or development)
8
+ and the other one for testing (or for performance evaluation). The split
9
+ between the train and test set is based upon a messages posted before
10
+ and after a specific date.
11
+
12
+ This module contains two loaders. The first one,
13
+ :func:`sklearn.datasets.fetch_20newsgroups`,
14
+ returns a list of the raw texts that can be fed to text feature
15
+ extractors such as :class:`~sklearn.feature_extraction.text.CountVectorizer`
16
+ with custom parameters so as to extract feature vectors.
17
+ The second one, :func:`sklearn.datasets.fetch_20newsgroups_vectorized`,
18
+ returns ready-to-use features, i.e., it is not necessary to use a feature
19
+ extractor.
20
+
21
+ **Data Set Characteristics:**
22
+
23
+ ================= ==========
24
+ Classes 20
25
+ Samples total 18846
26
+ Dimensionality 1
27
+ Features text
28
+ ================= ==========
29
+
30
+ |details-start|
31
+ **Usage**
32
+ |details-split|
33
+
34
+ The :func:`sklearn.datasets.fetch_20newsgroups` function is a data
35
+ fetching / caching functions that downloads the data archive from
36
+ the original `20 newsgroups website`_, extracts the archive contents
37
+ in the ``~/scikit_learn_data/20news_home`` folder and calls the
38
+ :func:`sklearn.datasets.load_files` on either the training or
39
+ testing set folder, or both of them::
40
+
41
+ >>> from sklearn.datasets import fetch_20newsgroups
42
+ >>> newsgroups_train = fetch_20newsgroups(subset='train')
43
+
44
+ >>> from pprint import pprint
45
+ >>> pprint(list(newsgroups_train.target_names))
46
+ ['alt.atheism',
47
+ 'comp.graphics',
48
+ 'comp.os.ms-windows.misc',
49
+ 'comp.sys.ibm.pc.hardware',
50
+ 'comp.sys.mac.hardware',
51
+ 'comp.windows.x',
52
+ 'misc.forsale',
53
+ 'rec.autos',
54
+ 'rec.motorcycles',
55
+ 'rec.sport.baseball',
56
+ 'rec.sport.hockey',
57
+ 'sci.crypt',
58
+ 'sci.electronics',
59
+ 'sci.med',
60
+ 'sci.space',
61
+ 'soc.religion.christian',
62
+ 'talk.politics.guns',
63
+ 'talk.politics.mideast',
64
+ 'talk.politics.misc',
65
+ 'talk.religion.misc']
66
+
67
+ The real data lies in the ``filenames`` and ``target`` attributes. The target
68
+ attribute is the integer index of the category::
69
+
70
+ >>> newsgroups_train.filenames.shape
71
+ (11314,)
72
+ >>> newsgroups_train.target.shape
73
+ (11314,)
74
+ >>> newsgroups_train.target[:10]
75
+ array([ 7, 4, 4, 1, 14, 16, 13, 3, 2, 4])
76
+
77
+ It is possible to load only a sub-selection of the categories by passing the
78
+ list of the categories to load to the
79
+ :func:`sklearn.datasets.fetch_20newsgroups` function::
80
+
81
+ >>> cats = ['alt.atheism', 'sci.space']
82
+ >>> newsgroups_train = fetch_20newsgroups(subset='train', categories=cats)
83
+
84
+ >>> list(newsgroups_train.target_names)
85
+ ['alt.atheism', 'sci.space']
86
+ >>> newsgroups_train.filenames.shape
87
+ (1073,)
88
+ >>> newsgroups_train.target.shape
89
+ (1073,)
90
+ >>> newsgroups_train.target[:10]
91
+ array([0, 1, 1, 1, 0, 1, 1, 0, 0, 0])
92
+
93
+ |details-end|
94
+
95
+ |details-start|
96
+ **Converting text to vectors**
97
+ |details-split|
98
+
99
+ In order to feed predictive or clustering models with the text data,
100
+ one first need to turn the text into vectors of numerical values suitable
101
+ for statistical analysis. This can be achieved with the utilities of the
102
+ ``sklearn.feature_extraction.text`` as demonstrated in the following
103
+ example that extract `TF-IDF`_ vectors of unigram tokens
104
+ from a subset of 20news::
105
+
106
+ >>> from sklearn.feature_extraction.text import TfidfVectorizer
107
+ >>> categories = ['alt.atheism', 'talk.religion.misc',
108
+ ... 'comp.graphics', 'sci.space']
109
+ >>> newsgroups_train = fetch_20newsgroups(subset='train',
110
+ ... categories=categories)
111
+ >>> vectorizer = TfidfVectorizer()
112
+ >>> vectors = vectorizer.fit_transform(newsgroups_train.data)
113
+ >>> vectors.shape
114
+ (2034, 34118)
115
+
116
+ The extracted TF-IDF vectors are very sparse, with an average of 159 non-zero
117
+ components by sample in a more than 30000-dimensional space
118
+ (less than .5% non-zero features)::
119
+
120
+ >>> vectors.nnz / float(vectors.shape[0])
121
+ 159.01327...
122
+
123
+ :func:`sklearn.datasets.fetch_20newsgroups_vectorized` is a function which
124
+ returns ready-to-use token counts features instead of file names.
125
+
126
+ .. _`20 newsgroups website`: http://people.csail.mit.edu/jrennie/20Newsgroups/
127
+ .. _`TF-IDF`: https://en.wikipedia.org/wiki/Tf-idf
128
+
129
+ |details-end|
130
+
131
+ |details-start|
132
+ **Filtering text for more realistic training**
133
+ |details-split|
134
+
135
+ It is easy for a classifier to overfit on particular things that appear in the
136
+ 20 Newsgroups data, such as newsgroup headers. Many classifiers achieve very
137
+ high F-scores, but their results would not generalize to other documents that
138
+ aren't from this window of time.
139
+
140
+ For example, let's look at the results of a multinomial Naive Bayes classifier,
141
+ which is fast to train and achieves a decent F-score::
142
+
143
+ >>> from sklearn.naive_bayes import MultinomialNB
144
+ >>> from sklearn import metrics
145
+ >>> newsgroups_test = fetch_20newsgroups(subset='test',
146
+ ... categories=categories)
147
+ >>> vectors_test = vectorizer.transform(newsgroups_test.data)
148
+ >>> clf = MultinomialNB(alpha=.01)
149
+ >>> clf.fit(vectors, newsgroups_train.target)
150
+ MultinomialNB(alpha=0.01, class_prior=None, fit_prior=True)
151
+
152
+ >>> pred = clf.predict(vectors_test)
153
+ >>> metrics.f1_score(newsgroups_test.target, pred, average='macro')
154
+ 0.88213...
155
+
156
+ (The example :ref:`sphx_glr_auto_examples_text_plot_document_classification_20newsgroups.py` shuffles
157
+ the training and test data, instead of segmenting by time, and in that case
158
+ multinomial Naive Bayes gets a much higher F-score of 0.88. Are you suspicious
159
+ yet of what's going on inside this classifier?)
160
+
161
+ Let's take a look at what the most informative features are:
162
+
163
+ >>> import numpy as np
164
+ >>> def show_top10(classifier, vectorizer, categories):
165
+ ... feature_names = vectorizer.get_feature_names_out()
166
+ ... for i, category in enumerate(categories):
167
+ ... top10 = np.argsort(classifier.coef_[i])[-10:]
168
+ ... print("%s: %s" % (category, " ".join(feature_names[top10])))
169
+ ...
170
+ >>> show_top10(clf, vectorizer, newsgroups_train.target_names)
171
+ alt.atheism: edu it and in you that is of to the
172
+ comp.graphics: edu in graphics it is for and of to the
173
+ sci.space: edu it that is in and space to of the
174
+ talk.religion.misc: not it you in is that and to of the
175
+
176
+
177
+ You can now see many things that these features have overfit to:
178
+
179
+ - Almost every group is distinguished by whether headers such as
180
+ ``NNTP-Posting-Host:`` and ``Distribution:`` appear more or less often.
181
+ - Another significant feature involves whether the sender is affiliated with
182
+ a university, as indicated either by their headers or their signature.
183
+ - The word "article" is a significant feature, based on how often people quote
184
+ previous posts like this: "In article [article ID], [name] <[e-mail address]>
185
+ wrote:"
186
+ - Other features match the names and e-mail addresses of particular people who
187
+ were posting at the time.
188
+
189
+ With such an abundance of clues that distinguish newsgroups, the classifiers
190
+ barely have to identify topics from text at all, and they all perform at the
191
+ same high level.
192
+
193
+ For this reason, the functions that load 20 Newsgroups data provide a
194
+ parameter called **remove**, telling it what kinds of information to strip out
195
+ of each file. **remove** should be a tuple containing any subset of
196
+ ``('headers', 'footers', 'quotes')``, telling it to remove headers, signature
197
+ blocks, and quotation blocks respectively.
198
+
199
+ >>> newsgroups_test = fetch_20newsgroups(subset='test',
200
+ ... remove=('headers', 'footers', 'quotes'),
201
+ ... categories=categories)
202
+ >>> vectors_test = vectorizer.transform(newsgroups_test.data)
203
+ >>> pred = clf.predict(vectors_test)
204
+ >>> metrics.f1_score(pred, newsgroups_test.target, average='macro')
205
+ 0.77310...
206
+
207
+ This classifier lost over a lot of its F-score, just because we removed
208
+ metadata that has little to do with topic classification.
209
+ It loses even more if we also strip this metadata from the training data:
210
+
211
+ >>> newsgroups_train = fetch_20newsgroups(subset='train',
212
+ ... remove=('headers', 'footers', 'quotes'),
213
+ ... categories=categories)
214
+ >>> vectors = vectorizer.fit_transform(newsgroups_train.data)
215
+ >>> clf = MultinomialNB(alpha=.01)
216
+ >>> clf.fit(vectors, newsgroups_train.target)
217
+ MultinomialNB(alpha=0.01, class_prior=None, fit_prior=True)
218
+
219
+ >>> vectors_test = vectorizer.transform(newsgroups_test.data)
220
+ >>> pred = clf.predict(vectors_test)
221
+ >>> metrics.f1_score(newsgroups_test.target, pred, average='macro')
222
+ 0.76995...
223
+
224
+ Some other classifiers cope better with this harder version of the task. Try the
225
+ :ref:`sphx_glr_auto_examples_model_selection_plot_grid_search_text_feature_extraction.py`
226
+ example with and without the `remove` option to compare the results.
227
+ |details-end|
228
+
229
+ .. topic:: Data Considerations
230
+
231
+ The Cleveland Indians is a major league baseball team based in Cleveland,
232
+ Ohio, USA. In December 2020, it was reported that "After several months of
233
+ discussion sparked by the death of George Floyd and a national reckoning over
234
+ race and colonialism, the Cleveland Indians have decided to change their
235
+ name." Team owner Paul Dolan "did make it clear that the team will not make
236
+ its informal nickname -- the Tribe -- its new team name." "It's not going to
237
+ be a half-step away from the Indians," Dolan said."We will not have a Native
238
+ American-themed name."
239
+
240
+ https://www.mlb.com/news/cleveland-indians-team-name-change
241
+
242
+ .. topic:: Recommendation
243
+
244
+ - When evaluating text classifiers on the 20 Newsgroups data, you
245
+ should strip newsgroup-related metadata. In scikit-learn, you can do this
246
+ by setting ``remove=('headers', 'footers', 'quotes')``. The F-score will be
247
+ lower because it is more realistic.
248
+ - This text dataset contains data which may be inappropriate for certain NLP
249
+ applications. An example is listed in the "Data Considerations" section
250
+ above. The challenge with using current text datasets in NLP for tasks such
251
+ as sentence completion, clustering, and other applications is that text
252
+ that is culturally biased and inflammatory will propagate biases. This
253
+ should be taken into consideration when using the dataset, reviewing the
254
+ output, and the bias should be documented.
255
+
256
+ .. topic:: Examples
257
+
258
+ * :ref:`sphx_glr_auto_examples_model_selection_plot_grid_search_text_feature_extraction.py`
259
+
260
+ * :ref:`sphx_glr_auto_examples_text_plot_document_classification_20newsgroups.py`
261
+
262
+ * :ref:`sphx_glr_auto_examples_text_plot_hashing_vs_dict_vectorizer.py`
263
+
264
+ * :ref:`sphx_glr_auto_examples_text_plot_document_clustering.py`