Fetching metadata from the HF Docker repository...
-
__results___files
add models
-
245 Bytes
add models
-
2.62 MB
add models
-
249 Bytes
add models
label_encoder.pkl
Detected Pickle imports (4)
- "numpy.ndarray",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "numpy.dtype",
- "sklearn.preprocessing._label.LabelEncoder"
How to fix it?
375 Bytes
add models
lightgbm.pkl
Detected Pickle imports (8)
- "collections.OrderedDict",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "lightgbm.sklearn.LGBMClassifier",
- "numpy.dtype",
- "numpy.ndarray",
- "numpy.core.multiarray.scalar",
- "collections.defaultdict",
- "lightgbm.basic.Booster"
How to fix it?
192 kB
add models
results.zip
Detected Pickle imports (23)
- "numpy.ndarray",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "numpy.dtype",
- "numpy.ndarray",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "numpy.dtype",
- "sklearn.preprocessing._label.LabelEncoder",
- "collections.OrderedDict",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "lightgbm.sklearn.LGBMClassifier",
- "numpy.dtype",
- "numpy.ndarray",
- "numpy.core.multiarray.scalar",
- "collections.defaultdict",
- "lightgbm.basic.Booster",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "numpy.dtype",
- "numpy.ndarray",
- "numpy.core.multiarray.scalar",
- "sklearn.preprocessing._data.StandardScaler",
- "xgboost.core.Booster",
- "xgboost.sklearn.XGBClassifier",
- "builtins.bytearray"
How to fix it?
2.87 MB
add models
scaler.pkl
Detected Pickle imports (5)
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "numpy.dtype",
- "numpy.ndarray",
- "numpy.core.multiarray.scalar",
- "sklearn.preprocessing._data.StandardScaler"
How to fix it?
32.7 kB
add models
xgboost.pkl
Detected Pickle imports (3)
- "xgboost.core.Booster",
- "xgboost.sklearn.XGBClassifier",
- "builtins.bytearray"
How to fix it?
150 kB
add models