Fetching metadata from the HF Docker repository...
Update app.py
3183cd1
verified
-
pages
Update pages/Score_predicter.py
-
1.52 kB
initial commit
-
83.2 kB
Upload Cricket_players_data_cleaned1.csv
-
243 Bytes
initial commit
-
2.67 kB
Update app.py
-
84.4 kB
Upload cric_final.csv
label_encoder.pkl
Detected Pickle imports (4)
- "numpy.ndarray",
- "numpy.dtype",
- "sklearn.preprocessing._label.LabelEncoder",
- "numpy._core.multiarray._reconstruct"
How to fix it?
687 Bytes
Upload 3 files
-
98 Bytes
Create requirements.txt
score_prediction_model.pkl
Detected Pickle imports (8)
- "numpy.ndarray",
- "sklearn.ensemble._forest.RandomForestRegressor",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "sklearn.tree._tree.Tree",
- "_codecs.encode",
- "numpy._core.multiarray._reconstruct",
- "numpy.dtype",
- "sklearn.tree._classes.DecisionTreeRegressor"
How to fix it?
109 kB
Upload 3 files
svc_face_classifier.pkl
Detected Pickle imports (7)
- "numpy._core.multiarray._reconstruct",
- "numpy._core.multiarray.scalar",
- "sklearn.pipeline.Pipeline",
- "sklearn.preprocessing._data.MinMaxScaler",
- "sklearn.svm._classes.SVC",
- "numpy.ndarray",
- "numpy.dtype"
How to fix it?
212 MB
Upload 3 files
winner_prediction_model.pkl
Detected Pickle imports (7)
- "numpy.ndarray",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "sklearn.ensemble._forest.RandomForestClassifier",
- "sklearn.tree._classes.DecisionTreeClassifier",
- "_codecs.encode",
- "numpy._core.multiarray._reconstruct",
- "numpy.dtype"
How to fix it?
531 kB
Upload winner_prediction_model.pkl